Data Engineer Intern gaining hands-on experience in TD's big data platform. Collaborating on software development and system enhancements while learning about analytical tools and technologies.
Responsibilities
Apply Software Development Life Cycle (SDLC) and Project Management Methodology (PMLC) to complete specific development tasks for projects
Ensure defect-free programming by testing and debugging using available/appropriate tools and participate in reviewing peer coding
Design, develop and deploy system enhancements
Perform Unit and System Integration tests
Document system deployment plans for system implementation and code deployment
Participate in Proof-of-Concept (POC) on new technologies and document findings
Provide recommendations with full usability analysis
Perform root-cause analysis and resolution of application defects
Adhere to standard security coding practices to ensure application is free of coding vulnerabilities
Continuously enhance knowledge/expertise and keep current with leading-edge technologies trends/developments and develop expertise in TD services, applications, infrastructure, analytical tools and techniques that can contribute to effective solution development/delivery
Requirements
Currently enrolled in a relevant technology programs, such as Computer Science, Engineering, Information Technology Management, Financial Analysis and Risk Management, Computing, etc.
Familiarity with at least one of the following technologies: Java, Scala, Python, Pyspark, or SQL
Practical experience/knowledge of data generation, data masking, data subsetting, data archiving, data virtualization, project management, management reporting, data modelling, and/or database development
Test Data Tools Knowledge (CA TDM or Delphix)
Familiarity with source code management such as GIT
Familiarity with Jira and Confluence
A thirst for constantly exploring emerging technology
Skills that would be an asset, but not required: Familiarity with RDBMS, Big data, Familiarity with Data Warehousing, ETL, ELT concepts, Familiarity with Pandas, Numpy, Scikit-learn, Matplotlib, Ruby, Linux, Interest or aptitude in Financial Systems and the Financial Industry, Familiarity with Cloud technology (Azure, AWS, GCP)
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Senior Data Engineer developing the data management layer for a financial brokerage platform with scalability for larger customers. Collaborating with teams in a fully remote, diverse environment.