Data Engineer Intern/Co-op working on TD’s big data platform. Engage in software development and project management methodologies while collaborating with technical teams.
Responsibilities
Apply Software Development Life Cycle (SDLC) and Project Management Methodology (PMLC) to complete specific development tasks for projects
Ensure defect-free programming by testing and debugging using available/appropriate tools and participate in reviewing peer coding
Design, develop and deploy system enhancements
Perform Unit and System Integration tests
Document system deployment plans for system implementation and code deployment
Participate in Proof-of-Concept (POC) on new technologies and document findings
Provide recommendations with full usability analysis
Perform root-cause analysis and resolution of application defects
Adhere to standard security coding practices to ensure application is free of coding vulnerabilities
Continuously enhance knowledge/expertise and keep current with leading-edge technologies trends/developments and develop expertise in TD services, applications, infrastructure, analytical tools and techniques that can contribute to effective solution development/delivery
Requirements
Currently enrolled in a relevant technology program, such as Computer Science, Engineering, Information Technology Management, Computing, etc.
Familiarity with at least one of the following technologies: Java, Scala, Python, Pyspark, or SQL
Practical experience/knowledge of data generation, data masking, data subsetting, data archiving, data virtualization, project management, management reporting, data modelling, and/or database development
Test Data Tools Knowledge (CA TDM or Delphix)
Familiarity with source code management such as GIT
Familiarity with Jira and Confluence
A thirst for constantly exploring emerging technology
Skills that would be an asset, but not required:
Familiarity with RDBMS, Big data
Familiarity with Data Warehousing, ETL, ELT concepts
Familiarity with Pandas, Numpy Scikit-learn, Matplotlib, Ruby, Linux
Interest or aptitude in Financial Systems and the Financial Industry
Familiarity with Cloud technology (Azure, AWS, GCP)
Data Architecture Advisor designing robust data management solutions. Involved in data integration, analytics, and architecture implementation for diverse business sectors.
Data Engineer with 6+ years experience needed in Mississauga, ON. Must have strong Python and Spark skills for designing software solutions and data analysis.
Sr. IFRS17 Data Engineer contract role in Toronto (hybrid). Must have IFRS17 experience, insurance domain knowledge, and expertise in Python, PySpark, SQL, and data analysis.
Senior Software Engineer developing data - heavy services and data security solutions for Coinbase. Building and maintaining scalable data integration tools and self - service applications.
Senior Data Engineer role requiring Python & PySpark expertise. Design scalable data solutions, build data processing programs, and work with HBase/Hive datasets.
Geospatial Data Engineer building data processing systems for autonomous flight decisions alongside the CTO. Architecting systems evolution from initial implementation to production - grade infrastructure.
Engineer building scalable data platforms for a telematics startup tackling complex backend challenges. Designing and optimizing data platforms for API, managing large - scale data processing.
Engineer building scalable data platforms for commercial trucking API company. Designing and optimizing the data platform to tackle complex backend challenges.
Data Engineer at RAVL designing, building, and operating data pipelines for decision - making. Transforming raw data into reliable assets for analytics and machine learning.
Principal Data Engineer at RAVL designing secure and scalable data and AI platforms. Leading architectural guidance and building capabilities across the organization to enhance data operations.