Data Engineer role in Toronto requiring 2+ years experience with large datasets, Python optimization, and data integration. Hybrid work with 3 days on-site.
Responsibilities
Exciting Opportunity: Data Engineer Location: Toronto – Hybrid (3 days on site) Candidate Requirements/Must Have Skills: 1. 2+ years of Solid experience working with large datasets and complex data transformations 2. 2+ years of experience with Python, with hands-on experience optimizing data-heavy workloads (Demonstrated ability to improve performance of existing code {e.g., pandas optimization, algorithmic improvements}) 3. 1+ years of Experience sourcing and integrating data from multiple systems or formats (e.g., files, databases, APIs) 4. Familiarity with best practices for writing clean, maintainable, and testable code Nice-To-Have Skills: 1. Experience with data engineering frameworks or tools (e.g., PySpark, Dask, Airflow)
Requirements
Candidate Requirements/Must Have Skills: 1. 2+ years of Solid experience working with large datasets and complex data transformations 2. 2+ years of experience with Python, with hands-on experience optimizing data-heavy workloads (Demonstrated ability to improve performance of existing code {e.g., pandas optimization, algorithmic improvements}) 3. 1+ years of Experience sourcing and integrating data from multiple systems or formats (e.g., files, databases, APIs) 4. Familiarity with best practices for writing clean, maintainable, and testable code Nice-To-Have Skills: 1. Experience with data engineering frameworks or tools (e.g., PySpark, Dask, Airflow)
Data Engineer building data integration pipelines for data lakes and warehouses. Collaborating with stakeholders to meet business requirements in a leading publishing company.
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.