Azure & Databricks Data Engineer role designing and building data pipelines using Microsoft tech stack. 11-month contract, hybrid work in Oshawa, $90-95/hr.
Responsibilities
As an Azure and Databricks Data Engineer, the role focuses on designing, building, and supporting data-driven applications that enable innovative, customer-centric digital experiences. Work as part of a cross-discipline agile team, collaborating to solve problems across business areas. Build reliable, supportable, and performant data lake and data warehouse products to support reporting, analytics, applications, and innovation. Apply best practices in development, security, accessibility, and design to deliver high-quality services. Develop modular and scalable ELT/ETL pipelines and data infrastructure leveraging diverse enterprise data sources. Create curated common data models in collaboration with Data Modelers and Data Architects to support business intelligence, reporting, and downstream systems. Clean, prepare, and optimize datasets with strong lineage and quality controls throughout the integration cycle. Support BI Analysts with dimensional modeling and aggregation optimization for visualization and reporting. Collaborate with Business Analysts, Data Scientists, Senior Data Engineers, Data Analysts, Solution Architects, and Data Modelers.
Requirements
Completion of a four-year university program in computer science, engineering, or related data disciplines. Experience designing and building data pipelines, with strong Python, PySpark, SparkSQL, and SQL skills. Experience with Azure Data Factory, ADLS, Synapse, and Databricks, and building pipelines for Data Lakehouses and Warehouses. Strong understanding of data structures, governance, and data quality principles, with effective communication skills for technical and non-technical audiences.
Data Engineer building data integration pipelines for data lakes and warehouses. Collaborating with stakeholders to meet business requirements in a leading publishing company.
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.