Senior Data Engineer designing cloud-native data pipelines for modern analytics and reporting. Collaborating with cross-functional teams to support operational workloads in a hybrid-friendly environment.
Responsibilities
Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads.
Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources.
Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications.
Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency.
Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations.
Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities.
Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management.
Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability.
Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure.
Requirements
5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures.
Strong proficiency in SQL for data modeling, transformation, and performance optimization.
Expertise in Python for data processing, automation, and pipeline development.
Experience with cloud data platforms, particularly Google Cloud Platform (GCP).
Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub.
Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam.
Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows.
Understanding of data privacy, security, and compliance best practices.
Strong problem-solving skills, with the ability to debug and optimize complex data workflows.
Excellent communication and collaboration skills.
Benefits
Great projects: Broaden your skills on a range of engaging projects with international brands that have a global impact.
An inclusive and safe environment: We’re truly committed to building a culture where you are celebrated and everyone feels welcome and safe.
Learning opportunities: We offer generous training budgets, including partner tech certifications, custom learning plans, workshops, mentorship, and peer support.
Generous vacation policy: Work-life balance is key to our team’s success, so we offer flexible personal time offer (PTO); allowing ample time away from work to promote overall well-being.
Customizable benefits: Tailor your extended health and dental plan to your needs, priorities, and preferences.
Flexible work arrangements: We work in a variety of ways, from remote, to in-office, to a blend of both.
Salesforce Data Architect designing and optimizing enterprise - grade data architectures across Salesforce platforms. Collaborating with team members to ensure data quality and readiness for analytics.
Senior Data Engineer with a strong background in Google Cloud services at Valtech. Leading data engineering projects and developing highly available data pipelines.
Sr. Databricks Spark Developer role designing and optimizing data pipelines for banking. Requires Databricks/Spark experience in financial services with strong communication skills.
Data Integration Developer for market risk systems. Responsible for ETL/ELT development, SQL database programming, and supporting risk management systems in a hybrid Mississauga contract role.
Azure & Databricks Data Engineer role designing and building data pipelines using Microsoft tech stack. 11 - month contract, hybrid work in Oshawa, $90 - 95/hr.
Data Engineering Developer responsible for designing and implementing data flows using cloud technologies like AWS and Databricks. Collaborating within a strong data science team to optimize data for machine learning.
Sr. Manager leading data engineering team to optimize data infrastructure for insurance. Driving innovative data solutions and managing cross - functional collaborations within a remote setup.