Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Responsibilities
Design, develop, and maintain robust ETL/ELT pipelines to integrate data from multiple sources into a centralized cloud-based data platform
Build scalable data ingestion, transformation, and enrichment processes using Python, SQL, and PySpark
Optimize data workflows for performance, scalability, and cost efficiency in the cloud
Implement data quality and validation checks to ensure trust in reporting, analytics, and data-driven products
Collaborate with cross-functional teams to translate business requirements into technical data solutions
Support large-scale transformations using distributed processing frameworks
Troubleshoot and resolve issues in data pipelines, ensuring reliability and uptime
Participate in code reviews and contribute to engineering standards and best practices
Document data processes, pipelines, and schemas to improve transparency and reusability
Stay current with modern data engineering tools, practices, and cloud technologies, with a passion for continual learning and knowledge sharing
Build with stakeholders in mind, not just raw pipelines.
Requirements
3+ years of experience in data engineering, data development, or data management
Strong hands-on experience with Snowflake and modern data warehouse concepts (data lakes, lakehouse, streaming)
Proficiency in Python and SQL for building and optimizing data pipelines
Hands-on experience with AWS services such as S3, Glue, Lambda, Redshift, and data platforms such as Snowflake
Experience with ETL/ELT, data modeling, and data warehousing concepts
Experience with orchestration tools (Airflow, Dagster)
Hands-on experience with PySpark and distributed data processing frameworks (e.g., AWS EMR)
Knowledge of pipeline performance optimization and debugging
Strong problem-solving, analytical, and collaboration skills
Experience with version control (Git) and CI/CD workflows
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Senior Data Engineer developing the data management layer for a financial brokerage platform with scalability for larger customers. Collaborating with teams in a fully remote, diverse environment.
Technical Lead overseeing data engineers, analysts, and architects to implement data solutions. Leading modernization of data infrastructures for diverse business objectives.
Data Engineer joining a consulting firm in Toronto with world - class team of engineers. Producing high quality data tools and pipelines while collaborating with leading companies.
Director of Data Engineering & AI Strategy driving Google Marketing Platform capabilities for global marketing partner Incubeta. Hands - on technical leadership at the intersection of ad tech and media.