Data Engineer building scalable ETL/ELT data pipelines using GCP and BigQuery. Collaborating across teams to ensure robust data infrastructure and analytics ready datasets.
Responsibilities
Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines using cloud services such as GCP Dataflow, Cloud Functions, Pub/Sub, and Cloud Composer.
Architect and implement robust data infrastructure capable of handling high-volume data ingestion and processing.
Develop and manage our central data warehouse in Google BigQuery.
Design and implement data models, schemas, and table structures optimized for performance, scalability, and long-term maintainability.
Write clean, efficient, and maintainable SQL and Python code to transform raw data into curated, analysis-ready datasets.
Build reliable transformation workflows that support analytics, reporting, and data science initiatives.
Monitor, troubleshoot, and optimize data infrastructure to ensure high performance, reliability, and cost efficiency.
Implement BigQuery best practices, including partitioning, clustering, query optimization, and materialized views.
Build and maintain curated data models that serve as the “source of truth” for business intelligence and reporting.
Ensure data is optimized and readily accessible for BI tools such as Looker and other analytics platforms.
Implement automated data quality checks, validation rules, and monitoring frameworks to ensure the integrity and reliability of data pipelines and warehouse systems.
Establish processes for data governance, observability, and lineage tracking.
Work closely with software engineers, data analysts, and data scientists to understand their data requirements and provide the necessary infrastructure and data products.
Lead and support client and stakeholder communication, working with enterprise clients to translate business needs into scalable data solutions.
Partner with product teams and leadership to ensure that technical data solutions align with business strategy and client expectations.
Take ownership of data platforms and architecture decisions, helping shape the future direction of our analytics and data infrastructure.
Identify opportunities to improve data reliability, automate workflows, and generate new insights through data.
Contribute to a collaborative, high-performing engineering culture with strong communication and teamwork.
Requirements
5+ years of hands-on experience in data engineering, data integration, or data platform development.
Degree in Computer Science, Engineering, Mathematics, or related STEM discipline.
Strong programming and query skills in SQL and Python.
Experience working with distributed version control systems such as Git in an Agile/Scrum environment.
Experience designing and orchestrating ETL pipelines, particularly with Databricks.
Experience working within cloud environments (GCP, AWS, or Azure).
Experience with database systems such as MongoDB and Elasticsearch.
Strong understanding of data warehousing and dimensional modeling methodologies.
Hands-on experience with Airflow and Hadoop.
Experience using Docker for containerized workflows and reproducible environments.
Ability to identify opportunities to improve data quality, reliability, and automation.
Strong business awareness and communication skills, with the ability to collaborate with both technical teams and business stakeholders.
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Senior Data Engineer developing the data management layer for a financial brokerage platform with scalability for larger customers. Collaborating with teams in a fully remote, diverse environment.