Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Data Engineer focusing on building scalable data solutions with GCP and BigQuery for Fortune 500 companies. Join our team to architect data pipelines and support analytics initiatives.

Responsibilities

  • Design, build, and maintain scalable and reliable batch and real-time ETL/ELT data pipelines using GCP services like Dataflow, Cloud Functions, Pub/Sub, and Cloud Composer.
  • Develop and manage our central data warehouse in Google BigQuery. Implement data models, schemas, and table structures optimized for performance and scalability.
  • Write clean, efficient, and robust code (primarily in SQL and Python) to transform raw data into curated, analysis-ready datasets.
  • Monitor, troubleshoot, and optimize our data infrastructure for performance, reliability, and cost-effectiveness. Implement BigQuery best practices, including partitioning, clustering, and materialized views.
  • Build and maintain curated data models that serve as the "source of truth" for business intelligence and reporting, ensuring data is ready for consumption by BI tools like Looker.
  • Implement automated data quality checks, validation rules, and monitoring to ensure the accuracy and integrity of our data pipelines and warehouse.
  • Work closely with software engineers, data analysts, and data scientists to understand their data requirements and provide the necessary infrastructure and data products.

Requirements

  • 3-5+ years of hands-on experience in a Data Engineering, Software Engineering, or a similar role.
  • Strong proficiency in a programming language such as Python or Java for data processing and automation.
  • Mastery of SQL for complex data manipulation, DDL/DML operations, and query optimization.
  • Proven expertise in using BigQuery as a data warehouse, including data modeling, performance tuning, and cost management.
  • Hands-on experience building data pipelines using the GCP ecosystem (e.g., Dataflow, Pub/Sub, Cloud Storage, Cloud Composer/Airflow).
  • Deep understanding of ETL/ELT principles and data warehousing architecture (e.g., Star Schema, Data Lakes).
  • Strong problem-solving and troubleshooting skills with a focus on building scalable, maintainable, and automated systems.

Benefits

  • Comprehensive Benefits: We cover 100% of health, dental, and vision insurance premiums for you and your dependents which means no out-of-pocket costs. Eligibility starts from day one itself.
  • Access extensive learning and development resources to keep leveling up your skills.

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

CA$80,000 - CA$110,000 per year

Degree requirement

Bachelor's Degree

Tech skills

AirflowBigQueryCloudETLGoogle Cloud PlatformJavaPythonSQL

Location requirements

HybridTorontoCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.