Data Engineer – Cloud / GCP

Posted via LinkedIn Recruiter (not a company profile)

Posted 2 weeks ago

Apply Now

About the role

  • Data Engineer role focusing on GCP to build scalable data pipelines for global analytics. Hybrid position in Toronto requiring bilingual English/Spanish skills.

Responsibilities

  • Build and maintain modern ELT/ETL data pipelines from scratch. Integrate and orchestrate workflows using Airflow and cloud-native tools. Apply SQL & Python to manipulate, clean, and validate large datasets. Support CI/CD pipelines using GitHub, Bitbucket, and Terraform. Conduct data quality checks and monitor pipeline performance. Translate technical concepts for non-technical stakeholders.

Requirements

  • Must-Have Skills: 3–4 years in ELT/ETL data pipelines. 2–4 years with GCP and Airflow. 3+ years with CI/CD pipelines and source control (GitHub, Bitbucket, Terraform). 2–4 years in data modeling, SQL, and Python. Bilingual: English/Spanish mandatory. Nice-to-Have: Power BI or other visualization tools. Experience with DevOps or Agile/Scrum teams. Banking or FI sector experience.

Benefits

  • Work at the center of a data modernization journey for a global financial program. Make a real impact by shaping data-driven insights across multiple geographies. Hybrid work environment with flexible collaboration.

Job title

Job type

Contractor

Experience level

Mid level

Salary

Not specified

Degree requirement

No Education Requirement

Tech skills

Google Cloud PlatformAirflowSQLPythonGitHubBitbucketTerraformPower BI

Location requirements

Linkedin Recruiter PostTorontoOntario Toronto

Report this job

Found something wrong with the page? Please let us know by submitting a report below.