Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Data Engineer at The Fedcap Group architecting and leading enterprise data warehouse solutions. Focused on enabling scalable growth and operational excellence across the organization.

Responsibilities

  • Deliver Reliable, Analytics-Ready Data Models
  • Build Secure and Compliant Data Infrastructure
  • Lead development of dbt and transformation workflows
  • Ensure performance and cost optimization
  • Enable end to end Data pipeline
  • Directly Support Business & Analytics Teams
  • Collaborate with the Head of Data and Analytics to implement the enterprise Medallion Architecture (Bronze → Silver → Gold)
  • Design, build, and maintain data ingestion pipelines in Azure Data Factory (ADF)
  • Configure and manage secure integrations between Azure and Snowflake
  • Develop and optimize Snowflake data models
  • Implement role-based access control (RBAC), data masking, and row/column-level security in Snowflake
  • Build and maintain a modular dbt framework
  • Create and manage CI/CD pipelines for dbt using GitHub Actions or Azure DevOps
  • Write and optimize complex SQL and Python scripts
  • Implement data validation, quality checks, and monitoring frameworks
  • Collaborate directly with BI, Analytics, and Data Science teams
  • Take end-to-end ownership of assigned data engineering projects: requirements - design - build - deploy - support
  • Document pipelines, transformations, and models to ensure reproducibility and team-wide adoption

Requirements

  • Bachelor’s degree in information systems, Computer Science, Engineering, or related field.
  • Advanced degrees in related fields are plus, however hands-on experience is strongly preferred.
  • Snowflake Snowpro Advanced Data Engineer / Architect certification (Preferred).
  • 5+ years of proven experience in data engineer roles.
  • Deep expertise in enterprise system implementations, data lifecycle management, modular framework and data platform architecture.
  • Strong hands-on experience with dbt, Azure and Snowflake are a must.
  • Demonstrated ability to design and implement scalable, secure and modular data pipeline.
  • Experience with data quality frameworks, lineage and governance practice.
  • Track record of delivering end-to-end data solutions in cloud environments.

Benefits

  • Professional development opportunities • Flexible work arrangements

Job title

Job type

Full Time

Experience level

Senior

Salary

Not specified

Degree requirement

Bachelor's Degree

Tech skills

AzureCloudPythonSQL

Location requirements

RemoteCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.