About the role

  • Data Architect leading the design of a Customer Data Mart at ShyftLabs for Fortune 500 AI solutions. Collaborating with teams to implement scalable, secure, and modern data architectures.

Responsibilities

  • Own the technical vision and architecture for the Unified Customer Data Mart, ensuring solutions are scalable, secure, compliant, and aligned with enterprise standards.
  • Design and implement end-to-end data architectures of data pipelines, including raw data ingestion (Bronze), data cleaning and standardization (Silver), and curated data marts (Gold) that serve CDP, reporting, and activation use cases.
  • Define and evolve data modeling standards for customer data, including customer dimensions, transaction facts, engagement events, web behavior, support interactions, and loyalty activity.
  • Decomposing complex business requirements into structured technical solutions and driving alignment with client stakeholders.
  • Formulate, compare, and present multiple architectural approaches for data ingestion, transformation, identity resolution, and consumption patterns, guiding clients and internal teams toward optimal long-term solutions that balance speed, maintainability, and scalability.
  • Architect and build production-grade data pipelines using DBT and Airflow that support customer analytics, segmentation and reporting at scale.
  • Partner directly with client stakeholders to understand business objectives, translate customer journey requirements into robust technical designs, and act as a trusted technical advisor on data architecture decisions.
  • Lead and mentor cross-functional teams, including Analytics Engineers, Data Engineers, and BI developers, setting a high bar for technical quality, code review standards, and documentation practices.
  • Influence and contribute to data governance initiatives, including PII handling, data quality frameworks, identity resolution strategies, and platform reliability.

Requirements

  • Deep expertise in SQL and Python, with demonstrated ability to design, optimize, and troubleshoot complex distributed data systems.
  • 5+ years of experience in data engineering and/or data architecture, with a proven track record of building and scaling enterprise-level data platforms.
  • Extensive experience designing and implementing data lakes, cloud data warehouses, and modern analytics architectures in production environments.
  • Hands on experience with DBT for transformations and modular data modeling.
  • Hands on experience with Google BigQuery (mandatory) or equivalent cloud warehouses (Snowflake, Databricks).
  • Hands on experience with Airflow (or similar orchestration frameworks).
  • Proven experience implementing medallion or layered data architectures, including raw ingestion, conformed layers, and curated marts.
  • Strong foundation in dimensional modeling, star/snowflake schemas, conformed dimensions, and designing for both analytical and operational use cases.
  • Experience with Customer Data Platforms (CDPs) and multi-channel customer data integration, including identity resolution (deterministic and probabilistic matching).
  • Experience designing for security and compliance, including PII masking, access controls, RLS policies, encryption, and privacy regulations (GDPR/CCPA).
  • Strong understanding of cloud architecture principles, including storage optimization, cost management, security patterns, and scalability in GCP environments.
  • Demonstrated ability to operate independently with full architectural ownership while influencing senior stakeholders in client-facing environments.
  • Experience leading and mentoring engineers, setting architectural standards, and driving technical governance.

Benefits

  • Comprehensive Benefits: 100% coverage for health, dental, and vision insurance for you and your dependents from day one.
  • Hybrid Flexibility: 4 days per week in our downtown Toronto office.
  • Continuous learning opportunities and influence over technical direction.
  • Shape applied research and AI strategy in a fast-growing, product-focused data company.

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

CA$120,000 - CA$160,000 per year

Degree requirement

Bachelor's Degree

Tech skills

AirflowBigQueryCloudGoogle Cloud PlatformPythonSQL

Location requirements

HybridTorontoCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.