About the role

  • Solution Architect designing secure data architectures empowered by Snowflake and GCP for modernization initiatives at Apply Digital. Leading large-scale transformations while collaborating with enterprise clients and cross-functional teams.

Responsibilities

  • Design and architect modern, scalable, and composable data platforms across multi-cloud environments (AWS, GCP, Azure), supporting real-time analytics, AI/ML, and customer experience use cases.
  • Assess client data ecosystems and create strategic modernization roadmaps—from on-prem to cloud migrations, to enabling unified customer data and activation.
  • Create and enforce architectural standards, governance models, and reference architectures across data lakehouse, mesh, and medallion patterns.
  • Evaluate and recommend data storage and compute technologies (e.g., BigQuery, Snowflake, Redshift, Delta Lake, Apache Iceberg) based on performance, scalability, cost, and security trade-offs.
  • Design robust, scalable ETL/ELT pipelines using both batch and streaming tools (e.g., dbt, Airflow, Kafka, Kinesis, Pub/Sub), and optimize data flows for speed and reliability.
  • Define and enforce data governance frameworks, including RBAC/ABAC models, lineage tracking, metadata management, and compliance (GDPR, CCPA, SOC2).
  • Develop architectural guidance for AI/ML data pipelines, including the design of feature stores, real-time inference pipelines, and MLOps alignment.
  • Recommend tools and practices for data cataloging, quality, observability, and security, leveraging platforms such as Collibra, DataHub, or Apache Atlas.
  • Provide strategic advisory to business and technical stakeholders, aligning data architecture decisions with broader digital transformation and ROI goals.
  • Create clear, client-facing deliverables: reference architectures, integration strategies, migration plans, and governance models.

Requirements

  • 10+ years of experience in data architecture, platform engineering, or solution architecture within enterprise or consulting environments.
  • Proven experience architecting and implementing large-scale data modernization programs using Snowflake (required) and/or GCP, AWS, Azure.
  • Snowflake SnowPro Advanced Architect certification (nice to have GCP Professional Data Engineer or Architect certification)
  • Expertise in data modeling (dimensional, Data Vault), schema evolution, and building systems to handle both structured and unstructured data.
  • Strong understanding of data lake, data warehouse, lakehouse, and data mesh architectures, and when to apply each.
  • Experience designing high-performance, scalable pipelines using dbt, Apache Airflow, Kafka, Kinesis, Pub/Sub, or similar tools.
  • Knowledge of Terraform for Infrastructure as Code (IaC) in data environments.
  • Experience integrating CDPs and MarTech/AdTech platforms for real-time personalization.
  • A consultative mindset — you’re comfortable assessing ambiguous environments, providing strategic guidance, and influencing stakeholders toward long-term, scalable solutions.

Benefits

  • Great projects: Broaden your skills on a range of engaging projects with international brands that have a global impact.
  • An inclusive and safe environment: We’re truly committed to building a culture where you are celebrated and everyone feels welcome and safe.
  • Learning opportunities: We offer generous training budgets, including partner tech certifications, custom learning plans, workshops, mentorship, and peer support.
  • Generous vacation policy: Work-life balance is key to our team’s success, so we offer flexible personal time offer (PTO); allowing ample time away from work to promote overall well-being.
  • Customizable benefits: Tailor your extended health and dental plan to your needs, priorities, and preferences.
  • Flexible work arrangements: We work in a variety of ways, from remote, to in-office, to a blend of both.

Job type

Full Time

Experience level

SeniorLead

Salary

Not specified

Degree requirement

Bachelor's Degree

Tech skills

AirflowAmazon RedshiftApacheAWSAzureBigQueryCloudETLGoogle Cloud PlatformKafkaTerraformVault

Location requirements

HybridTorontoCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.