Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Senior Data Solution Architect designing and implementing data architecture on GCP for a warehouse intelligence platform at Fulfillment IQ.

Responsibilities

  • Design and implement the end-to-end data architecture for a multi-site warehouse intelligence platform on GCP
  • Develop a dual-layer data strategy, including analytics and real-time operational data layers
  • Design and implement CDC pipelines using FiveTran, Debezium, or Oracle GoldenGate
  • Develop the real-time operational data layer using Apache Flink or GCP Dataflow
  • Design integration patterns between the platform, Blue Yonder WMS, MuleSoft middleware, and downstream analytics consumers
  • Develop data pipelines built to scale across 50+ site production volumes
  • Collaborate with the BI team to configure Polaris catalog and Iceberg table partitioning strategy
  • Establish data quality, lineage, and observability standards across all pipelines
  • Participate in architecture reviews and provide technical leadership on data-related decisions

Requirements

  • 8+ years of experience in data architecture or data engineering, with at least 3 years in a solution architect capacity
  • 3+ years of experience with Snowflake, including data engineering, data modeling, and data warehousing
  • Deep GCP experience, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud SQL, and Cloud Spanner
  • Hands-on experience with Apache Iceberg and CDC expertise
  • Experience with streaming architecture, including Apache Flink, GCP Dataflow, or Apache Kafka Streams
  • SQL mastery and experience with Oracle databases
  • Strong understanding of the supply chain and logistics domains
  • Strong communication and collaboration skills
  • Experience with Apache Kafka, Blue Yonder, MuleSoft integration patterns, and multi-tenant/multi-site data architectures (preferred)
  • Familiarity with GenAI/LLM architectures and their data requirements (preferred)
  • Experience with MDM tools or patterns (preferred)
  • GCP Professional Data Engineer or equivalent certification (preferred)

Benefits

  • Comprehensive health and dental coverage for you and your family (region-specific plans)
  • Employee wellness programs, where applicable
  • Competitive paid time off (PTO), sick leave, and public holidays
  • Flexible leave policies that respect local labor standards
  • Retirement savings programs and employer contributions
  • Region-specific plans (CPP and supplementary plans in Canada)
  • Dedicated learning and development budget
  • Support for skills development, leadership growth, and career progression
  • Remote and hybrid work options
  • Flexible working hours aligned to role and client needs
  • Equipment and workstation allowances
  • Internet and business travel reimbursements
  • Employee stock options (ESOP), where applicable
  • Team events, meetups, and company offsites

Job type

Contract

Experience level

Senior

Salary

CA$120 - CA$145 per hour

Degree requirement

No Education Requirement

Tech skills

ApacheBigQueryCloudGoogle Cloud PlatformKafkaOracleSQL

Location requirements

HybridTorontoCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.