Data Engineer

Posted 23 hours ago

Apply Now

Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.

Responsibilities

  • Collaborate & Strategize: Partner closely with business stakeholders to understand their challenges and design end-to-end architecture that solves complex business problems.
  • Build & Maintain Data Models: Design, develop, and own robust, efficient, and scalable data models in Snowflake and Iceberg using dbt and advanced SQL.
  • Orchestrate & Automate: Build and manage reliable data pipelines and CI/CD workflows using tools like Airflow, Python, and Terraform to ensure data is fresh, trustworthy, and infrastructure is version-controlled.
  • Champion Data Quality: Implement rigorous testing, documentation, and data governance practices to maintain a single source of truth.
  • Enable Analytics & Workflows: Act as the Product Owner and Tech Lead for your data domains, taking responsibility for the end-to-end data product delivery– from raw ingestion to data models enabling analytics and data apps in tools like Tableau and Retool.
  • Innovate with AI: Help us build our next-generation data infrastructure by integrating AI capabilities (like Snowflake Cortex AI) to democratize analytics and empower the business.
  • Architect Observability: Implement monitoring and alerting frameworks (e.g., dbt packages or Monte Carlo monitors) to proactively catch "silent" data failures before stakeholders do.

Requirements

  • 6+ years of experience in Analytics Engineering, Data Engineering, or a similar role.
  • Deep expertise in SQL and developing complex data models for analytical purposes (e.g., dimensional modeling).
  • Hands-on experience with:
  • Data Warehousing: High proficiency in Snowflake (preferred) and experience with Open Table Formats like Iceberg.
  • Data Transformation: dbt
  • Orchestration & ETL: Airflow, Fivetran, Airbyte
  • Cloud Platform: AWS
  • Programming/Ingestion: Python
  • Infrastructure as Code: Terraform
  • AI-Augmented Development: Proficiency using AI coding assistants (Cursor, Copilot, or Claude) to accelerate development and automate routine tasks.
  • A strong analytical mindset with a proven ability to solve ambiguous business problems with data.
  • Excellent communication skills and experience working cross-functionally.
  • Self-starter with the ability to self-project manage work
  • A user focus with the ability to understand how a data consumer will use the data products you build

Benefits

  • Health, pharmacy, optical and dental care benefits
  • Paid time off
  • Sick time off
  • Short term and long term disability coverage
  • Life insurance
  • 401k contribution

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

CA$133,000 - CA$182,000 per year

Degree requirement

Bachelor's Degree

Tech skills

AirflowAWSCloudETLPythonSQLTableauTerraform

Location requirements

RemoteCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.