Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Data Engineer owning the full data stack for Practice Better, a health and wellness platform. Collaborating with teams to ensure reliable data and developing AI-assisted workflows.

Responsibilities

  • Own and evolve the full data infrastructure: Snowflake, dbt, Stitch, Rivery, and pipeline orchestration
  • Build and maintain ELT/ETL pipelines and dbt models supporting analytics, reporting, and the AI warehouse agent
  • Manage Snowflake for performance, cost, and reliability
  • Build and maintain integrations between production systems — Stripe, Square, HubSpot, and others — and the data warehouse
  • Implement data quality monitoring, testing frameworks, and alerting so problems surface before they reach stakeholders
  • Trace anomalies to root causes and ensure leadership is making decisions on reliable data
  • Establish and maintain data governance standards: documentation, access controls, and metric definitions
  • Partner with Product and Engineering to instrument event tracking and ensure complete data capture across the customer journey
  • Design and evolve AI-assisted data workflows, including our internal warehouse agent, ensuring they’re powered by clean, well-modeled, trustworthy data
  • Bring strong intuition for AI failure modes and how to constrain systems for reliability
  • Enable stakeholders to self-serve through AI-assisted tooling; your work is what makes that possible
  • Partner with the Head of RevOps on data model integrity, executive reporting, and prioritizing what gets built or fixed first
  • Work closely with the analytics team to understand how data is consumed across Sigma, Amplitude, Hightouch, and the AI warehouse agent, so infrastructure decisions account for downstream impact
  • Partner with GTM stakeholders across Growth, Customer Success, Marketing, and Payments to understand data dependencies, surface issues early, and translate business needs into technical requirements

Requirements

  • 5+ years as a data engineer, analytics engineer, or backend engineer with a data focus
  • Experience as the primary or sole data engineer on a small, high-growth team (you’ve owned a full stack end-to-end, not just a slice of one)
  • Strong SQL and hands-on experience with dbt, Snowflake, Stitch, and Rivery
  • Python proficiency for pipeline development and automation
  • Experience building and optimizing ELT/ETL pipelines at scale, with strong understanding of data warehouse architecture and dimensional modeling
  • A pragmatic approach to AI — focused on outcomes, not demos. You’ve built or worked with AI-assisted data workflows in production, understand their failure modes, and know how to constrain them for reliability
  • Comfort working with large, imperfect datasets. You diagnose data quality issues independently and don’t wait for someone to hand you a spec
  • Experience with version control (Git), CI/CD practices, and infrastructure as code
  • Clear, proactive written communication. You document your work so the next person isn’t starting from scratch.

Benefits

  • Comprehensive health and dental benefits from day 1
  • RRSP matching
  • Generous paid parental leave
  • Annual learning stipends
  • Unlimited vacation
  • $750 annual Health & Wellness Allowance
  • $1,000 annual Learning & Development Allowance to support your growth
  • $500 annual Home Office Allowance to set up a productive remote workspace
  • Personalized support for family-building and fertility journeys
  • Confidential, digital mental health support from licensed professionals
  • Company-wide holiday closure in December
  • Regular virtual company-wide events, lunches, and team socials to stay connected

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

CA$135,000 - CA$145,000 per year

Degree requirement

Bachelor's Degree

Tech skills

ETLPythonSQL

Location requirements

RemoteCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.