Senior Data Platform Engineer

Posted last week

Apply Now

Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Senior Data Platform Engineer optimizing data processes for a Montreal IT consulting firm. Involving governance, ingestion pipelines, and scalable architecture in data management.

Responsibilities

  • Deliver curated, reusable datasets for BI, analytics, and ML.
  • Build and run ingestion, transformation, and serving pipelines (batch and streaming).
  • Implement governance: ownership, PII handling, RBAC, retention, audit and compliance.
  • Set up and monitor data quality and observability (checks, alerts, incident handling).
  • Design and own scalable data platform architecture (lake/Lakehouse/warehouse).
  • Define and track dataset SLOs/SLAs (freshness, completeness, latency).
  • Define and enforce data standards: modeling, layering, naming, metadata, contracts.

Requirements

  • 6+ years in data architecture, data engineering, and DataOps (production environments)
  • Strong SQL experience
  • Strong Python experience
  • Experience building and maintaining production data pipelines
  • Experience designing and operating cloud data platforms (AWS and/or Azure or equivalent)
  • Experience with data lakes, Lakehouse, and/or data warehouses
  • Experience with data modeling and layering
  • Experience with data contracts and schema management
  • Experience with data governance (ownership, PII handling, RBAC, retention, auditability)
  • Experience with data quality controls and observability
  • Experience with metadata and data lineage
  • Experience with CI/CD practices for data systems
  • Experience with infrastructure as code (e.g., Terraform or equivalent)
  • Experience with batch and streaming data processing
  • Experience with at least one orchestration tool (Airflow, Dagster, Prefect, or similar)
  • Experience with at least one transformation tool (dbt or equivalent)
  • Experience with at least one streaming system (Kafka, Kinesis, Event Hubs, Flink, or Spark Streaming)
  • Experience with Lakehouse table formats (Delta, Iceberg, or Hudi)
  • Experience with catalog and governance tools (Purview, Collibra, DataHub, Unity Catalog, or Alation)
  • Experience with BI tools (Power BI, Tableau, or Looker)
  • Experience with data quality/observability tools (Great Expectations, Soda, or equivalent)
  • Familiarity with Kubernetes and containerized workloads

Benefits

  • Flexible work arrangements
  • Professional development opportunities

Job type

Full Time

Experience level

Senior

Salary

Not specified

Degree requirement

Bachelor's Degree

Tech skills

AirflowAWSAzureCloudKafkaKubernetesPythonSparkSQLTableauTerraformUnity

Location requirements

HybridMontrealCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.