Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Senior Databricks Engineer responsible for designing and implementing modern data platforms on Azure Databricks. Working closely with client teams to migrate and optimize data workloads while ensuring best practices.

Responsibilities

  • Lead the implementation of modern data platforms and architectures on Databricks (including ETL, workload migrations, and Unity Catalog).
  • Design and implement data pipelines using Azure Databricks (PySpark, SQL)
  • Build and optimize batch and streaming data workloads
  • Migrate legacy data workloads to Azure + Databricks
  • Implement Delta Lake patterns (medallion architecture, CDC, data quality)
  • Integrate Databricks with Azure services such as ADLS, ADF, Azure Key Vault and DevOps/Github
  • Optimize performance and cost (cluster sizing, job orchestration, query tuning)
  • Collaborate with solution architects, analytics engineers, and client stakeholders
  • Support client enablement through knowledge transfer and documentation
  • Provide hands-on solution delivery, including guiding and working closely with client engineers and ensuring best practices.
  • Implement governance models and Unity Catalog including data access, lineage, and security frameworks.
  • Evaluate and prioritize high-value AI and ML use cases, embedding Generative AI into client strategies.
  • Act as a thought leader by contributing to client workshops, executive roundtables, and industry discussions.

Requirements

  • 5–7+ years of experience in data engineering.
  • 3+ years of hands-on experience with Databricks, including advanced features (Delta Lake, Unity Catalog, MLflow).
  • Proven experience leading large-scale data migrations (ETL, workloads, cloud platforms).
  • Strong expertise in Azure environments.
  • Multi-cloud experience is an asset.
  • Experience working with Azure data services (ADLS, ADF, Synapse, etc.)
  • Solid understanding of modern data architectures (lakehouse, medallion, ELT/ETL)
  • Experience with CI/CD for data workloads
  • Databricks or cloud certifications required.

Benefits

  • Health insurance
  • Retirement plans
  • Paid time off
  • Flexible working arrangements
  • Professional development

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Tech skills

AzureCloudETLPySparkSQLUnityVault

Location requirements

RemoteCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.