About the role

  • DevOps Engineer automating delivery pipelines for Journey Capital, a leading online lender in Canada. Focus on CI/CD, cloud infrastructure, and enhancing engineering productivity through automation.

Responsibilities

  • Accelerate Automation & Delivery Velocity: Replace manual/semi-automated steps with end-to-end automation across build, test, deploy, environment provisioning, and operational runbooks.
  • Identify SDLC bottlenecks with DEV, QA, Salesforce and data teams; deliver automation that measurably improves cycle time, deployment frequency, and change failure rate.
  • Establish reusable patterns (Jenkins shared libraries, IaC modules, testing harnesses) to scale automation across teams.
  • CI/CD Modernization & Orchestration: Enhance Jenkins/Bitbucket (& Gearset for Salesforce) pipelines to support multi-language builds (Java, JS/TS, Python, Apex), artifact flow through Nexus, and promotion across dev/UAT/prod with automated rollback and environment parity checks.
  • Integrate automated test suites (unit, integration, e2e, contract) as quality gates; surface signal early and reduce reliance on manual testing.
  • Implement release strategies (blue/green, canary, rolling) for EC2/ECS/EKS workloads.
  • AWS Infrastructure Automation: Provision and govern AWS resources (VPC, EC2, ECS, EKS, ELB, S3, RDS [Postgres/MySQL], IAM, CloudWatch/CloudTrail, KMS, SSM) with Terraform/CloudFormation and PR-based workflows to eliminate configuration drift.
  • Optimize capacity, resilience, and cost; automate backups, DR, and security baselines.
  • Identity, Security & Compliance: Integrate Keycloak (OIDC/SAML) with apps and services; automate client/realm config and secret lifecycles.
  • Shift-left security with SAST/SCA/secret scanning, container image scanning, and signed SBOMs embedded in CI/CD.
  • Observability & SDLC Operational Excellence: Standardize telemetry (metrics, logs, traces), define SLOs/SLIs, automate alerting, and codify incident response and postmortems.
  • Automate environment resets, data refreshes, smoke tests, release readiness, and config promotion.
  • Data, Analytics & AI Alignment: Orchestrate and integrate with data pipelines so applications, data transformations, and analytics refreshes release coherently. Partner with data engineering to automate pipeline deployments (batch/near-real-time), schema migration flows, and secure data access to RDS (Postgres/MySQL) and S3. Coordinate Tableau Cloud extract/refresh automation as part of release trains. Integrated repositories: define versioning and promotion standards for code, infrastructure, data schemas, and ML assets (e.g., model artifacts), ensuring traceability from commit → build → deploy → MLOps & AI Engineering: Work with AI/ML teams to productionize models with CI/CD for training/packaging/serving, model registry and approval workflows, and safe rollout/rollback of model versions. Enable feature/config management, secrets/IAM for data and model services, and integrate model-specific monitoring (data quality, drift, performance). Support serving patterns that embed models into microservices or batch scoring jobs on AWS (tooling such as MLflow/SageMaker or equivalents, as appropriate). QA Automation Engineering Alignment: Partner closely with QA Automation Engineering to integrate automated UI, API, and end-to-end test suites into CI/CD, ensuring consistent gating, fast feedback loops, and reduced reliance on manual validation. You'll contribute to test environment reliability, improve test data automation, and ensure QA automation becomes a first-class, scalable component of the delivery pipeline.

Requirements

  • Proven track record turning manual or semi-automated delivery into fully automated pipelines that lift throughput and reliability.
  • Deep Jenkins experience (declarative pipelines, shared libs, multi-stage) and Nexus artifact governance; strong Git workflows and release orchestration.
  • Hands-on with AWS (EC2, ECS, EKS, ELB, S3, RDS, IAM) and networking; skilled with Terraform (preferred) or CloudFormation; container best practices (multi-stage Dockerfiles, image hardening, RBAC, ingress/autoscaling).
  • Comfortable embedding automated test suites (Java, JS/TS, Python) as pipeline gates; experience with security scanning and SBOMs; strong telemetry and actionable alerting.
  • Experience partnering with data/analytics teams to automate data pipelines and integrate data refreshes into application releases.
  • Familiarity with MLOps concepts: model packaging, registries, approval workflows, automated promotion, monitoring for drift and data quality, and secure model serving.
  • Practical knowledge of Postgres/MySQL operations and secure data access on AWS; awareness of Tableau Cloud deployment/refresh patterns.
  • Bias to automate and remove toil; pragmatic, security-first, and documentation-driven. Excellent partner to engineering teams; strong communication and enablement skills. A strong communicator who can partner with developers, QA & Engineering teams. You provide clarity, documentation, and guidance while driving engineering excellence across the ecosystem.

Benefits

  • Competitive compensation
  • Flexible work schedule
  • Remote or in-office work
  • Personalized benefits program
  • $1,500 for professional training and classes
  • Free English or French tutoring classes
  • Free gym access
  • Free coffee & snacks
  • Regular events & team building activities

Job type

Full Time

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

No Education Requirement

Tech skills

AWSCloudEC2JavaJavaScriptJenkinsMicroservicesMySQLPostgresPythonRealmSDLCTableauTerraformTypeScript

Location requirements

HybridMontrealCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.