About the role

  • Data Engineer architecting and building cloud-based systems for Semios Group, an agricultural technology company. Responsibilities include managing data interfaces, scalable infrastructure, and delivering actionable insights.

Responsibilities

  • Architect and build Cloud based systems to manage and improve the interface between Semios data and its consumers.
  • Design, develop and maintain scalable infrastructure to process and store data, integrate data­ driven models and automate manual processes.
  • Implement highly scalable big data analytics systems in a cloud environment.
  • Design and build reliable, monitorable and fault-tolerant data systems & data processes.
  • Create data tools for analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Continuously identify bottlenecks in the data stack and optimize queries and processes for cost and performance.
  • Write clear documentation of data processes and products.

Requirements

  • Advanced skills in SQL; how to write elegant queries; written for humans first, machines second.
  • The ability to thrive both autonomously and in a team environment.
  • Hands-on experience with provisioning and developing on cloud platforms (familiarity with GCP is a definite plus).
  • Experience with at least one Data Warehouse (BigQuery, RedShift, Snowflake, On-Prem).
  • Excellent verbal & written communication skills: a talent to distill complex ideas to different audiences.
  • An in-depth experience with Big Data. A proven track-record of effective collection, storage, and access.
  • Proven experience with workflow and scheduling tools (e.g., like Prefect, Airflow, Dagster, Kubeflow, etc.) and version control (Git).
  • A fluency in Python, Node or other imperative language or ability to learn quickly and with enthusiasm.
  • Excellent troubleshooting skills to rapidly identify and resolve issues.
  • **Nice to have:**
  • Significant exposure to at least one relational database (Postgres, MySQL).
  • Real world experience with containers (Docker) & container management systems (Kubernetes).
  • Experience or Interest in working with IoT Cloud and IoT data.
  • Familiarity with data transformation tools (dbt, SQLMesh, Dataform) and syncing tools (e.g., dlt, Fivetran, Airbyte).
  • Experience enabling Machine Learning workflows (MLOps).
  • Advanced education in Big Data and/or Data Engineering whether from Academia or Certifications.

Benefits

  • Purposeful Work: Make a global impact by advancing sustainable food production.
  • Our People: Work with a fun, collaborative, and supportive team.
  • Recharge: Generous vacation policy, company-paid holidays and year-end winter break.
  • Work Flexibility: Hybrid working arrangements and strong work-life balance culture.
  • Prioritize Your Well-Being: Access comprehensive health plans designed to support your physical and mental health.
  • Group RRSP, which includes a 3% company paid match after one year of employment
  • Office location that is convenient via transit and bike paths

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

CA$98,000 - CA$125,000 per year

Degree requirement

Bachelor's Degree

Tech skills

AirflowAmazon RedshiftBigQueryCloudDockerGoogle Cloud PlatformIoTKubernetesMySQLNode.jsPostgresPythonSQL

Location requirements

HybridVancouverCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.