Data Engineer developing and maintaining data pipelines leveraging analytics for Samsara’s Data Platform. Working collaboratively to integrate diverse data sources and ensure data-driven decision-making.
Responsibilities
Develop and maintain E2E data pipelines, backend ingestion and participate in the build of Samsara’s Data Platform to enable advanced automation and analytics.
Work with data from a variety of sources including but not limited to: CRM data, Product data, Marketing data, Order flow data, Support ticket volume data.
Manage critical data pipelines to enable our growth initiatives and advanced analytics.
Facilitate data integration and transformation requirements for moving data between applications; ensuring interoperability of applications with data layers and data lake.
Develop and improve the current data architecture, data quality, monitoring, observability and data availability.
Write data transformations in SQL/Python to generate data products consumed by customer systems and Analytics, Marketing Operations, Sales Operations teams.
Champion, role model, and embed Samsara’s cultural principles as we scale globally and across new offices.
Requirements
A Bachelor’s degree in computer science, data engineering, data science, information technology, or equivalent engineering program.
3+ years of experience in data engineering, ETL development, or database architecture.
3+ years of experience in building/maintaining a large-scale production-grade end-to-end data pipelines, including Data Modeling.
Experience with modern cloud-based data-lake and data-warehousing technology stacks, and familiarity with typical data-engineering tools, ETL/ELT, and data-warehousing processes and best practices.
Experience with leading end-to-end projects, including being the central point of contact to stakeholders.
Engage directly with internal cross-functional stakeholders to understand their data needs and design scalable solutions.
3+ years in Python, SQL.
Exposure to ETL tools such as Fivetran, DBT or equivalent.
Exposure to python based API frameworks for data pipelines.
RDBMS: MySQL, AWS RDS/Aurora MySQL, PostgreSQL, Oracle, MS SQL-Server or equivalent.
Cloud: AWS, Azure and/or GCP.
Data warehouse: Databricks, Google Big Query, AWS Redshift, Snowflake or equivalent.
Senior Maintenance Technician role in Ottawa requiring 5+ years building operations experience. Direct hire permanent position with mechanical, electrical, HVAC, and construction knowledge needed.
Senior leader role in Data, Analytics & AI for BFSI sector. Requires 18+ years experience, business development skills, and expertise in enterprise data architecture and cloud platforms.
Junior AI Developer building Python services for an advanced multi - agent orchestration platform at Delta Intelligent Building Technologies. Collaborating on AI - driven features using LLMs, RAG, AWS services, and more.
AI Engineer in AI Platform team building LLM - powered AI systems for Affinity's relationship intelligence platform. Collaborating with machine learning and software engineers to deliver insights from data.
SailPoint Architect contract role in Calgary. Requires expertise in SailPoint, Delinia Secret server, PKI Cert, AD, and Azure AD for enterprise security solutions.
Product Manager overseeing AI observability and evaluation within the ML/AI Platform team at Spotify. Collaborating with engineering leaders to enhance AI workloads and user experiences.