Data Engineer at KPI specializing in cloud-based digital evolution and data integration projects. Collaborating with data scientists to ensure reliable and accessible data for clients.
Responsibilities
Design and implement projects that integrate data from multiple sources to support analysis and decision-making.
Ensure data is accessible, reliable, and easy to work with for both routine and ad-hoc needs.
Collaborate closely with data scientists and AI engineers to support software solution development.
Requirements
Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related discipline.
Minimum of five years of experience in data engineering or a related field.
Demonstrated ability to optimize and troubleshoot data engineering processes.
Strong proficiency in Python and database management systems (T-SQL, SQL, NoSQL), including query optimization.
Solid foundational software engineering skills; familiarity with DevOps, UNIX, Git, Docker, and object-oriented principles.
Experience with data processing tools and frameworks such as Apache Spark, Apache Flink, Snowflake, dbt, Airflow, Dagster, and Databricks.
Understanding of data storage, collection, and aggregation models—and how to apply them to business problems.
Knowledge of database indexing, especially clustered columnstore tables.
Strong ability to handle, analyze, and interpret data.
Understanding of data security both at rest and in transit.
Ability to deliver and scale performance, anticipating system enhancements proactively.
Experience with data integrity practices, including data manipulation, error handling, and modeling.
Experience deploying Docker-based solutions in cloud environments.
Benefits
The opportunity to join and grow an ever-expanding professional network of high-profile clients and reputable colleagues.
Permanent, full-time job (40 hours a week).
Hybrid (work from home and/or office) and flexible schedule.
Competitive salary and bonus structure.
Attractive group insurance plan.
Retirement savings plan with matching.
Family company culture.
Flex-Fridays in the summertime.
Free use of the gym within building.
Subsidized catering service & free snacks at the office.
Salesforce Data Architect designing and optimizing enterprise - grade data architectures across Salesforce platforms. Collaborating with team members to ensure data quality and readiness for analytics.
Senior Data Engineer with a strong background in Google Cloud services at Valtech. Leading data engineering projects and developing highly available data pipelines.
Sr. Databricks Spark Developer role designing and optimizing data pipelines for banking. Requires Databricks/Spark experience in financial services with strong communication skills.
Data Integration Developer for market risk systems. Responsible for ETL/ELT development, SQL database programming, and supporting risk management systems in a hybrid Mississauga contract role.
Azure & Databricks Data Engineer role designing and building data pipelines using Microsoft tech stack. 11 - month contract, hybrid work in Oshawa, $90 - 95/hr.
Data Engineering Developer responsible for designing and implementing data flows using cloud technologies like AWS and Databricks. Collaborating within a strong data science team to optimize data for machine learning.
Sr. Manager leading data engineering team to optimize data infrastructure for insurance. Driving innovative data solutions and managing cross - functional collaborations within a remote setup.