Data Engineering Specialist responsible for designing and maintaining cloud data integration processes for Cogeco Communications. Collaborating with analysts and architects to enhance data governance and quality.
Responsibilities
Reporting to the Lead Data Engineering, the Data Engineering Specialist is responsible for designing, developing, and maintaining data integration and transformation processes in our cloud-based data platform.
Develop and orchestrate data pipelines for ingestion from various sources (e.g. MySQL, Oracle, PostgreSQL, flat files…etc.) into a cloud-based environment and move data around multiple systems based on the business needs and requirements.
Collaborate with Data Analysts and Data Architects on defining data models, requirements, and architecture for optimal performance in databases (e.g. BigQuery or other cloud-based relational databases).
Ensure robust ETL/ELT processes that support scalability, reliability, and efficient data access.
Implement and maintain data governance frameworks and standards, focusing on data classification, lineage, and documentation.
Utilize Collibra or similar platforms to manage data catalogs, business glossaries, and data policies.
Work closely with stakeholders to uphold best practices for data security, compliance, and privacy.
Identify, design, and implement process enhancements for data delivery, ensuring scalability and cost-effectiveness.
Automate manual tasks using scripting languages (e.g., Bash, Python) and Enterprise scheduling/orchestration tools like Airflow.
Conduct root cause analysis to troubleshoot data issues and implement solutions that enhance data reliability.
Partner with cross-functional teams (IT, Analytics, Data Science, etc.) to gather data requirements and improve data-driven decision-making.
Provide subject matter expertise on cloud data services, data classification standards, and governance tools.
Monitor and communicate platform performance, proactively recommending optimizations to align with organizational goals.
Requirements
Experience with at least one major cloud platform (AWS, Azure, GCP), with GCP exposure considered a significant asset.
Strong understanding of RDBMS (PostgreSQL, MySQL, Oracle, SQL Server) with the ability to optimize SQL queries and maintain database performance.
Familiarity with version control systems (Git) to manage codebase changes and maintain a clean development workflow.
Familiarity with data governance and classification concepts, leveraging Collibra or similar platforms to manage data lineage, business glossaries, and metadata.
Knowledge of Linux/UNIX environments, and experience working with APIs (XML, JSON, REST, SOAP).
Demonstrated ability to build large-scale, complex data pipelines for ETL/ELT processes.
Hands-on experience with scripting/programming languages (e.g., Python, Bash) to automate data workflows and error handling.
Strong analytical and problem-solving skills with the ability to work with unstructured datasets.
Functional knowledge of encryption technologies (SSL, TLS, SSH) and data protection measures.
Experience implementing governance best practices to ensure data security and regulatory compliance.
Excellent communication and collaboration skills to partner effectively with cross-functional teams.
Curiosity and a growth mindset, with the initiative to explore emerging data technologies.
Bilingualism (written and spoken) is an asset to interface with stakeholders in Ontario and across the United-States.
Bachelor’s degree in Information Technology, Computer Science, or a related field; or an equivalent combination of education and experience.
5 years of progressive experience in data engineering, data analytics, or a similar role.
Proven track record in architecting, optimizing, and delivering enterprise-grade data solutions on a major cloud platform (AWS, Azure, or GCP).
Benefits
Flexibility: Yes, we think that what you do matters. At work and at home.
Fun: We laugh a lot, it makes every day brighter.
Discounted services: We provide amazing services to our clients, and you’ll get them at home, because you deserve them.
Rewarding Pay: Let's be honest, everybody likes to make a good salary. We offer attractive compensation packages, and it comes with a great culture.
Benefits: We’ve got you covered.
Career Evolution: Join us and we will give you the tools to achieve your career goals!
Technology: You have a passion for technology? Excellent, we do too. Here, you will manage, influence, play, create, fix, and shape the industry.
Hiring an AWS Data Engineer for a contract position in Toronto. Requires 7 - 10 years of experience in data engineering with strong expertise in AWS, Python, SQL, PySpark, and Airflow.
Snowflake Data Engineer at Capgemini; responsible for designing and maintaining data warehouse solutions. Focused on analytics support and infrastructure optimization.
Senior Backend Engineer designing scalable ad delivery services for FOX's streaming platforms. Collaborating with ML and Data Science teams for optimization and performance.
Senior Data Engineer at SecurityScorecard mentoring engineers and leading data infrastructure projects across scalable platforms. Working with a collaborative team, ensuring reliability and performance of data pipelines.
Senior Data Engineer joining IT services company's high - impact digital transformation initiative. Working on various projects to enhance service delivery and data platforms.
Specialist in Data Engineering developing Azure - based data integration and transformation solutions for Alberta Energy Regulator. Collaborating with teams to enable analytics and operational insights.
Data Engineer creating large - scale data processing tools for Spotify. Collaborating across teams to enhance data infrastructure and developer experience, focusing on AI integration.
Senior Data Engineer Consultant designing and optimizing Databricks - based data solutions for Keyrus. Leading technical projects, collaborating with teams to drive data engineering best practices during the data lifecycle.
Data Migration Specialist working on ETL and data mapping solutions at FNZ. Join a growing Migration Team for delivering complex data migration projects.
Data Engineer developing and maintaining ETL pipelines and data platform solutions for enterprise. Working on data modeling, SQL queries, and cloud services integration.