Intermediate Data Engineer developing solutions for big data reporting and analytics. Collaborating with stakeholders to deliver precision reporting and performance measurement for the organization.
Responsibilities
Working in collaboration with customers across the organization (Strategic Analytics, Claims, Insurance, Finance, Driver Licensing, Road Safety, etc.) to plan, scope, execute and sustain data-based solutions.
Responding to internal and external ad hoc requests, review and clarify data requirements, ensuring report content is acceptable within policy and privacy protocols.
Providing subject matter expertise within the department and to clients on data sources, reporting workflows, business process, and the appropriate tools with which to analyze their data.
Participating with corporate data user teams, developing data validation and test plans, performing user acceptance testing, and providing feedback to development and sustainment teams.
Conducting analysis for moderate to complex requests, defining data fields and determining data availability, developing information layout, format and interactivity. Presenting findings and providing clarification.
Requirements
Advanced skill in Object Oriented programming languages such as Scala, Python or Java.
Working experience with Big Data platforms, with exposure to Hadoop ecosystem (Spark, HDFS, Hive, Kafka) and AWS big data technologies (AWS Glue, Lamba, and S3)
Experience with CI/CD tools like Jenkins, Git, Fisheye, SVN
Experience in developing efficient and robust data transformation & ingestion pipelines
Working experience in building and deploying machine learning models.
Advanced working SQL knowledge and experience working with Relational and NoSQL databases.
Strong analytical skills related to working with unstructured datasets.
Demonstrated ability to work with large and complex datasets, while managing priorities and responding to time pressures.
Detail-oriented with demonstrated ability to meet deadlines, manage multiple priorities and work effectively under pressure.
Strong data quality management process understanding, data analysis and data profiling.
Ability to apply critical thinking skills to troubleshoot and perform root cause analysis on technical problems and solution design.
Providing technical advice and guidance to staff in resolving complex data ingestion and transformation issues.
Experience with performance tuning and code optimization.
Design, develop and enforce best practices and standards around data engineering.
Ability to work effectively with a team or independently, as well as lead small teams as needed.
Demonstrating leadership qualities in coaching junior staff members and new hires.
Understanding of Agile Methodologies.
Excellent interpersonal, verbal and written communication skills to work with Managers, Directors and Executive level stakeholders.
Experience with reporting and visualization tools, such as Tableau, user interface design, and iterative customer-driven design processes would be additional assets.
Data Engineer building data integration pipelines for data lakes and warehouses. Collaborating with stakeholders to meet business requirements in a leading publishing company.
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.