Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Intermediate Data Engineer developing solutions for big data reporting and analytics. Collaborating with stakeholders to deliver precision reporting and performance measurement for the organization.

Responsibilities

  • Working in collaboration with customers across the organization (Strategic Analytics, Claims, Insurance, Finance, Driver Licensing, Road Safety, etc.) to plan, scope, execute and sustain data-based solutions.
  • Responding to internal and external ad hoc requests, review and clarify data requirements, ensuring report content is acceptable within policy and privacy protocols.
  • Providing subject matter expertise within the department and to clients on data sources, reporting workflows, business process, and the appropriate tools with which to analyze their data.
  • Participating with corporate data user teams, developing data validation and test plans, performing user acceptance testing, and providing feedback to development and sustainment teams.
  • Conducting analysis for moderate to complex requests, defining data fields and determining data availability, developing information layout, format and interactivity. Presenting findings and providing clarification.

Requirements

  • Advanced skill in Object Oriented programming languages such as Scala, Python or Java.
  • Working experience with Big Data platforms, with exposure to Hadoop ecosystem (Spark, HDFS, Hive, Kafka) and AWS big data technologies (AWS Glue, Lamba, and S3)
  • Experience with CI/CD tools like Jenkins, Git, Fisheye, SVN
  • Experience in developing efficient and robust data transformation & ingestion pipelines
  • Working experience in building and deploying machine learning models.
  • Advanced working SQL knowledge and experience working with Relational and NoSQL databases.
  • Strong analytical skills related to working with unstructured datasets.
  • Demonstrated ability to work with large and complex datasets, while managing priorities and responding to time pressures.
  • Detail-oriented with demonstrated ability to meet deadlines, manage multiple priorities and work effectively under pressure.
  • Strong data quality management process understanding, data analysis and data profiling.
  • Ability to apply critical thinking skills to troubleshoot and perform root cause analysis on technical problems and solution design.
  • Providing technical advice and guidance to staff in resolving complex data ingestion and transformation issues.
  • Experience with performance tuning and code optimization.
  • Design, develop and enforce best practices and standards around data engineering.
  • Ability to work effectively with a team or independently, as well as lead small teams as needed.
  • Demonstrating leadership qualities in coaching junior staff members and new hires.
  • Understanding of Agile Methodologies.
  • Excellent interpersonal, verbal and written communication skills to work with Managers, Directors and Executive level stakeholders.
  • Experience with reporting and visualization tools, such as Tableau, user interface design, and iterative customer-driven design processes would be additional assets.

Benefits

  • competitive salary
  • comprehensive benefits
  • collaborative work environment

Job title

Job type

Full Time

Experience level

Mid levelSenior

Salary

CA$92,729 - CA$100,427 per year

Degree requirement

Bachelor's Degree

Tech skills

AWSHadoopHDFSJavaJenkinsKafkaNoSQLPythonScalaSparkSQLSubversionTableau

Location requirements

HybridNorth VancouverCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.