Data Engineer working with Google Cloud's technologies, assisting clients in data solutions and pipelines. Collaborating with teams to optimize data infrastructures and promote agile practices.
Responsibilities
Build the infrastructure that enables analytics and data science teams to deliver innovative, impactful solutions for clients.
Assist clients in migrating their existing business intelligence and data warehouse solutions to Google Cloud.
Design, develop, and optimize robust data pipelines, making data easily accessible for visualization and machine learning applications.
Design and implement new data warehouse and data mart solutions, including transforming, testing, deploying, and documenting data.
Understanding data modeling techniques.
Optimising and storing data for warehouse technologies.
Architect, maintain, and troubleshoot cloud-based infrastructure to ensure high availability and performance.
Work closely with technology partners such as Google Cloud, Snowflake, dbt, and Looker, mastering their technologies and building a network with their engineers.
Collaborate in an agile and dynamic environment with a team of data engineers, BI analysts, data scientists, and machine learning experts.
Implement software engineering best practices to analytics processes, such as version control, testing, and continuous integration.
Requirements
4+ years in a data-related role (e.g., Data Engineer, Data Analyst, Analytics Engineer).
Hands-on experience with Looker, dbt, modern data warehouses like Snowflake or BigQuery, and Kimball data modeling.
Expertise in Python and/or Java, with proficiency in SQL.
5+ years of experience in designing and building scalable data solutions.
Ability to write tested, resilient, and well-documented code.
Experience in building and maintaining cloud infrastructure (GCP or AWS is a plus).
Ability to take ownership and drive projects from concept to completion.
Natural ability to manage multiple initiatives and clients simultaneously.
Skilled in writing analytical SQL, with an understanding of the difference between SQL that works and performant SQL.
Experience in translating business requirements into technical solutions.
Ability to communicate complex ideas simply to a wide range of audiences.
Experience in providing technical guidance and direction on projects.
Complete alignment with our culture of transparency, empathy, accountability, and performance.
Benefits
20 days of paid vacation per calendar year
Public Holidays for your Province of Residence
5 Wellness days (sickness, personal time, mental health)
5 Lifestyle days (religious events, volunteer day, sick day)
Matching Group Retirement Savings Plan after 3 months
Competitive Group Insurance plan on Day 1 - individual premium paid 100%!
Virtual Medicine and Family Assistance Program - 100% employer-paid!
Home office budget - We are 100% remote!
CAD $70/month for internet/phone expenses
CAD $1,500 every 3 years for tech accessories and office equipment (monitor, keyboard, mouse, desk, etc.) starting on Day 1
Company-supplied MacBook Pro or Air
CAD $400/year for books, relevant app subscriptions or an e-reader.
Opportunities for paid certifications
Opportunities for professional and personal learning through Google and other training programs
Data Engineer building data integration pipelines for data lakes and warehouses. Collaborating with stakeholders to meet business requirements in a leading publishing company.
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.