Senior Data Engineer responsible for data infrastructure in Cost Engineering. Enabling cost optimization and sustainability initiatives while collaborating across teams.
Responsibilities
Be responsible for the design, implementation, and evolution of scalable, reliable data infrastructure that underpins Spotify’s cost and carbon intelligence.
Own end-to-end data pipelines for cloud cost, usage, and emissions data: spanning ingestion, transformation, modeling, and serving layers.
Partner deeply with Data Scientists, Engineering, Finance, and Procurement to translate sophisticated analytical and business needs into robust data architectures.
Set technical direction and standards for data modeling, orchestration, testing, and observability within the Cost Engineering domain.
Build and maintain curated, analytics-ready datasets that power executive reporting, forecasting, and optimization initiatives.
Ensure data accuracy, consistency, and timeliness for high-stakes cost and emissions reporting used to guide strategic infrastructure investments.
Proactively identify opportunities to improve the scalability, reliability, and cost efficiency of the data platform itself.
Mentor other engineers and act as a technical sounding board, raising the overall bar for data engineering perfection on the team.
Work across all Missions at Spotify to embed cost and climate awareness into decision-making, with a focus on accurate attribution of spend and carbon impact.
Requirements
A senior data engineer with a strong track record of owning and operating production-critical data systems end to end.
Hold a degree in computer science, engineering, or a related technical field, or equivalent proven experience.
Experienced in designing data architectures that scale with both data volume and organizational complexity.
Comfortable leading technical discussions, influencing build decisions, and aligning partners around long-term solutions.
Thrive in environments with evolving requirements, balancing speed of delivery with adaptability and correctness.
Strong communicator who can explain sophisticated technical concepts clearly to both technical and non-technical audiences.
Familiar with financial, billing, or usage data, and able to connect infrastructure metrics to real business and sustainability impact.
Hands-on experience with cloud data platforms (GCP preferred).
Highly proficient with Python, SQL, DBT, and modern orchestration frameworks and experienced with data quality and observability tooling.
Experience with at least one data processing framework such as Spark, Flink, or Dataflow
Data Engineer building data integration pipelines for data lakes and warehouses. Collaborating with stakeholders to meet business requirements in a leading publishing company.
Google Cloud Data Engineer implementing data ingestion and analytics frameworks at Fueled. Specializing in Google Cloud Platform and modern data modeling.
Consulting Senior Data Architect specializing in Microsoft Fabric solutions for digital products. Engage in hands - on delivery, architecture, and governance for data engineering in a remote capacity.
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.