Senior Data Engineer developing the data management layer for a financial brokerage platform with scalability for larger customers. Collaborating with teams in a fully remote, diverse environment.
Responsibilities
Design and oversee key forward- and reverse-ETL patterns to deliver data to relevant stakeholders.
Develop scalable patterns in the transformation layer to ensure repeatable integrations with BI tools across various business verticals.
Expand and maintain the Alpaca Data Lakehouse architecture's constantly evolving elements.
Collaborate closely with sales, marketing, product, and operations teams to address key data flow needs.
Operate the system and manage production issues in a timely manner.
Requirements
7+ years of experience in data engineering, including 2+ years of building scalable, low-latency data platforms capable of handling >100M events/day.
Proficiency in at least one programming language, with strong working knowledge of Python and SQL.
Experience with cloud-native technologies like Docker, Kubernetes, and Helm.
Strong hands-on experience with relational database systems and object storage implementations like Apache Iceberg.
Strong hands-on experience with Google Cloud Platform and its various data-related services (Composer, Dataproc, Datastream, etc.).
Experience in building scalable transformation layers, preferably through formalized SQL models (e.g., dbt).
Ability to work in a fast-paced environment and adapt solutions to changing business needs.
Experience with ETL orchestrators / frameworks like Apache Airflow and Airbyte.
Production experience with streaming systems like Kafka.
Exposure to infrastructure, DevOps, and Infrastructure as Code (IaaC), like Terraform.
Deep knowledge of distributed systems, storage, transactions, and query processing utilizing open-source distributed query engines like Trino (formerly PrestoSQL).
Benefits
Competitive Salary & Stock Options
Health Benefits
New Hire Home-Office Setup: One-time USD $500
Monthly Stipend: USD $150 per month via a Brex Card
Data Engineer at Motive delivering data infrastructure for the AI era. Collaborating with stakeholders, building models, and implementing innovative tooling.
Data Architect designing and governing data foundations for analytics and AI applications at Clio. Collaborating cross - functionally to develop high - quality data models and standards.
IAM/Data Engineer role in Toronto (Hybrid). Requires 4+ years in ETL, data pipelines, cloud platforms, and skills in Windows IAM, Ansible, Terraform, SQL, Python/Java, Spark/Kafka.
Data Migration Specialist managing client data migrations to gaiia's platform. Collaborating with teams to ensure accurate and timely data transitions.
Senior Data Architect/Strategist at Robots & Pencils blending advanced data knowledge with problem solving to drive intelligent products and smarter business decisions.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Technical Lead overseeing data engineers, analysts, and architects to implement data solutions. Leading modernization of data infrastructures for diverse business objectives.
Data Engineer joining a consulting firm in Toronto with world - class team of engineers. Producing high quality data tools and pipelines while collaborating with leading companies.
Director of Data Engineering & AI Strategy driving Google Marketing Platform capabilities for global marketing partner Incubeta. Hands - on technical leadership at the intersection of ad tech and media.
Data Engineer at Alberta Blue Cross designing and implementing data solutions for business analytics. Collaborating on data pipelines and analytics projects in a hybrid work environment.