Data Engineer at Appsilon responsible for building scalable data pipelines and ensuring data quality for global organizations. Collaborate with cross-functional teams to leverage massive datasets.
Responsibilities
Design, build, and maintain scalable data pipelines across diverse environments
Integrate data from multiple internal and external sources into data warehouses or data lakes
Collaborate closely with Data Scientists, ML Engineers, and Developers to ensure data quality, structure, and availability
Monitor and improve data integrity, performance, and reliability
Build and optimize database schemas, data models, and documentation
Implement data governance, security best practices, and compliance standards
Requirements
Backend Python Development
Strong experience building scalable backend systems in Python
Comfortable with modern language features (type hints, decorators, generators)
Able to design clean, maintainable APIs using FastAPI, Django REST Framework, or Flask
Good understanding of performance optimization and Python internals
A collaborative mindset — you enjoy working closely with cross-functional teams
Data Engineering
Hands-on experience designing and operating ETL/ELT pipelines
Solid SQL skills and ability to model, optimize, and maintain database structures
Experience integrating data from multiple sources (databases, APIs, streaming)
Familiarity with large-scale data processing tools or distributed systems
Nice to have: Experience with cloud platforms (AWS/Azure/GCP)
Knowledge of R
Experience with Docker, Kubernetes, and CI/CD tools (GitHub Actions, GitLab CI)
Understanding of data governance, metadata management, and security
Experience in life sciences, biotech, genomics, or enterprise data environments
Prior remote work experience with international teams
Life science skills: Molecular Biology & Bioinformatics, Clinical Trials - Data Tools & Flow, CDISC & Clinical Data Standards, Nextflow
Benefits
Competitive B2B compensation with clear salary ranges (up to 23.000 PLN net B2B)
Modern equipment (MacBook / ThinkPad + Linux environment)
Budget for professional development (certifications, courses, conferences)
Opportunity to collaborate with industry experts on innovative data products
A supportive, ambitious, and friendly team that cares about excellence
Data Engineer focusing on building scalable GCP solutions for TELUS Digital. Collaborating on vendor technologies and integrating with internal systems.
Design and implement end - to - end data architectures using Microsoft Fabric components with emphasis on Medallion architectural patterns. Contract position in Winnipeg.
Principal Data Engineer with 10+ years experience to lead AWS cloud - native data platform design and enterprise data strategy. Hybrid role in Toronto with 2 days onsite.
Data Architecture Advisor designing robust data management solutions. Involved in data integration, analytics, and architecture implementation for diverse business sectors.
Data Engineer with 6+ years experience needed in Mississauga, ON. Must have strong Python and Spark skills for designing software solutions and data analysis.
Sr. IFRS17 Data Engineer contract role in Toronto (hybrid). Must have IFRS17 experience, insurance domain knowledge, and expertise in Python, PySpark, SQL, and data analysis.
Senior Software Engineer developing data - heavy services and data security solutions for Coinbase. Building and maintaining scalable data integration tools and self - service applications.
Senior Data Engineer role requiring Python & PySpark expertise. Design scalable data solutions, build data processing programs, and work with HBase/Hive datasets.
Data Engineer Intern/Co - op working on TD’s big data platform. Engage in software development and project management methodologies while collaborating with technical teams.
Geospatial Data Engineer building data processing systems for autonomous flight decisions alongside the CTO. Architecting systems evolution from initial implementation to production - grade infrastructure.