Senior Software Engineer, Data Platform

Posted 4 days ago

Apply Now

Resume Score

Check how well your resume matches this job before you apply.

Sign in to check score

About the role

  • Senior Software Engineer at Lithic developing backend services and APIs for data access. Collaborating with Analytics Engineering team and contributing to data governance processes.

Responsibilities

  • Design, build, and maintain backend services and REST APIs that serve data from various SQL subsystems and other data sources
  • Develop well-tested, production-grade Python services with clean API contracts, proper authentication, versioning, and error handling
  • Work closely with the Analytics Engineering team to expose modeled data (billing, settlement, finance) through APIs that downstream consumers can rely on
  • Build internal tooling and services that enable the broader organization to self-serve their data needs without writing SQL
  • Participate in code reviews, system design discussions, and engineering best practices across the Infrastructure org
  • Contribute to service observability: logging, metrics, alerting, and on-call practices for the services you own
  • Maintain and improve existing data pipelines that move data from source systems into Snowflake (Airflow, Airbyte)
  • Contribute to the dbt project alongside the Analytics Engineering team — model improvements, test coverage, and data quality
  • Support data governance practices including access controls, lineage documentation, and data quality standards

Requirements

  • Strong Python proficiency with experience building backend services and REST APIs
  • Experience with web frameworks such as FastAPI, Flask, Django, or similar
  • Solid SQL skills and hands-on experience with modern cloud data warehouses (Snowflake strongly preferred)
  • Experience designing and building production APIs with proper authentication, versioning, and error handling
  • Familiarity with CI/CD, automated testing, and operational reliability practices
  • A track record of shipping reliable, well-tested services in production environments
  • Comfort navigating ambiguity and driving projects forward with minimal oversight
  • Experience with data pipeline development using tools like Airflow, Airbyte, Dagster, or similar (preferred)
  • Familiarity with dbt or similar transformation frameworks (preferred)
  • Experience in fintech, payments, or other financial services environments (preferred)
  • Familiarity with AWS services (Lambda, S3, RDS, API Gateway, ECS/Fargate) (preferred)
  • Kafka or event streaming experience (preferred)
  • Infrastructure-as-code experience (Terraform, Pulumi) (preferred)

Benefits

  • Unlimited PTO
  • 12-weeks fully paid parental leave
  • 4-Week Fully Paid Sabbatical (earned at your 5-year anniversary)
  • Work From Anywhere: work from anywhere in the world 4-weeks each year
  • 3% cashback on card purchases with your complimentary Privacy.com employee account
  • Health, vision, and dental insurance; HSA Contribution Match
  • 401(k) match
  • Voluntary Life Insurance and STD/LTD

Job title

Job type

Full Time

Experience level

Senior

Salary

CA$170,000 - CA$280,000 per year

Degree requirement

Bachelor's Degree

Tech skills

AirflowAWSCloudDjangoFlaskKafkaPythonSQLTerraform

Location requirements

RemoteCanada

Report this job

Found something wrong with the page? Please let us know by submitting a report below.