Teradata Lead contract role in Toronto requiring strong data management skills and hands-on experience with big data technologies like Hadoop, Spark, and Kafka.
Responsibilities
Strong skills in data management: data modeling and db design, DWE Solution Architecture, data integration, data movement and replication, business intelligence, data quality, reference and master data management, data governance. Hands-on implementation experience working with a combination of the following technologies: Hadoop distributions, Storm and Spark streaming, Kafka, Spark advanced analytics, NoSQL data warehouses such as Hbase and Cassandra, data processing frameworks. Experience in designing and implementing big data solutions. This includes creating the requirements analysis, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution. Expertise in database integration patterns and performance considerations for operational and DWE databases. Experience with patterns and technologies in emerging areas: Big Data, Hadoop, Hortonworks, Aster, NoSQL databases, Visualization tool, Advanced Analytics, BI Virtualization, Cloud etc. Domain Knowledge in financial services platforms and enterprise functions (e.g. Banking, Insurance, Investments, Wealth Management, AML, Risk, Finance, Fraud etc.).
Salesforce Data Architect designing and optimizing enterprise - grade data architectures across Salesforce platforms. Collaborating with team members to ensure data quality and readiness for analytics.
Senior Data Engineer with a strong background in Google Cloud services at Valtech. Leading data engineering projects and developing highly available data pipelines.
Sr. Databricks Spark Developer role designing and optimizing data pipelines for banking. Requires Databricks/Spark experience in financial services with strong communication skills.
Data Integration Developer for market risk systems. Responsible for ETL/ELT development, SQL database programming, and supporting risk management systems in a hybrid Mississauga contract role.
Azure & Databricks Data Engineer role designing and building data pipelines using Microsoft tech stack. 11 - month contract, hybrid work in Oshawa, $90 - 95/hr.
Data Engineering Developer responsible for designing and implementing data flows using cloud technologies like AWS and Databricks. Collaborating within a strong data science team to optimize data for machine learning.
Sr. Manager leading data engineering team to optimize data infrastructure for insurance. Driving innovative data solutions and managing cross - functional collaborations within a remote setup.