Our client is a Toronto-based firm founded by leading engineers and data scientists, with deep roots in capital markets and cloud engineering.
Our mission is to transform data into strategic assets for financial services firms—driving growth and innovation through tailored, modern solutions.
You'll work on cutting-edge projects in asset management and wealth, handling large volumes of structured and unstructured data—free from legacy constraints. As part of a lean, agile team, you'll have the autonomy to shape architecture, innovate, and deliver impactful solutions.
What You'll DoDesign & Build: Develop scalable ELT pipelines using Azure Databricks (PySpark) and orchestrate workflows via ADF or Synapse.
Data Modeling: Build and maintain data marts in Snowflake, utilizing streams, tasks, clustering, and RBAC for efficient data access.
Quality Assurance: Implement automated data-quality tests using Great Expectations or dbt to ensure reliability and accuracy.
DevOps & Infrastructure: Deploy and manage infrastructure with Terraform and Azure DevOps, ensuring a stable production environment.
Performance Tuning: Work closely with our Platform Lead to optimize cost and performance (e.g., Spot pools, warehouse sizing).
Collaboration & Documentation: Write clear design docs, participate in code reviews, and actively share knowledge within the team.
Agile Environment: Contribute to sprints, demos, and retrospectives to continually improve our processes.
Cloud & Data Engineering: Hands-on experience with the Azure data stack including ADF/Synapse and Databricks.
Big Data Processing: Expertise in Spark / PySpark, with a focus on tuning and optimization.
Database Mastery: Proficient in SQL and Snowflake, with solid knowledge of schema design and access controls.
Programming: Strong skills in Python, writing modular and testable code.
CI/CD & Infrastructure-as-Code: Experience with Git workflows, Terraform/Bicep, and CI/CD pipelines.
Data Quality Focus: A mindset geared toward building trust in data through automated validation and testing.
Experience in financial services or asset management data.
Knowledge of Databricks Delta Live Tables, dbt, or Apache Airflow.
Familiarity with observability tools like Datadog or Prometheus.
Experience mentoring junior engineers or facilitating agile ceremonies.
A stimulating hybrid work environment with strong engineering values.
Autonomy and ownership over architecture and project delivery.
Competitive salary + performance bonus.
Annual training & development budget.
Comprehensive benefits and pathways for career advancement in a growing consulting firm focused on financial data.
Please send your resume and Linkedin to [email protected] , include a short note about a pipeline or data solution you're proud of
Please send your resume or LinkedIn profile to : [email protected]
Note: Include a short note about a pipeline or data solution you're proud of.