Senior Python Data Platform Engineer (SnowFlake)
Location: Montreal, Quebec, Canada (Onsite - Local Candidates Only)
Experience: 10+ Years in Data Engineering
Responsibilties
- Build and enhance a Python-based data pipeline framework using Airflow, DBT, Spark, and Snowflake
- Establish and enforce best practices for orchestration, data modeling, and platform integration
- Work on data ingestion from multiple sources, optimizing for performance and scalability
- Collaborate closely with cross-functional teams including analysts, data scientists, and infrastructure engineers
- Own testing, deployment (CI/CD), performance tuning, and monitoring of pipelines
- Contribute to integrations with internal systems: data catalog, data quality, incident logging, and reporting tools
Required Skills
- Bachelor's degree in Computer Science, IT, or related field
- 10+ years in complex, high-volume data environments
- 7+ years of SQL/PLSQL and Python for data pipelines (Pandas, PySpark, NumPy)
- 3+ years working in hybrid environments (on-prem + cloud)
- 3+ years hands-on with Apache Airflow DAGs (branching, dynamic task generation, error handling)
Proven experience with:
- Snowflake
- Apache Spark
- Airflow or Dagster
- Structured, semi-structured, and unstructured data stores
- Strong understanding of data modeling, E-R models, ETL best practices, and performance tuning
Nice to Have
- Hands-on with DBT
- Familiarity with advanced DWH concepts (e.g., Factless Fact Tables, Temporal Models)
- Financial services or risk domain experience