Job Title: Senior Data Engineer - Databricks
Job Type: 6-month contract to permanent
Location: Vancouver, BC (Hybrid, 3 days in-office)
Hours per week: 40 hours per week
Rate: $60/hr Inc. ($115K to $120K when it converts to permanent)
Overview
Our client is embarking on a transformative data journey to consolidate their data infrastructure into a unified Databricks Lakehouse platform. As a Senior Data Engineer, you will lead the technical implementation of their Databricks migration, establishing foundational data architecture, and mentoring team members to build a scalable, reliable data ecosystem. This role is critical for designing and implementing a comprehensive data strategy that drives business value through advanced analytics and AI capabilities.
Responsibilities
- Design and implement a medallion architecture (bronze, silver, gold) within Databricks to support data transformation, quality, and accessibility
- Develop robust ETL/ELT pipelines to ingest data from various sources, including the SQUARE Point of Sale (POS) system and our ERP (INforce)
- Establish governance frameworks using Unity Catalog to ensure data security, compliance, and proper access controls
- Collaborate with business stakeholders to standardize business definitions and logic through a well-designed semantic layer
- Mentor and upskill team members on Databricks technologies, best practices, and modern data engineering techniques
- Optimize data pipelines for performance and cost efficiency in cloud environments
- Implement data quality monitoring and testing frameworks to ensure reliable analytics
- Enable a single source of truth across the organization by consolidating disparate data sources into a unified Lakehouse platform
- Accelerate business decision-making through standardized metrics and efficient data processing
- Establish a foundation for advanced analytics, AI/ML capabilities, and forecasting
- Build a scalable data architecture that supports the company's long-term data strategy
- Create a culture of data excellence by mentoring team members and establishing best practices
- Drive the successful implementation of multi-phase data migration from legacy systems to modern cloud architecture
Qualifications:
- 5+ years of experience in data engineering, with at least 2 years of hands-on experience with Databricks
- Strong proficiency in Python and SQL for data transformation and pipeline development
- Experience implementing medallion architecture and Delta Lake in production environments
- Practical knowledge of Unity Catalog and data governance frameworks
- Experience with cloud platforms, preferably AWS
- Demonstrated ability to mentor junior team members and communicate technical concepts effectively
- Experience with data modeling and implementing semantic layers
- Familiarity with data visualization tools (Power BI preferred)
- Experience migrating data from on-premises systems to cloud environments
- Databricks certification (e.g., Databricks Certified Data Engineer) is a plus