Titre du poste ou emplacement

Intermediate Data Engineer - Databricks, Microsoft Fabric and Snowflake

SDK Tek Services Ltd.
Calgary, AB
Publié il y a 3 jours
Détails de l'emploi :
Temps plein
Expérimenté

Who We Are

SDK Tek Services is one of Canada's leading data services firms-an official partner with Microsoft, Databricks, and Snowflake. Since 2016, we've helped clients modernize how they use data, building cloud-first platforms that drive competitive advantage. We specialize in modern DataOps, advanced analytics, and scalable data architecture. Our delivery model blends consulting, automation, and long-term managed services.

About the Role

We're hiring an Intermediate Data Engineer with experience across Databricks, Microsoft Fabric, and Snowflake. You'll be part of a team delivering modern data pipelines and cloud platform solutions for enterprise clients. This role involves hands-on development, integration, and optimization of big data environments-across Azure, Fabric, and Snowflake ecosystems.

What You'll Do

  • Develop and maintain data pipelines and orchestration workflows using Databricks, Microsoft Fabric, and Snowflake.
  • Integrate and transform structured and semi-structured data from diverse sources using Spark, PySpark, SQL, and native tools.
  • Contribute to data lakehouse and warehouse design using Delta Lake, Snowflake, and Fabric Lakehouses.
  • Work with clients to ingest, clean, and model data in scalable cloud environments.
  • Support cross-platform integrations and ensure data lineage, quality, and performance standards are met.
  • Participate in project planning, estimations, and code reviews as part of a collaborative delivery team.
  • Support internal IP development and reusability frameworks.

What You Bring

  • 2+ years of hands-on data engineering experience with at least two of the following: Databricks, Microsoft Fabric, or Snowflake.
  • Proficient in SQL, Python, and/or PySpark for data transformation and orchestration.
  • Experience building scalable ETL/ELT pipelines and working with large datasets.
  • Familiarity with Azure Data Services (e.g., Data Lake, Data Factory, Synapse).
  • Exposure to CI/CD practices and version control (e.g., Git, Azure DevOps, GitHub Actions).
  • Strong problem-solving skills and ability to work in a fast-paced consulting environment.
  • Ability to communicate clearly with both technical and non-technical audiences.

Preferred but Not Required

  • Certification in Snowflake, Databricks, or Microsoft Fabric.
  • Experience with Delta Live Tables, dbt, or Fabric Dataflows.
  • Exposure to FinOps or cloud cost optimization best practices.

Partager un emploi :