Join a strategic initiative with a leading banking client to modernize their data infrastructure and enhance business intelligence capabilities through Azure technologies and Databricks.
Key Responsibilities:
- Data Pipeline Modernization: Migrate legacy VM-based data sources to Azure Data Factory (ADF) and Databricks.
- ETL Development: Build robust ETL pipelines to ingest data from SFTP, APIs, and databases into Delta Lake using Medallion architecture.
- Visualization Enablement: Support integration between Plotly Dash and Databricks for real-time data insights.
- Databricks Administration: Configure Unity Catalog, manage clusters, define schemas, and enforce secure access controls.
- DevOps & Deployment: Implement CI/CD pipelines using Azure DevOps or GitHub Enterprise for secure and automated deployments.
Must-Have Skills:
- 7+ years with ADF, Databricks, and Azure Functions (C#)
- Strong Python/Notebook development (Spark, Pandas)
- Experience with Unity Catalog migration and Delta table design
- Proficiency in Azure Entra ID (SSO/RBAC) and Azure PaaS (SQL, ADLS, Synapse)
- CI/CD pipeline setup for Databricks (cluster/secret management)
Nice to Have:
- Experience with Delta Live Tables
- Familiarity with secure, regulated environments
- Financial services industry background
Soft Skills:
- Strong communication and leadership
- Ability to work independently and lead small teams
- Proactive problem-solving in agile environments