Titre du poste ou emplacement

Tech Lead Data Engineering

Canada, SK
Posté hier
Détails de l'emploi :
Télétravail
Temps plein
Expérimenté

Job Title: Tech Lead Location: Remote - CanadaJob Title: Tech Lead- Data Engineering Job Summary:- We are looking for a highly skilled and motivated Tech Lead of Data Engineering to spearhead the development of scalable and efficient data engineering solutions. The ideal candidate will possess deep expertise in Python, PySpark, AWS services, and streaming data platforms, with a proven ability to integrate complex data sources and develop distributed data processing frameworks. This role requires a strong technical leader who can guide the team, solve complex challenges, and deliver optimal solutions that align with client requirements. Healthcare experience is an added benefit. Key Responsibilities:
  1. Technical Leadership:
  • Provide hands-on technical leadership to the team in designing and implementing data engineering solutions.
  • Lead by example in adopting best practices for coding, testing, and deployment.
  1. ETL Development:
  • Design and develop robust ETL pipelines using AWS Glue, Lambda, and other AWS services to process large volumes of data efficiently.
  • Implement complex data transformations and integrate data from multiple sources such as APIs, databases, and streaming platforms.

  1. Distributed Data Processing:
  • Develop distributed data processing frameworks to ensure performance and scalability in handling large datasets.
  • Optimize the performance of data processing jobs for both batch and real-time workloads.
  1. Solutioning & Architecture:
  • Provide optimal data engineering solutions aligned with client requirements and business objectives.
  • Collaborate with architects to design scalable and secure data solutions leveraging AWS cloud services.
  1. AWS Expertise:
  • Utilize AWS services (e.g., S3, Glue, Lambda, Kinesis, DynamoDB) to build efficient and scalable cloud-based solutions.
  • Stay updated with the latest AWS services and features to continuously improve system performance and cost efficiency.
  1. Stakeholder Collaboration:
  • Work closely with clients, business analysts, and other stakeholders to understand requirements and translate them into technical solutions.
  • Communicate progress, challenges, and solutions effectively to both technical and non-technical stakeholders.
Qualifications:
  • Education:
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Experience:
  • 10+ years of experience in data engineering.
  • Strong hands-on expertise in Python, PySpark, and AWS services for data processing and integration.
  • Extensive knowledge of distributed data processing frameworks and best practices.
  • Skills:
  • Strong problem-solving and solution-oriented mindset to deliver optimal results.
  • Excellent knowledge of data integration techniques and cloud-based architecture.
  • Proficient in implementing complex data transformations and scalable data workflows.
  • Exceptional team leadership and mentoring abilities.
  • Strong communication skills for effective stakeholder collaboration.
Preferred Qualifications:
  • AWS Certified Solutions Architect certification or equivalent.
  • Familiarity with Terraform or CloudFormation for AWS infrastructure as code.
  • Wealth Management domain experience.
Why Join Us?
  • Be a key player in building cutting-edge data engineering solutions for large-scale projects.
  • Work with a talented and collaborative team in a dynamic environment.
Competitive salary, benefits, and opportunities for career growth.

Partager un emploi :