The ideal candidate will have a passion cum extensive experience in large scale data handling, rich expertise on BigData Transformation programs right from Architecture to Maintenance. The candidate will work closely with Enterprise Application teams, Technology partners, Enterprise Architects, Data Governance Teams, Business Analysts, Quality Engineers to meet the IDP Program objectives.
Job Title: Big Data Developer
Location: Mississauga - Hybrid - 3 days/week
Term: 12 months contract plus extension
Key Responsibilities:
• Translate high-level & functional requirements, Data Modelling (dimensional, semi structured, transactional use cases) to technical design.
• Develop batch & real time data ingestion Pipelines involving wide range of technologies like Spark, Hadoop, Hive, Middleware etc.
• Develop programs to migrate Historical Data from legacy platforms to the BigData platform.
• Participate in UAT/ SIT Test cycles, Release cycles, triage and resolve issues.
• Perform Code reviews, test case reviews and ensure Functional & Non-Functional requirements.
• Analyse Platform & Software version upgrades, evaluate new tools and technologies for handling the growing Data.
Required Skills:
• Experience: 6 to 8 years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems.
• Exp with Designing, creating, optimizing Cloudera platform. Design, develop, and optimize scalable distributed data processing pipelines using Apache Spark, Scala, Hadoop ecosystem tools such as Hive, HDFS, and YARN.
• Real-Time Data Streaming: Experience with streaming platforms such as Apache Kafka or Spark Streaming.