We are looking for Bigdata consultant for a contract Hybrid role in Mississauga. The ideal candidate will have a passion cum extensive experience in large scale data handling, rich expertise on BigData Transformation programs right from Architecture to Maintenance. If you have relevant skills .Please apply with a copy of your updated resume and contact details.
Key Responsibilities:
• Translate high-level & functional requirements, Data Modelling (dimensional, semi structured, transactional use cases) to technical design.
• Develop batch & real time data ingestion Pipelines involving wide range of technologies like Spark, Hadoop, Hive, Middleware etc.
• Develop programs to migrate Historical Data from legacy platforms to the BigData platform.
• Participate in UAT/ SIT Test cycles, Release cycles, triage and resolve issues.
• Perform Code reviews, test case reviews and ensure Functional & Non-Functional requirements.
• Analyse Platform & Software version upgrades, evaluate new tools and technologies for handling the growing Data.
Required Skills:
• Experience: 6 to 8 years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems.
• Exp with Designing, creating, optimizing Cloudera platform. Design, develop, and optimize scalable distributed data processing pipelines using Apache Spark, Scala, Hadoop ecosystem tools such as Hive, HDFS, and YARN.
• Real-Time Data Streaming: Experience with streaming platforms such as Apache Kafka or Spark Streaming.