Job Title or Location
RECENT SEARCHES

Contract Data Architect

Improving
Calgary, AB
Posted today
Job Details:
Full-time
Contract
Experienced

This role is open across Canada, but requires a contractor to be on client site in Calgary from Monday to Friday for the duration of the 4 week contract. Please provide an all-inclusive rate when applying for this role.

Job description

Improving Calgary offers an exceptional work environment where employees can engage with cutting-edge technology and a diverse range of projects, all within a positive and inclusive cultural experience. With a rich history and strong presence in the IT services and solutions industry, Improving Calgary, formerly known as Quadrus, has been serving Canada for over twenty years before becoming part of Improving in 2013

Our client has a legacy database environment that receives information from multiple retail application environments. It is primarily used for business intelligence purposes but also serves as an operational database in some cases. They would like to move off of this legacy environment onto a cloud platform data lake, including migrating the methods put data in and pull data out of their legacy environment, from custom code, to industry standards.

Key Deliverables:

The deliverable here is a solution architecture and plan for the migration that would be wrapped into an SOW for the client to approve for a much longer term project. It would be ideal to have the individual that created the solution architecture and plan, to be involved in the larger project.

Required Skills:

  • Experience designing scalable and robust architecture that aligns with business goals and technical requirements. This includes creating architectural diagrams, defining data flows, and ensuring system interoperability.
  • Able to contribute to the creation of a roadmap and plan (resources, effort, duration and costing) to complete the proposed solution.
  • Experience with ETL processes, particularly custom C++ and UNIX batch jobs.
  • Proficiency data integration from multiple sources / formats
  • Expertise in managing and designing databases
  • Experience with Data Lakes on AWS
  • Knowledge of Spark and Confluence Kafka for handling large volumes of data and real-time processing

Share This Job: