We are seeking a highly skilled Data Streaming Architect with deep expertise in Apache Kafka to lead the design, implementation, and optimization of streaming data platforms across our enterprise systems. This role combines hands-on engineering knowledge with strategic architectural guidance, serving as a consulting architect to cross-functional teams. The ideal candidate is passionate about distributed systems, real-time data processing, and guiding best practices at scale.
Key Responsibilities
Architecture & Design
Design scalable, resilient, and secure Kafka-based data streaming architectures.
Evaluate existing systems and recommend integration strategies with Kafka and related technologies (e.g., Kafka Connect, ksqlDB, Schema Registry).
Architect end-to-end streaming pipelines, including producers, brokers, consumers, and external sinks/sources.
Consulting & Strategic Guidance
Serve as a consulting architect to internal teams and stakeholders, advising on streaming strategies aligned with business goals.
Define and communicate streaming architecture best practices, standards, and governance models across teams.
Lead architectural reviews and assessments, ensuring compliance with performance, scalability, and reliability standards.
Implementation & Enablement
Guide development teams in implementing Kafka integrations, tuning configurations, and resolving bottlenecks.
Prototype and validate new technologies and frameworks to support advanced streaming use cases.
Create technical artifacts including architecture diagrams, solution documents, and runbooks.
Mentorship & Leadership
Mentor engineering teams on Kafka usage patterns, anti-patterns, and performance tuning.
Deliver training sessions and workshops to upskill internal teams on event-driven architectures and stream processing.
Operational Oversight
Collaborate with DevOps and SRE teams to ensure effective monitoring, alerting, and disaster recovery for Kafka platforms.
Establish SLAs and operational playbooks for mission-critical streaming infrastructure.