Singapore, Singapore (Hybrid)

Overview
We are looking for a highly experienced Confluent Solution Architect with deep expertise in event streaming platforms to design, implement, and optimize scalable Kafka-based solutions. The role requires strong architectural ownership, hands-on technical leadership, and close collaboration with cross-functional and customer teams. The ideal candidate will bring extensive experience in Apache Kafka and the Confluent ecosystem, delivering resilient, high-performance streaming architectures aligned with modern cloud and enterprise data strategies.
Key Responsibilities
Design, architect, and implement end-to-end event streaming solutions using Apache Kafka and Confluent Platform/Confluent Cloud.
Act as the technical authority for Kafka-based architectures, advising on best practices, scalability, performance, and reliability.
Collaborate with application, data, platform, and DevOps teams to enable event-driven architectures and real-time data pipelines.
Lead solution design discussions with stakeholders and customers, translating business requirements into robust technical architectures.
Implement and manage Kafka deployments across cloud environments (AWS, GCP, Azure), including multi-cloud setups.
Guide integration using Kafka Connect, Kafka Streams, ksqlDB, Schema Registry, REST Proxy, and Flink-based streaming solutions.
Ensure governance, security, and compliance through RBAC, audit logs, data lineage, and stream governance practices.
Oversee CI/CD pipelines, containerized deployments (Docker, Kubernetes), and Confluent for Kubernetes (CFK).
Support platform monitoring, observability, and performance tuning using tools such as Prometheus, Grafana, and Splunk.
Mentor engineers and contribute to architectural standards, patterns, and continuous improvement initiatives.
Required Skills
Total experience: 10+ years in software engineering, data engineering, or platform architecture roles.
Relevant experience: 5+ years of hands-on experience with Apache Kafka (open-source or managed distributions such as Confluent, AWS MSK, Cloudera).
Strong programming proficiency in Java, Python, or Scala.
Solid understanding of event-driven architecture, real-time data streaming patterns, and distributed systems.
Experience deploying and operating Kafka on AWS, GCP, or Azure.
Hands-on experience with Docker, Kubernetes, and CI/CD pipelines.
Strong problem-solving skills with excellent verbal and written communication abilities.
Preferred / Added Advantage
Proven experience with Confluent Kafka ecosystem, including Kafka Connect, Kafka Streams, ksqlDB, Schema Registry, REST Proxy, and Confluent Control Center.
Hands-on exposure to Confluent Cloud services, including ksqlDB Cloud and Apache Flink.
Familiarity with Stream Governance, Data Lineage, Stream Catalog, Audit Logs, and RBAC.
Experience with Confluent Platform, managed services, multi-cloud deployments, and Confluent for Kubernetes (CFK).
Knowledge of data mesh architectures, KRaft migration, and modern event streaming design patterns.
Exposure to monitoring and observability tools such as Prometheus, Grafana, and Splunk.
Experience integrating Kafka with data lakes, data warehouses, or big data ecosystems.
Confluent certifications (Developer, Administrator, Flink Developer) are a strong plus.
Personal Attributes
High analytical and problem-solving capabilities.
Strong initiative, adaptability, and ownership mindset.
High customer orientation with a focus on quality and delivery excellence.