
We are looking for an experienced Confluent Consulting Engineer to design, develop, and maintain real-time data streaming solutions using Apache Kafka and Confluent technologies. The ideal candidate will have a solid background in distributed systems, event-driven architectures, and cloud-native deployments. You will work closely with cross-functional teams to deliver scalable and high-performance streaming solutions.
Job Responsibilities:
- Design and implement real-time data pipelines and event-driven architectures using Apache Kafka or Confluent Platform.
- Develop and maintain Kafka producers, consumers, and streaming applications in Java, Python, or Scala.
- Integrate Kafka with various data sources and sinks using Kafka Connect and related connectors.
- Deploy and manage Kafka clusters on cloud platforms (AWS, GCP, Azure) or on-premise environments.
- Ensure scalability, reliability, and performance of streaming applications.
- Collaborate with DevOps teams to build and maintain CI/CD pipelines and containerized deployments using Docker and Kubernetes.
- Monitor and troubleshoot Kafka infrastructure using tools like Prometheus, Grafana, or Splunk.
- Provide technical guidance and best practices for event streaming and Confluent ecosystem adoption.
Job Requirements:
- Minimum 5 years of hands-on experience with Apache Kafka (open-source or distributions like Confluent, Cloudera, AWS MSK).
- Strong proficiency in Java, Python, or Scala.
- Deep understanding of event-driven architecture and streaming data patterns.
- Experience with cloud platforms (AWS, GCP, or Azure).
- Familiarity with Docker, Kubernetes, and CI/CD pipelines.
- Excellent analytical, problem-solving, and communication skills.
Preferred/Desired Skills:
- Experience with Confluent Kafka and its ecosystem (Kafka Streams, Kafka Connect, Schema Registry, KSQL, REST Proxy, Control Center).
- Hands-on experience with Confluent Cloud services and Apache Flink.
- Knowledge of Stream Governance, Data Lineage, Stream Catalog, RBAC, and related components.
- Confluent certifications (Developer, Administrator, or Flink Developer) are a plus.
- Experience with multi-cloud deployments, Confluent for Kubernetes, or data mesh architectures.
- Exposure to monitoring tools (Prometheus, Grafana, Splunk) and big data technologies (data lakes, data warehouses).