← Serch more jobs

Kafka Consultant

LinkedIn Infosys Charlotte, NC
Not Applicable Posted April 2, 2026 Job link
Thinking about this job
Not Met Priorities
What still needs stronger evidence
Requirements
  • At least 11 years of Relevant Information Technology experience.
  • Infosys is unable to provide immigration sponsorship for this role at this time
  • This position may require travel to project/client location.
  • Atleast 8 years’ of experience with Apache Kafka and Confluent Kafka
  • Experience in defining architectural standards and best practices for streaming data pipelines.
  • Experience with deploying, configuring, and maintaining Kafka clusters in production environments.
  • Experience in ensuring high availability, fault tolerance, and scalability of Kafka infrastructure.
  • Experience in integrating Kafka with other enterprise systems (e.g., databases, cloud services, microservices).
  • Experience in ensuring data consistency, integrity, and quality across streaming pipelines.
  • Experience in defining message schemas, data models, and serialization formats (e.g., Avro, Protobuf)
  • Experience with implement authentication, authorization, and encryption mechanisms.
  • Experience in setting up monitoring tools (e.g., Prometheus, Grafana, Splunk) for Kafka performance and health.
  • Understanding on Confluence Kafka Admin activities like – cluster set up, topic management, security & access control, monitoring & logging, consumer group management The job may entail extensive travel.
Preferred Skills
  • Atleast 8 years’ of experience with Apache Kafka and Confluent Kafka
  • Atleast 4 years of experience with Streaming technologies (e.g., Flink, Spark Streaming) and integration technologies (e.g., REST, SOAP, MQ)
  • Deep understanding of Kafka internals, architecture, and operational best practices.
  • Experience with cloud platforms (such as AWS, Azure, GCP) and container orchestration (Kubernetes)
  • Strong background in Java, Spring Boot, .Net and microservices architecture with hands-on experience with CI/CD tools and Infrastructure as Code (Terraform, Ansible).
  • Experience in leading the design and implementation of enterprise-grade Kafka platforms using Confluent Kafka.
  • Experience in defining architectural standards and best practices for streaming data pipelines.
  • Experience in collaborating with business and technical stakeholders to align Kafka solutions with enterprise goals.
  • Experience with deploying, configuring, and maintaining Kafka clusters in production environments.
  • Experience in ensuring high availability, fault tolerance, and scalability of Kafka infrastructure.
  • Experience in integrating Kafka with other enterprise systems (e.g., databases, cloud services, microservices).
  • Experience in ensuring data consistency, integrity, and quality across streaming pipelines.
  • Experience in defining message schemas, data models, and serialization formats (e.g., Avro, Protobuf)
  • Experience with implement authentication, authorization, and encryption mechanisms.
  • Experience in setting up monitoring tools (e.g., Prometheus, Grafana, Splunk) for Kafka performance and health.
  • Understanding on Confluence Kafka Admin activities like – cluster set up, topic management, security & access control, monitoring & logging, consumer group management The job may entail extensive travel.
  • The job may also entail sitting as well as working at a computer for extended periods of time.
Education
  • (Required) – Bachelor’s degree or foreign equivalent required from an accredited institution.
  • (Not required) – Will also consider three years of progressive experience in the specialty in lieu of every year of education.