We are seeking an experienced Kafka Streaming Platform Engineer<\/b> to design, develop, and manage real -time data streaming solutions using Apache Kafka<\/b>. This role is essential in building and optimizing data pipelines, ensuring system scalability<\/b>, and integrating Kafka with various systems. The engineer will play a key role in maintaining and optimizing Kafka clusters, working with Kafka Connect<\/b>, Kafka Streams<\/b>, and other associated technologies.
<\/p>
You will collaborate with cross -functional teams<\/b>, including data architects<\/b>, DBAs<\/b>, and application developers<\/b>, and leverage your experience with DevOps practices<\/b> (CI/CD, Docker, Kubernetes) for automated deployments and infrastructure management. Strong problem -solving skills and a solid understanding of event -driven architectures<\/b> are essential for this role. Design, develop, and maintain data pipelines<\/b> using Apache Kafka<\/b>, including real -time streaming applications. Build highly scalable data solutions<\/b> that can handle large volumes of data in real -time. Install, configure, monitor, and optimize Kafka clusters<\/b>, ensuring high availability and efficient resource utilization. Manage Kafka Connect and ensure smooth integration with external systems (databases, messaging systems, etc.). Design and implement event -driven architectures<\/b> using Kafka Streams<\/b>, Kafka Connect<\/b>, KSQL<\/b>, and other Kafka -related technologies. Leverage pub/sub<\/b> patterns to enable real -time event processing and improve system responsiveness. Integrate Kafka<\/b> with various databases<\/b> (relational, NoSQL), applications<\/b>, and other data processing platforms<\/b>. Utilize Change Data Capture (CDC)<\/b> techniques using Kafka Connect<\/b>, Debezium<\/b>, or custom connectors to enable real -time data sync across platforms. Ensure data ingestion and transformation pipelines are optimized for performance<\/b>, reliability<\/b>, and scalability<\/b>. Continuously monitor and improve Kafka cluster performance<\/b>, minimizing latency and maximizing throughput. Proactively monitor Kafka clusters to identify and resolve any performance or availability issues. Troubleshoot complex data streaming issues and ensure high uptime and system stability. Apply software engineering best practices<\/b> such as object -oriented programming (OOP)<\/b>, TDD<\/b>, and design patterns<\/b> to ensure code maintainability and quality. Write efficient, clean, and well -documented code to meet project requirements. Utilize DevOps<\/b> methodologies (CI/CD, Docker<\/b>, Kubernetes<\/b>) for automated deployments and infrastructure management. Automate infrastructure provisioning, Kafka cluster scaling, and deployment workflows. Create and maintain technical documentation<\/b>, including data flow diagrams<\/b>, design documents<\/b>, and operational procedures<\/b>. Ensure that all processes and architecture are well -documented and easily understood by the team. Collaborate with cross -functional teams<\/b>, including data architects, DBAs, and application developers, to align on project goals and technical solutions. Provide technical support and guidance to team members, ensuring high -quality implementations. 8+ years<\/b> of experience with Apache Kafka<\/b>, including expertise in Kafka Connect<\/b>, Kafka Streams<\/b>, and Kafka brokers. Strong proficiency in Java<\/b>, with experience in object -oriented programming (OOP)<\/b>. Shell scripting<\/b> and Python<\/b> experience (desirable). Solid understanding of event -driven architectures<\/b>, pub/sub patterns<\/b>, and real -time data processing. Database knowledge<\/b>: Experience with MongoDB<\/b>, relational databases<\/b>, or other data storage solutions. Experience with cloud platforms such as AWS<\/b>, Azure<\/b>, or GCP<\/b>. Solid experience with DevOps practices<\/b> including CI/CD<\/b>, Docker<\/b>, and Kubernetes<\/b>. Strong troubleshooting and problem -solving skills, with an ability to identify and resolve complex issues in Kafka clusters. Schema registry<\/b> experience and familiarity with tools related to Kafka (e.g., Confluent<\/b>, Schema Registry<\/b>, Control Center<\/b>) is a plus. Experience with Change Data Capture (CDC)<\/b>, Debezium<\/b>, or similar tools is highly desirable.
<\/p>Key Responsibilities:<\/b>
<\/h3>1. Data Pipeline Development<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>2. Kafka Cluster Management<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>3. Event -Driven Architecture Implementation<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>4. Integration with Other Systems<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>5. Performance Optimization<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>6. Troubleshooting and Monitoring<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>7. Code Quality and Best Practices<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>8. DevOps Practices<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>9. Documentation<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>10. Collaboration and Stakeholder Communication<\/b>
<\/h4>
<\/p><\/li>
<\/p><\/li><\/ul>
<\/div><\/span>Requirements<\/h3>
Required Skills and Experience:<\/b>
<\/h3>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li>
<\/p><\/li><\/ul>
<\/div><\/span>