Next-Link logo

Kafka Streaming Platform Engineer

Next-Link
Contract
On-site
Amstelveen, Noord -Holland, Netherlands

Job Overview:<\/b>
<\/h3>

We are seeking an experienced Kafka Streaming Platform Engineer<\/b> to design, develop, and manage real -time data streaming solutions using Apache Kafka<\/b>. This role is essential in building and optimizing data pipelines, ensuring system scalability<\/b>, and integrating Kafka with various systems. The engineer will play a key role in maintaining and optimizing Kafka clusters, working with Kafka Connect<\/b>, Kafka Streams<\/b>, and other associated technologies.
<\/p>

You will collaborate with cross -functional teams<\/b>, including data architects<\/b>, DBAs<\/b>, and application developers<\/b>, and leverage your experience with DevOps practices<\/b> (CI/CD, Docker, Kubernetes) for automated deployments and infrastructure management. Strong problem -solving skills and a solid understanding of event -driven architectures<\/b> are essential for this role.
<\/p>

Key Responsibilities:<\/b>
<\/h3>

1. Data Pipeline Development<\/b>
<\/h4>
  • Design, develop, and maintain data pipelines<\/b> using Apache Kafka<\/b>, including real -time streaming applications.
    <\/p><\/li>

  • Build highly scalable data solutions<\/b> that can handle large volumes of data in real -time.
    <\/p><\/li><\/ul>

    2. Kafka Cluster Management<\/b>
    <\/h4>
    • Install, configure, monitor, and optimize Kafka clusters<\/b>, ensuring high availability and efficient resource utilization.
      <\/p><\/li>

    • Manage Kafka Connect and ensure smooth integration with external systems (databases, messaging systems, etc.).
      <\/p><\/li><\/ul>

      3. Event -Driven Architecture Implementation<\/b>
      <\/h4>
      • Design and implement event -driven architectures<\/b> using Kafka Streams<\/b>, Kafka Connect<\/b>, KSQL<\/b>, and other Kafka -related technologies.
        <\/p><\/li>

      • Leverage pub/sub<\/b> patterns to enable real -time event processing and improve system responsiveness.
        <\/p><\/li><\/ul>

        4. Integration with Other Systems<\/b>
        <\/h4>
        • Integrate Kafka<\/b> with various databases<\/b> (relational, NoSQL), applications<\/b>, and other data processing platforms<\/b>.
          <\/p><\/li>

        • Utilize Change Data Capture (CDC)<\/b> techniques using Kafka Connect<\/b>, Debezium<\/b>, or custom connectors to enable real -time data sync across platforms.
          <\/p><\/li><\/ul>

          5. Performance Optimization<\/b>
          <\/h4>
          • Ensure data ingestion and transformation pipelines are optimized for performance<\/b>, reliability<\/b>, and scalability<\/b>.
            <\/p><\/li>

          • Continuously monitor and improve Kafka cluster performance<\/b>, minimizing latency and maximizing throughput.
            <\/p><\/li><\/ul>

            6. Troubleshooting and Monitoring<\/b>
            <\/h4>
            • Proactively monitor Kafka clusters to identify and resolve any performance or availability issues.
              <\/p><\/li>

            • Troubleshoot complex data streaming issues and ensure high uptime and system stability.
              <\/p><\/li><\/ul>

              7. Code Quality and Best Practices<\/b>
              <\/h4>
              • Apply software engineering best practices<\/b> such as object -oriented programming (OOP)<\/b>, TDD<\/b>, and design patterns<\/b> to ensure code maintainability and quality.
                <\/p><\/li>

              • Write efficient, clean, and well -documented code to meet project requirements.
                <\/p><\/li><\/ul>

                8. DevOps Practices<\/b>
                <\/h4>
                • Utilize DevOps<\/b> methodologies (CI/CD, Docker<\/b>, Kubernetes<\/b>) for automated deployments and infrastructure management.
                  <\/p><\/li>

                • Automate infrastructure provisioning, Kafka cluster scaling, and deployment workflows.
                  <\/p><\/li><\/ul>

                  9. Documentation<\/b>
                  <\/h4>
                  • Create and maintain technical documentation<\/b>, including data flow diagrams<\/b>, design documents<\/b>, and operational procedures<\/b>.
                    <\/p><\/li>

                  • Ensure that all processes and architecture are well -documented and easily understood by the team.
                    <\/p><\/li><\/ul>

                    10. Collaboration and Stakeholder Communication<\/b>
                    <\/h4>
                    • Collaborate with cross -functional teams<\/b>, including data architects, DBAs, and application developers, to align on project goals and technical solutions.
                      <\/p><\/li>

                    • Provide technical support and guidance to team members, ensuring high -quality implementations.
                      <\/p><\/li><\/ul>


                      <\/div><\/span>

                      Requirements<\/h3>

                      Required Skills and Experience:<\/b>
                      <\/h3>
                      • 8+ years<\/b> of experience with Apache Kafka<\/b>, including expertise in Kafka Connect<\/b>, Kafka Streams<\/b>, and Kafka brokers.
                        <\/p><\/li>

                      • Strong proficiency in Java<\/b>, with experience in object -oriented programming (OOP)<\/b>.
                        <\/p><\/li>

                      • Shell scripting<\/b> and Python<\/b> experience (desirable).
                        <\/p><\/li>

                      • Solid understanding of event -driven architectures<\/b>, pub/sub patterns<\/b>, and real -time data processing.
                        <\/p><\/li>

                      • Database knowledge<\/b>: Experience with MongoDB<\/b>, relational databases<\/b>, or other data storage solutions.
                        <\/p><\/li>

                      • Experience with cloud platforms such as AWS<\/b>, Azure<\/b>, or GCP<\/b>.
                        <\/p><\/li>

                      • Solid experience with DevOps practices<\/b> including CI/CD<\/b>, Docker<\/b>, and Kubernetes<\/b>.
                        <\/p><\/li>

                      • Strong troubleshooting and problem -solving skills, with an ability to identify and resolve complex issues in Kafka clusters.
                        <\/p><\/li>

                      • Schema registry<\/b> experience and familiarity with tools related to Kafka (e.g., Confluent<\/b>, Schema Registry<\/b>, Control Center<\/b>) is a plus.
                        <\/p><\/li>

                      • Experience with Change Data Capture (CDC)<\/b>, Debezium<\/b>, or similar tools is highly desirable.
                        <\/p><\/li><\/ul>


                        <\/div><\/span>

Apply now
Share this job

More jobs