|
Search Jobvertise Jobs
|
Jobvertise
|
Kafka Architect Location: US-IL-Deerfield Jobcode: 6ebc894dc4513775aa670bf1a023ae83-122020 Email Job
| Report Job
- 3+ years of hands on Kafka/Confluent/Data Streaming development/operational platform experience
- Experience in working in with Kafka connectors, Producer and consumer APIs
- Strong understanding of Kafka architecture including offset management, partition strategy and DR requirements
- Good understanding of Streaming message delivery semantics.
- Good understanding of Spark framework
- Strong understanding of streaming message formats like Avro and streaming semantics.
- Strong understanding of data architecture, data integration and data quality management. Should be able to create a secure and performance centric architecture based on client requirements.
- Ability to articulate design standards and patterns
- Application integration experience, event driven processing with Kafka, Kafka with Flink, Sink connectors
- Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.
- Ensure optimum performance, high availability and stability of solutions
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations.
- Strong written and oral communication and presentation
- Strong analytical and problem solving skills.
- Roles & Responsibilities:
- Build Kafka/Confluent/Data Streaming development/operational related applications
- Develop Kafka connectors, Producer and consumer APIs
- Design Kafka architecture including offset management, partition strategy and DR requirements
- Build Spark Streaming applications using Scala/Python
- Strong understanding of streaming message formats like Avro and streaming semantics.
- Strong understanding of data architecture, data integration and data quality management
- Ability to articulate design standards and patterns
- Application integration experience, event driven processing with Kafka, Kafka with Flink, Sink connectors
- Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.
- Ensure optimum performance, high availability and stability of solutions
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations.
- Communicate and Explain Design and Architecture to the client.
-
Varite
|