Example Search by API
Example Search by Word
Project Search
Top Packages
Top Classes
Top Methods
Top Projects
Java
C++
Python
Scala
Blog
Project: KafkaExample
(GitHub Link)
KafkaExample-master
demokafka.0.8.2.2
src
main
resources
log4j.properties
java
com
jasongj
kafka
DemoLowLevelConsumer.java
DemoHighLevelConsumer.java
RoundRobinPartitioner.java
ProducerDemo.java
HashPartitioner.java
test
java
com
jasongj
kafka
AppTest.java
pom.xml
pom.xml
LICENSE
demokafka.0.10.1.0
src
main
resources
orders.csv
items.csv
log4j.properties
users.csv
java
com
jasongj
kafka
stream
WordCountProcessor.java
serdes
GenericSerializer.java
GenericDeserializer.java
SerdesFactory.java
WordCountDSL.java
PurchaseAnalysis.java
timeextractor
OrderTimestampExtractor.java
WordCountTopology.java
model
Item.java
Order.java
User.java
producer
UserProducer.java
OrderProducer.java
ItemProducer.java
consumer
DemoConsumerAutoCommit.java
DemoConsumerCommitPartition.java
DemoConsumerCommitCallback.java
DemoConsumerAssign.java
DemoConsumerFlowControl.java
DemoConsumerManualCommit.java
DemoConsumerInterceptor.java
DemoConsumerRebalance.java
connect
ConsoleSourceConnect.java
ConsoleSourceTask.java
ConsoleSinkTask.java
ConsoleSinkConnect.java
producer
EvenProducerInterceptor.java
ProducerDemo.java
ProducerDemoCallback.java
HashPartitioner.java
test
java
com
jasongj
kafka
AppTest.java
pom.xml
README.md
.gitignore
Kafka使用示例
Kafka 0.8.2.2示例
Producer示例
HashPartitioner示例
实现HashPartitioner从而保证key相同的消息被发送到同一个Partition
RoundRobinPartitioner示例
提供RoundRobin消息路由算法,实现Load balance
High Level Consumer示例
通过High level API中的consumer group实现group内的消息单播和group间的消息广播
Low Level Consumer示例
使用Low level API可实现精确的消息消费控制
Kafka 0.10.1.0示例
Producer示例
Producer支持send callback
Partitioner示例
Partitioner接口与旧版本相比有所区别,可以实现更多语义的消息路由/消息分发
Consumer示例
Kafka 0.10.
版本中新的Consumer使用同一套API同时实现0.8.
及以前版本中的High Level API及Low Level API
Stream Low Level Processor API示例
Stream Topology示例
使用Kafka Stream的Low-level Processor API实现word count
Stream DSL示例
通过Kafka Stream的DSL API实现word count功能
Purchase Analysis
如何使用KStream与KTable Join,如何创建自己的Serializer/Deserializer和Serde,以及如何使用Kafka Stream的Transform和Kafka Stream的Window