kafka-connect-zeebe

This Kafka Connect connector for Zeebe can do two things:

Overview

See this blog post for some background on the implementation.

Examples

Examples

Installation and quickstart

You will find information on how to build the connector and how to run Kafka and Zeebe to get started quickly here:

Installation

Connectors

The plugin comes with two connectors, a source and a sink connector.

The source connector activates Zeebe jobs, publishes them as Kafka records, and completes them once they have been committed to Kafka.

Sink connector

In a workflow model you can wait for certain events by name (extracted from the payload by messageNameJsonPath):

Overview

The sink connector consumes Kafka records and publishes messages constructed from those records to Zeebe. This uses the Zeebe Message Correlation features. So for example if no matching workflow instance is found, the message is buffered for its time-to-live (TTL) and then discarded. You could simply ingest all messages from a Kafka topic and check if they correlate to something in Zeebe.

Configuration

In order to communicate with Zeebe, the connector has to create a Zeebe client, which must be configured with the following properties:

For client and job worker configuration, we reuse the system properties as used by Zeebe, so if you already have a properties file for those they should simply work.

The connector does not yet support schemas, and currently expect all records to be JSON. Therefore, in order to properly construct a message, we use JSON path to extract properties from the Kafka record. These paths are all configurable per connector.

You can find sample properties for the sink connector here.

Source

Similar to receiving a message, in a workflow model you can also throw messages (i.e. the message throw event). Zeebe does not yet support this BPMN feature; what we can do to allow communication with external systems through Kafka, however, is use service tasks.

In a workflow you can then add a ServiceTask with a configurable task type which will create a record on the configured Kafka topic:

Overview

Under the hood, the connector will create one job worker per configured task type, consume their jobs, and publish records to Kafka based on those. As we do not yet support schemas, the record values are a JSON representation of the job itself, and the record keys are the job key.

Configuration

In order to communicate with Zeebe, the connector has to create a Zeebe client, which must be configured with the following properties:

For client and job worker configuration, we reuse the system properties as used by Zeebe, so if you already have a properties file for those they should simply work.

You can find sample properties for the source connector here.

Filtering Variables

You can filter the variables being sent to Kafka by adding a custom header to the "sendMessage" task with the configuration option "job.variables".

Set the value of this key to a comma-separated list of variables to pass to Kafka.

If this custom header is not present, then all variables in the scope will be sent to Kafka by default.

Filter Variables

Configuring Error Handling of Kafka Connect, e.g. Logging or Dead Letter Queues

Kafka Connect allows you to configure what happens if a message cannot be processed. A great explanation can be found in Kafka Connect Deep Dive – Error Handling and Dead Letter Queues. This of course also applies to this connector.

Confluent Hub

This project is set up to be released on Confluent Hub.

When