Azure Kafka Batch through Eventhubs with Azure Schema Registry Source
Provided by: "Apache Software Foundation"
Support Level for this Kamelet is: "Preview"
Receive data from Kafka topics in batch on Azure Eventhubs combined with Azure Schema Registry and commit them manually through KafkaManualCommit or auto commit.
Configuration Options
The following table summarizes the configuration options available for the kafka-batch-azure-schema-registry-source
Kamelet:
Property | Name | Description | Type | Default | Example |
---|---|---|---|---|---|
Azure Schema Registry URL | Required The Apicurio Schema Registry URL. | string | |||
Bootstrap Servers | Required Comma separated list of Kafka Broker URLs. | string | |||
Password | Required Password to authenticate to kafka. | string | |||
Topic Names | Required Comma separated list of Kafka topic names. | string | |||
Allow Manual Commit | Whether to allow doing manual commits. | boolean | false | ||
Auto Commit Enable | If true, periodically commit to ZooKeeper the offset of messages already fetched by the consumer. | boolean | true | ||
Auto Offset Reset | What to do when there is no initial offset. There are 3 enums and the value can be one of latest, earliest, none. | string | latest | ||
Batch Dimension | The maximum number of records returned in a single call to poll(). | int | 500 | ||
Consumer Group | A string that uniquely identifies the group of consumers to which this source belongs. | string | my-group-id | ||
Automatically Deserialize Headers | When enabled the Kamelet source will deserialize all message headers to String representation. | boolean | true | ||
Max Poll Interval | The maximum delay between invocations of poll() when using consumer group management. | int | |||
Poll On Error Behavior | What to do if kafka threw an exception while polling for new messages. There are 5 enums and the value can be one of DISCARD, ERROR_HANDLER, RECONNECT, RETRY, STOP. | string | ERROR_HANDLER | ||
Poll Timeout Interval | The timeout used when polling the KafkaConsumer. | int | 5000 | ||
SASL Mechanism | The Simple Authentication and Security Layer (SASL) Mechanism used. | string | PLAIN | ||
Security Protocol | Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported. | string | SASL_SSL | ||
Specific Avro Value Type | The Specific Type Avro will have to deal with. | string | com.example.Order | ||
Topic Is Pattern | Whether the topic is a pattern (regular expression). This can be used to subscribe to dynamic number of topics matching the pattern. | boolean | false | ||
Value Deserializer | Deserializer class for value that implements the Deserializer interface. | string | com.microsoft.azure.schemaregistry.kafka.avro.KafkaAvroDeserializer |
Dependencies
At runtime, the kafka-batch-azure-schema-registry-source
Kamelet relies upon the presence of the following dependencies:
-
camel:kafka
-
camel:core
-
camel:kamelet
-
camel:azure-schema-registry
-
mvn:com.microsoft.azure:azure-schemaregistry-kafka-avro:1.1.1
-
mvn:com.azure:azure-data-schemaregistry-apacheavro:1.1.20
-
mvn:com.azure:azure-identity:1.13.2
Camel JBang usage
Prerequisites
-
You’ve installed JBang.
-
You have executed the following command:
jbang app install camel@apache/camel
Supposing you have a file named route.yaml with this content:
- route:
from:
uri: "kamelet:kafka-batch-azure-schema-registry-source"
parameters:
.
.
.
steps:
- to:
uri: "kamelet:log-sink"
You can now run it directly through the following command
camel run route.yaml