Kafka Streaming Output and Integration

George Alpizar
George Alpizar
  • Updated

Overview

The Kafka output will stream analytics and insights to your Kafka endpoint.


Review Parameters

Review the following parameters that you can configure in the Edge Delta App:

YAML Description
name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

This parameter is required. 

Review the following example:

name: kafka
integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

This parameter is optional. 

Review the following example:

integration_name: orgs-kafka
type

Enter kafka.

This parameter is required. 

Review the following example:

type: kafka
endpoint

Enter your Kafka broker address.

This parameter is required. 

Review the following example:

endpoint: localhost:2888,localhost:3888
topic

Enter your Kafka topic name.

This parameter is required. 

Review the following example:

topic: example_kafka_topic
required_acks

Enter the number of acknowledgments that the leader must receive before considering a request to be complete.

This parameter is optional. 

Review the following example:

required_acks: 10
batch_size

Enter the maximum number of messages to buffer being being sent to a partition.

The default limit is 100 messages. 

This parameter is optional. 

Review the following example:

batch_size: 1000
batch_bytes

Enter a limit (in bytes) for the maximum size of a request before being sent to a partition. 

The default value is 1048576.

This parameter is optional. 

Review the following example:

batch_bytes: 10000
batch_timeout

Enter a time limit for how often incomplete message batches will be flushed to Kafka.

This parameter is optional. 

Review the following example:

batch_timeout: 1m
async

Enter true or false to enable or disable asynchronous communication between Edge Delta and Kafka.

This parameter is optional. 

Review the following example:

async: true

tls:

  disable_verify 

To disable a TLS verification of a certificate, in the YAML file, enter:

  • disable_verify: true

To enable a TLS verification of the certificate, on the YAML file, you can enter disable_verify: false or you can remove this line entirely. 

This parameter is optional. 

Review the following example:

tls:
  disable_verify: true
ca_file

Enter the absolute file path to the CA certificate file. 

This parameter is optional. 

Review the following example:

tls:
ca_file: /var/etc/kafka/ca_file
ca_path

Enter the absolute path to scan the CA certificate file.

This parameter is optional. 

Review the following example:

tls:
ca_path: /var/etc/kafka
crt_file

Enter the absolute path to the certificate file. 

This parameter is optional. 

Review the following example:

tls:
crt_file: /var/etc/kafka/crt_file
key_file

Enter the absolute path to the private key file. 

This parameter is optional. 

Review the following example:

tls: 
key_file: /var/etc/kafka/keyfile
key_password

Enter the password for the key file.

This parameter is optional. 

Review the following example:

tls:
key_password: p@ssword123
client_auth_type

Enter a client authorization type. 

You can enter:

  • noclientcert
  • requestclientcert
  • requireanyclientcert
  • verifyclientcertifgiven
  • requireandverifyclientcert 

The default setting is noclientcert.

This parameter is optional. 

Review the following example:

tls: 
client_auth_type: noclientcert

tls: 

  min_version

Enter the minimum version of TLS to accept. 

This parameter is optional. 

Review the following example:

tls:
min_version: TLSv1_1

tls: 

  max_version

Enter the maximum version of TLS to accept. 

This parameter is optional. 

Review the following example:

This parameter is optional. 

Review the following example:

tls:
max_version: TLSv1_3

sasl:

  username

Enter your Kafka SASL username.

This parameter is optional. 

Review the following example:

sasl:
  username: kafka_username

sasl:

  password

Enter your Kafka SASL password.

This parameter is optional. 

Review the following example:

sasl:  
password: p@ssword123

sasl:

  mechanism

Enter a Kafka SASL mechanism type to implement a secure authentication.

You can enter: 

  • PLAIN
  • SCRAM-SHA-256
  • SCRAM-SHA-512

This parameter is optional. 

Review the following example:

sasl:
mechanism: PLAIN
features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, review the Review Feature Types section in Stream Outputs and Integrations Overview.

This parameter is optional. 

Review the following example:

features: log,metric
buffer_ttl

Enter a length of time to retry failed streaming data.

After this length of time is reached, the failed streaming data will no longer be tried.

This parameter is optional. 

Review the following example: 

buffer_ttl: 2h
buffer_path

Enter a folder path to temporarily store failed streaming data.

The failed streaming data will be retried until the data reaches its destinations or until the Buffer TTL value is reached.

If you enter a path that does not exist, then the agent will create directories, as needed.

This parameter is optional.

Review the following example:

buffer_path: /var/log/edgedelta/pushbuffer/
buffer_max_bytesize

Enter the maximum size of failed streaming data that you want to retry.

If the failed streaming data is larger than this size, then the failed streaming data will not be retried.

This parameter is optional.

Review the following example:

buffer_max_bytesize: 100MB

Review Sample Configuration

The following sample configuration displays an output without the name of the organization-level integration:

    - name: kafka
      type: kafka
      endpoint: localhost:2888,localhost:3888 # brokers
      topic: example_kafka_topic
      required_acks: 10
      batch_size: 1000
      batch_bytes: 10000
      batch_timeout: 1m
      async: true
      features: log,metric
      tls:
        disable_verify: true
        ca_file: /var/etc/kafka/ca_file
        ca_path: /var/etc/kafka
        crt_file: /var/etc/kafka/crt_file
        key_file: /var/etc/kafka/keyfile
        key_password: p@ssword123
        client_auth_type: noclientcert 
      sasl:
        username: kafka_username
        password: p@ssword123
        mechanism: PLAIN 

 

Share this document