Review Parameters for Stream Outputs and Integrations

George Alpizar
George Alpizar
  • Updated

Overview

You can use this document to better understand the parameters for stream outputs and integration.

In the Edge Delta App, when you create an integration or an individual output, similar parameters will display. As a result, this document applies to both outputs and integrations.

Note

To learn how to create an output or integration, see Create and Manage Outputs and Integrations.


Review Supported Stream Destinations

Edge Delta supports the following stream destinations:


Splunk

The Splunk output will stream analytics and insights to a Splunk HEC endpoint.

Before you begin

To create an output, you must have available a Splunk HEC token and HEC endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter splunk. Required
Endpoint endpoint

Enter the full Splunk HEC URI for this integration.

To send data in a JSON format, you must point this endpoint to Splunk's API (services/collector/raw)

Review the following example:

endpoint: ..../services/collector/raw
Required
Token token Enter the Splunk HEC token for this integration. Required
Index index Enter the Splunk index for this integration.  Optional
Features features

This parameter defines which data types to stream to the destination. 

If you do not provide a value, then metric, edac, cluster will be set.

To learn more, see Review Feature Types.

Optional
Not applicable disable_verify

To disable a TLS verification of a certificate:

  • In the YAML file, enter: disable_verify:true.

To enable a TLS verification of the certificate:

  • In the YAML file, you can enter disable_verify:false or you can remove this line entirely. 
Optional

Not applicable

custom_tags

This parameter defines key-value pairs that are streamed with every request.

Optional

The following example displays an output without the name of the organization-level integration:

- name: my-splunk
      type: splunk
      endpoint: "<protocol>://<host>:<port>/<endpoint>"
      token: "32-character GUID token"
      custom_tags:
        "app": "test"
        "region": "us-west-2"
        "File Path": "{{.FileGlobPath}}"
        "K8s PodName": "{{.K8sPodName}}"
        "K8s Namespace": "{{.K8sNamespace}}"
        "K8s ControllerKind": "{{.K8sControllerKind}}"
        "K8s ContainerName": "{{.K8sContainerName}}"
        "K8s ContainerImage": "{{.K8sContainerImage}}"
        "K8s ControllerLogicalName": "{{.K8sControllerLogicalName}}"
        "ECSCluster": "{{.ECSCluster}}"
        "ECSContainerName": "{{.ECSContainerName}}"
        "ECSTaskVersion": "{{.ECSTaskVersion}}"
        "ECSTaskFamily": "{{.ECSTaskFamily}}"
        "DockerContainerName": "{{.DockerContainerName}}"
        "ConfigID": "{{.ConfigID}}"
        "Host": "{{.Host}}"
        "Source": "{{.Source}}"
        "SourceType": "{{.SourceType}}"
        "Tag": "{{.Tag}}"

The following example displays when the name of the organization-level integration is entered:

      - integration_name: my-org-splunk

The following example displays if there are multiple instances of the same destination that need to route different data types to different Splunk indexes:

- name: edac-splunk-dest
  integration_name: orgs-splunk
  features: edac
  index: edac-index
- integration_name: orgs-splunk
  name: metric-splunk-dest
  features: metric
  index: metric-index

Sumo Logic

The Sumo Logic output will stream analytics and insights to a Sumo Logic HTTPs Endpoint.

Before you begin

Before you can create an output, you must have available the Sumo Logic HTTPs Endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integrations integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable  type Enter sumologic. Required
Endpoint endpoint Enter the full HTTPs URL for this endpoint. Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional
Not applicable  send_as_json

Enter true or false to send data in a JSON format, which allows the fields to be auto-parsed and extracted in Sumo.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: '{{ Env "TEST_SUMO" "sumo-us" }}'
      type: sumologic
      endpoint: "https://endpoint4.collection.us2.sumologic.com/receiver/v1/http/XYZ"
      custom_tags:
        "app": "transaction_manager"
        "region": "us-west-2"
        "File Path": "{{.FileGlobPath}}"
        "K8s PodName": "{{.K8sPodName}}"
        "K8s Namespace": "{{.K8sNamespace}}"
        "K8s ControllerKind": "{{.K8sControllerKind}}"
        "K8s ContainerName": "{{.K8sContainerName}}"
        "K8s ContainerImage": "{{.K8sContainerImage}}"
        "K8s ControllerLogicalName": "{{.K8sControllerLogicalName}}"
        "ECSCluster": "{{.ECSCluster}}"
        "ECSContainerName": "{{.ECSContainerName}}"
        "ECSTaskVersion": "{{.ECSTaskVersion}}"
        "ECSTaskFamily": "{{.ECSTaskFamily}}"
        "DockerContainerName": "{{.DockerContainerName}}"
        "ConfigID": "{{.ConfigID}}"
        "Host": "{{.Host}}"
        "Source": "{{.Source}}"
        "SourceType": "{{.SourceType}}"
        "Tag": "{{.Tag}}"
    - name: sumo-us-2
      type: sumologic
      endpoint: '{{ Env "EMPTY" "https://endpoint4.collection.us2.sumologic.com/receiver/v1/http/XYZ" }}'
      # send_as_json can be used to send data to Sumologic in JSON format instead of the usual cost-optimized format (JSON format is easier to use/consume)
      send_as_json: true

AWS CloudWatch

The AWS CloudWatch output will stream logs to a specified CloudWatch log group.

Before you begin

Before you can create an output, you must have available the CloudWatch log group name and log stream name.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter cloudwatch. Required
Region region Enter the AWS region destination to send logs. Required
Log Group Name log_group_name Enter the CloudWatch log group name. Required
Log Stream Name log_stream_name

Enter the CloudWatch log stream name.

You can enter a name or prefix, but not both.

Required
Log Stream Prefix log_stream_prefix

Enter the CloudWatch log stream prefix.

You can enter a name or prefix, but not both.

Required
Allow Label Override allow_label_override

Enter true or false for the parameter to override the default values of the log group name, log stream name, and log stream prefix by setting the ed_log_group_name, ed_log_stream_name, and ed_log_stream_prefix labels 

Optional
Auto Configure auto_configure

Enter true or false for the parameter to automatically create:

  • LogGroupName in the /ecs/task_definition_family format
  • LogsStreamPrefix in the ecs/container_name/task_id format

This parameter is only supported for ECS environments.

Additionally, only region configurations can be provided.

Optional
Not applicable auto_create

If this parameter is set, then IAM policies will be set.

If this parameter is not set, then log group and log stream will be created.

To learn more, review the examples below.

Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: cloudwatch
        type: cloudwatch
        region: us-west-2
        log_group_name: /ecs/microservice
        log_stream_prefix: ecs
        auto_create: true
        features: log
  • If the auto_create parameter is not set, then assign the following permission to taskExecutionRoleArn to put log events into CloudWatch. Review the following example:

        {
          "Version": "2012-10-17",
          "Statement": [{
            "Effect": "Allow",
            "Action": [
              "logs:PutLogEvents"
            ],
            "Resource": "*"
          }]
        }
  • If the auto_create parameter is set, then assign the following permission to taskExecutionRoleArn. Review the following example:

        {
          "Version": "2012-10-17",
          "Statement": [{
            "Effect": "Allow",
            "Action": [
              "logs:CreateLogStream",
              "logs:CreateLogGroup",
              "logs:DescribeLogStreams",
              "logs:PutLogEvents"
            ],
            "Resource": "*"
          }]
        }

Datadog

The Datadog output will stream analytics and insights to a Datadog environment.

Before you begin

Before you can create an output, you must have available a Datadog API Key.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter datadog. Required
Log Host log_host

Enter a Datadog log host to send log data. 

Optional
Event Host event_host Enter a Datadog event host to send log data.  Optional
Metric Host metric_host Enter a Datadog metric host to send log data.  Optional
Api Key api_key Enter a Datadog API key. Required
Send Alert As Datadog Log alert_as_log

Select (or enter) true to send alerts as a log. 

  • Additionally, for this configuration to work, you must also select alert as a feature type. 

Select (or enter) false to send alerts as events. 

Optional
Custom Tags custom_tags This parameter defines key-value pairs that are streamed with every request. Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional
Buffer TTL  buffer_ttl

Enter a length of time to retry failed streaming data.

After this length of time is reached, the failed streaming data will no longer be tried.

This parameter is optional. 

Review the following example: 

buffer_ttl: 2h
Optional

Buffer Path 

buffer_path

Enter a folder path to temporarily store failed streaming data.

The failed streaming data will be retried until the data reaches its destinations or until the Buffer TTL value is reached.

If you enter a path that does not exist, then the agent will create directories, as needed.

This parameter is optional.

Review the following example:

buffer_path: /var/log/edgedelta/pushbuffer/
Optional
Buffer Max Size 
buffer_max_bytesize

Enter the maximum size of failed streaming data that you want to retry.

If the failed streaming data is larger than this size, then the failed streaming data will not be retried.

This parameter is optional.

Review the following example:

buffer_max_bytesize: 100MB
Optional

The following example displays an output without the name of the organization-level integration:

    - name: datadog-custom
      type: datadog
      # If provided, custom installation of datadog log host can be reached.
      log_host: "<ADD DATADOG LOG_HOST>"
      # If provided, custom installation of datadog metric host can be reached.
      metric_host: "<ADD DATADOG METRIC_HOST>"
      # If provided, custom installation of datadog event host can be reached.
      event_host: "<ADD DATADOG EVENT_HOST>"
      api_key: '{{ Env "TEST_DD_APIKEY" }}'
      features: metric
      custom_tags:
        "app": "starbucks_pos_transaction_manager"
        "region": "us-west-2"
    - name: datadog-alert-as-log
      type: datadog
      api_key: '{{ Env "TEST_DD_APIKEY" }}'
      features: metric, alert, edac
      alert_as_log: true # this indicates the alert will be sent as a log instead of event by default

New Relic

The New Relic output will stream analytics and insights to a New Relic environment.

Before you begin

Before you can create an output, you must have available the New Relic Insert API key.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter newrelic. Required
Log Host log_host

Enter the New Relic log host. 

Optional
Metric Host metric_host If provided, custom installation of Datadog metric host to send metric data. Optional
Api Key api_key

Enter a New Relic Ingest- License Key.

Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: newrelic
      type: newrelic
      log_host: "<ADD NEWRELIC LOG_HOST>"
      metric_host: "<ADD NEWRELIC METRIC_HOST>"
      api_key: {{Env "TEST_NR_APIKEY"}}
      features: metric
      custom_tags:
        "app": "starbucks_pos_transaction_manager"
        "region": "us-west-2"
        "File Path": "{{.FileGlobPath}}"
        "K8s PodName": "{{.K8sPodName}}"
        "K8s Namespace": "{{.K8sNamespace}}"
        "K8s ControllerKind": "{{.K8sControllerKind}}"
        "K8s ContainerName": "{{.K8sContainerName}}"
        "K8s ContainerImage": "{{.K8sContainerImage}}"
        "K8s ControllerLogicalName": "{{.K8sControllerLogicalName}}"
        "ECSCluster": "{{.ECSCluster}}"
        "ECSContainerName": "{{.ECSContainerName}}"
        "ECSTaskVersion": "{{.ECSTaskVersion}}"
        "ECSTaskFamily": "{{.ECSTaskFamily}}"
        "DockerContainerName": "{{.DockerContainerName}}"
        "ConfigID": "{{.ConfigID}}"
        "Host": "{{.Host}}"
        "Source": "{{.Source}}"
        "SourceType": "{{.SourceType}}"
        "Tag": "{{.Tag}}"

Honeycomb

The Honeycomb output will stream analytics and insights to a Honeycomb environment.

Before you begin

Before you can create an output, you must have available a Honeycomb API key.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter honeycomb. Required
Host host Enter a name to create a custom installation of the Honeycomb host that will receive the data. Optional
Api Key api_key Enter the Honeycomb API key. Required
Dataset Name dataset_name

Enter a name to create a dataset that will send data to Honeycomb.

At Honeycomb, datasets are high-level buckets for your data.

Required
Unpacking unpacking

Enter true or false to unpack  and flatten nested JSON objects into unique columns. 

If you do not want to flatten nested fields, then enter false.

Optional
Custom Tags custom_tags

This parameter defines key-value pairs that are streamed with every request.

Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: honeycomb
      type: honeycomb
      host: "<ADD HONEYCOMB HOST>" #Host is the optional and default is "api.honeycomb.io"
      api_key: {{Env "TEST_HC_APIKEY"}}
      dataset_name: "<ADD-DATASET_NAME>"
      unpacking: false
      features: metric,log,edac
      custom_tags:
        "app": "starbucks_pos_transaction_manager"
        "region": "us-west-2"
        "File Path": "{{.FileGlobPath}}"
        "K8s PodName": "{{.K8sPodName}}"
        "K8s Namespace": "{{.K8sNamespace}}"
        "K8s ControllerKind": "{{.K8sControllerKind}}"
        "K8s ContainerName": "{{.K8sContainerName}}"
        "K8s ContainerImage": "{{.K8sContainerImage}}"
        "K8s ControllerLogicalName": "{{.K8sControllerLogicalName}}"
        "ECSCluster": "{{.ECSCluster}}"
        "ECSContainerName": "{{.ECSContainerName}}"
        "ECSTaskVersion": "{{.ECSTaskVersion}}"
        "ECSTaskFamily": "{{.ECSTaskFamily}}"
        "DockerContainerName": "{{.DockerContainerName}}"
        "ConfigID": "{{.ConfigID}}"
        "Host": "{{.Host}}"
        "Source": "{{.Source}}"
        "SourceType": "{{.SourceType}}"
        "Tag": "{{.Tag}}"

AppDynamics

The AppDynamics output will stream analytics and insights to an AppDynamics environment.

Before you begin

Before you can create an output, you must have available the AppDynamics Global account name and an API key.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

 

Optional
Not applicable type Enter appdynamics. Required
Host host

To create a custom installation, enter the name of the AppDynamics host that will receive the data.

Optional
Api Key api_key Enter the AppDynamics API key. Required
Schema schema_name Enter a name to create a schema that will send data to AppDynamics. Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

Optional
Custom Tags custom_tags This parameter defines key-value pairs that are streamed with every request. Optional

The following example displays an output without the name of the organization-level integration:

      - name: appdynamics-integration
        type: appdynamics
        # If provided, custom installation of appdynamics host can be reached.
        host: "<add appdynamics host>"
        global_account_name: "<add global account name>"
        api_key: "<add appdynamics api key>"
        # No whitespaces in schema name
        schema_name: "<add-schema-name>"
        features: log,metric,edac
        custom_tags:
          "app": "transaction_manager"
          "region": "us-west-2"

InfluxDB

The InfluxDB output will stream analytics and insights to your InfluxDB deployment.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable  type Enter influxdb. Required
Version version

Enter the version number of the InfluxDB deployment.

This parameter supports versions 1.x and 2.x.

An empty version will default to version 2.x

Optional
Endpoint endpoint

Enter the InfluxDB endpoint.

Required
Port port

Enter the port to connect to InfluxDB.

 
DB db Enter the InfluxDB database to stream data to. This parameter is only required for version 1.x. Required
Organization ID organization

Enter the organization ID that corresponds to the desired bucket.

This parameter is only required for version 2.x.

Required
Bucket Name bucket

Enter the name of the bucket that the agent will stream data to.  

This parameter is only required for version 2.x.

Required
Token token Enter your InfluxDB token.   
HTTP Username  http_user

Enter the InfluxDB user credentials.

This parameter is only required for version 1.x.

Required
HTTP Password  http_password

Enter the InfluxDB password for connecting user.

This parameter is only required for version 1.x.

Required
Disable Verify  disable_verify

To disable a TLS verification of a certificate:

  • In the visual editor, select True.
  • In the YAML file, enter: disable_verify:true.

To enable a TLS verification of the certificate:

  • In the visual editor, select False.
  • In the YAML file, you can enter disable_verify:false or you can remove this line entirely. 
Optional
CA File Path  ca_file_path Enter the absolute file path to the CA certificate file.  Optional
CA Path  ca_path Enter the absolute path to scan the CA certificate file. Optional
CRT File  crt_file Enter the absolute path to the certificate file.  Optional
Key File  key_file Enter the absolute path to the private key file.  Optional
Key Password  key_password Enter the password for the key file. Optional
Client Auth Type  client_auth_type

Select a client authorization type. 

The default setting is noclientcert.

Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

   - name: influxdb-integration
      type: influxdb
      endpoint: "https://influxdb.<your-domain>.com/"
      token: YOUR_API_TOKEN
      # empty version or version 2.x requires bucket and organization info
      bucket: testbucket
      organization: yourorganization
      port: 443
    - name: influxdb-integration-v1x
      type: influxdb
      version: 1.x
      endpoint: "https://influxdb.<your-domain>.com/"
      token: YOUR_API_TOKEN
      port: 443
      http_user: admin
      http_password: your_http_password
      # version 1.x requires db info
      db: "specific_influxdb_database"

Wavefront

The Wavefront output will stream analytics and insights to your Wavefront environment.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration  integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter wavefront. Required
Endpoint endpoint Enter the Wavefront endpoint. Required
Token token Enter the Wavefront API token. Required
Features features

This parameter defines which data types to stream to the destination.

For Wavefront, you can only select metric.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: wavefront-integration
      type: wavefront
      endpoint: "https://{your wavefront domain}.wavefront.com/report"
      token: "<add wavefront api token>"

Dynatrace

The Dynatrace output will stream analytics and insights to your Dynatrace environment.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter dynatrace. Required
Log Endpoint log_endpoint  Enter the Dynatrace log endpoint.  Optional
Metric Endpoint metric_endpoint  Enter the Dynatrace metric endpoint.  Optional
Token log_token 

Enter the Dynatrace log token.

You must enter a token to support log streaming. 

Optional
Custom Tags custom_tags 

This parameter defines key-value pairs that are streamed with every request.

Optional
Disable Verify  disable_verify 

To disable a TLS verification of a certificate:

  • In the visual editor, select True.
  • In the YAML file, enter: disable_verify:true.

To enable a TLS verification of the certificate:

  • In the visual editor, select False.
  • In the YAML file, you can enter disable_verify:false or you can remove this line entirely. 
Optional
CA File Path  ca_file_path 

Enter the absolute file path to the CA certificate file. 

Optional
CA Path  ca_path

Enter the absolute path to scan the CA certificate file.

Optional
CRT File  crt_file

Enter the absolute path to the certificate file. 

Optional
Key File  key_file

Enter the absolute path to the private key file. 

Optional
Key Password  key_password 

Enter the password for the key file.

Optional
Client Auth Type  client_auth_type 

Select a client authorization type. 

The default setting is noclientcert.

Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional 

EDPort

The EDPort output will stream analytics and insights to your EDPort environment.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter edport. Required
Endpoint endpoint

Enter the EDPort endpoint. 

You must enter an endpoint to support HTTP/S stream.

Optional
Host host

Enter the EDPort host. 

You must enter a host to support TCP stream. 

Optional
Port port

Enter the EDPort port. 

You must enter a port to support TCP stream. 

Optional
Schema schema  Select the format type for the streaming data, such as json Optional
Not applicable disable_verify

To disable a TLS verification of a certificate:

  • In the YAML file, enter: disable_verify:true.

To enable a TLS verification of the certificate:

  • In the YAML file, you can enter disable_verify:false or you can remove this line entirely. 
Optional
Not applicable protocol

Enter a protocol type, such as https, tls, etc. 

Optional
Not applicable listen 

Enter a network interface where the agent can listen for data. 

The default value for this parameter is 0.0.0.0.

Review the following example: 

    - labels: "ed-port-with-network-interface"
      port: 4545
      protocol: tcp
      # Listen is for network interface and default value is "0.0.0.0".
      listen: 127.0.0,1
    - labels: "ed-port-https-with-tls"
      protocol: https
      listen: localhost
      port: 443

 

Optional
Not applicable tls

Enter a TLS configuration.

In some cases, a data source may require a TLS connection to send data. If a TLS connection is not detected, then you may not be able to receive specific data. 

Review the following example: 

    - labels: "ed-port-tcp-with-tls"
      port: 4545
      protocol: tcp
      tls:
        crt_file: /certs/server-cert.pem
        key_file: /certs/server-key.pem
        ca_file: /certs/ca.pem
    - labels: "ed-port-https-with-tls"
      protocol: https
      listen: localhost
      port: 443
      tls:
        crt_file: /certs/server-cert.pem
        key_file: /certs/server-key.pem
        ca_file: /certs/ca.pem
Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

OpenMetrics

The OpenMetrics output will stream analytics and insights to your OpenMetrics environment.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type

Enter openmetrics

Required
Endpoint endpoint

Enter the OpenMetrics endpoint where data should stream to. 

Optional 
Custom Tags custom_tags This parameter defines key-value pairs that are streamed with every request. Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

outputs:
  streams:
  - name: 'openmetrics-documentation-streaming-example '
    type: openmetrics
    endpoint: http://localhost:8428/api/v1/import/prometheus

Scalyr

The Scalyr output will stream analytics and insights to your Scalyr environment.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter scalyr. Required
Endpoint endpoint Enter the Scalyr endpoint. Required
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: scalyr-integration
        type: scalyr
        endpoint: "https://app.scalyr.com/api/uploadLogs?token={scalyr log access write key}"

Elastic

The Elastic output will stream analytics and insights to your Elasticsearch environment.

Before you begin

Edge Delta recommends that you review and complete the steps listed in the Configure Elastic Index for the Agent document.

  • This process will help you prepare your Elasticsearch environment to become an Edge Delta streaming target.

 

Note

For connection url, you must provide either the cloud_id or address. You cannot enter both parameters. For the authentication, you must provide either the token or the user/password. You cannot enter both parameters.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter elastic. Required
Index index Enter the name of the Elastic index (or index template) where Edge Delta should stream the data.  Required
Cloud Id cloud_id

Enter the cloud ID of the Elasticsearch backend.

You must enter a Cloud ID or an Address

Optional
User user Enter the username of the Elasticsearch credentials. Optional
Password password Enter the password for the connecting user. Optional
Token token Enter the Elasticsearch API key. Optional
Address address

Enter the address list of the Elasticsearch backend.

You must enter a Cloud ID or an Address

To locate your Elasticsearch URL, review these forum topics from Elasticsearch: 

Optional
Region region Enter the AWS region destination to send logs. Optional
Role ARN  role_arn

To assume an AWS IAM role, enter the account ID and role name:

  • role_arn: "arn:aws:iam::<ACCOUNT_ID>:role/<ROLE_NAME>"
Optional
External ID  external_id

Enter a unique identifier to avoid a confused deputy attack.

Optional
Send JSON Logs As Is  send_as_json

Enter true or false to enable or disable this feature. 

If you select true, then Edge Delta will send logs without a defined JSON object wrapping. In other words, no metadata will be attached. 

The configured index should handle the mapping of the fields in the JSON log. 

Optional
EDAC Enrichment - EDAC ID 

edac_enrichment:

   edac_id_field

Enter a field name to display in the final JSON object. The EDAC's ID will display as a value.  Optional
EDAC Enrichment - Metric Name 

edac_enrichment:

 metric_name_field

Enter a field name to display in the final JSON object. The EDAC's metric name will display as a value.  Optional
Custom Tags custom_tags This parameter defines key-value pairs that are streamed with every request. Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: elastic-integration
        type: elastic
        index: "index name"
        # you can provide cloud or address list but not both at the same time
        cloud_id: "<add elasticsearch cloud_id>"
        #address:
         #- <elasticsearch endpoint address_1>
         #- <elasticsearch endpoint address_2>
        # you can provide token or user/pass for auth but not both at the same time
        token: "elasticsearch api key"
        #user: "elasticsearch username"
        #password: "elasticsearch password"

Azure AppInsight

The Azure AppInsight output will stream analytics and insights to your Azure endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable  type Enter azure. Required
API Key api_key Enter the Azure AppInsight API key. Required
Endpoint endpoint Enter the Azure AppInsight endpoint. Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

 

The following example displays an output without the name of the organization-level integration:

      - name: azure-integration
        type: azure
        endpoint: https://dc.services.visualstudio.com/v2/track
        api_key: "Azure AppInsight api key" 
        features: "metric"

Kafka

The Kafka output will stream analytics and insights to your Kafka endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter kafka. Required
Broker endpoint Enter your Kafka broker address. Required
Topic topic Enter your Kafka topic name. Required
Required Acknowledgements  required_acks

Enter the number of acknowledgments that the leader must receive before considering a request to be complete.

To learn more, view this article from Kafka.

Optional
Batch Size batch_size

Enter the maximum number of messages to buffer being being sent to a partition.

The default limit is 100 messages. 

Optional
Batch Bytes batch_bytes

Enter a limit (in bytes) for the maximum size of a request before being sent to a partition. 

The default value is 1048576.

Optional
Batch Timeout Duration  batch_timeout

Enter a time limit for how often incomplete message batches will be flushed to Kafka.

 
Async async

Enter true or false to enable or disable asynchronous communication between Edge Delta and Kafka.

 
Disable Verify  disable_verify 

To disable a TLS verification of a certificate:

  • In the visual editor, select True.
  • In the YAML file, enter: disable_verify:true.

To enable a TLS verification of the certificate:

  • In the visual editor, select False.
  • In the YAML file, you can enter disable_verify:false or you can remove this line entirely. 
Optional 
CA File Path  ca_file_path Enter the absolute file path to the CA certificate file.  Optional
CA Path  ca_path Enter the absolute path to scan the CA certificate file. Optional
CRT File  crt_file Enter the absolute path to the certificate file.  Optional
Key File  key_file Enter the absolute path to the private key file.  Optional
Key Password  key_password Enter the password for the key file. Optional
Client Auth Type  client_auth_type

Select a client authorization type. 

The default setting is noclientcert.

Optional
Username username Enter your Kafka SASL username. Optional
Password password Enter your Kafka SASL password. Optional
Mechanism mechanism

Enter a Kafka SASL mechanism type to implement a secure authentication.

You can enter: 

  • PLAIN
  • SCRAM-SHA-256
  • SCRAM-SHA-512
Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

 

The following example displays an output without the name of the organization-level integration:

    - name: kafka
      type: kafka
      endpoint: localhost:2888,localhost:3888 # brokers
      topic: example_kafka_topic
      required_acks: 10
      batch_size: 1000
      batch_bytes: 10000
      batch_timeout: 1m
      async: true
      features: log,metric
      tls:
        disable_verify: true
        ca_file: /var/etc/kafka/ca_file
        ca_path: /var/etc/kafka
        crt_file: /var/etc/kafka/crt_file
        key_file: /var/etc/kafka/keyfile
        key_password: p@ssword123
        client_auth_type: noclientcert # possible selections: noclientcert, requestclientcert, requireanyclientcert, verifyclientcertifgiven, requireandverifyclientcert
      sasl:
        username: kafka_username
        password: p@ssword123
        mechanism: PLAIN # possible selections: PLAIN, SCRAM-SHA-256, SCRAM-SHA-512

SignalFx

The SignalFx output will stream analytics and insights to your SignalFx endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter signalfx. Required
Endpoint endpoint Enter your SignalFx endpoint. Required
Token  token Enter your SignalFx API token. Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: signalfx-integration
        type: signalfx
        endpoint: https://ingest.us1.signalfx.com/v2
        token: "<add signalfx api token>"
        features: "metric,log"

Humio

The Humio output will stream analytics and insights to your Humio endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable  type Enter humio. Required
Endpoint endpoint Enter the Humio endpoint. You can use a cloud endpoint or a self-hosted endpoint. Required
Token  token Enter the Humio API token. Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: humio-integration
        type: humio
        endpoint: http://localhost:8080
        token: "<add humio api token here>"
        features: "metric,log"

Loggly

The Loggly output will stream analytics and insights to your Loggly endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter loggly. Required
Endpoint endpoint

Enter a Loggly endpoint.

You can use a cloud endpoint or a self-hosted endpoint.

The default endpoint is https://logs-01.loggly.com.

Optional
Token token Enter a Loggly API token. Required
Grouped Events grouped_events

Enter true or false

To group and send log entries based on shared properties, such as source type, source properties, etc, enter true

Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: loggly
      type: loggly
      endpoint: https://logs-01.loggly.com
      token: token12345
      features: log
      grouped_events: true # it will enable grouping event feature for loggly, meaning that one payload per observation group will be generated. "events" field will be used 

Logz.io

The Logz.io output will stream analytics and insights to your Logz.io endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor Parameter Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter logzio. Required
Endpoint endpoint Enter the Logz.io endpoint. You can use a cloud endpoint or a self-hosted endpoint. Required
Token token Enter the Logz.io log token. This parameter is required if you want to support log stream. Optional
Metric Token metric_token Enter the Logz.io metric token. This parameter is required if you want to support metric stream. Optional
Custom Tags custom_tags This parameter defines key-value pairs that are streamed with every request. Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: logzio
        type: logzio
        endpoint: "https://app-eu.logz.io:8071"
        token: "<add logz.io log shipping token>"
        metric_token: "<add logz.io metric shipping token>"
        custom_tags:
          "app": "starbucks_pos_transaction_manager"
          "region": "us-west-2"

Loki

The Loki output will stream analytics and insights to your Loki endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor Parameter Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter loki. Required
Endpoint endpoint Enter the Loki endpoint. Required
User user Enter the username for Loki. Optional
API Key api_key Enter the Loki API key. Optional
Send Alert As Loki Log  alert_as_log

Select (or enter) true to send alerts as a log. 

  • Additionally, for this configuration to work, you must also select log as a feature type. 

Select (or enter) false to send alerts as events. 

Optional
Custom Tags custom_tags

This parameter defines key-value pairs that are streamed with every request.

This parameter supports templating.

Optional
Message Template message_template

This parameter customizes the message content.

This parameter supports templating.

Optional
Features features

This parameter defines which data types to stream to the destination.

You can select logedac, and / or cluster pattern.

To learn more, see Review Feature Types.

Optional

As an optional step, you can customize the message payload and custom tags that are sent to a Loki destination.

  • Loki does not support the - character as a key value.

Review the following template fields:

Field Description
Tag This field is the user-defined tag that describes the environment, such as prod_us_west_2_cluster.
Host This field is the hostname of the environment where the agent is running on.
ConfigID This field is the configuration ID of the corresponding agent.
Source This field is the source name, specifically the identifier of the source, such as docker container id or file name.
SourceType This field is the source type, such as Docker or system.
FileGlobPath This field is the file global path.
K8sPodName This field is the Kubernetes pod name.
K8sNamespace This field is the Kubernetes namespace.
K8sControllerKind This field is the Kubernetes controller kind.
K8sContainerName This field is the Kubernetes container name.
K8sContainerImage This field is the Kubernetes container image.
K8sControllerLogicalName This field is the Kubernetes controller logical name.
ECSCluster This field is the ECS cluster name.
ECSContainerName This field is the ECS container name.
ECSTaskVersion This field is the ECS task version.
ECSTaskFamily This field is the ECS task family.
DockerContainerName This field is the Docker container name.

The following example displays an output without the name of the organization-level integration:

      - name: loki-integration
        type: loki
        endpoint: "https://localhost:3000/loki/api/v1/push"
        api_key: "api_key"
        user: "user"
        custom_tags:
          "app": "test"
          "region": "us-west-2"
        message_template:
          "File Path": "{{.FileGlobPath}}"
          "K8s PodName": "{{.K8sPodName}}"
          "K8s Namespace": "{{.K8sNamespace}}"
          "K8s ControllerKind": "{{.K8sControllerKind}}"
          "K8s ContainerName": "{{.K8sContainerName}}"
          "K8s ContainerImage": "{{.K8sContainerImage}}"
          "K8s ControllerLogicalName": "{{.K8sControllerLogicalName}}"
          "ECSCluster": "{{.ECSCluster}}"
          "ECSContainerName": "{{.ECSContainerName}}"
          "ECSTaskVersion": "{{.ECSTaskVersion}}"
          "ECSTaskFamily": "{{.ECSTaskFamily}}"
          "DockerContainerName": "{{.DockerContainerName}}"
          "ConfigID": "{{.ConfigID}}"
          "Host": "{{.Host}}"
          "Source": "{{.Source}}"
          "SourceType": "{{.SourceType}}"
          "Tag": "{{.Tag}}"

FluentD

The FluentD output will stream analytics and insights to your FluentD endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable  type Enter fluentd. Required
Host host Enter the FluentD host. This parameter is required to support tcp stream. Required
Port port Enter the FluentD port. This parameter is required to support tcp stream. Required
Encoder encoder Enter the encoder type to use while streaming data to FluentD. Raw and 'msgpack' are supported. Optional
Tag Prefix tag_prefix This parameter is used by the fluentd pusher to determine what fluentd tag to use. If the source config already defines a tag enrichment, then that configuration will be used. Otherwise, a tag will be generated in the following format: "{TagPrefix}{Agent's Tag}" Optional
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: fluentd-log-fwd
      type: fluentd
      host: log-repo-host
      port: 23131
      encoder: msgpack
      pool_size: 10
      # tag_prefix; agent setting tag value is appended to this prefix
      # and used as fluentd forward tag (the payload itself will still have edgedelta_tag=agentsettings.tag)
      # tag_prefix is only used as fluentd tag if the corresponding data doesn't have a tag defined in enrichments
      tag_prefix: "tail.ed."
      features: log

Azure Event Hubs

The Azure Event Hub Stream output will stream analytics and insights to an Azure Event Hubs endpoint.

Before you begin

To enable this integration, you must have an Azure AD token.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter eventhubstream. Required
Endpoint endpoint Enter the Event Hubs endpoint. Required
Token token Enter the Azure AD token. Required
Features features

This parameter defines which data types to stream to the destination.

If you do not provide a value, then all will be set.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

      - name: eventhub-stream
        type: eventhubstream
        endpoint: "https://namespace.servicebus.windows.net/hub/messages"
        token: "azure-ad-token"
        features: log,metric

Cribl

The Cribl output will stream analytics and insights to your Cribl endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor Parameter Description Required or Optional 
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type Enter cribl. Required
Endpoint endpoint Enter the full Cribl ingress endpoint.  Required
Token token Enter the Cribl token. Required
Disable Verify  disable_verify

To disable a TLS verification of a certificate:

  • In the visual editor, select True.
  • In the YAML file, enter: disable_verify:true.

To enable a TLS verification of the certificate:

  • In the visual editor, select False.
  • In the YAML file, you can enter disable_verify:false or you can remove this line entirely. 
Optional
CA File Path  ca_file_path

Enter the absolute file path to the CA certificate file. 

Optional
CA Path  ca_path

Enter the absolute path to scan the CA certificate file. 

Optional
CRT File  crt_file

Enter the absolute path to the certificate file. 

Optional
Key File  key_file

Enter the absolute path to the private key file. 

Optional
Key Password  key_password

Enter the password for the key file. 

Optional
Client Auth Type  client_auth_type

Select a client authorization type. 

The default setting is noclientcert.

Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: cribl-http
      type: cribl
      endpoint: http://in.logstream..cribl.cloud:10080/crible/_bulk
      token: ""
      features: log,edac,metric,alert

Graylog

The Graylog output will stream analytics and insights to your Graylog endpoint.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional
Not applicable type

Enter graylog

Required
Host host

Enter the graylog host.

Required
Port port

Enter the graylog port.

Required
Custom Tags custom_tags

This parameter defines key-value pairs that are streamed with every request.

Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

outputs:
  streams:
  - name: graylog-documentation-test-stream
    type: graylog
    host: graylog.example.org
    port: 514

AWS S3

The AWS S3 output will stream analytics and insights to an S3 bucket.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional 
Not applicable type

Enter s3stream.

Required
Bucket bucket

Enter the target S3 bucket.

Required
Region region Enter the specified S3 bucket's region. Required
AWS Key aws_key_id

Enter the AWS key ID associated with the specified bucket. 

Optional
AWS Secret Key aws_sec_key Enter the AWS secret key ID associated with the specified bucket.  Optional
Fluster Interval flush_interval

Enter a time to flush (or force) data to the destination, including buffered data. 

Optional
Flush Byte Size flush_bytesize

Enter a data threshold to flush (or force) data to the destination, including buffered data. 

Optional
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: my-s3-streamer
      type: s3stream
      aws_key_id: {{ Env "AWS_KEY_ID" }}
      aws_sec_key: {{ Env "AWS_SECRET_KEY" }}
      bucket: testbucket
      region: us-east-2
      flush_interval: 30s # Default is 3 minutes
      flush_bytesize: 1M # Having a byte size parameter means that if the given buffer reachs given threshold, it will be flushed even if flush interval is not yet elapsed

ObserveInc

The ObserveInc output will stream analytics and insights to your ObserveInc endpoint. 

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor YAML Description Required or Optional
Name name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required
Integration integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional 
Not applicable type

Enter observeinc.

Required
Endpoint endpoint

Enter an HTTP streaming endpoint.

To learn more, please review this document from ObserveInc.

Required
Custom Tags custom_tag

This parameter defines key-value pairs that are streamed with every request.

Optional 
Features features

This parameter defines which data types to stream to the destination.

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: my-observeinc
      type: observeinc
      endpoint: "http://localhost:5555"
      features: metric,log,health,alert,event
      custom_tags:
        "Host": "{{.Host}}"
        "Source": "{{.Source}}"
        "SourceType": "{{.SourceType}}"
        "Tag": "{{.Tag}}"

GCP Cloud Monitoring

The Cloud Monitoring output will stream custom Google Cloud metrics to a Cloud project.

Review the following parameters that you can configure in the Edge Delta App:

Visual Editor

YAML

Description

Required or Optional

Name

name

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

Required

Integration

integration_name

This parameter only appears when you create an individual output.

This parameter refers to the organization-level integration created in the Integrations page. If you enter this name, then the rest of the fields will be automatically populated.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name field. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

Optional

Not applicable

type

Enter cloudmonitoring.

Required

Project ID

project_id

Enter the identifier of the GCP project associated with this resourc.

Required

Key

key

Enter a key for the agent to use for authentication.

You must enter a key or key path.

Optional

Key Path

key_path

Enter a path for the agent to use for authentication.

You must enter a key or key path.

Optional

Features

features

This parameter defines which data types to stream to the destination. 

To learn more, see Review Feature Types.

Optional

The following example displays an output without the name of the organization-level integration:

    - name: my-cloudmonitoring
      type: cloudmonitoring
      project_id: edgedelta
      key: '{{ Env "CLOUDMONITORING_KEY" }}'
      features: metric

Review Example of Stream and Trigger Outputs

The following example displays a configuration with a stream output and a trigger output:

outputs:
  streams:
      - name: sumo-logic-integration
        type: sumologic
        endpoint: "https://[SumoEndpoint]/receiver/v1/http/[UniqueHTTPCollectorCode]"
  triggers:
      - name: slack-integration
        type: slack
        endpoint: "https://hooks.slack.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX"

Review Feature Types

In the Edge Delta App, features are the data types that the Edge Delta agent collects (or generates), and then sends to a streaming destination.

Based on the streaming destination, you can add the following features to the output:

Feature Type Description
metric

This feature sends metrics that are generated by processors in the workflow.

By default, this feature type is enabled.

edac

This feature sends contextual logs that happened around an anomaly. 

edac represents Edge Delta Anomaly Context.

By default, this feature type is enabled.

cluster pattern

This feature sends cluster patterns in the following format: "{cluster-pattern}, {count}"

By default, this feature type is enabled.

cluster sample

This feature sends cluster samples, specifically: 

  • Timestamp
  • Host
  • Source
  • Log

 

logs This feature forwards raw logs to a streaming destination.
topk This feature sends top-k records that are generated by the top-k processor.
health

This feature sends data regarding the health of the agent's internal components.

After an agent is restarted (or reloaded), data is reported every 15 seconds during the initial 2 minutes.

Afterwards, data for components with a status: nok are reported every minute.

Additionally, regardless of a component's status, data for all components are reported every 60 minutes. 

heartbeat

This feature indicates if the agent is active and running. Specifically, a value of 1.0 indicates a running agent. 

Review the following example: 

"value":1.0
alert

This feature sends detected alert, which can include the following information: 

  • Edac ID
  • Source information
  • Metrics name
  • Host
all This feature enables the metricedac, and cluster features for streaming destinations.

Add a Source Type

The agent supports the following source types:

  • K8s

  • Docker

  • ECS

  • File

  • Custom

    • If you enter custom, then you must add the field_mapping parameter to define the source.

    • Review the following example:

          - labels: "my-kafka-events"
            endpoint: "something"
            topic: "topic"
            group_id: "my-group"
            sasl:
              username: kafka_username
              password: p@ssword123
              mechanism: PLAIN
            source_detection:
              source_type: "Custom"
              optional: false
              field_mappings:
                namespace: "kubernetes.namespace"
                serviceName: "service"
                roleName: "user.role"
                systemType: "system"

Supplemental Information for Splunk Users

Before you can set up a Splunk output or integration, you must have the HEC token and HEC endpoint available. At a high level, to set up a Splunk output or integration, you must:

  • Configure an HEC token in Splunk
  • Determine the correct HEC endpoint in Splunk
  • Import the Edge Delta dashboard into Splunk

Note

The process to set up a Splunk output varies for Splunk Cloud and Splunk Enterprise users.


Step 1: Configure an HEC Token in Splunk

Option 1: Splunk Cloud

To create a Splunk HTTP Event Collector (HEC) and token:

  1. In the Splunk Web UI, navigate to Settings, then click Add Data.
  2. Click Monitor, and then click HTTP Event Listener.
  3. In the field, enter a name for the HEC, and then click Next.
  4. Confirm the index information or use the default index, and then click Click Review.
  5. Click Submit.
  6. Copy the displayed token value. You can enter this information in the Token field in the Edge Delta App.

Option 2: Splunk Enterprise

To ensure HTTP Event Collector (HEC) is enabled:

  1. In the Splunk Enterprise Web UI, navigate to Settings, then click Data Inputs.
  2. Click HTTP Event Collector.
  3. Click Global Settings.
  4. In the All Tokens toggle button, select Enabled.

To create a Splunk HTTP Event Collector (HEC) and token:

  1. In the Splunk Web UI, navigate to Settings, then click Add Data.
  2. Click Monitor, and then click HTTP Event Listener.
  3. In the field, enter a name for the HEC, and then click Next.
  4. Confirm the index information or use the default index, and then click Click Review.
  5. Click Submit.
  6. Copy the displayed token value. You can enter this information in the Token field in the Edge Delta App.

Step 2: Determine your HEC Endpoint

Before you continue, verify that you have the following information:

  • Splunk deployment type (Enterprise, Cloud, Free Trial, etc.)
  • Splunk hostname (from Splunk Browser URI)
  • Input Protocol (HTTPS is default)

Option 1: Splunk Cloud Format (Cloud, Free Trial, Cloud on GCP)

Replace <splunk_hostname> with your organization’s hostname

  • Splunk Cloud
    • URI Format: https://http-inputs-<splunk_hostname>:443/services/collector/event
  • Splunk Free Trial
    • URI Format: https://inputs.<splunk_hostname>:8088/services/collector/event
  • Splunk Cloud on GCP
    • URI Format: https://http-inputs.<splunk_hostname>:443/services/collector/event

Option 2: Splunk Enterprise

  • URI Format: https://<splunk_hostname>:8088/services/collector/event

Step 3: Import the Edge Delta Dashboard to Splunk

  1. In Splunk, navigate to Search interface.
  2. Click Dashboards.
  3. Click Create New Dashboard.
  4. Enter and configure a dashboard name, description, and permissions.  
  5. Click Classic Dashboards, and then click Create.
  6. In the Edit Dashboard page, switch from UI to Source.
  7. Replace the existing XML with the copied XML.
    • Contact your Edge Delta Sales Engineer so that you can obtain the XML.
  8. Switch back to UI.
  9. Click Save.

Related Documentation


Share this document