Overview
The Datadog output will stream analytics and insights to a Datadog environment.
Before you begin
Before you can create an output, you must have available a Datadog API Key.
- To learn how to create a new Datadog API key, review this document from Datadog.
Review Sample Configuration
The following sample configuration displays an output without the name of the organization-level integration:
- name: datadog-default
type: datadog
api_key: '{{ Env "TEST_DD_APIKEY" }}'
custom_tags:
"app": "transaction_manager"
"region": "us-west-2"
"File Path": "{{.FileGlobPath}}"
"K8s PodName": "{{.K8sPodName}}"
"K8s Namespace": "{{.K8sNamespace}}"
"K8s ControllerKind": "{{.K8sControllerKind}}"
"K8s ContainerName": "{{.K8sContainerName}}"
"K8s ContainerImage": "{{.K8sContainerImage}}"
"K8s ControllerLogicalName": "{{.K8sControllerLogicalName}}"
"ECSCluster": "{{.ECSCluster}}"
"ECSContainerName": "{{.ECSContainerName}}"
"ECSTaskVersion": "{{.ECSTaskVersion}}"
"ECSTaskFamily": "{{.ECSTaskFamily}}"
"DockerContainerName": "{{.DockerContainerName}}"
"ConfigID": "{{.ConfigID}}"
"Host": "{{.Host}}"
"Source": "{{.Source}}"
"SourceType": "{{.SourceType}}"
"Tag": "{{.Tag}}"
- name: datadog-custom
type: datadog
log_host: "<ADD DATADOG LOG_HOST>"
metric_host: "<ADD DATADOG METRIC_HOST>"
event_host: "<ADD DATADOG EVENT_HOST>"
api_key: '{{ Env "TEST_DD_APIKEY" }}'
features: metric
custom_tags:
"app": "starbucks_pos_transaction_manager"
"region": "us-west-2"
- name: datadog-alert-as-log
type: datadog
api_key: '{{ Env "TEST_DD_APIKEY" }}'
features: metric, alert, edac
alert_as_log: true # this indicates the alert will be sent as a log instead of event by default
- name: datadog-buffered-output
type: datadog
api_key: '{{ Env "TEST_DD_APIKEY" }}'
features: metric, alert, edac
buffer_path: /var/log/edgedelta/pushbuffer/
buffer_ttl: 2h
buffer_max_bytesize: 100MB
Review Parameters
Review the following parameters that you can configure in the Edge Delta App.
name
Required
Enter a descriptive name for the output or integration.
For outputs, this name will be used to map this destination to a workflow.
Review the following example:
name: datadog-default
integration_name
Optional
This parameter refers to the organization-level integration created in the Integrations page.
If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name parameter. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.
Review the following example:
integration_name: orgs-datadog
type
Required
Enter datadog.
Review the following example:
type: datadog
log_host
Optional
Enter a Datadog log host to send log data.
Review the following example:
log_host: "<ADD DATADOG LOG_HOST>"
event_host
Optional
Enter a Datadog event host to send log data.
Review the following example:
event_host: "<ADD DATADOG EVENT_HOST>"
metric_host
Optional
Enter a Datadog metric host to send log data.
Review the following example:
metric_host: "<ADD DATADOG METRIC_HOST>"
api_key
Required
Enter a Datadog API key.
Review the following example:
api_key: '{{ Env "TEST_DD_APIKEY" }}'
alert_as_log
Optional
Enter true to send alerts as a log.
- Additionally, for this configuration to work, you must also enter alert as a feature type.
Enter false to send alerts as events.
Review the following example:
alert_as_log: true
custom_tags
Optional
This parameter defines key-value pairs that are streamed with every request.
Review the following example:
custom_tags: "app": "starbucks_pos_transaction_manager" "region": "us-west-2"
features
Optional
This parameter defines which data types to stream to the destination.
To learn more, review the Review Feature Types section in Stream Outputs and Integrations Overview.
Review the following example:
features: log,edac,metric,alert
buffer_ttl
Optional
Enter a length of time to retry failed streaming data.
After this length of time is reached, the failed streaming data will no longer be tried.
Review the following example:
buffer_ttl: 2h
buffer_path
Optional
Enter a folder path to temporarily store failed streaming data.
The failed streaming data will be retried until the data reaches its destinations or until the Buffer TTL value is reached.
If you enter a path that does not exist, then the agent will create directories, as needed.
Review the following example:
buffer_path: /var/log/edgedelta/pushbuffer/
buffer_max_bytesize
Optional
Enter the maximum size of failed streaming data that you want to retry.
If the failed streaming data is larger than this size, then the failed streaming data will not be retried.
Review the following example:
buffer_max_bytesize: 100MB