Overview
You can use this document to better understand the parameters for stream outputs and integration.
In the Edge Delta App, when you create an integration or an individual output, similar parameters will display. As a result, this document applies to both outputs and integrations.
Note
To learn how to create an output or integration, see Create and Manage Outputs and Integrations.
Review Supported Streaming Destinations
Edge Delta supports the following streaming destinations:
Supported Streaming Outputs | Description |
AppDynamics |
The AppDynamics output will stream analytics and insights to an AppDynamics environment. To learn more, see AppDynamics Streaming Output and Integration. |
AWS Cloudwatch |
The AWS CloudWatch output will stream logs to a specified CloudWatch log group. To learn more, see AWS CloudWatch Streaming Output and Integration. |
AWS S3 |
The AWS S3 output will stream analytics and insights to an S3 bucket. To learn more, see AWS S3 Streaming Output and Integration . |
Azure AppInsight |
The Azure AppInsight output will stream analytics and insights to your Azure endpoint. To learn more, see Azure AppInsight Streaming Output and Integration. |
Azure Event Hub |
The Azure Event Hub output will stream analytics and insights to an Azure Event Hubs endpoint. To learn more, see Azure Event Hub Streaming Output and Integration. |
Cribl |
The Cribl output will stream analytics and insights to your Cribl endpoint. To learn more, see Cribl Streaming Output and Integration. |
Datadog |
The Datadog output will stream analytics and insights to a Datadog environment. To learn more, see Datadog Streaming Output and Integration. |
Dynatrace |
The Dynatrace output will stream analytics and insights to your Dynatrace environment. To learn more, see Dynatrace Streaming Output and Integration. |
Elastic |
The Elastic output will stream analytics and insights to your Elasticsearch environment. To learn more, see Elastic Streaming Output and Integration. |
EDPort |
The EDPort output will stream analytics and insights to your EDPort environment. To learn more, see EDPort Streaming Output and Integration. |
FluentD |
The FluentD output will stream analytics and insights to your FluentD endpoint. To learn more, see FluentD Streaming Output and Integration. |
GCP Cloud Monitoring |
The Cloud Monitoring output will stream custom Google Cloud metrics to a Cloud project. To learn more, see GCP Cloud Monitoring Streaming Outputs and Integrations. |
Graylog |
The Graylog output will stream analytics and insights to your Graylog endpoint. To learn more, see Graylog Streaming Output and Integration. |
Honeycomb |
The Honeycomb output will stream analytics and insights to a Honeycomb environment. To learn more, see Honeycomb Streaming Output and Integration. |
InfluxDB |
The InfluxDB output will stream analytics and insights to your InfluxDB deployment. To learn more, see InfluxDB Streaming Output and Integration. |
Humio |
The Humio output will stream analytics and insights to your Humio endpoint. To learn more, see Humio Streaming Output and Integration. |
Kafka |
The Kafka output will stream analytics and insights to your Kafka endpoint. To learn more, see Kafka Streaming Output and Integration. |
Loggly |
The Loggly output will stream analytics and insights to your Loggly endpoint. To learn more, see Loggly Streaming Output and Integration. |
Logz.io |
The Logz.io output will stream analytics and insights to your Logz.io endpoint. To learn more, see Logz.io Streaming Output and Integration. |
Loki |
The Loki output will stream analytics and insights to your Loki endpoint. To learn more, see Loki Streaming Output and Integration. |
New Relic |
The New Relic output will stream analytics and insights to a New Relic environment. To learn more, see New Relic Streaming Output and Integration. |
ObserveInc |
The ObserveInc output will stream analytics and insights to your ObserveInc endpoint. To learn more, see ObserveInc Streaming Output and Integration. |
OpenMetrics |
The OpenMetrics output will stream analytics and insights to your OpenMetrics environment. To learn more, see OpenMetrics Streaming Output and Integration. |
Scalyr |
The Scalyr output will stream analytics and insights to your Scalyr environment. To learn more, see Scalyr Streaming Output and Integration. |
SignalFx |
The SignalFx output will stream analytics and insights to your SignalFx endpoint. To learn more, see SignalFx Streaming Output and Integration. |
Splunk |
The Splunk output will stream analytics and insights to a Splunk HEC endpoint. To learn more, see Splunk Streaming Output and Integration. |
Sumo Logic |
The Sumo Logic output will stream analytics and insights to a Sumo Logic HTTPs Endpoint. To learn more, see Sumo Logic Streaming Output and Integration. |
Wavefront |
The Wavefront output will stream analytics and insights to your Wavefront environment. To learn more, see Wavefront Streaming Output and Integration. |
Review Feature Types
In the Edge Delta App, features are the data types that the Edge Delta agent collects (or generates), and then sends to a streaming destination.
Based on the streaming destination, you can add the following features to the output:
Feature Type | Description |
metric |
This feature sends metrics that are generated by processors in the workflow. By default, this feature type is enabled. |
edac |
This feature sends contextual logs that happened around an anomaly. edac represents Edge Delta Anomaly Context. By default, this feature type is enabled. |
cluster_patterns |
This feature sends cluster patterns in the following format: "{cluster-pattern}, {count}" By default, this feature type is enabled. |
cluster_samples |
This feature sends cluster samples, specifically:
|
log | This feature forwards raw logs to a streaming destination. |
topk | This feature sends top-k records that are generated by the top-k processor. |
health |
This feature sends data regarding the health of the agent's internal components. After an agent is restarted (or reloaded), data is reported every 15 seconds during the initial 2 minutes. Afterwards, data for components with a Additionally, regardless of a component's status, data for all components are reported every 60 minutes. |
heartbeat |
This feature indicates if the agent is active and running. Specifically, a value of 1.0 indicates a running agent. Review the following example: "value":1.0 |
alert |
This feature sends detected alert, which can include the following information:
|
all | This feature enables the metric, edac, and cluster features for streaming destinations. |
Add a Source Type
The agent supports the following source types:
-
K8s
-
Docker
-
ECS
-
File
-
Custom
-
If you enter custom, then you must add the field_mapping parameter to define the source.
-
Review the following example:
- labels: "my-kafka-events" endpoint: "something" topic: "topic" group_id: "my-group" sasl: username: kafka_username password: p@ssword123 mechanism: PLAIN source_detection: source_type: "Custom" optional: false field_mappings: namespace: "kubernetes.namespace" serviceName: "service" roleName: "user.role" systemType: "system"
-