Filters Overview

George Alpizar
George Alpizar
  • Updated


You can use this document to learn about the configuration parameters available in a configuration file, specifically for Filters.

You can use a filter to discard unnecessary logs or protect sensitive data. In other words, you can use a filter to refine and transform collected logs before additional processing takes place.  As a result, filters can help reduce the agent's resource load because of the reduced log ingestion.


Some input types offer additional filtering options that you can use to complement the generic filters in this page. 

For example, when you create a Kubernetes input, you can add Kubernetes-specific filters to the input, such as filter for namespaces or pods. Additionally, you can also use the filters listed in this page to add to the Kubernetes input. 

To learn more about inputs, see Inputs.

Review Supported Filters Types

The Edge Delta App supports the following filter types:


This filter type can be used to process Elastic APM logs.

To learn more, see APM Filters.


This filter type can be used to decode base64 encoding.

To learn more, see Base64 Filters.

Buffered Trace

This filter type handles trace logs.

  • Edge Delta defines trace log as a set of logs that can be tied together with an ID, such as a trace ID or request ID.

To learn more, see Buffered Trace Filters.


This filter type allows you to combine already-existing filters into a single filter. 

To learn more, see Combination Filters.

Custom Attributes

This filter type can filter for custom attributes. 

To learn more, see Custom Attribute Filters.

JSON Field Drop

This filter type can filter and drop specified JSON fields from the incoming logs. 

To learn more, see JSON Field Drop Filters.

JSON Field Extractor

This filter type extracts a field's value and replaces the whole JSON content with the field's value.

To learn more, see JSON Field Extractor Filters.


This filter type hides (or masks) specific data, based on the configured regex pattern.

To lean more, see Mask Filters.

OTLP (Open Telemetry)

This filter type can be used to process OTLP (Open Telemetry) logs. 

To learn more, see OTLP (Open Telemetry) Filters.


This filter type passes all log lines that match the specified regular expression. All unmatched logs are discarded.

To learn more, see Regex Filters.

Split Lines

This filter type can be used to match, then split a single log into multiple logs.

To learn more, see Split Lines Filters.

Create and Manage a Filter

At a high level, there are 2 ways to access Filters:

  • If you need to create a new configuration, then you can use the visual editor to populate a YAML file, as well as make changes directly in the YAML file.
  • If you already have an existing configuration, then you can update configurations in the YAML file.

Option 1: Access the visual editor for a new configuration

  1. In the Edge Delta App, on the left-side navigation, click Data Pipeline, and then click Agent Settings.
  2. Click Create Configuration.
  3. Click Visual.
  4. On the right-side, select Filters.
  5. Complete the missing fields.
  6. To make additional configurations to the configuration file, click the back button, and then select a new configuration parameter to manage.
  7. To save the configuration and exit the visual editor, click Save.
  8. Refresh the page to view the newly created configuration in the table.

Option 2: Access the YAML file for an existing configuration

  1. In the Edge Delta App, on the left-side navigation, click Data Pipeline, and then click Agent Settings.
  2. Locate the desired configuration, then under Actions, click the vertical ellipses, and then click Edit
  3. Review the YAML file, make your changes, and then click Save.
    • To learn about these configurations, see Review Filter Types.
    • In a YAML file, filters are defined at the top level. Review the following example:
  - name: error
    type: regex
    pattern: "error"

Understand the Workflow of a Filter

After you define a filter, filters can be referenced at different places in the YAML file:

  • Input filters apply right after the data ingestion from the input, but before running the workflows associated with the input.
  • Workflow filters apply before the processor runs within the workflow.
  • Processor filters apply before the processor runs, regardless of which workflow the processor is running within.


The following example displays a file input with error and mask_card filters:

      - labels: "billing"
        path: "/var/log/billing/*.log"
          - error
          - mask_card

To learn how inputs can be filtered, see Inputs.


The following example displays a workflow with the error filter:

      - system_stats
      - agent_stats
      - application_logs
      - error
      - error-check
      - fail-check
      - success-check
      - sumo-logic-devops-integration
      - slack-devops-integration

To learn how workflows can be filtered, see Workflows.


The following example displays the Dimension Counter Processor with the not_debug filter.

  - name: "log"
    pattern: "level=(?P<level>\\w+) "
    dimensions: ["level"]
      anomaly_probability_percentage: 90
      - not_debug

To learn more, see Processors Overview.

Share this document