Skip to main content
Version: 3.7.0

Custom Data via Kafka

Consume Kafka topics to ingest any data type—metrics, alerts, traces, logs, or entities—into ONE.

Prerequisites

  • ONE platform can reach the Kafka brokers.

Getting Started

Source Side

Publish your records to a Kafka topic (JSON or UTF-8 text recommended).

ONE Side

  • Navigate to Integrations → Plugins.Click Start Integration on the Kafka card.
  • Enter broker list, topic name, and optional consumer properties, e.g.
    auto.offset.reset=earliest
    (full param list: https://kafka.apache.org/documentation/#connectconfigs ).
  • Save the task—it is created in the disabled state.
  • Manually enable the task after you finish the data stream setup.

Configure Data Stream

ONE does not provide a default stream for custom payloads.
Create a manual data stream to parse, transform, and store the records before enabling the task.

Verification

  • Metrics: Insight → Data Model → Metrics or Insight → Data Explorer
  • Logs: Logs → Log Query
  • Events: Insight → Event Query
  • Traces: Insight → Data Query → Trace