• Tools
Tools
  • Tools
loading...
No Results
  • ActiveCampaign
  • Asana
  • AWS-S3
  • AWS Lambda
  • Appstore
  • Bitbucket
  • Coda
  • Code
  • ConvertKit
  • CSV
  • Crypto
  • Clockify
  • Data Shaping
  • Date & Time
  • Delay
  • DingTalk
  • Discord
  • Dropbox
  • Elastic Security
  • FeiShu
  • Freshdesk
  • Freshservice
  • Freshworks CRM
  • Github
  • Gitlab
  • Google Calendar
  • Google Developer
  • Google Drive
  • Google Gmail
  • Google Sheets
  • Grafana
  • HaloPSA
  • Hacker News
  • Harvest
  • Help Scout
  • Hubspot
  • Intercom
  • Jenkins
  • Kafka
  • MailChimp
  • Microsoft Excel
  • Monday
  • Notion
  • Odoo
  • Ortto
  • Okta
  • Paddle
  • PayPal
  • Pipedrive
  • Qdrant
  • QRCode
  • QuickBooks
  • Redis
  • Segment
  • Search&Crawl
  • ServiceNow
  • Shopify
  • Stripe
  • Text
  • Trello
  • Twilio
  • WooCommerce
  • WordPress
  • Wikipedia
  • Xml
  • Zendesk
  • Zoom
Home > Tools

Kafka

1. Overview

Apache Kafka is a distributed event streaming platform capable of handling trillions of events a day. It is used for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.

The GoInsight Kafka node allows you to interact directly with your Kafka clusters within your automation workflows. You can seamlessly produce messages and retrieve metadata about your topics and partitions. Key capabilities include:

  • Producing Messages: Send records or messages to a specified topic.
  • Topic Management: List all available topics or retrieve details for a specific topic.
  • Partition Inspection: Get metadata for all partitions within a topic or for a single, specific partition.

2. Prerequisites

Before using this node, you need to have access to a running Apache Kafka cluster. You will also need the necessary connection details (such as broker addresses and any required authentication credentials) to connect to your cluster.

3. Credentials

For a detailed guide on how to obtain and configure credentials, please refer to our official documentation: Credentials Configuration Guide.

4. Supported Operations

Summary

This node provides operations to interact with Kafka resources such as topics, partitions, and messages. The table below summarizes the available actions.

Resource Operation Description
Topic Get a Topic Gets a topic by topic name from Apache Kafka.
Topic Get Many Topics Gets all topics from Kafka cluster.
Partition Get a Partition Get metadata about a single partition in the topic by topic name from Apache Kafka.
Partition Get Many Partitions Get many partitions by topic name from Apache Kafka.
Message Create a Produce Message Creates a produce message by topic name in Apache Kafka.

Operation Details

Get a Topic

Gets a topic by topic name from Apache Kafka.

Input Parameters:

  • TopicName: Name of the topic to retrieve

Output:

  • Topic (object): Topic data retrieved from Kafka
  • StatusCode (number): HTTP status code or operation status code (-1 for parameter error, 500 for exceptions)
  • ErrorMessage (string): Error message description, returns empty string on success

Get Many Topics

Gets all topics from Kafka cluster.

Output:

  • Topics (object-array): Topic data retrieved from Kafka.
  • StatusCode (number): HTTP status code or operation status code (-1 for parameter error, 500 for exceptions)
  • ErrorMessage (string): Error message description, returns empty string on success

Get a Partition

Get metadata about a single partition in the topic by topic name from Apache Kafka.

Input Parameters:

  • TopicName: Name of the topic

Options:

  • PartitionId: ID of the partition to inspect

Output:

  • Partition (object): Partition data retrieved from Kafka
  • StatusCode (number): HTTP status code or operation status code (-1 for parameter error, 500 for exceptions)
  • ErrorMessage (string): Error message description, returns empty string on success

Get Many Partitions

Get many partitions by topic name from Apache Kafka.

Input Parameters:

  • TopicName: Name of the topic

Output:

  • Partitions (object-array): Partitions data retrieved from Kafka
  • StatusCode (number): HTTP status code or operation status code (-1 for parameter error, 500 for exceptions)
  • ErrorMessage (string): Error message description, returns empty string on success

Create a Produce Message

Creates a produce message by topic name in Apache Kafka.

Input Parameters:

  • TopicName: Name of the topic to create
  • Records: List of records to be sent to the topic

Options:

  • PartitionId: Partition ID to which the records should be sent

Output:

  • Topic (object): Created topic data.
  • StatusCode (number): HTTP status code or operation status code (-1 for parameter error, 500 for exceptions)
  • ErrorMessage (string): Error message description, returns empty string on success

5. Example Usage

This section will guide you through creating a simple workflow to send a message to a Kafka topic using the Create a Produce Message operation.

Workflow Overview

The workflow will consist of three nodes: Start -> Create a Produce Message -> Answer.

Step-by-Step Guide

  1. Add the Tool Node:
    • In the workflow canvas, click the "+" button to add a new node.
    • In the panel that appears, select the "Tools" tab.
    • Find and select "Kafka" from the list of tools.
    • From the list of supported operations for Kafka, click on Create a Produce Message to add the node to your canvas.
  2. Configure the Node:
    • Click on the newly added Create a Produce Message node to open its configuration panel on the right.
    • Configure Credentials: In the credentials field at the top of the panel, click the dropdown menu and select your pre-configured Kafka credentials.
    • Fill in Parameters: Complete the input fields as follows:
    • TopicName: Enter the name of the topic you want to send a message to, for example, user-signups.
    • Records: This field expects an array of objects, where each object represents a message. You can use a JSON editor or reference output from a previous node. For a simple test, you can enter a static value like:
  3. Run and Validate:
    • Once all required parameters are correctly filled, any error indicators on the workflow canvas will disappear.
    • Click the "Run" button in the top-right corner of the canvas to execute the workflow.
    • After a successful execution, you can click the log icon in the top-right corner to view the detailed inputs and outputs of the node, confirming that the message was sent successfully.

After completing these steps, your workflow is fully configured. When executed, it will send the specified record to your Kafka topic.

6. FAQs

Q: I'm getting a connection error. What should I check?

A: Connection errors are often related to incorrect configuration or network issues. Please verify the following:

  • Broker Addresses: Ensure the broker addresses in your credentials are correct and reachable from the GoInsight environment.
  • Authentication: Double-check that your authentication mechanism (e.g., SASL/SCRAM) and credentials are correct.
  • Network/Firewall: Confirm that there are no firewalls or network security groups blocking the connection between GoInsight and your Kafka brokers.

Q: How should I format the Records parameter for the "Create a Produce Message" operation?

A: The Records parameter must be an array of JSON objects. Each object in the array represents a single message to be sent to Kafka. A message object should contain a value and can optionally include a key. The value should typically be a stringified JSON object.

  • Example:
    [
      {
        "key": "order-456",
        "value": "{\"productId\": \"prod-abc\", \"quantity\": 2, \"price\": 50.00}"
      },
      {
        "key": "order-457",
        "value": "{\"productId\": \"prod-xyz\", \"quantity\": 1, \"price\": 120.50}"
      }
    ]

7. Official Documentation

For more in-depth information about Apache Kafka and its concepts, please refer to the official documentation.

Kafka Official Documentation

Updated on: Dec 3, 2025
Was This Page Helpful?
Prev Jenkins
Next MailChimp
Discussion

Leave a Reply. Cancel reply

Your email address will not be published. Required fields are marked*

Product-related questions?Contact Our Support Team to Get a Quick Solution>
On this page
  • 1. Overview
  • 2. Prerequisites
  • 3. Credentials
  • 4. Supported Operations
    • Summary
    • Operation Details
  • 5. Example Usage
    • Workflow Overview
    • Step-by-Step Guide
  • 6. FAQs
  • 7. Official Documentation
loading...
No Results