loading...
No Results
  • Get Started
    • Welcome to GoInsight.AI
    • Quick Start
  • Knowledge 101
    • Key Concepts
    • Knowledge Base
    • LLM Selection Guide
    • Tool
    • Service
    • Data Security
  • Quick Chatbot
    • Build a Quick Bot
  • InsightFlow
    • InsightFlow Introduction
    • Types of InsightFlow
    • Node
      • Start
      • LLM
      • Knowledge Base Retrieval
      • Answer
      • Document Writing
      • Document Reading
      • HTTP Request
      • IF/ELSE
      • Question Classifier
      • Branch Aggregator
      • Multi branch Selector
      • Iteration
      • Auto-Continue
      • Template
      • Code
      • JSON Variable Extractor
      • Variable Assigner
      • Variable Extractor
      • KnowledgeFocus LLM
      • Agent
      • End
    • Publishing
      • Publishing an Interactive Flow
      • Publishing a Service Flow
      • Create Your First Workflow
  • Control & Management
    • Access Control
Home > Documentation > InsightFlow > Node

KnowledgeFocus LLM Node

Definition:

KnowledgeFocus LLM is a pre-configured LLM node on the GoInsight platform, designed specifically for InteractFlow. It incorporates security and knowledge base restrictions, making it a foundational LLM for creating Q&A chatbots through a guided setup process. The responses are driven from an approved knowledge base, which helps prevent irrelevant answers or jailbreak attempts.

How to Configure:

1. Model

GoInsight supports a wide range of globally recognized models, including the Microsoft Azure GPT series (GPT-4o mini, GPT-4o, o1, o3-mini), OpenAI's GPT series, Claude 3.5 series, DeepSeek series, Qwen-plus, among others. Users can configure the model temperature as needed. For optimal results, choose the appropriate model based on your specific scenario and task requirements.

2. Knowledge Base Retrieval Results

"Context" refers to the background information provided to the KnowledgeFocus LLM to improve its response accuracy. Consider it as hints that the LLM uses to generate accurate answers to your inquiries.

3. User

The User field automatically takes the same query from the starting node, so no manual input is needed.

4. Prompts

The KnowledgeFocus LLM node uses two types of prompts to guide its responses:

  • User Prompt: Custom instructions given to the LLM, which are combined with the system's default directives. You can also choose to replace the system’s instructions.
  • System Prompt: These built-in high-level system instructions guide how the LLM responds. Overriding these might reduce response accuracy.
    Knowledgefocus LLM Prompt

5. Token Distributio

Controls how tokens (words or characters) are allocated between the input (source content) and its output (response). More tokens for Input lead to deeper content analysis, while more tokens for Output provide detailed answers.

6. Memory

After enabling memory, each input will include the chat history in the conversation (node ​​memory is not supported) to help LLM understand the context and improve the problem understanding ability in the conversation interaction.

  • History Count: Choose how many previous messages the LLM remembers [between 1 to 50].

7. Output Variable

Text (String): It is the final text result generated by the LLM, which you can use elsewhere.

Differences Between KnowledgeFocus LLM Node and General LLM Node:

  • Similarities: Both are LLM (Large Language Model) nodes that generate text responses.
  • Differences: The KnowledgeFocus LLM node has pre-defined templates, security rules, and is restricted to a knowledge base to limit responses. It is ideal for document-based Q&As and FAQs systems using its stored knowledge base. The General LLM node allows for fully customizable instructions and open-ended prompts. It is best suited for creative tasks where knowledge base is not needed, like story writing.

Note:

Use the KnowledgeFocus LLM for workflows involving FAQs or other knowledge-driven tasks. But if the knowledge base is not involved, the general LLM node is more suitable.

Updated on: Jun 25, 2025
Prev Variable Extractor
Next Agent
On this page
  • Definition
  • How to configure
    • Model
    • Knowledge Base Retrieval Results
    • User
    • Prompts
    • Token Distribution
    • Memory
    • Output Variable
  • Differences Between KnowledgeFocus LLM Node and General LLM Node
  • Note
loading...
No Results