Skip to main content

Documentation Index

Fetch the complete documentation index at: https://koreai.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

Build conversational AI assistants for customers, agents, and employees across voice and digital channels. Automation AI uses DialogGPT and AI Agents to deliver multi-turn conversations that understand intent, retain context, and respond naturally.

Key Components

┌──────────────────────────────────────────────────────┐ │ Automation AI │ └───────────────────────────┬──────────────────────────┘ │ │ │ ┌────────────────┼────────────────┐ │ ▼ ▼ ▼ │ ┌─────────────┐ ┌─────────────┐ ┌──────────────┐ │ │ DialogGPT │ │Agent Flows │ │ AI Agents │ │ │─────────────│ │─────────────│ │──────────────│ │ │ Agentic │ │ Dialog Task │ │ Tool-Calling │ │ │orchestration│ │ + │ │ + │ │ │ No training│ │ Agent Nodes │ │ External │ │ │ data needed│ │ │ │ Integrations │ │ └──────┬──────┘ └──────┬──────┘ └──────┬───────┘ │ └────────────────┼────────────────┘ │ │ │ ┌────────────────┴────────────────┐ │ ▼ ▼ │ ┌──────────────────┐ ┌──────────────────┐ │ │ Conversation │ │ Evaluation │ │ │ Management │ │──────────────────│ │ │──────────────────│ │ Testing Suite │ │ │ Interruptions │ │ Validate flows │ │ │ Clarifications │ │ Pre-deployment │ │ │ Context Switches │ │ checks │ │ └──────────────────┘ └──────────────────┘
ComponentDescription
DialogGPTAgentic orchestration engine that routes conversations using generative models—no training data required.
Agent FlowsConversational workflows combining and Agent Nodes for goal-driven service interactions.
AI AgentsAgent Nodes with tool-calling that handle complex tasks via contextual intelligence and external integrations.
Conversation ManagementHandles interruptions, clarifications, and context switches mid-conversation.
EvaluationTesting suite to validate conversational workflows before deployment.

DialogGPT Orchestration

DialogGPT analyzes each user message and routes it to the appropriate handler:
┌─────────────────────────────────────────────────────────┐ │ User Message │ └──────────────────────────┬──────────────────────────────┘ │ ▼ ┌─────────────────────────────────────────────────────────┐ │ DialogGPT Orchestrator │ │ │ │ Analyzes intent, context, and confidence to route to: │ │ │ │ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │ │ │ Agent │ │ Generative │ │ Agent │ │ │ │ Flows │ │ AI │ │ Handoff │ │ │ │ │ │ │ │ │ │ │ │ Structured │ │ LLM-powered │ │ Transfer │ │ │ │ tasks │ │ │ │ to human │ │ │ └─────────────┘ └─────────────┘ └─────────────┘ │ └─────────────────────────────────────────────────────────┘

Build an AI Agent

Follow these steps to create, configure, test, and deploy an AI Agent.

1. Create and Configure the App

StepWhat to do
Create the appCreate a DialogGPT-based app. The platform auto-enables the required XO GPT models; Dialogs and FAQs are on by default. Guided Onboarding →
Configure DialogGPTDefine Conversation Types and set models for Chunk Shortlisting and Conversation Orchestration. Enable or disable Intent and Conversation Events and override defaults as needed. Conversation Orchestration →

2. Configure Generative AI

StepWhat to do
Integrate an LLMConnect to a supported LLM provider, a bring-your-own model, or Kore.ai XO GPT. For Tool Calling, Streaming Responses, or Dynamic Variables, use a custom prompt with a pre-built or custom integration. LLM Integration →
Create a custom promptTailor model behavior per use case—build from scratch or import an existing prompt. Agent Node supports tool calling and prompt streaming with OpenAI/Azure OpenAI response formats in custom JavaScript V2 prompts. Prompts Library →
Enable GenAI featuresActivate LLM-powered features that accelerate development and improve runtime performance. Features must be explicitly enabled before use. GenAI Features →
Configure data safeguardsEnable PII/sensitive data anonymization at the assistant and LLM level. Data Anonymization →
Set up Guardrails to enforce appropriate AI outputs. Guardrails →

3. Build Flows and Connect Knowledge

StepWhat to do
Create a DialogDefine conversation flows using interlinked nodes. Nodes retrieve data, perform actions, call external apps, send messages, and control branching logic. Dialog Tasks →
Add an Agent NodeUse LLMs and tool calling to handle complex tasks, collect entities, and integrate with external systems. Supports multilingual conversations and contextual intelligence. Agent Node →
Connect Search AIIndex content from websites, documents (PDF, Office), and third-party systems (ServiceNow, Confluence, etc.) to give DialogGPT a reliable knowledge base. Content Sources →
Add supporting nodesUse Prompt, Entity, and Agent Transfer nodes with transitions to complete the conversation flow. Nodes Overview →

4. Test

StepWhat to do
Interactive testingValidate your app in real time using the built-in Playground before publishing. Playground →
Batch testingUpload CSV or JSON test cases to validate accuracy and reliability at scale. Generates comprehensive performance metrics. Batch Testing →

5. Deploy

StepWhat to do
Enable a channelConnect the agent to one or more voice or digital channels. The agent isn’t accessible to users until at least one channel is enabled. Digital Channels →
Configure agent transferSet up to a human agent. Integrations are platform-hosted—no custom BotKit required. Agent Transfer →
PublishSubmit the app through the publishing flow for admin review before making it available to end users. Publishing →