v11.23.1 April 11, 2026
This update includes enhancements and bug fixes. The key enhancements included in this release are summarized below.Agent Flow
Context-Aware Custom Prompts for Rephrase Responses
The context object is now supported in the post-processor scripts within the Rephrase Responses custom prompts, regardless of the model used. Previously, these scripts only supported static content. Developers can now render templates dynamically using real-time context data, enabling greater flexibility across post-processor scripts.Agent Transfer
Salesforce MIAW - Display Real-Time Queue Position
End users can now see their real-time position in the Salesforce MIAW queue directly in the chat window while waiting to connect with a live agent. The position updates automatically as the queue moves, improving transparency and reducing drop-offs. Learn more→Agent Transfer Node
Developers can now configure a dialog trigger directly within the Agent Transfer node to automatically execute a predefined dialog, such as a feedback survey, after every agent conversation ends. Available for Genesys integrations only. Learn more→v11.23.0 March 28, 2026
This update includes enhancements and bug fixes. The key enhancements included in this release are summarized below.Agent Flow
Conversation History Length Doubled in the Agent Node and DialogGPT
The conversation history now supports up to 50 messages. It improves retention for custom prompts and prevents mid-conversation restarts, context loss, and repeated questions. You can configure the limit independently of the DialogGPT and Agent Node features. Learn more→Agent Transfer
Control Session Closure Message for Genesys
A new Disable End Conversation Message setting in Genesys configurations allows administrators to control whether users see a session closure message after an agent conversation ends. Its default state in the Platform is disabled. Learn more→Genesys Agent Message ID Available via Context Variable
When an agent responds in Genesys, after an agent handoff, a message ID is now exposed via a context variable and recorded in debug logs for traceability. This message ID helps trace the agent who handled the conversation. Learn more→v11.22.1 March 14, 2026
This update includes bug fixes.v11.22.0 February 28, 2026
This update includes enhancements and bug fixes. The key enhancements included in this release are summarized below.DialogGPT
DialogGPT-Based Apps now support nlMeta
DialogGPT-based apps now support handling interruptions across linked apps using thenlMeta object. It allows you to pass information directly to the AI Agent, which prioritizes and executes the specified intent before processing any other input. Learn more→
Agent Flow
Context-Aware Custom Prompts in Agent Node
The context object is now supported in pre-processor and post-processor scripts within Agent Node custom prompts. Previously, these scripts only supported static content. With this update, developers can render templates dynamically using real-time context data, enabling greater flexibility across pre- and post-processor features.v11.21.1 January 31, 2026
This update includes bug fixes.v11.21.0 January 17, 2026
This update includes enhancements and bug fixes. The key enhancements included in this release are summarized below.DialogGPT
Multiple Intent Descriptions
DialogGPT now allows users to add multiple descriptions for each intent, giving the flexibility to define various phrasings, perspectives, and explanations. This enhancement helps users achieve broader semantic coverage and significantly improves the accuracy of intent shortlisting and detection—especially valuable when working with broad, overlapping, or domain-specific intents.Agent Node
Ability to Render Rich UI Components from LLM Responses
The Agent Node can now pass structured JSON responses from the LLM to client channels for rich UI presentation. When users enable the “Parse Rich Templates” option in custom prompt settings (available for V1 and V2 prompts), the node passes the JSON payloads as structured responses to the platform, which then sends them as templates to client channels. This can be achieved by either prompting the model to generate responses in structured JSON format or by generating the templates in the prompt post-processor. The Node passes these JSON payloads as structured responses to the platform, which then sends them as templates to client channels. The client channels render these templates as supported UI components such as cards, lists, tables, and suggestion chips, enabling visually engaging information display beyond plain text.Security (API Scopes)
End-to-End Request and Response Payload Encryption for APIs
Payload encryption support has been extended to request payloads for selected APIs, in addition to the existing response payload encryption. The “Enforce Request and Response Payload Encryption” option (renamed from “Enforce Response Payload Encryption”) now encrypts both request and response payloads. When enabled, the system generates a key to encrypt payloads in both directions, ensuring full end-to-end data protection for APIs within the JWT application’s assigned scope.Flows & Channels
Enhanced Response Structure for the Get Linked Apps API
The Get Linked Apps API has been enhanced with a revised response structure to retrieve all linked applications associated with a universal or parent app.Agent Transfer
Read Receipts for WhatsApp Cloud API Integration
The Platform now supports WhatsApp Cloud API read receipts, allowing end users to see message status indicators (delivered and read) across both bot and agent conversations. This feature is automatically available for all existing and new WhatsApp Cloud API integrations.Availability of Salesforce MIAW Conversation ID
The Platform now exposes the Salesforce MIAW Conversation ID in the bot user context during agent transfer. This enables Message, Script, and Call Flow nodes, as well as external integrations, to access the active MIAW conversation identifier at runtime for tracking, correlation, and conditional logic.NLU Config
BGE M3 Embeddings Support
The Platform now supports BGE M3 Embeddings as an additional option for Knowledge Graph, Machine Learning, and Few-Shot NLP use cases across all languages. Users can select BGE M3 alongside existing options (MPNet and LaBSE) to enhance multilingual performance and improve retrieval accuracy.