Workflow tools are built by connecting nodes on a visual canvas. Each node type serves a specific purpose—processing data, making decisions, integrating with external systems, or pausing for human review.
Start → Validate → API Call → Transform → Condition → End ↓ Error Handler
Synchronous: waits for response (timeout: 5–180s, default 60s) · Asynchronous: continues without waiting (timeout: 30–300s, default 60s; No timeout option available for indefinite wait)
Request Definition
API endpoint URL or cURL, auth profile, headers, and body
Auth options:
Pre-authorize the integration: Use system-level credentials shared across all users.
Allow users to authorize: Each user authenticates at runtime (for example, Google Drive).
On Success / On Failure: Configure downstream nodes for each path.
Async timeout: When using No timeout for asynchronous operations, enable it on both the API node and the parent tool — otherwise timeout errors may still occur.
Invoke a function from an imported and deployed script.
Select Script: Choose a deployed script from the Script name list. Deploy scripts via Settings > Manage custom scripts.
Select Function: Choose a function from the Function name list. Only one function per node; only deployed scripts are listed.
Map Input Arguments: Assign static or dynamic values to each argument. Select the correct data type (String, Number, JSON, Boolean). Type {{ to trigger context variable suggestions.
Test: Click Test, enter values in the Input panel, then click Execute.
Output (result key) is saved to {{context.steps.functionnodename.output}}. Errors (stderr) are saved to {{context.steps.functionnodename.error}}.
Input argument mapping is required for deployment. You can test the function and tool, but you cannot deploy until all mapping errors are fixed.
Function nodes can write custom keys directly to the context object, making computed or intermediate values available to any downstream node in the flow. This is useful for storing control variables, enriched data, or flags that multiple nodes need to reference.
// Store a simple valuecontext.userTier = "premium";// Store structured data computed from upstream outputscontext.enrichedOrder = { orderId: context.steps.FetchOrder.output.id, isHighValue: context.steps.FetchOrder.output.total > 1000, processedAt: new Date().toISOString()};
Once set, reference the custom key in any downstream node using {{context.userTier}} or {{context.enrichedOrder.isHighValue}}.
Do not override system-reserved keys in the context object.
Use memory stores to retain and share data across steps or sessions. Data is stored as JSON; always check the memory store schema for field names and types. Memory stores are accessed per their defined scope — use projections to retrieve only the fields you need.
Add at least one service provider connection via Settings > Integrations before configuring this node. Test the connection in Settings to confirm it works.
IF Condition: Enter a context variable (for example, {{context.ambiguous_sub_categories}}), choose an operator, and enter a value or another variable (for example, {{context.steps.NodeName.output}}). Combine multiple criteria with AND/OR.
Routing: Set Go To (IF met) and ELSE (IF not met) nodes.
Write your own: Enter System Prompt (model role) and Human Prompt (task instructions) · Prompt Hub: Select a saved prompt and version; optionally customize
Select Model
Choose a configured LLM
Timeout
30 - 180 seconds (default 60s)
Response JSON schema
Optional; define structure for predictable output
Model Configurations
Temperature, Top-p, Top-k, Max Tokens
System vs. Human prompts:
System Prompt: Sets the model’s role. Example: “You are a helpful assistant.”
Human Prompt: The task or question. Example: “Summarize this error log.” Use {{context.variable_name}} for dynamic values.
Tool calling settings:
Setting
Description
Add Tools
Select up to 3 tools from your account
Exit node execution after
Number of model calls before exiting to failure path
Tool choice
Auto (model decides) or Required (always calls a tool)
Parallel tool calls
True for simultaneous calls; False for sequential
When tool calling is enabled, the model autonomously decides whether to use its internal knowledge or invoke an external tool to complete the task.
Set up a Search AI App: Configure a Search AI application and enable the Answer Generation API scope.
Link Search AI in the Platform: Go to Settings > Integrations > Search AI > Link an App. Enter the app credentials, test the connection, and confirm. Use https://platform.kore.ai for the Search AI URL.
When the workflow reaches the Human Node, it sends a POST request to the configured endpoint. Execution pauses until the reviewer responds, times out, or a delivery failure occurs.
Mode
Behavior
Sync
Workflow pauses and waits for reviewer response within the endpoint timeout
Async
Workflow sends an immediate acknowledgement and continues; notifies the callback URL when the request is sent, when the response is received, and when the final output is generated
The callback URL remains active until the configured wait time — late or duplicate responses are ignored.Sync mode scenarios:
Scenario
Debug Panel Output
Flow Outcome
Reviewer responds before timeout
Response JSON with key-value pairs
Continues along the Success path
Channel/platform failure
Error message in the Response section
Continues along the Failure path
Timeout — Terminate
Response JSON with null values
Ends along the End/Terminate path
Timeout — Skip & Continue
Response JSON with null values
Skips the Human node; continues to the next configured node
Endpoint timeout before reviewer response
Timeout error JSON
Stops with a Timeout error
Endpoint timeout after reviewer response
Timeout error JSON
Stops with a Timeout error
Async mode scenarios:
Scenario
Debug Panel Output
Flow Outcome
Reviewer responds before timeout
Response JSON with key-value pairs
Continues along the Success path
Channel/platform failure
Error message in the Response section
Continues along the Failure path
Timeout — Terminate
Response JSON with null values
Ends along the Terminate path
Timeout — Skip & Continue
Response JSON with null values
Skips the Human node; continues to the next configured node
1. Request Destination: Select Custom Request in Send & wait for response (currently the only supported option).2. Request Definition: Click Define Request and provide:
Field
Details
Request Type
POST only
API Endpoint URL
Endpoint URL or cURL command
Auth Profile
Pre-authorize (system credentials) or user-authorize (per-user runtime auth)
Headers
Key-value pairs; CallbackURL and Token are auto-included
Body
Auto-generated at runtime from Input Fields + Reviewer Note; extra keys added for testing are ignored at runtime
3. Input Fields: Define fields the reviewer must fill in.
Supported types: Text, Number, Boolean, Date
Set default values and mark required/optional
Pre-fill with context variables: {{context.user.name}}
Click Payload preview to inspect the full payload
4. Reviewer Note:
Field
Description
Subject line
Email subject or message title
Message body
Context or instructions for the reviewer (resolved at runtime)
Assign to
Reviewer’s email address
5. Timeout Behavior:
No timeout: Waits indefinitely.
Set timeout: Default 120 seconds (configurable in seconds, minutes, hours, or days).
6. Outcome Paths:
Outcome
Behavior
On Success
All mandatory fields received a response; workflow continues along the success path; supports parallel branching
On Timeout - Terminate
No response within timeout; flow ends via the End node; if the End node is deleted, the flow automatically switches to Skip
On Timeout - Skip
No response within timeout; continues with null output to the next node
// Full response payload{{context.steps.NodeName.output}}// Specific field{{context.steps.NodeName.output.Approval}}{{context.steps.NodeName.output.Comments}}
Inside Loops: The loop does not advance to the next iteration until the Human node receives a response.
In Parallel Branches: The branch merge waits for the Human node to complete before continuing.