GETTING STARTED
SearchAssist Overview
SearchAssist Introduction
Onboarding SearchAssist
Build your first App
Glossary
Release Notes
What's new in SearchAssist
Previous Versions

CONCEPTS
Managing Sources
Introduction
Files
Web Pages
FAQs
Structured Data 
Connectors
Introduction to Connectors
Azure Storage Connector
Confluence Cloud Connector
Confluence Server Connector
Custom Connector
DotCMS Connector
Dropbox Connector
Google Drive Connector
Oracle Knowledge Connector
Salesforce Connector
ServiceNow Connector
SharePoint Connector
Zendesk Connector
RACL
Virtual Assistants
Managing Indices
Introduction
Index Fields
Traits
Workbench
Introduction to Workbench
Field Mapping
Entity Extraction
Traits Extraction
Keyword Extraction
Exclude Document
Semantic Meaning
Snippet Extraction
Custom LLM Prompts
Index Settings
Index Languages
Managing Chunks
Chunk Browser
Managing Relevance
Introduction
Weights
Highlighting
Presentable
Synonyms
Stop Words
Search Relevance
Spell Correction
Prefix Search
Custom Configurations
Personalizing Results
Introduction
Answer Snippets
Introduction
Extractive Model
Generative Model
Enabling Both Models
Simulation and Testing
Debugging
Best Practices and Points to Remember
Troubleshooting Answers
Answer Snippets Support Across Content Sources
Result Ranking
Facets
Business Rules
Introduction
Contextual Rules
NLP Rules
Engagement
Small Talk
Bot Actions
Designing Search Experience
Introduction
Search Interface
Result Templates
Testing
Preview and Test
Debug Tool
Running Experiments
Introduction
Experiments
Analyzing Search Performance
Overview
Dashboard
User Engagement
Search Insights
Result Insights
Answer Insights

ADMINISTRATION
General Settings
Credentials
Channels
Team
Collaboration
Integrations
OpenAI Integration
Azure OpenAI Integration
Custom Integration
Billing and Usage
Plan Details
Usage Logs
Order and Invoices
Smart Hibernation

SearchAssist APIs
API Introduction
API List

SearchAssist SDK

HOW TOs
Use Custom Fields to Filter Search Results and Answers
Add Custom Metadata to Ingested Content
Write Painless Scripts
Configure Business Rules for Generative Answers

Experiments

Adding Experiments

You can define experiment parameters and add them to an active SearchAssist app. By activating an experiment, you are allowing search query traffic to be split randomly among the variants.

Follow these steps to create an experiment:

  1. Click the Analytics menu tab
  2. Select Experiments on the drop-down menu.
  3. Click Add Experiment.
  4. In the New Experiment dialog box, enter a unique name in the field.

 

Adding Variants

By default, SearchAssist displays two rows of variants. Click Add variant to add up to two more variations (for a total maximum of four variants).

  1. Enter a unique Variant Name.
  2. Click the Index field and select an option on the drop-down menu.
  3. Click the Search Configuration field and select an option on the drop-down menu.
  4. Repeat steps to create a second variant.
  5. Click Add Variant. (optional)

 

Note: Make sure you train each index before running an experiment. Untrained indices prevent the experiment from running.

Customizing Traffic Settings

  1. Under the Traffic section, adjust the slider to set a value indicating the percent of the user query traffic to flow through each variant.
  2. In the Duration field, enter the number of days the experiment needs to be run, up to a maximum of 90 days.
  3. Click Add.

Running Experiments

To run an experiment:

  1. Click the Analytics menu tab.
  2. Click Experiments in the drop-down menu.
  3. On the Experiments page, find the list of experiments.
  4. Click Run on the respective experiment.

Note: you can run only one experiment at a time.

Once the experiment starts, SearchAssist changes the status to Active and displays the duration.

Notes

  1. You cannot run multiple experiments with two variants attached to each experiment. If you want to run a new experiment, pause or stop the current Active experiment.
  2. You cannot edit a completed experiment.

Getting Insights from Experiments

You can monitor active progress by clicking the experiment. Based on the insights drawn from the Experiment metrics, you can make informed decisions like which variant to apply to achieve the desired business objectives.

You can view the following analytics and draw insights from the same for a given Variant name:

  1. Index assigned
  2. Search configuration mapped
  3. Traffic percentage assigned
  4. Number of users
  5. Number of searches
  6. Count of clicks
  7. Click-through rate for each variant (i.e. number of unique clicks per appearance)

Managing Experiments

SearchAssist allows you to perform various actions on experiments. Hover the mouse on the right side of the experiment row to display action icons.

  • pause or stop an active experiment
  • restart or delete a paused or stopped the experiment
  • Edit an Experiment

You cannot edit a completed experiment. To edit a configured, but not active, experiment:

  1. Click the Configured label. The experiment table expands.
  2. Click View configuration.
  3. Make changes.
  4. Click Add.

Experiments

Adding Experiments

You can define experiment parameters and add them to an active SearchAssist app. By activating an experiment, you are allowing search query traffic to be split randomly among the variants.

Follow these steps to create an experiment:

  1. Click the Analytics menu tab
  2. Select Experiments on the drop-down menu.
  3. Click Add Experiment.
  4. In the New Experiment dialog box, enter a unique name in the field.

 

Adding Variants

By default, SearchAssist displays two rows of variants. Click Add variant to add up to two more variations (for a total maximum of four variants).

  1. Enter a unique Variant Name.
  2. Click the Index field and select an option on the drop-down menu.
  3. Click the Search Configuration field and select an option on the drop-down menu.
  4. Repeat steps to create a second variant.
  5. Click Add Variant. (optional)

 

Note: Make sure you train each index before running an experiment. Untrained indices prevent the experiment from running.

Customizing Traffic Settings

  1. Under the Traffic section, adjust the slider to set a value indicating the percent of the user query traffic to flow through each variant.
  2. In the Duration field, enter the number of days the experiment needs to be run, up to a maximum of 90 days.
  3. Click Add.

Running Experiments

To run an experiment:

  1. Click the Analytics menu tab.
  2. Click Experiments in the drop-down menu.
  3. On the Experiments page, find the list of experiments.
  4. Click Run on the respective experiment.

Note: you can run only one experiment at a time.

Once the experiment starts, SearchAssist changes the status to Active and displays the duration.

Notes

  1. You cannot run multiple experiments with two variants attached to each experiment. If you want to run a new experiment, pause or stop the current Active experiment.
  2. You cannot edit a completed experiment.

Getting Insights from Experiments

You can monitor active progress by clicking the experiment. Based on the insights drawn from the Experiment metrics, you can make informed decisions like which variant to apply to achieve the desired business objectives.

You can view the following analytics and draw insights from the same for a given Variant name:

  1. Index assigned
  2. Search configuration mapped
  3. Traffic percentage assigned
  4. Number of users
  5. Number of searches
  6. Count of clicks
  7. Click-through rate for each variant (i.e. number of unique clicks per appearance)

Managing Experiments

SearchAssist allows you to perform various actions on experiments. Hover the mouse on the right side of the experiment row to display action icons.

  • pause or stop an active experiment
  • restart or delete a paused or stopped the experiment
  • Edit an Experiment

You cannot edit a completed experiment. To edit a configured, but not active, experiment:

  1. Click the Configured label. The experiment table expands.
  2. Click View configuration.
  3. Make changes.
  4. Click Add.