How to Get Started with the Generative AI Controller in ServiceNow

How to Get Started with the Generative AI Controller in ServiceNow
How to Get Started with the Generative AI Controller in ServiceNow

Ramya Priya Balasubramanian

Practice Head ServiceNow

February 14, 2025

Discover the seamless way to automate intelligent responses and content creation

Introduction to the Generative AI Controller

The Generative AI Controller is a new feature in the Vancouver release of ServiceNow, designed to integrate large language models (LLMs) from providers like OpenAI and Azure OpenAI. It enables users to leverage AI for content generation, summarization, question-answering, and sentiment analysis directly within ServiceNow. This tool provides built-in actions compatible with ServiceNow environments, offering a seamless way to automate intelligent responses and content creation in workflows.

Capabilities

  • Summarize
  • Content Generation
  • QnA (included for Now Assist capabilities)
  • Sentiment Analysis
  • Generic Prompt

Credits: ServiceNow Generative AI Flexible LLM

1. Summarize

This capability takes large blocks of text (like customer messages, reports, or long descriptions) and generates a condensed summary. It’s especially useful in cases where a quick overview is needed, such as summarizing customer support logs, incident reports, or meeting notes.

  • Example Use: Summarizing a long customer support chat into a few key points to quickly assess the problem.

2. Content Generation

This feature generates content based on a topic or prompt provided by the user. It can create new text by understanding the context of the prompt, making it useful for drafting emails, creating knowledge base articles, or generating automated responses in Virtual Agent. 

  • Example Use: Creating an initial draft for a knowledge base article based on a simple prompt or creating personalized responses for customer inquiries.

3. QnA (Questions and Answers)

This feature allows the system to answer specific questions based on provided context. Primarily designed for use in Now Assist capabilities, this helps virtual agents and automated workflows to retrieve precise information from large documents or knowledge bases.

  • Example Use: A customer asks a virtual agent about a specific policy or procedure, and the QnA function provides an accurate answer based on stored content.

4. Sentiment Analysis

Sentiment Analysis helps detect the emotional tone of a piece of text, identifying whether the content is positive, neutral, or negative. It’s particularly useful for customer service, where sentiment detection can guide the prioritization of cases or personalize responses.

  • Example Use: Flagging negative customer feedback for quick escalation or measuring customer satisfaction trends in support interactions.

5. Generic Prompt

This capability allows for custom prompts where the user can directly interact with the AI model without predefined functions. It’s versatile, letting users define unique tasks based on specific needs.

  • Example Use: An IT manager could use a custom prompt to get insights into new security trends, or a customer support team could experiment with prompts to generate unique responses.

Each of these capabilities is integrated into ServiceNow’s Flow Designer and Virtual Agent, allowing developers to embed AI-driven responses and automate processes that previously required manual input. They enable ServiceNow users to create a more personalized, efficient, and proactive service experience

Configuration Guide

Step 1: Install the Generative AI Controller Plugin

The “sn_generative_ai” plugin is necessary to enable the Generative AI Controller in your instance. Note that this plugin is not available for Personal Developer Instances (PDIs), so it must be installed on an enterprise instance.

Gen AI Controller Application manager
  1. Go to Application Manager in your ServiceNow instance.
  2. In the search bar, type Generative AI Controller to locate the plugin.
  3. Click on the Generative AI Controller plugin (highlighted in the red box in your screenshot).
  4. Select Install and follow any additional prompts to complete the installation.

Step 2: Obtain an API Key from a Supported Provider

  1. Sign up with a supported generative AI provider such as OpenAI or Microsoft Azure OpenAI.
  2. Generate an API key from the provider’s portal:
    • For OpenAI: OpenAI Platform
    • For Microsoft Azure: Azure OpenAI Service
  3. Save the API key securely; it will be used for configuration in the next step.

Step 3: Configure API Credentials in ServiceNow

  1. In your ServiceNow instance, navigate to Connection & Credential Aliases.
  2. Open the OpenAI record or create a new record if necessary.
  3. Under Related Links, select Create New Connection & Credential.
connection and cred

4. Enter the API key in the API Key field and save.

Step 4: Set System Properties for Provider Configuration

  1. Navigate to System Properties (sys_properties.list) in ServiceNow.
  2. Locate the property named com.sn.generative.ai.provider.
  3. Set the value to either openai or azure based on the provider you’ve configured.
  4. Save the changes.

Step 5: Test and Use Generative AI Controller in Flow Designer or Script

  1. Flow Designer
    • Create a new flow and set a trigger condition.
    • Add an action from Generative AI Controller and select capabilities like “Summarization” or “Content Generation.”

2. Scripting:

  • You can directly call actions like sn_generative_ai.summarize in scripts for more custom use cases.

Snippets

Summarize

Generate content

Generic Prompt

generic prompt

QnA

QnA

Frequently Asked Questions

OpenAI and Azure OpenAI
Summarization, Content Generation, QnA, and Generic Prompt
Vancouver Patch 2+
Inquire with your account reps

Virtual Agent Designer, Mobile App Builder, Flow Designer, and scripting

Yes. Please be aware that data and queries you make from the app are sent to OpenAI and Azure OpenAI. Note their data privacy policy and decide what usage policy best fits your organization.
Yes. There are embedded Integration Hub spokes built into the Generative AI Controller that connect to the third-party LLM service providers. LLM transactions from a SN production instance will be counted as Integration Hub transactions (except from sub-prod instances).
Yes. Using OpenAI as a provider, we automatically apply their moderation API. For Azure OpenAI, we also employ their home-grown moderation.
While we have minimized the possibility of hallucinations through selective use cases and prompt engineering, there is always a risk of inaccurate information in generated content. Thus, it is always recommended to employ human-in-the-loop review for such content.
Currently no. Prompts are currently read-only. Default temperature = 0. Default max_tokens = 500.
GPT-3, GPT 3.5 turbo, GPT-4.
It depends on the provider and the length of the output. As an example, OpenAI charges $0.002/1k tokens for GPT 3.5 turbo.
Make sure your deployment name is entered as the model name
Not at this time

Final Thought

The Generative AI Controller in ServiceNow simplifies AI adoption for enterprises, enhancing automation and decision-making. With built-in AI actions, users can streamline workflows, improve customer interactions, and generate content efficiently—all within the ServiceNow ecosystem.

Author

Benjamin Samson

 

Recent Blogs
  • An Insight into ServiceNow Hardware Asset Management (HAM)
    An Insight into ServiceNow Hardware Asset Management (HAM) Ramya Priya Balasubramanian Practice Head ServiceNow Gain …
    Read More »
  • How to Write Test Cases: Introduction and Best Practices
    Learn to write effective test cases. Master best practices, templates, and tips to enhance software …
    Read More »
  • MuleSoft Admin Co-Pilot: Revolutionize Integration Management
    In today’s fast-paced digital landscape, seamless data integration is crucial for business
    Read More »