OpenAI Alert - Plug Sample #6: Reach maximum potential with SalesIQ Zobot and ChatGPT Integration
In recent times, AI-powered tools have seen a remarkable surge in usage due to their exceptional capacity to enhance overall performance across diverse industries and sectors. One such tool is ChatGPT which businesses have started to use predominantly for multiple purposes.
ChatGPT is an AI-powered chatbot that can understand natural language and respond to queries in real-time. On the other hand, we already know that SalesIQ's Zobot is an excellent tool to automate customer engagement which helps businesses to engage with their customers and perform actions based on unique business needs.
By integrating ChatGPT and SalesIQ's Zobot, businesses can create a seamless customer service experience that is not only efficient but also productive. In this post, let's discuss how to integrate ChatGPT with SalesIQ's Zobot (
Codeless bot builder ) using
Plugs.
Here is a
sample plug to Integrate Zobot with ChatGPT Assistant
What can this plug do?
- First plug helps to integrate the GPT model - text-davinci-003 with Zobot using your OpenAI's API key.
- Second plug helps to integrate the GPT- model 3.5-turbo with Zobot using your OpenAI's API key.
- Get a response from GPT for any text inputs such as questions/issues from the visitor.
Note : Remember that this script cannot train the GPT models according to your business resource.
How to create the ChatGPT Plug?
Step 1 - Get API keys from the Open AI
- Navigate to the OpenAI developer section.
- Click " Log in " on the top right corner to log in with your account.
- If you belong to multiple organization, select your organization.
- Finally, click on " Create new secret key " to generate an API key for your account.
Step 2 - Create the Plug
- In your SalesIQ Dashboard, navigate to Settings > Developers > Plugs > click on Add .
- Provide your Plug a name, description, select the Platform as SalesIQ Scripts , and click on Create Plug .
- Click on Parameters and provide the following
- Input Parameter: question | Data Type: String
- Output Parameter: answer | Data Type: String
- Copy and paste the below script (text-davinci-003 model or gpt-3.5-turbo) and replace the token with your API key.
Plug script to integrate with GPT - text-davinci-003 model
- if(session.containsKey("question"))
- {
- question = session.get("question").get("value");
- }
- token = "Bearer sk-oxxxxxxxxxxxxxxxxxxxxxxxxxxxxxK";
- //replace the token with your API key | token = "Bearer <your API key>";
- header = Map();
- header.put("Authorization",token);
- header.put("Content-Type","application/json");
- chatgpt = Collection();
- chatgpt = {"model":"text-davinci-003","prompt":question,"temperature":0.9,"max_tokens":250,"top_p":1,"frequency_penalty":0.0,"presence_penalty":0.6,"stop":{" Human:"," AI:"}};
- params = Map();
- params.put(chatgpt);
- response = invokeurl
- [
- url :" https://api.openai.com/v1/completions"
- type :POST
- parameters:params.toString()
- headers:header
- ];
- info response;
- answer = response.get("choices").getJSON("text");
- info answer;
- response = Map();
- response.put("answer",answer);
- return response;
Plug script to integrate with GPT - gpt-3.5-turbo model
- if(session.containsKey("question"))
- {
- question = session.get("question").get("value");
- }
- token = "Bearer sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx2";
- //replace the token with your API key | token = "Bearer <your API key>";
- header = Map();
- header.put("Authorization",token);
- header.put("Content-Type","application/json");
- chatgpt = Collection();
- chatgpt = {"model":"gpt-3.5-turbo","messages":{{"role":"user","content":question}},"temperature":0.9,"max_tokens":250,"top_p":1,"frequency_penalty":0.0,"presence_penalty":0.6,"stop":{" Human:"," AI:"}};
- params = Map();
- params.put(chatgpt);
- response = invokeurl
- [
- url :"https://api.openai.com/v1/chat/completions"
- type :POST
- parameters:params.toString()
- headers:header
- ];
- info response;
- answer = response.get("choices").getJSON("message").get("content");
- info answer;
- response = Map();
- response.put("answer",answer);
- return response;
- Click on Save to save the plug.
- Test the plug by giving any input.
- The API response from GPT () will be as below.
- Finally, click on Publish .
How to incorporate plugs in the Codeless bot builder?
- Navigate to Settings > Bot > Add , provide the necessary information, and select Codeless Bot as a bot platform or open an existing bot.
- Select the Plugs under Action Cards and select the required plug (Only published plugs will be listed here).
- Provide the bot context variables for the plug input (question) and output (answer).
- In this case, the "visitor.question" is the visitor's input (maybe a question/issue) that is stored in the bot context by the visitor fields card.
- The plug executes and returns the answer from the GPT response as output (answer) which has to be stored in bot context (gptResponse) for displaying it to the visitors.
- Click Save.
- Then using any input/response cards, display the answer. Type % to list all the dynamic and context variables.
Heads up:
- Use a single choice card to display the answer and provide two follow-up actions such as "I've another question" for the visitor to ask another question and "End chat" to end the conversation.
- Save the visitor's choice in a bot context and using criteria router , route the flow respectively.
Cheers,
Sasidar Thandapani