paint-brush
How to Build a Customer Support Chatbot with LangChain and DeepInfra: A Step-by-Step Guideby@mikeyoung44
708 reads
708 reads

How to Build a Customer Support Chatbot with LangChain and DeepInfra: A Step-by-Step Guide

by Mike YoungJune 10th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Chatbots have become a staple in modern businesses, enhancing customer service while boosting efficiency. When developing a responsive and effective chatbot, three elements are essential: the model, Prompt template, and memory. We will walk through the process of building a customer support chatbot with LangChain and DeepInfra. We’ll pretend that this chatbot “works” at an online clothing store.
featured image - How to Build a Customer Support Chatbot with LangChain and DeepInfra: A Step-by-Step Guide
Mike Young HackerNoon profile picture

You might have come across chatbots in your daily interactions online, but have you ever considered the tech that powers these digital helpers? Chatbots, especially in the realm of customer support, have become a staple in modern businesses, enhancing customer service while boosting efficiency. Today, we'll delve into how LangChain and DeepInfra are enabling these chatbots, making them more responsive and effective.

Fundamental Components of a Chatbot

Let's get to the basics first - what are the core components of a chatbot? When developing a responsive and effective chatbot, three elements are essential: the model, PromptTemplate, and memory.


The model represents the AI brain behind the chatbot, taking charge of understanding and responding to user inputs. The PromptTemplate guides the chatbot's responses, ensuring they stay relevant to the conversation. Lastly, the memory maintains state across interactions, enabling the chatbot to remember past conversations and use them to understand the context of current ones.

Step-by-Step Guide: Building a Customer Support Chatbot with LangChain and DeepInfra

Now let's get our hands dirty. We will walk through the process of building a customer support chatbot with LangChain and DeepInfra. We’ll pretend that this chatbot “works” at an online clothing store and can help customers choose clothes for them.

Acquiring the DeepInfra API key

DeepInfra, with its simple API and scalable, production-ready infrastructure, allows you to run top AI models with ease. First things first, you'll need to use this link acquire a DeepInfra API key to interact with their service. Once you have it, you can set the API token in your environment as follows:


from getpass import getpass
import os
# Set the DeepInfra API token
DEEPINFRA_API_TOKEN = getpass()
os.environ["DEEPINFRA_API_TOKEN"] = DEEPINFRA_API_TOKEN

Setting up the LangChain and DeepInfra environments

Next, you'll want to set up your LangChain and DeepInfra environments. Import the necessary components and instantiate the DeepInfra model. For example, you can use a model like 'databricks/dolly-v2-12b':


from langchain import ConversationChain, LLMChain, PromptTemplate
from langchain.memory import ConversationBufferWindowMemory
from langchain.llms import DeepInfra
# Create the DeepInfra instance
llm = DeepInfra(model_id="databricks/dolly-v2-12b")
llm.model_kwargs = {'temperature': 0.7, 'repetition_penalty': 1.2, 'max_new_tokens': 250, 'top_p': 0.9}

A Note: Selecting and deploying the right model for the chatbot

You can use many different models for the LLM line. The example shows how to use the databricks/dolly-v2-12b model, but there are many others out there on DeepInfra. Since there are a lot of options, you may want to use a tool like AIModels.fyi to find suitable LLMs to use with LangChain. You have the freedom to search, filter, and sort AI models to find the one that fits your project best. Check out the DeepInfra page to find alternative models to play around with.

Creating a PromptTemplate to guide the chatbot's responses

Now, it's time to define a PromptTemplate to guide your chatbot's responses. This will ensure your chatbot's responses are aligned with the context and user's input. I tried several different templates and it wasn’t easy to get one that worked perfectly. The process of coming up with the right prompt is called prompt engineering. Eventually, I was able to re-use a template I found on Pinecone’s site.


template = """Given the following user prompt and conversation log, formulate a question that would be the most relevant to provide the user with an answer from a knowledge base.
  You should follow the following rules when generating and answer:
  - Always prioritize the user prompt over the conversation log.
  - Ignore any conversation log that is not directly related to the user prompt.
  - Only attempt to answer if a question was posed.
  - The question should be a single sentence.
  - You should remove any punctuation from the question.
  - You should remove any words that are not relevant to the question.
  - If you are unable to formulate a question, respond with the same USER PROMPT you got.

Conversation log: {history}
USER PROMPT: {human_input}
Your response:
"""

prompt = PromptTemplate(
    input_variables=["history", "human_input"], 
    template=template
)

Initializing the chatbot and setting up memory

With your model and PromptTemplate ready, the next step is to initialize the chatbot and set up memory to maintain the state across interactions.


# Now using DeepInfra with the LLMChain
llm_chain = LLMChain(
llm=llm,
prompt=prompt,
verbose=True,
memory=ConversationBufferWindowMemory(k=2),
)

Running the chatbot and interacting with it

Finally, you can now interact with your chatbot. Let's see an example:


output = llm_chain.predict(human_input="Hello! What clothes do you recommend I buy to rebuild my summer wardrobe")
print(output)


The resulting response recommends some clothes:


In the context of summer wardrobe recommendations, you should buy your clothes from the following list:
- V-neck T-shirts
- Tank Tops
- Solid Color Swim Shorts
- Swim Shorts
- Skirts
- Cardigans
- Sandals

The Concept of Memory in Chatbots

Memory plays a critical role in chatbots. It helps maintain context and history in chatbot interactions, enabling the chatbot to recall past conversations and understand the context of current ones. This ability is fundamental in creating a more human-like interaction, enhancing the user's experience. There’s a lot to dive into on the topic of memory, and I recommend you check out this guide for more information.

More Resources and Examples

For further understanding, I recommend checking out resources such as the ChatGPT Clone notebook on Langchain’s site, the Conversation Memory notebook, and the Conversation Agent notebook. These resources offer deeper dives into the concepts of memory, with Memory Key Concepts and Memory Examples offering practical guides.


You should also check out the other Langchain guides on AIModels.fyi.


DeepInfra also has robust documentation for their platform and even has a blog you can visit to get detailed posts, guides, and articles.

Conclusion

Building a chatbot for customer support using LangChain and DeepInfra may seem complex initially, but once you understand the fundamental components and steps, the process becomes much more straightforward. Leveraging these technologies can significantly enhance customer service, increase business efficiency, and improve overall customer satisfaction. As we move forward, the potential of these technologies is truly immense, and I look forward to seeing how they'll continue to evolve and impact the realm of customer service. Thanks for reading, and have fun building!


Also published here.