Chatbot For Your Google Paperwork Utilizing Langchain And OpenAI

Introduction
On this article, we’ll create a Chatbot to your Google Paperwork with OpenAI and Langchain. Now why do now we have to do that within the first place? It could get tedious to repeat and paste your Google Docs contents to OpenAI. OpenAI has a personality token restrict the place you possibly can solely add particular quantities of knowledge. So if you wish to do that at scale otherwise you need to do it programmatically, you’re going to wish a library that will help you out; with that, Langchain comes into the image. You’ll be able to create a enterprise affect by connecting Langchain with Google Drive and open AI so as to summarize your paperwork and ask associated questions. These paperwork might be your product paperwork, your analysis paperwork, or your inner information base that your organization is utilizing.
Studying Aims
- You’ll be able to discover ways to fetch your Google paperwork content material utilizing Langchain.
- Learn to combine your Google docs content material with OpenAI LLM.
- You’ll be able to study to summarize and ask questions on your doc’s content material.
- You’ll be able to discover ways to create a Chatbot that solutions questions primarily based in your paperwork.
This text was printed as part of the Data Science Blogathon.
Load Your Paperwork
Earlier than we get began, we have to arrange our paperwork in google drive. The essential half here’s a doc loader that langchain offers known as GoogleDriveLoader. Utilizing this, you possibly can initialize this class after which go it a listing of doc IDs.
from langchain.document_loaders import GoogleDriveLoader
import os
loader = GoogleDriveLoader(document_ids=["YOUR DOCUMENT ID's'"],
credentials_path="PATH TO credentials.json FILE")
docs = loader.load()
You could find your doc id out of your doc hyperlink. You could find the id between the ahead slashes after /d/ within the hyperlink.
For instance, in case your doc hyperlink is https://docs.google.com/doc/d/1zqC3_bYM8Jw4NgF then your doc id is “1zqC3_bYM8Jw4NgF”.
You’ll be able to go the listing of those doc IDs to document_ids parameter, and the cool half about that is it’s also possible to go a Google Drive folder ID that comprises your paperwork. In case your folder hyperlink is https://drive.google.com/drive/u/0/folders/OuKkeghlPiGgWZdM then the folder ID is “OuKkeghlPiGgWZdM1TzuzM”.
Authorize Google Drive Credentials
Step 1:
Allow the GoogleDrive API by utilizing this hyperlink https://console.cloud.google.com/flows/enableapi?apiid=drive.googleapis.com. Please guarantee you might be logged into the identical Gmail account the place your paperwork are saved within the drive.

Step 2: Go to the Google Cloud console by clicking this link . Choose “OAuth consumer ID”. Give software kind as Desktop app.


Step 3: After creating the OAuth consumer, obtain the secrets and techniques file by clicking “DOWNLOAD JSON”. You’ll be able to observe Google’s steps when you have any doubts whereas making a credentials file.

Step 4: Improve your Google API Python consumer by working beneath pip command
pip set up --upgrade google-api-python-client google-auth-httplib2 google-auth-oauthlib
Then we have to go our json file path into GoogleDriveLoader.
Summarizing Your Paperwork
Be sure to have your OpenAI API Keys out there with you. If not, observe the beneath steps:
1. Go to ‘https://openai.com/ and create your account.
2. Login into your account and choose ‘API’ in your dashboard.
3. Now click on in your profile icon, then choose ‘View API Keys’.
4. Choose ‘Create new secret key’, copy it, and reserve it.
Subsequent, we have to load our OpenAI LLM. Let’s summarize the loaded docs utilizing OpenAI. Within the beneath code, we used a summarization algorithm known as summarize_chain supplied by langchain to create a summarization course of which we saved in a variable named chain that takes enter paperwork and produces concise summaries utilizing the map_reduce method. Substitute your API key within the beneath code.
from langchain.llms import OpenAI
from langchain.chains.summarize import load_summarize_chain
llm = OpenAI(temperature=0, openai_api_key=os.environ['OPENAI_API_KEY'])
chain = load_summarize_chain(llm, chain_type="map_reduce", verbose= False)
chain.run(docs)
You’re going to get a abstract of your paperwork for those who run this code. If you wish to see what LangChain was doing beneath the covers, change verbose to True, after which you possibly can see the logic that Langchain is utilizing and the way it’s pondering. You’ll be able to observe that LangChain will robotically insert the question to summarize your doc, and all the textual content(question+ doc content material) will probably be handed to OpenAI. Now OpenAI will generate the abstract.
Under is a use case the place I despatched a doc in Google Drive associated to a product SecondaryEquityHub and summarized the doc utilizing the map_reduce chain kind and load_summarize_chain() operate. I’ve set verbose=True to see how Langchain is working internally.
from langchain.document_loaders import GoogleDriveLoader
import os
loader = GoogleDriveLoader(document_ids=["ceHbuZXVTJKe1BT5apJMTUvG9_59-yyknQsz9ZNIEwQ8"],
credentials_path="../../desktop_credetnaisl.json")
docs = loader.load()
from langchain.llms import OpenAI
from langchain.chains.summarize import load_summarize_chain
llm = OpenAI(temperature=0, openai_api_key=os.environ['OPENAI_API_KEY'])
chain = load_summarize_chain(llm, chain_type="map_reduce", verbose=True)
chain.run(docs)
Output:

We will observe that Langchain inserted the immediate to generate a abstract for a given doc.

We will see the concise abstract and the product options current within the doc generated by Langchain utilizing OpenAI LLM.
Extra Use Instances
1. Analysis: We will use this performance whereas doing analysis, As a substitute of intensively studying all the analysis paper phrase by phrase, we will use the summarizing performance to get a look on the paper shortly.
2. Schooling: Instructional establishments can get curated textbook content material summaries from intensive knowledge, tutorial books, and papers.
3. Enterprise Intelligence: Knowledge analysts should undergo a big set of paperwork to extract insights from paperwork. Utilizing this performance, they’ll scale back the massive quantity of effort.
4. Authorized Case Evaluation: Legislation practising professionals can use this performance to shortly get essential arguments extra effectively from their huge quantity of earlier related case paperwork.
Let’s say we needed to ask questions on content material in a given doc, we have to load in a unique chain named load_qa_chain . Subsequent, we initialise this chain with a chain_type parameter. In our case, we used chain_type as “stuff” It is a easy chain kind; it takes all of the content material, concatenates, and passes to LLM.
Different chain_types:
- map_reduce: At the start, the mannequin will individually seems to be into every doc and shops its insights, and on the finish, it combines all these insights and once more seems to be into these mixed insights to get the ultimate response.
- refine: It iteratively seems to be into every doc given within the document_id listing, then it refines the solutions with the latest data it discovered within the doc because it goes.
- Map re-rank: The mannequin will individually look into every doc and assigns a rating to the insights. Lastly, it is going to return the one with the very best rating.
Subsequent, we run our chain by passing the enter paperwork and question.
from langchain.chains.question_answering import load_qa_chain
question = "Who's founding father of analytics vidhya?"
chain = load_qa_chain(llm, chain_type="stuff")
chain.run(input_documents=docs, query=question)
Whenever you run this code, langchain robotically inserts the immediate together with your doc content material earlier than sending this to OpenAI LLM. Below the hood, langchain helps us with immediate engineering by offering optimized prompts to extract the required content material from paperwork. If you wish to see what prompts they’re utilizing internally, simply set verbose=True, then you possibly can see the immediate within the output.
from langchain.chains.question_answering import load_qa_chain
question = "Who's founding father of analytics vidhya?"
chain = load_qa_chain(llm, chain_type="stuff", verbose=True)
chain.run(input_documents=docs, query=question)
Construct Your Chatbot
Now we have to discover a method to make this mannequin a question-answering Chatbot. Primarily we have to observe beneath three issues to create a Chatbot.
1. Chatbot ought to bear in mind the chat historical past to know the context relating to the continuing dialog.
2. Chat historical past must be up to date after every immediate the consumer asks to bot.
2. Chatbot ought to work till the consumer desires to exit the dialog.
from langchain.chains.question_answering import load_qa_chain
# Operate to load the Langchain question-answering chain
def load_langchain_qa():
llm = OpenAI(temperature=0, openai_api_key=os.environ['OPENAI_API_KEY'])
chain = load_qa_chain(llm, chain_type="stuff", verbose=True)
return chain
# Operate to deal with consumer enter and generate responses
def chatbot():
print("Chatbot: Hello! I am your pleasant chatbot. Ask me something or kind 'exit' to finish the dialog.")
from langchain.document_loaders import GoogleDriveLoader
loader = GoogleDriveLoader(document_ids=["YOUR DOCUMENT ID's'"],
credentials_path="PATH TO credentials.json FILE")
docs = loader
# Initialize the Langchain question-answering chain
chain = load_langchain_qa()
# Record to retailer chat historical past
chat_history = []
whereas True:
user_input = enter("You: ")
if user_input.decrease() == "exit":
print("Chatbot: Goodbye! Have an important day.")
break
# Append the consumer's query to speak historical past
chat_history.append(user_input)
# Course of the consumer's query utilizing the question-answering chain
response = chain.run(input_documents=chat_history, query=user_input)
# Extract the reply from the response
reply = response['answers'][0]['answer'] if response['answers'] else "I could not discover a solution to your query."
# Append the chatbot's response to speak historical past
chat_history.append("Chatbot: " + reply)
# Print the chatbot's response
print("Chatbot:", reply)
if __name__ == "__main__":
chatbot()
We initialized our google drive paperwork and OpenAI LLM. Subsequent, we created a listing to retailer the chat historical past, and we up to date the listing after each immediate. Then we created an infinite whereas loop that stops when the consumer offers “exit” as a immediate.
Conclusion
On this article, now we have seen create a Chatbot to provide insights about your Google paperwork contents. Integrating Langchain, OpenAI, and Google Drive is among the most useful use circumstances in any discipline, whether or not medical, analysis, industrial, or engineering. As a substitute of studying complete knowledge and analyzing the information to get insights which prices loads of human time and effort. We will implement this expertise to automate describing, summarizing, analyzing, and extracting insights from our knowledge recordsdata.
Key Takeaways
- Google paperwork could be fetched into Python utilizing Python’s GoogleDriveLoader class and Google Drive API credentials.
- By integrating OpenAI LLM with Langchain, we will summarize our paperwork and ask questions associated to the paperwork.
- We will get insights from a number of paperwork by selecting applicable chain sorts like map_reduce, stuff, refine, and map rerank.
Regularly Requested Questions
A. To construct an clever chatbot, it’s essential have applicable knowledge, then it’s essential give entry to ChatGPT for this knowledge. Lastly, it’s essential present dialog reminiscence to the bot to retailer the chat historical past to know the context.
A. One of many options is you should use Langchain’s GoogleDriveLoader to fetch a Google Doc then, you possibly can initialize the OpenAI LLM utilizing your API keys, then you possibly can share the file to this LLM.
A. First, it’s essential allow Google Drive API, then get your credentials for Google Drive API, then you possibly can go the doc id of your file to the OpenAI ChatGPT mannequin utilizing Langchain GoogleDriveLoader.
A. ChatGPT can’t entry our paperwork straight. Nonetheless, we will both copy and paste the content material into ChatGPT or straight fetch the contents of paperwork utilizing Langchain then, we will go the contents to ChatGPT by initializing it utilizing secret keys.
The media proven on this article isn’t owned by Analytics Vidhya and is used on the Writer’s discretion.