On my previous blog at link, https://telephonyhub.in/2024/07/16/get-started-with-langchain-and-llm-applications/ we saw about langchain installation and it’s capabilities. Now, need to transform those capabilities into useful APIs so everyone can consume in his applications easily. Here we will see how to convert some langchain features into useful REST APIs with Flask framework.
In my previous post we did some useful langchain package installation, now need to install below packages to work with APIs. Install packages below,
pip3 install flask pip3 install flask_limiter
Now, we are ready to start our flask server. First need to clone below git repository.
git clone https://github.com/ankitjayswal87/LangChainLLM.git
Once you clone this repository you will have many python files inside. Need to create .env file first with below command.
vim .env
Add below details in it,
OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" LANGCHAIN_API_KEY="lsv2_xxxxxxxxxxxxxxxxxxxxxxxxxxxxx" MISTRAL_API_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Inside cloned project flolder you will find “flask_lang_chain.py” file. You can run below command from the terminal to start the flask server,
python3 flask_lang_chain.py
Great! you will see the server is now started.
If you face any error in starting the Flask server you can use below command to install required packages and dependencies:
pip3 install -r requirements.txt
LangChain APIs Document
Here, we will see some useful langchain apis that can be used by anyone to integrate in their applications. This can be helpful to provide these AI capabilities to everyone who does not know langchain but want to consume these features.
Simple LLM Call
Description: This is simple API that can take your input query string and return appropriate answer from OpenAI model. You can ask anything meaningful and the model will return you some answer.
URL: http://localhost/lang_chain_api/simple_llm_call
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{“query”:”tell me three lines about Rashmika Mandana”,”model”:”openai”}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
query | string | your query string to ask llm |
model | string | Available options are: 'openai', 'ollama' |
CURL Request:
curl -X POST 'http://localhost/lang_chain_api/simple_llm_call' --header 'Content-Type: application/json' --data-raw '{"query":"tell me three lines about Rashmika Mandana","model":"openai"}'
Response:
{ "response": "1. Rashmika Mandana is a popular Indian actress known for her work in Telugu and Kannada films.\n2. She made her acting debut in 2016 with the Kannada film \"Kirik Party\" and quickly rose to fame in the industry.\n3. Rashmika has won several awards for her performances and is considered one of the most promising young talents in Indian cinema." }
LLM Call With Prompt
Description: In this API call you can pass your query string with some variables. So in every call you can customize your query string with your custom variables. This will give you more control to ask LLM model and in more generic way you can ask to LLM.
URL: http://localhost/lang_chain_api/llm_call_with_prompt
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{“template”:”I want to open {field1} {field2} company, please suggest me 3 best names for it”,”fields”:{“field1″:”Indian”,”field2″:”readymade cloths”},”model”:”openai”}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
template | string | your query template with variables. E.g. I want to open {field1} {field2} company, please suggest me 3 best names for it |
fields | string | field values in json format. E.g. {"field1":"Indian","field2":"readymade cloths"} |
model | string | Available options are: 'openai', 'ollama' |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/llm_call_with_prompt' \ --header 'Content-Type: application/json' \ --data-raw '{"template":"I want to open {field1} {field2} company, please suggest me 3 best names for it","fields":{"field1":"Indian","field2":"readymade cloths"},"model":"openai"}'
Response:
{ "response": "1. Desi Threads\n2. Ethnic Elegance\n3. Indian Attire Co." }
LLM Call with Structured Output
Description: In this API call you can pass your query string with JSON schema. This JSON schema is the desired output format, which we want in our response. Let’s say for our query “let me know the highest runs of Sachin Tendulkar in cricket in ODI” the output we are interested in is ‘game’, ‘player’ and ‘score’ parameters. Here we will get these required key values only in our JSON response, which is easier to handle in any programming language and process it further.
URL: http://localhost/lang_chain_api/llm_call_with_structured_output
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{“query”:”let me know the highest runs of Sachin Tendulkar in cricket in ODI”,”output_schema”:{
“title”: “Sports”,
“description”: “Get highest score of player”,
“type”: “object”,
“properties”: {
“game”: {
“type”: “string”,
“description”: “The sport in sentence”
},
“player”: {
“type”: “string”,
“description”: “The player in the sport”
},
“score”: {
“type”: “integer”,
“description”: “How many runs by player”
}
},
“required”: [“game”, “player”,”score”]
},”model”:”openai”}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
query | string | your query to llm model |
output_schema | string | the desired json format in which you want output from llm. You can modify the variables here 'game', 'player', 'score' as per your query |
model | string | Available options are: 'openai', 'ollama' |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/llm_call_with_structured_output' \ --header 'Content-Type: application/json' \ --data-raw '{"query":"let me know the highest runs of Sachin Tendulkar in cricket in ODI","output_schema":{ "title": "Sports", "description": "Get highest score of player", "type": "object", "properties": { "game": { "type": "string", "description": "The sport in sentence" }, "player": { "type": "string", "description": "The player in the sport" }, "score": { "type": "integer", "description": "How many runs by player" } }, "required": ["game", "player","score"] },"model":"openai"}'
Response:
{ "response": { "game": "cricket", "player": "Sachin Tendulkar", "score": 200 } }
Ask To Vector Database
Description: In this API call you can query and get response from your vector database. How to generate and store the vector database is shown in my previous post at link, https://telephonyhub.in/2024/07/16/get-started-with-langchain-and-llm-applications/ in the Example6. Here we are considering we already have a vector database available to query. Vector database is just numerical representation of sentences and when you query such database it returns similar looking vectors and sentences in response.
URL: http://localhost/lang_chain_api/ask_to_vector_db
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{“vector_db”:”openai_vector_data”,”search_type”:”similarity”,”similar_results”:2,”query”:”what is CPaaS solution”}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
vector_db | string | name of your vector db folder |
search_type | string | possible values are 'similarity', 'mmr' |
similar_results | int | number of similar results to return |
query | string | user query string |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/ask_to_vector_db' \ --header 'Content-Type: application/json' \ --data-raw '{"vector_db":"openai_vector_data","search_type":"similarity","similar_results":2,"query":"what is CPaaS solution"}'
Response:
{ "response": [ { "answer": "Benefits of CPaaS Solution:\n– Telecom companies can offer their voice services in form of API and that way they can expose their services in more effective way\n– Customer does not require to do heavy VoIP server setup\n– Telecom companies do not require to route physical SIP trunk lines to customer premises\n– Customers do not require to care about server administration, any down time or scaling the voice traffic\n\nAbout my CPaaS Solution:", "source": "openai_test_data.txt" }, { "answer": "There are many reasons for choosing CPaaS solution. Many telecom companies and call center providers are adopting this solution as it is deployed in highly scalable environment. In this solution one can expose his/her VoIP services in form of API and web hook responses only. This solution provides programmable interface to develop the IVR call flows, now no need to develop dialplan every time for a new IVR. The all heavy lifting is done in core development. People just need to Buy DID number,", "source": "openai_test_data.txt" } ] }
LLM Call with RAG
Description: In this API call you can use external knowledge base and craft your answer with power of LLM model. Here also we will ask same question “what is CPaaS solution” and we will see some difference in providing way of answer from LLM model. You can compare the responses generated in both cases, in previous API call we got the given number of similar looking vectors only while in RAG case you will see more mature answer from our own external knowledge base. For both the API calls the source is same “openai_test_data.txt”. But in previous case it just gives similar looking sentences while in RAG call it gives effective presentable answer.
URL: http://localhost/lang_chain_api/ask_to_vector_db_rag
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{“vector_db”:”openai_vector_data”,”query”:”what is CPaaS solution”}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
vector_db | string | name of your local vector database folder |
query | string | user query string |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/ask_to_vector_db_rag' \ --header 'Content-Type: application/json' \ --data-raw '{"vector_db":"openai_vector_data","query":"what is CPaaS solution"}'
Response:
{ "response": "A CPaaS solution is a way for telecom companies to offer their voice services in the form of APIs, allowing them to expose their services more effectively. It eliminates the need for heavy VoIP server setup, physical SIP trunk lines, server administration, downtime, and scaling voice traffic for customers. It also provides a programmable interface for developing IVR call flows and allows services to be delivered in the form of APIs, relieving customers of the burden of VoIP server setup and maintenance." }
Summarize Web Article
Description: In this API call you can summarize the web article. Here you just need to provide web URL link in parameter and the API will give you summary of that particular article mentioned in web URL.
URL: http://localhost/lang_chain_api/summarize_web_article
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{“web_url”:”https://telephonyhub.in/2024/07/29/host-langchain-apis-with-flask/”,”summarize_type”:”stuff”,”model”:”openai”}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
web_url | string | your web URL to summarize |
summarize_type | string | possible values 'stuff' and 'refine' |
model | string | Available options are: 'openai', 'ollama' |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/summarize_web_article' \ --header 'Content-Type: application/json' \ --data-raw '{"web_url":"https://telephonyhub.in/2024/07/29/host-langchain-apis-with-flask/","summarize_type":"stuff","model":"openai"}'
Response:
{ "response": "This blog post discusses how to host LangChain APIs with Flask in order to create useful REST APIs for consuming LangChain features. It provides instructions on installing necessary packages, cloning a repository, setting up environment variables, and starting the Flask server. The post also includes examples of different LangChain API calls, such as simple LLM calls, LLM calls with prompts, LLM calls with structured output, querying a vector database, and using external knowledge bases with the RAG model. Each API call is explained with its URL, method, body parameters, CURL request, and response." }
Talk To Database
Description: In this API call you can ask your queries to MySql database in human language. Here you can pass your DB connection and Query, it will return you calculative answer from sql. Internally it will prepare sql query for your question and return nicely formatted answer.
URL: http://localhost/lang_chain_api/talk_to_database
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{
“db_user”: “admin”,”db_password”: “YOUR_DB_PASSWORD”,”db_name”:”store”,”template”:”You are an expert AI assistant who answers user queries from looking into the mysql database called store.The store database has following tables:company_employee,t_shirts The company_employee table has following fields:first_name,last_name,age The t_shirts table has following fields:colour,size,type Please answer for below user_query: {user_query}”,”query”:”who is nisha patel”
}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
db_user | string | your mysql database username |
db_password | string | your mysql database password |
db_name | string | your mysql database name |
template | string | description about your database and tables. Explain your database here |
query | string | your query in human language to the database |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/talk_to_database' \ --header 'Content-Type: application/json' \ --data-raw '{ "db_user": "admin","db_password": "YOUR_DB_PASSWORD","db_name":"store","template":"You are an expert AI assistant who answers user queries from looking into the mysql database called store.The store database has following tables:company_employee,t_shirts The company_employee table has following fields:first_name,last_name,age The t_shirts table has following fields:colour,size,type Please answer for below user_query: {user_query}","query":"who is nisha patel" }'
Response:
{ "response": "Nisha Patel is an employee in the company_employee table. She is 35 years old." }
Classification LLM Call
Description: In this API call you can know the user’s ‘sentiment’, ‘aggressiveness’ and ‘language’ in the spoken sentence. Here you can just pass the user response string and classify it in these key parameters to know overall nature of the sentence.
URL: http://localhost/lang_chain_api/classification_llm_call
Method: POST
Body:
Provide the below JSON string in the body of your API request. This JSON string contains required and/or
optional parameters to be submitted in API requests.
{
“query”:”This is fifth time I am calling in your customer support, still I did not get the solution of cable TV channels”
}
Below are the descriptions for the body parameters.
Parameter Name | Data Type | Description |
---|---|---|
query | string | user response of which the sentiment you want to know |
CURL Request:
curl -X POST \ 'http://localhost/lang_chain_api/classification_llm_call' \ --header 'Content-Type: application/json' \ --data-raw '{ "query":"This is fifth time I am calling in your customer support, still I did not get the solution of cable TV channels" }'
Response:
{ "response": { "aggressiveness": 4, "language": "english", "sentiment": "negative" } }
ABC