So I have this code running in python that will take a text type of query , such as “How many customers do I have?” and it will use the below and convert to a SQL query using the prompt template with is simply a schema of the table to check, this code below runs 100% fine:
def query_maker(user_input):
# make sql queries using LLM chain
openaiLLM = ChatOpenAI(model="gpt-3.5-turbo", temperature=0.7, openai_api_key=api_key, cache=False)
prompt_template = PromptTemplate.from_template(
"{system_prompt} + 'n' + {user_input}.")
chain = LLMChain(llm=openaiLLM, prompt=prompt_template)
query=chain.run({"system_prompt": query_maker_gpt_system_prompt, "user_input": user_input})
return(query)
I want to now convert this to use a local LLM running on LM STUDIO as its private data and dont want it exposed to openai – my local LLM is running Llama2 and is as follows:
def query_maker(user_input):
# Define the API endpoint
# for the local LLM
api_url = "http://localhost:1234/v1/chat/completions"
# Construct the payload for the API request
payload = {
"model": "TheBloke/Llama-2-7B-Chat-GGUF/llama-2-7b-chat.Q4_0.gguf",
"inputs": {
"system_prompt": query_maker_gpt_system_prompt,
"user_input": user_input
},
"parameters": {
"temperature": 0.7
}
}
# Send the request to the local API
try:
response = requests.post(api_url, json=payload)
response_data = response.json()
# Extract the query from the response
if 'choices' in response_data and response_data['choices']:
query = response_data['choices'][0]['message']['content']
print(query)
return query
else:
return "No response from model."
except Exception as e:
print("Failed to connect to the local model:", e)
return "Error connecting to the model."
This is giving all kinds of errors
The console shows: [2024-04-23 22:46:10.677] [ERROR] ‘messages’ field is required
and the browser shows:
(‘42000’, ‘[42000] [Microsoft][ODBC Driver 17 for SQL Server]Syntax error, permission violation, or other nonspecific error (0) (SQLExecDirectW)’)
Anyone else managed to get this right?