I’m attempting to use LangChain’s AzureChatOpenAI with the gpt-35-turbo-16k model in a Node.js application to create an OpenAI Function Agent. I’ve properly set upt the required environment variables for Azure OpenAI setup. However, I’m encountering an error, which is not very descriptive, when trying to run the function.

Here’s the relevant code snippet:

import { pull } from "langchain/hub";
import type { ChatPromptTemplate } from "@langchain/core/prompts";
import { AzureChatOpenAI } from "@langchain/azure-openai";

const model = new AzureChatOpenAI({
  azureOpenAIApiDeploymentName: "gpt-35-turbo-16k",
  azureOpenAIApiVersion: "2024-03-01-preview",
  modelName: "gpt-35-turbo-16k",
  temperature: 0,
  maxRetries: 1,

export async function ChatWithFunctionAgent(question: string) {
  try {
    // Get the prompt to use - you can modify this!
    // If you want to see the prompt in full, you can at:
    // https://smith.langchain.com/hub/hwchase17/openai-functions-agent
    const prompt = await pull<ChatPromptTemplate>(
    const agent = await createOpenAIFunctionsAgent({
      llm: model,
      tools: [],
    const agentExecutor = new AgentExecutor({
      tools: [],
    const result = await agentExecutor.invoke({
      input: "what is LangChain?",
    return result;
  } catch (error) {


Http function processed request for url "http://localhost:7071/api/chat"
[1] [2024-04-04T05:27:52.461Z] Error: [object Object]
[1] [2024-04-04T05:27:52.461Z]     at /home/ujwal/code/roc-next/azure-func/node_modules/@langchain/core/dist/utils/async_caller.cjs:98:23
[1] [2024-04-04T05:27:52.461Z]     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
[2024-04-04T05:27:52.461Z]     at async RetryOperation._fn (/home/ujwal/code/roc-next/azure-func/node_modules/p-retry/index.js:50:12) {
[1] [2024-04-04T05:27:52.461Z]   attemptNumber: 2,
[1] [2024-04-04T05:27:52.461Z]   retriesLeft: 0
[1] [2024-04-04T05:27:52.461Z] }

What I’ve tried:

  • I’ve already double-checked to ensure all the required environment variables are set, including those needed for authentication with Azure.

  • I’ve manually invoked the model, that is working fine, so no issues with Azure credentials or connection.

Expected behavior:

  • AzureChatOpenAI to work with langchain OpenAIFunction Agent

Any help would be greatly appreciated!