Can’t feed custom method response to open ai LLM
I have the following chat loop based on open ai with configuring to call custom functions (via Betalgo.OpenAI
v8.5.0):
I have the following chat loop based on open ai with configuring to call custom functions (via Betalgo.OpenAI
v8.5.0):