LiteLLM and Llama-Index Service Context creation
I am going crazy with finding a solution to this. I am working with a proxy server for OpenAI models. I’m using an ssh tunnel to hit the server on my localhost.
I am going crazy with finding a solution to this. I am working with a proxy server for OpenAI models. I’m using an ssh tunnel to hit the server on my localhost.