I want to modify an LLM prompt at runtime by replacing some strings with user input.
The idea is to do that when the chain starts executing before it calls the LLM.
I tried creating a custom handler and implementing the ‘on_chain_start’ method.
The method signature is as follows:

def on_chain_start( self, serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any ) -> Any

In serialized['kwargs']['prompt']['kwargs']['template'] I can see the current prompt’s template and I’m able to change it manually, but when the chain execution continues, the original prompt is used (not the modified one in the handler).

How can I change the prompt’s template at runtime using the on_chain_start callback method?

Thanks.

New contributor

David Marce is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.