How can I use Langchain’s RedisChatMessageHistory as the chat_memory for ConversationSummaryBufferMemory to keep my token usage low?
I’m creating a chatbot that is also context-aware using a vector retriever. This is accomplished using create_stuff_documents_chain, create_retrieval_chain, and RunnableWithMessageHistory.