How to Track Token Usage with TikToken Library for Anthropic Models in llama-index Query Engine?
I’m facing an issue with tracking token usage for Anthropic models using the TikToken library. The tiktoken
library natively supports OpenAI models, but I’m working with the Claude-3 model family from Anthropic.