Relative Content

Tag Archive for huggingface-transformers

How to improve word output for hugging face models?

I am running into a weird output involving the generative text AI models. I expect that the huggingface models that state to be similar to GPT-3, GPT-4 or llama would perform similar to the webUI counterparts that can be found on openAI or poe. However when I use them I end up receiving very strange and nonsensical output. Is there something that I am missing that is resulting in the output?

How to improve word output for hugging face models?

I am running into a weird output involving the generative text AI models. I expect that the huggingface models that state to be similar to GPT-3, GPT-4 or llama would perform similar to the webUI counterparts that can be found on openAI or poe. However when I use them I end up receiving very strange and nonsensical output. Is there something that I am missing that is resulting in the output?

How to improve word output for hugging face models?

I am running into a weird output involving the generative text AI models. I expect that the huggingface models that state to be similar to GPT-3, GPT-4 or llama would perform similar to the webUI counterparts that can be found on openAI or poe. However when I use them I end up receiving very strange and nonsensical output. Is there something that I am missing that is resulting in the output?

How to improve word output for hugging face models?

I am running into a weird output involving the generative text AI models. I expect that the huggingface models that state to be similar to GPT-3, GPT-4 or llama would perform similar to the webUI counterparts that can be found on openAI or poe. However when I use them I end up receiving very strange and nonsensical output. Is there something that I am missing that is resulting in the output?

change hugging face HF_MODULES_CACHE

when I use from_pretrained to load a model I encountered the error that I don’t have write permission to “./cache” file specifically when this line is invoked https://github.com/huggingface/transformers/blob/main/src/transformers/dynamic_module_utils.py#L54.

Huggingface trainer with 2 optimizers

Is there any way to use the huggingface trainer with 2 optimizers? I need to train 2 parts of my model iteratively, but the Trainer object seems to only take on optimizer.