NefTune Receiving 0 Training Loss on Transformers
I’m basically trying to fine-tune my model with Neftune. Model is based on Turkish Language. But there I’m receiving zero training lose. I’ve tried to another model like Turkish-GPT2 there is no issue everything is okay. I think probably there is problem with model. I don’t know how to handle with this issue.