-
Notifications
You must be signed in to change notification settings - Fork 7.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Jinja] GPT4All-Chat 3.5.1 breaks TheBloke/OpenHermes-2.5-Mistral-7B-GGUF #3281
Comments
I respectfully disagree. Many models were not working with prior versions of GPT4All-Chat (v3.4.2 and prior) and users were forced to painfully change the chat template in the models tokenizer_config.json of the model (in particular the Here is also a statement by Gonzochess75 from discord:
|
But I agree, that maybe there should have been more testing and fixing a wider variety models, before pushing the update to users. |
So out of 12 models that are regularly used by me.
None of them work as of the last update. They all worked wonderfully under 3.4.2 and all are broke under 3.5.1.
Is every non Nomic Model being rendered useless and inoperable, going to be addressed as a FLAW or is it a new feature that I am to be happy and joyous over at all times?
Basically, is there a plan to go back to some kind of 'jinja' based 'default prompt' that just works with any side loaded models? Lawyer talk and nanny gas lighting aside the expected real-world behavior was side loaded models just worked once put into the model folder. Very compatible with many many models of any quant we wanted from 4 to 16 any quant you wanted almost any model. Thousands in fact.
That was the functionality up until 3.5 AFTER ALL. Truth be told. 99% of all Hugging face models just worked out of the box and with very little or NO fiddling, upside, will all those THOUSAND OF LLM models on HUGGINGFACE that made your product popular because they side loaded EASILY, have to stop advertising as working with your product?
TheBloke Models are not compatible with software all the sudden?
TheBloke/OpenHermes-2.5-Mistral-7B-GGUF
Ran script ... Won't just find and install the correct template?
Nope. This is a lot of lost productivity.
The Bloke and folks like him ... MADE YOU ...
Those 'side loaded' models are why many folks use you. We ourselves, only use this because it gave us the EASE and FREEDOM to use any model WE CHOSE on our machine under OUR TERMS. 99/100 AI models in 5 or 6 Archs used to work perfectly with the default out of the box.
If that goes: Your software gets uninstalled. I don't use your curated ones at all daily so them working or not matters not to me in the least. My freedom to use the model I want like I could last week, does.
The text was updated successfully, but these errors were encountered: