Can GPT4ALL be used? #120
Unanswered
mrvikramkohli
asked this question in
Q&A
Replies: 1 comment 2 replies
-
If I remember correctly you can simply use the openai versions but change the backend to your localhost from gpt4all. Gpt4all should have openai compatible api endpoints. Alternatively you can use my fork scikit-ollama. But that shouldn't be necessary after all. If you look at the issues where I commented there should also be an example I gave. Currently on mobile so can't really search well. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I believe there was a very old post about the ability to use GPT4ALL with scikit-LLM but it was experimental. I would like to run GPT4ALL locally. In the backend families doc I don't see GPT4ALL listed. But I wanted to check. If I change the name space to GPT4ALL will it work? And for a local instance what would I set for SKLLM config and what estimators are available to use.
Thanks for your help!
Vikram
Beta Was this translation helpful? Give feedback.
All reactions