Replies: 1 comment
-
Technically Llama 2 7b and 13b can be used by sideloading the models into gpt4all (the underlying architecture seems to be identical to llama1). You can find the instructions here: https://docs.gpt4all.io/gpt4all_chat.html#sideloading-any-ggml-model Also, you would need to install the latest scikit-llm directly from git Unfortunately 70b version is not supported for now. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
LLama2 support ?
Beta Was this translation helpful? Give feedback.
All reactions