gptneox.cpp for OpenAssistant models #1202
byroneverson
started this conversation in
Show and tell
Replies: 1 comment 2 replies
-
cool, gg's implementation seems to have an error. would be nice if you could use your experience with neox and apply it https://github.com/ggerganov/ggml/tree/master/examples/stablelm#warning |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Put together a quick fork of llama.cpp for using OpenAssistant StableLM and Pythia models. So far the 7B stablelm and 12B pythia models are supported but I'm sure I will add more soon as they are made available. There are some easy to use scripts in the scripts directory for those of you that want to check it out with ease. Refer to the README-GPTNEOX for more details.
https://github.com/byroneverson/gptneox.cpp
Beta Was this translation helpful? Give feedback.
All reactions