1B Model not using tools when running from ollama #216
-
When using the home-llm 1B model, it will correctly use the tools like turning off the lights. However it is very slow when the devices are added to the prompt, so I tried running the model on ollama so I could use a GPU.
and turns off the lights, but with ollama it answers with:
(I removed one ` for githubs formatting) It seems like it wants to use the tool, but somewhere it goes wrong. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Make sure in your settings that:
|
Beta Was this translation helpful? Give feedback.
Make sure in your settings that:
Home-LLM (v1-v3)