Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama code completions has problem #893

Open
fashen97 opened this issue Feb 20, 2025 · 2 comments
Open

Ollama code completions has problem #893

fashen97 opened this issue Feb 20, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@fashen97
Copy link

What happened?

when choose deepseek-coder-v2:latest model,code completion works well,but choose deepseek-r1:7b model it completion ```,not useful code

Relevant log output or stack trace

Steps to reproduce

No response

CodeGPT version

2.16.3-241.1

Operating System

macOS

@fashen97 fashen97 added the bug Something isn't working label Feb 20, 2025
@cavebatsofware
Copy link

cavebatsofware commented Feb 28, 2025

I am having this problem with codellama:7b and codellama:13b-code (as well as other models) using code completion with Ollama and CodeGPT (ProxyAI 2.16.4-241.1) and Ollama server (local network, port forwarded locally) version 0.5.7. This is a screenshot of the error showing up in Webstorm version 2024.3.4. I'm on macos 15.3.1.

Image

{"error":"registry.ollama.ai/library/codellama:13b does not support insert"}

Stacktrace

java.lang.RuntimeException
	at ee.carlrobert.llm.completion.CompletionEventSourceListener.onFailure(CompletionEventSourceListener.java:118)
	at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:52)
	at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1583)

I should mention that the chat function is working fine with the same models.

Could be related to issue #799

@cavebatsofware
Copy link

cavebatsofware commented Feb 28, 2025

Update on this

It appears that if I change the plugin settings and check the box "Use build-in FIM template" it works as expected.

This works for me:

Image

This does not work for me right now:

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants