Conversation roles must alternate user/assistant/user/assistant/... #2112
Replies: 6 comments 13 replies
-
Mistral instruct and Mixtral instruct doesn't accept system prompt. This is error message directly from their official chat template. |
Beta Was this translation helpful? Give feedback.
-
whoops! ok, got it!!!!!!!!!!!!! THANK YOU!!!!! VLLM IS AWESOME!!! |
Beta Was this translation helpful? Give feedback.
-
If you are interested in python code snippet that does streaming, it would look like this `from openai import OpenAI openai_api_key = "EMPTY" client = OpenAI( models = client.models.list() chat_completion = client.chat.completions.create( for chunk in chat_completion: |
Beta Was this translation helpful? Give feedback.
-
This shall be an issue to be fixed from vllm side. "system" role is not only used for system prompt, and can be also used to provide some "system" messages. GPT(OpenAI official API) can handle system role. |
Beta Was this translation helpful? Give feedback.
-
Faced this error on first shot with vllm. My workaround is to use a custom chat template. You can pass it to vllm with Where the content of {%- for message in messages %}
{%- if message['role'] == 'system' -%}
{{- message['content'] -}}
{%- else -%}
{%- if message['role'] == 'user' -%}
{{-'[INST] ' + message['content'].rstrip() + ' [/INST]'-}}
{%- else -%}
{{-'' + message['content'] + '</s>' -}}
{%- endif -%}
{%- endif -%}
{%- endfor -%}
{%- if add_generation_prompt -%}
{{-''-}}
{%- endif -%} It now works like a charm 🥳 |
Beta Was this translation helpful? Give feedback.
-
Hi !
Instead of running the llm inside my langchain code with the transformers library, I launched an llm server and make langchain connect to it as if it were an openai server. The model is managed by vllm and you just need to use it as a "prompt engineer". This made my issue of prompt inside the answer disappear. Moreover once the vllm server is launched, you don't have to manage it anymore and you can improve your code without relaunching the llm each time
Envoyé à partir de Outlook pour Android<https://aka.ms/AAb9ysg>
…________________________________
From: Anshuman Kumar ***@***.***>
Sent: Monday, June 10, 2024 1:16:07 PM
To: vllm-project/vllm ***@***.***>
Cc: jeanfredd ***@***.***>; Mention ***@***.***>
Subject: Re: [vllm-project/vllm] Conversation roles must alternate user/assistant/user/assistant/... (Discussion #2112)
@jeanfredd<https://github.com/jeanfredd> How did you fix this issue??
—
Reply to this email directly, view it on GitHub<#2112 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ASX65LVQYLM3DSB7TJS4BVDZGWDHPAVCNFSM6AAAAABAVEUL7CVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TOMRWGIYTG>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
we are getting this error: {"object":"error","message":"Conversation roles must alternate user/assistant/user/assistant/...","type":"invalid_request_error","param":null,"code":null}
using hte exact same curl example from the documentation:
curl http://localhost:8000/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "mistralai/Mistral-7B-Instruct-v0.2", "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Who won the world series in 2020?"} ] }'
i am sure it is simple, but any ideas?
Beta Was this translation helpful? Give feedback.
All reactions