You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
vllm is a great project. and i have meet the problem that how could i acquire the logps of a given input and output?
for example, when the question is "who are you?", i want acquire the logps of answer "i am GPT model". i could not find any functions to realize this issue.
Alternatives
No response
Additional context
No response
Before submitting a new issue...
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
You can pass sampling_params=SamplingParams(prompt_logprobs=...) for offline inference. A corresponding option is also available for online inference, please read the docs.
it seems like not fit this problem. when i ask the LLM that "who are you?", i want it output the logps of the answer "i am GPT model". and the "i am GPT model" what i expected.
what is expected may be a function like:
def get_logps(question, answer):
return logps
🚀 The feature, motivation and pitch
vllm is a great project. and i have meet the problem that how could i acquire the logps of a given input and output?
for example, when the question is "who are you?", i want acquire the logps of answer "i am GPT model". i could not find any functions to realize this issue.
Alternatives
No response
Additional context
No response
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: