-
Notifications
You must be signed in to change notification settings - Fork 341
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ContextChatEngine does not use context in chat #1228
Comments
For further context, I modified /examples/pg-vector-store/query.ts to test the difference between using streaming ContextChatEngine and some other method. This is using the example data, which has this snippet: So a context model should be able to answer the question of Duracell's alkaline battery market at 29%. Output from streaming chat with context retriever: For the most accurate and up-to-date information on Duracell's market share in the global alkaline battery market, I recommend consulting recent market research reports or financial statements from Duracell's parent company, Berkshire Hathaway. Industry analysis firms like Nielsen, Statista, or Market Research Future often publish detailed reports that include market share data for major players in the battery industry. Output from queryEngine: So we can see the contentEngine is not using the context from the above queries. Script used is here (after running /examples/pg-vector-store/load-docs.ts). import {
ContextChatEngine,
OpenAI,
PGVectorStore,
Settings,
VectorStoreIndex,
} from "llamaindex";
Settings.llm = new OpenAI({ model: "gpt-4o" });
async function main() {
try {
const pgvs = new PGVectorStore();
// Optional - set your collection name, default is no filter on this field.
// pgvs.setCollection();
const messages = [
{
role: "user",
content:
"What is duracell's market share of the global alkaline battery market?",
},
];
const query = messages[messages.length - 1].content;
const index = await VectorStoreIndex.fromVectorStore(pgvs);
// Below using chatEngine
const retriever = index.asRetriever({
topK: {
TEXT: 3,
IMAGE: 3,
},
});
const chatEngine = new ContextChatEngine({
retriever,
});
console.log("Output from streaming chat with context retriever:");
const stream = await chatEngine.chat({
message: query,
chatHistory: messages,
stream: true,
verbose: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.response);
}
console.log("\n");
// Below using queryEngine
const queryEngine = await index.asQueryEngine();
const response = await queryEngine.query({ query });
console.log("Output from queryEngine:");
console.log(response.message.content);
} catch (err) {
console.error(err);
}
}
main()
.catch(console.error)
.finally(() => {
process.exit(1);
}); |
what nodes do you used when call |
Code for both |
Seems like it's an issue with |
Thanks for debugging. I will check later |
For now I don't have much time to create a minumal reprodue code. But I think this might related to Memory refactor or it's already a bug from very early commits. We lack some test cases here, so you can provide more detail expected behavior to this ticket and I will add to my todo list |
@marcusschiesser could you please help me on this? |
@marcusschiesser sure, i'll have a look |
Issue in 6.1 onwards.
When calling .chat() from a ContextChatEngine the responses clearly do not use the context.
If I run .retrieve instead straight from the Index.asRetriever, context is returned just fine. Context is also returned okay if I call
.contextGenerator.generate(prompt)
from the chat engine. It is just when calling .chat that the context does not seem to be included, judging by the response.The text was updated successfully, but these errors were encountered: