-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
替换模型 #20
Comments
可以替换。嵌入模型放在localmodels。微调后的LLM可以通过配置API访问。 |
好的,非常感谢您的解答,按照你说的方法,我试一下
回忆、是幸福的延续 つ
***@***.***
…------------------ 原始邮件 ------------------
发件人: "wzdavid/ThinkRAG" ***@***.***>;
发送时间: 2025年3月5日(星期三) 上午9:07
***@***.***>;
抄送: "回忆、是幸福的延续 ***@***.******@***.***>;
主题: Re: [wzdavid/ThinkRAG] 替换模型 (Issue #20)
可以替换。嵌入模型放在localmodels。微调后的LLM可以通过配置API访问。
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
wzdavid left a comment (wzdavid/ThinkRAG#20)
可以替换。嵌入模型放在localmodels。微调后的LLM可以通过配置API访问。
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
您好,针对Thinkrag这个开源的框架,我想通过微调模型,来测试微调前后的一个效果,想通过我们这个框架实现批量处理输入的问题,生成答案来进行微调前后的一个效果,想问一下我们这个框架可以实现么,不仅仅在前端进行问答,期待您的回复
回忆、是幸福的延续 つ
***@***.***
|
ThinkRAG是基于LlamaIndex编写的。LlamaIndex有评估模块,可以评估RAG的效果,也许适合你的场景,但ThinkRAG还没有使用它。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
想问一下,该RAG框架中各个模块(嵌入模型或者llm)可以替换成微调之后的么?
The text was updated successfully, but these errors were encountered: