-
Notifications
You must be signed in to change notification settings - Fork 8.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PROMPT TOO LONG? WHAT DOES THAT MEAN? #1227
Comments
It probably means that you have hit the limits of bolt in regards to project size. I guess bolt tries to send all your source code into the context window of the LLM but if you have just too much source code already in your project, the LLM rejects it. |
Makes sense. And explains why i was not using many tokens earlier, but once
they reached 10 million they got exhausted VERY fast
…On Fri, Nov 1, 2024 at 15:30 ayalpani ***@***.***> wrote:
It probably means that you have hit the limits of bolt in regards to
project size. I guess bolt tries to send all your source code into the
context window of the LLM but if you have just too much source code already
in your project, the LLM rejects it.
—
Reply to this email directly, view it on GitHub
<#1227 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXPPHXS4L66KZ6COI53WQDTZ6NX5LAVCNFSM6AAAAABQ6D67NOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJRG44TOMJXGY>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Exactly. Even when you ask bolt to just change a constant value somewhere, it will cost you those 200k input token plus additional output tokens (3-4 times more costly than input token). Somewhere on twitter the bolt team explained that they want to give developers the possibility to restrict the context to code-parts you can manually chose. But that would also mean that as a user of bolt you'll have to make a lot of decisions and have an actual understanding of the code that bolt created. It will not be the tool anymore with which product managers can work and ordinary people who are not coders like us. Even an experienced developer might have a hard time understanding huge amounts of code created by Bolt. Alternatively Bolt finds a good strategy of automatically choosing which parts of your code it wants to fit into the context window. I guess unless we get much bigger context windows and cheaper prices, this can not scale beyond a point and we'll have to get in control of the code ourselves. Also I guess increasing the context window leads to worse quality of the LLM answers so there might be a natural limitation. Maybe the next gen of LLMs does not have such limitations anymore. Anyway, for a huge amount of use cases, bolt's approach will work just fine already today. Small homepages, small tools, scripts, npm packages, ... Let's see where things are going to, more and more products like bolt will come up and will have different strengths |
@samora254 thanks for your report, and appreciate your patience as we are a small team working to support all of the new users! The context on this error, workaround ideas, and future updates on the R&D we are doing on this issue are being tracked in #1322 going forward so please go subscribe there! @ayalpani is dead on with everything in that message! We are working to automatically manage the context window better by implementing a multi-agent approach (potentially) so you have multiple AIs, each with a 200k context window, and they work together to help you manage different aspects of the project. This is all R&D at this point, but very exciting. |
You’re doing a great job. I cant wait to see what the platform becomes by
end of the year. Let me find some more dollars to finalize on the App so I
ship asap. Students in kenya are gonna love the product am putting together
for them.
…On Sat, Nov 2, 2024 at 00:13 Alexander Berger ***@***.***> wrote:
@samora254 <https://github.com/samora254> thanks for your report, and
appreciate your patience as we are a small team working to support all of
the new users! The context on this error, workaround ideas, and future
updates on the R&D we are doing on this issue are being tracked in #1322
<#1322> going forward so
please go subscribe there!
@ayalpani <https://github.com/ayalpani> is dead on with everything in
that message! We are working to automatically manage the context window
better by implementing a multi-agent approach (potentially) so you have
multiple AIs, each with a 200k context window, and they work together to
help you manage different aspects of the project. This is all R&D at this
point, but very exciting.
—
Reply to this email directly, view it on GitHub
<#1227 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXPPHXQZQNOA65V25GON3BTZ6PVGPAVCNFSM6AAAAABQ6D67NOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJSGU4TQMZUGE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Hi Alexander and Ayalpani, if you're still here; Been trying to deploy my
project to Netlify all night without success. I have attached the error.
…On Sat, Nov 2, 2024 at 3:24 PM NDIZI TV ***@***.***> wrote:
You’re doing a great job. I cant wait to see what the platform becomes by
end of the year. Let me find some more dollars to finalize on the App so I
ship asap. Students in kenya are gonna love the product am putting together
for them.
On Sat, Nov 2, 2024 at 00:13 Alexander Berger ***@***.***>
wrote:
> @samora254 <https://github.com/samora254> thanks for your report, and
> appreciate your patience as we are a small team working to support all of
> the new users! The context on this error, workaround ideas, and future
> updates on the R&D we are doing on this issue are being tracked in #1322
> <#1322> going forward so
> please go subscribe there!
>
> @ayalpani <https://github.com/ayalpani> is dead on with everything in
> that message! We are working to automatically manage the context window
> better by implementing a multi-agent approach (potentially) so you have
> multiple AIs, each with a 200k context window, and they work together to
> help you manage different aspects of the project. This is all R&D at this
> point, but very exciting.
>
> —
> Reply to this email directly, view it on GitHub
> <#1227 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AXPPHXQZQNOA65V25GON3BTZ6PVGPAVCNFSM6AAAAABQ6D67NOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJSGU4TQMZUGE>
> .
> You are receiving this because you were mentioned.Message ID:
> ***@***.***>
>
|
Describe the bug
I get an error when issueing a prompt. Error says, "prompt is too long: 202135 tokens > 200000 maximum'
Link to the Bolt URL that caused the error
https://bolt.new/~/sb1-969yan
Steps to reproduce
I am giving normal commands for changes to my app but commands are not accepted. I've checked I still have enough tokens. And my prompts are short paragraphs.
Expected behavior
Effect the changes reqquested
Screen Recording / Screenshot
No response
Platform
Additional context
No response
The text was updated successfully, but these errors were encountered: