Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PROMPT TOO LONG? WHAT DOES THAT MEAN? #1227

Closed
samora254 opened this issue Oct 31, 2024 · 6 comments
Closed

PROMPT TOO LONG? WHAT DOES THAT MEAN? #1227

samora254 opened this issue Oct 31, 2024 · 6 comments

Comments

@samora254
Copy link

Describe the bug

I get an error when issueing a prompt. Error says, "prompt is too long: 202135 tokens > 200000 maximum'

Link to the Bolt URL that caused the error

https://bolt.new/~/sb1-969yan

Steps to reproduce

I am giving normal commands for changes to my app but commands are not accepted. I've checked I still have enough tokens. And my prompts are short paragraphs.

Expected behavior

Effect the changes reqquested

Screen Recording / Screenshot

No response

Platform

Browser name = Chrome
Full version = 130.0.0.0
Major version = 130
navigator.appName = Netscape
navigator.userAgent = Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Safari/537.36
performance.memory = {
  "totalJSHeapSize": 173556632,
  "usedJSHeapSize": 164409892,
  "jsHeapSizeLimit": 4294705152
}
Username = samora254
Chat ID = e36485e1dab6

Additional context

No response

@ayalpani
Copy link

ayalpani commented Nov 1, 2024

It probably means that you have hit the limits of bolt in regards to project size. I guess bolt tries to send all your source code into the context window of the LLM but if you have just too much source code already in your project, the LLM rejects it.

@samora254
Copy link
Author

samora254 commented Nov 1, 2024 via email

@ayalpani
Copy link

ayalpani commented Nov 1, 2024

Exactly. Even when you ask bolt to just change a constant value somewhere, it will cost you those 200k input token plus additional output tokens (3-4 times more costly than input token). Somewhere on twitter the bolt team explained that they want to give developers the possibility to restrict the context to code-parts you can manually chose.

But that would also mean that as a user of bolt you'll have to make a lot of decisions and have an actual understanding of the code that bolt created. It will not be the tool anymore with which product managers can work and ordinary people who are not coders like us. Even an experienced developer might have a hard time understanding huge amounts of code created by Bolt.

Alternatively Bolt finds a good strategy of automatically choosing which parts of your code it wants to fit into the context window. I guess unless we get much bigger context windows and cheaper prices, this can not scale beyond a point and we'll have to get in control of the code ourselves. Also I guess increasing the context window leads to worse quality of the LLM answers so there might be a natural limitation. Maybe the next gen of LLMs does not have such limitations anymore.

Anyway, for a huge amount of use cases, bolt's approach will work just fine already today. Small homepages, small tools, scripts, npm packages, ...

Let's see where things are going to, more and more products like bolt will come up and will have different strengths

@kc0tlh
Copy link
Collaborator

kc0tlh commented Nov 1, 2024

@samora254 thanks for your report, and appreciate your patience as we are a small team working to support all of the new users! The context on this error, workaround ideas, and future updates on the R&D we are doing on this issue are being tracked in #1322 going forward so please go subscribe there!

@ayalpani is dead on with everything in that message! We are working to automatically manage the context window better by implementing a multi-agent approach (potentially) so you have multiple AIs, each with a 200k context window, and they work together to help you manage different aspects of the project. This is all R&D at this point, but very exciting.

@kc0tlh kc0tlh closed this as completed Nov 1, 2024
@samora254
Copy link
Author

samora254 commented Nov 2, 2024 via email

@samora254
Copy link
Author

samora254 commented Nov 7, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants