-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connections past the H2SET_MAX_CONCURRENT_STREAMS limit are dropped, is there any waiting list mechanism available? #3271
Comments
The server setting this limit is usually in the situation it doesn't have resources (typically, memory) to handle more than the stated number of streams. It has told the client when the h2 connection was set up. If the client ignores the limit (which is a standardized h2 thing) the server doesn't have to bend itself out of shape trying to work with it, sending a GOAWAY is reasonable. So it's a client-side problem to work within the parameters the server described. Lws client-side does have a queuing concept but I don't think it's wired up to the remote peer's h2 CONCURRENT_STREAMS limit. IIRC if it's h2 it will try to mux any new client request that can be done on an existing connection. The lib could do it, or you could track how many unterminated http client actions are still ongoing and delay on the app client side. |
Thanks @lws-team for the quick response. I agree with you on resource limitation. Let’s assume I have a server setup with HTML, CSS, JS, and image files under a mount path. The default landing page is index.html, which includes some JS files and image files. When the client (Chrome or Firefox browser) tries to connect to the server(https://localhost:7681), it receives the MAX_CONCURRENT_STREAMS(8) in HEADERS and somehow does not honor it, sending more than 8 requests(either for images or JS files). In resource-limited environments, sending a GOAWAY is completely understandable. |
If there's no problem serving more streams simultaneously resource-wise, you can control what the server sends out for CONCURRENT_STREAMS. If it's freertos (a guess because it seems to be 8 for you by default), eg
|
Hi Andy,
I have setup a http2 server with H2SET_MAX_CONCURRENT_STREAMS as 8. When Client sends more than 8 requests, server is closing all the streams.
Do we have any mechanism currently to maintain a waitlist for the concurrent streams after crossing the limit?
If not, this needs to be implemented in library or in the application?
Thanks.
The text was updated successfully, but these errors were encountered: