Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance regression in 2.6.1 (after updating from 2.6.0) when using upsertQueryEntries #4909

Open
ceisele-r opened this issue Mar 27, 2025 · 5 comments
Labels
bug Something isn't working RTK-Query

Comments

@ceisele-r
Copy link

When using upsertQueryEntries to upsert ~5000 entries, the performance massively degraded after the update from 2.6.0 to 2.6.1 making the browser hang completely for a few seconds where it was previously performing well.

Image

I suspect #4872 to be the cause of the issue, though I have not traced it down completely to this specific commit.

But I can confirm the upsertQueryEntries is performing well in 2.6.0 while it is making the browser hang for ~5s after the update to 2.6.1 using the same code.

@markerikson
Copy link
Collaborator

@ceisele-r Hmm. Certainly possible. Could you provide a repro that shows this happening?

@ceisele-r
Copy link
Author

@markerikson yeah, please see the following:
v2.6.0 with good performance:
https://codesandbox.io/p/devbox/rtk-query-4909-forked-wfr6yd?workspaceId=ws_AYiY7CpdS8ZRZrgT98ctux

v2.6.1 with bad performance:
https://codesandbox.io/p/devbox/redux-with-redux-toolkit-2-forked-z7ng7l?workspaceId=ws_AYiY7CpdS8ZRZrgT98ctux

After clicking the button "Test", the query is started with upserting entries.
Once completed, it displays the loaded data.

As you can see in the second (v2.6.1) demo, the preview hangs and the upserts take way longer.

@markerikson
Copy link
Collaborator

Ouch. Yeah. Confirmed:

Image

I think the issue is that internally this is reusing the existing logic for updating the api.provided slice of state, which maps Tag > ID > QueryCacheKey[]:

Image

That logic looks like:

updateProvidedBy: {
        reducer(
          draft,
          action: PayloadAction<{
            queryCacheKey: QueryCacheKey
            providedTags: readonly FullTagDescription<string>[]
          }>,
        ) {
          const { queryCacheKey, providedTags } = action.payload

          for (const tagTypeSubscriptions of Object.values(draft)) {
            for (const idSubscriptions of Object.values(tagTypeSubscriptions)) {
              const foundAt = idSubscriptions.indexOf(queryCacheKey)
              if (foundAt !== -1) {
                idSubscriptions.splice(foundAt, 1)
              }
            }
          }

          for (const { type, id } of providedTags) {
            const subscribedQueries = ((draft[type] ??= {})[
              id || '__internal_without_id'
            ] ??= [])
            const alreadySubscribed = subscribedQueries.includes(queryCacheKey)
            if (!alreadySubscribed) {
              subscribedQueries.push(queryCacheKey)
            }
          }
        },

Unfortunately it's trying to do a brute-force cleanup of any existing uses of the query cache key in subscriptions... which means it's doing a nested loop over all the IDs and arrays. That's not only O(n), but now we're rerunning that entire sequence for each upserted entry, so we're going to re-run this with increasing N every time, and it's all accessing the Immer draft proxy.

Not immediately sure how to rework this logic, but agreed this is a noticeable issue and needs to be addressed.

@markerikson
Copy link
Collaborator

Okay, so the good news is I've got a prospective fix. I added a lookup table mapping cache keys to their provided tags so we can remove them easily when needed, and that dropped the upsertQueryEntries runtime for this example from 23 seconds to 91 milliseconds.

The issue is I'm having to alter the structure of the api.provided state, and that's currently breaking the Redux DevTools RTKQ display. So, going to need to figure out coordinating getting that updated to handle the two possible state structures.

I've got a PR up at #4910 that should fix the perf issue, but actually releasing it is going to have to wait until we can get the DevTools updated to handle the change. You should be able to use the PR preview build in the meantime.

@ceisele-r
Copy link
Author

Great @markerikson , thank you for the fast confirmation and fix 👏
It's great to have that performance bottleneck eliminated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working RTK-Query
Projects
None yet
Development

No branches or pull requests

2 participants