Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor how we handle PreExecution errors #3787

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft

Conversation

patrick91
Copy link
Member

@patrick91 patrick91 commented Feb 18, 2025

Summary by Sourcery

Enhancements:

  • This change refactors how pre-execution errors are handled in subscriptions, ensuring that errors occurring before the execution of a subscription are properly caught and sent to the client.

Copy link
Contributor

sourcery-ai bot commented Feb 18, 2025

Reviewer's Guide by Sourcery

This pull request refactors how PreExecutionError is handled in the subscription protocols (graphql_transport_ws and graphql_ws). The changes ensure that errors occurring before the execution of a subscription are correctly propagated to the client and simplifies the subscription execution flow by removing the wrapper function and directly returning the async generator.

Sequence diagram for handling PreExecutionError in graphql_transport_ws

sequenceDiagram
  participant Client
  participant Server
  participant Schema
  participant Operation

  Client->>Server: SubscribeMessage
  Server->>Schema: schema.subscribe(query, variables, operation_name, context, root_value)
  Schema->>Schema: _subscribe(...)
  alt PreExecutionError
    Schema-->>Server: PreExecutionError
    Server->>Operation: send_initial_errors(errors)
    Operation->>Client: Sends error message
  else SubscriptionResult
    Schema-->>Server: AsyncGenerator[ExecutionResult]
    Server->>Operation: result_source
    loop For each result in result_source
      Operation->>Client: send_next(result)
    end
    Operation->>Client: send_operation_message({type: \"complete\"})
  end
Loading

Sequence diagram for handling PreExecutionError in graphql_ws

sequenceDiagram
  participant Client
  participant Server
  participant Schema

  Client->>Server: Subscribe (query, variables, operation_name)
  Server->>Schema: schema.subscribe(query, variables, operation_name, context, root_value)
  Schema->>Schema: _subscribe(...)
  alt PreExecutionError
    Schema-->>Server: PreExecutionError
    Server->>Server: send_message({type: \"error\", id: operation_id, payload: error})
    Server->>Client: Sends error message
  else SubscriptionResult
    Schema-->>Server: AsyncGenerator[ExecutionResult]
    Server->>Server: result_source
    loop For each result in result_source
      Server->>Server: send_data_message(result, operation_id)
      Server->>Client: Sends data message
    end
    Server->>Server: send_message({type: \"complete\", id: operation_id})
    Server->>Client: Sends complete message
  end
Loading

File-Level Changes

Change Details Files
Refactors the handling of PreExecutionError in subscription protocols to ensure errors occurring before subscription execution are correctly propagated to the client.
  • Ensures PreExecutionError is correctly sent to the client as an error message.
  • Simplifies the subscription execution flow by directly returning the async generator.
  • Removes the wrapper function that was previously used to handle the first result from the subscription.
  • Handles PreExecutionError directly within the main execution loop.
  • Adds a check for PreExecutionError as the first result of the subscription and sends an error message if it occurs.
strawberry/subscriptions/protocols/graphql_transport_ws/handlers.py
strawberry/subscriptions/protocols/graphql_ws/handlers.py
strawberry/schema/schema.py

Possibly linked issues


Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@patrick91
Copy link
Member Author

Next step would be to change the PreExecution error to an exception

@patrick91 patrick91 marked this pull request as ready for review February 18, 2025 18:17
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @patrick91 - I've reviewed your changes - here's some feedback:

Overall Comments:

  • It looks like you're handling PreExecutionError differently for subscriptions and other operations; is this intentional?
  • Consider adding a test case specifically for PreExecutionError in subscriptions to ensure the fix works as expected.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟡 Complexity: 1 issue found
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

else:
async for result in first_res_or_agen:
is_first_result = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (complexity): Consider separating the processing of the first result from the rest of the asynchronous iterator to improve readability.

Consider separating the processing of the first result from the subsequent ones to eliminate the inline flag and nested condition. For example, you could immediately obtain the first value from the async iterator and then process the rest. This keeps the functionality identical while making the control flow easier to follow. For instance:

# Replace the current else clause with:

# For async generators:
first_result = None
aiter = result_source.__aiter__()
try:
    first_result = await aiter.__anext__()
except StopAsyncIteration:
    pass

if first_result is None:
    pass  # Optionally, handle the case of no results.
elif isinstance(first_result, PreExecutionError):
    await operation.send_initial_errors(first_result.errors)
else:
    await operation.send_next(first_result)
    async for result in aiter:
        await operation.send_next(result)

This refactoring removes the need for an is_first_result flag and simplifies the logic while keeping all behavior intact.

@botberry
Copy link
Member

Hi, thanks for contributing to Strawberry 🍓!

We noticed that this PR is missing a RELEASE.md file. We use that to automatically do releases here on GitHub and, most importantly, to PyPI!

So as soon as this PR is merged, a release will be made 🚀.

Here's an example of RELEASE.md:

Release type: patch

Description of the changes, ideally with some examples, if adding a new feature.

Release type can be one of patch, minor or major. We use semver, so make sure to pick the appropriate type. If in doubt feel free to ask :)

Here's the tweet text:

🆕 Release (next) is out! Thanks to @patrick91 for the PR 👏

Get it here 👉 https://strawberry.rocks/release/(next)

Copy link

codecov bot commented Feb 18, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 95.24%. Comparing base (951e56c) to head (0adf2e4).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3787      +/-   ##
==========================================
- Coverage   97.23%   95.24%   -2.00%     
==========================================
  Files         503      499       -4     
  Lines       33529    32360    -1169     
  Branches     1716     1679      -37     
==========================================
- Hits        32603    30820    -1783     
- Misses        707     1275     +568     
- Partials      219      265      +46     

@patrick91 patrick91 marked this pull request as draft February 18, 2025 18:23
Copy link

codspeed-hq bot commented Feb 18, 2025

CodSpeed Performance Report

Merging #3787 will not alter performance

Comparing refactor/ext (0adf2e4) with main (951e56c)

Summary

✅ 21 untouched benchmarks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants