Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Tests Failing in Windows Environment #757

Open
1 task done
jessicatarra opened this issue Sep 16, 2024 · 2 comments · May be fixed by #751
Open
1 task done

fix: Tests Failing in Windows Environment #757

jessicatarra opened this issue Sep 16, 2024 · 2 comments · May be fixed by #751
Labels
bug Something isn't working

Comments

@jessicatarra
Copy link
Contributor

jessicatarra commented Sep 16, 2024

Is there an existing issue for this?

  • I have searched the existing issues.

Version

main

Description

Currently, there are two issues related to the Windows environment:

  1. The status codes when running some melos commands are not reliable. For example, the command melos test --no-select is expected to be the same as running melos exec --dir-exists=test --concurrency 1 -- "dart test". However, using the latter command directly provides the actual exit code of the process in windows environment pipeline, while the former does not.

  2. Due to the recent fix to maintain the working directory across script steps, as merged in PR #711, the log output will look like the following one:

'Microsoft Windows [Version 10.0.20348.2655]\n'
              '(c) Microsoft Corporation. All rights reserved.\n'
              '\n'
              'd:\\a\\melos\\melos\\packages\\melos\\.dart_tool\\a9ed4760>      echo "➡️ step: melos run list" && melos run list || VER>NUL && if %ERRORLEVEL% NEQ 0 (echo __FAILURE_COMMAND_END__) else (echo )\n'
              '"➡️ step: melos run list" \n'
              'melos run list\n'
              '  └> echo "list script"\n'
              '     └> RUNNING\n'
              '\n'
              '"list script"\n'
              '\n'
              'melos run list\n'
              '  └> echo "list script"\n'
              '     └> SUCCESS\n'
              '\n'
              'd:\\a\\melos\\melos\\packages\\melos\\.dart_tool\\a9ed4760>    \n'
              'd:\\a\\melos\\melos\\packages\\melos\\.dart_tool\\a9ed4760>      echo "➡️ step: echo "hello world"" && echo "hello world" || VER>NUL && if %ERRORLEVEL% NEQ 0 (echo __FAILURE_COMMAND_END__) else (echo )\n'
              '"➡️ step: echo "hello world"" \n'
              '"hello world" \n'
              '\n'
              'd:\\a\\melos\\melos\\packages\\melos\\.dart_tool\\a9ed4760>    \n'
              'd:\\a\\melos\\melos\\packages\\melos\\.dart_tool\\a9ed4760>SUCCESS\n'

Steps to reproduce

Run the test suite as a GitHub Actions Workflow:
https://github.com/invertase/melos/actions/runs/10889115119

Expected behavior

  • Status codes should be reliable and consistent across environments
  • Log output for melos script with steps should be consistently formatted across environments

Screenshots

No response

Additional context and comments

No response

@jessicatarra jessicatarra added the bug Something isn't working label Sep 16, 2024
@jessicatarra jessicatarra linked a pull request Sep 16, 2024 that will close this issue
7 tasks
@bvoq
Copy link

bvoq commented Sep 17, 2024

This also happens when you add steps as follows:

scripts:
  test:
    steps:
      - melos exec -f --dir-exists=test -- flutter test
      - echo "Continue the step like nothing happened"

If you run melos test you will hit exit code 0 on macOS.
This is not platform specific.
However, if you run melos exec -f --dir-exists=test -- flutter test directly, you will get exit code 1.

@spydon
Copy link
Collaborator

spydon commented Sep 17, 2024

@bvoq this issue is about the internal Melos windows tests. There should be another issue open for the issue you're describing though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants