Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Yarn 2: cache is deleted before pruning dev dependencies, making build slow #918

Closed
osdiab opened this issue May 22, 2021 · 9 comments · Fixed by #990
Closed

Yarn 2: cache is deleted before pruning dev dependencies, making build slow #918

osdiab opened this issue May 22, 2021 · 9 comments · Fixed by #990

Comments

@osdiab
Copy link

osdiab commented May 22, 2021

Describe the bug
If you use yarn berry, the buildpack deletes the cache before pruning dev dependencies:

rm -rf "$cache_dir"

The result is that my builds take way longer than they should because while the first cached build is super fast, when it prunes dev deps it ends up re-downloading all my deps. This sucks! I don't really understand the justification for deleting the cache there, but it adds around 2+ minutes to my builds for a task that should happen near instantly.

For now I'll just disable pruning with the YARN_PRODUCTION=false flag. That doesn't work either, #906

Versions (please complete the following information):

  • Heroku Stack: heroku-18
  • Node Version: 14
  • NPM or Yarn Version: yarn 2
  • Buildpack Version: latest from git
@danielleadams
Copy link
Contributor

The cache is deleted because running the prune command does not actually remove the devDependencies. Right now, the cost of a smaller slug size is a longer build, but we can continue to investigate this to figure out if there's a better way to do this.

@osdiab
Copy link
Author

osdiab commented May 30, 2021

Discussion on the yarn berry package suggests that yarn maintainers went the route of not building in this functionality but allowing people to offer it as a yarn plugin:

If you're open to using that plugin, could make it possible fairly easily!

@danielleadams danielleadams removed their assignment Jun 5, 2021
@imajes
Copy link

imajes commented Jun 29, 2021

@danielleadams, @heroku: is there a fix for this, or any movement? our build times have significantly decreased because of it, and i'm concious that it's just putting more pressure on yarn mirrors which are often provided out of generosity- downloading again because you've purged a cache is not very economical.

@danielleadams
Copy link
Contributor

@imajes unfortunately, I left Heroku, so I'm no longer on this Issue.

@MaxMartiFoundry
Copy link

You can work around this by uninstalling the workspace-tools plugin before this step.

if has_yarn_workspace_plugin_installed "$build_dir"; then
echo "Running 'yarn workspaces focus --all --production'"
meta_set "workspace-plugin-present" "true"
# The cache is removed beforehand because the command is running an install on devDeps, and
# it will not remove the existing dependencies beforehand.
rm -rf "$cache_dir"
monitor "yarn-prune" yarn workspaces focus --all --production
meta_set "skipped-prune" "false"
else
meta_set "workspace-plugin-present" "false"
echo "Skipping because the Yarn workspace plugin is not present. Add the plugin to your source code with 'yarn plugin import workspace-tools'."
fi

Add in package.json scripts:

    "heroku-postbuild": "yarn plugin remove @yarnpkg/plugin-workspace-tools",

@lizthegrey
Copy link

lizthegrey commented Feb 10, 2022

@joshwlewis I think this is the thing you mentioned that is still remaining to fix wrt Yarn 2 parity after #978 got merged.

@lizthegrey
Copy link

Thanks for fixing this, v193 appears indeed to work. Hopefully a future v195 will contain the revert of the revert in v194!

@colincasey
Copy link
Contributor

@lizthegrey v195 release does contain #999. It's currently running for only 20% of builds but we'll roll it out more broadly if it appears to handle pruning without tripping over the edge case that caused us to revert #990.

@hibachrach
Copy link

I'm confused--this is still a problem, no? Can this issue be reopened? See #1056 as an example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants