Local Cache Not Being Reused Across Builds with Local File System Caching in Docker Buildx #2884
Closed
aditya-kashyapp
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Current Implementation:
In our CI/CD pipeline setup with Jenkins, we want to leverage Docker Buildx for efficient image builds while utilizing a centralized cache stored on a local file system (e.g., /mnt/cache, which mounted in the Jenkins Worker Pod via a PV/PVC).
We are running Docker Inside Jenkins Worker Pod, but with the switch of Kubernetes form Docker to Containerd, the time taken to build docker image inside the Jenkins Worker image was increased significantly.
So we decided to switch to docker buildx from normal docker build command for building the docker images for which we could leverage the buildx cache export and import feature(--cache-from & --cache-to).
Here for the a build pipeline, if a build is successful, for the next subsequent build on the same pipeline the Layers are Cached and is exported into a Central Cache Location which we have mounted as a PV which is an EFS for cache retrieval and storing.
We are experiencing an issue where Docker Buildx is not reusing cache effectively across multiple builds, despite using the local file system for caching. Specifically, when building different components(Build Pipeline) with the same dependencies (i.e., using the same base image and requirement.txt for Python Package installation), the layers cached are not picked-up during build, even though there are cached blobs available in the file system.
We configure DOCKER_BUILDKIT and CACHE_PATH as environment variables in the Pod Spec.
Jenkins dynamically creates and destroys worker pods for each build
Challenges:
Cache Not Utilized Across Components:
For Example: (Build Pipeline are referred as components for simplicity)
If
component1
is built usingbase image1
and after a successful build, its layers are exported to a central cache, when buildingcomponent2
(with a sharedbase image1
), it does not reuse the cached layers which were exported by component1 build.This results in redundant downloads and builds of already existing layers.
Cache Not Reused After Build Failures::
When a build fails, the cached layers from the previous successful build are not reused for the next subsequent build.
This leads to repeated rebuilding from scratch for subsequent attempts.
This is our Build.sh script for building the docker image where we are creating the Docker Builder and Image for each build:
These are the values of the Constants:
${CACHE_PATH} = /mnt/cache
${DEPENDENCIES} = dependencies
${ROOT_FOLDER} = /home/jenkins/agent/workspace/
Cache Path and Dependencies are constant for every build pipeline but the root folder location changes w.r.t Build pipeline.
Jenkins Logs where the Build Layers are Cached when for the same build pipeline if the previous build is successful, the layers are being cached.
Logs for build where layers are not cached, if the same base docker image is being used, but for a different pipeline, the cache is not getting picked-up.
Questions:
I have found this warning on the Docker Docs, not sure what is exactly meant by this, does the cache location is over-ridden, but I can see layers in the Cache Folder which are present for older build
As a general rule, each cache writes to some location. No location can be written to twice, without overwriting the previously cached data. If you want to maintain multiple scoped caches (for example, a cache per Git branch), then ensure that you use different locations for exported cache.
Expected Result:
Multiple Docker builds should be able to share a cache directory where:
Steps tried:
Note: We want to use Local Cache export/import.
Beta Was this translation helpful? Give feedback.
All reactions