Skip to content

Commit

Permalink
chore(release): Backport relevant release-24.11 changes (#146)
Browse files Browse the repository at this point in the history
* chore(release): Update stackableRelease to 24.11

* chore(release): Update image references with stackable24.11.0

* chore(release): Replace githubusercontent references main->release-24.11

* chore: Change docs version from nightly to 24.11

* chore: Remove superfluous page-aliases

* chore: Explicitly bump 24.11.0 to 24.11.1

These changes should be pulled into `main`, but this script needs more
work.

* chore(release): Update image references with stackable24.11.1

* fix(stack/end-to-end-security): Skip DB restore if the DB exists (#139)

Otherwise it breaks with:

```
ERROR [flask_migrate] Error: Requested revision 17fcea065655 overlaps with other requested revisions b7851ee5522f
```

The latter revision being the one that exists in the uploaded dump.

* change references back to main etc.

* prepare update_refs for next release

* typo

* revert changes for products

* set images back to dev except 24.3.0 exceptions

* Update .scripts/update_refs.sh

Co-authored-by: Nick <[email protected]>

---------

Co-authored-by: Techassi <[email protected]>
Co-authored-by: Techassi <[email protected]>
Co-authored-by: Nick Larsen <[email protected]>
Co-authored-by: Nick <[email protected]>
  • Loading branch information
5 people authored Jan 30, 2025
1 parent de005da commit d4baa48
Show file tree
Hide file tree
Showing 42 changed files with 42 additions and 53 deletions.
6 changes: 3 additions & 3 deletions .scripts/update_refs.sh
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,10 @@ function prepend {
function maybe_commit {
[ "$COMMIT" == "true" ] || return 0
local MESSAGE="$1"
PATCH=$(mktemp)
PATCH=$(mktemp --suffix=.diff)
git add -u
git diff --staged > "$PATCH"
git commit -S -m "$MESSAGE" --no-verify
git diff-index --quiet HEAD -- || git commit -S -m "$MESSAGE" --no-verify
echo "patch written to: $PATCH" | prepend "\t"
}

Expand All @@ -55,7 +55,7 @@ if [[ "$CURRENT_BRANCH" == release-* ]]; then

# Replace 0.0.0-dev refs with ${STACKABLE_RELEASE}.0
# TODO (@NickLarsenNZ): handle patches later, and what about release-candidates?
SEARCH='stackable(0\.0\.0-dev|24\.7\.[0-9]+)' # TODO (@NickLarsenNZ): After https://github.com/stackabletech/stackable-cockpit/issues/310, only search for 0.0.0-dev
SEARCH='stackable(0\.0\.0-dev)'
REPLACEMENT="stackable${STACKABLE_RELEASE}.0" # TODO (@NickLarsenNZ): Be a bit smarter about patch releases.
MESSAGE="Update image references with $REPLACEMENT"
echo "$MESSAGE"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: start-pyspark-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: start-date-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-kafka
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/instance=kafka -l app.kubernetes.io/name=kafka"]
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/LakehouseKafkaIngest.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-kafka
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=kafka -l app.kubernetes.io/instance=kafka"]
containers:
- name: create-spark-ingestion-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ingestion-job.yaml"]
volumeMounts:
- name: manifest
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-testdata
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Waiting for job load-test-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-test-data"]
containers:
- name: create-tables-in-trino
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
4 changes: 2 additions & 2 deletions demos/end-to-end-security/create-spark-report.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-trino-tables
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command:
- bash
- -euo
Expand All @@ -23,7 +23,7 @@ spec:
kubectl wait --timeout=30m --for=condition=complete job/create-tables-in-trino
containers:
- name: create-spark-report
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command:
- bash
- -euo
Expand Down
2 changes: 1 addition & 1 deletion demos/end-to-end-security/create-trino-tables.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-tables-in-trino
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ spec:
spec:
containers:
- name: create-hfile-and-import-to-hbase
image: docker.stackable.tech/stackable/hbase:2.4.18-stackable24.7.0
image: docker.stackable.tech/stackable/hbase:2.4.18-stackable0.0.0-dev
env:
- name: HADOOP_USER_NAME
value: stackable
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-druid-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor"]
volumeMounts:
- name: ingestion-job-spec
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/IngestEarthquakesToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-druid-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/stations-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-compaction-job-spec.json https://druid-coordinator:8281/druid/coordinator/v1/config/compaction"]
volumeMounts:
- name: ingestion-job-spec
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/IngestWaterLevelsToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/signal-processing/Dockerfile-nifi
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable24.7.0
FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable0.0.0-dev

RUN curl --fail -o /stackable/nifi/postgresql-42.6.0.jar "https://repo.stackable.tech/repository/misc/postgresql-timescaledb/postgresql-42.6.0.jar"
4 changes: 2 additions & 2 deletions demos/signal-processing/create-nifi-ingestion-job.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,13 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-timescale-job
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Waiting for timescaleDB tables to be ready'
&& kubectl wait --for=condition=complete job/create-timescale-tables-job"
]
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "export PGPASSWORD=$(cat /timescale-admin-credentials/password) && \
curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/DownloadAndWriteToDB.xml && \
sed -i \"s/PLACEHOLDERPGPASSWORD/$PGPASSWORD/g\" DownloadAndWriteToDB.xml && \
Expand Down
2 changes: 1 addition & 1 deletion demos/signal-processing/create-timescale-tables.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-timescale
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Waiting for timescaleDB to be ready'
&& kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=postgresql-timescaledb"
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,11 @@ spec:
spec:
initContainers:
- name: wait-for-testdata
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Waiting for job load-ny-taxi-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-ny-taxi-data"]
containers:
- name: create-spark-anomaly-detection-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ad-job.yaml"]
volumeMounts:
- name: manifest
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/trino-taxi-data/create-table-in-trino.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-ny-taxi-data-table-in-trino
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
2 changes: 1 addition & 1 deletion demos/trino-taxi-data/setup-superset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
Expand Down
1 change: 0 additions & 1 deletion docs/modules/demos/pages/airflow-scheduled-job.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= airflow-scheduled-job
:page-aliases: stable@stackablectl::demos/airflow-scheduled-job.adoc
:description: This demo installs Airflow with Postgres and Redis on Kubernetes, showcasing DAG scheduling, job runs, and status verification via the Airflow UI.

Install this demo on an existing Kubernetes cluster:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= data-lakehouse-iceberg-trino-spark
:page-aliases: stable@stackablectl::demos/data-lakehouse-iceberg-trino-spark.adoc
:description: This demo shows a data workload with real-world data volumes using Trino, Kafka, Spark, NiFi, Superset and OPA.

:demo-code: https://github.com/stackabletech/demos/blob/main/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml
Expand Down
1 change: 0 additions & 1 deletion docs/modules/demos/pages/hbase-hdfs-load-cycling-data.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= hbase-hdfs-cycling-data
:page-aliases: stable@stackablectl::demos/hbase-hdfs-load-cycling-data.adoc
:description: Load cyclist data from HDFS to HBase on Kubernetes using Stackable's demo. Install, copy data, create HFiles, and query efficiently.

:kaggle: https://www.kaggle.com/datasets/timgid/cyclistic-dataset-google-certificate-capstone?select=Divvy_Trips_2020_Q1.csv
Expand Down
1 change: 0 additions & 1 deletion docs/modules/demos/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= Demos
:page-aliases: stable@stackablectl::demos/index.adoc
:description: Explore Stackable demos showcasing data platform architectures. Includes external components for evaluation.

The pages in this section guide you on how to use the demos provided by Stackable.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data
:page-aliases: stable@stackablectl::demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc

:scikit-lib: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.IsolationForest.html
:k8s-cpu: https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu
Expand Down
1 change: 0 additions & 1 deletion docs/modules/demos/pages/logging.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= logging
:page-aliases: stable@stackablectl::demos/logging.adoc
:description: Deploy a logging stack with OpenSearch, Vector, and Zookeeper for log data analysis using OpenSearch Dashboards in Kubernetes.

:k8s-cpu: https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= nifi-kafka-druid-earthquake-data
:page-aliases: stable@stackablectl::demos/nifi-kafka-druid-earthquake-data.adoc
:description: Install this demo for a showcase of using Kafka, Druid and Superset to view the global earthquake distribution.

:superset-docs: https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/#creating-charts-in-explore-view
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= nifi-kafka-druid-water-level-data
:page-aliases: stable@stackablectl::demos/nifi-kafka-druid-water-level-data.adoc
:description: Install this demo for a showcase of using Kafka, Druid and Superset to visualize water levels in across Germany.

:superset: https://superset.apache.org/docs/using-superset/creating-your-first-dashboard/#creating-charts-in-explore-view
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= spark-k8s-anomaly-detection-taxi-data
:page-aliases: stable@stackablectl::demos/spark-k8s-anomaly-detection-taxi-data.adoc
:description: Deploy a Kubernetes-based Spark demo for anomaly detection using the popular New York taxi dataset, featuring Trino, Spark, MinIO, and Superset.

:scikit-lib: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.IsolationForest.html
Expand Down
1 change: 0 additions & 1 deletion docs/modules/demos/pages/trino-iceberg.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= trino-iceberg
:page-aliases: stable@stackablectl::demos/trino-iceberg.adoc
:description: Install and explore Trino with Apache Iceberg for efficient SQL queries and scalable data management in a demo environment.

:k8s-cpu: https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu
Expand Down
1 change: 0 additions & 1 deletion docs/modules/demos/pages/trino-taxi-data.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
= trino-taxi-data
:page-aliases: stable@stackablectl::demos/trino-taxi-data.adoc
:description: Install and demo Trino with NYC taxi data: Query with SQL, visualize with Superset, and explore data in MinIO and Trino on Kubernetes.

:superset-docs: https://superset.apache.org/docs/creating-charts-dashboards/creating-your-first-dashboard#creating-charts-in-explore-view
Expand Down
2 changes: 1 addition & 1 deletion stacks/_templates/jupyterhub.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ options:
HADOOP_CONF_DIR: "/home/jovyan/hdfs"
initContainers:
- name: download-notebook
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
image: docker.stackable.tech/stackable/tools:1.0.0-stackable0.0.0-dev
command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/notebook.ipynb -o /notebook/notebook.ipynb']
volumeMounts:
- mountPath: /notebook
Expand Down
2 changes: 1 addition & 1 deletion stacks/_templates/keycloak.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ spec:
- name: tls
mountPath: /tls/
- name: create-auth-class
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev
command: ["/bin/bash", "-c"]
args:
- |
Expand Down
Loading

0 comments on commit d4baa48

Please sign in to comment.