-
Notifications
You must be signed in to change notification settings - Fork 307
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot find catalog plugin class for catalog 'glue_catalog' #222
Comments
You may be missing the following configuration parameters: |
Try to remove below config first: config("spark.sql.defaultCatalog", "glue_catalog") This is the configuration I used with no issue:
|
The DATALAKE_FORMATS environment variable was not working.
|
Hello ! I have an error regarding Iceberg setup with the image glue_libs_4.0.0_image_01
Summary
Can not use Iceberg in glue_libs_4.0.0_image_01 image
Steps to Reproduce
Dockerfile
"""
FROM amazon/aws-glue-libs:glue_libs_4.0.0_image_01
USER root
WORKDIR /app
COPY . /app
RUN yum update -y && yum install -y
wget
curl
python3-pip
&& rm -rf /var/lib/apt/lists/*
RUN curl "https://d1vvhvl2y92vvt.cloudfront.net/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" &&
unzip awscliv2.zip &&
./aws/install
RUN pip3 install pyspark==3.5.4
ENV DATALAKE_FORMATS=iceberg
ENTRYPOINT ["bash"]
"""
main.py
"""
from pyspark.sql import SparkSession
spark = (
SparkSession.builder.config("spark.sql.catalog.glue_catalog", "org.apache.iceberg.spark.SparkCatalog")
.config("spark.sql.catalog.glue_catalog.catalog-impl", "org.apache.iceberg.aws.glue.GlueCatalog")
.config("spark.sql.catalog.glue_catalog.io-impl", "org.apache.iceberg.aws.s3.S3FileIO")
.getOrCreate()
)
spark.sql("CREATE TABLE glue_catalog.my_database.my_table (id BIGINT, name STRING) USING iceberg")
"""
After building the image, running the container, and connect de AWS, I have the error "An error occurred while calling o47.sql.
: org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'glue_catalog': org.apache.iceberg.spark.SparkCatalog"
The text was updated successfully, but these errors were encountered: