docker.io/kubeflow/spark-operator:2.1.0 linux/amd64

docker.io/kubeflow/spark-operator:2.1.0 - 国内下载镜像源 浏览次数:53
```html

这是一个Kubeflow项目提供的Spark Operator镜像。Spark Operator允许您在Kubernetes集群上运行和管理Apache Spark应用程序。使用此镜像,您可以方便地部署、扩展和监控Spark作业,而无需直接与Kubernetes API交互。

```
源镜像 docker.io/kubeflow/spark-operator:2.1.0
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0
镜像ID sha256:a585bb80ff18409222dd9a8ec3181c6cbee6064d310f6b88cf5462f10d6afda0
镜像TAG 2.1.0
大小 1.05GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /usr/bin/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 53 次
贡献者 de**********i@gmail.com
镜像创建 2024-12-11T05:43:47.146733197Z
同步时间 2024-12-24 17:11
更新时间 2025-01-30 18:26
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-11.0.25+9 SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz.asc GPG_KEY=0A2D660358B6F6F8071FD16F6606986CF5A8447C SPARK_HOME=/opt/spark
镜像标签
2024-12-11T05:42:20.357Z: org.opencontainers.image.created Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name 664b9d01c42a04a5327e582cc23215c34e9a5020: org.opencontainers.image.revision https://github.com/kubeflow/spark-operator: org.opencontainers.image.source spark-operator: org.opencontainers.image.title https://github.com/kubeflow/spark-operator: org.opencontainers.image.url 2.1.0: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: ubuntu 20.04 扫描引擎: Trivy 扫描时间: 2024-12-24 17:11

低危漏洞:134 中危漏洞:1213 高危漏洞:37 严重漏洞:5

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0  docker.io/kubeflow/spark-operator:2.1.0

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0  docker.io/kubeflow/spark-operator:2.1.0

Shell快速替换命令

sed -i 's#kubeflow/spark-operator:2.1.0#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0  docker.io/kubeflow/spark-operator:2.1.0'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0  docker.io/kubeflow/spark-operator:2.1.0'

镜像构建历史


# 2024-12-11 13:43:47  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/usr/bin/entrypoint.sh"]
                        
# 2024-12-11 13:43:47  1.04KB 复制新文件或目录到容器中
COPY entrypoint.sh /usr/bin/ # buildkit
                        
# 2024-12-11 13:43:47  64.95MB 复制新文件或目录到容器中
COPY /workspace/bin/spark-operator /usr/bin/spark-operator # buildkit
                        
# 2024-12-11 13:42:51  0.00B 指定运行容器时使用的用户
USER 185:185
                        
# 2024-12-11 13:42:51  0.00B 执行命令并创建新的镜像层
RUN |2 SPARK_UID=185 SPARK_GID=185 /bin/sh -c mkdir -p /etc/k8s-webhook-server/serving-certs /home/spark &&     chmod -R g+rw /etc/k8s-webhook-server/serving-certs &&     chown -R spark /etc/k8s-webhook-server/serving-certs /home/spark # buildkit
                        
# 2024-12-11 13:42:51  0.00B 执行命令并创建新的镜像层
RUN |2 SPARK_UID=185 SPARK_GID=185 /bin/sh -c apt-get update     && apt-get install -y tini     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-12-11 13:42:51  0.00B 指定运行容器时使用的用户
USER root
                        
# 2024-12-11 13:42:51  0.00B 定义构建参数
ARG SPARK_GID=185
                        
# 2024-12-11 13:42:51  0.00B 定义构建参数
ARG SPARK_UID=185
                        
# 2024-10-10 14:58:10  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2024-10-10 14:58:10  303.03MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -ex;     apt-get update;     apt-get install -y python3 python3-pip;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-10-10 14:58:10  0.00B 指定运行容器时使用的用户
USER root
                        
# 2024-10-10 14:58:10  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2024-10-10 14:58:10  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2024-10-10 14:58:10  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2024-10-10 14:58:10  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2024-10-10 14:58:10  4.74KB 复制新文件或目录到容器中
COPY entrypoint.sh /opt/ # buildkit
                        
# 2024-10-10 14:58:10  361.77MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv RELEASE /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     ln -s "$(basename /opt/spark/examples/jars/spark-examples_*.jar)" /opt/spark/examples/jars/spark-examples.jar;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     mv R /opt/spark/;     chmod a+x /opt/decom.sh;     cd ..;     rm -rf "$SPARK_TMP"; # buildkit
                        
# 2024-10-10 14:58:10  0.00B 设置环境变量 SPARK_TGZ_URL SPARK_TGZ_ASC_URL GPG_KEY
ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz.asc GPG_KEY=0A2D660358B6F6F8071FD16F6606986CF5A8447C
                        
# 2024-10-10 14:58:10  50.89MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     apt-get update;     apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper;     mkdir -p /opt/spark;     mkdir /opt/spark/python;     mkdir -p /opt/spark/examples;     mkdir -p /opt/spark/work-dir;     chmod g+w /opt/spark/work-dir;     touch /opt/spark/RELEASE;     chown -R spark:spark /opt/spark;     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-10-10 14:58:10  64.84KB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark &&     useradd --system --uid=${spark_uid} --gid=spark spark # buildkit
                        
# 2024-10-10 14:58:10  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2024-10-10 14:58:10  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/__cacert_entrypoint.sh"]
                        
# 2024-10-10 14:58:10  5.31KB 复制新文件或目录到容器中
COPY --chmod=755 entrypoint.sh /__cacert_entrypoint.sh # buildkit
                        
# 2024-10-10 14:58:10  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     echo "Verifying install ...";     echo "java --version"; java --version;     echo "Complete." # buildkit
                        
# 2024-10-10 14:58:10  141.01MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     ARCH="$(dpkg --print-architecture)";     case "${ARCH}" in        amd64)          ESUM='84cd7101f39172a4db085fb52940595bb14dad6bc3afb5bf82ee497eceaf86d3';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.25%2B9/OpenJDK11U-jre_x64_linux_hotspot_11.0.25_9.tar.gz';          ;;        arm64)          ESUM='e37ba6636e31f3c9191ac7e3fd0ab7fb354a2f3b320d68bfb95efd1e053134c9';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.25%2B9/OpenJDK11U-jre_aarch64_linux_hotspot_11.0.25_9.tar.gz';          ;;        armhf)          ESUM='6b7b1297da762cf2b1eb4834073e6a45cda82a359efb17a89eba3fc6b59b4d26';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.25%2B9/OpenJDK11U-jre_arm_linux_hotspot_11.0.25_9.tar.gz';          ;;        ppc64el)          ESUM='7e7edaf34c29c304514d60f40f6c9ce58eb3e32b0dec20bb6ccd1cfbb4456698';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.25%2B9/OpenJDK11U-jre_ppc64le_linux_hotspot_11.0.25_9.tar.gz';          ;;        s390x)          ESUM='4ec884fe3874e258ae2253d535d3d92d6c313542fd973e8963c2eb87d68fb273';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.25%2B9/OpenJDK11U-jre_s390x_linux_hotspot_11.0.25_9.tar.gz';          ;;        *)          echo "Unsupported arch: ${ARCH}";          exit 1;          ;;     esac;     wget --progress=dot:giga -O /tmp/openjdk.tar.gz ${BINARY_URL};     wget --progress=dot:giga -O /tmp/openjdk.tar.gz.sig ${BINARY_URL}.sig;     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver keyserver.ubuntu.com --recv-keys 3B04D753C9050D9A5D343F39843C48A565F8F04B;     gpg --batch --verify /tmp/openjdk.tar.gz.sig /tmp/openjdk.tar.gz;     rm -r "${GNUPGHOME}" /tmp/openjdk.tar.gz.sig;     echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -;     mkdir -p "$JAVA_HOME";     tar --extract         --file /tmp/openjdk.tar.gz         --directory "$JAVA_HOME"         --strip-components 1         --no-same-owner     ;     rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip;     find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf;     ldconfig;     java -Xshare:dump; # buildkit
                        
# 2024-10-10 14:58:10  0.00B 设置环境变量 JAVA_VERSION
ENV JAVA_VERSION=jdk-11.0.25+9
                        
# 2024-10-10 14:58:10  52.87MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     apt-get update;     DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         curl         wget         gnupg         fontconfig         ca-certificates p11-kit         tzdata         locales     ;     echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen;     locale-gen en_US.UTF-8;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-10-10 14:58:10  0.00B 设置环境变量 LANG LANGUAGE LC_ALL
ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
                        
# 2024-10-10 14:58:10  0.00B 设置环境变量 PATH
ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-10-10 14:58:10  0.00B 设置环境变量 JAVA_HOME
ENV JAVA_HOME=/opt/java/openjdk
                        
# 2024-10-10 14:58:10  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2024-10-10 14:58:10  72.81MB 
/bin/sh -c #(nop) ADD file:7486147a645d8835a5181c79f00a3606c6b714c83bcbfcd8862221eb14690f9e in / 
                        
# 2024-10-10 14:58:10  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=20.04
                        
# 2024-10-10 14:58:10  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2024-10-10 14:58:10  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2024-10-10 14:58:10  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:a585bb80ff18409222dd9a8ec3181c6cbee6064d310f6b88cf5462f10d6afda0",
    "RepoTags": [
        "kubeflow/spark-operator:2.1.0",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.1.0"
    ],
    "RepoDigests": [
        "kubeflow/spark-operator@sha256:ff96ec3e900dc4610b5c18cab89946d6cd4242c1b4cf1f6192824afd553a2b0f",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator@sha256:4cb43c71e41819294356e29b12dd63daef96ff146011f28019700c4ad9dc45ad"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-12-11T05:43:47.146733197Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "185:185",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-11.0.25+9",
            "SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz",
            "SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz.asc",
            "GPG_KEY=0A2D660358B6F6F8071FD16F6606986CF5A8447C",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/usr/bin/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2024-12-11T05:42:20.357Z",
            "org.opencontainers.image.description": "Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "664b9d01c42a04a5327e582cc23215c34e9a5020",
            "org.opencontainers.image.source": "https://github.com/kubeflow/spark-operator",
            "org.opencontainers.image.title": "spark-operator",
            "org.opencontainers.image.url": "https://github.com/kubeflow/spark-operator",
            "org.opencontainers.image.version": "2.1.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1047404404,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/7dc8ad933e699d651cb9f43a79db6315e08c28ebf5515817fe3a2400a05da6d6/diff:/var/lib/docker/overlay2/1dfd2a9223c697a6ca2c96d0c9007b71bde776aa2cd91daffd1497535397166b/diff:/var/lib/docker/overlay2/4ab98b322dd7092c5350f51d21c34f7bea684f250b88dd937d1aabb60003f2c0/diff:/var/lib/docker/overlay2/a74f201e812bca8fab0c09840ab42abfec44be9af4c0bdf18d6ce9619d4cf4b4/diff:/var/lib/docker/overlay2/52f351b28a7b20d09a44637e3585971da4856d48ef06a03ad12a4ccf7662ab1d/diff:/var/lib/docker/overlay2/97dfbbbd464608bf1282a15def33261b6c6193231a94b7eaa69bdf3000dfdc3c/diff:/var/lib/docker/overlay2/153cef14c04f77e16b82fc4d50d54955117480d982f1d2bbd22f7aa3a10f9c77/diff:/var/lib/docker/overlay2/3ea68c59a882a4a21c0cba3d00bd02dd00091aca92b15e5f5c9169c2d2d280d1/diff:/var/lib/docker/overlay2/40ea43c55ea51a66af3b9743608e37a46ddad6bc7dc40d5b9b6d9b6d037365ee/diff:/var/lib/docker/overlay2/94a629ee4b2451511d577e4fab7b35eacb5e806744d6818992353bb640eccd31/diff:/var/lib/docker/overlay2/1801b78a3803e61839ddc8b12075dd49f5c0a9916154038349baf01aa75d4dca/diff:/var/lib/docker/overlay2/2a7538cef17086dea1c75c6246a23016a82b3e1b53ac382cea105dbd8618553c/diff:/var/lib/docker/overlay2/061cfdb7a7d417d3c2cb338841a353101b1a410028ce07415b72418529391fd7/diff:/var/lib/docker/overlay2/364ee650fcb55abc90ea2c6b6b0e7b00ae28674a17a09f26a611d7681de96bc4/diff",
            "MergedDir": "/var/lib/docker/overlay2/3f1856d09f997cd85b780e789602641126d87f9351327958732616317361ee7e/merged",
            "UpperDir": "/var/lib/docker/overlay2/3f1856d09f997cd85b780e789602641126d87f9351327958732616317361ee7e/diff",
            "WorkDir": "/var/lib/docker/overlay2/3f1856d09f997cd85b780e789602641126d87f9351327958732616317361ee7e/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:fffe76c64ef2dee2d80a8bb3ad13d65d596d04a45510b1956a976a69215dae92",
            "sha256:1a6e7e1fc78a1c01ecbc8ce3693faf9816b1380df134458cac72f340d43d1fa8",
            "sha256:60b89701d33eefe6014db3d21b24c8bdb8c0054848cfea3ce44d07e1a22ecf7b",
            "sha256:2a574e81bd82d2c5c8d8a482e12bb8fa85d206da01f4da10ff5b39c0059d1629",
            "sha256:06f79ba26fd42b919085db9e9f953097c2779853058475d8bcbd08e26ddc07a4",
            "sha256:5785eadf7f8e11009e0d1edf0d3fb0347fa29f5a0d5881f85dfdcf2e2cca1da7",
            "sha256:2628edda9834fb4146690501cec2f7a7c4b2709bcddff1dcf0dadc674939caf8",
            "sha256:331e29774e06a2cc1ce82ca53448bb311f0aab48ce19dbbc6a01b2d2372a6ffe",
            "sha256:ab601ad9ad6709daa012040c40e5b314385e76c1df5381a6aed9a3bab8659900",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:e9bb2ff0a7e7ad52f4e6866f04a5da6561a68e4ddd0a8bcab9d4af26e8054df3",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:8541b3d2227e9fdd5b9107d80617e37bb882faf2c50378e11fa5a0658097cc5d",
            "sha256:3460c943cf1834e73b9f16a73c400b50e2e2b8dc696129dc0fb615bdf413f64a",
            "sha256:0ab7811d682bc636d87666db43482b15560bc76bf36fe4b8ca51308690e03182"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-12-24T17:11:08.691010433+08:00"
    }
}

更多版本

docker.io/kubeflow/spark-operator:2.0.2

linux/amd64 docker.io1.05GB2024-11-28 17:32
56

docker.io/kubeflow/spark-operator:2.1.0

linux/amd64 docker.io1.05GB2024-12-24 17:11
52

docker.io/kubeflow/spark-operator:v1beta2-1.6.2-3.5.0

linux/amd64 docker.io1.04GB2025-01-15 10:20
37