广告图片

ghcr.io/kubeflow/spark-operator/controller:2.5.0 linux/amd64

ghcr.io/kubeflow/spark-operator/controller:2.5.0 - 国内下载镜像源 浏览次数:20

该镜像是Kubeflow项目中的Spark Operator控制器组件,用于在Kubernetes集群中管理和运行Apache Spark应用程序,提供对Spark作业的生命周期管理、资源调度和监控等功能,帮助用户在K8s环境中高效部署和管理Spark任务。

源镜像 ghcr.io/kubeflow/spark-operator/controller:2.5.0
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0
镜像ID sha256:ebe165a0262eaf87e403bacebdf1f82c1bc9667b16ddc7fe902062fd9505a995
镜像TAG 2.5.0
大小 1.36GB
镜像源 ghcr.io
CMD
启动入口 /usr/bin/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 20 次
贡献者
镜像创建 2026-03-19T00:30:24.670807041Z
同步时间 2026-04-27 16:39
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-17.0.18+8 SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz?action=download SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz.asc?action=download GPG_KEY=F28C9C925C188C35E345614DEDA00CE834F0FC5C SPARK_HOME=/opt/spark
镜像标签
2026-03-19T00:28:42.339Z: org.opencontainers.image.created Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name b3cca21e7889986e921de211401db6fea2f650c7: org.opencontainers.image.revision https://github.com/kubeflow/spark-operator: org.opencontainers.image.source spark-operator: org.opencontainers.image.title https://github.com/kubeflow/spark-operator: org.opencontainers.image.url 2.5.0: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0  ghcr.io/kubeflow/spark-operator/controller:2.5.0

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0  ghcr.io/kubeflow/spark-operator/controller:2.5.0

Shell快速替换命令

sed -i 's#ghcr.io/kubeflow/spark-operator/controller:2.5.0#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0  ghcr.io/kubeflow/spark-operator/controller:2.5.0'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0  ghcr.io/kubeflow/spark-operator/controller:2.5.0'

镜像构建历史


# 2026-03-19 08:30:24  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/usr/bin/entrypoint.sh"]
                        
# 2026-03-19 08:30:24  1.04KB 复制新文件或目录到容器中
COPY entrypoint.sh /usr/bin/ # buildkit
                        
# 2026-03-19 08:30:24  78.01MB 复制新文件或目录到容器中
COPY /workspace/bin/spark-operator /usr/bin/spark-operator # buildkit
                        
# 2026-03-19 08:29:18  0.00B 指定运行容器时使用的用户
USER 185:185
                        
# 2026-03-19 08:29:18  0.00B 执行命令并创建新的镜像层
RUN |2 SPARK_UID=185 SPARK_GID=185 /bin/sh -c mkdir -p /etc/k8s-webhook-server/serving-certs /home/spark &&     chmod -R g+rw /etc/k8s-webhook-server/serving-certs &&     chown -R spark /etc/k8s-webhook-server/serving-certs /home/spark # buildkit
                        
# 2026-03-19 08:29:17  0.00B 执行命令并创建新的镜像层
RUN |2 SPARK_UID=185 SPARK_GID=185 /bin/sh -c apt-get update     && apt-get install -y tini     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-19 08:29:17  0.00B 指定运行容器时使用的用户
USER root
                        
# 2026-03-19 08:29:17  0.00B 定义构建参数
ARG SPARK_GID=185
                        
# 2026-03-19 08:29:17  0.00B 定义构建参数
ARG SPARK_UID=185
                        
# 2026-03-17 11:25:09  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2026-03-17 11:25:09  321.83MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -ex;     apt-get update;     apt-get install -y python3 python3-pip;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-17 11:25:09  0.00B 指定运行容器时使用的用户
USER root
                        
# 2026-03-17 10:50:11  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2026-03-17 10:50:11  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2026-03-17 10:50:11  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2026-03-17 10:50:11  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2026-03-17 10:50:11  4.74KB 复制新文件或目录到容器中
COPY entrypoint.sh /opt/ # buildkit
                        
# 2026-03-17 10:50:11  488.98MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv RELEASE /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     ln -s "$(basename /opt/spark/examples/jars/spark-examples_*.jar)" /opt/spark/examples/jars/spark-examples.jar;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     mv R /opt/spark/;     chmod a+x /opt/decom.sh;     cd ..;     rm -rf "$SPARK_TMP"; # buildkit
                        
# 2026-03-17 10:48:50  0.00B 设置环境变量 SPARK_TGZ_URL SPARK_TGZ_ASC_URL GPG_KEY
ENV SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz?action=download SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz.asc?action=download GPG_KEY=F28C9C925C188C35E345614DEDA00CE834F0FC5C
                        
# 2026-03-17 10:48:50  53.37MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     apt-get update;     apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper;     mkdir -p /opt/spark;     mkdir /opt/spark/python;     mkdir -p /opt/spark/examples;     mkdir -p /opt/spark/work-dir;     chmod g+w /opt/spark/work-dir;     touch /opt/spark/RELEASE;     chown -R spark:spark /opt/spark;     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-17 10:48:39  64.83KB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark &&     useradd --system --uid=${spark_uid} --gid=spark -d /nonexistent spark # buildkit
                        
# 2026-03-17 10:48:39  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2026-03-17 09:22:59  0.00B 设置默认要执行的命令
CMD ["jshell"]
                        
# 2026-03-17 09:22:59  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/__cacert_entrypoint.sh"]
                        
# 2026-03-17 09:22:59  5.31KB 复制新文件或目录到容器中
COPY --chmod=755 entrypoint.sh /__cacert_entrypoint.sh # buildkit
                        
# 2026-03-17 09:22:59  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     echo "Verifying install ...";     fileEncoding="$(echo 'System.out.println(System.getProperty("file.encoding"))' | jshell -s -)"; [ "$fileEncoding" = 'UTF-8' ]; rm -rf ~/.java;     echo "javac --version"; javac --version;     echo "java --version"; java --version;     echo "Complete." # buildkit
                        
# 2026-03-17 09:22:58  280.28MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     ARCH="$(dpkg --print-architecture)";     case "${ARCH}" in        amd64)          ESUM='0c94cbb54325c40dcf026143eb621562017db5525727f2d9131a11250f72c450';          BINARY_URL='https://github.com/adoptium/temurin17-binaries/releases/download/jdk-17.0.18%2B8/OpenJDK17U-jdk_x64_linux_hotspot_17.0.18_8.tar.gz';          ;;        arm64)          ESUM='592a6702b3a07a0e0b82cb38aaab149bfce1b0c24d6b57ddb410bd9009333095';          BINARY_URL='https://github.com/adoptium/temurin17-binaries/releases/download/jdk-17.0.18%2B8/OpenJDK17U-jdk_aarch64_linux_hotspot_17.0.18_8.tar.gz';          ;;        armhf)          ESUM='21050b8325b62cb3fca4f871aadbddc04c67e21f3ab57236439aa951cbcb17ae';          BINARY_URL='https://github.com/adoptium/temurin17-binaries/releases/download/jdk-17.0.18%2B8/OpenJDK17U-jdk_arm_linux_hotspot_17.0.18_8.tar.gz';          ;;        ppc64el)          ESUM='5ab89fbde560e1a09386f389dd7881715b896f49c6e9aa974f72d551337dba5e';          BINARY_URL='https://github.com/adoptium/temurin17-binaries/releases/download/jdk-17.0.18%2B8/OpenJDK17U-jdk_ppc64le_linux_hotspot_17.0.18_8.tar.gz';          ;;        s390x)          ESUM='3693469655bcfa2fa5e70907245a2b3bc4236db7d9fa1b9feb0ab7abd235da09';          BINARY_URL='https://github.com/adoptium/temurin17-binaries/releases/download/jdk-17.0.18%2B8/OpenJDK17U-jdk_s390x_linux_hotspot_17.0.18_8.tar.gz';          ;;        *)          echo "Unsupported arch: ${ARCH}";          exit 1;          ;;     esac;     wget --progress=dot:giga -O /tmp/openjdk.tar.gz ${BINARY_URL};     wget --progress=dot:giga -O /tmp/openjdk.tar.gz.sig ${BINARY_URL}.sig;     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver keyserver.ubuntu.com --recv-keys 3B04D753C9050D9A5D343F39843C48A565F8F04B;     gpg --batch --verify /tmp/openjdk.tar.gz.sig /tmp/openjdk.tar.gz;     rm -rf "${GNUPGHOME}" /tmp/openjdk.tar.gz.sig;     echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -;     mkdir -p "$JAVA_HOME";     tar --extract         --file /tmp/openjdk.tar.gz         --directory "$JAVA_HOME"         --strip-components 1         --no-same-owner     ;     rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip;     find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf;     ldconfig;     java -Xshare:dump; # buildkit
                        
# 2026-03-17 09:22:51  0.00B 设置环境变量 JAVA_VERSION
ENV JAVA_VERSION=jdk-17.0.18+8
                        
# 2026-03-17 09:22:51  56.81MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     apt-get update;     DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         curl         wget         gnupg         fontconfig         ca-certificates p11-kit         binutils         tzdata         locales     ;     echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen;     locale-gen en_US.UTF-8;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-17 09:22:51  0.00B 设置环境变量 LANG LANGUAGE LC_ALL
ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
                        
# 2026-03-17 09:22:51  0.00B 设置环境变量 PATH
ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2026-03-17 09:22:51  0.00B 设置环境变量 JAVA_HOME
ENV JAVA_HOME=/opt/java/openjdk
                        
# 2026-02-24 15:30:08  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2026-02-24 15:30:08  77.88MB 
/bin/sh -c #(nop) ADD file:87202021c36509f80e5414aa2307ce867cd2e3b5f0d0f3bd0c98749793bd1fb4 in / 
                        
# 2026-02-24 15:30:06  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2026-02-24 15:30:06  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2026-02-24 15:30:06  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2026-02-24 15:30:06  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:ebe165a0262eaf87e403bacebdf1f82c1bc9667b16ddc7fe902062fd9505a995",
    "RepoTags": [
        "ghcr.io/kubeflow/spark-operator/controller:2.5.0",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller:2.5.0"
    ],
    "RepoDigests": [
        "ghcr.io/kubeflow/spark-operator/controller@sha256:82b6f8df1986b865bd33272ba76605b7beaa012d430e3d32b1b2aac420f03137",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/kubeflow/spark-operator/controller@sha256:bf10fb90ca4c62f6df55585edd910c05a3973cf7ce38e2bf5adaefd776049294"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2026-03-19T00:30:24.670807041Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "185:185",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-17.0.18+8",
            "SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz?action=download",
            "SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz.asc?action=download",
            "GPG_KEY=F28C9C925C188C35E345614DEDA00CE834F0FC5C",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/usr/bin/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2026-03-19T00:28:42.339Z",
            "org.opencontainers.image.description": "Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "b3cca21e7889986e921de211401db6fea2f650c7",
            "org.opencontainers.image.source": "https://github.com/kubeflow/spark-operator",
            "org.opencontainers.image.title": "spark-operator",
            "org.opencontainers.image.url": "https://github.com/kubeflow/spark-operator",
            "org.opencontainers.image.version": "2.5.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1357211954,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/de1a49bb1026bb1c424623eb4cf9561e2ae2d0faef20ccafb1a068ea6ee4dbae/diff:/var/lib/docker/overlay2/be548829fd1cebb173da604b9a8142fad9a7d5a5bb9752e5c6895344c18c1b76/diff:/var/lib/docker/overlay2/034d33b4bd28d3449fc2ab94f4ac42588403dcff857329fd01ce9635b465fbe6/diff:/var/lib/docker/overlay2/cec14a8c5735dec42636bc339508bdca22f5bcc935625caaecac283754d56919/diff:/var/lib/docker/overlay2/ba1f46548e668384a1b8982c12f4a87b9407891755443ccf04efa3988234489f/diff:/var/lib/docker/overlay2/a0c9eb517d7571cd3b6247e08c7013a052e15c717bf655b419847f8bac0e285a/diff:/var/lib/docker/overlay2/533211c73f2857a5920ebf3c19188dac3d01a632397be5dccc87c471007e3ab0/diff:/var/lib/docker/overlay2/92d8694f53551d3aedab67703d2ddfb005a6d6e997504beb44941f67c1c8b013/diff:/var/lib/docker/overlay2/1ea80a07e6c96569726828199b584cd37f6a21f93d7792105232e490e169a297/diff:/var/lib/docker/overlay2/71fb652924407d76f1d5267669f4cefae96f6a31a098d34b4312b2a0de1d0315/diff:/var/lib/docker/overlay2/34ef6f0118a4efae1d6761e2b04669d58f6c9864b6be9c415a301eb536ba45ba/diff:/var/lib/docker/overlay2/8f0d74a899fa44fd09eb7b3f495fce8486446b2cbfd7354497a6f381fff09019/diff:/var/lib/docker/overlay2/b7c451bca27ce0f03265eb53cc1fccf1804feb8d6c69c0478487054795a85438/diff:/var/lib/docker/overlay2/d846fcd7e5404b055d5b5fbf3176781b22fde37043daf5917dfcb548dbc0ef31/diff",
            "MergedDir": "/var/lib/docker/overlay2/6f2471f63977c41e03ab7336bbbc5e2f6198a2509efea863a0fe46d6d8422ac0/merged",
            "UpperDir": "/var/lib/docker/overlay2/6f2471f63977c41e03ab7336bbbc5e2f6198a2509efea863a0fe46d6d8422ac0/diff",
            "WorkDir": "/var/lib/docker/overlay2/6f2471f63977c41e03ab7336bbbc5e2f6198a2509efea863a0fe46d6d8422ac0/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:fd7b85c1b84e3dcd6e601489a9676aeb7126f78d71babf9637fab93280660536",
            "sha256:c730154d906e365ecf5ad3177429fe997481261a79c848b87fd0686c9cb5e6f2",
            "sha256:aeaad1f66245130458d35cbb9c01d576ed42aaf9e5483da82037c6f91e41a0fa",
            "sha256:44811e11756a1511a5074c2a986202fcc39098bcb85b18051fafe404796dc912",
            "sha256:70ef899564ac16a6b1c315a336c02eaa6bf2762ac58673ddb0cab4387179ae48",
            "sha256:cb066cb54ea3b4459b762f7b447facf69c06bb120047da2e931500db5a1e2217",
            "sha256:eb7b2f0ff5f2a6ef9eeed28bbe3adfee0fb64029f775b1bedf5754baf082f421",
            "sha256:828ffbf1ebd71ca6f6990f95b4b45c445cad06d498787fc4c05fc02f4bc270fa",
            "sha256:a37ac52c7fdb9212969be97120613eebea50d3c8c5fd3bc2d59ba29a99a49af7",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:947ac208aa80ef925cad843ec2853c4eb308de5e5a64b9ddd106575ba84775c8",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:c98ac252dca36b841c1c91b064f8a34a959881d3d07e12381c5aa5cbacf60a9b",
            "sha256:66959451ca0929b5f019b65b5d3469eb0a5f7975c9a9c7089d9d35847574a426",
            "sha256:69a3fe2179ee435d0ffaa2fab5cac08c362dd111defb8059cfdc9f99e1f91dee"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-04-27T16:37:30.023209637+08:00"
    }
}

更多版本

ghcr.io/kubeflow/spark-operator/controller:2.5.0

linux/amd64 ghcr.io1.36GB2026-04-27 16:39
19