docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu linux/amd64

docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu - 国内下载镜像源 浏览次数:11
Apache Spark 是一个快速、通用的、分布式内存处理引擎。

Spark Docker 镜像

这是 Apache Spark 的 Docker 镜像,提供了一个完整的 Spark 集群环境。

  • 基于 OpenJDK 11 和 Java 8
  • 包含 Scala 2.12 和 Python 3.x 运行时
  • 支持 HDFS、HBase 和 Cassandra 等存储系统
  • 可以通过 Docker Compose 或者 Kubernetes 部署
源镜像 docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu
镜像ID sha256:03086b8c65cc1e8b1c5c262c2e5fcca8c404962cf7c9fae6ec457aedfe50b957
镜像TAG 4.0.1-scala2.13-java21-python3-ubuntu
大小 1.30GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /opt/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 11 次
贡献者 is****i@163.com
镜像创建 2025-11-14T01:19:40.684407064Z
同步时间 2025-11-19 18:03
更新时间 2025-11-19 22:13
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-21.0.9+10 SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz?action=download SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz.asc?action=download GPG_KEY=F28C9C925C188C35E345614DEDA00CE834F0FC5C SPARK_HOME=/opt/spark
镜像标签
ubuntu: org.opencontainers.image.ref.name 22.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu  docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu  docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu

Shell快速替换命令

sed -i 's#spark:4.0.1-scala2.13-java21-python3-ubuntu#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu  docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu  docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu'

镜像构建历史


# 2025-11-14 09:19:40  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2025-11-14 09:19:40  318.41MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -ex;     apt-get update;     apt-get install -y python3 python3-pip;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-11-14 09:19:40  0.00B 指定运行容器时使用的用户
USER root
                        
# 2025-11-14 08:40:29  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2025-11-14 08:40:29  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2025-11-14 08:40:29  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2025-11-14 08:40:29  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2025-11-14 08:40:29  4.74KB 复制新文件或目录到容器中
COPY entrypoint.sh /opt/ # buildkit
                        
# 2025-11-14 08:40:29  488.98MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv RELEASE /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     ln -s "$(basename /opt/spark/examples/jars/spark-examples_*.jar)" /opt/spark/examples/jars/spark-examples.jar;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     mv R /opt/spark/;     chmod a+x /opt/decom.sh;     cd ..;     rm -rf "$SPARK_TMP"; # buildkit
                        
# 2025-11-14 08:39:57  0.00B 设置环境变量 SPARK_TGZ_URL SPARK_TGZ_ASC_URL GPG_KEY
ENV SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz?action=download SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz.asc?action=download GPG_KEY=F28C9C925C188C35E345614DEDA00CE834F0FC5C
                        
# 2025-11-14 08:39:57  53.36MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     apt-get update;     apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper;     mkdir -p /opt/spark;     mkdir /opt/spark/python;     mkdir -p /opt/spark/examples;     mkdir -p /opt/spark/work-dir;     chmod g+w /opt/spark/work-dir;     touch /opt/spark/RELEASE;     chown -R spark:spark /opt/spark;     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-11-14 08:39:46  64.83KB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark &&     useradd --system --uid=${spark_uid} --gid=spark -d /nonexistent spark # buildkit
                        
# 2025-11-14 08:39:46  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2025-11-14 07:21:42  0.00B 设置默认要执行的命令
CMD ["jshell"]
                        
# 2025-11-14 07:21:42  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/__cacert_entrypoint.sh"]
                        
# 2025-11-14 07:21:42  5.31KB 复制新文件或目录到容器中
COPY --chmod=755 entrypoint.sh /__cacert_entrypoint.sh # buildkit
                        
# 2025-11-14 07:21:42  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     echo "Verifying install ...";     fileEncoding="$(echo 'System.out.println(System.getProperty("file.encoding"))' | jshell -s -)"; [ "$fileEncoding" = 'UTF-8' ]; rm -rf ~/.java;     echo "javac --version"; javac --version;     echo "java --version"; java --version;     echo "Complete." # buildkit
                        
# 2025-11-14 07:21:41  307.02MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     ARCH="$(dpkg --print-architecture)";     case "${ARCH}" in        amd64)          ESUM='810d3773df7e0d6c4394e4e244b264c8b30e0b05a0acf542d065fd78a6b65c2f';          BINARY_URL='https://github.com/adoptium/temurin21-binaries/releases/download/jdk-21.0.9%2B10/OpenJDK21U-jdk_x64_linux_hotspot_21.0.9_10.tar.gz';          ;;        arm64)          ESUM='edf0da4debe7cf475dbe320d174d6eed81479eb363f41e38a2efb740428c603a';          BINARY_URL='https://github.com/adoptium/temurin21-binaries/releases/download/jdk-21.0.9%2B10/OpenJDK21U-jdk_aarch64_linux_hotspot_21.0.9_10.tar.gz';          ;;        ppc64el)          ESUM='ac5a0394a234269b4e20459649ac93cb702cde29b3e46a0bcf3aa53958f2d4a4';          BINARY_URL='https://github.com/adoptium/temurin21-binaries/releases/download/jdk-21.0.9%2B10/OpenJDK21U-jdk_ppc64le_linux_hotspot_21.0.9_10.tar.gz';          ;;        s390x)          ESUM='e8ede0fb48aaa3a0cc1ac7c8522f8ca7938bdbb8be0d603b61134de7f898aff4';          BINARY_URL='https://github.com/adoptium/temurin21-binaries/releases/download/jdk-21.0.9%2B10/OpenJDK21U-jdk_s390x_linux_hotspot_21.0.9_10.tar.gz';          ;;        *)          echo "Unsupported arch: ${ARCH}";          exit 1;          ;;     esac;     wget --progress=dot:giga -O /tmp/openjdk.tar.gz ${BINARY_URL};     wget --progress=dot:giga -O /tmp/openjdk.tar.gz.sig ${BINARY_URL}.sig;     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver keyserver.ubuntu.com --recv-keys 3B04D753C9050D9A5D343F39843C48A565F8F04B;     gpg --batch --verify /tmp/openjdk.tar.gz.sig /tmp/openjdk.tar.gz;     rm -rf "${GNUPGHOME}" /tmp/openjdk.tar.gz.sig;     echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -;     mkdir -p "$JAVA_HOME";     tar --extract         --file /tmp/openjdk.tar.gz         --directory "$JAVA_HOME"         --strip-components 1         --no-same-owner     ;     rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip;     find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf;     ldconfig;     java -Xshare:dump; # buildkit
                        
# 2025-11-14 07:21:34  0.00B 设置环境变量 JAVA_VERSION
ENV JAVA_VERSION=jdk-21.0.9+10
                        
# 2025-11-14 07:21:34  56.81MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     apt-get update;     DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         curl         wget         gnupg         fontconfig         ca-certificates p11-kit         binutils         tzdata         locales     ;     echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen;     locale-gen en_US.UTF-8;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-11-14 07:21:34  0.00B 设置环境变量 LANG LANGUAGE LC_ALL
ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
                        
# 2025-11-14 07:21:34  0.00B 设置环境变量 PATH
ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-11-14 07:21:34  0.00B 设置环境变量 JAVA_HOME
ENV JAVA_HOME=/opt/java/openjdk
                        
# 2025-10-14 01:23:20  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-10-14 01:23:20  77.87MB 
/bin/sh -c #(nop) ADD file:d025507456f1d7d19195885b1c02a346454d60c9348cbd3be92431f2d7e2666e in / 
                        
# 2025-10-14 01:23:18  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2025-10-14 01:23:18  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2025-10-14 01:23:18  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2025-10-14 01:23:18  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:03086b8c65cc1e8b1c5c262c2e5fcca8c404962cf7c9fae6ec457aedfe50b957",
    "RepoTags": [
        "spark:4.0.1-scala2.13-java21-python3-ubuntu",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu"
    ],
    "RepoDigests": [
        "spark@sha256:84738ac8824a2da1d74c225f4371b4e6109295d39ed75eee085d07286505bd55",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark@sha256:5cd152b425a654cc8233f6c002e589883039c39cd054b745b5c48a895a1d6a32"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-11-14T01:19:40.684407064Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "spark",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-21.0.9+10",
            "SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz?action=download",
            "SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.1/spark-4.0.1-bin-hadoop3.tgz.asc?action=download",
            "GPG_KEY=F28C9C925C188C35E345614DEDA00CE834F0FC5C",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/opt/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "22.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1302525882,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/bdc1c9dbb37ee7c1505d5fa74169ed4f43c00d1d9e9ed69ea60632ac8d938713/diff:/var/lib/docker/overlay2/43a54fd9b70768c5771029aee9e08f351dbc0063e4dbfe3ff80176baf8e22ca5/diff:/var/lib/docker/overlay2/a8ccf4cee1c5f2641fb5bade284125f4d29e1d6c2767500c0204df8579f4ce47/diff:/var/lib/docker/overlay2/1460b77099bd4cd90ec9114480f90a307b080b93a4300cce579a4e679424c248/diff:/var/lib/docker/overlay2/b8d773090326653f7e924e8af3bad0f64ee6fd4960a6870ba8ef7b37093f4932/diff:/var/lib/docker/overlay2/99f7733142a1c32f4bfe809652d025026a0ecad4d6d5b2b374fe482053fc991f/diff:/var/lib/docker/overlay2/06d5cc29f21a50412ddc03962c0545f480b6dcd82a3a1cd49e04668aa9b5d181/diff:/var/lib/docker/overlay2/70a290ecb90e8304f24b7b2dff742047b98ec29f0e9405720f2a745f0f0db8cd/diff:/var/lib/docker/overlay2/f3e2cd18e8a550e7319c268f08f63da2c52ea89ca4a0050b4250d5400877db69/diff:/var/lib/docker/overlay2/6f6ec8e5321ca8688879ac4e8387377602db46a15370198137f7e7fb60a45a73/diff",
            "MergedDir": "/var/lib/docker/overlay2/76864bd457b079dcce729a3446763dd5f33cfec5b7da3669e4f6e584ba158edd/merged",
            "UpperDir": "/var/lib/docker/overlay2/76864bd457b079dcce729a3446763dd5f33cfec5b7da3669e4f6e584ba158edd/diff",
            "WorkDir": "/var/lib/docker/overlay2/76864bd457b079dcce729a3446763dd5f33cfec5b7da3669e4f6e584ba158edd/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:73974f74b436f39a2fdb6461b1e3f7c3e41c73325776fa71d16b942a5b4a365b",
            "sha256:831c5368931554699471d1ed9d336c36d9ec370677e41457f910de8e8338d2ee",
            "sha256:e024fae40503f036a90dbbbdf96e0f665081a491c20f1e5ef3b008704bff3a4d",
            "sha256:d2137e6133c89825ada7ce343e29fa969361b96d8ef0bc0ceaedd509e4853378",
            "sha256:db168814201228770d952e6d185aa66f83b34c67de467a8b3c7fdea9d8f7fc36",
            "sha256:c996f8f47ec2f4530a0f642a3f5e5b23b2f1d2437585ae0d28bfe2b6c811b169",
            "sha256:63a68cbdac573819b887f44c80ac43a3190872beeaa11eecf2d8852a57878cee",
            "sha256:44fc57e60c08f3dcf00fde685c7bf442ed73593cf89d46d44bd73aa1d0da7d43",
            "sha256:847e2e5916f7f7b1129dbf23b209e1131b19ed2260004655bc8177c0bdb0d1d0",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:8cd87d42b6a7e4540fbff522c85c0d7e8c4efd54a8aca3e373ec2b9658e56944"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-11-19T18:01:33.370620315+08:00"
    }
}

更多版本

docker.io/spark:latest

linux/amd64 docker.io984.32MB2024-09-25 17:19
638

docker.io/spark:3.5.3

linux/amd64 docker.io982.46MB2024-10-29 01:25
408

docker.io/sparkison/m3u-editor:latest

linux/amd64 docker.io989.06MB2025-08-15 20:40
227

docker.io/spark:3.5.7

linux/amd64 docker.io1.15GB2025-10-10 10:53
126

docker.io/spark:4.0.1-scala2.13-java21-python3-ubuntu

linux/amd64 docker.io1.30GB2025-11-19 18:03
10