docker.io/spark:3.5.7 linux/amd64

docker.io/spark:3.5.7 - 国内下载镜像源 浏览次数:12
Apache Spark 是一个快速、通用的、分布式内存处理引擎。

Spark Docker 镜像

这是 Apache Spark 的 Docker 镜像,提供了一个完整的 Spark 集群环境。

  • 基于 OpenJDK 11 和 Java 8
  • 包含 Scala 2.12 和 Python 3.x 运行时
  • 支持 HDFS、HBase 和 Cassandra 等存储系统
  • 可以通过 Docker Compose 或者 Kubernetes 部署
源镜像 docker.io/spark:3.5.7
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7
镜像ID sha256:e1798f073e8fcc27e62ef7c753f8729f35860a73c0f9f811444dbe2c29921942
镜像TAG 3.5.7
大小 1.15GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /opt/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 12 次
贡献者
镜像创建 2025-10-07T16:34:28Z
同步时间 2025-10-10 10:53
更新时间 2025-10-10 16:39
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-11.0.28+6 SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz.asc?action=download GPG_KEY=564CA14951C29266889F9C5B90E2BA86F7A9B307 SPARK_HOME=/opt/spark
镜像标签
ubuntu: org.opencontainers.image.ref.name 22.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7  docker.io/spark:3.5.7

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7  docker.io/spark:3.5.7

Shell快速替换命令

sed -i 's#spark:3.5.7#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7  docker.io/spark:3.5.7'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7  docker.io/spark:3.5.7'

镜像构建历史


# 2025-10-08 00:34:28  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2025-10-08 00:34:28  332.30MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -ex;     apt-get update;     apt-get install -y python3 python3-pip;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-10-08 00:34:28  0.00B 指定运行容器时使用的用户
USER root
                        
# 2025-10-08 00:34:28  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2025-10-08 00:34:28  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2025-10-08 00:34:28  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2025-10-08 00:34:28  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2025-10-08 00:34:28  4.74KB 复制新文件或目录到容器中
COPY entrypoint.sh /opt/ # buildkit
                        
# 2025-10-08 00:34:28  361.82MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv RELEASE /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     ln -s "$(basename /opt/spark/examples/jars/spark-examples_*.jar)" /opt/spark/examples/jars/spark-examples.jar;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     mv R /opt/spark/;     chmod a+x /opt/decom.sh;     cd ..;     rm -rf "$SPARK_TMP"; # buildkit
                        
# 2025-10-08 00:34:28  0.00B 设置环境变量 SPARK_TGZ_URL SPARK_TGZ_ASC_URL GPG_KEY
ENV SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz.asc?action=download GPG_KEY=564CA14951C29266889F9C5B90E2BA86F7A9B307
                        
# 2025-10-08 00:34:28  53.35MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     apt-get update;     apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper;     mkdir -p /opt/spark;     mkdir /opt/spark/python;     mkdir -p /opt/spark/examples;     mkdir -p /opt/spark/work-dir;     chmod g+w /opt/spark/work-dir;     touch /opt/spark/RELEASE;     chown -R spark:spark /opt/spark;     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-10-08 00:34:28  64.83KB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark &&     useradd --system --uid=${spark_uid} --gid=spark -d /nonexistent spark # buildkit
                        
# 2025-10-08 00:34:28  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2025-08-01 19:04:34  0.00B 设置默认要执行的命令
CMD ["jshell"]
                        
# 2025-08-01 19:04:34  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/__cacert_entrypoint.sh"]
                        
# 2025-08-01 19:04:34  5.31KB 复制新文件或目录到容器中
COPY --chmod=755 entrypoint.sh /__cacert_entrypoint.sh # buildkit
                        
# 2025-08-01 19:04:34  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     echo "Verifying install ...";     fileEncoding="$(echo 'System.out.println(System.getProperty("file.encoding"))' | jshell -s -)"; [ "$fileEncoding" = 'UTF-8' ]; rm -rf ~/.java;     echo "javac --version"; javac --version;     echo "java --version"; java --version;     echo "Complete." # buildkit
                        
# 2025-08-01 19:04:34  278.92MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     ARCH="$(dpkg --print-architecture)";     case "${ARCH}" in        amd64)          ESUM='7dfd551795a8884b26cbb02e0301da95db40160bb194f48271dc2ef9367f50c2';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.28%2B6/OpenJDK11U-jdk_x64_linux_hotspot_11.0.28_6.tar.gz';          ;;        arm64)          ESUM='32c316cb3998a9c9dee2829fbb577ea1c0ed666700cec73e049d44c342bb19af';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.28%2B6/OpenJDK11U-jdk_aarch64_linux_hotspot_11.0.28_6.tar.gz';          ;;        armhf)          ESUM='b33c99068804bbd7e4aa4bd1c5419ae88ec77833e5e5339ab06a00546a2b0711';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.28%2B6/OpenJDK11U-jdk_arm_linux_hotspot_11.0.28_6.tar.gz';          ;;        ppc64el)          ESUM='e272abd162b3de68093630929453feba3e63a5ab1bbb912379f6a4aa968ef06a';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.28%2B6/OpenJDK11U-jdk_ppc64le_linux_hotspot_11.0.28_6.tar.gz';          ;;        s390x)          ESUM='ac3f94fdcc5372e90f44fad9cd03ec0e3fd3535fea06c120f85e4a7534c6de04';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.28%2B6/OpenJDK11U-jdk_s390x_linux_hotspot_11.0.28_6.tar.gz';          ;;        *)          echo "Unsupported arch: ${ARCH}";          exit 1;          ;;     esac;     wget --progress=dot:giga -O /tmp/openjdk.tar.gz ${BINARY_URL};     wget --progress=dot:giga -O /tmp/openjdk.tar.gz.sig ${BINARY_URL}.sig;     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver keyserver.ubuntu.com --recv-keys 3B04D753C9050D9A5D343F39843C48A565F8F04B;     gpg --batch --verify /tmp/openjdk.tar.gz.sig /tmp/openjdk.tar.gz;     rm -rf "${GNUPGHOME}" /tmp/openjdk.tar.gz.sig;     echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -;     mkdir -p "$JAVA_HOME";     tar --extract         --file /tmp/openjdk.tar.gz         --directory "$JAVA_HOME"         --strip-components 1         --no-same-owner     ;     rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip;     find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf;     ldconfig;     java -Xshare:dump; # buildkit
                        
# 2025-08-01 19:04:34  0.00B 设置环境变量 JAVA_VERSION
ENV JAVA_VERSION=jdk-11.0.28+6
                        
# 2025-08-01 19:04:34  42.92MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     apt-get update;     DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         curl         wget         gnupg         fontconfig         ca-certificates p11-kit         tzdata         locales     ;     echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen;     locale-gen en_US.UTF-8;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-08-01 19:04:34  0.00B 设置环境变量 LANG LANGUAGE LC_ALL
ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
                        
# 2025-08-01 19:04:34  0.00B 设置环境变量 PATH
ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-08-01 19:04:34  0.00B 设置环境变量 JAVA_HOME
ENV JAVA_HOME=/opt/java/openjdk
                        
# 2025-08-01 19:04:34  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-08-01 19:04:34  77.87MB 
/bin/sh -c #(nop) ADD file:32d41b6329e8f89fa4ac92ef97c04b7cfd5e90fb74e1509c3e27d7c91195b7c7 in / 
                        
# 2025-08-01 19:04:34  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2025-08-01 19:04:34  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2025-08-01 19:04:34  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2025-08-01 19:04:34  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:e1798f073e8fcc27e62ef7c753f8729f35860a73c0f9f811444dbe2c29921942",
    "RepoTags": [
        "spark:3.5.7",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark:3.5.7"
    ],
    "RepoDigests": [
        "spark@sha256:5b75b64a42010ba072c2b0a987cc8d1c4b9ec70e0e99b1c72c7f1871855ff631",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/spark@sha256:28ac9c3a44f0d68e53f67362e94f0eb03cf594132dd509e825146df7c290b27f"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-10-07T16:34:28Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "spark",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-11.0.28+6",
            "SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download",
            "SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz.asc?action=download",
            "GPG_KEY=564CA14951C29266889F9C5B90E2BA86F7A9B307",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/opt/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "22.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1147267224,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/047e92e14a729757a97f0bc3ba52ef7e47a87b92bfd7b949dd23fd5dc7a6e9ba/diff:/var/lib/docker/overlay2/28934776a7ea72e8a5c2add54bddaf0e126b8ad771241bcb88821e0d20945059/diff:/var/lib/docker/overlay2/e6e4a5a97d8c29f102c61b971bebfea0764bd36a36a3b9d573d54fbce21f2011/diff:/var/lib/docker/overlay2/29902e24ccfbf53129ccb2a38c64050199c3b5e575915d606a56f4f38d6b0305/diff:/var/lib/docker/overlay2/17f695b490e128fab80248aa9c329db98a5b2ed43aefb8e2f2f3aa2d5350d83f/diff:/var/lib/docker/overlay2/556a0ebf049bce635f7f0c41d1c130888be22e4cd68594e184125ea9defa98ff/diff:/var/lib/docker/overlay2/928b0a61f9a5cb25723fb51a83b485db13c90f642c4adde53cef3ef7ed5d1907/diff:/var/lib/docker/overlay2/d795813bac3a7da89a6e5138b5354e5ac2d6164c2fcd05ab3cbee6b0ef091585/diff:/var/lib/docker/overlay2/590c3c2757d1744a711f8f0a1766a7e1840b1e20d3a749c1a83bec8ee01f90aa/diff:/var/lib/docker/overlay2/99a8a7af45ffa1dc430375fde8c3084ee85be5839f687d98d2857fb82cd37c67/diff",
            "MergedDir": "/var/lib/docker/overlay2/a3d0ad101424d8aaf011be594fa35766c9501e94ee1072bdecf11a8b93df3d23/merged",
            "UpperDir": "/var/lib/docker/overlay2/a3d0ad101424d8aaf011be594fa35766c9501e94ee1072bdecf11a8b93df3d23/diff",
            "WorkDir": "/var/lib/docker/overlay2/a3d0ad101424d8aaf011be594fa35766c9501e94ee1072bdecf11a8b93df3d23/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:767e56ba346ae714b6e6b816baa839051145ed78cfa0e4524a86cc287b0c4b00",
            "sha256:ed8d24f15f58ca1d507fbd49d73db4a20c1768d5b5eb4349d55b685700b166da",
            "sha256:8f5d757681956607236326be20e821e4acbb5f144842e173438956c41a07a7bf",
            "sha256:71023687c9c68fe6c9f7607a3d5528f06e42ef0bcfa9a7211369a1725d0a4f4f",
            "sha256:42f09734fd259a0b4adecb32e678c4c0f3c9c9dd7e019bb69517c57ddc034038",
            "sha256:4402b1e5d5e9cf9f34fc0dcc5d132af37e24f0a30a89faf0f5b758faff689944",
            "sha256:40fec6d69bf2a94a4f80489a6d7a3a5eb9ab61b8359aed46a2b4016e7df467b9",
            "sha256:684811c1392caddcc2180aa8853c1e6691ce1d87b73fb7b4827040a65ae0c5a9",
            "sha256:ec48606a8654fb39611ffea1f9c6e2c12a9f633495091fd4ae3e283a65f442f7",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:b73c2fb94371f6372db7f58e0923e4f470ea25e1a3e9b1bff8b27ee4aa65e78e"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-10-10T10:51:52.498901873+08:00"
    }
}

更多版本

docker.io/spark:latest

linux/amd64 docker.io984.32MB2024-09-25 17:19
565

docker.io/spark:3.5.3

linux/amd64 docker.io982.46MB2024-10-29 01:25
337

docker.io/sparkison/m3u-editor:latest

linux/amd64 docker.io989.06MB2025-08-15 20:40
136

docker.io/spark:3.5.7

linux/amd64 docker.io1.15GB2025-10-10 10:53
11