docker.io/apache/spark:3.5.3 linux/amd64

docker.io/apache/spark:3.5.3 - 国内下载镜像源 浏览次数:55 安全受验证的发布者-apache

Apache Spark 镜像

该镜像包含 Apache Spark 的预构建环境,可用于运行 Spark 作业和应用程序。它提供了 Spark 的核心组件,包括:

  • Spark Core
  • Spark SQL
  • Spark Streaming
  • Spark MLlib
  • Spark GraphX

使用该镜像可以快速开始使用 Spark,无需手动安装和配置。它是一个易于使用且可靠的解决方案,适用于各种 Spark 应用场景。

源镜像 docker.io/apache/spark:3.5.3
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3
镜像ID sha256:d3ea4aeb842bc149cfcb143f7692273c9077b84bef2942a43aeb36ac0b8169dc
镜像TAG 3.5.3
大小 984.74MB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /opt/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 55 次
贡献者
镜像创建 2024-09-27T17:09:33.652793318Z
同步时间 2024-11-23 19:03
更新时间 2025-01-18 11:10
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-11.0.24+8 SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz.asc GPG_KEY=0A2D660358B6F6F8071FD16F6606986CF5A8447C SPARK_HOME=/opt/spark
镜像标签
ubuntu: org.opencontainers.image.ref.name 20.04: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: ubuntu 20.04 扫描引擎: Trivy 扫描时间: 2024-11-23 19:05

低危漏洞:136 中危漏洞:1184 高危漏洞:38 严重漏洞:5

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3  docker.io/apache/spark:3.5.3

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3  docker.io/apache/spark:3.5.3

Shell快速替换命令

sed -i 's#apache/spark:3.5.3#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3  docker.io/apache/spark:3.5.3'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3  docker.io/apache/spark:3.5.3'

镜像构建历史


# 2024-09-28 01:09:33  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2024-09-28 01:09:33  303.42MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -ex;     apt-get update;     apt-get install -y python3 python3-pip;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-09-28 01:09:33  0.00B 指定运行容器时使用的用户
USER root
                        
# 2024-09-28 00:45:36  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2024-09-28 00:45:36  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2024-09-28 00:45:36  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2024-09-28 00:45:36  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2024-09-28 00:45:36  4.74KB 复制新文件或目录到容器中
COPY entrypoint.sh /opt/ # buildkit
                        
# 2024-09-28 00:45:36  361.77MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv RELEASE /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     ln -s "$(basename $(ls /opt/spark/examples/jars/spark-examples_*.jar))" /opt/spark/examples/jars/spark-examples.jar;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     mv R /opt/spark/;     chmod a+x /opt/decom.sh;     cd ..;     rm -rf "$SPARK_TMP"; # buildkit
                        
# 2024-09-28 00:45:13  0.00B 设置环境变量 SPARK_TGZ_URL SPARK_TGZ_ASC_URL GPG_KEY
ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz.asc GPG_KEY=0A2D660358B6F6F8071FD16F6606986CF5A8447C
                        
# 2024-09-28 00:45:13  59.91MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     apt-get update;     apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper;     mkdir -p /opt/spark;     mkdir /opt/spark/python;     mkdir -p /opt/spark/examples;     mkdir -p /opt/spark/work-dir;     chmod g+w /opt/spark/work-dir;     touch /opt/spark/RELEASE;     chown -R spark:spark /opt/spark;     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-09-28 00:45:04  64.84KB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark &&     useradd --system --uid=${spark_uid} --gid=spark spark # buildkit
                        
# 2024-09-28 00:45:04  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2024-08-22 15:58:33  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/__cacert_entrypoint.sh"]
                        
# 2024-08-22 15:58:33  4.74KB 复制新文件或目录到容器中
COPY --chmod=755 entrypoint.sh /__cacert_entrypoint.sh # buildkit
                        
# 2024-08-22 15:58:33  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     echo "Verifying install ...";     echo "java --version"; java --version;     echo "Complete." # buildkit
                        
# 2024-08-22 15:58:33  140.96MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     ARCH="$(dpkg --print-architecture)";     case "${ARCH}" in        amd64)          ESUM='e0c1938093da3780e4494d366a4e6b75584dde8d46a19acea6691ae11df4cda5';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_x64_linux_hotspot_11.0.24_8.tar.gz';          ;;        arm64)          ESUM='1fe97cdaad47d7d108f329c6e4560b46748ef7f2948a1027812ade0bbc2a3597';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_aarch64_linux_hotspot_11.0.24_8.tar.gz';          ;;        armhf)          ESUM='bf893085627c6ec484e63aa1290276b23bcfee547459da6b0432ae9c5c1be22a';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_arm_linux_hotspot_11.0.24_8.tar.gz';          ;;        ppc64el)          ESUM='8ee351314182df93fbad96139bb74b97814944d66197896e388404a1ecfa06b3';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_ppc64le_linux_hotspot_11.0.24_8.tar.gz';          ;;        s390x)          ESUM='5b331f093bb03126334bbbc24f05f60681baeda461d860e4e2cdb693ee54e0ed';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_s390x_linux_hotspot_11.0.24_8.tar.gz';          ;;        *)          echo "Unsupported arch: ${ARCH}";          exit 1;          ;;     esac;     wget --progress=dot:giga -O /tmp/openjdk.tar.gz ${BINARY_URL};     echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -;     mkdir -p "$JAVA_HOME";     tar --extract         --file /tmp/openjdk.tar.gz         --directory "$JAVA_HOME"         --strip-components 1         --no-same-owner     ;     rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip;     find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf;     ldconfig;     java -Xshare:dump; # buildkit
                        
# 2024-08-22 15:58:33  0.00B 设置环境变量 JAVA_VERSION
ENV JAVA_VERSION=jdk-11.0.24+8
                        
# 2024-08-22 15:58:33  45.80MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux;     apt-get update;     DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         curl         wget         fontconfig         ca-certificates p11-kit         tzdata         locales     ;     echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen;     locale-gen en_US.UTF-8;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-08-22 15:58:33  0.00B 设置环境变量 LANG LANGUAGE LC_ALL
ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
                        
# 2024-08-22 15:58:33  0.00B 设置环境变量 PATH
ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-08-22 15:58:33  0.00B 设置环境变量 JAVA_HOME
ENV JAVA_HOME=/opt/java/openjdk
                        
# 2024-08-13 17:26:48  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2024-08-13 17:26:48  72.81MB 
/bin/sh -c #(nop) ADD file:e7cff353f027ecf0a2cb1cdd51714de3b083a11a0d965f104489f9a7e6925056 in / 
                        
# 2024-08-13 17:26:46  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=20.04
                        
# 2024-08-13 17:26:46  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2024-08-13 17:26:46  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2024-08-13 17:26:46  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:d3ea4aeb842bc149cfcb143f7692273c9077b84bef2942a43aeb36ac0b8169dc",
    "RepoTags": [
        "apache/spark:3.5.3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.5.3"
    ],
    "RepoDigests": [
        "apache/spark@sha256:d0f5d63bfbbffdbec08debb4911554ba671a2fa7398374db8db314d80328f576",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark@sha256:1d1493d5acb98bdf5e3376a123848ee0ef45663bcc70911f20fb576205a35a95"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-09-27T17:09:33.652793318Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "spark",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-11.0.24+8",
            "SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz",
            "SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz.asc",
            "GPG_KEY=0A2D660358B6F6F8071FD16F6606986CF5A8447C",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/opt/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "20.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 984741019,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/7d501a45988c0d8083408ef5b94090fc7f56d752c822b6384d4a5c224dd59f15/diff:/var/lib/docker/overlay2/e03fd0cd642832a57eed234a457f700677d4d1bd6c7eb96d25622d460daf1f87/diff:/var/lib/docker/overlay2/14087142c47e591382d150d2a03230fe4ed158ee8b4bb38ed38bb3da7b09f5e5/diff:/var/lib/docker/overlay2/cb2ad94f4a728c9655298960403c8456b8a88ea12fbefb97c98745fc0ed25f9e/diff:/var/lib/docker/overlay2/87f43794cf1b7e656b1f7dd9b5f7bb7cd8cfbc2604d29b92374d97632e638018/diff:/var/lib/docker/overlay2/28bbe5c88ba0dd804cfcaaa9a2cb8286a05d2a83ac2d6b772703711bd02756c4/diff:/var/lib/docker/overlay2/a744fe1bb9aea44667dfdf4260ef39385e8ddef4ea90316afe89c95b28bcb812/diff:/var/lib/docker/overlay2/578e01379a6b2506288183a4ae59322da0afd07ef63bf3d6601186c5b2d5d3f1/diff:/var/lib/docker/overlay2/e00cd7ba9b629675b6eca28bac83aa0212c87f13d6d84dcc67dba11855454844/diff:/var/lib/docker/overlay2/666611870c5b851bee1e441704a53c1c6f5beaae296df2fa4ee5b1a9ff40e37f/diff",
            "MergedDir": "/var/lib/docker/overlay2/665f620cd255ae4b9f4875a586ffa955b671f20fd2fa66f304ba46cb292a32a1/merged",
            "UpperDir": "/var/lib/docker/overlay2/665f620cd255ae4b9f4875a586ffa955b671f20fd2fa66f304ba46cb292a32a1/diff",
            "WorkDir": "/var/lib/docker/overlay2/665f620cd255ae4b9f4875a586ffa955b671f20fd2fa66f304ba46cb292a32a1/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:3ec3ded77c0ce89e931f92aed086b2a2c774a6fbd51617853decc8afa4e1087a",
            "sha256:8cdab93a842f1ebf4bfcd2da131a8b77a3d34a77c27fc239e0c0ef1803705f6d",
            "sha256:2f30f6654b4ec617de7f4513be9a6bbbb91f18194b0481496a76b04c3e112590",
            "sha256:4a5ab425d6e141aeea4423031c339b6b9a735ecb5f2d55952cc6704cf9ed8b3c",
            "sha256:b2650c6d730e3b3e60b27ee239244e841d56c85165215f2f557f2fae3d772276",
            "sha256:e8d880a0a760a02caef668593f1fe1af706b451ab559fb40f4f26fc96ec89940",
            "sha256:ae866224aa472db5a8a2c4e4a0bac17640332e40212493c328150f8d0e17d233",
            "sha256:356c8b2412ab6a373ef51f65fc062a97041aa459931a34a4b8660571d5465890",
            "sha256:1a701ede636236be195237753f7506c8d3571946961c47e94090e4f5e94a83a6",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:182e6e58b00dce4d43942230fbe06b9c3439011128f42f6cb4ab6da650cd1572"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-11-23T19:02:35.693340456+08:00"
    }
}

更多版本

docker.io/apache/spark:3.4.4

linux/amd64 docker.io974.48MB2024-10-29 01:26
111

docker.io/apache/spark:v3.2.3

linux/amd64 docker.io612.37MB2024-11-11 16:11
74

docker.io/apache/spark:3.5.3

linux/amd64 docker.io984.74MB2024-11-23 19:03
54

docker.io/apache/spark:3.5.3-java17

linux/amd64 docker.io1.15GB2024-11-24 00:41
53

docker.io/apache/spark:3.5.3-scala2.12-java11-r-ubuntu

linux/amd64 docker.io1.32GB2024-11-24 01:13
50

docker.io/apache/spark:3.5.3-scala2.12-java11-ubuntu

linux/amd64 docker.io681.32MB2024-11-24 01:31
60

docker.io/apache/spark:3.5.3-scala2.12-java17-ubuntu

linux/amd64 docker.io828.98MB2024-11-24 01:33
48

docker.io/apache/spark:3.3.3

linux/amd64 docker.io939.31MB2024-12-03 11:56
54

docker.io/apache/spark-py:v3.1.3

linux/amd64 docker.io886.30MB2024-12-11 19:44
47