docker.io/apache/spark:3.3.3 linux/amd64

docker.io/apache/spark:3.3.3 - 国内下载镜像源 浏览次数:55 安全受验证的发布者-apache

Apache Spark 镜像

该镜像包含 Apache Spark 的预构建环境,可用于运行 Spark 作业和应用程序。它提供了 Spark 的核心组件,包括:

  • Spark Core
  • Spark SQL
  • Spark Streaming
  • Spark MLlib
  • Spark GraphX

使用该镜像可以快速开始使用 Spark,无需手动安装和配置。它是一个易于使用且可靠的解决方案,适用于各种 Spark 应用场景。

源镜像 docker.io/apache/spark:3.3.3
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3
镜像ID sha256:f00ba10a5676bf2d02ea10a73e244f57140cb86a6299a2dfee9795ab1c175443
镜像TAG 3.3.3
大小 939.31MB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /opt/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 55 次
贡献者
镜像创建 2023-08-22T10:18:27.458046245Z
同步时间 2024-12-03 11:56
更新时间 2025-01-18 11:10
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-11.0.20+8 SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz.asc GPG_KEY=F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8 SPARK_HOME=/opt/spark
镜像标签
ubuntu: org.opencontainers.image.ref.name 20.04: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: ubuntu 20.04 扫描引擎: Trivy 扫描时间: 2024-12-03 11:58

低危漏洞:218 中危漏洞:1872 高危漏洞:96 严重漏洞:6

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3  docker.io/apache/spark:3.3.3

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3  docker.io/apache/spark:3.3.3

Shell快速替换命令

sed -i 's#apache/spark:3.3.3#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3  docker.io/apache/spark:3.3.3'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3  docker.io/apache/spark:3.3.3'

镜像构建历史


# 2023-08-22 18:18:27  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2023-08-22 18:18:27  303.01MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -ex;     apt-get update;     apt-get install -y python3 python3-pip;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2023-08-22 18:18:27  0.00B 指定运行容器时使用的用户
USER root
                        
# 2023-08-22 18:04:13  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2023-08-22 18:04:13  0.00B 指定运行容器时使用的用户
USER spark
                        
# 2023-08-22 18:04:13  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2023-08-22 18:04:13  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2023-08-22 18:04:13  4.54KB 复制新文件或目录到容器中
COPY entrypoint.sh /opt/ # buildkit
                        
# 2023-08-22 18:04:13  320.05MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "$SPARK_TGZ_URL";     wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL";     export GNUPGHOME="$(mktemp -d)";     gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" ||     gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     mv R /opt/spark/;     chmod a+x /opt/decom.sh;     cd ..;     rm -rf "$SPARK_TMP"; # buildkit
                        
# 2023-08-22 18:00:38  0.00B 设置环境变量 SPARK_TGZ_URL SPARK_TGZ_ASC_URL GPG_KEY
ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz.asc GPG_KEY=F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8
                        
# 2023-08-22 18:00:38  57.33MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex;     apt-get update;     apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper;     mkdir -p /opt/spark;     mkdir /opt/spark/python;     mkdir -p /opt/spark/examples;     mkdir -p /opt/spark/work-dir;     chmod g+w /opt/spark/work-dir;     touch /opt/spark/RELEASE;     chown -R spark:spark /opt/spark;     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2023-08-22 18:00:25  64.84KB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark &&     useradd --system --uid=${spark_uid} --gid=spark spark # buildkit
                        
# 2023-08-22 18:00:25  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2023-08-15 02:10:24  0.00B 
/bin/sh -c #(nop)  ENTRYPOINT ["/__cacert_entrypoint.sh"]
                        
# 2023-08-15 02:10:24  1.18KB 
/bin/sh -c #(nop) COPY file:8b8864b3e02a33a579dc216fd51b28a6047bc8eeaa03045b258980fe0cf7fcb3 in /__cacert_entrypoint.sh 
                        
# 2023-08-09 03:22:46  0.00B 
/bin/sh -c echo Verifying install ...     && fileEncoding="$(echo 'System.out.println(System.getProperty("file.encoding"))' | jshell -s -)"; [ "$fileEncoding" = 'UTF-8' ]; rm -rf ~/.java     && echo java --version && java --version     && echo Complete.
                        
# 2023-08-09 03:22:45  140.28MB 
/bin/sh -c set -eux;     ARCH="$(dpkg --print-architecture)";     case "${ARCH}" in        aarch64|arm64)          ESUM='45e190920fb3ec61ee5213a7bd98553abf2ae7692eb9daa504fcdc9d59a7cfc4';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.20%2B8/OpenJDK11U-jre_aarch64_linux_hotspot_11.0.20_8.tar.gz';          ;;        armhf|arm)          ESUM='1e2a02364084b2d054e88a871f3efaa4450ae4f087b8f806fd95c15d5affcc7b';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.20%2B8/OpenJDK11U-jre_arm_linux_hotspot_11.0.20_8.tar.gz';          ;;        ppc64el|powerpc:common64)          ESUM='61034834b61bf080392218b25dcac2d9e3505b5e4f53539704d496be4181aadf';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.20%2B8/OpenJDK11U-jre_ppc64le_linux_hotspot_11.0.20_8.tar.gz';          ;;        s390x|s390:64-bit)          ESUM='0c7050976914e0613179446de62bb20d2845ae809f6d31bc0ed8d136f8fd3d9b';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.20%2B8/OpenJDK11U-jre_s390x_linux_hotspot_11.0.20_8.tar.gz';          ;;        amd64|i386:x86-64)          ESUM='ffb070c26ea22771f78769c569c9db3412e6486434dc6df1fd3c3438285766e7';          BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.20%2B8/OpenJDK11U-jre_x64_linux_hotspot_11.0.20_8.tar.gz';          ;;        *)          echo "Unsupported arch: ${ARCH}";          exit 1;          ;;     esac; 	  wget -O /tmp/openjdk.tar.gz ${BINARY_URL}; 	  echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -; 	  mkdir -p "$JAVA_HOME"; 	  tar --extract 	      --file /tmp/openjdk.tar.gz 	      --directory "$JAVA_HOME" 	      --strip-components 1 	      --no-same-owner 	  ;     rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip;     find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf;     ldconfig;     java -Xshare:dump;
                        
# 2023-08-09 03:22:00  0.00B 
/bin/sh -c #(nop)  ENV JAVA_VERSION=jdk-11.0.20+8
                        
# 2023-08-09 03:20:25  45.78MB 
/bin/sh -c apt-get update     && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends tzdata curl wget ca-certificates fontconfig locales p11-kit     && echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen     && locale-gen en_US.UTF-8     && rm -rf /var/lib/apt/lists/*
                        
# 2023-08-03 10:34:40  0.00B 
/bin/sh -c #(nop)  ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
                        
# 2023-08-03 10:34:40  0.00B 
/bin/sh -c #(nop)  ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2023-08-03 10:34:40  0.00B 
/bin/sh -c #(nop)  ENV JAVA_HOME=/opt/java/openjdk
                        
# 2023-08-01 14:16:46  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2023-08-01 14:16:45  72.79MB 
/bin/sh -c #(nop) ADD file:233702cd816c07bc9fed02881b11fb3bdcaee41f3ce3ec1c9f0c4a060b155d5b in / 
                        
# 2023-08-01 14:16:44  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=20.04
                        
# 2023-08-01 14:16:44  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2023-08-01 14:16:44  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2023-08-01 14:16:43  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:f00ba10a5676bf2d02ea10a73e244f57140cb86a6299a2dfee9795ab1c175443",
    "RepoTags": [
        "apache/spark:3.3.3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark:3.3.3"
    ],
    "RepoDigests": [
        "apache/spark@sha256:df1f30f4e33ae553f5af5a7f1b349851b0316895c260637378bd203eb68a2234",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark@sha256:1f29eb4bb11e24bd60c16ce3c89ebbc9855bc3c22dbec0f0bf9dd9b62f98698a"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2023-08-22T10:18:27.458046245Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "spark",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-11.0.20+8",
            "SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz",
            "SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz.asc",
            "GPG_KEY=F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/opt/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "20.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 939314036,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/9ce9bdc567bfbf4ab28f017b5677b9b74fdbfef8b98c9a1343b8c1f7466574c6/diff:/var/lib/docker/overlay2/ca4c00ac9adb80849ad8e55fe9e53ccb84d4319cbf80cb1daf8f25205fb05fa0/diff:/var/lib/docker/overlay2/379dedd26dfdf8b39aecdbe57c106f8063d68db0de82cd72819957ad55a41542/diff:/var/lib/docker/overlay2/f20a438b78f4c8595761f3ab7975bfe92ca682bf0b8a157aeb2097afc65fbebc/diff:/var/lib/docker/overlay2/4571b00540f19508717264a27ca8da909d5c59d033d18e69eb44d1473275648f/diff:/var/lib/docker/overlay2/efe6b667d1ab9b80918e86176bd24e9e9bc8c0c4950c83c19fbab6ec2825540a/diff:/var/lib/docker/overlay2/0a3b08722aa524e4d31c591e56d74dc970e8e9a0e39c767ac2ac5e0847d745f3/diff:/var/lib/docker/overlay2/96de4d4fc11e435926e12a685910969ac5927c5fe406bedae07f17bbe8e8ab6e/diff:/var/lib/docker/overlay2/7f370514ba7f543738e6b94647ee38bef911513b7365542aacd5044d3c141ee4/diff:/var/lib/docker/overlay2/4b2b8d2ade56584c9a2974512fdef770b7bbf844f4e5a74dc3cc33cb29abc606/diff",
            "MergedDir": "/var/lib/docker/overlay2/c892c60603ab2a363a389fbc85d44ff39e72136b24eabbd6aef2ed38af1bade6/merged",
            "UpperDir": "/var/lib/docker/overlay2/c892c60603ab2a363a389fbc85d44ff39e72136b24eabbd6aef2ed38af1bade6/diff",
            "WorkDir": "/var/lib/docker/overlay2/c892c60603ab2a363a389fbc85d44ff39e72136b24eabbd6aef2ed38af1bade6/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:954c82bdeb5fcc80094317528fa3fcbb1026aeff64f872527d35ec9b4343b84d",
            "sha256:0b4923f16514fbdf96c8a0ca9eec9282449776e30840066e2a848a0fe324de5f",
            "sha256:2802c81551d24cf028479e0c6ae3d992740a00c0b71f29eaa2d40974f52fc020",
            "sha256:c549c0edaca7e17eb5cb3bed8081e9b2ee92382ec8a24f7957fc541b4a5e5bde",
            "sha256:60e89035d98776a784cfcf9868857f63e7137933a4990fa42f9cc4b5bfc49304",
            "sha256:733ab64c25c95033c4c3df157f659f3b2bfb29b26f6291a536ced0c123b51f12",
            "sha256:d85730979894ce324e98c7be61011780be2a7c38fa7578304ba16a8baad23eb5",
            "sha256:421393740affb9bc1d1e3ebefac9d20949910af140351e846b90980727949827",
            "sha256:94beda9f2ced5b24a136ce40d56e196d8d69c894a1c64f72d50fc0d329f85c6e",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:d1c0cfcfb7159153ee45c76f0aa5ccceb024e11e380805e2067ea9bb670875d5"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-12-03T11:56:48.815641373+08:00"
    }
}

更多版本

docker.io/apache/spark:3.4.4

linux/amd64 docker.io974.48MB2024-10-29 01:26
111

docker.io/apache/spark:v3.2.3

linux/amd64 docker.io612.37MB2024-11-11 16:11
74

docker.io/apache/spark:3.5.3

linux/amd64 docker.io984.74MB2024-11-23 19:03
55

docker.io/apache/spark:3.5.3-java17

linux/amd64 docker.io1.15GB2024-11-24 00:41
53

docker.io/apache/spark:3.5.3-scala2.12-java11-r-ubuntu

linux/amd64 docker.io1.32GB2024-11-24 01:13
50

docker.io/apache/spark:3.5.3-scala2.12-java11-ubuntu

linux/amd64 docker.io681.32MB2024-11-24 01:31
60

docker.io/apache/spark:3.5.3-scala2.12-java17-ubuntu

linux/amd64 docker.io828.98MB2024-11-24 01:33
48

docker.io/apache/spark:3.3.3

linux/amd64 docker.io939.31MB2024-12-03 11:56
54

docker.io/apache/spark-py:v3.1.3

linux/amd64 docker.io886.30MB2024-12-11 19:44
47