docker.io/apache/spark-py:v3.1.3 linux/amd64

docker.io/apache/spark-py:v3.1.3 - 国内下载镜像源 浏览次数:31 安全受验证的发布者-apache
```html

这是一个包含 Apache Spark 和 Python 的 Docker 镜像。它预先配置好了 Spark 和其 Python API,方便用户运行基于 Python 的 Spark 应用。用户可以直接使用该镜像创建容器,无需自行安装和配置 Spark 环境。

```
源镜像 docker.io/apache/spark-py:v3.1.3
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3
镜像ID sha256:2308916c7e213b0b2d85655ab2f154154f9a489d16c72076e961853c86027e74
镜像TAG v3.1.3
大小 886.30MB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /opt/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 31 次
贡献者 13*******7@qq.com
镜像创建 2022-02-21T19:47:34.775420112Z
同步时间 2024-12-11 19:44
更新时间 2024-12-26 11:46
环境变量
PATH=/usr/local/openjdk-11/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/usr/local/openjdk-11 LANG=C.UTF-8 JAVA_VERSION=11.0.14.1 SPARK_HOME=/opt/spark
镜像安全扫描 查看Trivy扫描报告

系统OS: debian 11.2 扫描引擎: Trivy 扫描时间: 2024-12-11 19:46

低危漏洞:604 中危漏洞:2235 高危漏洞:868 严重漏洞:97

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3  docker.io/apache/spark-py:v3.1.3

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3  docker.io/apache/spark-py:v3.1.3

Shell快速替换命令

sed -i 's#apache/spark-py:v3.1.3#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3  docker.io/apache/spark-py:v3.1.3'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3  docker.io/apache/spark-py:v3.1.3'

镜像构建历史


# 2022-02-22 03:47:34  0.00B 指定运行容器时使用的用户
USER 185
                        
# 2022-02-22 03:47:34  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2022-02-22 03:47:34  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2022-02-22 03:47:34  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2022-02-22 03:47:34  929.92KB 复制新文件或目录到容器中
COPY python/lib /opt/spark/python/lib # buildkit
                        
# 2022-02-22 03:47:34  3.77MB 复制新文件或目录到容器中
COPY python/pyspark /opt/spark/python/pyspark # buildkit
                        
# 2022-02-22 03:47:34  350.30MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update &&     apt install -y python3 python3-pip &&     pip3 install --upgrade pip setuptools &&     rm -r /root/.cache && rm -rf /var/cache/apt/* # buildkit
                        
# 2022-02-22 03:46:57  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c mkdir ${SPARK_HOME}/python # buildkit
                        
# 2022-02-22 03:46:57  0.00B 指定运行容器时使用的用户
USER 0
                        
# 2022-02-22 03:46:57  0.00B 设置工作目录为/
WORKDIR /
                        
# 2022-02-22 03:45:53  0.00B 指定运行容器时使用的用户
USER 185
                        
# 2022-02-22 03:45:53  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/entrypoint.sh"]
                        
# 2022-02-22 03:45:53  0.00B 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c chmod a+x /opt/decom.sh # buildkit
                        
# 2022-02-22 03:45:53  0.00B 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c chmod g+w /opt/spark/work-dir # buildkit
                        
# 2022-02-22 03:45:53  0.00B 设置工作目录为/opt/spark/work-dir
WORKDIR /opt/spark/work-dir
                        
# 2022-02-22 03:45:53  0.00B 设置环境变量 SPARK_HOME
ENV SPARK_HOME=/opt/spark
                        
# 2022-02-22 03:45:53  968.74KB 复制新文件或目录到容器中
COPY data /opt/spark/data # buildkit
                        
# 2022-02-22 03:45:53  11.27KB 复制新文件或目录到容器中
COPY kubernetes/tests /opt/spark/tests # buildkit
                        
# 2022-02-22 03:45:53  3.05MB 复制新文件或目录到容器中
COPY examples /opt/spark/examples # buildkit
                        
# 2022-02-22 03:45:53  1.28KB 复制新文件或目录到容器中
COPY kubernetes/dockerfiles/spark/decom.sh /opt/ # buildkit
                        
# 2022-02-22 03:45:52  3.49KB 复制新文件或目录到容器中
COPY kubernetes/dockerfiles/spark/entrypoint.sh /opt/ # buildkit
                        
# 2022-02-22 03:45:52  45.07KB 复制新文件或目录到容器中
COPY sbin /opt/spark/sbin # buildkit
                        
# 2022-02-22 03:45:52  53.79KB 复制新文件或目录到容器中
COPY bin /opt/spark/bin # buildkit
                        
# 2022-02-22 03:45:52  230.24MB 复制新文件或目录到容器中
COPY jars /opt/spark/jars # buildkit
                        
# 2022-02-22 03:45:51  69.01MB 执行命令并创建新的镜像层
RUN |1 spark_uid=185 /bin/sh -c set -ex &&     sed -i 's/http:\/\/deb.\(.*\)/https:\/\/deb.\1/g' /etc/apt/sources.list &&     apt-get update &&     ln -s /lib /lib64 &&     apt install -y bash tini libc6 libpam-modules krb5-user libnss3 procps &&     mkdir -p /opt/spark &&     mkdir -p /opt/spark/examples &&     mkdir -p /opt/spark/work-dir &&     touch /opt/spark/RELEASE &&     rm /bin/sh &&     ln -sv /bin/bash /bin/sh &&     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su &&     chgrp root /etc/passwd && chmod ug+rw /etc/passwd &&     rm -rf /var/cache/apt/* # buildkit
                        
# 2022-02-22 03:45:51  0.00B 定义构建参数
ARG spark_uid=185
                        
# 2022-02-12 08:02:43  142.65MB 
/bin/sh -c set -eux; 		arch="$(dpkg --print-architecture)"; 	case "$arch" in 		'amd64') 			downloadUrl='https://github.com/AdoptOpenJDK/openjdk11-upstream-binaries/releases/download/jdk-11.0.14.1%2B1/OpenJDK11U-jre_x64_linux_11.0.14.1_1.tar.gz'; 			;; 		'arm64') 			downloadUrl='https://github.com/AdoptOpenJDK/openjdk11-upstream-binaries/releases/download/jdk-11.0.14.1%2B1/OpenJDK11U-jre_aarch64_linux_11.0.14.1_1.tar.gz'; 			;; 		*) echo >&2 "error: unsupported architecture: '$arch'"; exit 1 ;; 	esac; 		savedAptMark="$(apt-mark showmanual)"; 	apt-get update; 	apt-get install -y --no-install-recommends 		dirmngr 		gnupg 		wget 	; 	rm -rf /var/lib/apt/lists/*; 		wget --progress=dot:giga -O openjdk.tgz "$downloadUrl"; 	wget --progress=dot:giga -O openjdk.tgz.asc "$downloadUrl.sign"; 		export GNUPGHOME="$(mktemp -d)"; 	gpg --batch --keyserver keyserver.ubuntu.com --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; 	gpg --batch --keyserver keyserver.ubuntu.com --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; 	gpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F 		| tee /dev/stderr 		| grep '0xA5CD6035332FA671' 		| grep 'Andrew Haley'; 	gpg --batch --verify openjdk.tgz.asc openjdk.tgz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME"; 		mkdir -p "$JAVA_HOME"; 	tar --extract 		--file openjdk.tgz 		--directory "$JAVA_HOME" 		--strip-components 1 		--no-same-owner 	; 	rm openjdk.tgz*; 		apt-mark auto '.*' > /dev/null; 	[ -z "$savedAptMark" ] || apt-mark manual $savedAptMark > /dev/null; 	apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; 		{ 		echo '#!/usr/bin/env bash'; 		echo 'set -Eeuo pipefail'; 		echo 'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth "$JAVA_HOME/lib/security/cacerts"'; 	} > /etc/ca-certificates/update.d/docker-openjdk; 	chmod +x /etc/ca-certificates/update.d/docker-openjdk; 	/etc/ca-certificates/update.d/docker-openjdk; 		find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; 	ldconfig; 		java -Xshare:dump; 		java --version
                        
# 2022-02-12 07:58:04  0.00B 
/bin/sh -c #(nop)  ENV JAVA_VERSION=11.0.14.1
                        
# 2022-01-26 17:25:48  0.00B 
/bin/sh -c #(nop)  ENV LANG=C.UTF-8
                        
# 2022-01-26 17:25:47  0.00B 
/bin/sh -c #(nop)  ENV PATH=/usr/local/openjdk-11/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2022-01-26 17:25:47  27.00B 
/bin/sh -c { echo '#/bin/sh'; echo 'echo "$JAVA_HOME"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ "$JAVA_HOME" = "$(docker-java-home)" ] # backwards compatibility
                        
# 2022-01-26 17:25:46  0.00B 
/bin/sh -c #(nop)  ENV JAVA_HOME=/usr/local/openjdk-11
                        
# 2022-01-26 17:18:16  4.88MB 
/bin/sh -c set -eux; 	apt-get update; 	apt-get install -y --no-install-recommends 		ca-certificates p11-kit 	; 	rm -rf /var/lib/apt/lists/*
                        
# 2022-01-26 09:40:36  0.00B 
/bin/sh -c #(nop)  CMD ["bash"]
                        
# 2022-01-26 09:40:35  80.39MB 
/bin/sh -c #(nop) ADD file:90495c24c897ec47982e200f732f8be3109fcd791691ddffae0756898f91024f in / 
                        
                    

镜像信息

{
    "Id": "sha256:2308916c7e213b0b2d85655ab2f154154f9a489d16c72076e961853c86027e74",
    "RepoTags": [
        "apache/spark-py:v3.1.3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py:v3.1.3"
    ],
    "RepoDigests": [
        "apache/spark-py@sha256:a9f097faaa1c628b15d0f01f507de195d6c0745da4132903b6fdc9d88da8fc5d",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/apache/spark-py@sha256:3e46e4151e4dcac1b7ec0edf53c11be314848b56046cc221870fc74ffa846e6f"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2022-02-21T19:47:34.775420112Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "185",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/openjdk-11/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/usr/local/openjdk-11",
            "LANG=C.UTF-8",
            "JAVA_VERSION=11.0.14.1",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/opt/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": null
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 886298260,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/d7f624435c1042200db4c8c06ee51ad2ef74187ec09098bb9428f2191feea5fc/diff:/var/lib/docker/overlay2/7dbb2115ef2735410843e1f868c385e5a690d0139c4002093f9109b0a9916527/diff:/var/lib/docker/overlay2/6246f9d17241c644bb99ca55d72d0438b1f62e9bf3dfe8701afb1f9f9364dc0f/diff:/var/lib/docker/overlay2/2b64345f91726c44639886f9cf87a08e4dca25ce77954b96ab263a58f1d6a92f/diff:/var/lib/docker/overlay2/128fe367fc6bb1f2820acad96b5964ab8998b5d7c7db0195238f939a9336e5ec/diff:/var/lib/docker/overlay2/1eaf4abf1eafc569eac5f533135ad2238e87fc7230ab7eecbad9c35704af0676/diff:/var/lib/docker/overlay2/152651eea254731ebb070078e2d31ca0a659c187056f8fff404fd7395b41ab09/diff:/var/lib/docker/overlay2/e88a7c7b82234f727faec077865fc80339303d2861e0bf4815a090624859e9e9/diff:/var/lib/docker/overlay2/7e13950c71121b1f06ac69b58410cd2089a888a4df2a234fe48cced5a1bb2891/diff:/var/lib/docker/overlay2/8e1ce71ca94a5503f43277daad0dca3e32d1b29edec51ac50fe06c90b6532cbd/diff:/var/lib/docker/overlay2/1e63fb787dcc8a8f2f961901dfc576b69ac48f00945de8de788a36dbf8d1c35d/diff:/var/lib/docker/overlay2/bbde743fdb4ed5dd59533954c8ad9124e52f6b19d75e915260b162a2732aede6/diff:/var/lib/docker/overlay2/4d0ba6a96ade84431e7e732f1e66473aabf3ff0f9e84b2f794d04f288f92a52d/diff:/var/lib/docker/overlay2/663e3631ba40ab9d4048fe0bdf649e8e053fe3eac9f4849306c43e0f5bfcf7d8/diff:/var/lib/docker/overlay2/9a2f5c3420879319d2ea8c6cc54c8e1332cfc683aa721bc1ce675c689f326524/diff:/var/lib/docker/overlay2/c9e29ed0e13240f1a717af68eb17a0770c79ce308837560ff9896de577450c26/diff:/var/lib/docker/overlay2/4ee71878652fc9dd9812ded97fdbbaa4545c64f619fa0101262925a5158c67b6/diff",
            "MergedDir": "/var/lib/docker/overlay2/81164af707f7d9c07153b742d5f897ff6bf36526b0225143847246bb387864e8/merged",
            "UpperDir": "/var/lib/docker/overlay2/81164af707f7d9c07153b742d5f897ff6bf36526b0225143847246bb387864e8/diff",
            "WorkDir": "/var/lib/docker/overlay2/81164af707f7d9c07153b742d5f897ff6bf36526b0225143847246bb387864e8/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:7d0ebbe3f5d26c1b5ec4d5dbb6fe3205d7061f9735080b0162d550530328abd6",
            "sha256:7da834c1ebd33c97e3e26c8de9b5ee92e542ec3e020789f587dac23b363a8d83",
            "sha256:20f064be7fc07e0a5df9776325c2bfe21c4472c6e70924fe76e797ebbafd7776",
            "sha256:2fd9f728b7a7103890df84c95a3c7f880848af1d74c5190d2a8d39bff533c36f",
            "sha256:07bc0f91d0014163213220844bcb93e381cb209c07febe1c3163843310fa867f",
            "sha256:9f505b282ce4db2fd20d94507c6c036c26063b8fdbf76768bf1e71e94952b324",
            "sha256:48f204c8837814fb5fa103f25968705eaedce38d0f9f7b82118cc623815c18f2",
            "sha256:ddbb6da6ad6ec7567e15318e6502feb22da9d25ad7cb7acd39a37401c555ca07",
            "sha256:fe4b661571064f69d9cd5bfaa122ebe6458bae6e2f3137a6c898d4d4ba990a6b",
            "sha256:fe1fcd47e77a04b977f4882f7beaffb057a6b71b249572b7c6ae9270afe94813",
            "sha256:823c32fbff6b736c2519680bb42f6e15250eec09c85e885e24a45a53a623f093",
            "sha256:81130b6b76323914d010fb9255525814180bab69f356a7e07bb7533e6679fc49",
            "sha256:9eacb60f3f8c385ed6bae73e98f519246d3a7c683251f205be622c5ac51e4443",
            "sha256:786225448ba0348144fb53220644b4b3d6237fa493c3eefe2b88ac8d4f93ef8f",
            "sha256:d8ece09b95f894b4e229717ba67ee5e932813978f984e86884d8c487972676d6",
            "sha256:67c48ef82c6357a78745ac33392b1a38e33a836304379e4a66aa5f4fb196ebdb",
            "sha256:dda9c3a498b1ad5cae054a2ec4a25d7c36e9c4514aae731407f906fddf34019f",
            "sha256:575daefc7aaa5dd820ad88b2cb9e1b8051931033b9f5401de24130a86ac94609"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-12-11T19:43:31.031001579+08:00"
    }
}

更多版本

docker.io/apache/spark-py:v3.1.3

linux/amd64 docker.io886.30MB2024-12-11 19:44
30