docker.io/kubeflow/spark-operator:2.0.2 linux/amd64

docker.io/kubeflow/spark-operator:2.0.2 - 国内下载镜像源 浏览次数:29
```html

这是一个Kubeflow项目提供的Spark Operator镜像。Spark Operator允许您在Kubernetes集群上运行和管理Apache Spark应用程序。使用此镜像,您可以方便地部署、扩展和监控Spark作业,而无需直接与Kubernetes API交互。

```
源镜像 docker.io/kubeflow/spark-operator:2.0.2
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2
镜像ID sha256:59792962f4316f88dfefe2bda4d444c29061311e7e38844f13bd7f2796290a69
镜像TAG 2.0.2
大小 1.05GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /usr/bin/entrypoint.sh
工作目录 /opt/spark/work-dir
OS/平台 linux/amd64
浏览量 29 次
贡献者
镜像创建 2024-10-11T01:47:08.245071835Z
同步时间 2024-11-28 17:32
更新时间 2024-12-21 10:12
环境变量
PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin JAVA_HOME=/opt/java/openjdk LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8 JAVA_VERSION=jdk-11.0.24+8 SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz.asc GPG_KEY=D76E23B9F11B5BF6864613C4F7051850A0AF904D SPARK_HOME=/opt/spark
镜像标签
2024-10-11T01:45:46.390Z: org.opencontainers.image.created Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name ef9a2a134b80f8c5368db53615d9aa766c67ad0a: org.opencontainers.image.revision https://github.com/kubeflow/spark-operator: org.opencontainers.image.source spark-operator: org.opencontainers.image.title https://github.com/kubeflow/spark-operator: org.opencontainers.image.url 2.0.2: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: ubuntu 20.04 扫描引擎: Trivy 扫描时间: 2024-11-28 17:32

低危漏洞:131 中危漏洞:1059 高危漏洞:38 严重漏洞:5

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2  docker.io/kubeflow/spark-operator:2.0.2

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2  docker.io/kubeflow/spark-operator:2.0.2

Shell快速替换命令

sed -i 's#kubeflow/spark-operator:2.0.2#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2  docker.io/kubeflow/spark-operator:2.0.2'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2  docker.io/kubeflow/spark-operator:2.0.2'

镜像历史

大小 创建时间 层信息
0.00B 2024-10-11 09:47:08 ENTRYPOINT ["/usr/bin/entrypoint.sh"]
657.00B 2024-10-11 09:47:08 COPY entrypoint.sh /usr/bin/ # buildkit
64.60MB 2024-10-11 09:47:08 COPY /workspace/bin/spark-operator /usr/bin/spark-operator # buildkit
0.00B 2024-10-11 09:46:12 USER spark
0.00B 2024-10-11 09:46:12 RUN /bin/sh -c mkdir -p /etc/k8s-webhook-server/serving-certs /home/spark && chmod -R g+rw /etc/k8s-webhook-server/serving-certs && chown -R spark /etc/k8s-webhook-server/serving-certs /home/spark # buildkit
0.00B 2024-10-11 09:46:12 RUN /bin/sh -c apt-get update && apt-get install -y tini && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-10-11 09:46:12 USER root
0.00B 2024-08-12 17:09:28 USER spark
303.04MB 2024-08-12 17:09:28 RUN /bin/sh -c set -ex; apt-get update; apt-get install -y python3 python3-pip; rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-08-12 17:09:28 USER root
0.00B 2024-08-12 17:09:28 ENTRYPOINT ["/opt/entrypoint.sh"]
0.00B 2024-08-12 17:09:28 USER spark
0.00B 2024-08-12 17:09:28 WORKDIR /opt/spark/work-dir
0.00B 2024-08-12 17:09:28 ENV SPARK_HOME=/opt/spark
4.74KB 2024-08-12 17:09:28 COPY entrypoint.sh /opt/ # buildkit
361.74MB 2024-08-12 17:09:28 RUN |1 spark_uid=185 /bin/sh -c set -ex; export SPARK_TMP="$(mktemp -d)"; cd $SPARK_TMP; wget -nv -O spark.tgz "$SPARK_TGZ_URL"; wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" || gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY"; gpg --batch --verify spark.tgz.asc spark.tgz; gpgconf --kill all; rm -rf "$GNUPGHOME" spark.tgz.asc; tar -xf spark.tgz --strip-components=1; chown -R spark:spark .; mv jars /opt/spark/; mv bin /opt/spark/; mv sbin /opt/spark/; mv kubernetes/dockerfiles/spark/decom.sh /opt/; mv examples /opt/spark/; mv kubernetes/tests /opt/spark/; mv data /opt/spark/; mv python/pyspark /opt/spark/python/pyspark/; mv python/lib /opt/spark/python/lib/; mv R /opt/spark/; chmod a+x /opt/decom.sh; cd ..; rm -rf "$SPARK_TMP"; # buildkit
0.00B 2024-08-12 17:09:28 ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz.asc GPG_KEY=D76E23B9F11B5BF6864613C4F7051850A0AF904D
57.88MB 2024-08-12 17:09:28 RUN |1 spark_uid=185 /bin/sh -c set -ex; apt-get update; apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper; mkdir -p /opt/spark; mkdir /opt/spark/python; mkdir -p /opt/spark/examples; mkdir -p /opt/spark/work-dir; chmod g+w /opt/spark/work-dir; touch /opt/spark/RELEASE; chown -R spark:spark /opt/spark; echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su; rm -rf /var/lib/apt/lists/* # buildkit
64.84KB 2024-08-12 17:09:28 RUN |1 spark_uid=185 /bin/sh -c groupadd --system --gid=${spark_uid} spark && useradd --system --uid=${spark_uid} --gid=spark spark # buildkit
0.00B 2024-08-12 17:09:28 ARG spark_uid=185
0.00B 2024-08-12 17:09:28 ENTRYPOINT ["/__cacert_entrypoint.sh"]
4.74KB 2024-08-12 17:09:28 COPY --chmod=755 entrypoint.sh /__cacert_entrypoint.sh # buildkit
0.00B 2024-08-12 17:09:28 RUN /bin/sh -c set -eux; echo "Verifying install ..."; echo "java --version"; java --version; echo "Complete." # buildkit
140.96MB 2024-08-12 17:09:28 RUN /bin/sh -c set -eux; ARCH="$(dpkg --print-architecture)"; case "${ARCH}" in amd64) ESUM='e0c1938093da3780e4494d366a4e6b75584dde8d46a19acea6691ae11df4cda5'; BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_x64_linux_hotspot_11.0.24_8.tar.gz'; ;; arm64) ESUM='1fe97cdaad47d7d108f329c6e4560b46748ef7f2948a1027812ade0bbc2a3597'; BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_aarch64_linux_hotspot_11.0.24_8.tar.gz'; ;; armhf) ESUM='bf893085627c6ec484e63aa1290276b23bcfee547459da6b0432ae9c5c1be22a'; BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_arm_linux_hotspot_11.0.24_8.tar.gz'; ;; ppc64el) ESUM='8ee351314182df93fbad96139bb74b97814944d66197896e388404a1ecfa06b3'; BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_ppc64le_linux_hotspot_11.0.24_8.tar.gz'; ;; s390x) ESUM='5b331f093bb03126334bbbc24f05f60681baeda461d860e4e2cdb693ee54e0ed'; BINARY_URL='https://github.com/adoptium/temurin11-binaries/releases/download/jdk-11.0.24%2B8/OpenJDK11U-jre_s390x_linux_hotspot_11.0.24_8.tar.gz'; ;; *) echo "Unsupported arch: ${ARCH}"; exit 1; ;; esac; wget --progress=dot:giga -O /tmp/openjdk.tar.gz ${BINARY_URL}; echo "${ESUM} */tmp/openjdk.tar.gz" | sha256sum -c -; mkdir -p "$JAVA_HOME"; tar --extract --file /tmp/openjdk.tar.gz --directory "$JAVA_HOME" --strip-components 1 --no-same-owner ; rm -f /tmp/openjdk.tar.gz ${JAVA_HOME}/lib/src.zip; find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; ldconfig; java -Xshare:dump; # buildkit
0.00B 2024-08-12 17:09:28 ENV JAVA_VERSION=jdk-11.0.24+8
45.82MB 2024-08-12 17:09:28 RUN /bin/sh -c set -eux; apt-get update; DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends curl wget fontconfig ca-certificates p11-kit tzdata locales ; echo "en_US.UTF-8 UTF-8" >> /etc/locale.gen; locale-gen en_US.UTF-8; rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-08-12 17:09:28 ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
0.00B 2024-08-12 17:09:28 ENV PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
0.00B 2024-08-12 17:09:28 ENV JAVA_HOME=/opt/java/openjdk
0.00B 2024-08-12 17:09:28 /bin/sh -c #(nop) CMD ["/bin/bash"]
72.81MB 2024-08-12 17:09:28 /bin/sh -c #(nop) ADD file:6a209aa51ba684c0a39769619c42058ca99311b87563c7b079319a8bb91bec1f in /
0.00B 2024-08-12 17:09:28 /bin/sh -c #(nop) LABEL org.opencontainers.image.version=20.04
0.00B 2024-08-12 17:09:28 /bin/sh -c #(nop) LABEL org.opencontainers.image.ref.name=ubuntu
0.00B 2024-08-12 17:09:28 /bin/sh -c #(nop) ARG LAUNCHPAD_BUILD_ARCH
0.00B 2024-08-12 17:09:28 /bin/sh -c #(nop) ARG RELEASE

镜像信息

{
    "Id": "sha256:59792962f4316f88dfefe2bda4d444c29061311e7e38844f13bd7f2796290a69",
    "RepoTags": [
        "kubeflow/spark-operator:2.0.2",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator:2.0.2"
    ],
    "RepoDigests": [
        "kubeflow/spark-operator@sha256:39907913e188632033b002331d36529b43b911d8c479d9aec25d4dc91c394dbe",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/kubeflow/spark-operator@sha256:2e8bdb5970f6e67cc7fe68c5a41670d0418788688ea57c48db8f467f236d464d"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-10-11T01:47:08.245071835Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "spark",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/java/openjdk/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "JAVA_HOME=/opt/java/openjdk",
            "LANG=en_US.UTF-8",
            "LANGUAGE=en_US:en",
            "LC_ALL=en_US.UTF-8",
            "JAVA_VERSION=jdk-11.0.24+8",
            "SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz",
            "SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz.asc",
            "GPG_KEY=D76E23B9F11B5BF6864613C4F7051850A0AF904D",
            "SPARK_HOME=/opt/spark"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/spark/work-dir",
        "Entrypoint": [
            "/usr/bin/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2024-10-11T01:45:46.390Z",
            "org.opencontainers.image.description": "Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "ef9a2a134b80f8c5368db53615d9aa766c67ad0a",
            "org.opencontainers.image.source": "https://github.com/kubeflow/spark-operator",
            "org.opencontainers.image.title": "spark-operator",
            "org.opencontainers.image.url": "https://github.com/kubeflow/spark-operator",
            "org.opencontainers.image.version": "2.0.2"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1046930976,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/8f91574a4a0954dcbc76d72b2afcaa8779b4987ce4864107e8d231c5eb33464b/diff:/var/lib/docker/overlay2/3624c105b35a4dbf7a1149f36da423f17d3871a7cfd00b3e0234c4b5fae59e0b/diff:/var/lib/docker/overlay2/cd952f239d4999077983ea985a861dc67e26d4d890978b0b5fdf62f69cd805ec/diff:/var/lib/docker/overlay2/01c573e3e23d5fb705496465edd1765ced039e278152cc397284c471c7281fe5/diff:/var/lib/docker/overlay2/d32fa9ab6d720e8078eea2e11aae64a7b5bea12075ec5b58496b7ab7df7708a6/diff:/var/lib/docker/overlay2/31929b37e921f6fcf0f88eb3d6fe99b64e716deb856f6b94894866c6fec375a2/diff:/var/lib/docker/overlay2/4c531ce751d3ccee8d48f9f5b8a52731ef220a55a9582865a8fb72288c5aa339/diff:/var/lib/docker/overlay2/398904af131ea12b7e47cc6e272b4181427196a09abb93ade8d50824522ca4e6/diff:/var/lib/docker/overlay2/ff89a4a58be49508b3ed964edbab52ed0d6c0fa2ddbda34ba31c47b54280d3f6/diff:/var/lib/docker/overlay2/e9611645ff35e26e1958ea56a4ff49719d48cbb86e8654a69d55ce6153c3c0b0/diff:/var/lib/docker/overlay2/d052a35a489c8b01146d8d149a7e2812dbfc55469139a0ad5cadde29391a3926/diff:/var/lib/docker/overlay2/1280ffe88e6d387d58fa527a129ed37ccd02024a1413efc92cf6adbe34c9b834/diff:/var/lib/docker/overlay2/57d0f1c81696d4390215f6a1087ecddb0cfb807739d99e61608ea8c1170918f3/diff:/var/lib/docker/overlay2/e03e767a68eea9e72e887a3a62851395c231aaec3e6c60c06d211adee3e60043/diff",
            "MergedDir": "/var/lib/docker/overlay2/e7d7716c91d9d1abb9fc09c665fc6298e1fbd55a73d92eb77a3be3994ab70744/merged",
            "UpperDir": "/var/lib/docker/overlay2/e7d7716c91d9d1abb9fc09c665fc6298e1fbd55a73d92eb77a3be3994ab70744/diff",
            "WorkDir": "/var/lib/docker/overlay2/e7d7716c91d9d1abb9fc09c665fc6298e1fbd55a73d92eb77a3be3994ab70744/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:20eafb43e177262767f62cacb9ee87b70eae2d0804c03b5f6f387ca9e64c70f9",
            "sha256:8b24e3f33d17387933d58c44cf7aab94f1b9022bd09dabd9246cc999ae2f938b",
            "sha256:3614f92a09e3016ffac890dada7847ff35e8f58922d7638f959a6631cb5ffbf2",
            "sha256:6cdd75f7488965ed11263a5dcec1c8dce94b1c2d028432b141d79e9b35527a8c",
            "sha256:f1fd336d6d574a3318d65fa6d6c66aef36e9db8479402290bd0548f48e81e87e",
            "sha256:07c58db2707955e366d40b5acc77d89104477fcf64d2e95b0a720e8a5089b4de",
            "sha256:abec6948c0a9fc6129cf0cb5dc4696a0e5dbfdd95eb4487285394b0ca1d2954e",
            "sha256:a2d9b5794067f8a2ccd4fffda8a32738b638b5007ba0efe777e5c409ed692f8e",
            "sha256:d67e06579df2ef180f45c964fccd5ce6d285bce6f232d4d5b75e4c8ae288d25a",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:1d464edf4e899a24b207481754f9268dcc2744f9bf898abaac9cb17054028a38",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:97f2b3fbe69497d77fa246ae85bda9f949f75849912e89558a02488580f14cac",
            "sha256:ff17ad681be7f17932b74c054b524e61b5f4ac828772e90b26fd0dadeb821431",
            "sha256:938d3a4375656a33607c6f1b67a83e2b496f90237b0759e7102c4a13ed2ae630"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-11-28T17:30:53.240048853+08:00"
    }
}

更多版本

docker.io/kubeflow/spark-operator:2.0.2

linux/amd64 docker.io1.05GB2024-11-28 17:32
28