docker.io/acryldata/datahub-kafka-setup:edb9a87 linux/amd64

docker.io/acryldata/datahub-kafka-setup:edb9a87 - 国内下载镜像源 浏览次数:143
DataHub Kafka Setup 这是一个 Docker 容器镜像,用于设置和配置 Apache Kafka 集群。
源镜像 docker.io/acryldata/datahub-kafka-setup:edb9a87
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87
镜像ID sha256:e02d38323695ff5d712320acc6669eabb927a05c556bb6bc48e10be4c36e385d
镜像TAG edb9a87
大小 604.05MB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD /bin/sh -c ./kafka-setup.sh
启动入口
工作目录 /opt/kafka
OS/平台 linux/amd64
浏览量 143 次
贡献者 10*******2@qq.com
镜像创建 2024-08-15T22:22:00.947738483Z
同步时间 2024-08-18 11:52
更新时间 2025-02-22 00:53
环境变量
PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.12.5 PYTHON_PIP_VERSION=24.2 PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/66d8a0f637083e2c3ddffc0cb1e65ce126afb856/public/get-pip.py PYTHON_GET_PIP_SHA256=6fb7b781206356f45ad79efbb19322caa6c2a5ad39092d0d44d0fec94117e118 KAFKA_VERSION=3.7.0 SCALA_VERSION=2.13 METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4 METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4 FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4 DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1 METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1 METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1 METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1 FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1 PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1 DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1 USE_CONFLUENT_SCHEMA_REGISTRY=TRUE
镜像标签
kafka: name 3.7.0: version

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87  docker.io/acryldata/datahub-kafka-setup:edb9a87

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87  docker.io/acryldata/datahub-kafka-setup:edb9a87

Shell快速替换命令

sed -i 's#acryldata/datahub-kafka-setup:edb9a87#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87  docker.io/acryldata/datahub-kafka-setup:edb9a87'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87  docker.io/acryldata/datahub-kafka-setup:edb9a87'

镜像构建历史


# 2024-08-16 06:22:00  0.00B 设置默认要执行的命令
CMD ["/bin/sh" "-c" "./kafka-setup.sh"]
                        
# 2024-08-16 06:22:00  2.18KB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c chmod +x ./kafka-setup.sh ./kafka-topic-workers.sh ./kafka-ready.sh # buildkit
                        
# 2024-08-16 06:22:00  658.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/env_to_properties.py ./env_to_properties.py # buildkit
                        
# 2024-08-08 05:13:17  317.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-ready.sh ./kafka-ready.sh # buildkit
                        
# 2024-08-08 05:13:17  2.18KB 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-topic-workers.sh ./kafka-topic-workers.sh # buildkit
                        
# 2024-08-08 05:13:17  303.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-config.sh ./kafka-config.sh # buildkit
                        
# 2024-08-08 05:13:17  5.40KB 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-setup.sh ./kafka-setup.sh # buildkit
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 USE_CONFLUENT_SCHEMA_REGISTRY
ENV USE_CONFLUENT_SCHEMA_REGISTRY=TRUE
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 DATAHUB_UPGRADE_HISTORY_TOPIC_NAME
ENV DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 PLATFORM_EVENT_TOPIC_NAME
ENV PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME
ENV FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 METADATA_CHANGE_PROPOSAL_TOPIC_NAME
ENV METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME
ENV METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME
ENV METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 DATAHUB_USAGE_EVENT_NAME
ENV DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 FAILED_METADATA_CHANGE_EVENT_NAME
ENV FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 METADATA_CHANGE_EVENT_NAME
ENV METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4
                        
# 2024-08-08 05:13:17  0.00B 设置环境变量 METADATA_AUDIT_EVENT_NAME
ENV METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4
                        
# 2024-08-08 05:13:17  14.87MB 复制文件或目录到容器中
ADD --chown=kafka:kafka https://github.com/aws/aws-msk-iam-auth/releases/download/v2.0.3/aws-msk-iam-auth-2.0.3-all.jar /opt/kafka/libs # buildkit
                        
# 2024-08-08 05:13:17  14.87MB 复制文件或目录到容器中
ADD --chown=kafka:kafka https://github.com/aws/aws-msk-iam-auth/releases/download/v2.0.3/aws-msk-iam-auth-2.0.3-all.jar /usr/share/java/cp-base-new # buildkit
                        
# 2024-08-08 05:13:17  1.31KB 复制新文件或目录到容器中
COPY /etc/cp-base-new/log4j.properties /etc/cp-base-new/log4j.properties # buildkit
                        
# 2024-08-08 05:13:16  39.06MB 复制新文件或目录到容器中
COPY /usr/share/java/cp-base-new/ /usr/share/java/cp-base-new/ # buildkit
                        
# 2024-08-08 05:13:16  0.00B 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c ls -la # buildkit
                        
# 2024-08-08 05:13:16  0.00B 设置工作目录为/opt/kafka
WORKDIR /opt/kafka
                        
# 2024-08-08 05:13:16  0.00B 设置环境变量 PATH
ENV PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-08-08 05:13:16  124.06MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c mkdir -p /opt   && if [ "${APACHE_DOWNLOAD_URL}" != "null" ] ; then mirror="${APACHE_DOWNLOAD_URL}/" ; else mirror=$(curl --stderr /dev/null https://www.apache.org/dyn/closer.cgi\?as_json\=1 | jq -r '.preferred'); fi   && curl -sSL "${mirror}kafka/${KAFKA_VERSION}/kafka_${SCALA_VERSION}-${KAFKA_VERSION}.tgz"   | tar -xzf - -C /opt   && mv /opt/kafka_${SCALA_VERSION}-${KAFKA_VERSION} /opt/kafka   && adduser -DH -s /sbin/nologin kafka   && chown -R kafka: /opt/kafka   && rm -rf /tmp/*   && apk del --purge .build-deps # buildkit
                        
# 2024-08-08 05:13:14  173.59MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk add --no-cache -t .build-deps git curl ca-certificates jq gcc musl-dev libffi-dev zip # buildkit
                        
# 2024-08-08 05:13:10  181.24MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk --no-cache add openjdk17-jre-headless --repository=${ALPINE_REPO_URL}/edge/community # buildkit
                        
# 2024-08-08 05:13:08  2.79MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk add --no-cache bash coreutils # buildkit
                        
# 2024-08-08 05:13:16  0.00B 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c if [ "${ALPINE_REPO_URL}" != "http://dl-cdn.alpinelinux.org/alpine" ] ; then sed -i "s#http.*://dl-cdn.alpinelinux.org/alpine#${ALPINE_REPO_URL}#g" /etc/apk/repositories ; fi # buildkit
                        
# 2024-08-08 05:13:16  0.00B 添加元数据标签
LABEL name=kafka version=3.7.0
                        
# 2024-08-08 05:13:16  0.00B 设置环境变量 SCALA_VERSION
ENV SCALA_VERSION=2.13
                        
# 2024-08-08 05:13:16  0.00B 设置环境变量 KAFKA_VERSION
ENV KAFKA_VERSION=3.7.0
                        
# 2024-08-08 05:13:16  0.00B 定义构建参数
ARG GITHUB_REPO_URL=https://github.com
                        
# 2024-08-08 05:13:16  0.00B 定义构建参数
ARG APACHE_DOWNLOAD_URL=null
                        
# 2024-08-08 05:13:16  0.00B 定义构建参数
ARG ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine
                        
# 2024-08-07 23:49:22  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2024-08-07 23:49:22  12.38MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		wget -O get-pip.py "$PYTHON_GET_PIP_URL"; 	echo "$PYTHON_GET_PIP_SHA256 *get-pip.py" | sha256sum -c -; 		export PYTHONDONTWRITEBYTECODE=1; 		python get-pip.py 		--disable-pip-version-check 		--no-cache-dir 		--no-compile 		"pip==$PYTHON_PIP_VERSION" 	; 	rm -f get-pip.py; 		pip --version # buildkit
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 PYTHON_GET_PIP_SHA256
ENV PYTHON_GET_PIP_SHA256=6fb7b781206356f45ad79efbb19322caa6c2a5ad39092d0d44d0fec94117e118
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 PYTHON_GET_PIP_URL
ENV PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/66d8a0f637083e2c3ddffc0cb1e65ce126afb856/public/get-pip.py
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 PYTHON_PIP_VERSION
ENV PYTHON_PIP_VERSION=24.2
                        
# 2024-08-07 23:49:22  32.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2024-08-07 23:49:22  32.37MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		apk add --no-cache --virtual .build-deps 		gnupg 		tar 		xz 				bluez-dev 		bzip2-dev 		dpkg-dev dpkg 		expat-dev 		findutils 		gcc 		gdbm-dev 		libc-dev 		libffi-dev 		libnsl-dev 		libtirpc-dev 		linux-headers 		make 		ncurses-dev 		openssl-dev 		pax-utils 		readline-dev 		sqlite-dev 		tcl-dev 		tk 		tk-dev 		util-linux-dev 		xz-dev 		zlib-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		$(test "$gnuArch" != 'riscv64-linux-musl' && echo '--enable-optimizations') 		--enable-option-checking=fatal 		--enable-shared 		--with-lto 		--with-system-expat 		--without-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="-DTHREAD_STACK_SIZE=0x100000"; 	LDFLAGS="${LDFLAGS:--Wl},--strip-all"; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 		"PROFILE_TASK=${PROFILE_TASK:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:--Wl},-rpath='\$\$ORIGIN/../lib'" 		"PROFILE_TASK=${PROFILE_TASK:-}" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec scanelf --needed --nobanner --format '%n#p' '{}' ';' 		| tr ',' '\n' 		| sort -u 		| awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' 		| xargs -rt apk add --no-network --virtual .python-rundeps 	; 	apk del --no-network .build-deps; 		python3 --version # buildkit
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.12.5
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2024-08-07 23:49:22  1.02MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apk add --no-cache 		ca-certificates 		tzdata 	; # buildkit
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2024-08-07 23:49:22  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-07-23 06:26:43  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/sh"]
                        
# 2024-07-23 06:26:43  7.80MB 
/bin/sh -c #(nop) ADD file:99093095d62d0421541d882f9ceeddb2981fe701ec0aa9d2c08480712d5fed21 in / 
                        
                    

镜像信息

{
    "Id": "sha256:e02d38323695ff5d712320acc6669eabb927a05c556bb6bc48e10be4c36e385d",
    "RepoTags": [
        "acryldata/datahub-kafka-setup:edb9a87",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:edb9a87"
    ],
    "RepoDigests": [
        "acryldata/datahub-kafka-setup@sha256:f8a31e63422a6084096c9890085c4d187144d836243ac3128d6b373cbc0a7a4b",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup@sha256:b303e9f8b385111801d73d5f840262c592c9aca28c5ced95c6d5d337722e27da"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-08-15T22:22:00.947738483Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.12.5",
            "PYTHON_PIP_VERSION=24.2",
            "PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/66d8a0f637083e2c3ddffc0cb1e65ce126afb856/public/get-pip.py",
            "PYTHON_GET_PIP_SHA256=6fb7b781206356f45ad79efbb19322caa6c2a5ad39092d0d44d0fec94117e118",
            "KAFKA_VERSION=3.7.0",
            "SCALA_VERSION=2.13",
            "METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4",
            "METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4",
            "FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4",
            "DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1",
            "METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1",
            "METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1",
            "METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1",
            "FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1",
            "PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1",
            "DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1",
            "USE_CONFLUENT_SCHEMA_REGISTRY=TRUE"
        ],
        "Cmd": [
            "/bin/sh",
            "-c",
            "./kafka-setup.sh"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/kafka",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "name": "kafka",
            "version": "3.7.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 604053649,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/06cd1390b7a3c07673e3eca56fb811170771aa4df1bc89a148723f64165d6288/diff:/var/lib/docker/overlay2/09566222b59fa4cb98b20ccaa62456be741d78202fa034049d87d0da42a97243/diff:/var/lib/docker/overlay2/ab98b7cdfc527cda102c1b617f961c456a55b525dd20a3b5ceeb6013f117dec0/diff:/var/lib/docker/overlay2/cc8cdeaea9ecf8ae5b311b8060bf95643a89f39d9b7e51682966b7ddcd99034d/diff:/var/lib/docker/overlay2/6a3346503e4d6e36050ba898ead802ac892e5b1a2bfc3e503957cf90248cb273/diff:/var/lib/docker/overlay2/b55ca9dbe022981ddfa915669e8bd0ecd7a2a0929b0521fe6486b0e492b99e34/diff:/var/lib/docker/overlay2/399f7b18aebaf7f8d480d7b112e1334e0ecb9f9cf1a4c6c9b6901824a2a385c8/diff:/var/lib/docker/overlay2/4344dec715d07f58ae7256854cdda41753dde8fa1b49f18727752552b5e88346/diff:/var/lib/docker/overlay2/5bf05cc7a25de94c3c8b9e7b34f144bb933c0938de042ad399040684cb7b598d/diff:/var/lib/docker/overlay2/15f9703ba412b82c9aa3847ce1ff7c2798b1370e7444d9fb9ac6ea3df6df2908/diff:/var/lib/docker/overlay2/d3ca00a7d2b4c8e018c281bb6d83878cd9631819e33c9ca0ea8efbeccb964e86/diff:/var/lib/docker/overlay2/cf97db52765cebacdaae5521b7fd393d38fa3b479a28f62da9a8c7cdc9df2d0a/diff:/var/lib/docker/overlay2/5b23358caeaccc7457e8b4373114cc5f6e81b2595baa56d3f102dea5d16150a3/diff:/var/lib/docker/overlay2/038c0ed155735c196cb21ed52a031236a5d81f873c53ab3b2f78cb5e7cb71879/diff:/var/lib/docker/overlay2/c67c310d5237c3cf32d78555794ac8de4efd13cec5cbfc93da6637e6a60e17f8/diff:/var/lib/docker/overlay2/ac6e38e45472e57b3d9ea7c55f3c32d06af4696a99ff35cf303a7221f3f3a2a0/diff:/var/lib/docker/overlay2/8800cdd146548e7543b4067380193d4723312f7901d493cb26a31f280d5a38b7/diff:/var/lib/docker/overlay2/bc28800394e7afaa03c87d9764c62452e2d912d2ef7963bcd02dd8087cd1f18f/diff:/var/lib/docker/overlay2/b04618f0ee536c0e987793c6ef6fcfc77acb57935a1dade78a6dafe461589b06/diff:/var/lib/docker/overlay2/3ee4fd7eaedfcdb96ddfcac65de8325384353e4dda4292edb35c333e4217a612/diff:/var/lib/docker/overlay2/197612c0cb8cf4a976e6909c9e1ea5c606d639df6dab5cfe2c9b2e53929fc5d7/diff",
            "MergedDir": "/var/lib/docker/overlay2/16fb6638ae8fc27bec1801d77f4599689d85b436245f33611165c2d8ef73a47e/merged",
            "UpperDir": "/var/lib/docker/overlay2/16fb6638ae8fc27bec1801d77f4599689d85b436245f33611165c2d8ef73a47e/diff",
            "WorkDir": "/var/lib/docker/overlay2/16fb6638ae8fc27bec1801d77f4599689d85b436245f33611165c2d8ef73a47e/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:78561cef0761903dd2f7d09856150a6d4fb48967a8f113f3e33d79effbf59a07",
            "sha256:9ac07527ca5530875925969103c05674e4e67d62166d80949a7c86df4a216a25",
            "sha256:13f521543aaded6cefb31cb8e0615b8ba693e0b6204b54dbed022d281c96dd97",
            "sha256:ca9577f6528c5b48b3681b4815e6fcc0287204ac2ac357707b5504dbe8b1352d",
            "sha256:3d4d52c356301b94b9cd13727f6359652a04087d51f71f7978599f9183267f19",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:7da1d3b307a82b3d57bf4228be1ab735b001c594b6ba4e8f2498e5b44bf028b1",
            "sha256:3f40f11335a54a839e524ba820331f8f944a076d8c01a51685577117454f6163",
            "sha256:6993a8983ba95bd7120627a6953c9986e8482e422a124de13f7b6ffcc404b9d3",
            "sha256:36ac9e6b288ff7f65ac06087d3769d20beab371af238d58c7fc3911d998dc5f1",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:67503680104f9412b3cce7c90302f3fc93e1fbd98f48eb7a4f275d2dab3365b2",
            "sha256:a724bf4591f31ab0a4d9b41cd5c4d2da82627d661bb38517221bf7870972f683",
            "sha256:16646907010752e6ea199a95224471fe944a086369cca7e59fca12e1a92ade6a",
            "sha256:8b9b69b806b2a9a05266d3ac8a40acfdc7b64e1a05abe331fbcd2cdb9b6f705b",
            "sha256:2e86401e7a87ad75884f0e8afea811b7034f9a282a2e81cd4b0fe01153047a42",
            "sha256:291c96fb00c0dc80219682f7283aa7f928bb7de0d6f81bcbe0d9310e620bb272",
            "sha256:b856ca54efc8687ac31e2fc95a654a4b1dd2e43ffb18e1e6638588b0753d086a",
            "sha256:a7bb6dfc3f5f298ba4f7c188670a5146bacfb41344faec565aa36f92aa773eaa",
            "sha256:1401553bc6d81059eb64519d0475e813ac5b6f4cf1acce977c80731c3ae2a093",
            "sha256:a9cd58d482edc258040bf31553ada01131c023c14d356e9767630f52d9e5f608"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-08-18T11:50:49.862225616+08:00"
    }
}

更多版本

docker.io/acryldata/datahub-kafka-setup:edb9a87

linux/amd64 docker.io604.05MB2024-08-18 11:52
142

docker.io/acryldata/datahub-kafka-setup:head

linux/amd64 docker.io605.30MB2024-09-04 17:13
119