docker.io/acryldata/datahub-kafka-setup:v0.13.3 linux/amd64

docker.io/acryldata/datahub-kafka-setup:v0.13.3 - 国内下载镜像源 浏览次数:13
DataHub Kafka Setup 这是一个 Docker 容器镜像,用于设置和配置 Apache Kafka 集群。
源镜像 docker.io/acryldata/datahub-kafka-setup:v0.13.3
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3
镜像ID sha256:6af0264087550d9b7cb5e51845ac5a60df24276c4f50dd9237a37135bd2c597d
镜像TAG v0.13.3
大小 602.16MB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD /bin/sh -c ./kafka-setup.sh
启动入口
工作目录 /opt/kafka
OS/平台 linux/amd64
浏览量 13 次
贡献者
镜像创建 2024-05-23T23:18:52.74715601Z
同步时间 2025-04-23 12:40
更新时间 2025-04-24 20:32
环境变量
PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.12.3 PYTHON_PIP_VERSION=24.0 PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/dbf0c85f76fb6e1ab42aa672ffca6f0a675d9ee4/public/get-pip.py PYTHON_GET_PIP_SHA256=dfe9fd5c28dc98b5ac17979a953ea550cec37ae1b47a5116007395bfacff2ab9 KAFKA_VERSION=3.7.0 SCALA_VERSION=2.13 METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4 METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4 FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4 DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1 METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1 METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1 METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1 FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1 PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1 DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1 USE_CONFLUENT_SCHEMA_REGISTRY=TRUE
镜像标签
kafka: name 3.7.0: version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3  docker.io/acryldata/datahub-kafka-setup:v0.13.3

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3  docker.io/acryldata/datahub-kafka-setup:v0.13.3

Shell快速替换命令

sed -i 's#acryldata/datahub-kafka-setup:v0.13.3#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3  docker.io/acryldata/datahub-kafka-setup:v0.13.3'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3  docker.io/acryldata/datahub-kafka-setup:v0.13.3'

镜像构建历史


# 2024-05-24 07:18:52  0.00B 设置默认要执行的命令
CMD ["/bin/sh" "-c" "./kafka-setup.sh"]
                        
# 2024-05-24 07:18:52  2.18KB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c chmod +x ./kafka-setup.sh ./kafka-topic-workers.sh ./kafka-ready.sh # buildkit
                        
# 2024-05-24 07:18:52  317.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-ready.sh ./kafka-ready.sh # buildkit
                        
# 2024-05-24 07:18:52  2.18KB 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-topic-workers.sh ./kafka-topic-workers.sh # buildkit
                        
# 2024-05-24 07:18:52  303.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-config.sh ./kafka-config.sh # buildkit
                        
# 2024-05-24 07:18:52  7.69KB 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-setup.sh ./kafka-setup.sh # buildkit
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 USE_CONFLUENT_SCHEMA_REGISTRY
ENV USE_CONFLUENT_SCHEMA_REGISTRY=TRUE
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 DATAHUB_UPGRADE_HISTORY_TOPIC_NAME
ENV DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 PLATFORM_EVENT_TOPIC_NAME
ENV PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME
ENV FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 METADATA_CHANGE_PROPOSAL_TOPIC_NAME
ENV METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME
ENV METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME
ENV METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 DATAHUB_USAGE_EVENT_NAME
ENV DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 FAILED_METADATA_CHANGE_EVENT_NAME
ENV FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 METADATA_CHANGE_EVENT_NAME
ENV METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 METADATA_AUDIT_EVENT_NAME
ENV METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4
                        
# 2024-05-24 07:18:52  14.87MB 复制文件或目录到容器中
ADD --chown=kafka:kafka https://github.com/aws/aws-msk-iam-auth/releases/download/v2.0.3/aws-msk-iam-auth-2.0.3-all.jar /opt/kafka/libs # buildkit
                        
# 2024-05-24 07:18:52  14.87MB 复制文件或目录到容器中
ADD --chown=kafka:kafka https://github.com/aws/aws-msk-iam-auth/releases/download/v2.0.3/aws-msk-iam-auth-2.0.3-all.jar /usr/share/java/cp-base-new # buildkit
                        
# 2024-05-24 07:18:52  1.31KB 复制新文件或目录到容器中
COPY /etc/cp-base-new/log4j.properties /etc/cp-base-new/log4j.properties # buildkit
                        
# 2024-05-24 07:18:52  39.06MB 复制新文件或目录到容器中
COPY /usr/share/java/cp-base-new/ /usr/share/java/cp-base-new/ # buildkit
                        
# 2024-05-24 07:18:52  0.00B 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c ls -la # buildkit
                        
# 2024-05-24 07:18:52  0.00B 设置工作目录为/opt/kafka
WORKDIR /opt/kafka
                        
# 2024-05-24 07:18:52  0.00B 设置环境变量 PATH
ENV PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-05-24 07:18:52  124.06MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c mkdir -p /opt   && if [ "${APACHE_DOWNLOAD_URL}" != "null" ] ; then mirror="${APACHE_DOWNLOAD_URL}/" ; else mirror=$(curl --stderr /dev/null https://www.apache.org/dyn/closer.cgi\?as_json\=1 | jq -r '.preferred'); fi   && curl -sSL "${mirror}kafka/${KAFKA_VERSION}/kafka_${SCALA_VERSION}-${KAFKA_VERSION}.tgz"   | tar -xzf - -C /opt   && mv /opt/kafka_${SCALA_VERSION}-${KAFKA_VERSION} /opt/kafka   && adduser -DH -s /sbin/nologin kafka   && chown -R kafka: /opt/kafka   && rm -rf /tmp/*   && apk del --purge .build-deps # buildkit
                        
# 2024-05-24 07:18:44  173.57MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk add --no-cache -t .build-deps git curl ca-certificates jq gcc musl-dev libffi-dev zip # buildkit
                        
# 2024-05-24 07:18:40  181.17MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk --no-cache add openjdk17-jre-headless --repository=${ALPINE_REPO_URL}/edge/community # buildkit
                        
# 2024-05-24 07:18:36  2.79MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk add --no-cache bash coreutils # buildkit
                        
# 2024-05-24 07:18:35  0.00B 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c if [ "${ALPINE_REPO_URL}" != "http://dl-cdn.alpinelinux.org/alpine" ] ; then sed -i "s#http.*://dl-cdn.alpinelinux.org/alpine#${ALPINE_REPO_URL}#g" /etc/apk/repositories ; fi # buildkit
                        
# 2024-05-24 07:18:35  0.00B 添加元数据标签
LABEL name=kafka version=3.7.0
                        
# 2024-05-24 07:18:35  0.00B 设置环境变量 SCALA_VERSION
ENV SCALA_VERSION=2.13
                        
# 2024-05-24 07:18:35  0.00B 设置环境变量 KAFKA_VERSION
ENV KAFKA_VERSION=3.7.0
                        
# 2024-05-24 07:18:35  0.00B 定义构建参数
ARG GITHUB_REPO_URL
                        
# 2024-05-24 07:18:35  0.00B 定义构建参数
ARG APACHE_DOWNLOAD_URL
                        
# 2024-05-24 07:18:35  0.00B 定义构建参数
ARG ALPINE_REPO_URL
                        
# 2024-05-22 20:26:30  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2024-05-22 20:26:30  10.32MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		wget -O get-pip.py "$PYTHON_GET_PIP_URL"; 	echo "$PYTHON_GET_PIP_SHA256 *get-pip.py" | sha256sum -c -; 		export PYTHONDONTWRITEBYTECODE=1; 		python get-pip.py 		--disable-pip-version-check 		--no-cache-dir 		--no-compile 		"pip==$PYTHON_PIP_VERSION" 	; 	rm -f get-pip.py; 		pip --version # buildkit
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 PYTHON_GET_PIP_SHA256
ENV PYTHON_GET_PIP_SHA256=dfe9fd5c28dc98b5ac17979a953ea550cec37ae1b47a5116007395bfacff2ab9
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 PYTHON_GET_PIP_URL
ENV PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/dbf0c85f76fb6e1ab42aa672ffca6f0a675d9ee4/public/get-pip.py
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 PYTHON_PIP_VERSION
ENV PYTHON_PIP_VERSION=24.0
                        
# 2024-05-22 20:26:30  32.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2024-05-22 20:26:30  32.62MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		apk add --no-cache --virtual .build-deps 		gnupg 		tar 		xz 				bluez-dev 		bzip2-dev 		dpkg-dev dpkg 		expat-dev 		findutils 		gcc 		gdbm-dev 		libc-dev 		libffi-dev 		libnsl-dev 		libtirpc-dev 		linux-headers 		make 		ncurses-dev 		openssl-dev 		pax-utils 		readline-dev 		sqlite-dev 		tcl-dev 		tk 		tk-dev 		util-linux-dev 		xz-dev 		zlib-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-optimizations 		--enable-option-checking=fatal 		--enable-shared 		--with-lto 		--with-system-expat 		--without-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="-DTHREAD_STACK_SIZE=0x100000"; 	LDFLAGS="${LDFLAGS:--Wl},--strip-all"; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 		"PROFILE_TASK=${PROFILE_TASK:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:--Wl},-rpath='\$\$ORIGIN/../lib'" 		"PROFILE_TASK=${PROFILE_TASK:-}" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec scanelf --needed --nobanner --format '%n#p' '{}' ';' 		| tr ',' '\n' 		| sort -u 		| awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' 		| xargs -rt apk add --no-network --virtual .python-rundeps 	; 	apk del --no-network .build-deps; 		python3 --version # buildkit
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.12.3
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2024-05-22 20:26:30  1.02MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apk add --no-cache 		ca-certificates 		tzdata 	; # buildkit
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2024-05-22 20:26:30  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-05-23 02:18:12  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/sh"]
                        
# 2024-05-23 02:18:11  7.79MB 
/bin/sh -c #(nop) ADD file:e3abcdba177145039cfef1ad882f9f81a612a24c9f044b19f713b95454d2e3f6 in / 
                        
                    

镜像信息

{
    "Id": "sha256:6af0264087550d9b7cb5e51845ac5a60df24276c4f50dd9237a37135bd2c597d",
    "RepoTags": [
        "acryldata/datahub-kafka-setup:v0.13.3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v0.13.3"
    ],
    "RepoDigests": [
        "acryldata/datahub-kafka-setup@sha256:a14c812106cfec78548f9893ea092703b0dd632bceb18f041010420dd9728fcd",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup@sha256:2881e8769b0d63d3d643876cf694f91d63660f1a819ffab16839042b83c3ec86"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-05-23T23:18:52.74715601Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.12.3",
            "PYTHON_PIP_VERSION=24.0",
            "PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/dbf0c85f76fb6e1ab42aa672ffca6f0a675d9ee4/public/get-pip.py",
            "PYTHON_GET_PIP_SHA256=dfe9fd5c28dc98b5ac17979a953ea550cec37ae1b47a5116007395bfacff2ab9",
            "KAFKA_VERSION=3.7.0",
            "SCALA_VERSION=2.13",
            "METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4",
            "METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4",
            "FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4",
            "DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1",
            "METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1",
            "METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1",
            "METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1",
            "FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1",
            "PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1",
            "DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1",
            "USE_CONFLUENT_SCHEMA_REGISTRY=TRUE"
        ],
        "Cmd": [
            "/bin/sh",
            "-c",
            "./kafka-setup.sh"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/kafka",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "name": "kafka",
            "version": "3.7.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 602159594,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/5596e4892c45e5ad379f2b37e82858da358ffd1c7af26ea4f465bd73f96389ba/diff:/var/lib/docker/overlay2/f8b15ded0e087528c16577ad2767cce250eb19bc0d21086968e408010660746e/diff:/var/lib/docker/overlay2/2faab38d928703396d1d22554c150d0726722770dcd4c87508346648f1c63e9d/diff:/var/lib/docker/overlay2/bbf50d832a391730c49b816e91f2972f1c9d5129bdc49c864e2472b54d318ffd/diff:/var/lib/docker/overlay2/6bfc062e646b3abbead8de2e9b3bf0cc19cfb88b80b8c1e942a7fd9359597bad/diff:/var/lib/docker/overlay2/fc77959e8a03f2657fb157f32161841f79b6f22fd017ef5249544d6e73f34898/diff:/var/lib/docker/overlay2/32883b0c8620ff5aab1e460b00a0cb4736d16ab60388dc3d1a05ba54af5265e9/diff:/var/lib/docker/overlay2/6778e0f2507e000b95f389ea263f972e0f86dccff1f29b560681e50fea08594e/diff:/var/lib/docker/overlay2/dae83fe8b7a497eb79db1bc66bfa1d8879966edf194d3ec505f74b41e553576d/diff:/var/lib/docker/overlay2/6e9bb81d4932e1bb06cf08bb64b73f69940d82d979664200eeef564e14b3123f/diff:/var/lib/docker/overlay2/0757826a172c9d06063629bfad638949325c2f61eb148289c2d18936d5f18648/diff:/var/lib/docker/overlay2/48b0e943abe3d5661678207410ead6b81266985208e0b7600bd90bf747c27ca9/diff:/var/lib/docker/overlay2/12000cca94b671946a69f0852fcfc0bee7253f626d688943bedbba5cf0c05775/diff:/var/lib/docker/overlay2/8e70bf8c399adb9fa837354ca12201e7b5f2b96865a53ccb6d3021f69cb4f6ed/diff:/var/lib/docker/overlay2/47bf52c61a435c74d9e081fd70830815f6fa3751f0d576ae6218e9f47f165f81/diff:/var/lib/docker/overlay2/287fadcbfe5fd688d84bb2dc15de6fe3bcf705125cbf3527ed5b683e8257d16f/diff:/var/lib/docker/overlay2/46941647a6d1a0a53eadff03892a9acd3cd14c360edda1e3cef31223cf06c0c5/diff:/var/lib/docker/overlay2/fc1aefb634089d9c8c10011939095ea0a3f022e7491540462466e2b52133b2fc/diff:/var/lib/docker/overlay2/cc869ae6b6029525171dc0fe28f09d508088a9b51e04fb13362902e42199cee4/diff:/var/lib/docker/overlay2/0ba7f042e18abf4e70cf02edfbbb10efb0f2a80391a75e5169bb2177e228332b/diff",
            "MergedDir": "/var/lib/docker/overlay2/92619f83394e161e852e944f0b7f3701a8d06db3be814fa90e71ddd211b61e66/merged",
            "UpperDir": "/var/lib/docker/overlay2/92619f83394e161e852e944f0b7f3701a8d06db3be814fa90e71ddd211b61e66/diff",
            "WorkDir": "/var/lib/docker/overlay2/92619f83394e161e852e944f0b7f3701a8d06db3be814fa90e71ddd211b61e66/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:02f2bcb26af5ea6d185dcf509dc795746d907ae10c53918b6944ac85447a0c72",
            "sha256:2b4aacde20aba82fae6121eac6bc0e093c71a139dbfb912e4887b4877f39a862",
            "sha256:2d4dfc806acc4792c2968f88b0e1d364d1e46ae53c6551ea3a60c0b9a53dd88c",
            "sha256:f17b1f60a5008430de2f767bd362525600b08a4915bed5783bc7a778c47c92dd",
            "sha256:b460a573bcd3c94fb8285c29e46e2ffd3f3d3fd1f68ddfbc6dd30414c51dfe55",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:a668ed34c23527981f697447802301e55280ce8a4db78fcecb9461a22a5ddb8f",
            "sha256:0113c2ba534927d63d981271fc15c3ec0d1e47750eb4c2a2f6ccf68394f3ccf7",
            "sha256:1834edadf0a0b42aaabc2c512231dfb86d5766d5588dada178b9581ba4c63954",
            "sha256:f041de07e9339615c2f42f1109fa35ec242845ddd110d0b6a3d18f0ab56c7c94",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:641defbd565c7cbcc49987b40aae8b9ad70c8ef7c760b40afead1321c3907fe7",
            "sha256:22c977a38a7ef0a88f7179bf503f54b1cb47efcc9f7e12bf75e96799e1a5b1c3",
            "sha256:8a04b9b1849dcfab33f9767bedcbbed87a5846550b1f6c2fa176ce7799ddc98f",
            "sha256:99b157070c9bbf4fb932140e04c3b2d13e48629d9d05dc56a04f3d49e6e337d1",
            "sha256:8d1c1a3a4deff4b0fba8086c52a7fd30b76aae35ac87aed7f0517116b3f230b7",
            "sha256:a0cae1ab9174cfb77892f30221afe9df3abf1b0e517bed08de5421db15dbe9e7",
            "sha256:dfdfe2fdb97ec4c5469c5235c5b1e3bd68165c3c1b3d7ce9f69dc47625bf166b",
            "sha256:8559fbd52bf5b7c1e9bb5b85e57ba163cd7434be3c780d73da65c94b5e43d4f2",
            "sha256:b45eae398fabd59610ce98151a78c05fa70dbac595c6a8c09b520fab8682c6c2"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-04-23T12:39:26.323775118+08:00"
    }
}

更多版本

docker.io/acryldata/datahub-kafka-setup:edb9a87

linux/amd64 docker.io604.05MB2024-08-18 11:52
168

docker.io/acryldata/datahub-kafka-setup:head

linux/amd64 docker.io605.30MB2024-09-04 17:13
154

docker.io/acryldata/datahub-kafka-setup:v0.13.3

linux/amd64 docker.io602.16MB2025-04-23 12:40
12