docker.io/acryldata/datahub-kafka-setup:v1.2.0.1 linux/amd64

docker.io/acryldata/datahub-kafka-setup:v1.2.0.1 - 国内下载镜像源 浏览次数:12
DataHub Kafka Setup 这是一个 Docker 容器镜像,用于设置和配置 Apache Kafka 集群。
源镜像 docker.io/acryldata/datahub-kafka-setup:v1.2.0.1
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1
镜像ID sha256:a5502899aff7c40514c72c65cde32e5fca49550c812cd6edf622caa2213754aa
镜像TAG v1.2.0.1
大小 824.36MB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD /bin/sh -c ./kafka-setup.sh
启动入口
工作目录 /opt/kafka
OS/平台 linux/amd64
浏览量 12 次
贡献者 my******t@163.com
镜像创建 2025-08-22T15:43:54.004632546Z
同步时间 2025-08-27 11:09
更新时间 2025-08-27 17:12
环境变量
PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.13.7 PYTHON_SHA256=5462f9099dfd30e238def83c71d91897d8caa5ff6ebc7a50f14d4802cdaaa79a KAFKA_VERSION=4.0.0 SCALA_VERSION=2.13 METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4 METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4 FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4 DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1 METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1 METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1 METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1 FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1 PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1 DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1 USE_CONFLUENT_SCHEMA_REGISTRY=TRUE
镜像标签
kafka: name 4.0.0: version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1  docker.io/acryldata/datahub-kafka-setup:v1.2.0.1

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1  docker.io/acryldata/datahub-kafka-setup:v1.2.0.1

Shell快速替换命令

sed -i 's#acryldata/datahub-kafka-setup:v1.2.0.1#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1  docker.io/acryldata/datahub-kafka-setup:v1.2.0.1'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1  docker.io/acryldata/datahub-kafka-setup:v1.2.0.1'

镜像构建历史


# 2025-08-22 23:43:54  0.00B 设置默认要执行的命令
CMD ["/bin/sh" "-c" "./kafka-setup.sh"]
                        
# 2025-08-22 23:43:54  2.18KB 执行命令并创建新的镜像层
RUN |6 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com MAVEN_CENTRAL_REPO_URL=https://repo1.maven.org/maven2 SNAKEYAML_VERSION=2.4 COMMONS_BEAN_UTILS_VERSION=1.11.0 /bin/sh -c chmod +x ./kafka-setup.sh ./kafka-topic-workers.sh ./kafka-ready.sh # buildkit
                        
# 2025-08-22 23:43:53  658.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/env_to_properties.py ./env_to_properties.py # buildkit
                        
# 2025-08-22 23:43:53  317.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-ready.sh ./kafka-ready.sh # buildkit
                        
# 2025-08-22 23:43:53  2.18KB 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-topic-workers.sh ./kafka-topic-workers.sh # buildkit
                        
# 2025-08-22 23:43:53  323.00B 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-config.sh ./kafka-config.sh # buildkit
                        
# 2025-08-22 23:43:53  5.40KB 复制新文件或目录到容器中
COPY docker/kafka-setup/kafka-setup.sh ./kafka-setup.sh # buildkit
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 USE_CONFLUENT_SCHEMA_REGISTRY
ENV USE_CONFLUENT_SCHEMA_REGISTRY=TRUE
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 DATAHUB_UPGRADE_HISTORY_TOPIC_NAME
ENV DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 PLATFORM_EVENT_TOPIC_NAME
ENV PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME
ENV FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 METADATA_CHANGE_PROPOSAL_TOPIC_NAME
ENV METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME
ENV METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME
ENV METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 DATAHUB_USAGE_EVENT_NAME
ENV DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 FAILED_METADATA_CHANGE_EVENT_NAME
ENV FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 METADATA_CHANGE_EVENT_NAME
ENV METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4
                        
# 2025-08-22 23:43:53  0.00B 设置环境变量 METADATA_AUDIT_EVENT_NAME
ENV METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4
                        
# 2025-08-22 23:43:53  13.69MB 复制文件或目录到容器中
ADD --chown=kafka:kafka https://github.com/aws/aws-msk-iam-auth/releases/download/v2.3.2/aws-msk-iam-auth-2.3.2-all.jar /opt/kafka/libs # buildkit
                        
# 2025-08-22 23:43:53  13.69MB 复制文件或目录到容器中
ADD --chown=kafka:kafka https://github.com/aws/aws-msk-iam-auth/releases/download/v2.3.2/aws-msk-iam-auth-2.3.2-all.jar /usr/share/java/cp-base-new # buildkit
                        
# 2025-08-22 23:43:52  493.61KB 执行命令并创建新的镜像层
RUN |6 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com MAVEN_CENTRAL_REPO_URL=https://repo1.maven.org/maven2 SNAKEYAML_VERSION=2.4 COMMONS_BEAN_UTILS_VERSION=1.11.0 /bin/sh -c rm /usr/share/java/cp-base-new/commons-beanutils-*.jar   && wget -P /usr/share/java/cp-base-new $MAVEN_CENTRAL_REPO_URL/commons-beanutils/commons-beanutils/$COMMONS_BEAN_UTILS_VERSION/commons-beanutils-$COMMONS_BEAN_UTILS_VERSION.jar   && rm /opt/kafka/libs/commons-beanutils-*.jar   && cp /usr/share/java/cp-base-new/commons-beanutils-$COMMONS_BEAN_UTILS_VERSION.jar  /opt/kafka/libs # buildkit
                        
# 2025-08-22 23:43:52  0.00B 定义构建参数
ARG COMMONS_BEAN_UTILS_VERSION=1.11.0
                        
# 2025-08-22 23:43:52  339.82KB 执行命令并创建新的镜像层
RUN |5 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com MAVEN_CENTRAL_REPO_URL=https://repo1.maven.org/maven2 SNAKEYAML_VERSION=2.4 /bin/sh -c rm /usr/share/java/cp-base-new/snakeyaml-*.jar   && wget -P /usr/share/java/cp-base-new $MAVEN_CENTRAL_REPO_URL/org/yaml/snakeyaml/$SNAKEYAML_VERSION/snakeyaml-$SNAKEYAML_VERSION.jar # buildkit
                        
# 2025-08-22 23:43:51  0.00B 定义构建参数
ARG SNAKEYAML_VERSION=2.4
                        
# 2025-08-22 23:43:51  0.00B 定义构建参数
ARG MAVEN_CENTRAL_REPO_URL
                        
# 2025-08-22 23:43:51  336.00B 复制新文件或目录到容器中
COPY /etc/cp-base-new/log4j2.yaml /etc/cp-base-new/log4j2.yaml # buildkit
                        
# 2025-08-22 23:43:50  50.24MB 复制新文件或目录到容器中
COPY /usr/share/java/cp-base-new/ /usr/share/java/cp-base-new/ # buildkit
                        
# 2025-08-22 23:43:48  0.00B 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c ls -la # buildkit
                        
# 2025-08-22 23:43:46  0.00B 设置工作目录为/opt/kafka
WORKDIR /opt/kafka
                        
# 2025-08-22 23:43:44  0.00B 设置环境变量 PATH
ENV PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-08-22 23:43:44  137.09MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c mkdir -p /opt   && if [ "${APACHE_DOWNLOAD_URL}" != "null" ] ; then mirror="${APACHE_DOWNLOAD_URL}/" ; else mirror=$(curl --stderr /dev/null https://www.apache.org/dyn/closer.cgi\?as_json\=1 | jq -r '.preferred'); fi   && curl -sSL "${mirror}kafka/${KAFKA_VERSION}/kafka_${SCALA_VERSION}-${KAFKA_VERSION}.tgz"   | tar -xzf - -C /opt   && mv /opt/kafka_${SCALA_VERSION}-${KAFKA_VERSION} /opt/kafka   && adduser -DH -s /sbin/nologin kafka   && chown -R kafka: /opt/kafka   && rm -rf /tmp/*   && apk del --purge .build-deps # buildkit
                        
# 2025-08-22 23:43:35  379.23MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk --no-cache --update-cache --available upgrade     && apk --no-cache add 'c-ares>1.34.5'  --repository=${ALPINE_REPO_URL}/edge/main     && apk --no-cache add -t .build-deps git curl ca-certificates jq gcc musl-dev libffi-dev zip # buildkit
                        
# 2025-08-22 23:43:21  181.89MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk --no-cache add openjdk17-jre-headless --repository=${ALPINE_REPO_URL}/edge/community # buildkit
                        
# 2025-08-22 23:43:11  2.82MB 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c apk add --no-cache bash coreutils # buildkit
                        
# 2025-08-22 23:43:06  0.00B 执行命令并创建新的镜像层
RUN |3 ALPINE_REPO_URL=http://dl-cdn.alpinelinux.org/alpine APACHE_DOWNLOAD_URL=null GITHUB_REPO_URL=https://github.com /bin/sh -c if [ "${ALPINE_REPO_URL}" != "http://dl-cdn.alpinelinux.org/alpine" ] ; then sed -i "s#http.*://dl-cdn.alpinelinux.org/alpine#${ALPINE_REPO_URL}#g" /etc/apk/repositories ; fi # buildkit
                        
# 2025-08-22 23:43:06  0.00B 添加元数据标签
LABEL name=kafka version=4.0.0
                        
# 2025-08-22 23:43:06  0.00B 设置环境变量 SCALA_VERSION
ENV SCALA_VERSION=2.13
                        
# 2025-08-22 23:43:06  0.00B 设置环境变量 KAFKA_VERSION
ENV KAFKA_VERSION=4.0.0
                        
# 2025-08-22 23:43:06  0.00B 定义构建参数
ARG GITHUB_REPO_URL
                        
# 2025-08-22 23:43:06  0.00B 定义构建参数
ARG APACHE_DOWNLOAD_URL
                        
# 2025-08-22 23:43:06  0.00B 定义构建参数
ARG ALPINE_REPO_URL
                        
# 2025-08-15 05:49:23  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2025-08-15 05:49:23  36.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pip3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2025-08-15 05:49:23  35.59MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		apk add --no-cache --virtual .build-deps 		gnupg 		tar 		xz 				bluez-dev 		bzip2-dev 		dpkg-dev dpkg 		findutils 		gcc 		gdbm-dev 		libc-dev 		libffi-dev 		libnsl-dev 		libtirpc-dev 		linux-headers 		make 		ncurses-dev 		openssl-dev 		pax-utils 		readline-dev 		sqlite-dev 		tcl-dev 		tk 		tk-dev 		util-linux-dev 		xz-dev 		zlib-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	echo "$PYTHON_SHA256 *python.tar.xz" | sha256sum -c -; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-option-checking=fatal 		--enable-shared 		$(test "${gnuArch%%-*}" != 'riscv64' && echo '--with-lto') 		--with-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="-DTHREAD_STACK_SIZE=0x100000"; 	LDFLAGS="${LDFLAGS:--Wl},--strip-all"; 		arch="$(apk --print-arch)"; 		case "$arch" in 			x86_64|aarch64) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer"; 				;; 			x86) 				;; 			*) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer"; 				;; 		esac; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:--Wl},-rpath='\$\$ORIGIN/../lib'" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec scanelf --needed --nobanner --format '%n#p' '{}' ';' 		| tr ',' '\n' 		| sort -u 		| awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' 		| xargs -rt apk add --no-network --virtual .python-rundeps 	; 	apk del --no-network .build-deps; 		export PYTHONDONTWRITEBYTECODE=1; 	python3 --version; 	pip3 --version # buildkit
                        
# 2025-08-15 05:49:23  0.00B 设置环境变量 PYTHON_SHA256
ENV PYTHON_SHA256=5462f9099dfd30e238def83c71d91897d8caa5ff6ebc7a50f14d4802cdaaa79a
                        
# 2025-08-15 05:49:23  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.13.7
                        
# 2025-08-15 05:49:23  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2025-08-15 05:49:23  986.35KB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apk add --no-cache 		ca-certificates 		tzdata 	; # buildkit
                        
# 2025-08-15 05:49:23  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-07-15 19:01:16  0.00B 设置默认要执行的命令
CMD ["/bin/sh"]
                        
# 2025-07-15 19:01:16  8.31MB 复制文件或目录到容器中
ADD alpine-minirootfs-3.22.1-x86_64.tar.gz / # buildkit
                        
                    

镜像信息

{
    "Id": "sha256:a5502899aff7c40514c72c65cde32e5fca49550c812cd6edf622caa2213754aa",
    "RepoTags": [
        "acryldata/datahub-kafka-setup:v1.2.0.1",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup:v1.2.0.1"
    ],
    "RepoDigests": [
        "acryldata/datahub-kafka-setup@sha256:0985afccd17b0af78ce73574ace4c8e6208562b1901c69f6db28f25896975298",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/acryldata/datahub-kafka-setup@sha256:d4dd9654d738a0be2f7899c147a4b1c12acad9b742baad88539fe4f152e7d903"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-08-22T15:43:54.004632546Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/sbin:/opt/kafka/bin/:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.13.7",
            "PYTHON_SHA256=5462f9099dfd30e238def83c71d91897d8caa5ff6ebc7a50f14d4802cdaaa79a",
            "KAFKA_VERSION=4.0.0",
            "SCALA_VERSION=2.13",
            "METADATA_AUDIT_EVENT_NAME=MetadataAuditEvent_v4",
            "METADATA_CHANGE_EVENT_NAME=MetadataChangeEvent_v4",
            "FAILED_METADATA_CHANGE_EVENT_NAME=FailedMetadataChangeEvent_v4",
            "DATAHUB_USAGE_EVENT_NAME=DataHubUsageEvent_v1",
            "METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME=MetadataChangeLog_Versioned_v1",
            "METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME=MetadataChangeLog_Timeseries_v1",
            "METADATA_CHANGE_PROPOSAL_TOPIC_NAME=MetadataChangeProposal_v1",
            "FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME=FailedMetadataChangeProposal_v1",
            "PLATFORM_EVENT_TOPIC_NAME=PlatformEvent_v1",
            "DATAHUB_UPGRADE_HISTORY_TOPIC_NAME=DataHubUpgradeHistory_v1",
            "USE_CONFLUENT_SCHEMA_REGISTRY=TRUE"
        ],
        "Cmd": [
            "/bin/sh",
            "-c",
            "./kafka-setup.sh"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/kafka",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "name": "kafka",
            "version": "4.0.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 824364670,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/2aa0b6411a906e2985cdd19818af6b0fa847d3a405c56ac7ee53c7aa641a92be/diff:/var/lib/docker/overlay2/4fdae689f321cfba34f23cb1de4178eb73b3742051ff0ef455c43b9afe15bd91/diff:/var/lib/docker/overlay2/f697faeb125603b714fad548445534c6aa6875a94412a91b71ce99caaaf11022/diff:/var/lib/docker/overlay2/6b73ce87fd465500231fffae3463c86dce75be9e047ac98109e9eecfed8d78ae/diff:/var/lib/docker/overlay2/1c79c82624744e2d5fedc593983d86d270829067d167c2ad1666a65f8c3f1c88/diff:/var/lib/docker/overlay2/1b0c266b76f285f364d52d66f4ef32c87d12a8a221cadbaba70270e9690db046/diff:/var/lib/docker/overlay2/faafa2e26a775e93d2ea95efd20bc61413ba30410765c61f46de37b2ec2fa93b/diff:/var/lib/docker/overlay2/8ac925ebe17cb8f213fb8f12098167f0acaabba83960343ae5d9283d0888eb8a/diff:/var/lib/docker/overlay2/00ef099c1a43bb0bc3ebcd7f87d9b370f519f87fb9d7bdbc5b97ac1b6f9089be/diff:/var/lib/docker/overlay2/e58bd6e7c9405751863e39a1884914bf34651f4ca8195fb414f79c14887b9307/diff:/var/lib/docker/overlay2/12bbbf55872a129c3f8dc2a34604cebec08b013044c75c6a5628ee1b828dec9c/diff:/var/lib/docker/overlay2/a2773ab55fcf110c88082936434fd1437435d924fb74c5331605c9ecbbb65a09/diff:/var/lib/docker/overlay2/0a6446e1e20e895c4afe127cbc8488f1be61cf670059cf946832b6546c2ebc77/diff:/var/lib/docker/overlay2/4a6977d8f15a25f502e66ce5418fb62e0664fb09bbfb2fb5dc188dd3defab03b/diff:/var/lib/docker/overlay2/67db5c197a6c09b7591d552d0d7678023a01edaf2e43a60c88d14aed6e263d75/diff:/var/lib/docker/overlay2/79dee73d9a6233c418a8b290bff77e2d008116fa4e4a3abdc189c8beafc40e6c/diff:/var/lib/docker/overlay2/d29e711febfa53bc1d2f0feb8b27e4c1012b18593f7a12924fb6731a8397ca7d/diff:/var/lib/docker/overlay2/6065e62f69de6c8fbaf025ac5d6b686fa40c9d9f1ed12b2010d3c4bb75966934/diff:/var/lib/docker/overlay2/d95568461f14934419bc8fea3a7c6419ddc061c467dbde70a05aaf60743cd174/diff:/var/lib/docker/overlay2/bae041e600717cd917d48789d03a7c6abff725dbbc3ab6da8e050bd1a07ddea4/diff:/var/lib/docker/overlay2/135f217b3fe5257135d9830e1f205a3abab4e3ceaa138254f8be7310fe521a11/diff:/var/lib/docker/overlay2/fce31083bf78fb669ced5e473ada166388fb6191ae6144c91b94c9a1629510cc/diff",
            "MergedDir": "/var/lib/docker/overlay2/aa084a2a0651dc5da43e2e83f34f77e5dbfc4db209b56c504110245e679a9cd8/merged",
            "UpperDir": "/var/lib/docker/overlay2/aa084a2a0651dc5da43e2e83f34f77e5dbfc4db209b56c504110245e679a9cd8/diff",
            "WorkDir": "/var/lib/docker/overlay2/aa084a2a0651dc5da43e2e83f34f77e5dbfc4db209b56c504110245e679a9cd8/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:418dccb7d85a63a6aa574439840f7a6fa6fd2321b3e2394568a317735e867d35",
            "sha256:062ca0d899245921454443a8bca59333e9f65ce0971881e23a4ed8e38ae5eb7f",
            "sha256:d2a0c772fb406ad6f280d8c4ff7f414f5062093bacd6198b945b79b8ff84afdb",
            "sha256:7e8988ece024a5f9dca59003db4b857c72fdb3e33bda91d6b97b8d066ca04cd1",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:45580e29eaa59ab2a553fa050dcb226ef559d2ba34138c6e8ae98500d9b68dd7",
            "sha256:1f142bcfd20b507ce4143df356350a6d8e59125d86b2c66991dee785ede2acf7",
            "sha256:87c9be6aa27e97d2d136ded273d0698d7654d39fa6ccb1f38ed5544564e1c2fa",
            "sha256:ad221c28542ed2af14ae1ace99a5ab7011000df501bee69daae8c18599c027e7",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5e73a84d0d074bae674ddd81bb5179b3c8ef0bda277656c23bffb67b373df9bc",
            "sha256:d457ab560d4a25a6ef78e7e568a28ed8992f7837ea26c0d751065118b45f16e6",
            "sha256:1a9e14d3237e6e714b768158b2caf88cb7b912c82c188af8d1f4f596503a2227",
            "sha256:8e91e552e5ce432c436130f6e697322e09f56d78040422c46a5ffcd21bd18497",
            "sha256:977fb62e924f0b6e9ff28a797a0be4aed9d1e71e469901d36ab9286ba7c30123",
            "sha256:ce5cc22e9c51a4daf82dee826c4076e276a8418c62c1ff9c0267a6238e283f24",
            "sha256:2ddeb3741b4428bb8c8bd5c567479110d6fa5b7a09c0f951196d02cf6b9233a6",
            "sha256:93f5c41f0ecbc535076349f3600ff412b806181b0b1f0bad56c6ff4bf9cbc2f9",
            "sha256:00966bbe0db03e4988853946c79c8251eb5f3a0dac84d8ab12ea821af429f2cb",
            "sha256:ee3c72c74aee17dcff34a97581f3c4d9456e137895287808cdb3823fe5d5ea6b",
            "sha256:f478e06c2fb649ef0c1d437c03244287a34e18f1efa09f613a61af2f5f068022",
            "sha256:732aef1d5aae3b13279837772ed270951a372a874759cd79816c3c2ce730bac1"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-08-27T11:08:32.899244762+08:00"
    }
}

更多版本

docker.io/acryldata/datahub-kafka-setup:edb9a87

linux/amd64 docker.io604.05MB2024-08-18 11:52
255

docker.io/acryldata/datahub-kafka-setup:head

linux/amd64 docker.io605.30MB2024-09-04 17:13
271

docker.io/acryldata/datahub-kafka-setup:v0.13.3

linux/amd64 docker.io602.16MB2025-04-23 12:40
94

docker.io/acryldata/datahub-kafka-setup:v1.2.0.1

linux/amd64 docker.io824.36MB2025-08-27 11:09
11