广告图片

ghcr.io/ztx888/halowebui:main linux/amd64

ghcr.io/ztx888/halowebui:main - 国内下载镜像源 浏览次数:12

ghcr.io/ztx888/halowebui是Halo博客系统的Web前端界面镜像,用于提供Halo博客平台的用户交互界面,支持博客内容管理、页面展示、主题设置等功能,帮助用户便捷地操作和管理自己的Halo博客站点。

源镜像 ghcr.io/ztx888/halowebui:main
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main
镜像ID sha256:5d74395b15fa91d53b26d24df5cadceaaaf80cc72a277201632530da22a3a3ac
镜像TAG main
大小 1.49GB
镜像源 ghcr.io
CMD bash start.sh
启动入口
工作目录 /app/backend
OS/平台 linux/amd64
浏览量 12 次
贡献者
镜像创建 2026-05-07T05:21:57.334619584Z
同步时间 2026-05-09 00:36
开放端口
8080/tcp
环境变量
PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=A035C8C19219BA821ECEA86B64E628F8D684696D PYTHON_VERSION=3.11.15 PYTHON_SHA256=272179ddd9a2e41a0fc8e42e33dfbdca0b3711aa5abf372d3f2d51543d09b625 ENV=prod PORT=8080 USE_OLLAMA_DOCKER=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_DOCKER=false USE_CUDA_DOCKER_VER=cu121 USE_EMBEDDING_MODEL_DOCKER=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL_DOCKER= ENABLE_LOCAL_MODEL_RUNTIME=false OLLAMA_BASE_URL=/ollama OPENAI_API_BASE_URL= SCARF_NO_ANALYTICS=true DO_NOT_TRACK=true ANONYMIZED_TELEMETRY=false WHISPER_MODEL=base WHISPER_MODEL_DIR=/app/backend/data/cache/whisper/models RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 RAG_RERANKING_MODEL= SENTENCE_TRANSFORMERS_HOME=/app/backend/data/cache/embedding/models TIKTOKEN_ENCODING_NAME=cl100k_base TIKTOKEN_CACHE_DIR=/app/backend/data/cache/tiktoken HF_HOME=/app/backend/data/cache/embedding/models HALO_RUNTIME_PROFILE=main HOME=/root WEBUI_BUILD_VERSION=b2eecdce3d1915daa8b3057605c64c0378148f48 DOCKER=true
镜像标签
2026-05-07T05:16:24.202Z: org.opencontainers.image.created 基于官方OpenWebUI,汉化界面提高中文使用体验,增加了模型计费和用量统计: org.opencontainers.image.description BSD-3-Clause: org.opencontainers.image.licenses b2eecdce3d1915daa8b3057605c64c0378148f48: org.opencontainers.image.revision https://github.com/ztx888/HaloWebUI: org.opencontainers.image.source HaloWebUI: org.opencontainers.image.title https://github.com/ztx888/HaloWebUI: org.opencontainers.image.url main: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main  ghcr.io/ztx888/halowebui:main

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main  ghcr.io/ztx888/halowebui:main

Shell快速替换命令

sed -i 's#ghcr.io/ztx888/halowebui:main#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main  ghcr.io/ztx888/halowebui:main'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main  ghcr.io/ztx888/halowebui:main'

镜像构建历史


# 2026-05-07 13:21:57  0.00B 设置默认要执行的命令
CMD ["bash" "start.sh"]
                        
# 2026-05-07 13:21:57  0.00B 设置环境变量 DOCKER
ENV DOCKER=true
                        
# 2026-05-07 13:21:57  0.00B 设置环境变量 WEBUI_BUILD_VERSION
ENV WEBUI_BUILD_VERSION=b2eecdce3d1915daa8b3057605c64c0378148f48
                        
# 2026-05-07 13:21:57  0.00B 定义构建参数
ARG BUILD_HASH=b2eecdce3d1915daa8b3057605c64c0378148f48
                        
# 2026-05-07 13:21:57  0.00B 指定运行容器时使用的用户
USER 0:0
                        
# 2026-05-07 13:21:57  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "python -c \"import json, os, urllib.request; response = urllib.request.urlopen(f'http://localhost:{os.environ.get(\\\"PORT\\\", \\\"8080\\\")}/health'); assert json.loads(response.read())[\\\"status\\\"] is True\" || exit 1"] "0s" "0s" "0s" "0s" '\x00'}
                        
# 2026-05-07 13:21:57  0.00B 声明容器运行时监听的端口
EXPOSE [8080/tcp]
                        
# 2026-05-07 13:21:57  0.00B 执行命令并创建新的镜像层
RUN |12 USE_CUDA=false USE_OLLAMA=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_VER=cu121 USE_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL= USE_TIKTOKEN_ENCODING_NAME=cl100k_base HALO_PG_CLIENT_MAJORS=14 15 16 17 18 UID=0 GID=0 HALO_RUNTIME_PROFILE=main /bin/sh -c chown -R $UID:$GID /app "$HOME" # buildkit
                        
# 2026-05-07 13:21:51  30.87MB 复制新文件或目录到容器中
COPY --chown=0:0 /app/backend/open_webui/static /app/backend/open_webui/static # buildkit
                        
# 2026-05-07 13:21:50  34.78MB 复制新文件或目录到容器中
COPY --chown=0:0 ./backend . # buildkit
                        
# 2026-05-07 13:21:49  4.69KB 复制新文件或目录到容器中
COPY --chown=0:0 /app/package.json /app/package.json # buildkit
                        
# 2026-05-07 13:21:49  601.00B 复制新文件或目录到容器中
COPY --chown=0:0 /app/CHANGELOG.md /app/CHANGELOG.md # buildkit
                        
# 2026-05-07 13:21:49  92.17MB 复制新文件或目录到容器中
COPY --chown=0:0 /app/build /app/build # buildkit
                        
# 2026-04-23 23:44:56  36.00B 执行命令并创建新的镜像层
RUN |12 USE_CUDA=false USE_OLLAMA=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_VER=cu121 USE_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL= USE_TIKTOKEN_ENCODING_NAME=cl100k_base HALO_PG_CLIENT_MAJORS=14 15 16 17 18 UID=0 GID=0 HALO_RUNTIME_PROFILE=main /bin/sh -c echo -n 00000000-0000-0000-0000-000000000000 > "$HOME/.cache/chroma/telemetry_user_id" # buildkit
                        
# 2026-04-23 23:44:56  0.00B 执行命令并创建新的镜像层
RUN |12 USE_CUDA=false USE_OLLAMA=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_VER=cu121 USE_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL= USE_TIKTOKEN_ENCODING_NAME=cl100k_base HALO_PG_CLIENT_MAJORS=14 15 16 17 18 UID=0 GID=0 HALO_RUNTIME_PROFILE=main /bin/sh -c mkdir -p "$HOME/.cache/chroma" /app/backend/data # buildkit
                        
# 2026-05-07 12:50:34  0.00B 执行命令并创建新的镜像层
RUN |12 USE_CUDA=false USE_OLLAMA=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_VER=cu121 USE_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL= USE_TIKTOKEN_ENCODING_NAME=cl100k_base HALO_PG_CLIENT_MAJORS=14 15 16 17 18 UID=0 GID=0 HALO_RUNTIME_PROFILE=main /bin/sh -c if [ $UID -ne 0 ]; then     if [ $GID -ne 0 ]; then         addgroup --gid $GID app;     fi;     adduser --uid $UID --gid $GID --home $HOME --disabled-password --no-create-home app;     fi # buildkit
                        
# 2026-04-23 23:44:56  1.21GB 执行命令并创建新的镜像层
RUN |12 USE_CUDA=false USE_OLLAMA=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_VER=cu121 USE_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL= USE_TIKTOKEN_ENCODING_NAME=cl100k_base HALO_PG_CLIENT_MAJORS=14 15 16 17 18 UID=0 GID=0 HALO_RUNTIME_PROFILE=main /bin/sh -c set -eux;     extra_apt_packages="";     build_apt_packages="gcc python3-dev";     pg_client_packages="";     requirements_file="requirements/${INSTALL_PROFILE}.txt";     test -f "${requirements_file}";     case "$INSTALL_PROFILE" in         local-audio) extra_apt_packages="ffmpeg libsm6 libxext6" ;;         docs-full|full) extra_apt_packages="pandoc ffmpeg libsm6 libxext6" ;;     esac;     for major in ${HALO_PG_CLIENT_MAJORS}; do         pg_client_packages="${pg_client_packages} postgresql-client-${major}";     done;     apt-get update;     apt-get install -y --no-install-recommends         curl         ca-certificates         tzdata         ${build_apt_packages}         ${extra_apt_packages};     install -d /usr/share/postgresql-common/pgdg;     curl -fsSL -o /usr/share/postgresql-common/pgdg/apt.postgresql.org.asc         https://www.postgresql.org/media/keys/ACCC4CF8.asc;     . /etc/os-release;     echo "deb [signed-by=/usr/share/postgresql-common/pgdg/apt.postgresql.org.asc] https://apt.postgresql.org/pub/repos/apt ${VERSION_CODENAME}-pgdg main"         > /etc/apt/sources.list.d/pgdg.list;     apt-get update;     apt-get install -y --no-install-recommends ${pg_client_packages};     if [ "$HALO_RUNTIME_PROFILE" = "main" ]; then         apt-get install -y --no-install-recommends nodejs npm git;     elif [ "$HALO_RUNTIME_PROFILE" = "slim" ]; then         apt-get install -y --no-install-recommends nodejs;     fi;     if [ "$USE_OLLAMA" = "true" ]; then         curl -fsSL https://ollama.com/install.sh | sh;     fi;     pip install --no-cache-dir uv;     if [ "$INSTALL_PROFILE" = "local-rag" ] || [ "$INSTALL_PROFILE" = "local-audio" ] || [ "$INSTALL_PROFILE" = "full" ]; then         if [ "$USE_CUDA" = "true" ]; then             pip install torch torchvision torchaudio --index-url "https://download.pytorch.org/whl/${USE_CUDA_DOCKER_VER}" --no-cache-dir;         else             pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu --no-cache-dir;         fi;     fi;     if [ "$INSTALL_PROFILE" = "core" ]; then         uv pip install --system -r requirements/core.txt --no-cache-dir;     else         uv pip install --system -r "${requirements_file}" --no-cache-dir;     fi;     if [ "$HALO_RUNTIME_PROFILE" = "main" ]; then         uv pip install --system -r requirements/storage-s3.txt --no-cache-dir;     fi;     if [ "$PRELOAD_LOCAL_MODELS" = "true" ]; then         if [ "$INSTALL_PROFILE" = "local-rag" ] || [ "$INSTALL_PROFILE" = "full" ]; then             python -c "import os; from sentence_transformers import SentenceTransformer; SentenceTransformer(os.environ['RAG_EMBEDDING_MODEL'], device='cpu')";         fi;         if [ "$INSTALL_PROFILE" = "local-audio" ] || [ "$INSTALL_PROFILE" = "full" ]; then             python -c "import os; from faster_whisper import WhisperModel; WhisperModel(os.environ['WHISPER_MODEL'], device='cpu', compute_type='int8', download_root=os.environ['WHISPER_MODEL_DIR'])";         fi;         python -c "import os; import tiktoken; tiktoken.get_encoding(os.environ['TIKTOKEN_ENCODING_NAME'])";     fi;     apt-get purge -y --auto-remove gcc python3-dev;     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-04-22 22:03:37  2.77KB 复制新文件或目录到容器中
COPY ./backend/requirements ./requirements # buildkit
                        
# 2026-04-22 22:03:37  0.00B 设置工作目录为/app/backend
WORKDIR /app/backend
                        
# 2026-04-22 22:03:37  0.00B 设置环境变量 ENV PORT USE_OLLAMA_DOCKER INSTALL_PROFILE PRELOAD_LOCAL_MODELS USE_CUDA_DOCKER USE_CUDA_DOCKER_VER USE_EMBEDDING_MODEL_DOCKER USE_RERANKING_MODEL_DOCKER ENABLE_LOCAL_MODEL_RUNTIME OLLAMA_BASE_URL OPENAI_API_BASE_URL SCARF_NO_ANALYTICS DO_NOT_TRACK ANONYMIZED_TELEMETRY WHISPER_MODEL WHISPER_MODEL_DIR RAG_EMBEDDING_MODEL RAG_RERANKING_MODEL SENTENCE_TRANSFORMERS_HOME TIKTOKEN_ENCODING_NAME TIKTOKEN_CACHE_DIR HF_HOME HALO_RUNTIME_PROFILE HOME
ENV ENV=prod PORT=8080 USE_OLLAMA_DOCKER=false INSTALL_PROFILE=core PRELOAD_LOCAL_MODELS=false USE_CUDA_DOCKER=false USE_CUDA_DOCKER_VER=cu121 USE_EMBEDDING_MODEL_DOCKER=sentence-transformers/all-MiniLM-L6-v2 USE_RERANKING_MODEL_DOCKER= ENABLE_LOCAL_MODEL_RUNTIME=false OLLAMA_BASE_URL=/ollama OPENAI_API_BASE_URL= SCARF_NO_ANALYTICS=true DO_NOT_TRACK=true ANONYMIZED_TELEMETRY=false WHISPER_MODEL=base WHISPER_MODEL_DIR=/app/backend/data/cache/whisper/models RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 RAG_RERANKING_MODEL= SENTENCE_TRANSFORMERS_HOME=/app/backend/data/cache/embedding/models TIKTOKEN_ENCODING_NAME=cl100k_base TIKTOKEN_CACHE_DIR=/app/backend/data/cache/tiktoken HF_HOME=/app/backend/data/cache/embedding/models HALO_RUNTIME_PROFILE=main HOME=/root
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG HALO_RUNTIME_PROFILE=main
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG GID=0
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG UID=0
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG HALO_PG_CLIENT_MAJORS=14 15 16 17 18
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG USE_TIKTOKEN_ENCODING_NAME=cl100k_base
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG USE_RERANKING_MODEL=
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG USE_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG USE_CUDA_VER=cu121
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG PRELOAD_LOCAL_MODELS=false
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG INSTALL_PROFILE=core
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG USE_OLLAMA=false
                        
# 2026-04-22 22:03:37  0.00B 定义构建参数
ARG USE_CUDA=false
                        
# 2026-04-22 10:12:18  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2026-04-22 10:12:18  36.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pip3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2026-04-22 10:12:18  45.51MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		savedAptMark="$(apt-mark showmanual)"; 	apt-get update; 	apt-get install -y --no-install-recommends 		dpkg-dev 		gcc 		gnupg 		libbluetooth-dev 		libbz2-dev 		libc6-dev 		libdb-dev 		libffi-dev 		libgdbm-dev 		liblzma-dev 		libncursesw5-dev 		libreadline-dev 		libsqlite3-dev 		libssl-dev 		make 		tk-dev 		uuid-dev 		wget 		xz-utils 		zlib1g-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	echo "$PYTHON_SHA256 *python.tar.xz" | sha256sum -c -; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-optimizations 		--enable-option-checking=fatal 		--enable-shared 		$(test "${gnuArch%%-*}" != 'riscv64' && echo '--with-lto') 		--with-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="$(dpkg-buildflags --get CFLAGS)"; 	LDFLAGS="$(dpkg-buildflags --get LDFLAGS)"; 	LDFLAGS="${LDFLAGS:-} -Wl,--strip-all"; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-} -Wl,-rpath='\$\$ORIGIN/../lib'" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		ldconfig; 		apt-mark auto '.*' > /dev/null; 	apt-mark manual $savedAptMark; 	find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec ldd '{}' ';' 		| awk '/=>/ { so = $(NF-1); if (index(so, "/usr/local/") == 1) { next }; gsub("^/(usr/)?", "", so); printf "*%s\n", so }' 		| sort -u 		| xargs -rt dpkg-query --search 		| awk 'sub(":$", "", $1) { print $1 }' 		| sort -u 		| xargs -r apt-mark manual 	; 	apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; 	rm -rf /var/lib/apt/lists/*; 		export PYTHONDONTWRITEBYTECODE=1; 	python3 --version; 		pip3 install 		--disable-pip-version-check 		--no-cache-dir 		--no-compile 		'setuptools==79.0.1' 		'wheel<0.46' 	; 	pip3 --version # buildkit
                        
# 2026-04-22 10:04:27  0.00B 设置环境变量 PYTHON_SHA256
ENV PYTHON_SHA256=272179ddd9a2e41a0fc8e42e33dfbdca0b3711aa5abf372d3f2d51543d09b625
                        
# 2026-04-22 10:04:27  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.11.15
                        
# 2026-04-22 10:04:27  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=A035C8C19219BA821ECEA86B64E628F8D684696D
                        
# 2026-04-22 10:04:27  9.26MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apt-get update; 	apt-get install -y --no-install-recommends 		ca-certificates 		netbase 		tzdata 	; 	rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-04-22 10:04:27  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2026-04-22 10:04:27  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2026-04-21 08:00:00  74.83MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1776729600'
                        
                    

镜像信息

{
    "Id": "sha256:5d74395b15fa91d53b26d24df5cadceaaaf80cc72a277201632530da22a3a3ac",
    "RepoTags": [
        "ghcr.io/ztx888/halowebui:main",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui:main"
    ],
    "RepoDigests": [
        "ghcr.io/ztx888/halowebui@sha256:57621275a53fcaff6557ea1303610950d039b75b48b8cbc7daf7f22f972cbf29",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ztx888/halowebui@sha256:bb2ea94c835808d15665f97f5f8d81770ccda80dd681e7d8afadce5917952c37"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2026-05-07T05:21:57.334619584Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "0:0",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "8080/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=A035C8C19219BA821ECEA86B64E628F8D684696D",
            "PYTHON_VERSION=3.11.15",
            "PYTHON_SHA256=272179ddd9a2e41a0fc8e42e33dfbdca0b3711aa5abf372d3f2d51543d09b625",
            "ENV=prod",
            "PORT=8080",
            "USE_OLLAMA_DOCKER=false",
            "INSTALL_PROFILE=core",
            "PRELOAD_LOCAL_MODELS=false",
            "USE_CUDA_DOCKER=false",
            "USE_CUDA_DOCKER_VER=cu121",
            "USE_EMBEDDING_MODEL_DOCKER=sentence-transformers/all-MiniLM-L6-v2",
            "USE_RERANKING_MODEL_DOCKER=",
            "ENABLE_LOCAL_MODEL_RUNTIME=false",
            "OLLAMA_BASE_URL=/ollama",
            "OPENAI_API_BASE_URL=",
            "SCARF_NO_ANALYTICS=true",
            "DO_NOT_TRACK=true",
            "ANONYMIZED_TELEMETRY=false",
            "WHISPER_MODEL=base",
            "WHISPER_MODEL_DIR=/app/backend/data/cache/whisper/models",
            "RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2",
            "RAG_RERANKING_MODEL=",
            "SENTENCE_TRANSFORMERS_HOME=/app/backend/data/cache/embedding/models",
            "TIKTOKEN_ENCODING_NAME=cl100k_base",
            "TIKTOKEN_CACHE_DIR=/app/backend/data/cache/tiktoken",
            "HF_HOME=/app/backend/data/cache/embedding/models",
            "HALO_RUNTIME_PROFILE=main",
            "HOME=/root",
            "WEBUI_BUILD_VERSION=b2eecdce3d1915daa8b3057605c64c0378148f48",
            "DOCKER=true"
        ],
        "Cmd": [
            "bash",
            "start.sh"
        ],
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "python -c \"import json, os, urllib.request; response = urllib.request.urlopen(f'http://localhost:{os.environ.get(\\\"PORT\\\", \\\"8080\\\")}/health'); assert json.loads(response.read())[\\\"status\\\"] is True\" || exit 1"
            ]
        },
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/app/backend",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2026-05-07T05:16:24.202Z",
            "org.opencontainers.image.description": "基于官方OpenWebUI,汉化界面提高中文使用体验,增加了模型计费和用量统计",
            "org.opencontainers.image.licenses": "BSD-3-Clause",
            "org.opencontainers.image.revision": "b2eecdce3d1915daa8b3057605c64c0378148f48",
            "org.opencontainers.image.source": "https://github.com/ztx888/HaloWebUI",
            "org.opencontainers.image.title": "HaloWebUI",
            "org.opencontainers.image.url": "https://github.com/ztx888/HaloWebUI",
            "org.opencontainers.image.version": "main"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1492768928,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/b2d65c470ec5fb1f4799da80064347dc8370121168bc4e22f7a3cae4d859ca62/diff:/var/lib/docker/overlay2/9799d1448eb267b2c08aa78abae7a8aa66fde288b0c9f8dca6fbb89e6446f19e/diff:/var/lib/docker/overlay2/67da706faa254e5daae09b3a507232e6d1299a68cb7a9574a31452195d6d4971/diff:/var/lib/docker/overlay2/85c3673e179cd39c37831f38158520d747f20c598046303c0c3f83fa18d35171/diff:/var/lib/docker/overlay2/7bcf007511e3cabde1e81f45eb83ee9b4489e852b80a0b0ebf27db51ca929083/diff:/var/lib/docker/overlay2/9b6c04531ed380c29193f60675000fa0ea381572f33ea60067f4f0149bd0cf88/diff:/var/lib/docker/overlay2/ee5b6aea1fc5e3fbdbc244c38f07d18f9163c1d186ed6515527a6616c2151c68/diff:/var/lib/docker/overlay2/11f621c77d2e33e628cff02fddd470a5ec93c19fde47f7ac85b6d5c31d808a52/diff:/var/lib/docker/overlay2/1389075e3840e4424b32b5bffce0962d794d6696f6dc87c2e36d9d4a203020bf/diff:/var/lib/docker/overlay2/b11dfe813758670d6678b4ae8491b7a492a3b52285af14ff118706237cc3873b/diff:/var/lib/docker/overlay2/3b2c453a5274b32e30d4c921dd0b3f4da4764e50d2cf82f8bcb4d83aaac2a3ce/diff:/var/lib/docker/overlay2/4d9cd7680276d21958fd04e4d14bbcd119371cccbc6f9bda7536f878ee3a7c36/diff:/var/lib/docker/overlay2/64217af372e21f1cc9bfdab0962acb006386db90ef08431d8b9758ae43401f19/diff:/var/lib/docker/overlay2/cc990fbc81de48078fc0604b9b6d36a0fec4e1e82221b08ef8be5312c6ba6c53/diff:/var/lib/docker/overlay2/cf43f95bbd2394100154831c0af08fee9aac29e6f0ad6ee1ddfcd2e2dfd607a9/diff",
            "MergedDir": "/var/lib/docker/overlay2/2ed8c37d127df49509c6e60fa2cda9a433f26e7c14f242f8c075ee92fae60480/merged",
            "UpperDir": "/var/lib/docker/overlay2/2ed8c37d127df49509c6e60fa2cda9a433f26e7c14f242f8c075ee92fae60480/diff",
            "WorkDir": "/var/lib/docker/overlay2/2ed8c37d127df49509c6e60fa2cda9a433f26e7c14f242f8c075ee92fae60480/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:0da811fd3ed46c38cea69079fa395a3d715dbdbdd5c8177107c450bf6332bbfa",
            "sha256:039df08804d0e605098b0b563d52f6065273780641b5b8517a25d470c8ce96ce",
            "sha256:b2689a916b50d244e615fd29a0786eb6dc1f51f5c354f4903a56bcfd3bc1c897",
            "sha256:724fc0dfc2934d5e16711d1be01fe409f2b3caa056c1ff30fcd6d61ec0e3b025",
            "sha256:c897e311a1e6353dcc9c362071b2c34c8dda47a0fc8d9cc9e8bff3b50356aa6a",
            "sha256:c254e9608a8683bbafd14c0dc5b9a6ff6a7ba8246a69984bdbd21c395af2a60f",
            "sha256:8d6df1082318277b60f0f136e868e7c5d7aa64a3f2cdfc646cf73a87dbd2e67b",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:a74aff649bb5abd4d28c87555820b07b8f3f29663a9b40fedac74d690f6cea04",
            "sha256:15603c1ea5a40ac675de2e10f36bcf6f6cab8ade9a88401ebb291bd1e41c5f6a",
            "sha256:8c30eeccb76f7203383506aa1f3b9ab546f41af691ccafc2a6fea20f9c8c1858",
            "sha256:f3302ea790c3cc308352102939d6c425772215e9ae0e2ea6b9c9f99067037816",
            "sha256:bc4c2622c04960b7727b00043b5e607c4938b4dd4d36c926b72911011b26d1b6",
            "sha256:4b359252fe54bc684d3677a2fa317730370531d59501e3653ee6777540f6ba96",
            "sha256:d0ba3b1dab97144337d16392b0e67c105c2fdf8a616255ad4add34350786716e",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-05-09T00:29:26.512662083+08:00"
    }
}

更多版本

ghcr.io/ztx888/halowebui:main

linux/amd64 ghcr.io1.49GB2026-05-09 00:36
11