ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu linux/amd64

ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu - 国内下载镜像源 浏览次数:32

用于文本生成的 Hugging Face Inference 镜像。它旨在提供高性能的文本生成服务,优化了延迟和吞吐量。该镜像支持多种模型,并提供了易于使用的 API,方便部署和扩展。

源镜像 ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu
镜像ID sha256:1f0561ad3a99411617467f125e67646dffe7158909c3f443e2a4b3aea53eac43
镜像TAG sha-f08b44a-intel-cpu
大小 6.78GB
镜像源 ghcr.io
CMD --json-output
启动入口 text-generation-launcher
工作目录 /usr/src
OS/平台 linux/amd64
浏览量 32 次
贡献者
镜像创建 2025-05-22T13:37:03.507702705Z
同步时间 2025-05-23 20:34
更新时间 2025-05-31 07:20
环境变量
PATH=/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data HF_HUB_ENABLE_HF_TRANSFER=1 PORT=80 LD_PRELOAD=/opt/conda/lib/libtcmalloc.so CCL_ROOT=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch I_MPI_ROOT=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch FI_PROVIDER_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib/prov:/usr/lib64/libfabric LD_LIBRARY_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib:/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/lib:/opt/conda/lib/ UV_SYSTEM_PYTHON=1 ATTENTION=flashdecoding-ipex PREFIX_CACHING=1 PREFILL_CHUNKING=1 CUDA_GRAPHS=0
镜像标签
2025-05-22T13:31:36.965Z: org.opencontainers.image.created Large Language Model Text Generation Inference: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name f08b44ade5c64ce87aff7ff4d74f766282f579a3: org.opencontainers.image.revision https://github.com/huggingface/text-generation-inference: org.opencontainers.image.source text-generation-inference: org.opencontainers.image.title https://github.com/huggingface/text-generation-inference: org.opencontainers.image.url latest-intel-cpu: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu  ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu  ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu  ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu  ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu'

镜像构建历史


# 2025-05-22 21:37:03  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2025-05-22 21:37:03  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-generation-launcher"]
                        
# 2025-05-22 21:37:03  0.00B 设置环境变量 CUDA_GRAPHS
ENV CUDA_GRAPHS=0
                        
# 2025-05-22 21:37:03  0.00B 设置环境变量 PREFILL_CHUNKING
ENV PREFILL_CHUNKING=1
                        
# 2025-05-22 21:37:03  0.00B 设置环境变量 PREFIX_CACHING
ENV PREFIX_CACHING=1
                        
# 2025-05-22 21:37:03  0.00B 设置环境变量 ATTENTION
ENV ATTENTION=flashdecoding-ipex
                        
# 2025-05-22 21:37:03  6.98MB 复制新文件或目录到容器中
COPY /usr/src/target/release-opt/text-generation-launcher /usr/local/bin/text-generation-launcher # buildkit
                        
# 2025-05-22 21:37:03  39.42MB 复制新文件或目录到容器中
COPY /usr/src/target/release-opt/text-generation-router /usr/local/bin/text-generation-router # buildkit
                        
# 2025-05-22 21:37:03  11.85MB 复制新文件或目录到容器中
COPY /usr/src/target/release-opt/text-generation-benchmark /usr/local/bin/text-generation-benchmark # buildkit
                        
# 2025-05-21 21:47:37  646.16MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c cd server &&     make gen-server &&     pip install -U pip uv &&     uv pip install -e ".[accelerate, compressed-tensors, peft, outlines]" --no-cache-dir # buildkit
                        
# 2025-05-21 21:47:30  0.00B 设置环境变量 UV_SYSTEM_PYTHON
ENV UV_SYSTEM_PYTHON=1
                        
# 2025-05-21 21:47:30  0.00B 复制新文件或目录到容器中
COPY server/Makefile server/Makefile # buildkit
                        
# 2025-05-21 21:47:30  2.83MB 复制新文件或目录到容器中
COPY server server # buildkit
                        
# 2025-05-06 17:17:07  13.42KB 复制新文件或目录到容器中
COPY proto proto # buildkit
                        
# 2025-05-06 17:17:07  0.00B 设置环境变量 LD_LIBRARY_PATH
ENV LD_LIBRARY_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib:/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/lib:/opt/conda/lib/
                        
# 2025-05-06 17:17:07  0.00B 设置环境变量 LD_LIBRARY_PATH
ENV LD_LIBRARY_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib:/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/lib
                        
# 2025-05-06 17:17:07  0.00B 设置环境变量 FI_PROVIDER_PATH
ENV FI_PROVIDER_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib/prov:/usr/lib64/libfabric
                        
# 2025-05-06 17:17:07  0.00B 设置环境变量 I_MPI_ROOT
ENV I_MPI_ROOT=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch
                        
# 2025-05-06 17:17:07  0.00B 设置环境变量 CCL_ROOT
ENV CCL_ROOT=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch
                        
# 2025-05-06 17:17:07  0.00B 设置环境变量 LD_PRELOAD
ENV LD_PRELOAD=/opt/conda/lib/libtcmalloc.so
                        
# 2025-05-06 17:17:07  206.57MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c pip install https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/cpu/oneccl_bind_pt-2.7.0%2Bcpu-cp311-cp311-linux_x86_64.whl # buildkit
                        
# 2025-05-06 17:17:04  502.12MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c pip install https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/cpu/intel_extension_for_pytorch-2.7.0%2Bcpu-cp311-cp311-linux_x86_64.whl # buildkit
                        
# 2025-05-21 21:47:30  0.00B 设置工作目录为/usr/src
WORKDIR /usr/src
                        
# 2025-05-06 17:16:58  970.99MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c pip install triton==3.2.0 py-libnuma # buildkit
                        
# 2025-05-06 17:16:53  1.11GB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 --index-url https://download.pytorch.org/whl/cpu # buildkit
                        
# 2025-05-06 02:09:28  1.94GB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c conda install -c conda-forge gperftools mkl # buildkit
                        
# 2025-05-06 02:04:36  348.89MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c case ${TARGETPLATFORM} in          "linux/arm64")  exit 1 ;;          *)              /opt/conda/bin/conda update -y conda &&                           /opt/conda/bin/conda install -y "python=${PYTHON_VERSION}" ;;     esac &&     /opt/conda/bin/conda clean -ya # buildkit
                        
# 2025-05-06 02:01:59  388.41MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c chmod +x ~/mambaforge.sh &&     bash ~/mambaforge.sh -b -p /opt/conda &&     rm ~/mambaforge.sh # buildkit
                        
# 2025-05-06 02:01:54  87.01MB 执行命令并创建新的镜像层
RUN |3 MAMBA_VERSION=23.1.0-1 PYTHON_VERSION=3.11.10 TARGETPLATFORM=linux/amd64 /bin/sh -c case ${TARGETPLATFORM} in          "linux/arm64")  MAMBA_ARCH=aarch64  ;;          *)              MAMBA_ARCH=x86_64   ;;     esac &&     curl -fsSL -v -o ~/mambaforge.sh -O  "https://github.com/conda-forge/miniforge/releases/download/${MAMBA_VERSION}/Mambaforge-${MAMBA_VERSION}-Linux-${MAMBA_ARCH}.sh" # buildkit
                        
# 2025-05-06 02:01:54  0.00B 设置环境变量 PATH
ENV PATH=/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-05-06 02:01:54  0.00B 定义构建参数
ARG TARGETPLATFORM=linux/amd64
                        
# 2025-05-06 02:01:54  0.00B 定义构建参数
ARG PYTHON_VERSION=3.11.10
                        
# 2025-05-06 02:01:54  0.00B 定义构建参数
ARG MAMBA_VERSION=23.1.0-1
                        
# 2025-05-06 02:01:54  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE HF_HUB_ENABLE_HF_TRANSFER PORT
ENV HUGGINGFACE_HUB_CACHE=/data HF_HUB_ENABLE_HF_TRANSFER=1 PORT=80
                        
# 2025-05-06 02:01:53  5.96KB 执行命令并创建新的镜像层
RUN /bin/sh -c update-alternatives --set c++ /usr/bin/g++ # buildkit
                        
# 2025-05-06 02:01:53  5.83KB 执行命令并创建新的镜像层
RUN /bin/sh -c update-alternatives --install /usr/bin/c++ c++ /usr/bin/g++ 30 # buildkit
                        
# 2025-05-06 02:01:53  5.61KB 执行命令并创建新的镜像层
RUN /bin/sh -c update-alternatives --set cc /usr/bin/gcc # buildkit
                        
# 2025-05-06 02:01:53  5.48KB 执行命令并创建新的镜像层
RUN /bin/sh -c update-alternatives --install /usr/bin/cc cc /usr/bin/gcc 30 # buildkit
                        
# 2025-05-06 02:01:53  5.31KB 执行命令并创建新的镜像层
RUN /bin/sh -c update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-12 12 # buildkit
                        
# 2025-05-06 02:01:53  5.12KB 执行命令并创建新的镜像层
RUN /bin/sh -c update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-12 12 # buildkit
                        
# 2025-05-06 02:01:53  440.42MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     curl     ca-certificates     make     g++-12     gcc-12     git     wget     cmake     libnuma-dev # buildkit
                        
# 2025-04-28 17:44:42  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-04-28 17:44:42  77.86MB 
/bin/sh -c #(nop) ADD file:59e67123ba6a5d9eea9813e7b2a767696f767c15c5b23c61c4d5bd6ba6fa9ac6 in / 
                        
# 2025-04-28 17:44:40  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2025-04-28 17:44:40  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2025-04-28 17:44:40  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2025-04-28 17:44:40  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:1f0561ad3a99411617467f125e67646dffe7158909c3f443e2a4b3aea53eac43",
    "RepoTags": [
        "ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:sha-f08b44a-intel-cpu"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-generation-inference@sha256:96761cc0ed7c8448f81fcdc4c145c12d67fe6df783228e216c59742091322da3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference@sha256:62ca63f8db3d91c79f114387f4158d56e74f351539d2b2455991bf512ca5c283"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-05-22T13:37:03.507702705Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "HF_HUB_ENABLE_HF_TRANSFER=1",
            "PORT=80",
            "LD_PRELOAD=/opt/conda/lib/libtcmalloc.so",
            "CCL_ROOT=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch",
            "I_MPI_ROOT=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch",
            "FI_PROVIDER_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib/prov:/usr/lib64/libfabric",
            "LD_LIBRARY_PATH=/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/opt/mpi/libfabric/lib:/opt/conda/lib/python3.11/site-packages/oneccl_bindings_for_pytorch/lib:/opt/conda/lib/",
            "UV_SYSTEM_PYTHON=1",
            "ATTENTION=flashdecoding-ipex",
            "PREFIX_CACHING=1",
            "PREFILL_CHUNKING=1",
            "CUDA_GRAPHS=0"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/usr/src",
        "Entrypoint": [
            "text-generation-launcher"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2025-05-22T13:31:36.965Z",
            "org.opencontainers.image.description": "Large Language Model Text Generation Inference",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "f08b44ade5c64ce87aff7ff4d74f766282f579a3",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-generation-inference",
            "org.opencontainers.image.title": "text-generation-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-generation-inference",
            "org.opencontainers.image.version": "latest-intel-cpu"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 6778597375,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/d85377e89edbdd488f9e5b95302cd8fc554f76ff00cf7fe607535e559b9da414/diff:/var/lib/docker/overlay2/259ddecb6107c93775d8b771a1a573a9ec693e81513f1a901eb4bc8ace351a13/diff:/var/lib/docker/overlay2/e4321becf0a1063d5efc37361ab891ad9c0823c955436e282e90c637e878f925/diff:/var/lib/docker/overlay2/95860b9689dca24704c4ae83a41a92180b0cdc49604ab1a0f3d320ed10a85096/diff:/var/lib/docker/overlay2/169148cc9234e4ec3d2f523f5a494cb07898d3ddb9ee67846a4d0a02b6666e6c/diff:/var/lib/docker/overlay2/59ebdbdcb367e4be97aca75a110c4d7914f5056c9c30aff664aa1db459edd926/diff:/var/lib/docker/overlay2/0bb5c02c23e38dc01df13ca1a0c06050ec4ec2528653abb30150ee0f338dbfdc/diff:/var/lib/docker/overlay2/f9693cd8b7761810169f4381139591f75662115f2a46a4b727b2447d7eaa6874/diff:/var/lib/docker/overlay2/a7e7800d3791f51618b556989434c8a16fee032d4d0a6e5eb2608d692b02bd79/diff:/var/lib/docker/overlay2/6b1bd71f0b6bfc6f1e963cca65f9a43aab3bbb23f5c2559f43d871773b1095d2/diff:/var/lib/docker/overlay2/60773b51d6ac045adcb98f08ff1538ac97701960d5b65bd2f420e0c4022be2c1/diff:/var/lib/docker/overlay2/29017fe3ed2075dfaf4a7362f2aacee8caca08810d5d24ba1564f2ab41888bb7/diff:/var/lib/docker/overlay2/09595bb81675875fb908f52a9794ed35118697c7ce6e136d085d8fc65e55660c/diff:/var/lib/docker/overlay2/e740d63f23718d51d57d93663d854f44d814ede5da09f9239491d0a172fd3d83/diff:/var/lib/docker/overlay2/5a411dee1a529a446c8af77b54e745ca0f5159c21a6bb891ba88d27a3a259496/diff:/var/lib/docker/overlay2/09e8ba1f7410439ebf63b52bb626afe010a6b4ef6849796a8cc463341575b472/diff:/var/lib/docker/overlay2/fb5d252c4e0c4dba979857a7f0db13a9a46548a89f4ed78ec13aea03fa9ab069/diff:/var/lib/docker/overlay2/4f52cc29f6b9449cdbccc6ac09618ee9bf2c3e2a23f9c2e988cf032322cbc958/diff:/var/lib/docker/overlay2/d002b474cda27385019246759bc6cd1a2936d4c7fca1de84cadfb062d1fdaeb7/diff:/var/lib/docker/overlay2/2f9af40a7f68408d0565db80599dc03cc23eeef4e8ca1c1fe4a9d0b748b6035c/diff:/var/lib/docker/overlay2/9400eb686aca941431d1867af6f0f1d4f3b8aa9a14f992c1efa3b38fdb685fbc/diff:/var/lib/docker/overlay2/2068c496b8f606b88c90cf4e711c08db8ccfde5616e55f7ba88f8624c477c9da/diff:/var/lib/docker/overlay2/4fcc5403e042c93b3ca4d38d146f8e6b90dd77bfab9e3b637ca3380d6edf5498/diff",
            "MergedDir": "/var/lib/docker/overlay2/70b81dcb817096c2f43e7120664dffcf6c381946198652cb2c77a46a8da2a2da/merged",
            "UpperDir": "/var/lib/docker/overlay2/70b81dcb817096c2f43e7120664dffcf6c381946198652cb2c77a46a8da2a2da/diff",
            "WorkDir": "/var/lib/docker/overlay2/70b81dcb817096c2f43e7120664dffcf6c381946198652cb2c77a46a8da2a2da/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:346f14bf17b9846c3e67362b00c16c9085772033cb7c58742c4e1efa22466d38",
            "sha256:b6ff320e83ac24b42cc86c0038587c0899ecb7aa34f35362db08bb0f7bc8e62d",
            "sha256:68c1e58998cec3983a906759ac7cdc7b3826d8f865d252c6c5097406fa2ead94",
            "sha256:6c733fc4c56e5401d81fbec0d433891bc778679a58fcdb81abd78f2f5a9b0fbb",
            "sha256:f29e7ff76b9d0c666ae3e2812f1afdd595533c081ab260982be0562e35768292",
            "sha256:fcd431fd1772295bd0b2b3ba44a5d3bb1c912000286af919d90591846176ca66",
            "sha256:50b3c4e8833577985cec02290d750550cc9df78fc424342a62c364dd071150f2",
            "sha256:265a591b5d61adc22cd953575eabe1c65e480e3330c7cad806430830149c4d40",
            "sha256:9cd388385878624708feb93743b07880cd2b60fb49c361d196d422030b25f095",
            "sha256:3675b14e3d4451b09e58271d992dfe953f6af73f298f18ef1b5310bd92c8e5a0",
            "sha256:40a00fbe570c3d0290aa65e6fdb08cd58ead4949eb5a2277b96453f220ebcd92",
            "sha256:2073ad096e87b0642560de318e91dcb5933d2125d89174103b16477d84943d48",
            "sha256:6d9d2fa55eb51dfac54572dfe539dbd8af0e6ca266b1d0c557c11382c2a661d3",
            "sha256:e75be54f1562548a253da6ededc4f48e98c2dbaea6c094bae2e27f4ff1695b20",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:b599619d05aae857f70ca1f10c8b5565ba055529d9e427e241ede5c86107e253",
            "sha256:017597be9baa85a5190d6052214e2072b175fec622bc5620cacb231e61551ded",
            "sha256:f29e6a603fc2e22742aa0ee752fedac1d1c3832f5a9640d9ddefb8b3ddfe88f0",
            "sha256:9eaeaf85601a0eb19144341e46d8d78798f2c3b367e99b8db6c14b84036e8cd4",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:bf29ae7c180750bb6d61325a9ead7f587089a314b920b80ad99eb84b19f7d1f3",
            "sha256:8ac88fafaba0a994f7424f0de359234e6c7d95c54e579cf4ed817295fea4443d",
            "sha256:08acfe1cc129c6e227beecabbd6cbf15c1903697fbc3898fb3770e62b033030f",
            "sha256:96ff59f06be79deb4e7332a2f77821a30b38b41ca14d1794f8ab6318e1f4d86d"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-05-23T20:28:48.779040561+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-generation-inference:2.1.1

linux/amd64 ghcr.io10.66GB2024-09-07 04:52
195

ghcr.io/huggingface/text-generation-inference:2.2

linux/amd64 ghcr.io11.37GB2024-09-07 05:20
489

ghcr.io/huggingface/text-generation-inference:2.3.0

linux/amd64 ghcr.io13.75GB2024-09-23 15:50
571

ghcr.io/huggingface/text-generation-inference:2.4.0

linux/amd64 ghcr.io14.11GB2024-11-08 17:53
234

ghcr.io/huggingface/text-generation-inference:3.1.0

linux/amd64 ghcr.io12.24GB2025-02-11 04:56
226

ghcr.io/huggingface/text-generation-inference:3.1.1

linux/amd64 ghcr.io16.21GB2025-03-10 04:51
89

ghcr.io/huggingface/text-generation-inference:3.2.1

linux/amd64 ghcr.io16.22GB2025-03-25 00:58
236
31