ghcr.io/huggingface/text-generation-inference:2.3.0 linux/amd64

ghcr.io/huggingface/text-generation-inference:2.3.0 - 国内下载镜像源 浏览次数:649

用于文本生成的 Hugging Face Inference 镜像。它旨在提供高性能的文本生成服务,优化了延迟和吞吐量。该镜像支持多种模型,并提供了易于使用的 API,方便部署和扩展。

源镜像 ghcr.io/huggingface/text-generation-inference:2.3.0
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0
镜像ID sha256:cf86e18ed3214bb145a6ae124fb569b1a3194f57d03788fa4b03253f849f5a1a
镜像TAG 2.3.0
大小 13.75GB
镜像源 ghcr.io
CMD
启动入口 /tgi-entrypoint.sh
工作目录 /usr/src
OS/平台 linux/amd64
浏览量 649 次
贡献者 gu******u@myhexin.com
镜像创建 2024-09-20T15:52:30.619696814Z
同步时间 2024-09-23 15:50
更新时间 2025-07-07 03:50
环境变量
PATH=/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin NVARCH=x86_64 NVIDIA_REQUIRE_CUDA=cuda>=12.1 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526 NV_CUDA_CUDART_VERSION=12.1.55-1 NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-1 CUDA_VERSION=12.1.0 LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/opt/conda/lib/ NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=compute,utility CONDA_PREFIX=/opt/conda HF_HOME=/data HF_HUB_ENABLE_HF_TRANSFER=1 PORT=80 LD_PRELOAD=/opt/conda/lib/python3.11/site-packages/nvidia/nccl/lib/libnccl.so.2 EXLLAMA_NO_FLASH_ATTN=1
镜像标签
NVIDIA CORPORATION <cudatools@nvidia.com>: maintainer 2024-09-20T16:22:53.303Z: org.opencontainers.image.created Large Language Model Text Generation Inference: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name 169178b937d0c4173b0fdcd6bf10a858cfe4f428: org.opencontainers.image.revision https://github.com/huggingface/text-generation-inference: org.opencontainers.image.source text-generation-inference: org.opencontainers.image.title https://github.com/huggingface/text-generation-inference: org.opencontainers.image.url 2.3.0: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: ubuntu 22.04 扫描引擎: Trivy 扫描时间: 2024-10-24 21:44

低危漏洞:159 中危漏洞:750 高危漏洞:2 严重漏洞:0

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-generation-inference:2.3.0#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0'

镜像构建历史


# 2024-09-20 23:52:30  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/tgi-entrypoint.sh"]
                        
# 2024-09-20 23:52:30  0.00B 执行命令并创建新的镜像层
RUN /bin/sh -c chmod +x /tgi-entrypoint.sh # buildkit
                        
# 2024-09-20 23:52:30  130.00B 复制新文件或目录到容器中
COPY ./tgi-entrypoint.sh /tgi-entrypoint.sh # buildkit
                        
# 2024-09-20 23:52:30  5.95MB 复制新文件或目录到容器中
COPY /usr/src/target/release-opt/text-generation-launcher /usr/local/bin/text-generation-launcher # buildkit
                        
# 2024-09-20 23:52:30  36.13MB 复制新文件或目录到容器中
COPY /usr/src/target/release-opt/text-generation-router /usr/local/bin/text-generation-router # buildkit
                        
# 2024-09-20 23:52:30  11.02MB 复制新文件或目录到容器中
COPY /usr/src/target/release-opt/text-generation-benchmark /usr/local/bin/text-generation-benchmark # buildkit
                        
# 2024-09-20 20:38:16  220.15MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         build-essential         g++         && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-09-20 20:38:06  0.00B 设置环境变量 EXLLAMA_NO_FLASH_ATTN
ENV EXLLAMA_NO_FLASH_ATTN=1
                        
# 2024-09-20 20:38:06  0.00B 设置环境变量 LD_LIBRARY_PATH
ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/opt/conda/lib/
                        
# 2024-09-20 20:38:06  0.00B 设置环境变量 LD_PRELOAD
ENV LD_PRELOAD=/opt/conda/lib/python3.11/site-packages/nvidia/nccl/lib/libnccl.so.2
                        
# 2024-09-20 20:38:06  1.90GB 执行命令并创建新的镜像层
RUN /bin/sh -c cd server &&     make gen-server &&     pip install -r requirements_cuda.txt &&     pip install ".[bnb, accelerate, marlin, moe, quantize, peft, outlines]" --no-cache-dir &&     pip install nvidia-nccl-cu12==2.22.3 # buildkit
                        
# 2024-09-20 23:52:30  0.00B 复制新文件或目录到容器中
COPY server/Makefile server/Makefile # buildkit
                        
# 2024-09-20 20:37:09  2.10MB 复制新文件或目录到容器中
COPY server server # buildkit
                        
# 2024-09-20 01:49:50  12.51KB 复制新文件或目录到容器中
COPY proto proto # buildkit
                        
# 2024-09-20 01:49:50  331.50KB 执行命令并创建新的镜像层
RUN /bin/sh -c pip install einops --no-cache-dir # buildkit
                        
# 2024-09-20 01:49:49  1.85GB 复制新文件或目录到容器中
COPY /opt/conda/lib/python3.11/site-packages/flashinfer/ /opt/conda/lib/python3.11/site-packages/flashinfer/ # buildkit
                        
# 2024-09-20 01:49:26  23.81MB 复制新文件或目录到容器中
COPY /usr/src/causal-conv1d/build/lib.linux-x86_64-cpython-311/ /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-20 01:49:26  183.76MB 复制新文件或目录到容器中
COPY /usr/src/mamba/build/lib.linux-x86_64-cpython-311/ /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-20 01:48:41  135.10MB 复制新文件或目录到容器中
COPY /usr/src/vllm/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:37:59  9.60MB 复制新文件或目录到容器中
COPY /usr/src/fbgemm/fbgemm_gpu/_skbuild/linux-x86_64-3.11/cmake-install /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:37:59  24.03MB 复制新文件或目录到容器中
COPY /usr/src/lorax-punica/server/punica_kernels/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:37:59  41.52MB 复制新文件或目录到容器中
COPY /usr/src/eetq/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  9.69MB 复制新文件或目录到容器中
COPY /usr/src/llm-awq/awq/kernels/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  94.93MB 复制新文件或目录到容器中
COPY /usr/src/exllamav2/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  555.78KB 复制新文件或目录到容器中
COPY /usr/src/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  3.03MB 复制新文件或目录到容器中
COPY /usr/src/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  664.43MB 复制新文件或目录到容器中
COPY /opt/conda/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  10.71MB 复制新文件或目录到容器中
COPY /usr/src/flash-attention/csrc/rotary/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:15  725.06MB 复制新文件或目录到容器中
COPY /usr/src/flash-attention/csrc/layer_norm/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:32:14  153.86MB 复制新文件或目录到容器中
COPY /usr/src/flash-attention/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
                        
# 2024-09-12 03:18:46  7.32GB 复制新文件或目录到容器中
COPY /opt/conda /opt/conda # buildkit
                        
# 2024-08-09 21:02:09  91.99MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         libssl-dev         ca-certificates         make         curl         git         && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-09-20 23:52:30  0.00B 设置工作目录为/usr/src
WORKDIR /usr/src
                        
# 2024-09-20 23:52:30  0.00B 设置环境变量 HF_HOME HF_HUB_ENABLE_HF_TRANSFER PORT
ENV HF_HOME=/data HF_HUB_ENABLE_HF_TRANSFER=1 PORT=80
                        
# 2024-09-20 23:52:30  0.00B 设置环境变量 PATH CONDA_PREFIX
ENV PATH=/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin CONDA_PREFIX=/opt/conda
                        
# 2023-11-10 13:44:29  0.00B 设置环境变量 NVIDIA_DRIVER_CAPABILITIES
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
                        
# 2023-11-10 13:44:29  0.00B 设置环境变量 NVIDIA_VISIBLE_DEVICES
ENV NVIDIA_VISIBLE_DEVICES=all
                        
# 2023-11-10 13:44:29  17.29KB 复制新文件或目录到容器中
COPY NGC-DL-CONTAINER-LICENSE / # buildkit
                        
# 2023-11-10 13:44:29  0.00B 设置环境变量 LD_LIBRARY_PATH
ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64
                        
# 2023-11-10 13:44:29  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2023-11-10 13:44:29  46.00B 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf     && echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf # buildkit
                        
# 2023-11-10 13:44:29  149.59MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     cuda-cudart-12-1=${NV_CUDA_CUDART_VERSION}     ${NV_CUDA_COMPAT_PACKAGE}     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2023-11-10 13:44:18  0.00B 设置环境变量 CUDA_VERSION
ENV CUDA_VERSION=12.1.0
                        
# 2023-11-10 13:44:18  10.56MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     gnupg2 curl ca-certificates &&     curl -fsSLO https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/${NVARCH}/cuda-keyring_1.0-1_all.deb &&     dpkg -i cuda-keyring_1.0-1_all.deb &&     apt-get purge --autoremove -y curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2023-11-10 13:44:18  0.00B 添加元数据标签
LABEL maintainer=NVIDIA CORPORATION <cudatools@nvidia.com>
                        
# 2023-11-10 13:44:18  0.00B 定义构建参数
ARG TARGETARCH
                        
# 2023-11-10 13:44:18  0.00B 设置环境变量 NV_CUDA_COMPAT_PACKAGE
ENV NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-1
                        
# 2023-11-10 13:44:18  0.00B 设置环境变量 NV_CUDA_CUDART_VERSION
ENV NV_CUDA_CUDART_VERSION=12.1.55-1
                        
# 2023-11-10 13:44:18  0.00B 设置环境变量 NVIDIA_REQUIRE_CUDA brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand
ENV NVIDIA_REQUIRE_CUDA=cuda>=12.1 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526
                        
# 2023-11-10 13:44:18  0.00B 设置环境变量 NVARCH
ENV NVARCH=x86_64
                        
# 2023-10-05 15:33:32  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2023-10-05 15:33:32  77.82MB 
/bin/sh -c #(nop) ADD file:63d5ab3ef0aab308c0e71cb67292c5467f60deafa9b0418cbb220affcd078444 in / 
                        
# 2023-10-05 15:33:30  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2023-10-05 15:33:30  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2023-10-05 15:33:30  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2023-10-05 15:33:30  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:cf86e18ed3214bb145a6ae124fb569b1a3194f57d03788fa4b03253f849f5a1a",
    "RepoTags": [
        "ghcr.io/huggingface/text-generation-inference:2.3.0",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-generation-inference@sha256:dfcffa0498a806255fd14e462e864664519adf470bd7747379939208111b2138",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference@sha256:4dc58cf2e113b69c1943997084cdc91de45dd3124cdedd6cb4095a9aea97fadc"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-09-20T15:52:30.619696814Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "NVARCH=x86_64",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=12.1 brand=tesla,driver\u003e=470,driver\u003c471 brand=unknown,driver\u003e=470,driver\u003c471 brand=nvidia,driver\u003e=470,driver\u003c471 brand=nvidiartx,driver\u003e=470,driver\u003c471 brand=geforce,driver\u003e=470,driver\u003c471 brand=geforcertx,driver\u003e=470,driver\u003c471 brand=quadro,driver\u003e=470,driver\u003c471 brand=quadrortx,driver\u003e=470,driver\u003c471 brand=titan,driver\u003e=470,driver\u003c471 brand=titanrtx,driver\u003e=470,driver\u003c471 brand=tesla,driver\u003e=525,driver\u003c526 brand=unknown,driver\u003e=525,driver\u003c526 brand=nvidia,driver\u003e=525,driver\u003c526 brand=nvidiartx,driver\u003e=525,driver\u003c526 brand=geforce,driver\u003e=525,driver\u003c526 brand=geforcertx,driver\u003e=525,driver\u003c526 brand=quadro,driver\u003e=525,driver\u003c526 brand=quadrortx,driver\u003e=525,driver\u003c526 brand=titan,driver\u003e=525,driver\u003c526 brand=titanrtx,driver\u003e=525,driver\u003c526",
            "NV_CUDA_CUDART_VERSION=12.1.55-1",
            "NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-1",
            "CUDA_VERSION=12.1.0",
            "LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/opt/conda/lib/",
            "NVIDIA_VISIBLE_DEVICES=all",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "CONDA_PREFIX=/opt/conda",
            "HF_HOME=/data",
            "HF_HUB_ENABLE_HF_TRANSFER=1",
            "PORT=80",
            "LD_PRELOAD=/opt/conda/lib/python3.11/site-packages/nvidia/nccl/lib/libnccl.so.2",
            "EXLLAMA_NO_FLASH_ATTN=1"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/usr/src",
        "Entrypoint": [
            "/tgi-entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "maintainer": "NVIDIA CORPORATION \u003ccudatools@nvidia.com\u003e",
            "org.opencontainers.image.created": "2024-09-20T16:22:53.303Z",
            "org.opencontainers.image.description": "Large Language Model Text Generation Inference",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "169178b937d0c4173b0fdcd6bf10a858cfe4f428",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-generation-inference",
            "org.opencontainers.image.title": "text-generation-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-generation-inference",
            "org.opencontainers.image.version": "2.3.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 13750495023,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/8d3420dc9f1e96b37358a8f7bdeb842c41bbb1c5e5112dd4f214e0d187e87a4f/diff:/var/lib/docker/overlay2/93b9824751b0c77459aa55a1f1c199ea399301d870d5e4e2b48e8cb4e6ab4ea8/diff:/var/lib/docker/overlay2/75a637159c79b16446a02d0b36a7ce913559dd0e94426f1654edbf4bb1f70944/diff:/var/lib/docker/overlay2/f444af95724d3a7547788408f0c67d93780be62579cd28b304ade6aeaf204f1a/diff:/var/lib/docker/overlay2/dcb7223648156b5cf0a3b04b82a6b157fb20724a25b21a3b7316658bf8732378/diff:/var/lib/docker/overlay2/8d9dee90e96907b6188533ea9d570f2dccff9262700d67870e1dbd4efa1f9211/diff:/var/lib/docker/overlay2/79b5a70fcc67e6d76df125904c81e39472a647bea710dd37ad012c3b2ae635a2/diff:/var/lib/docker/overlay2/acad97648841c3798802c904f6b993ac95c6c0c43f43f67f67d88c4a8a93fb3d/diff:/var/lib/docker/overlay2/124ab4023ec7249cc41ca9cd85b7d05992af3a3a101e765ce32c984327b4bf19/diff:/var/lib/docker/overlay2/c3586f864fc0c4cb974c5851ef35b731771f7ed6d9e428a0184d10e4e6f87f20/diff:/var/lib/docker/overlay2/07ee997f45766c02699c3402f19ccbe18962ad3be1e36df717aebaffb8070b47/diff:/var/lib/docker/overlay2/8855b523684e06e95e06b2c25bb7b9a803b79b461769186021543089814cdd51/diff:/var/lib/docker/overlay2/aaf0802f839877c077ad552e7f9065ea108ae8a24b04c9753445ce4e5190dc9c/diff:/var/lib/docker/overlay2/26c7d1ba1f2afd2d63d3c055bbcdf7bb07acda576c4fe9f94f94d39bda653e44/diff:/var/lib/docker/overlay2/1f95518c00248a5f0aa2ea9884c9965e6a1c95089d4b7884d165c78db1506fcd/diff:/var/lib/docker/overlay2/bbf9494e895406f739325675162e679915c34642f6aa114fbea3dca6b3ccb5e0/diff:/var/lib/docker/overlay2/a47d197ad567d8c6742e463d537d480aac5a08710e52119102f78039e78898f1/diff:/var/lib/docker/overlay2/9c5f95ca94292eb769906b2bcdb05441f086a5acb7d2ca0a737fff12a2466122/diff:/var/lib/docker/overlay2/9a1628ea667e29676261a2234fe3a5c570e45a15fd58133feb7ec443fdbf2bc3/diff:/var/lib/docker/overlay2/98c6bff0d7351c775ed3d4c4cce81c862b10be94f784617693200e79677f122b/diff:/var/lib/docker/overlay2/2e46a20926684f31275bf877872e81ac9a2edc282b621f513dc634e02244994b/diff:/var/lib/docker/overlay2/4fa207bd5f8e2680f5395fe425fffd255b422e9636b8689c3d679c6d629dd3c3/diff:/var/lib/docker/overlay2/80ec0e1f0c6f2d8ad326c1655793939b551bf3e83b1a6cffd999a5e0d28b0295/diff:/var/lib/docker/overlay2/8498b0dded1d2a4702398305c87d1e332063d279149a151a26c7cfa42a8f004d/diff:/var/lib/docker/overlay2/8881da9986f3355bef43fa3e0d7e6ab4db6217ca27235784e33f55203ee90307/diff:/var/lib/docker/overlay2/53e9c421527c6da6521e7e931a5bc8af3046423bf648b6c7ff3c3965ecc3e7ea/diff:/var/lib/docker/overlay2/9bf836674b5ea77d1d6b609f7016d4e94eef7934eb9fc235cd138fe40b1100fd/diff:/var/lib/docker/overlay2/7a61305da1c37eee299f69cfd940813bed37ec9bf020ccecee6c7704084d6e48/diff:/var/lib/docker/overlay2/86c98ab0ed88b61b06cf59d67d016cbf05a2c7cea2a7a4fe53a6be12a6bb87d7/diff:/var/lib/docker/overlay2/dddcc2c36b2b15125b046f443eaf956385215dd4fe1629756f29c2e1077ee125/diff:/var/lib/docker/overlay2/7c0ec148c160c668fe5b36bcb65f7d5badb55eb5dcdea8490e826c2a41b578a9/diff:/var/lib/docker/overlay2/b213e3c8abc27592f101ffbb1d0f2c437b5effe9b384d0f550f46a4f894f180d/diff:/var/lib/docker/overlay2/f2905627b4505cda033dd62b5a5dc1676edda5a6e1bda7cd6e6e2048fcf5aee0/diff",
            "MergedDir": "/var/lib/docker/overlay2/5b520d460e376f09e3bd90e5f20025e3d8c299897b9e72c9ee25e5f729f37fdc/merged",
            "UpperDir": "/var/lib/docker/overlay2/5b520d460e376f09e3bd90e5f20025e3d8c299897b9e72c9ee25e5f729f37fdc/diff",
            "WorkDir": "/var/lib/docker/overlay2/5b520d460e376f09e3bd90e5f20025e3d8c299897b9e72c9ee25e5f729f37fdc/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:256d88da41857db513b95b50ba9a9b28491b58c954e25477d5dad8abb465430b",
            "sha256:7b9433fba79bfc9269aab8277ea9975364db1c1f775a7ee6b14b5dffa045b294",
            "sha256:765423415d690bf8ca1510e7147d5b86dba15dcf1a3b1a515f1a85cc5dd439bb",
            "sha256:e4b1bddcbe6378dfa58bf1faa040813b74938129eb4bb06cbf083240335c5c54",
            "sha256:cd77f58b80cdcfac5fcdef06b2033fedc1115073afae035a14b6692cb5cd6650",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:99b0f0055021ba424ea2bac69d5ff058312daa3e1c69fce868398b91ef7a99f5",
            "sha256:552c660b1e8ca76d7f0bb62b6629d1786d6ddd3334d21880444f714fa839dec4",
            "sha256:9e2989174898a8a2a288b634b7146e3d8204254a60ffc5129c30f8df9c8ba339",
            "sha256:09b3046669acc3370807fcf00df283a9ed29327131d64cbeb3e6ebde590506a0",
            "sha256:14c0be4c7c06a671b628fb93c0ecb0b22686a0ca1f8cc43ec1c47fa8843d318b",
            "sha256:c5a6e30e2280eddef2d103b72767262f863fe80813abc6367319fcde9e374a26",
            "sha256:e8df3713c0e2bd346a18801a672ad90aa93fe41f1d5348bf079097d86e056952",
            "sha256:13341a9da2cfe33e07d3168b4cb9516059190e8bb064e9573c29cfed0ef7e9c4",
            "sha256:f4f69f8530066d6c7717c6b5d8e1adb1c595f2c8fb65a3bc557ea8fe9cc03903",
            "sha256:258dcd29effdcb96973d3366d801f8263965742a6113baa3f606c8e8f0fe33ee",
            "sha256:662d544d557fad83e2978b8940b2aca2afc612c6fc49a3f825e06095e7c191da",
            "sha256:0b55d48e5f53af7b940dea8201ce6743d1886b899ab1f5d15f1e971108202ac4",
            "sha256:cb06598c55737a6dfcfda4372ffd5e68d99c26e5dc3e26bd22b68ad2c97651b1",
            "sha256:ceb2a8430a40d03e2a28f14344716b5768313811e7861e859a8858114c9cb2db",
            "sha256:dbfac2c3a9d78de7442e1775904d30acc4bd6a7c139d2c8749613ad33d55da42",
            "sha256:8f7f090e9dfa2e579e3899ab662ffdd182040a2b49e1d6e8efa7c2ec66e5ff6a",
            "sha256:36272860d9b56b510961b85e8866d4f8c916c20dd1de66f940a866c46a88920c",
            "sha256:fd394c45da14de5a9975e2307c5bd7bb22d04e3ef3e4e8bc90a112545df2d65d",
            "sha256:7b0feca58ac3331f4bc9c52b8d6abe74a629dc2c506fc5aaf77e337938b3b381",
            "sha256:bb59d79290aa729974af5bf01d67a0b10a5c0348a9b6ac0b6ab0b0762af6e4d6",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:01cd183462db112795a7220197c822019bbc60cce3a921834d373f5bcc6ab122",
            "sha256:5810aec5ec8b9c0a358a3d9a5bb04e38c312f3f2533a046d1916c4083eca27a6",
            "sha256:11dd00dee41a9924ce93b408451f9d3b8fcbaf4a3460884999900e6df54a795a",
            "sha256:fe1d98a4384792f28b5c4534484aa589259adfd3641913d0714d528a4583d207",
            "sha256:63be15e6d01a5e546b1e56cfac44bd0c56a12c7826f68ae01e619a090dbe17ec",
            "sha256:cb5c93cd01715ad2dce2dd9d5a630cbfb6399cce2bdd75ae2f7f146f2276c65d",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-09-23T15:34:28.806865356+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-generation-inference:2.1.1

linux/amd64 ghcr.io10.66GB2024-09-07 04:52
226

ghcr.io/huggingface/text-generation-inference:2.2

linux/amd64 ghcr.io11.37GB2024-09-07 05:20
581

ghcr.io/huggingface/text-generation-inference:2.3.0

linux/amd64 ghcr.io13.75GB2024-09-23 15:50
648

ghcr.io/huggingface/text-generation-inference:2.4.0

linux/amd64 ghcr.io14.11GB2024-11-08 17:53
257

ghcr.io/huggingface/text-generation-inference:3.1.0

linux/amd64 ghcr.io12.24GB2025-02-11 04:56
259

ghcr.io/huggingface/text-generation-inference:3.1.1

linux/amd64 ghcr.io16.21GB2025-03-10 04:51
105

ghcr.io/huggingface/text-generation-inference:3.2.1

linux/amd64 ghcr.io16.22GB2025-03-25 00:58
343
121