ghcr.io/huggingface/text-generation-inference:2.3.0 linux/amd64

ghcr.io/huggingface/text-generation-inference:2.3.0 - 国内下载镜像源 浏览次数:184
以下是镜像描述信息:

此镜像提供了预先训练好的文本生成模型,可用于各种应用场景,例如文本生成、语言理解等。它使用 Hugging Face Transformers 库并基于多种预训练模型进行训练,可以在多种任务中取得优异表现。

该镜像支持的功能包括:

  • 文本生成
  • 语言理解
  • 问答系统等

通过使用此镜像,您可以快速构建自己的应用程序并进行部署。欢迎您尝试和实验!

源镜像 ghcr.io/huggingface/text-generation-inference:2.3.0
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0
镜像ID sha256:cf86e18ed3214bb145a6ae124fb569b1a3194f57d03788fa4b03253f849f5a1a
镜像TAG 2.3.0
大小 13.75GB
镜像源 ghcr.io
CMD
启动入口 /tgi-entrypoint.sh
工作目录 /usr/src
OS/平台 linux/amd64
浏览量 184 次
贡献者 gu******u@myhexin.com
镜像创建 2024-09-20T15:52:30.619696814Z
同步时间 2024-09-23 15:50
更新时间 2024-11-13 06:36
环境变量
PATH=/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin NVARCH=x86_64 NVIDIA_REQUIRE_CUDA=cuda>=12.1 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526 NV_CUDA_CUDART_VERSION=12.1.55-1 NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-1 CUDA_VERSION=12.1.0 LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/opt/conda/lib/ NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=compute,utility CONDA_PREFIX=/opt/conda HF_HOME=/data HF_HUB_ENABLE_HF_TRANSFER=1 PORT=80 LD_PRELOAD=/opt/conda/lib/python3.11/site-packages/nvidia/nccl/lib/libnccl.so.2 EXLLAMA_NO_FLASH_ATTN=1
镜像标签
NVIDIA CORPORATION <cudatools@nvidia.com>: maintainer 2024-09-20T16:22:53.303Z: org.opencontainers.image.created Large Language Model Text Generation Inference: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name 169178b937d0c4173b0fdcd6bf10a858cfe4f428: org.opencontainers.image.revision https://github.com/huggingface/text-generation-inference: org.opencontainers.image.source text-generation-inference: org.opencontainers.image.title https://github.com/huggingface/text-generation-inference: org.opencontainers.image.url 2.3.0: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: ubuntu 22.04 扫描引擎: Trivy 扫描时间: 2024-10-24 21:44

低危漏洞:159 中危漏洞:750 高危漏洞:2 严重漏洞:0

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-generation-inference:2.3.0#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0  ghcr.io/huggingface/text-generation-inference:2.3.0'

镜像历史

大小 创建时间 层信息
0.00B 2024-09-20 23:52:30 ENTRYPOINT ["/tgi-entrypoint.sh"]
0.00B 2024-09-20 23:52:30 RUN /bin/sh -c chmod +x /tgi-entrypoint.sh # buildkit
130.00B 2024-09-20 23:52:30 COPY ./tgi-entrypoint.sh /tgi-entrypoint.sh # buildkit
5.95MB 2024-09-20 23:52:30 COPY /usr/src/target/release-opt/text-generation-launcher /usr/local/bin/text-generation-launcher # buildkit
36.13MB 2024-09-20 23:52:30 COPY /usr/src/target/release-opt/text-generation-router /usr/local/bin/text-generation-router # buildkit
11.02MB 2024-09-20 23:52:30 COPY /usr/src/target/release-opt/text-generation-benchmark /usr/local/bin/text-generation-benchmark # buildkit
220.15MB 2024-09-20 20:38:16 RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends build-essential g++ && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-09-20 20:38:06 ENV EXLLAMA_NO_FLASH_ATTN=1
0.00B 2024-09-20 20:38:06 ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/opt/conda/lib/
0.00B 2024-09-20 20:38:06 ENV LD_PRELOAD=/opt/conda/lib/python3.11/site-packages/nvidia/nccl/lib/libnccl.so.2
1.90GB 2024-09-20 20:38:06 RUN /bin/sh -c cd server && make gen-server && pip install -r requirements_cuda.txt && pip install ".[bnb, accelerate, marlin, moe, quantize, peft, outlines]" --no-cache-dir && pip install nvidia-nccl-cu12==2.22.3 # buildkit
0.00B 2024-09-20 23:52:30 COPY server/Makefile server/Makefile # buildkit
2.10MB 2024-09-20 20:37:09 COPY server server # buildkit
12.51KB 2024-09-20 01:49:50 COPY proto proto # buildkit
331.50KB 2024-09-20 01:49:50 RUN /bin/sh -c pip install einops --no-cache-dir # buildkit
1.85GB 2024-09-20 01:49:49 COPY /opt/conda/lib/python3.11/site-packages/flashinfer/ /opt/conda/lib/python3.11/site-packages/flashinfer/ # buildkit
23.81MB 2024-09-20 01:49:26 COPY /usr/src/causal-conv1d/build/lib.linux-x86_64-cpython-311/ /opt/conda/lib/python3.11/site-packages # buildkit
183.76MB 2024-09-20 01:49:26 COPY /usr/src/mamba/build/lib.linux-x86_64-cpython-311/ /opt/conda/lib/python3.11/site-packages # buildkit
135.10MB 2024-09-20 01:48:41 COPY /usr/src/vllm/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
9.60MB 2024-09-12 03:37:59 COPY /usr/src/fbgemm/fbgemm_gpu/_skbuild/linux-x86_64-3.11/cmake-install /opt/conda/lib/python3.11/site-packages # buildkit
24.03MB 2024-09-12 03:37:59 COPY /usr/src/lorax-punica/server/punica_kernels/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
41.52MB 2024-09-12 03:37:59 COPY /usr/src/eetq/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
9.69MB 2024-09-12 03:32:15 COPY /usr/src/llm-awq/awq/kernels/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
94.93MB 2024-09-12 03:32:15 COPY /usr/src/exllamav2/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
555.78KB 2024-09-12 03:32:15 COPY /usr/src/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
3.03MB 2024-09-12 03:32:15 COPY /usr/src/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
664.43MB 2024-09-12 03:32:15 COPY /opt/conda/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so /opt/conda/lib/python3.11/site-packages # buildkit
10.71MB 2024-09-12 03:32:15 COPY /usr/src/flash-attention/csrc/rotary/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
725.06MB 2024-09-12 03:32:15 COPY /usr/src/flash-attention/csrc/layer_norm/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
153.86MB 2024-09-12 03:32:14 COPY /usr/src/flash-attention/build/lib.linux-x86_64-cpython-311 /opt/conda/lib/python3.11/site-packages # buildkit
7.32GB 2024-09-12 03:18:46 COPY /opt/conda /opt/conda # buildkit
91.99MB 2024-08-09 21:02:09 RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends libssl-dev ca-certificates make curl git && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-09-20 23:52:30 WORKDIR /usr/src
0.00B 2024-09-20 23:52:30 ENV HF_HOME=/data HF_HUB_ENABLE_HF_TRANSFER=1 PORT=80
0.00B 2024-09-20 23:52:30 ENV PATH=/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin CONDA_PREFIX=/opt/conda
0.00B 2023-11-10 13:44:29 ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
0.00B 2023-11-10 13:44:29 ENV NVIDIA_VISIBLE_DEVICES=all
17.29KB 2023-11-10 13:44:29 COPY NGC-DL-CONTAINER-LICENSE / # buildkit
0.00B 2023-11-10 13:44:29 ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64
0.00B 2023-11-10 13:44:29 ENV PATH=/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
46.00B 2023-11-10 13:44:29 RUN |1 TARGETARCH=amd64 /bin/sh -c echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf && echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf # buildkit
149.59MB 2023-11-10 13:44:29 RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends cuda-cudart-12-1=${NV_CUDA_CUDART_VERSION} ${NV_CUDA_COMPAT_PACKAGE} && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2023-11-10 13:44:18 ENV CUDA_VERSION=12.1.0
10.56MB 2023-11-10 13:44:18 RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends gnupg2 curl ca-certificates && curl -fsSLO https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/${NVARCH}/cuda-keyring_1.0-1_all.deb && dpkg -i cuda-keyring_1.0-1_all.deb && apt-get purge --autoremove -y curl && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2023-11-10 13:44:18 LABEL maintainer=NVIDIA CORPORATION <cudatools@nvidia.com>
0.00B 2023-11-10 13:44:18 ARG TARGETARCH
0.00B 2023-11-10 13:44:18 ENV NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-1
0.00B 2023-11-10 13:44:18 ENV NV_CUDA_CUDART_VERSION=12.1.55-1
0.00B 2023-11-10 13:44:18 ENV NVIDIA_REQUIRE_CUDA=cuda>=12.1 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526
0.00B 2023-11-10 13:44:18 ENV NVARCH=x86_64
0.00B 2023-10-05 15:33:32 /bin/sh -c #(nop) CMD ["/bin/bash"]
77.82MB 2023-10-05 15:33:32 /bin/sh -c #(nop) ADD file:63d5ab3ef0aab308c0e71cb67292c5467f60deafa9b0418cbb220affcd078444 in /
0.00B 2023-10-05 15:33:30 /bin/sh -c #(nop) LABEL org.opencontainers.image.version=22.04
0.00B 2023-10-05 15:33:30 /bin/sh -c #(nop) LABEL org.opencontainers.image.ref.name=ubuntu
0.00B 2023-10-05 15:33:30 /bin/sh -c #(nop) ARG LAUNCHPAD_BUILD_ARCH
0.00B 2023-10-05 15:33:30 /bin/sh -c #(nop) ARG RELEASE

镜像信息

{
    "Id": "sha256:cf86e18ed3214bb145a6ae124fb569b1a3194f57d03788fa4b03253f849f5a1a",
    "RepoTags": [
        "ghcr.io/huggingface/text-generation-inference:2.3.0",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference:2.3.0"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-generation-inference@sha256:dfcffa0498a806255fd14e462e864664519adf470bd7747379939208111b2138",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-generation-inference@sha256:4dc58cf2e113b69c1943997084cdc91de45dd3124cdedd6cb4095a9aea97fadc"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-09-20T15:52:30.619696814Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "NVARCH=x86_64",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=12.1 brand=tesla,driver\u003e=470,driver\u003c471 brand=unknown,driver\u003e=470,driver\u003c471 brand=nvidia,driver\u003e=470,driver\u003c471 brand=nvidiartx,driver\u003e=470,driver\u003c471 brand=geforce,driver\u003e=470,driver\u003c471 brand=geforcertx,driver\u003e=470,driver\u003c471 brand=quadro,driver\u003e=470,driver\u003c471 brand=quadrortx,driver\u003e=470,driver\u003c471 brand=titan,driver\u003e=470,driver\u003c471 brand=titanrtx,driver\u003e=470,driver\u003c471 brand=tesla,driver\u003e=525,driver\u003c526 brand=unknown,driver\u003e=525,driver\u003c526 brand=nvidia,driver\u003e=525,driver\u003c526 brand=nvidiartx,driver\u003e=525,driver\u003c526 brand=geforce,driver\u003e=525,driver\u003c526 brand=geforcertx,driver\u003e=525,driver\u003c526 brand=quadro,driver\u003e=525,driver\u003c526 brand=quadrortx,driver\u003e=525,driver\u003c526 brand=titan,driver\u003e=525,driver\u003c526 brand=titanrtx,driver\u003e=525,driver\u003c526",
            "NV_CUDA_CUDART_VERSION=12.1.55-1",
            "NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-1",
            "CUDA_VERSION=12.1.0",
            "LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/opt/conda/lib/",
            "NVIDIA_VISIBLE_DEVICES=all",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "CONDA_PREFIX=/opt/conda",
            "HF_HOME=/data",
            "HF_HUB_ENABLE_HF_TRANSFER=1",
            "PORT=80",
            "LD_PRELOAD=/opt/conda/lib/python3.11/site-packages/nvidia/nccl/lib/libnccl.so.2",
            "EXLLAMA_NO_FLASH_ATTN=1"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/usr/src",
        "Entrypoint": [
            "/tgi-entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "maintainer": "NVIDIA CORPORATION \u003ccudatools@nvidia.com\u003e",
            "org.opencontainers.image.created": "2024-09-20T16:22:53.303Z",
            "org.opencontainers.image.description": "Large Language Model Text Generation Inference",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "169178b937d0c4173b0fdcd6bf10a858cfe4f428",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-generation-inference",
            "org.opencontainers.image.title": "text-generation-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-generation-inference",
            "org.opencontainers.image.version": "2.3.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 13750495023,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/8d3420dc9f1e96b37358a8f7bdeb842c41bbb1c5e5112dd4f214e0d187e87a4f/diff:/var/lib/docker/overlay2/93b9824751b0c77459aa55a1f1c199ea399301d870d5e4e2b48e8cb4e6ab4ea8/diff:/var/lib/docker/overlay2/75a637159c79b16446a02d0b36a7ce913559dd0e94426f1654edbf4bb1f70944/diff:/var/lib/docker/overlay2/f444af95724d3a7547788408f0c67d93780be62579cd28b304ade6aeaf204f1a/diff:/var/lib/docker/overlay2/dcb7223648156b5cf0a3b04b82a6b157fb20724a25b21a3b7316658bf8732378/diff:/var/lib/docker/overlay2/8d9dee90e96907b6188533ea9d570f2dccff9262700d67870e1dbd4efa1f9211/diff:/var/lib/docker/overlay2/79b5a70fcc67e6d76df125904c81e39472a647bea710dd37ad012c3b2ae635a2/diff:/var/lib/docker/overlay2/acad97648841c3798802c904f6b993ac95c6c0c43f43f67f67d88c4a8a93fb3d/diff:/var/lib/docker/overlay2/124ab4023ec7249cc41ca9cd85b7d05992af3a3a101e765ce32c984327b4bf19/diff:/var/lib/docker/overlay2/c3586f864fc0c4cb974c5851ef35b731771f7ed6d9e428a0184d10e4e6f87f20/diff:/var/lib/docker/overlay2/07ee997f45766c02699c3402f19ccbe18962ad3be1e36df717aebaffb8070b47/diff:/var/lib/docker/overlay2/8855b523684e06e95e06b2c25bb7b9a803b79b461769186021543089814cdd51/diff:/var/lib/docker/overlay2/aaf0802f839877c077ad552e7f9065ea108ae8a24b04c9753445ce4e5190dc9c/diff:/var/lib/docker/overlay2/26c7d1ba1f2afd2d63d3c055bbcdf7bb07acda576c4fe9f94f94d39bda653e44/diff:/var/lib/docker/overlay2/1f95518c00248a5f0aa2ea9884c9965e6a1c95089d4b7884d165c78db1506fcd/diff:/var/lib/docker/overlay2/bbf9494e895406f739325675162e679915c34642f6aa114fbea3dca6b3ccb5e0/diff:/var/lib/docker/overlay2/a47d197ad567d8c6742e463d537d480aac5a08710e52119102f78039e78898f1/diff:/var/lib/docker/overlay2/9c5f95ca94292eb769906b2bcdb05441f086a5acb7d2ca0a737fff12a2466122/diff:/var/lib/docker/overlay2/9a1628ea667e29676261a2234fe3a5c570e45a15fd58133feb7ec443fdbf2bc3/diff:/var/lib/docker/overlay2/98c6bff0d7351c775ed3d4c4cce81c862b10be94f784617693200e79677f122b/diff:/var/lib/docker/overlay2/2e46a20926684f31275bf877872e81ac9a2edc282b621f513dc634e02244994b/diff:/var/lib/docker/overlay2/4fa207bd5f8e2680f5395fe425fffd255b422e9636b8689c3d679c6d629dd3c3/diff:/var/lib/docker/overlay2/80ec0e1f0c6f2d8ad326c1655793939b551bf3e83b1a6cffd999a5e0d28b0295/diff:/var/lib/docker/overlay2/8498b0dded1d2a4702398305c87d1e332063d279149a151a26c7cfa42a8f004d/diff:/var/lib/docker/overlay2/8881da9986f3355bef43fa3e0d7e6ab4db6217ca27235784e33f55203ee90307/diff:/var/lib/docker/overlay2/53e9c421527c6da6521e7e931a5bc8af3046423bf648b6c7ff3c3965ecc3e7ea/diff:/var/lib/docker/overlay2/9bf836674b5ea77d1d6b609f7016d4e94eef7934eb9fc235cd138fe40b1100fd/diff:/var/lib/docker/overlay2/7a61305da1c37eee299f69cfd940813bed37ec9bf020ccecee6c7704084d6e48/diff:/var/lib/docker/overlay2/86c98ab0ed88b61b06cf59d67d016cbf05a2c7cea2a7a4fe53a6be12a6bb87d7/diff:/var/lib/docker/overlay2/dddcc2c36b2b15125b046f443eaf956385215dd4fe1629756f29c2e1077ee125/diff:/var/lib/docker/overlay2/7c0ec148c160c668fe5b36bcb65f7d5badb55eb5dcdea8490e826c2a41b578a9/diff:/var/lib/docker/overlay2/b213e3c8abc27592f101ffbb1d0f2c437b5effe9b384d0f550f46a4f894f180d/diff:/var/lib/docker/overlay2/f2905627b4505cda033dd62b5a5dc1676edda5a6e1bda7cd6e6e2048fcf5aee0/diff",
            "MergedDir": "/var/lib/docker/overlay2/5b520d460e376f09e3bd90e5f20025e3d8c299897b9e72c9ee25e5f729f37fdc/merged",
            "UpperDir": "/var/lib/docker/overlay2/5b520d460e376f09e3bd90e5f20025e3d8c299897b9e72c9ee25e5f729f37fdc/diff",
            "WorkDir": "/var/lib/docker/overlay2/5b520d460e376f09e3bd90e5f20025e3d8c299897b9e72c9ee25e5f729f37fdc/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:256d88da41857db513b95b50ba9a9b28491b58c954e25477d5dad8abb465430b",
            "sha256:7b9433fba79bfc9269aab8277ea9975364db1c1f775a7ee6b14b5dffa045b294",
            "sha256:765423415d690bf8ca1510e7147d5b86dba15dcf1a3b1a515f1a85cc5dd439bb",
            "sha256:e4b1bddcbe6378dfa58bf1faa040813b74938129eb4bb06cbf083240335c5c54",
            "sha256:cd77f58b80cdcfac5fcdef06b2033fedc1115073afae035a14b6692cb5cd6650",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:99b0f0055021ba424ea2bac69d5ff058312daa3e1c69fce868398b91ef7a99f5",
            "sha256:552c660b1e8ca76d7f0bb62b6629d1786d6ddd3334d21880444f714fa839dec4",
            "sha256:9e2989174898a8a2a288b634b7146e3d8204254a60ffc5129c30f8df9c8ba339",
            "sha256:09b3046669acc3370807fcf00df283a9ed29327131d64cbeb3e6ebde590506a0",
            "sha256:14c0be4c7c06a671b628fb93c0ecb0b22686a0ca1f8cc43ec1c47fa8843d318b",
            "sha256:c5a6e30e2280eddef2d103b72767262f863fe80813abc6367319fcde9e374a26",
            "sha256:e8df3713c0e2bd346a18801a672ad90aa93fe41f1d5348bf079097d86e056952",
            "sha256:13341a9da2cfe33e07d3168b4cb9516059190e8bb064e9573c29cfed0ef7e9c4",
            "sha256:f4f69f8530066d6c7717c6b5d8e1adb1c595f2c8fb65a3bc557ea8fe9cc03903",
            "sha256:258dcd29effdcb96973d3366d801f8263965742a6113baa3f606c8e8f0fe33ee",
            "sha256:662d544d557fad83e2978b8940b2aca2afc612c6fc49a3f825e06095e7c191da",
            "sha256:0b55d48e5f53af7b940dea8201ce6743d1886b899ab1f5d15f1e971108202ac4",
            "sha256:cb06598c55737a6dfcfda4372ffd5e68d99c26e5dc3e26bd22b68ad2c97651b1",
            "sha256:ceb2a8430a40d03e2a28f14344716b5768313811e7861e859a8858114c9cb2db",
            "sha256:dbfac2c3a9d78de7442e1775904d30acc4bd6a7c139d2c8749613ad33d55da42",
            "sha256:8f7f090e9dfa2e579e3899ab662ffdd182040a2b49e1d6e8efa7c2ec66e5ff6a",
            "sha256:36272860d9b56b510961b85e8866d4f8c916c20dd1de66f940a866c46a88920c",
            "sha256:fd394c45da14de5a9975e2307c5bd7bb22d04e3ef3e4e8bc90a112545df2d65d",
            "sha256:7b0feca58ac3331f4bc9c52b8d6abe74a629dc2c506fc5aaf77e337938b3b381",
            "sha256:bb59d79290aa729974af5bf01d67a0b10a5c0348a9b6ac0b6ab0b0762af6e4d6",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:01cd183462db112795a7220197c822019bbc60cce3a921834d373f5bcc6ab122",
            "sha256:5810aec5ec8b9c0a358a3d9a5bb04e38c312f3f2533a046d1916c4083eca27a6",
            "sha256:11dd00dee41a9924ce93b408451f9d3b8fcbaf4a3460884999900e6df54a795a",
            "sha256:fe1d98a4384792f28b5c4534484aa589259adfd3641913d0714d528a4583d207",
            "sha256:63be15e6d01a5e546b1e56cfac44bd0c56a12c7826f68ae01e619a090dbe17ec",
            "sha256:cb5c93cd01715ad2dce2dd9d5a630cbfb6399cce2bdd75ae2f7f146f2276c65d",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-09-23T15:34:28.806865356+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-generation-inference:2.1.1

linux/amd64 ghcr.io10.66GB2024-09-07 04:52
90

ghcr.io/huggingface/text-generation-inference:2.2

linux/amd64 ghcr.io11.37GB2024-09-07 05:20
151

ghcr.io/huggingface/text-generation-inference:2.3.0

linux/amd64 ghcr.io13.75GB2024-09-23 15:50
183

ghcr.io/huggingface/text-generation-inference:2.4.0

linux/amd64 ghcr.io14.11GB2024-11-08 17:53
23