ghcr.io/huggingface/text-embeddings-inference:cpu-1.8 linux/amd64

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8 - 国内下载镜像源 浏览次数:17

文本嵌入推断

Hugging Face 提供了一个用于文本嵌入推断的 Docker 镜像。
源镜像 ghcr.io/huggingface/text-embeddings-inference:cpu-1.8
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8
镜像ID sha256:787a2856287437bf62affc8e8fe57fa99a1ebca54e28cf8b0246e0481aa89751
镜像TAG cpu-1.8
大小 684.32MB
镜像源 ghcr.io
CMD --json-output
启动入口 text-embeddings-router
工作目录
OS/平台 linux/amd64
浏览量 17 次
贡献者
镜像创建 2025-08-05T07:07:23.481594946Z
同步时间 2025-09-04 10:19
更新时间 2025-09-05 15:08
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
镜像标签
2025-08-05T08:33:33.037Z: org.opencontainers.image.created A blazing fast inference solution for text embeddings models: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses 2bff275313a7b93e9a5d4dc1dbfdce8e72c7d820: org.opencontainers.image.revision https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.source text-embeddings-inference: org.opencontainers.image.title https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.url cpu-1.8.0: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-embeddings-inference:cpu-1.8#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8'

镜像构建历史


# 2025-08-05 15:07:23  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2025-08-05 15:07:23  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-embeddings-router"]
                        
# 2025-08-05 15:07:23  70.09MB 复制新文件或目录到容器中
COPY /usr/src/target/release/text-embeddings-router /usr/local/bin/text-embeddings-router # buildkit
                        
# 2025-07-23 16:42:57  15.03KB 复制新文件或目录到容器中
COPY /usr/src/libfakeintel.so /usr/local/libfakeintel.so # buildkit
                        
# 2025-07-23 16:42:57  65.71MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx512.so.2 /usr/local/lib/libmkl_avx512.so.2 # buildkit
                        
# 2025-07-23 16:42:57  48.84MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx2.so.2 /usr/local/lib/libmkl_avx2.so.2 # buildkit
                        
# 2025-07-23 16:42:57  14.25MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx512.so.2 /usr/local/lib/libmkl_vml_avx512.so.2 # buildkit
                        
# 2025-07-23 16:42:57  14.79MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx2.so.2 /usr/local/lib/libmkl_vml_avx2.so.2 # buildkit
                        
# 2025-07-23 16:42:57  41.16MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_def.so.2 /usr/local/lib/libmkl_def.so.2 # buildkit
                        
# 2025-07-23 16:42:57  8.75MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_def.so.2 /usr/local/lib/libmkl_vml_def.so.2 # buildkit
                        
# 2025-07-23 16:42:57  71.24MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_core.so.2 /usr/local/lib/libmkl_core.so.2 # buildkit
                        
# 2025-07-23 16:42:57  41.93MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_thread.so.2 /usr/local/lib/libmkl_intel_thread.so.2 # buildkit
                        
# 2025-07-23 16:42:57  24.38MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_lp64.so.2 /usr/local/lib/libmkl_intel_lp64.so.2 # buildkit
                        
# 2025-07-23 16:40:20  208.36MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     libomp-dev     ca-certificates     libssl-dev     curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-07-23 16:40:20  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE PORT MKL_ENABLE_INSTRUCTIONS RAYON_NUM_THREADS LD_PRELOAD LD_LIBRARY_PATH
ENV HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
                        
# 2025-07-21 08:00:00  74.81MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1753056000'
                        
                    

镜像信息

{
    "Id": "sha256:787a2856287437bf62affc8e8fe57fa99a1ebca54e28cf8b0246e0481aa89751",
    "RepoTags": [
        "ghcr.io/huggingface/text-embeddings-inference:cpu-1.8",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-embeddings-inference@sha256:dd16409ca456bd862ffffbb076851709619630166fa7ec26621653301d981225",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference@sha256:a6d362d8cf5d144440e926fd31d165864a2aa817b75b0b9d8967e9b59a4c3d45"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-08-05T07:07:23.481594946Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "PORT=80",
            "MKL_ENABLE_INSTRUCTIONS=AVX512_E4",
            "RAYON_NUM_THREADS=8",
            "LD_PRELOAD=/usr/local/libfakeintel.so",
            "LD_LIBRARY_PATH=/usr/local/lib"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "",
        "Entrypoint": [
            "text-embeddings-router"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2025-08-05T08:33:33.037Z",
            "org.opencontainers.image.description": "A blazing fast inference solution for text embeddings models",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.revision": "2bff275313a7b93e9a5d4dc1dbfdce8e72c7d820",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.title": "text-embeddings-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.version": "cpu-1.8.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 684315260,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/a43077cf94a9bef27ac4990d0d984c9d90d197ac984bf6e747447e94a103d8cb/diff:/var/lib/docker/overlay2/5cdc1abf0c7c2ec585b5e5b0085a4a9f680f0ce05a565f35f45ebc6408508bbe/diff:/var/lib/docker/overlay2/002cd997227c86028fe8625d5b6d33d1d9ea78539f97a6994c7c0730431bec97/diff:/var/lib/docker/overlay2/1230bf78feb3ec800876abcaa5e9e2d6dab4e1be8acca46eaeee05d0e4c3892c/diff:/var/lib/docker/overlay2/c73d51833f5f90175abb0b84763f54e1e92661e01ffd554d0f0db7bcab9f0408/diff:/var/lib/docker/overlay2/7b66f862d102ac95ecedeea4c0e5d7163001adc476b96ebd2576ef89cae75930/diff:/var/lib/docker/overlay2/518417eeba05c218ea079af6aab289d35e4d750f276656f6b5b6ababf7e01257/diff:/var/lib/docker/overlay2/fbe79db90cf58109eba2bd2d2a7f02316a57b32949865a9ed9a012e3378631e4/diff:/var/lib/docker/overlay2/1f7d710d38fc1f72c5e6635bf10cde14ed5207bb5f7a12a3e49a578991112e46/diff:/var/lib/docker/overlay2/05ed80b1e8d02d8fcb3ce8b82b38c7b131dcb0bd815593dcfcc0eab56d2f5aee/diff:/var/lib/docker/overlay2/e165ed9cd824a7993fbfb9026310446e9c7cf2b2c8e40cb69e5509c6b015819b/diff:/var/lib/docker/overlay2/78f84327c675385035d735018d8f3e092b1a95be91b6e02564557eb58a1c2945/diff",
            "MergedDir": "/var/lib/docker/overlay2/853258e34b5aedf5b748e78011dff52658da3c17a8f6dfcca9680934f6bbedee/merged",
            "UpperDir": "/var/lib/docker/overlay2/853258e34b5aedf5b748e78011dff52658da3c17a8f6dfcca9680934f6bbedee/diff",
            "WorkDir": "/var/lib/docker/overlay2/853258e34b5aedf5b748e78011dff52658da3c17a8f6dfcca9680934f6bbedee/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:7cc7fe68eff66f19872441a51938eecc4ad33746d2baa3abc081c1e6fe25988e",
            "sha256:d177d333376ee94befa1f6bcccee9f9f59fbc9ec25a76e4b8ba9fbce9a0630d9",
            "sha256:3aea23e3305a50a5d9c203e3be3d120fc8c25fbe68ec516aecb479e8b004a088",
            "sha256:c9eb0924061a882436a1cda655772e0fb86b47bf12dd44d05be673bbc62618c6",
            "sha256:3dfa2f0a3c1b36d19ba77ee39c509185c9b9567dc4bb8d8c58b0c65e31db961d",
            "sha256:ba8aab7b4eb9fc24fce4191deffa104800a73fc75890711c61e354a339c12f72",
            "sha256:c2c20ba9c17341cb6d31b5e34cae0ce5d9d3c982b416efd21c43b737941b8e51",
            "sha256:fadbcffaf3a5489bf0af8b18574f359f921581911f090fddde3fd9aad5edf703",
            "sha256:015fa5286d89689b4df6ba32afc5c9df6a04d8b8cb074c9381f8c70be2b0a6a9",
            "sha256:61c9331339fc2e5680203b98fc3aeee851739c5996fcaef04733ec7c2a37eb6a",
            "sha256:97d6cfc43952ff856c29508b98e63b86b016ec99417a925f088f963155b330dd",
            "sha256:ea4e356bbae746477034811f8f7d2bff513072e9a456140e4d9caa4eec8c357a",
            "sha256:9bd0a2f5198a0de10d7ab7323e9e203f840aaa284656746c4e02020f4c411d25"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-09-04T10:19:10.744172554+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-embeddings-inference:cpu-1.2

linux/amd64 ghcr.io636.68MB2024-07-25 11:53
621

ghcr.io/huggingface/text-embeddings-inference:turing-1.5

linux/amd64 ghcr.io900.87MB2024-10-23 11:14
510

ghcr.io/huggingface/text-embeddings-inference:cpu-latest

linux/amd64 ghcr.io660.31MB2024-12-04 09:13
574

ghcr.io/huggingface/text-embeddings-inference:1.6

linux/amd64 ghcr.io1.21GB2025-02-25 09:27
765

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

linux/amd64 ghcr.io659.95MB2025-02-25 09:47
352

ghcr.io/huggingface/text-embeddings-inference:86-1.6.1

linux/amd64 ghcr.io1.31GB2025-03-31 11:03
211

ghcr.io/huggingface/text-embeddings-inference:86-1.7.0

linux/amd64 ghcr.io1.11GB2025-04-09 09:33
202

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7

linux/amd64 ghcr.io1.11GB2025-04-14 16:27
197

ghcr.io/huggingface/text-embeddings-inference:latest

linux/amd64 ghcr.io1.11GB2025-04-17 14:51
307

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

linux/amd64 ghcr.io683.64MB2025-04-29 22:28
290

ghcr.io/huggingface/text-embeddings-inference:1.7

linux/amd64 ghcr.io1.11GB2025-05-14 09:19
173

ghcr.io/huggingface/text-embeddings-inference:86-1.7.1

linux/amd64 ghcr.io1.11GB2025-06-09 16:24
159

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7.1

linux/amd64 ghcr.io1.12GB2025-06-11 17:26
203

ghcr.io/huggingface/text-embeddings-inference:1.7.1

linux/amd64 ghcr.io1.11GB2025-06-13 16:46
180

ghcr.io/huggingface/text-embeddings-inference:1.7.4

linux/amd64 ghcr.io1.11GB2025-07-08 09:31
135

ghcr.io/huggingface/text-embeddings-inference:1.8.0

linux/amd64 ghcr.io1.11GB2025-08-15 16:52
82

ghcr.io/huggingface/text-embeddings-inference:1.8

linux/amd64 ghcr.io1.11GB2025-08-15 17:00
154

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8

linux/amd64 ghcr.io684.32MB2025-09-04 10:19
16