ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2 linux/amd64

ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2 - 国内下载镜像源 浏览次数:12

文本嵌入推断

Hugging Face 提供了一个用于文本嵌入推断的 Docker 镜像。
源镜像 ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2
镜像ID sha256:bb548efc1adaca7d5841d704b417a548530e68389832feb49852d4d2836c0fcc
镜像TAG cpu-1.9.2
大小 686.07MB
镜像源 ghcr.io
CMD --json-output
启动入口 text-embeddings-router
工作目录
OS/平台 linux/amd64
浏览量 12 次
贡献者
镜像创建 2026-02-25T10:49:53.3717003Z
同步时间 2026-03-23 09:55
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
镜像标签
2026-02-25T11:18:28.705Z: org.opencontainers.image.created A blazing fast inference solution for text embeddings models: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses 1d6ceb4883230aee3a4e53b7d5d6c0b5477a335c: org.opencontainers.image.revision https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.source text-embeddings-inference: org.opencontainers.image.title https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.url cpu-1.9.2: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2'

镜像构建历史


# 2026-02-25 18:49:53  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2026-02-25 18:49:53  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-embeddings-router"]
                        
# 2026-02-25 18:49:53  71.81MB 复制新文件或目录到容器中
COPY /usr/src/target/release/text-embeddings-router /usr/local/bin/text-embeddings-router # buildkit
                        
# 2026-02-25 18:45:40  15.04KB 复制新文件或目录到容器中
COPY /usr/src/libfakeintel.so /usr/local/libfakeintel.so # buildkit
                        
# 2026-02-25 18:45:40  65.71MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx512.so.2 /usr/local/lib/libmkl_avx512.so.2 # buildkit
                        
# 2026-02-25 18:45:40  48.84MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx2.so.2 /usr/local/lib/libmkl_avx2.so.2 # buildkit
                        
# 2026-02-25 18:45:40  14.25MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx512.so.2 /usr/local/lib/libmkl_vml_avx512.so.2 # buildkit
                        
# 2026-02-25 18:45:40  14.79MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx2.so.2 /usr/local/lib/libmkl_vml_avx2.so.2 # buildkit
                        
# 2026-02-25 18:45:40  41.16MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_def.so.2 /usr/local/lib/libmkl_def.so.2 # buildkit
                        
# 2026-02-25 18:45:40  8.75MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_def.so.2 /usr/local/lib/libmkl_vml_def.so.2 # buildkit
                        
# 2026-02-25 18:45:40  71.24MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_core.so.2 /usr/local/lib/libmkl_core.so.2 # buildkit
                        
# 2026-02-25 18:45:40  41.93MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_thread.so.2 /usr/local/lib/libmkl_intel_thread.so.2 # buildkit
                        
# 2026-02-25 18:45:40  24.38MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_lp64.so.2 /usr/local/lib/libmkl_intel_lp64.so.2 # buildkit
                        
# 2026-02-25 18:43:17  208.38MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     libomp-dev     ca-certificates     libssl-dev     curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-02-25 18:43:17  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE PORT MKL_ENABLE_INSTRUCTIONS RAYON_NUM_THREADS LD_PRELOAD LD_LIBRARY_PATH
ENV HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
                        
# 2026-02-23 08:00:00  74.83MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1771804800'
                        
                    

镜像信息

{
    "Id": "sha256:bb548efc1adaca7d5841d704b417a548530e68389832feb49852d4d2836c0fcc",
    "RepoTags": [
        "ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-embeddings-inference@sha256:16230cd8f679ae5f8a51d585033628b1c0d45cd70c2bc9b0d208170d71218fdb",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference@sha256:27dc2c5b0dcfac211e1aff6fc770d3df8fd77a56b087692763872c23e002830c"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2026-02-25T10:49:53.3717003Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "PORT=80",
            "MKL_ENABLE_INSTRUCTIONS=AVX512_E4",
            "RAYON_NUM_THREADS=8",
            "LD_PRELOAD=/usr/local/libfakeintel.so",
            "LD_LIBRARY_PATH=/usr/local/lib"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "",
        "Entrypoint": [
            "text-embeddings-router"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2026-02-25T11:18:28.705Z",
            "org.opencontainers.image.description": "A blazing fast inference solution for text embeddings models",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.revision": "1d6ceb4883230aee3a4e53b7d5d6c0b5477a335c",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.title": "text-embeddings-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.version": "cpu-1.9.2"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 686067813,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/d4362ea15c03275cac7c7e864b90a3372a87ada1d9ecc4e871101b0551b53a68/diff:/var/lib/docker/overlay2/dd3ea4bad345d815a51d6835399841255bb163a52c8636bbbb8caff5c857f85e/diff:/var/lib/docker/overlay2/9e66805072245aefe6a49edd8f8951d199dfb5fc4060a0e4ec7ac126f4691be7/diff:/var/lib/docker/overlay2/8f14adba2af71758b45e81c2a88532236d53045a5a8989824594c40840d58d0f/diff:/var/lib/docker/overlay2/3428816369ab575a067a656e59bd36100eda0314d9b4c9434fa5ad5647c5f052/diff:/var/lib/docker/overlay2/b9ce1054eb6ecad4560e0175769181d0fabd3e42dce5843e9e199d8236934718/diff:/var/lib/docker/overlay2/a482b3493ea378a3338f825fab4849c3f16b4809e554bef326fb75a6b240bf74/diff:/var/lib/docker/overlay2/8d41cc1e902f1012b5123040459b56810a715e207176bd174f5b5babb59e5c5e/diff:/var/lib/docker/overlay2/74d1a17e71f7202ab24a4a25e94833bca0db928f3216030a68ccb4a950ff7045/diff:/var/lib/docker/overlay2/36e55f788efc0f4d69e8091e0e66353f606c9080cb441890665452bc0a2973bc/diff:/var/lib/docker/overlay2/0eb3d7103d183f4c79e0d86cf274fa33f1335c94ccd7d27ab5f40be18480d146/diff:/var/lib/docker/overlay2/499cc6d85d37f89b7556d988116a714b6db64e0cc2224e5dbeca56ff996625ab/diff",
            "MergedDir": "/var/lib/docker/overlay2/81b84d391e7ab76633017a7c7e3c8fc3c7ea16cb47287ca604a684e3578cace8/merged",
            "UpperDir": "/var/lib/docker/overlay2/81b84d391e7ab76633017a7c7e3c8fc3c7ea16cb47287ca604a684e3578cace8/diff",
            "WorkDir": "/var/lib/docker/overlay2/81b84d391e7ab76633017a7c7e3c8fc3c7ea16cb47287ca604a684e3578cace8/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:9eb9a78eeb101ab215acf43ffb2a709d5fca9ca2a22178564c7ee5cc30774c60",
            "sha256:3f02a100eea9304a52ad0becc968d1c82949f9cb1ccdf1d8f13951808b37ad8c",
            "sha256:b3ad72a53b2d35d9a8d1e0993d6a0a502370d87b03488c47817367a7a648d2eb",
            "sha256:1991979f99e888faf15d910f2ab5659a2351d98cd5d5b5faf727409ad9b54dac",
            "sha256:4d031fdff5db5c8340de43b286fcae6962fe23112cb7731473daa16b7d3a1cbb",
            "sha256:c419384c2c8f2ccc81820afafc7dc41e3ae71067510aead732c478f4decff303",
            "sha256:8fe1fab83c96ff7275474e69d9fe94cda327192e697872204015ae6faada1af3",
            "sha256:d3a0a6db82bf5c6a5b4a2c3c211ab8dfdd139cd5bec63537e834a9804f104b68",
            "sha256:212f2ce6c5e8ffed4d3a3e5b24539ffa965adcabcab430c648e5e51a73e382c5",
            "sha256:30b09e3b17eef8e91e79fa9e704d431fc08bf78a2184a29e5c22bc5e9b4f95dd",
            "sha256:e1f6805ce71dd590fdf03ef295673da18c722d7a1121bdaed4e925b39d3caabf",
            "sha256:95ff351556db7fced8c5860dc6b47de8f4298b7ea0bc0ae0cd4922bf0644aae9",
            "sha256:f01ce2c2a4774c52831ac047e00e4136e414a6b891bba4627fedb2691b806932"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-03-23T09:54:53.872973316+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-embeddings-inference:cpu-1.2

linux/amd64 ghcr.io636.68MB2024-07-25 11:53
1105

ghcr.io/huggingface/text-embeddings-inference:turing-1.5

linux/amd64 ghcr.io900.87MB2024-10-23 11:14
1062

ghcr.io/huggingface/text-embeddings-inference:cpu-latest

linux/amd64 ghcr.io660.31MB2024-12-04 09:13
1273

ghcr.io/huggingface/text-embeddings-inference:1.6

linux/amd64 ghcr.io1.21GB2025-02-25 09:27
1340

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

linux/amd64 ghcr.io659.95MB2025-02-25 09:47
685

ghcr.io/huggingface/text-embeddings-inference:86-1.6.1

linux/amd64 ghcr.io1.31GB2025-03-31 11:03
478

ghcr.io/huggingface/text-embeddings-inference:86-1.7.0

linux/amd64 ghcr.io1.11GB2025-04-09 09:33
642

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7

linux/amd64 ghcr.io1.11GB2025-04-14 16:27
501

ghcr.io/huggingface/text-embeddings-inference:latest

linux/amd64 ghcr.io1.11GB2025-04-17 14:51
776

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

linux/amd64 ghcr.io683.64MB2025-04-29 22:28
765

ghcr.io/huggingface/text-embeddings-inference:1.7

linux/amd64 ghcr.io1.11GB2025-05-14 09:19
495

ghcr.io/huggingface/text-embeddings-inference:86-1.7.1

linux/amd64 ghcr.io1.11GB2025-06-09 16:24
361

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7.1

linux/amd64 ghcr.io1.12GB2025-06-11 17:26
459

ghcr.io/huggingface/text-embeddings-inference:1.7.1

linux/amd64 ghcr.io1.11GB2025-06-13 16:46
407

ghcr.io/huggingface/text-embeddings-inference:1.7.4

linux/amd64 ghcr.io1.11GB2025-07-08 09:31
367

ghcr.io/huggingface/text-embeddings-inference:1.8.0

linux/amd64 ghcr.io1.11GB2025-08-15 16:52
300

ghcr.io/huggingface/text-embeddings-inference:1.8

linux/amd64 ghcr.io1.11GB2025-08-15 17:00
645

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8

linux/amd64 ghcr.io684.32MB2025-09-04 10:19
510

ghcr.io/huggingface/text-embeddings-inference:cuda-1.8.1

linux/amd64 ghcr.io2.65GB2025-09-11 17:54
549

ghcr.io/huggingface/text-embeddings-inference:hopper-1.8

linux/amd64 ghcr.io1.12GB2025-09-11 18:36
274

ghcr.io/huggingface/text-embeddings-inference:86-1.8.2

linux/amd64 ghcr.io1.11GB2025-09-18 21:54
313

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2

linux/amd64 ghcr.io684.24MB2025-10-14 15:13
274

ghcr.io/huggingface/text-embeddings-inference:1.7.2

linux/amd64 ghcr.io1.11GB2025-10-14 16:56
266

ghcr.io/huggingface/text-embeddings-inference:89-1.8

linux/amd64 ghcr.io1.11GB2025-10-22 18:00
339

ghcr.io/huggingface/text-embeddings-inference:turing-1.8

linux/amd64 ghcr.io930.07MB2025-10-31 18:57
403

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2

linux/amd64 ghcr.io685.78MB2025-12-08 15:35
250

ghcr.io/huggingface/text-embeddings-inference:86-1.9

linux/amd64 ghcr.io5.01GB2026-02-25 00:16
98

ghcr.io/huggingface/text-embeddings-inference:cuda-1.9

linux/amd64 ghcr.io5.17GB2026-02-25 09:12
176

ghcr.io/huggingface/text-embeddings-inference:cuda-1.9.2

linux/amd64 ghcr.io5.17GB2026-03-13 00:40
87

ghcr.io/huggingface/text-embeddings-inference:cpu-1.9.2

linux/amd64 ghcr.io686.07MB2026-03-23 09:55
11