ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2 linux/amd64

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2 - 国内下载镜像源 浏览次数:16

文本嵌入推断

Hugging Face 提供了一个用于文本嵌入推断的 Docker 镜像。
源镜像 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2
镜像ID sha256:b23aad1404afd476a252efad1e3cee5dd95324e165cded13188e2373f615397b
镜像TAG cpu-1.7.2
大小 684.24MB
镜像源 ghcr.io
CMD --json-output
启动入口 text-embeddings-router
工作目录
OS/平台 linux/amd64
浏览量 16 次
贡献者
镜像创建 2025-06-16T06:53:53.069217957Z
同步时间 2025-10-14 15:13
更新时间 2025-10-15 16:58
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
镜像标签
2025-06-16T06:47:24.896Z: org.opencontainers.image.created A blazing fast inference solution for text embeddings models: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses a69cc2ee285ca87a8c7a6b8fc9abc1be360f8335: org.opencontainers.image.revision https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.source text-embeddings-inference: org.opencontainers.image.title https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.url cpu-1.7.2: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2'

镜像构建历史


# 2025-06-16 14:53:53  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2025-06-16 14:53:53  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-embeddings-router"]
                        
# 2025-06-16 14:53:53  70.03MB 复制新文件或目录到容器中
COPY /usr/src/target/release/text-embeddings-router /usr/local/bin/text-embeddings-router # buildkit
                        
# 2025-06-11 15:15:52  15.03KB 复制新文件或目录到容器中
COPY /usr/src/libfakeintel.so /usr/local/libfakeintel.so # buildkit
                        
# 2025-06-11 15:15:52  65.71MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx512.so.2 /usr/local/lib/libmkl_avx512.so.2 # buildkit
                        
# 2025-06-11 15:15:52  48.84MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx2.so.2 /usr/local/lib/libmkl_avx2.so.2 # buildkit
                        
# 2025-06-11 15:15:52  14.25MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx512.so.2 /usr/local/lib/libmkl_vml_avx512.so.2 # buildkit
                        
# 2025-06-11 15:15:52  14.79MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx2.so.2 /usr/local/lib/libmkl_vml_avx2.so.2 # buildkit
                        
# 2025-06-11 15:15:52  41.16MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_def.so.2 /usr/local/lib/libmkl_def.so.2 # buildkit
                        
# 2025-06-11 15:15:52  8.75MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_def.so.2 /usr/local/lib/libmkl_vml_def.so.2 # buildkit
                        
# 2025-06-11 15:15:52  71.24MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_core.so.2 /usr/local/lib/libmkl_core.so.2 # buildkit
                        
# 2025-06-11 15:15:51  41.93MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_thread.so.2 /usr/local/lib/libmkl_intel_thread.so.2 # buildkit
                        
# 2025-06-11 15:15:51  24.38MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_lp64.so.2 /usr/local/lib/libmkl_intel_lp64.so.2 # buildkit
                        
# 2025-06-11 15:13:24  208.35MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     libomp-dev     ca-certificates     libssl-dev     curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-06-11 15:13:24  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE PORT MKL_ENABLE_INSTRUCTIONS RAYON_NUM_THREADS LD_PRELOAD LD_LIBRARY_PATH
ENV HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
                        
# 2025-06-10 08:00:00  74.81MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1749513600'
                        
                    

镜像信息

{
    "Id": "sha256:b23aad1404afd476a252efad1e3cee5dd95324e165cded13188e2373f615397b",
    "RepoTags": [
        "ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-embeddings-inference@sha256:bbdbd7793b70e0329abcb55a6fb96cf7ee68e459305da95dfa79fb643f086317",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference@sha256:c2ac83dbf0e9e7dd2e3794c901b3563d2a6df49c5245a5f91752d2e87ae9f144"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-06-16T06:53:53.069217957Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "PORT=80",
            "MKL_ENABLE_INSTRUCTIONS=AVX512_E4",
            "RAYON_NUM_THREADS=8",
            "LD_PRELOAD=/usr/local/libfakeintel.so",
            "LD_LIBRARY_PATH=/usr/local/lib"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "",
        "Entrypoint": [
            "text-embeddings-router"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2025-06-16T06:47:24.896Z",
            "org.opencontainers.image.description": "A blazing fast inference solution for text embeddings models",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.revision": "a69cc2ee285ca87a8c7a6b8fc9abc1be360f8335",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.title": "text-embeddings-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.version": "cpu-1.7.2"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 684244434,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/5a0c5b627cb179f87d550abc0b19a5634341b8cee4676f20800fa45a1e4cc090/diff:/var/lib/docker/overlay2/4cf610589ceece3ffeb33eccd5dff3103b9b37e0f99000d8a9a92d7c9e7c6bac/diff:/var/lib/docker/overlay2/93aa57c97722fc8ca2a70ea1306468e44fc14485c4972445f911947a0e6f3430/diff:/var/lib/docker/overlay2/cb275d798c6ed717a12cfa19d06e8eed718dd971d6840d3464f0d9f601c3c271/diff:/var/lib/docker/overlay2/70e05f48e35b1a6f1ffa7b4b7a34116a003182f5b95a258893fd5130656f55b4/diff:/var/lib/docker/overlay2/d9e4a5acd01c824fcebe824f6619405e401d1d1d8e1e463daef10ef2f656d825/diff:/var/lib/docker/overlay2/5ec939a48e5a55e63ec792d7af27712286afd31bca03de757ff1653e53f2aac8/diff:/var/lib/docker/overlay2/b0243a0990f0e666080918627812d7b8c8d60f3392663df0f67287c3e9a7b204/diff:/var/lib/docker/overlay2/ca75dbaf6d482aad14f4646d1929269114b8cc0f81bb81c0644a4ccbb7f75f20/diff:/var/lib/docker/overlay2/cdecaf1b355347810913405f1451e091cabcb49a1258bb563b736448c7f3cb3b/diff:/var/lib/docker/overlay2/0f8a3cb510be908ece732bc2fdd9af1c3cf82245ced8487ffbf4cf66e24b7630/diff:/var/lib/docker/overlay2/03ffd856b1728e543c5e9502b4b1efbc7384b7a1f3f458f47fd42dd87e857ad8/diff",
            "MergedDir": "/var/lib/docker/overlay2/7eef1b37c22aa3369fce8aef83dda204ac9dca278dec9d9f5e1635c78ff079ec/merged",
            "UpperDir": "/var/lib/docker/overlay2/7eef1b37c22aa3369fce8aef83dda204ac9dca278dec9d9f5e1635c78ff079ec/diff",
            "WorkDir": "/var/lib/docker/overlay2/7eef1b37c22aa3369fce8aef83dda204ac9dca278dec9d9f5e1635c78ff079ec/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:7fb72a7d1a8e984ccd01277432de660162a547a00de77151518dc9033cfb8cb4",
            "sha256:b90936f20f7871402e27cd93ad9093ee236845a93781c137fc58bbf825d017b7",
            "sha256:1c5e8e91c7a0f0bb2943492f414e797ce0063ae449062ddcd73556f85157e7bd",
            "sha256:87e6213454ece3c56ff137b21a5283f4d4a35e163fcde7b025f240de482f74f4",
            "sha256:bab4d54788f809be9acad2b655c230c4b44fc5ac2d71f778c6bf29280b414f9e",
            "sha256:0fea109026391a4aff91d4b4e65c63b868168584ef45be445ac6e721f8fa75a2",
            "sha256:063e04d3591dd753876edebf2ac5ebb67d8eca12e4c059ffe298582a8fd7b258",
            "sha256:7d969b737d4d6b0f677c91dcf0891815fbff46f2df2b7f14afc73383298d5951",
            "sha256:e4aa7b4afc181693edbbbfc0fe9d84a937a27265a823ec4f46b0d64576a6dcb1",
            "sha256:9fd7b0e81f23e9b132fbe508427d191f57ea82a00ab420afe37a359c23a927fc",
            "sha256:d81fc26e9a851d392ea2529f42250afd1e0a5aecdc01d560518f2bbfe02d48c8",
            "sha256:445ee1bc6872c07b6c4af5a6c7dad25f2e768195a57aae687a9b5579757888c2",
            "sha256:06b6d128077aa63e314b883c39f715a532090fa66315dd3fff17d2fbcf9bdb2c"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-10-14T15:12:55.394165241+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-embeddings-inference:cpu-1.2

linux/amd64 ghcr.io636.68MB2024-07-25 11:53
669

ghcr.io/huggingface/text-embeddings-inference:turing-1.5

linux/amd64 ghcr.io900.87MB2024-10-23 11:14
593

ghcr.io/huggingface/text-embeddings-inference:cpu-latest

linux/amd64 ghcr.io660.31MB2024-12-04 09:13
639

ghcr.io/huggingface/text-embeddings-inference:1.6

linux/amd64 ghcr.io1.21GB2025-02-25 09:27
883

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

linux/amd64 ghcr.io659.95MB2025-02-25 09:47
393

ghcr.io/huggingface/text-embeddings-inference:86-1.6.1

linux/amd64 ghcr.io1.31GB2025-03-31 11:03
247

ghcr.io/huggingface/text-embeddings-inference:86-1.7.0

linux/amd64 ghcr.io1.11GB2025-04-09 09:33
255

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7

linux/amd64 ghcr.io1.11GB2025-04-14 16:27
245

ghcr.io/huggingface/text-embeddings-inference:latest

linux/amd64 ghcr.io1.11GB2025-04-17 14:51
387

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

linux/amd64 ghcr.io683.64MB2025-04-29 22:28
354

ghcr.io/huggingface/text-embeddings-inference:1.7

linux/amd64 ghcr.io1.11GB2025-05-14 09:19
224

ghcr.io/huggingface/text-embeddings-inference:86-1.7.1

linux/amd64 ghcr.io1.11GB2025-06-09 16:24
200

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7.1

linux/amd64 ghcr.io1.12GB2025-06-11 17:26
267

ghcr.io/huggingface/text-embeddings-inference:1.7.1

linux/amd64 ghcr.io1.11GB2025-06-13 16:46
217

ghcr.io/huggingface/text-embeddings-inference:1.7.4

linux/amd64 ghcr.io1.11GB2025-07-08 09:31
201

ghcr.io/huggingface/text-embeddings-inference:1.8.0

linux/amd64 ghcr.io1.11GB2025-08-15 16:52
122

ghcr.io/huggingface/text-embeddings-inference:1.8

linux/amd64 ghcr.io1.11GB2025-08-15 17:00
233

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8

linux/amd64 ghcr.io684.32MB2025-09-04 10:19
121

ghcr.io/huggingface/text-embeddings-inference:cuda-1.8.1

linux/amd64 ghcr.io2.65GB2025-09-11 17:54
111

ghcr.io/huggingface/text-embeddings-inference:hopper-1.8

linux/amd64 ghcr.io1.12GB2025-09-11 18:36
81

ghcr.io/huggingface/text-embeddings-inference:86-1.8.2

linux/amd64 ghcr.io1.11GB2025-09-18 21:54
78

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2

linux/amd64 ghcr.io684.24MB2025-10-14 15:13
15

ghcr.io/huggingface/text-embeddings-inference:1.7.2

linux/amd64 ghcr.io1.11GB2025-10-14 16:56
15