ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2 linux/amd64

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2 - 国内下载镜像源 浏览次数:11

文本嵌入推断

Hugging Face 提供了一个用于文本嵌入推断的 Docker 镜像。
源镜像 ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2
镜像ID sha256:07c57167078bd5c684f47a6cf82eaa76d515f0dae6b4f800d02e375d6e221673
镜像TAG cpu-1.8.2
大小 685.78MB
镜像源 ghcr.io
CMD --json-output
启动入口 text-embeddings-router
工作目录
OS/平台 linux/amd64
浏览量 11 次
贡献者
镜像创建 2025-09-09T14:40:32.925176038Z
同步时间 2025-12-08 15:35
更新时间 2025-12-09 03:28
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
镜像标签
2025-09-09T14:46:06.898Z: org.opencontainers.image.created A blazing fast inference solution for text embeddings models: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses d7af1fcc509902d8cc66cebf5a61c5e8e000e442: org.opencontainers.image.revision https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.source text-embeddings-inference: org.opencontainers.image.title https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.url cpu-1.8.2: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2  ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2'

镜像构建历史


# 2025-09-09 22:40:32  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2025-09-09 22:40:32  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-embeddings-router"]
                        
# 2025-09-09 22:40:32  71.54MB 复制新文件或目录到容器中
COPY /usr/src/target/release/text-embeddings-router /usr/local/bin/text-embeddings-router # buildkit
                        
# 2025-09-09 15:10:50  15.03KB 复制新文件或目录到容器中
COPY /usr/src/libfakeintel.so /usr/local/libfakeintel.so # buildkit
                        
# 2025-09-09 15:10:50  65.71MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx512.so.2 /usr/local/lib/libmkl_avx512.so.2 # buildkit
                        
# 2025-09-09 15:10:50  48.84MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx2.so.2 /usr/local/lib/libmkl_avx2.so.2 # buildkit
                        
# 2025-09-09 15:10:50  14.25MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx512.so.2 /usr/local/lib/libmkl_vml_avx512.so.2 # buildkit
                        
# 2025-09-09 15:10:50  14.79MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx2.so.2 /usr/local/lib/libmkl_vml_avx2.so.2 # buildkit
                        
# 2025-09-09 15:10:50  41.16MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_def.so.2 /usr/local/lib/libmkl_def.so.2 # buildkit
                        
# 2025-09-09 15:10:50  8.75MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_def.so.2 /usr/local/lib/libmkl_vml_def.so.2 # buildkit
                        
# 2025-09-09 15:10:50  71.24MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_core.so.2 /usr/local/lib/libmkl_core.so.2 # buildkit
                        
# 2025-09-09 15:10:50  41.93MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_thread.so.2 /usr/local/lib/libmkl_intel_thread.so.2 # buildkit
                        
# 2025-09-09 15:10:50  24.38MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_lp64.so.2 /usr/local/lib/libmkl_intel_lp64.so.2 # buildkit
                        
# 2025-09-09 15:08:33  208.37MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     libomp-dev     ca-certificates     libssl-dev     curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-09-09 15:08:33  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE PORT MKL_ENABLE_INSTRUCTIONS RAYON_NUM_THREADS LD_PRELOAD LD_LIBRARY_PATH
ENV HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
                        
# 2025-09-08 08:00:00  74.81MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1757289600'
                        
                    

镜像信息

{
    "Id": "sha256:07c57167078bd5c684f47a6cf82eaa76d515f0dae6b4f800d02e375d6e221673",
    "RepoTags": [
        "ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-embeddings-inference@sha256:4d632b76bd14cb57044a1ffb0ad48ab0ba4939e705a9a615ccc740658575c26e",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference@sha256:fd941c0abd20fb6fa004b2769035fcabfc7184ebef0f7b65608f32b119f93208"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-09-09T14:40:32.925176038Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "PORT=80",
            "MKL_ENABLE_INSTRUCTIONS=AVX512_E4",
            "RAYON_NUM_THREADS=8",
            "LD_PRELOAD=/usr/local/libfakeintel.so",
            "LD_LIBRARY_PATH=/usr/local/lib"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "",
        "Entrypoint": [
            "text-embeddings-router"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2025-09-09T14:46:06.898Z",
            "org.opencontainers.image.description": "A blazing fast inference solution for text embeddings models",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.revision": "d7af1fcc509902d8cc66cebf5a61c5e8e000e442",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.title": "text-embeddings-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.version": "cpu-1.8.2"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 685776344,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/8622a13450dacb18411f0e6c77cc6b82e71a01c3bf8aedf791ea74c86f161d6d/diff:/var/lib/docker/overlay2/f635b0086f38c11cf1a6ebd464ade94abd57935b70842c27bbf6868d007b1608/diff:/var/lib/docker/overlay2/c7919f88844cc9b3cada0a33dba5699eeaa98ccaf2cb47c4679f3af141d8a720/diff:/var/lib/docker/overlay2/72c63b211673701606240fc13d6e601e0b4e2f37261d9496f217f8d914c102c8/diff:/var/lib/docker/overlay2/9ebac13cced9578d259e05e371d43d4603fed50e47d2fe5c109f2cf452e5359a/diff:/var/lib/docker/overlay2/e8b3ab2bb3354f457a1a41de24347f06d900083276ac4442724a0b709665c3f3/diff:/var/lib/docker/overlay2/1a0ffa1066dc89e84611a3658257c4a2aeb862a95395353590ff4094272e9d73/diff:/var/lib/docker/overlay2/2ac6efa1d8877724d3cbdbf81f4001e7c1ee3000e912ba0a69b81b98acdb00b7/diff:/var/lib/docker/overlay2/df7d85b1809850689d85cf9867b87af5d58c6a6a0c436b30f1a69071c40566cf/diff:/var/lib/docker/overlay2/c677f6d302d3b27cae3fc813067ab483e4ec2386ddd2939f32187424e4132002/diff:/var/lib/docker/overlay2/38837eaa55a5190a89e36e6d65d53a96b2125a9dbaf15acfad2e42df60aad8d8/diff:/var/lib/docker/overlay2/caf3cfbefbc35502f2369d94d6b203b4148a47aaa764f79cb7c7a270d1b9c02e/diff",
            "MergedDir": "/var/lib/docker/overlay2/55b6a487ba8228344b1e60d9df6783615b9678342fa0b26a15c587a3d2e2c010/merged",
            "UpperDir": "/var/lib/docker/overlay2/55b6a487ba8228344b1e60d9df6783615b9678342fa0b26a15c587a3d2e2c010/diff",
            "WorkDir": "/var/lib/docker/overlay2/55b6a487ba8228344b1e60d9df6783615b9678342fa0b26a15c587a3d2e2c010/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:36f5f951f60a9fa1d51878e76fc16ba7b752f4d464a21b758a8ac88f0992c488",
            "sha256:eef8ac43376ebfbdc6af2452528d300f670e28adbed701a7e36f489d97556ae4",
            "sha256:f81e2269a5e9fdacae3ebde4cc24bffe1b1d4d726fdd796ce0caeabfd8e9b2dc",
            "sha256:fcf643345d5cf7612094d713352f27f494cc686fe072d3a4b709fffb4e713bb5",
            "sha256:7c287a25db2d24e62ac541f204725387763646f5543d94437ebd6d645b840c9d",
            "sha256:8f01ad90fdf7508370be4b23e7b00abd632256555eebbe7919c81607dcddc2e9",
            "sha256:88043379d8d2c834481968176cc9b5acdc07f1760953f338745c13165828c83b",
            "sha256:723551a04e47c41ece175ec7336f6c30c4b3a0377be1818c18be10fe571fe3f5",
            "sha256:45d54ec5663f81629b7b65940bdf18efb12332d0a159a29e22819f28a0850eec",
            "sha256:49c03f64ed462eab49a9fb3302b1d0ec1748b9f19b75d32dfaf42be5e6a27e0e",
            "sha256:15719c87250df61e7d422c8d6c3df9dbadd7547d4164b3ab6b90e85ee3aa3cdb",
            "sha256:8c748141fc94d8759a1cc96fb6e654356eeab8ab8fbebe03d4be3dd7f485f130",
            "sha256:8c6bcb168b1ba579fd93cff688f7f09b27152e661a79f10ddf25200e70b98710"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-12-08T15:35:51.860703157+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-embeddings-inference:cpu-1.2

linux/amd64 ghcr.io636.68MB2024-07-25 11:53
812

ghcr.io/huggingface/text-embeddings-inference:turing-1.5

linux/amd64 ghcr.io900.87MB2024-10-23 11:14
812

ghcr.io/huggingface/text-embeddings-inference:cpu-latest

linux/amd64 ghcr.io660.31MB2024-12-04 09:13
800

ghcr.io/huggingface/text-embeddings-inference:1.6

linux/amd64 ghcr.io1.21GB2025-02-25 09:27
1047

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

linux/amd64 ghcr.io659.95MB2025-02-25 09:47
512

ghcr.io/huggingface/text-embeddings-inference:86-1.6.1

linux/amd64 ghcr.io1.31GB2025-03-31 11:03
321

ghcr.io/huggingface/text-embeddings-inference:86-1.7.0

linux/amd64 ghcr.io1.11GB2025-04-09 09:33
435

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7

linux/amd64 ghcr.io1.11GB2025-04-14 16:27
409

ghcr.io/huggingface/text-embeddings-inference:latest

linux/amd64 ghcr.io1.11GB2025-04-17 14:51
516

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

linux/amd64 ghcr.io683.64MB2025-04-29 22:28
484

ghcr.io/huggingface/text-embeddings-inference:1.7

linux/amd64 ghcr.io1.11GB2025-05-14 09:19
328

ghcr.io/huggingface/text-embeddings-inference:86-1.7.1

linux/amd64 ghcr.io1.11GB2025-06-09 16:24
254

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7.1

linux/amd64 ghcr.io1.12GB2025-06-11 17:26
333

ghcr.io/huggingface/text-embeddings-inference:1.7.1

linux/amd64 ghcr.io1.11GB2025-06-13 16:46
291

ghcr.io/huggingface/text-embeddings-inference:1.7.4

linux/amd64 ghcr.io1.11GB2025-07-08 09:31
270

ghcr.io/huggingface/text-embeddings-inference:1.8.0

linux/amd64 ghcr.io1.11GB2025-08-15 16:52
172

ghcr.io/huggingface/text-embeddings-inference:1.8

linux/amd64 ghcr.io1.11GB2025-08-15 17:00
411

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8

linux/amd64 ghcr.io684.32MB2025-09-04 10:19
257

ghcr.io/huggingface/text-embeddings-inference:cuda-1.8.1

linux/amd64 ghcr.io2.65GB2025-09-11 17:54
268

ghcr.io/huggingface/text-embeddings-inference:hopper-1.8

linux/amd64 ghcr.io1.12GB2025-09-11 18:36
162

ghcr.io/huggingface/text-embeddings-inference:86-1.8.2

linux/amd64 ghcr.io1.11GB2025-09-18 21:54
174

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7.2

linux/amd64 ghcr.io684.24MB2025-10-14 15:13
125

ghcr.io/huggingface/text-embeddings-inference:1.7.2

linux/amd64 ghcr.io1.11GB2025-10-14 16:56
121

ghcr.io/huggingface/text-embeddings-inference:89-1.8

linux/amd64 ghcr.io1.11GB2025-10-22 18:00
145

ghcr.io/huggingface/text-embeddings-inference:turing-1.8

linux/amd64 ghcr.io930.07MB2025-10-31 18:57
112

ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.2

linux/amd64 ghcr.io685.78MB2025-12-08 15:35
10