ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 linux/amd64

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 - 国内下载镜像源 浏览次数:51

文本嵌入推断

Hugging Face 提供了一个用于文本嵌入推断的 Docker 镜像。
源镜像 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
镜像ID sha256:951eb1273aff9715b133d82efec1866b535be672d63b8f8dd99f757518e1ea4c
镜像TAG cpu-1.7
大小 683.64MB
镜像源 ghcr.io
CMD --json-output
启动入口 text-embeddings-router
工作目录
OS/平台 linux/amd64
浏览量 51 次
贡献者
镜像创建 2025-04-08T12:02:48.549850732Z
同步时间 2025-04-29 22:28
更新时间 2025-05-16 11:08
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
镜像标签
2025-04-08T11:56:15.606Z: org.opencontainers.image.created A blazing fast inference solution for text embeddings models: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses 72dac20cbc4a99502a3a57605206d6991fb0494c: org.opencontainers.image.revision https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.source text-embeddings-inference: org.opencontainers.image.title https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.url cpu-1.7.0: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-embeddings-inference:cpu-1.7#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7  ghcr.io/huggingface/text-embeddings-inference:cpu-1.7'

镜像构建历史


# 2025-04-08 20:02:48  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2025-04-08 20:02:48  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-embeddings-router"]
                        
# 2025-04-08 20:02:48  69.41MB 复制新文件或目录到容器中
COPY /usr/src/target/release/text-embeddings-router /usr/local/bin/text-embeddings-router # buildkit
                        
# 2025-04-08 16:47:26  15.03KB 复制新文件或目录到容器中
COPY /usr/src/libfakeintel.so /usr/local/libfakeintel.so # buildkit
                        
# 2025-04-08 16:47:26  65.71MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx512.so.2 /usr/local/lib/libmkl_avx512.so.2 # buildkit
                        
# 2025-04-08 16:47:26  48.84MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx2.so.2 /usr/local/lib/libmkl_avx2.so.2 # buildkit
                        
# 2025-04-08 16:47:26  14.25MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx512.so.2 /usr/local/lib/libmkl_vml_avx512.so.2 # buildkit
                        
# 2025-04-08 16:47:26  14.79MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx2.so.2 /usr/local/lib/libmkl_vml_avx2.so.2 # buildkit
                        
# 2025-04-08 16:47:26  41.16MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_def.so.2 /usr/local/lib/libmkl_def.so.2 # buildkit
                        
# 2025-04-08 16:47:26  8.75MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_def.so.2 /usr/local/lib/libmkl_vml_def.so.2 # buildkit
                        
# 2025-04-08 16:47:26  71.24MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_core.so.2 /usr/local/lib/libmkl_core.so.2 # buildkit
                        
# 2025-04-08 16:47:26  41.93MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_thread.so.2 /usr/local/lib/libmkl_intel_thread.so.2 # buildkit
                        
# 2025-04-08 16:47:26  24.38MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_lp64.so.2 /usr/local/lib/libmkl_intel_lp64.so.2 # buildkit
                        
# 2025-04-08 16:46:40  208.34MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     libomp-dev     ca-certificates     libssl-dev     curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-04-08 16:46:40  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE PORT MKL_ENABLE_INSTRUCTIONS RAYON_NUM_THREADS LD_PRELOAD LD_LIBRARY_PATH
ENV HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
                        
# 2025-04-07 08:00:00  74.83MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1743984000'
                        
                    

镜像信息

{
    "Id": "sha256:951eb1273aff9715b133d82efec1866b535be672d63b8f8dd99f757518e1ea4c",
    "RepoTags": [
        "ghcr.io/huggingface/text-embeddings-inference:cpu-1.7",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.7"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-embeddings-inference@sha256:8a4905e35746cdc3012eaf46cf32ae108969d72345738945f291a1b489946297",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference@sha256:9d06adedc8b2ea874c618d59ee5f23de68eb896d29493771e563ee4d5d814650"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-04-08T12:02:48.549850732Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "PORT=80",
            "MKL_ENABLE_INSTRUCTIONS=AVX512_E4",
            "RAYON_NUM_THREADS=8",
            "LD_PRELOAD=/usr/local/libfakeintel.so",
            "LD_LIBRARY_PATH=/usr/local/lib"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "",
        "Entrypoint": [
            "text-embeddings-router"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2025-04-08T11:56:15.606Z",
            "org.opencontainers.image.description": "A blazing fast inference solution for text embeddings models",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.revision": "72dac20cbc4a99502a3a57605206d6991fb0494c",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.title": "text-embeddings-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.version": "cpu-1.7.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 683640236,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/ec0701fe6ab1d65ffb7adf1e17ad39a2e2517a546873f3ba6596b1d329c7c545/diff:/var/lib/docker/overlay2/0598d9098757b10a44880dd981f348f60ac67c5b36c54306fe27d2236a4e0169/diff:/var/lib/docker/overlay2/603a290113e6eebdd9b148a0227595399fe9e911b81a0e58dae9aaf0e1da3481/diff:/var/lib/docker/overlay2/71ac8191879ea9c1f1edb7c127545e25d283af83f7a188d18c2bcecc11701efe/diff:/var/lib/docker/overlay2/327156d896f9380df4d086fcb75a519fabe8a071ed54134ba07143d31dcc88b3/diff:/var/lib/docker/overlay2/b255ae4e513738dedc1a8c539dbe513ad63845fcc819a2d79ca8e1cfbeea2051/diff:/var/lib/docker/overlay2/efbf591d7fe2a6d0a5a7efb82bcabc6283c175e90c0f4ef14fe15318b68431b1/diff:/var/lib/docker/overlay2/75155260558f437186ddef45ed686bf5afd957b3b37b7176f18861d6aace974a/diff:/var/lib/docker/overlay2/fc0baffd081b6b1dec78708facf5df4d109557fe3634587e1c35d256f47ff2e7/diff:/var/lib/docker/overlay2/aa864fdce5ec7eb08b1f952130d23cccb97fa50eda819badbf154820c42312c9/diff:/var/lib/docker/overlay2/f65cdbc528ffbb9bfdf72ba4301b5229a1ffdec8c2bc795cb0774ee86f99dc02/diff:/var/lib/docker/overlay2/6c87f3c40f6916490852332f5b9f29b4cf427a690869e3d003326ba790adfed3/diff",
            "MergedDir": "/var/lib/docker/overlay2/ef5507d2f05a4952ea5a7e3561754e61be62a2af283b3b64009fc6a17db544a0/merged",
            "UpperDir": "/var/lib/docker/overlay2/ef5507d2f05a4952ea5a7e3561754e61be62a2af283b3b64009fc6a17db544a0/diff",
            "WorkDir": "/var/lib/docker/overlay2/ef5507d2f05a4952ea5a7e3561754e61be62a2af283b3b64009fc6a17db544a0/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:ea680fbff095473bb8a6c867938d6d851e11ef0c177fce983ccc83440172bd72",
            "sha256:6bd9d9c567fcb056134660631f7d894fa5e632896ed9d9b5124c30d51e7b3ce7",
            "sha256:cb0bc89fa30b4e868d4621bb371dcd33e29df99c6f86ef68217b40f9449b95ea",
            "sha256:cdfc9d81ac758f0f3f4f6a062d7ae814ef375ebfab4dcf85c5235d1546ea8061",
            "sha256:39db813dd9d7bf099be1e2e333afa377f9d8dcabc8c4e81e6dfdd21e8192a1c6",
            "sha256:2cde8e04c3276d90d958d82275fc8c5ed22cfd86fca25eba1a69a79cde823803",
            "sha256:2bba86eb808e9fa4e8d4af984b7d7f0b83a35b9c264ef1ff00fda589fbf86cc3",
            "sha256:3c5928363b964191dac3c9dbe85c644b22dac995fb0a1e932a65d93a79983429",
            "sha256:e113cad5e0938f55c524f0419b089f707a0d37144aea1d3bd55ceb6a51e547ee",
            "sha256:d3d5aa85b9169d05330c4ee47bfc73c16683de05bf827a6dc9f732b194637ec9",
            "sha256:88d73e34c76ea61053bfc5ba419d25ce63b1a1cc6309462456c2b4f2b36ef1fa",
            "sha256:71b6648dc9a55fe1b67a2c1330fbdc766b292ac873e8ff1217c2de0a5a0ae4da",
            "sha256:36bf19c9c83d65524d6e735d3992f3d8c534ae4ee308bdf65a9bcc1a763de348"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-04-29T22:27:47.763649181+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-embeddings-inference:cpu-1.2

linux/amd64 ghcr.io636.68MB2024-07-25 11:53
448

ghcr.io/huggingface/text-embeddings-inference:turing-1.5

linux/amd64 ghcr.io900.87MB2024-10-23 11:14
295

ghcr.io/huggingface/text-embeddings-inference:cpu-latest

linux/amd64 ghcr.io660.31MB2024-12-04 09:13
280

ghcr.io/huggingface/text-embeddings-inference:1.6

linux/amd64 ghcr.io1.21GB2025-02-25 09:27
291

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

linux/amd64 ghcr.io659.95MB2025-02-25 09:47
202

ghcr.io/huggingface/text-embeddings-inference:86-1.6.1

linux/amd64 ghcr.io1.31GB2025-03-31 11:03
86

ghcr.io/huggingface/text-embeddings-inference:86-1.7.0

linux/amd64 ghcr.io1.11GB2025-04-09 09:33
59

ghcr.io/huggingface/text-embeddings-inference:hopper-1.7

linux/amd64 ghcr.io1.11GB2025-04-14 16:27
68

ghcr.io/huggingface/text-embeddings-inference:latest

linux/amd64 ghcr.io1.11GB2025-04-17 14:51
76

ghcr.io/huggingface/text-embeddings-inference:cpu-1.7

linux/amd64 ghcr.io683.64MB2025-04-29 22:28
50

ghcr.io/huggingface/text-embeddings-inference:1.7

linux/amd64 ghcr.io1.11GB2025-05-14 09:19
21