ghcr.io/huggingface/text-embeddings-inference:cpu-1.6 linux/amd64

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6 - 国内下载镜像源 浏览次数:7

文本嵌入推断

Hugging Face 提供了一个用于文本嵌入推断的 Docker 镜像。
源镜像 ghcr.io/huggingface/text-embeddings-inference:cpu-1.6
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6
镜像ID sha256:2c3dcccc45ec9f64c605c02126316bbbb5a4700f5354b1d5800b199c7ed6a8df
镜像TAG cpu-1.6
大小 659.95MB
镜像源 ghcr.io
CMD --json-output
启动入口 text-embeddings-router
工作目录
OS/平台 linux/amd64
浏览量 7 次
贡献者 77******9@qq.com
镜像创建 2024-12-13T16:00:12.843404321Z
同步时间 2025-02-25 09:47
更新时间 2025-02-25 14:35
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
镜像标签
2024-12-13T15:55:35.403Z: org.opencontainers.image.created A blazing fast inference solution for text embeddings models: org.opencontainers.image.description Apache-2.0: org.opencontainers.image.licenses 57d8fc8128ab94fcf06b4463ba0d83a4ca25f89b: org.opencontainers.image.revision https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.source text-embeddings-inference: org.opencontainers.image.title https://github.com/huggingface/text-embeddings-inference: org.opencontainers.image.url cpu-1.6.0: org.opencontainers.image.version
镜像安全扫描 查看Trivy扫描报告

系统OS: debian 12.8 扫描引擎: Trivy 扫描时间: 2025-02-25 09:47

低危漏洞:107 中危漏洞:20 高危漏洞:7 严重漏洞:1

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6  ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6  ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

Shell快速替换命令

sed -i 's#ghcr.io/huggingface/text-embeddings-inference:cpu-1.6#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6  ghcr.io/huggingface/text-embeddings-inference:cpu-1.6'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6  ghcr.io/huggingface/text-embeddings-inference:cpu-1.6'

镜像构建历史


# 2024-12-14 00:00:12  0.00B 设置默认要执行的命令
CMD ["--json-output"]
                        
# 2024-12-14 00:00:12  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["text-embeddings-router"]
                        
# 2024-12-14 00:00:12  46.49MB 复制新文件或目录到容器中
COPY /usr/src/target/release/text-embeddings-router /usr/local/bin/text-embeddings-router # buildkit
                        
# 2024-12-12 01:11:20  15.03KB 复制新文件或目录到容器中
COPY /usr/src/libfakeintel.so /usr/local/libfakeintel.so # buildkit
                        
# 2024-12-12 01:11:20  65.71MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx512.so.2 /usr/local/lib/libmkl_avx512.so.2 # buildkit
                        
# 2024-12-12 01:11:20  48.84MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_avx2.so.2 /usr/local/lib/libmkl_avx2.so.2 # buildkit
                        
# 2024-12-12 01:11:20  14.25MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx512.so.2 /usr/local/lib/libmkl_vml_avx512.so.2 # buildkit
                        
# 2024-12-12 01:11:20  14.79MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_avx2.so.2 /usr/local/lib/libmkl_vml_avx2.so.2 # buildkit
                        
# 2024-12-12 01:11:20  41.16MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_def.so.2 /usr/local/lib/libmkl_def.so.2 # buildkit
                        
# 2024-12-12 01:11:20  8.75MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_vml_def.so.2 /usr/local/lib/libmkl_vml_def.so.2 # buildkit
                        
# 2024-12-12 01:11:20  71.24MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_core.so.2 /usr/local/lib/libmkl_core.so.2 # buildkit
                        
# 2024-12-12 01:11:20  41.93MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_thread.so.2 /usr/local/lib/libmkl_intel_thread.so.2 # buildkit
                        
# 2024-12-12 01:11:20  24.38MB 复制新文件或目录到容器中
COPY /opt/intel/oneapi/mkl/latest/lib/intel64/libmkl_intel_lp64.so.2 /usr/local/lib/libmkl_intel_lp64.so.2 # buildkit
                        
# 2024-12-12 01:06:03  207.59MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends     libomp-dev     ca-certificates     libssl-dev     curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-12-12 01:06:03  0.00B 设置环境变量 HUGGINGFACE_HUB_CACHE PORT MKL_ENABLE_INSTRUCTIONS RAYON_NUM_THREADS LD_PRELOAD LD_LIBRARY_PATH
ENV HUGGINGFACE_HUB_CACHE=/data PORT=80 MKL_ENABLE_INSTRUCTIONS=AVX512_E4 RAYON_NUM_THREADS=8 LD_PRELOAD=/usr/local/libfakeintel.so LD_LIBRARY_PATH=/usr/local/lib
                        
# 2024-12-02 08:00:00  74.82MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1733097600'
                        
                    

镜像信息

{
    "Id": "sha256:2c3dcccc45ec9f64c605c02126316bbbb5a4700f5354b1d5800b199c7ed6a8df",
    "RepoTags": [
        "ghcr.io/huggingface/text-embeddings-inference:cpu-1.6",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference:cpu-1.6"
    ],
    "RepoDigests": [
        "ghcr.io/huggingface/text-embeddings-inference@sha256:d62ce8d730557a25a54f4697794f464ea58adad6b1eacd07faf78c4be019a58a",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/huggingface/text-embeddings-inference@sha256:12b99c0761435aec8a458597fb819330d794ae4185ae043f9d5635d07edb04c2"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-12-13T16:00:12.843404321Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "HUGGINGFACE_HUB_CACHE=/data",
            "PORT=80",
            "MKL_ENABLE_INSTRUCTIONS=AVX512_E4",
            "RAYON_NUM_THREADS=8",
            "LD_PRELOAD=/usr/local/libfakeintel.so",
            "LD_LIBRARY_PATH=/usr/local/lib"
        ],
        "Cmd": [
            "--json-output"
        ],
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "",
        "Entrypoint": [
            "text-embeddings-router"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2024-12-13T15:55:35.403Z",
            "org.opencontainers.image.description": "A blazing fast inference solution for text embeddings models",
            "org.opencontainers.image.licenses": "Apache-2.0",
            "org.opencontainers.image.revision": "57d8fc8128ab94fcf06b4463ba0d83a4ca25f89b",
            "org.opencontainers.image.source": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.title": "text-embeddings-inference",
            "org.opencontainers.image.url": "https://github.com/huggingface/text-embeddings-inference",
            "org.opencontainers.image.version": "cpu-1.6.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 659949684,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/93c738e4ba407a5b1bd815c3446d036dfad204916327d527e642807de0d41679/diff:/var/lib/docker/overlay2/4e6a72746fc176014cd17dbc483e6241f4b08db7b6abda53ef54b4f55a451e02/diff:/var/lib/docker/overlay2/c81baf6a0d10a9afca2191bc585356a66ffcbe12740a9210e78160d3b1b5d857/diff:/var/lib/docker/overlay2/f6633a8dbf38bc7d11c1dc2f552bf9b76d9d1fe279c0cd2efdb50134efa46b81/diff:/var/lib/docker/overlay2/c1c783569400a5cda4106e0a3cabebd97e21d5887b8eeaebe24a2b69d2c39f61/diff:/var/lib/docker/overlay2/1bb117d1d6444dc41fb9a8c9996e5f290c9137f1da35cbad81063217e96f38a5/diff:/var/lib/docker/overlay2/54faddfa15fca3e67ceb072020f8cd7b2abbba70ccd4c80833f791dd7953d9d1/diff:/var/lib/docker/overlay2/105b501f2df620af35fb86d006d3185f95a5f409879c719dfd2e8f9ad0fc8b26/diff:/var/lib/docker/overlay2/4864e473e9eed73ac4a757a6d6190777d1fe3ceb8e9036855eb53ede1df3b406/diff:/var/lib/docker/overlay2/4533c268ae7117c4eae21ff485b64308015de712771af35db8b59aed549207f4/diff:/var/lib/docker/overlay2/0f8cf26a44195ff516b49aca812246a49201f9a8194988720c8d57e8b09b78ce/diff:/var/lib/docker/overlay2/404e59c4c9adc52232d40ea19adf6c8d76b2fc9392dc54e49d9840bbd9ffc580/diff",
            "MergedDir": "/var/lib/docker/overlay2/60fe4db4fcdcd6fc98eb57d4f72c0a969122568a5c989811a5c2542b1a09d6ec/merged",
            "UpperDir": "/var/lib/docker/overlay2/60fe4db4fcdcd6fc98eb57d4f72c0a969122568a5c989811a5c2542b1a09d6ec/diff",
            "WorkDir": "/var/lib/docker/overlay2/60fe4db4fcdcd6fc98eb57d4f72c0a969122568a5c989811a5c2542b1a09d6ec/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:c0f1022b22a9b36851b358f44e5475e39d166e71a8073cf53c894a299239b1c5",
            "sha256:57f0332fe32f295d2be042b1ec9800970829f4a39f947cefae5e5701af449e59",
            "sha256:20f20bf9fbcf9ebca4dd842aa350123001323e48c9338f0ab572f96a142f52db",
            "sha256:a8416e909c394fc5b475e05584caa5b225968b0e049734c575a128c9086209d9",
            "sha256:95f62b304db694c56e606757898ee3c01004f42bd6ef23492f8880f45a0f685c",
            "sha256:197e4c6fee8319689e94382f09f2b14eb6b0a2a7b426c83eee15985f00055551",
            "sha256:63319be6362167dc369a51c72e9f238b47f03d3c47adbc4b0e1b51b6220653d6",
            "sha256:f31140f7a868563172544d44f1368a85ff16d28168cc26ef853deb50043a7597",
            "sha256:fb8ec576e909c916a1b59c8ecd449c59c0a529adfa328e3ff98edd08d93ca345",
            "sha256:3a0c548d6e8e638d3ca66949e057eb477691a53701e3d1e661e3e17f7bbb07e1",
            "sha256:d2a583f1816118478a42f46358941298485c6a22cb9564493196075a6a14eaf3",
            "sha256:a382cde9e178657f8968580548ccf54371e94496d0bffa95d10b562c9391ed72",
            "sha256:e4773e0d9d5ac2ba9ffc30dd9bc47d0be5adc4a8ae5e0a6d011532c1dff473c1"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-02-25T09:47:20.805816573+08:00"
    }
}

更多版本

ghcr.io/huggingface/text-embeddings-inference:cpu-1.2

linux/amd64 ghcr.io636.68MB2024-07-25 11:53
307

ghcr.io/huggingface/text-embeddings-inference:turing-1.5

linux/amd64 ghcr.io900.87MB2024-10-23 11:14
174

ghcr.io/huggingface/text-embeddings-inference:cpu-latest

linux/amd64 ghcr.io660.31MB2024-12-04 09:13
99

ghcr.io/huggingface/text-embeddings-inference:1.6

linux/amd64 ghcr.io1.21GB2025-02-25 09:27
7

ghcr.io/huggingface/text-embeddings-inference:cpu-1.6

linux/amd64 ghcr.io659.95MB2025-02-25 09:47
6