docker.io/xprobe/xinference:latest linux/amd64

docker.io/xprobe/xinference:latest - 国内下载镜像源 浏览次数:175

温馨提示:此镜像为latest tag镜像,本站无法保证此版本为最新镜像

XinInference 是一个基于 Docker 的容器镜像,旨在提供一个简单、易用的机器学习推断环境。该镜像包含了 TensorFlow、PyTorch 和 scikit-learn 等常见的深度学习框架,可以用来快速 prototyping 和开发机器学习模型。
源镜像 docker.io/xprobe/xinference:latest
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest
镜像ID sha256:f4a9926b5439d5252d6b0a6285a678679055c919995f33246c97f2cf4c2adeaf
镜像TAG latest
大小 27.44GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD /bin/bash
启动入口
工作目录 /opt/inference
OS/平台 linux/amd64
浏览量 175 次
贡献者
镜像创建 2025-04-03T14:07:04.480740488Z
同步时间 2025-04-06 00:40
更新时间 2025-05-16 08:48
环境变量
PATH=/usr/local/nvm/versions/node/v14.21.1/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin NVARCH=x86_64 NVIDIA_REQUIRE_CUDA=cuda>=12.4 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526 brand=tesla,driver>=535,driver<536 brand=unknown,driver>=535,driver<536 brand=nvidia,driver>=535,driver<536 brand=nvidiartx,driver>=535,driver<536 brand=geforce,driver>=535,driver<536 brand=geforcertx,driver>=535,driver<536 brand=quadro,driver>=535,driver<536 brand=quadrortx,driver>=535,driver<536 brand=titan,driver>=535,driver<536 brand=titanrtx,driver>=535,driver<536 NV_CUDA_CUDART_VERSION=12.4.127-1 NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-4 CUDA_VERSION=12.4.1 LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/lib/python3.10/dist-packages/nvidia/cublas/lib NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=compute,utility DEBIAN_FRONTEND=noninteractive VLLM_USAGE_SOURCE=production-docker-image NVM_DIR=/usr/local/nvm NODE_VERSION=14.21.1
镜像标签
NVIDIA CORPORATION <cudatools@nvidia.com>: maintainer ubuntu: org.opencontainers.image.ref.name 20.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest  docker.io/xprobe/xinference:latest

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest  docker.io/xprobe/xinference:latest

Shell快速替换命令

sed -i 's#xprobe/xinference:latest#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest  docker.io/xprobe/xinference:latest'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest  docker.io/xprobe/xinference:latest'

镜像构建历史


# 2025-04-03 22:07:04  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-04-03 22:07:04  0.00B 
/bin/sh -c #(nop)  ENTRYPOINT []
                        
# 2025-04-03 22:07:01  461.93MB 
|1 PIP_INDEX=https://pypi.org/simple /bin/sh -c /opt/conda/bin/conda create -n ffmpeg-env -c conda-forge 'ffmpeg<7' -y &&     ln -s /opt/conda/envs/ffmpeg-env/bin/ffmpeg /usr/local/bin/ffmpeg &&     ln -s /opt/conda/envs/ffmpeg-env/bin/ffprobe /usr/local/bin/ffprobe &&     /opt/conda/bin/conda clean --all -y
                        
# 2025-04-03 22:06:25  476.11MB 
|1 PIP_INDEX=https://pypi.org/simple /bin/sh -c wget -O Miniforge3.sh "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh" &&     bash Miniforge3.sh -b -p /opt/conda &&     rm Miniforge3.sh
                        
# 2025-04-03 22:06:02  16.22GB 
|1 PIP_INDEX=https://pypi.org/simple /bin/sh -c pip install --upgrade -i "$PIP_INDEX" pip &&     pip install -i "$PIP_INDEX" "diskcache>=5.6.1" "jinja2>=2.11.3" &&     pip install "llama-cpp-python>=0.2.82" -i https://abetlen.github.io/llama-cpp-python/whl/cu124 &&     pip install -i "$PIP_INDEX" --upgrade-strategy only-if-needed -r /opt/inference/xinference/deploy/docker/requirements.txt &&     pip install -i "$PIP_INDEX" --no-deps sglang &&     pip uninstall flashinfer -y &&     pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.5 &&     cd /opt/inference &&     python3 setup.py build_web &&     git restore . &&     pip install -i "$PIP_INDEX" --no-deps "." &&     pip uninstall xllamacpp -y &&     pip install xllamacpp --index-url https://xorbitsai.github.io/xllamacpp/whl/cu124 &&     pip cache purge
                        
# 2025-04-03 21:46:37  0.00B 
/bin/sh -c #(nop)  ARG PIP_INDEX=https://pypi.org/simple
                        
# 2025-04-03 21:46:37  0.00B 
/bin/sh -c #(nop)  ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/lib/python3.10/dist-packages/nvidia/cublas/lib
                        
# 2025-04-03 21:46:37  0.00B 
/bin/sh -c #(nop)  ENV PATH=/usr/local/nvm/versions/node/v14.21.1/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-04-03 21:46:34  440.51MB 
/bin/sh -c apt-get -y update   && apt install -y wget curl procps git libgl1   && printf "\ndeb https://mirrors.tuna.tsinghua.edu.cn/ubuntu/ jammy main restricted universe multiverse" >> /etc/apt/sources.list   && apt-get -y update   && apt-get install -y --only-upgrade libstdc++6 && apt install -y libc6   && mkdir -p $NVM_DIR   && curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash   && . $NVM_DIR/nvm.sh   && nvm install $NODE_VERSION   && nvm alias default $NODE_VERSION   && nvm use default   && apt-get -yq clean
                        
# 2025-04-03 21:45:40  0.00B 
/bin/sh -c #(nop)  ENV NODE_VERSION=14.21.1
                        
# 2025-04-03 21:45:40  0.00B 
/bin/sh -c #(nop)  ENV NVM_DIR=/usr/local/nvm
                        
# 2025-04-03 21:45:40  0.00B 
/bin/sh -c #(nop) WORKDIR /opt/inference
                        
# 2025-04-03 21:45:39  123.88MB 
/bin/sh -c #(nop) COPY dir:7738f00585893a079d26f23595774b8615b9a88724db3c1d1c6ca5095c81f07b in /opt/inference 
                        
# 2024-09-05 08:27:15  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["python3" "-m" "vllm.entrypoints.openai.api_server"]
                        
# 2024-09-05 08:27:15  0.00B 设置环境变量 VLLM_USAGE_SOURCE
ENV VLLM_USAGE_SOURCE=production-docker-image
                        
# 2024-09-05 08:27:15  49.59MB 执行命令并创建新的镜像层
RUN /bin/sh -c pip install accelerate hf_transfer 'modelscope!=1.15.0' # buildkit
                        
# 2024-09-05 08:26:55  1.84GB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c . /etc/environment &&     python3 -m pip install https://github.com/flashinfer-ai/flashinfer/releases/download/v0.1.6/flashinfer-0.1.6+cu121torch2.4-cp${PYTHON_VERSION_STR}-cp${PYTHON_VERSION_STR}-linux_x86_64.whl # buildkit
                        
# 2024-09-05 08:26:02  6.96GB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c python3 -m pip install dist/*.whl --verbose # buildkit
                        
# 2024-09-05 07:39:00  32.26KB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c ldconfig /usr/local/cuda-$(echo $CUDA_VERSION | cut -d. -f1,2)/compat/ # buildkit
                        
# 2024-09-05 07:39:00  612.25MB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c echo 'tzdata tzdata/Areas select America' | debconf-set-selections     && echo 'tzdata tzdata/Zones/America select Los_Angeles' | debconf-set-selections     && apt-get update -y     && apt-get install -y ccache software-properties-common git curl sudo vim python3-pip     && add-apt-repository ppa:deadsnakes/ppa     && apt-get update -y     && apt-get install -y python${PYTHON_VERSION} python${PYTHON_VERSION}-dev python${PYTHON_VERSION}-venv libibverbs-dev     && update-alternatives --install /usr/bin/python3 python3 /usr/bin/python${PYTHON_VERSION} 1     && update-alternatives --set python3 /usr/bin/python${PYTHON_VERSION}     && ln -sf /usr/bin/python${PYTHON_VERSION}-config /usr/bin/python3-config     && curl -sS https://bootstrap.pypa.io/get-pip.py | python${PYTHON_VERSION}     && python3 --version && python3 -m pip --version # buildkit
                        
# 2024-09-05 07:36:15  136.00B 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c PYTHON_VERSION_STR=$(echo ${PYTHON_VERSION} | sed 's/\.//g') &&     echo "export PYTHON_VERSION_STR=${PYTHON_VERSION_STR}" >> /etc/environment # buildkit
                        
# 2024-07-23 15:03:19  0.00B 设置环境变量 DEBIAN_FRONTEND
ENV DEBIAN_FRONTEND=noninteractive
                        
# 2024-07-23 15:03:19  0.00B 设置工作目录为/vllm-workspace
WORKDIR /vllm-workspace
                        
# 2024-07-23 15:03:19  0.00B 定义构建参数
ARG PYTHON_VERSION=3.10
                        
# 2024-07-23 15:03:19  0.00B 定义构建参数
ARG CUDA_VERSION=12.4.1
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 NVIDIA_DRIVER_CAPABILITIES
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 NVIDIA_VISIBLE_DEVICES
ENV NVIDIA_VISIBLE_DEVICES=all
                        
# 2024-04-23 07:42:36  17.29KB 复制新文件或目录到容器中
COPY NGC-DL-CONTAINER-LICENSE / # buildkit
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 LD_LIBRARY_PATH
ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-04-23 07:42:36  46.00B 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf     && echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf # buildkit
                        
# 2024-04-23 07:42:36  155.93MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     cuda-cudart-12-4=${NV_CUDA_CUDART_VERSION}     ${NV_CUDA_COMPAT_PACKAGE}     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 CUDA_VERSION
ENV CUDA_VERSION=12.4.1
                        
# 2024-04-23 07:42:24  18.32MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     gnupg2 curl ca-certificates &&     curl -fsSL https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/${NVARCH}/3bf863cc.pub | apt-key add - &&     echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/${NVARCH} /" > /etc/apt/sources.list.d/cuda.list &&     apt-get purge --autoremove -y curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-04-23 07:42:24  0.00B 添加元数据标签
LABEL maintainer=NVIDIA CORPORATION <cudatools@nvidia.com>
                        
# 2024-04-23 07:42:24  0.00B 定义构建参数
ARG TARGETARCH
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NV_CUDA_COMPAT_PACKAGE
ENV NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-4
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NV_CUDA_CUDART_VERSION
ENV NV_CUDA_CUDART_VERSION=12.4.127-1
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NVIDIA_REQUIRE_CUDA brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand
ENV NVIDIA_REQUIRE_CUDA=cuda>=12.4 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526 brand=tesla,driver>=535,driver<536 brand=unknown,driver>=535,driver<536 brand=nvidia,driver>=535,driver<536 brand=nvidiartx,driver>=535,driver<536 brand=geforce,driver>=535,driver<536 brand=geforcertx,driver>=535,driver<536 brand=quadro,driver>=535,driver<536 brand=quadrortx,driver>=535,driver<536 brand=titan,driver>=535,driver<536 brand=titanrtx,driver>=535,driver<536
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NVARCH
ENV NVARCH=x86_64
                        
# 2024-04-11 02:50:37  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2024-04-11 02:50:37  72.81MB 
/bin/sh -c #(nop) ADD file:ea2128e23dce0162557abadd80656bd5ae047d573095d1d4323eb4154490dfdc in / 
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=20.04
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:f4a9926b5439d5252d6b0a6285a678679055c919995f33246c97f2cf4c2adeaf",
    "RepoTags": [
        "xprobe/xinference:latest",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:latest"
    ],
    "RepoDigests": [
        "xprobe/xinference@sha256:fa13e32bab9e5bfe88a8ff2c3a74c5e671ace260d8a10f7ad3ec061be4e81cef",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference@sha256:b2b0869afa0da748eabade9afabe9e1fbc5f4dda2ca91f09be9c1f8f2915206b"
    ],
    "Parent": "",
    "Comment": "",
    "Created": "2025-04-03T14:07:04.480740488Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "20.10.17",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/nvm/versions/node/v14.21.1/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "NVARCH=x86_64",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=12.4 brand=tesla,driver\u003e=470,driver\u003c471 brand=unknown,driver\u003e=470,driver\u003c471 brand=nvidia,driver\u003e=470,driver\u003c471 brand=nvidiartx,driver\u003e=470,driver\u003c471 brand=geforce,driver\u003e=470,driver\u003c471 brand=geforcertx,driver\u003e=470,driver\u003c471 brand=quadro,driver\u003e=470,driver\u003c471 brand=quadrortx,driver\u003e=470,driver\u003c471 brand=titan,driver\u003e=470,driver\u003c471 brand=titanrtx,driver\u003e=470,driver\u003c471 brand=tesla,driver\u003e=525,driver\u003c526 brand=unknown,driver\u003e=525,driver\u003c526 brand=nvidia,driver\u003e=525,driver\u003c526 brand=nvidiartx,driver\u003e=525,driver\u003c526 brand=geforce,driver\u003e=525,driver\u003c526 brand=geforcertx,driver\u003e=525,driver\u003c526 brand=quadro,driver\u003e=525,driver\u003c526 brand=quadrortx,driver\u003e=525,driver\u003c526 brand=titan,driver\u003e=525,driver\u003c526 brand=titanrtx,driver\u003e=525,driver\u003c526 brand=tesla,driver\u003e=535,driver\u003c536 brand=unknown,driver\u003e=535,driver\u003c536 brand=nvidia,driver\u003e=535,driver\u003c536 brand=nvidiartx,driver\u003e=535,driver\u003c536 brand=geforce,driver\u003e=535,driver\u003c536 brand=geforcertx,driver\u003e=535,driver\u003c536 brand=quadro,driver\u003e=535,driver\u003c536 brand=quadrortx,driver\u003e=535,driver\u003c536 brand=titan,driver\u003e=535,driver\u003c536 brand=titanrtx,driver\u003e=535,driver\u003c536",
            "NV_CUDA_CUDART_VERSION=12.4.127-1",
            "NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-4",
            "CUDA_VERSION=12.4.1",
            "LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/lib/python3.10/dist-packages/nvidia/cublas/lib",
            "NVIDIA_VISIBLE_DEVICES=all",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "DEBIAN_FRONTEND=noninteractive",
            "VLLM_USAGE_SOURCE=production-docker-image",
            "NVM_DIR=/usr/local/nvm",
            "NODE_VERSION=14.21.1"
        ],
        "Cmd": [
            "/bin/bash"
        ],
        "Image": "sha256:edefec303b44c89f4c7c1e244c731d90fefb3bf150dac6766cdd63fc5ea16d64",
        "Volumes": null,
        "WorkingDir": "/opt/inference",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "maintainer": "NVIDIA CORPORATION \u003ccudatools@nvidia.com\u003e",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "20.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 27438477830,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/8bbff7c45a21b3bf4545aa5a6d6f06a6b41e322c2b7062fe847f5c31c149a785/diff:/var/lib/docker/overlay2/b6af2900efccf2646094c7e42cbbd6bad47edea7de9a10aaf360d64ecf8dde2a/diff:/var/lib/docker/overlay2/8ed21d4f9c0aca8c119d019b2c3d96a220121807c691e48d93e4e12fbb077a4e/diff:/var/lib/docker/overlay2/d07ac60a43b5d962762c307078d5aea3148db913d3e4b89dd03db49424046feb/diff:/var/lib/docker/overlay2/a6719a04d49d72dbff3dfc845cff808a5f2da0a7a604ce135c606404ff89af3f/diff:/var/lib/docker/overlay2/420855220a7707eb039e01dc96fdb49e0fcf494c20b09587f2ba6173b88de1f4/diff:/var/lib/docker/overlay2/82241de74a9ab6232aaf33bf9c668750d3d4dc67297fc88f1546ab83e6ea0183/diff:/var/lib/docker/overlay2/aea2d330d7d8409a241c7d7e56f8a82b5c04676b80387e0ff1c62a8ea2afb457/diff:/var/lib/docker/overlay2/a7dbcdcffad48cf54b08071ba8c289f856d2591bbba2b90eb7e1da65cff81b3f/diff:/var/lib/docker/overlay2/8cd8a3d50056dcc1e0cfb5a3c5fa2131db579e7b4d0fbb48ec704fb0515efbd5/diff:/var/lib/docker/overlay2/ffa224a2b2a2577e821891624d82e820a960cb3a99ba005fdfaf2f0647cd55ee/diff:/var/lib/docker/overlay2/0e6a6f4e5573529fa558e6b24566b38c6a84719b789ed96fb3009246f453765e/diff:/var/lib/docker/overlay2/dab94514c02fdb2620589f34a5bc510c4d563497315aafb167384c4513545546/diff:/var/lib/docker/overlay2/7adf3f9f81928cd4a4354e04808a47a1310fd41b49e578a96651cb49d5de79e3/diff:/var/lib/docker/overlay2/658950ec1e0e5fc86cb0d943568f69cb9ee68b401873dba7cdc367fb861d436d/diff:/var/lib/docker/overlay2/f684c3053f2c10cf7db88609b35810a4687299b260d0dcb69cd75c9f7ac79b37/diff",
            "MergedDir": "/var/lib/docker/overlay2/652ce7cdcf4b1982726f4a5e9034d609fb47df65803d09e7fcc4cd5592e5a7ca/merged",
            "UpperDir": "/var/lib/docker/overlay2/652ce7cdcf4b1982726f4a5e9034d609fb47df65803d09e7fcc4cd5592e5a7ca/diff",
            "WorkDir": "/var/lib/docker/overlay2/652ce7cdcf4b1982726f4a5e9034d609fb47df65803d09e7fcc4cd5592e5a7ca/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:106e8431b412f51ccd75ea46a2d5cb4343b23273cbcf50188377cb93aa9a6d82",
            "sha256:cd76869b72ab2a56badf9a068c3f6231203c316e6d1f7a9206e1a9a1a8009fd4",
            "sha256:2ecbf7829cb760264b926dfd4b3cd036c1bdbb50d311bb1c31a6c217ca208bb4",
            "sha256:5136ffc45974618abb8b5ac96d2be9a16fecf79d6ab6759d896f5355c52f10b9",
            "sha256:809d3bb9c80fb3d31d4c061ba0b38ba4e83b6329e33c2cb2bbf27251a8e527c6",
            "sha256:f64d335e5a99796db8621cc422978eba0bb9fbde78c7af53ca1867d545fde504",
            "sha256:2c1ed2c1da9656089f29d332d556e64364a9d7d3baf34f653823afc0fa33926e",
            "sha256:6e44a86d7e24ab88030cf839ab32d42a95208f9277b5906041b7a35ae7ddf21b",
            "sha256:3d42f132c006a5c80dbffaf9c2e38cfcaf379037b6b88273490e307a10f1a8ef",
            "sha256:3f3eb8c7c911cf80afc93c8ebd25b05e177a984d216a8cab762060d0f5395150",
            "sha256:220ab014ee964b4828b3fdc4d6cd323cbdeb9b4a237f47d2dd926f8fec617d36",
            "sha256:5d2b4a0ead904aadca0a0b294ae31c19a972e6c487a1be208d007d387635d556",
            "sha256:695c014fc18e0aeaf660354a3e7eb2e7e089b1affea7488521498a026628e6fe",
            "sha256:5707fbb7ad305dd7abe6b2904ef3c42fff24dc769c8aea13025644fe055beb30",
            "sha256:88f95dbebc3b00c8a486c571e223b0cfa10f0ce7a21187028968fb3cc0c28929",
            "sha256:37d33dd44b953fffad7801f7fd57f8364c836db85403d3dc8e261dc83aaf7e32",
            "sha256:97e3e4bf66b8207a221dfd63856e6f1880ddb9578e8e7ea4aa2aeeb1a04f2d08"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-04-06T00:23:54.149119859+08:00"
    }
}

更多版本

docker.io/xprobe/xinference:v0.13.3

linux/amd64 docker.io15.44GB2024-08-03 22:15
811

docker.io/xprobe/xinference:v0.14.0

linux/amd64 docker.io15.53GB2024-08-05 11:11
650

docker.io/xprobe/xinference:latest-cpu

linux/amd64 docker.io6.75GB2024-09-07 01:27
554

docker.io/xprobe/xinference:v0.12.1

linux/amd64 docker.io26.68GB2024-09-07 03:07
285

docker.io/xprobe/xinference:v0.15.0-cpu

linux/amd64 docker.io6.75GB2024-09-19 22:49
317

docker.io/xprobe/xinference:v0.12.0

linux/amd64 docker.io27.66GB2024-09-25 04:01
288

docker.io/xprobe/xinference:v0.15.2

linux/amd64 docker.io17.55GB2024-09-30 13:35
238

docker.io/xprobe/xinference:v0.15.4

linux/amd64 docker.io17.54GB2024-10-16 01:44
317

docker.io/xprobe/xinference:v0.16.3

linux/amd64 docker.io17.59GB2024-11-13 00:44
184

docker.io/xprobe/xinference:v1.0.0

linux/amd64 docker.io17.62GB2024-11-19 00:15
219

docker.io/xprobe/xinference:v1.0.1

linux/amd64 docker.io17.60GB2024-12-04 00:49
209

docker.io/xprobe/xinference:v1.1.0

linux/amd64 docker.io18.25GB2024-12-18 00:16
290

docker.io/xprobe/xinference:v1.2.0

linux/amd64 docker.io17.01GB2025-01-15 00:31
142

docker.io/xprobe/xinference:v1.2.1

linux/amd64 docker.io23.34GB2025-01-27 00:55
280

docker.io/xprobe/xinference:v1.2.2

linux/amd64 docker.io23.55GB2025-02-13 01:30
286

docker.io/xprobe/xinference:v1.3.0

linux/amd64 docker.io23.65GB2025-03-03 01:51
330

docker.io/xprobe/xinference:v1.3.1.post1

linux/amd64 docker.io25.86GB2025-03-13 01:23
184

docker.io/xprobe/xinference:v1.3.1

linux/amd64 docker.io25.83GB2025-03-19 01:04
360

docker.io/xprobe/xinference:v1.4.0

linux/amd64 docker.io26.27GB2025-03-22 01:34
365

docker.io/xprobe/xinference:v0.15.4-cpu

linux/amd64 docker.io6.78GB2025-04-01 20:33
72

docker.io/xprobe/xinference:v1.4.0-cpu

linux/amd64 docker.io8.82GB2025-04-02 00:33
153

docker.io/xprobe/xinference:latest

linux/amd64 docker.io27.44GB2025-04-06 00:40
174

docker.io/xprobe/xinference:v1.4.1

linux/amd64 docker.io27.44GB2025-04-06 13:27
331

docker.io/xprobe/xinference:nightly-main

linux/amd64 docker.io26.91GB2025-04-12 01:26
69

docker.io/xprobe/xinference:v1.5.0

linux/amd64 docker.io26.63GB2025-04-22 01:22
200

docker.io/xprobe/xinference:v1.5.0.post2

linux/amd64 docker.io27.24GB2025-04-25 05:27
120

docker.io/xprobe/xinference:v1.4.1-cpu

linux/amd64 docker.io8.66GB2025-04-30 00:55
71

docker.io/xprobe/xinference:v1.5.1

linux/amd64 docker.io27.95GB2025-05-02 01:16
306

docker.io/xprobe/xinference:v1.6.0

linux/amd64 docker.io26.72GB2025-05-17 01:24
12