docker.io/xprobe/xinference:v1.4.0 linux/amd64

docker.io/xprobe/xinference:v1.4.0 - 国内下载镜像源 浏览次数:217
XinInference 是一个基于 Docker 的容器镜像,旨在提供一个简单、易用的机器学习推断环境。该镜像包含了 TensorFlow、PyTorch 和 scikit-learn 等常见的深度学习框架,可以用来快速 prototyping 和开发机器学习模型。
源镜像 docker.io/xprobe/xinference:v1.4.0
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0
镜像ID sha256:ce854f81084e78a46035ef854991b50f0d47d65460b1d7f6124be72e49c2fe29
镜像TAG v1.4.0
大小 26.27GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD /bin/bash
启动入口
工作目录 /opt/inference
OS/平台 linux/amd64
浏览量 217 次
贡献者
镜像创建 2025-03-21T07:43:05.607245055Z
同步时间 2025-03-22 01:34
更新时间 2025-04-03 02:28
环境变量
PATH=/usr/local/nvm/versions/node/v14.21.1/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin NVARCH=x86_64 NVIDIA_REQUIRE_CUDA=cuda>=12.4 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526 brand=tesla,driver>=535,driver<536 brand=unknown,driver>=535,driver<536 brand=nvidia,driver>=535,driver<536 brand=nvidiartx,driver>=535,driver<536 brand=geforce,driver>=535,driver<536 brand=geforcertx,driver>=535,driver<536 brand=quadro,driver>=535,driver<536 brand=quadrortx,driver>=535,driver<536 brand=titan,driver>=535,driver<536 brand=titanrtx,driver>=535,driver<536 NV_CUDA_CUDART_VERSION=12.4.127-1 NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-4 CUDA_VERSION=12.4.1 LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/lib/python3.10/dist-packages/nvidia/cublas/lib NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=compute,utility DEBIAN_FRONTEND=noninteractive VLLM_USAGE_SOURCE=production-docker-image NVM_DIR=/usr/local/nvm NODE_VERSION=14.21.1
镜像标签
NVIDIA CORPORATION <cudatools@nvidia.com>: maintainer ubuntu: org.opencontainers.image.ref.name 20.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0  docker.io/xprobe/xinference:v1.4.0

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0  docker.io/xprobe/xinference:v1.4.0

Shell快速替换命令

sed -i 's#xprobe/xinference:v1.4.0#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0  docker.io/xprobe/xinference:v1.4.0'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0  docker.io/xprobe/xinference:v1.4.0'

镜像构建历史


# 2025-03-21 15:43:05  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-03-21 15:43:05  0.00B 
/bin/sh -c #(nop)  ENTRYPOINT []
                        
# 2025-03-21 15:43:02  459.84MB 
|1 PIP_INDEX=https://pypi.org/simple /bin/sh -c /opt/conda/bin/conda create -n ffmpeg-env -c conda-forge 'ffmpeg<7' -y &&     ln -s /opt/conda/envs/ffmpeg-env/bin/ffmpeg /usr/local/bin/ffmpeg &&     ln -s /opt/conda/envs/ffmpeg-env/bin/ffprobe /usr/local/bin/ffprobe &&     /opt/conda/bin/conda clean --all -y
                        
# 2025-03-21 15:42:25  476.11MB 
|1 PIP_INDEX=https://pypi.org/simple /bin/sh -c wget -O Miniforge3.sh "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh" &&     bash Miniforge3.sh -b -p /opt/conda &&     rm Miniforge3.sh
                        
# 2025-03-21 15:41:56  15.08GB 
|1 PIP_INDEX=https://pypi.org/simple /bin/sh -c pip install --upgrade -i "$PIP_INDEX" pip &&     pip install -i "$PIP_INDEX" "diskcache>=5.6.1" "jinja2>=2.11.3" &&     pip install "llama-cpp-python>=0.2.82" -i https://abetlen.github.io/llama-cpp-python/whl/cu124 &&     pip install -i "$PIP_INDEX" --upgrade-strategy only-if-needed -r /opt/inference/xinference/deploy/docker/requirements.txt &&     pip install -i "$PIP_INDEX" --no-deps sglang &&     pip uninstall flashinfer -y &&     pip install flashinfer-python -i https://flashinfer.ai/whl/cu124/torch2.5 &&     cd /opt/inference &&     python3 setup.py build_web &&     git restore . &&     pip install -i "$PIP_INDEX" --no-deps "." &&     pip uninstall xllamacpp -y &&     pip install xllamacpp --index-url https://xorbitsai.github.io/xllamacpp/whl/cu124 &&     pip cache purge
                        
# 2025-03-21 15:25:14  0.00B 
/bin/sh -c #(nop)  ARG PIP_INDEX=https://pypi.org/simple
                        
# 2025-03-21 15:25:14  0.00B 
/bin/sh -c #(nop)  ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/lib/python3.10/dist-packages/nvidia/cublas/lib
                        
# 2025-03-21 15:25:14  0.00B 
/bin/sh -c #(nop)  ENV PATH=/usr/local/nvm/versions/node/v14.21.1/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-03-21 15:25:11  440.00MB 
/bin/sh -c apt-get -y update   && apt install -y wget curl procps git libgl1   && printf "\ndeb https://mirrors.tuna.tsinghua.edu.cn/ubuntu/ jammy main restricted universe multiverse" >> /etc/apt/sources.list   && apt-get -y update   && apt-get install -y --only-upgrade libstdc++6 && apt install -y libc6   && mkdir -p $NVM_DIR   && curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash   && . $NVM_DIR/nvm.sh   && nvm install $NODE_VERSION   && nvm alias default $NODE_VERSION   && nvm use default   && apt-get -yq clean
                        
# 2025-03-21 15:20:45  0.00B 
/bin/sh -c #(nop)  ENV NODE_VERSION=14.21.1
                        
# 2025-03-21 15:20:45  0.00B 
/bin/sh -c #(nop)  ENV NVM_DIR=/usr/local/nvm
                        
# 2025-03-21 15:20:45  0.00B 
/bin/sh -c #(nop) WORKDIR /opt/inference
                        
# 2025-03-21 15:20:43  97.92MB 
/bin/sh -c #(nop) COPY dir:4d28d48efb587aae534e429bea49b7699d543f4aa0651cb93141b6fa6a48de21 in /opt/inference 
                        
# 2024-09-05 08:27:15  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["python3" "-m" "vllm.entrypoints.openai.api_server"]
                        
# 2024-09-05 08:27:15  0.00B 设置环境变量 VLLM_USAGE_SOURCE
ENV VLLM_USAGE_SOURCE=production-docker-image
                        
# 2024-09-05 08:27:15  49.59MB 执行命令并创建新的镜像层
RUN /bin/sh -c pip install accelerate hf_transfer 'modelscope!=1.15.0' # buildkit
                        
# 2024-09-05 08:26:55  1.84GB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c . /etc/environment &&     python3 -m pip install https://github.com/flashinfer-ai/flashinfer/releases/download/v0.1.6/flashinfer-0.1.6+cu121torch2.4-cp${PYTHON_VERSION_STR}-cp${PYTHON_VERSION_STR}-linux_x86_64.whl # buildkit
                        
# 2024-09-05 08:26:02  6.96GB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c python3 -m pip install dist/*.whl --verbose # buildkit
                        
# 2024-09-05 07:39:00  32.26KB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c ldconfig /usr/local/cuda-$(echo $CUDA_VERSION | cut -d. -f1,2)/compat/ # buildkit
                        
# 2024-09-05 07:39:00  612.25MB 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c echo 'tzdata tzdata/Areas select America' | debconf-set-selections     && echo 'tzdata tzdata/Zones/America select Los_Angeles' | debconf-set-selections     && apt-get update -y     && apt-get install -y ccache software-properties-common git curl sudo vim python3-pip     && add-apt-repository ppa:deadsnakes/ppa     && apt-get update -y     && apt-get install -y python${PYTHON_VERSION} python${PYTHON_VERSION}-dev python${PYTHON_VERSION}-venv libibverbs-dev     && update-alternatives --install /usr/bin/python3 python3 /usr/bin/python${PYTHON_VERSION} 1     && update-alternatives --set python3 /usr/bin/python${PYTHON_VERSION}     && ln -sf /usr/bin/python${PYTHON_VERSION}-config /usr/bin/python3-config     && curl -sS https://bootstrap.pypa.io/get-pip.py | python${PYTHON_VERSION}     && python3 --version && python3 -m pip --version # buildkit
                        
# 2024-09-05 07:36:15  136.00B 执行命令并创建新的镜像层
RUN |2 CUDA_VERSION=12.4.1 PYTHON_VERSION=3.10 /bin/sh -c PYTHON_VERSION_STR=$(echo ${PYTHON_VERSION} | sed 's/\.//g') &&     echo "export PYTHON_VERSION_STR=${PYTHON_VERSION_STR}" >> /etc/environment # buildkit
                        
# 2024-07-23 15:03:19  0.00B 设置环境变量 DEBIAN_FRONTEND
ENV DEBIAN_FRONTEND=noninteractive
                        
# 2024-07-23 15:03:19  0.00B 设置工作目录为/vllm-workspace
WORKDIR /vllm-workspace
                        
# 2024-07-23 15:03:19  0.00B 定义构建参数
ARG PYTHON_VERSION=3.10
                        
# 2024-07-23 15:03:19  0.00B 定义构建参数
ARG CUDA_VERSION=12.4.1
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 NVIDIA_DRIVER_CAPABILITIES
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 NVIDIA_VISIBLE_DEVICES
ENV NVIDIA_VISIBLE_DEVICES=all
                        
# 2024-04-23 07:42:36  17.29KB 复制新文件或目录到容器中
COPY NGC-DL-CONTAINER-LICENSE / # buildkit
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 LD_LIBRARY_PATH
ENV LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64
                        
# 2024-04-23 07:42:36  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2024-04-23 07:42:36  46.00B 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf     && echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf # buildkit
                        
# 2024-04-23 07:42:36  155.93MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     cuda-cudart-12-4=${NV_CUDA_CUDART_VERSION}     ${NV_CUDA_COMPAT_PACKAGE}     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 CUDA_VERSION
ENV CUDA_VERSION=12.4.1
                        
# 2024-04-23 07:42:24  18.32MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     gnupg2 curl ca-certificates &&     curl -fsSL https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/${NVARCH}/3bf863cc.pub | apt-key add - &&     echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/${NVARCH} /" > /etc/apt/sources.list.d/cuda.list &&     apt-get purge --autoremove -y curl     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2024-04-23 07:42:24  0.00B 添加元数据标签
LABEL maintainer=NVIDIA CORPORATION <cudatools@nvidia.com>
                        
# 2024-04-23 07:42:24  0.00B 定义构建参数
ARG TARGETARCH
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NV_CUDA_COMPAT_PACKAGE
ENV NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-4
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NV_CUDA_CUDART_VERSION
ENV NV_CUDA_CUDART_VERSION=12.4.127-1
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NVIDIA_REQUIRE_CUDA brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand brand
ENV NVIDIA_REQUIRE_CUDA=cuda>=12.4 brand=tesla,driver>=470,driver<471 brand=unknown,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=geforce,driver>=470,driver<471 brand=geforcertx,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=titan,driver>=470,driver<471 brand=titanrtx,driver>=470,driver<471 brand=tesla,driver>=525,driver<526 brand=unknown,driver>=525,driver<526 brand=nvidia,driver>=525,driver<526 brand=nvidiartx,driver>=525,driver<526 brand=geforce,driver>=525,driver<526 brand=geforcertx,driver>=525,driver<526 brand=quadro,driver>=525,driver<526 brand=quadrortx,driver>=525,driver<526 brand=titan,driver>=525,driver<526 brand=titanrtx,driver>=525,driver<526 brand=tesla,driver>=535,driver<536 brand=unknown,driver>=535,driver<536 brand=nvidia,driver>=535,driver<536 brand=nvidiartx,driver>=535,driver<536 brand=geforce,driver>=535,driver<536 brand=geforcertx,driver>=535,driver<536 brand=quadro,driver>=535,driver<536 brand=quadrortx,driver>=535,driver<536 brand=titan,driver>=535,driver<536 brand=titanrtx,driver>=535,driver<536
                        
# 2024-04-23 07:42:24  0.00B 设置环境变量 NVARCH
ENV NVARCH=x86_64
                        
# 2024-04-11 02:50:37  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2024-04-11 02:50:37  72.81MB 
/bin/sh -c #(nop) ADD file:ea2128e23dce0162557abadd80656bd5ae047d573095d1d4323eb4154490dfdc in / 
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=20.04
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2024-04-11 02:50:35  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:ce854f81084e78a46035ef854991b50f0d47d65460b1d7f6124be72e49c2fe29",
    "RepoTags": [
        "xprobe/xinference:v1.4.0",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference:v1.4.0"
    ],
    "RepoDigests": [
        "xprobe/xinference@sha256:159d0d98c5a417311637b27e18105db644fdea5912f9b6f18670087e9a8108f7",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/xprobe/xinference@sha256:8897f14abfd9bc10075410e7b8bf0256ad50a8a342000ff09a02362eec8d259e"
    ],
    "Parent": "",
    "Comment": "",
    "Created": "2025-03-21T07:43:05.607245055Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "20.10.17",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/nvm/versions/node/v14.21.1/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "NVARCH=x86_64",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=12.4 brand=tesla,driver\u003e=470,driver\u003c471 brand=unknown,driver\u003e=470,driver\u003c471 brand=nvidia,driver\u003e=470,driver\u003c471 brand=nvidiartx,driver\u003e=470,driver\u003c471 brand=geforce,driver\u003e=470,driver\u003c471 brand=geforcertx,driver\u003e=470,driver\u003c471 brand=quadro,driver\u003e=470,driver\u003c471 brand=quadrortx,driver\u003e=470,driver\u003c471 brand=titan,driver\u003e=470,driver\u003c471 brand=titanrtx,driver\u003e=470,driver\u003c471 brand=tesla,driver\u003e=525,driver\u003c526 brand=unknown,driver\u003e=525,driver\u003c526 brand=nvidia,driver\u003e=525,driver\u003c526 brand=nvidiartx,driver\u003e=525,driver\u003c526 brand=geforce,driver\u003e=525,driver\u003c526 brand=geforcertx,driver\u003e=525,driver\u003c526 brand=quadro,driver\u003e=525,driver\u003c526 brand=quadrortx,driver\u003e=525,driver\u003c526 brand=titan,driver\u003e=525,driver\u003c526 brand=titanrtx,driver\u003e=525,driver\u003c526 brand=tesla,driver\u003e=535,driver\u003c536 brand=unknown,driver\u003e=535,driver\u003c536 brand=nvidia,driver\u003e=535,driver\u003c536 brand=nvidiartx,driver\u003e=535,driver\u003c536 brand=geforce,driver\u003e=535,driver\u003c536 brand=geforcertx,driver\u003e=535,driver\u003c536 brand=quadro,driver\u003e=535,driver\u003c536 brand=quadrortx,driver\u003e=535,driver\u003c536 brand=titan,driver\u003e=535,driver\u003c536 brand=titanrtx,driver\u003e=535,driver\u003c536",
            "NV_CUDA_CUDART_VERSION=12.4.127-1",
            "NV_CUDA_COMPAT_PACKAGE=cuda-compat-12-4",
            "CUDA_VERSION=12.4.1",
            "LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/lib/python3.10/dist-packages/nvidia/cublas/lib",
            "NVIDIA_VISIBLE_DEVICES=all",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "DEBIAN_FRONTEND=noninteractive",
            "VLLM_USAGE_SOURCE=production-docker-image",
            "NVM_DIR=/usr/local/nvm",
            "NODE_VERSION=14.21.1"
        ],
        "Cmd": [
            "/bin/bash"
        ],
        "Image": "sha256:e30d978130069b4ed0bc82ed1e79cb6d730dd76864d66ea02783bbce6175629d",
        "Volumes": null,
        "WorkingDir": "/opt/inference",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "maintainer": "NVIDIA CORPORATION \u003ccudatools@nvidia.com\u003e",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "20.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 26274527506,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/3ff7fe52e614d5409be9843a778dcf037aa3c1caf81bffc19acc94c0fd6a77b0/diff:/var/lib/docker/overlay2/29fe523c40ba24a443511f1350353fc56fe0298196d865a70a00d0d3b19676c6/diff:/var/lib/docker/overlay2/7f582965ab37ea233ceaa5e0e623e6c7f1d45fa0a52fc09ec516d19dcd95340b/diff:/var/lib/docker/overlay2/a686ba5c7d4953c014ad465dd3a52fdec51abd8ebda887be88c7529fd5bef085/diff:/var/lib/docker/overlay2/a6719a04d49d72dbff3dfc845cff808a5f2da0a7a604ce135c606404ff89af3f/diff:/var/lib/docker/overlay2/420855220a7707eb039e01dc96fdb49e0fcf494c20b09587f2ba6173b88de1f4/diff:/var/lib/docker/overlay2/82241de74a9ab6232aaf33bf9c668750d3d4dc67297fc88f1546ab83e6ea0183/diff:/var/lib/docker/overlay2/aea2d330d7d8409a241c7d7e56f8a82b5c04676b80387e0ff1c62a8ea2afb457/diff:/var/lib/docker/overlay2/a7dbcdcffad48cf54b08071ba8c289f856d2591bbba2b90eb7e1da65cff81b3f/diff:/var/lib/docker/overlay2/8cd8a3d50056dcc1e0cfb5a3c5fa2131db579e7b4d0fbb48ec704fb0515efbd5/diff:/var/lib/docker/overlay2/ffa224a2b2a2577e821891624d82e820a960cb3a99ba005fdfaf2f0647cd55ee/diff:/var/lib/docker/overlay2/0e6a6f4e5573529fa558e6b24566b38c6a84719b789ed96fb3009246f453765e/diff:/var/lib/docker/overlay2/dab94514c02fdb2620589f34a5bc510c4d563497315aafb167384c4513545546/diff:/var/lib/docker/overlay2/7adf3f9f81928cd4a4354e04808a47a1310fd41b49e578a96651cb49d5de79e3/diff:/var/lib/docker/overlay2/658950ec1e0e5fc86cb0d943568f69cb9ee68b401873dba7cdc367fb861d436d/diff:/var/lib/docker/overlay2/f684c3053f2c10cf7db88609b35810a4687299b260d0dcb69cd75c9f7ac79b37/diff",
            "MergedDir": "/var/lib/docker/overlay2/52c8e4fa909dbf5198a5ce0b4fb9be508e6cf72cee7e4a7efb95a3b68bb35ed4/merged",
            "UpperDir": "/var/lib/docker/overlay2/52c8e4fa909dbf5198a5ce0b4fb9be508e6cf72cee7e4a7efb95a3b68bb35ed4/diff",
            "WorkDir": "/var/lib/docker/overlay2/52c8e4fa909dbf5198a5ce0b4fb9be508e6cf72cee7e4a7efb95a3b68bb35ed4/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:106e8431b412f51ccd75ea46a2d5cb4343b23273cbcf50188377cb93aa9a6d82",
            "sha256:cd76869b72ab2a56badf9a068c3f6231203c316e6d1f7a9206e1a9a1a8009fd4",
            "sha256:2ecbf7829cb760264b926dfd4b3cd036c1bdbb50d311bb1c31a6c217ca208bb4",
            "sha256:5136ffc45974618abb8b5ac96d2be9a16fecf79d6ab6759d896f5355c52f10b9",
            "sha256:809d3bb9c80fb3d31d4c061ba0b38ba4e83b6329e33c2cb2bbf27251a8e527c6",
            "sha256:f64d335e5a99796db8621cc422978eba0bb9fbde78c7af53ca1867d545fde504",
            "sha256:2c1ed2c1da9656089f29d332d556e64364a9d7d3baf34f653823afc0fa33926e",
            "sha256:6e44a86d7e24ab88030cf839ab32d42a95208f9277b5906041b7a35ae7ddf21b",
            "sha256:3d42f132c006a5c80dbffaf9c2e38cfcaf379037b6b88273490e307a10f1a8ef",
            "sha256:3f3eb8c7c911cf80afc93c8ebd25b05e177a984d216a8cab762060d0f5395150",
            "sha256:220ab014ee964b4828b3fdc4d6cd323cbdeb9b4a237f47d2dd926f8fec617d36",
            "sha256:5d2b4a0ead904aadca0a0b294ae31c19a972e6c487a1be208d007d387635d556",
            "sha256:4e86c7fc05460c45921b651bdc94fb074bf0177aef3d5bfc3e0d690dc838cf89",
            "sha256:c692b8b4f7023688063dbc0aaff5e74f5b5057f62098bce735473ce7f7bee713",
            "sha256:3796e48046d9256d7b4fcc38a2c10a1f1cd47a6f5b6d46ad46d2850edf6fb0f9",
            "sha256:1d2744f6cc30d74cd2cf7ab2726f134894395afc67051da673134709e8973a30",
            "sha256:e71d6f5f7faa44c2c0f659944bba6b698d7b495c0edae21a612707570e81bb8b"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-03-22T01:19:30.339898601+08:00"
    }
}

更多版本

docker.io/xprobe/xinference:v0.13.3

linux/amd64 docker.io15.44GB2024-08-03 22:15
688

docker.io/xprobe/xinference:v0.14.0

linux/amd64 docker.io15.53GB2024-08-05 11:11
531

docker.io/xprobe/xinference:latest-cpu

linux/amd64 docker.io6.75GB2024-09-07 01:27
402

docker.io/xprobe/xinference:v0.12.1

linux/amd64 docker.io26.68GB2024-09-07 03:07
238

docker.io/xprobe/xinference:v0.15.0-cpu

linux/amd64 docker.io6.75GB2024-09-19 22:49
263

docker.io/xprobe/xinference:v0.12.0

linux/amd64 docker.io27.66GB2024-09-25 04:01
225

docker.io/xprobe/xinference:v0.15.2

linux/amd64 docker.io17.55GB2024-09-30 13:35
203

docker.io/xprobe/xinference:v0.15.4

linux/amd64 docker.io17.54GB2024-10-16 01:44
286

docker.io/xprobe/xinference:v0.16.3

linux/amd64 docker.io17.59GB2024-11-13 00:44
158

docker.io/xprobe/xinference:v1.0.0

linux/amd64 docker.io17.62GB2024-11-19 00:15
179

docker.io/xprobe/xinference:v1.0.1

linux/amd64 docker.io17.60GB2024-12-04 00:49
158

docker.io/xprobe/xinference:v1.1.0

linux/amd64 docker.io18.25GB2024-12-18 00:16
217

docker.io/xprobe/xinference:v1.2.0

linux/amd64 docker.io17.01GB2025-01-15 00:31
114

docker.io/xprobe/xinference:v1.2.1

linux/amd64 docker.io23.34GB2025-01-27 00:55
223

docker.io/xprobe/xinference:v1.2.2

linux/amd64 docker.io23.55GB2025-02-13 01:30
255

docker.io/xprobe/xinference:v1.3.0

linux/amd64 docker.io23.65GB2025-03-03 01:51
274

docker.io/xprobe/xinference:v1.3.1.post1

linux/amd64 docker.io25.86GB2025-03-13 01:23
126

docker.io/xprobe/xinference:v1.3.1

linux/amd64 docker.io25.83GB2025-03-19 01:04
160

docker.io/xprobe/xinference:v1.4.0

linux/amd64 docker.io26.27GB2025-03-22 01:34
216

docker.io/xprobe/xinference:v0.15.4-cpu

linux/amd64 docker.io6.78GB2025-04-01 20:33
12

docker.io/xprobe/xinference:v1.4.0-cpu

linux/amd64 docker.io8.82GB2025-04-02 00:33
16