docker.io/nvcr.io/nvidia/tritonserver:23.05-py3 linux/amd64

docker.io/nvcr.io/nvidia/tritonserver:23.05-py3 - 国内下载镜像源 浏览次数:10

这是一个 NVIDIA Triton Inference Server 的 Docker 镜像。Triton Inference Server 是一个高性能的推理服务器,用于部署各种深度学习模型,支持多种框架(例如 TensorFlow, PyTorch, TensorRT 等),并提供模型版本管理、模型部署、以及高效的推理服务。

源镜像 docker.io/nvcr.io/nvidia/tritonserver:23.05-py3
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3
镜像ID sha256:b89e3ec436749c7c24b0d142afe88aecd434e4c948dfd45e6cd3b04a2ff148e5
镜像TAG 23.05-py3
大小 12.88GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /opt/nvidia/nvidia_entrypoint.sh
工作目录 /opt/tritonserver
OS/平台 linux/amd64
浏览量 10 次
贡献者
镜像创建 2023-05-26T22:28:45.142127653Z
同步时间 2026-01-30 00:29
更新时间 2026-01-30 04:13
环境变量
PATH=/opt/tritonserver/bin:/usr/local/mpi/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/ucx/bin CUDA_VERSION=12.1.1.009 CUDA_DRIVER_VERSION=530.30.02 CUDA_CACHE_DISABLE=1 NVIDIA_REQUIRE_JETPACK_HOST_MOUNTS= _CUDA_COMPAT_PATH=/usr/local/cuda/compat ENV=/etc/shinit_v2 BASH_ENV=/etc/bash.bashrc SHELL=/bin/bash NVIDIA_REQUIRE_CUDA=cuda>=9.0 NCCL_VERSION=2.18.1 CUBLAS_VERSION=12.1.3.1 CUFFT_VERSION=11.0.2.54 CURAND_VERSION=10.3.2.106 CUSPARSE_VERSION=12.1.0.106 CUSOLVER_VERSION=11.4.5.107 CUTENSOR_VERSION=1.7.0.1 NPP_VERSION=12.1.0.4 NVJPEG_VERSION=12.2.0.2 CUDNN_VERSION=8.9.1.23 TRT_VERSION=8.6.1.2+cuda12.0.1.011 TRTOSS_VERSION=23.05 NSIGHT_SYSTEMS_VERSION=2023.2.1.89 NSIGHT_COMPUTE_VERSION=2023.1.1.4 DALI_VERSION=1.25.0 DALI_BUILD=7922358 POLYGRAPHY_VERSION=0.47.1 TRANSFORMER_ENGINE_VERSION=0.8 LD_LIBRARY_PATH=/opt/tritonserver/backends/onnxruntime:/usr/local/cuda/compat/lib:/usr/local/nvidia/lib:/usr/local/nvidia/lib64 NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=compute,utility,video NVIDIA_PRODUCT_NAME=Triton Server GDRCOPY_VERSION=2.3 HPCX_VERSION=2.14 MOFED_VERSION=5.4-rdmacore36.0 OPENUCX_VERSION=1.14.0 OPENMPI_VERSION=4.1.4 RDMACORE_VERSION=36.0 OPAL_PREFIX=/opt/hpcx/ompi OMPI_MCA_coll_hcoll_enable=0 LIBRARY_PATH=/usr/local/cuda/lib64/stubs: TRITON_SERVER_VERSION=2.34.0 NVIDIA_TRITON_SERVER_VERSION=23.05 TF_ADJUST_HUE_FUSED=1 TF_ADJUST_SATURATION_FUSED=1 TF_ENABLE_WINOGRAD_NONFUSED=1 TF_AUTOTUNE_THRESHOLD=2 TRITON_SERVER_GPU_ENABLED=1 TRITON_SERVER_USER=triton-server DEBIAN_FRONTEND=noninteractive TCMALLOC_RELEASE_RATE=200 DCGM_VERSION=2.4.7 NVIDIA_BUILD_ID=61161506
镜像标签
true: com.amazonaws.sagemaker.capabilities.accept-bind-to-port true: com.amazonaws.sagemaker.capabilities.multi-models 61161506: com.nvidia.build.id cef5288e1d981eb7cf43622960293e7f2c4aae5f: com.nvidia.build.ref 12.1.3.1: com.nvidia.cublas.version 9.0: com.nvidia.cuda.version 8.9.1.23: com.nvidia.cudnn.version 11.0.2.54: com.nvidia.cufft.version 10.3.2.106: com.nvidia.curand.version 11.4.5.107: com.nvidia.cusolver.version 12.1.0.106: com.nvidia.cusparse.version 1.7.0.1: com.nvidia.cutensor.version 2.18.1: com.nvidia.nccl.version 12.1.0.4: com.nvidia.npp.version 2023.1.1.4: com.nvidia.nsightcompute.version 2023.2.1.89: com.nvidia.nsightsystems.version 12.2.0.2: com.nvidia.nvjpeg.version 8.6.1.2+cuda12.0.1.011: com.nvidia.tensorrt.version 23.05: com.nvidia.tensorrtoss.version 2.34.0: com.nvidia.tritonserver.version nvidia_driver: com.nvidia.volumes.needed ubuntu: org.opencontainers.image.ref.name 22.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3  docker.io/nvcr.io/nvidia/tritonserver:23.05-py3

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3  docker.io/nvcr.io/nvidia/tritonserver:23.05-py3

Shell快速替换命令

sed -i 's#nvcr.io/nvidia/tritonserver:23.05-py3#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3  docker.io/nvcr.io/nvidia/tritonserver:23.05-py3'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3  docker.io/nvcr.io/nvidia/tritonserver:23.05-py3'

镜像构建历史


# 2023-05-27 06:28:45  6.93KB 
/bin/sh -c #(nop) COPY --chown=1000:1000file:7ba4b93a8f8ee0495fd0ad42d82c40984446acbb0b828a0e715c75eebba2f542 in /usr/bin/. 
                        
# 2023-05-27 06:28:44  0.00B 
/bin/sh -c #(nop)  LABEL com.amazonaws.sagemaker.capabilities.multi-models=true
                        
# 2023-05-27 06:28:43  0.00B 
/bin/sh -c #(nop)  LABEL com.amazonaws.sagemaker.capabilities.accept-bind-to-port=true
                        
# 2023-05-27 06:28:43  3.01MB 
/bin/sh -c #(nop) COPY --chown=1000:1000file:6d5d6be54a7e1bc76ff422ff2a96234d0a7877b5edacfdd48581e009ee550303 in . 
                        
# 2023-05-27 06:28:42  0.00B 
/bin/sh -c #(nop) WORKDIR /opt/tritonserver
                        
# 2023-05-27 06:28:17  5.23GB 
/bin/sh -c #(nop) COPY --chown=1000:1000dir:2644b16cf3c4a8ec9b018f2500ae54f7a014b293ecfd623ba345700a7bd2d113 in tritonserver 
                        
# 2023-05-27 06:27:10  0.00B 
/bin/sh -c #(nop) WORKDIR /opt
                        
# 2023-05-27 06:27:10  0.00B 
/bin/sh -c #(nop)  LABEL com.nvidia.build.ref=cef5288e1d981eb7cf43622960293e7f2c4aae5f
                        
# 2023-05-27 06:27:10  0.00B 
/bin/sh -c #(nop)  LABEL com.nvidia.build.id=61161506
                        
# 2023-05-27 06:27:10  0.00B 
/bin/sh -c #(nop)  ENV NVIDIA_BUILD_ID=61161506
                        
# 2023-05-27 06:27:09  733.00B 
/bin/sh -c #(nop) COPY dir:60f804cc97daeac125a0631b2cbf94981378694edb14de2191d325a32ab7f093 in /opt/nvidia/entrypoint.d/ 
                        
# 2023-05-27 06:27:09  0.00B 
/bin/sh -c #(nop)  ENV NVIDIA_PRODUCT_NAME=Triton Server
                        
# 2023-05-27 06:27:09  0.00B 
|2 TRITON_CONTAINER_VERSION=23.05 TRITON_VERSION=2.34.0 /bin/sh -c rm -fr /opt/tritonserver/*
                        
# 2023-05-27 06:27:08  0.00B 
/bin/sh -c #(nop) WORKDIR /opt/tritonserver
                        
# 2023-05-27 06:27:04  146.51MB 
|2 TRITON_CONTAINER_VERSION=23.05 TRITON_VERSION=2.34.0 /bin/sh -c apt-get update &&     apt-get install -y --no-install-recommends             python3 libarchive-dev             python3-pip             libpython3-dev &&     pip3 install --upgrade pip &&     pip3 install --upgrade wheel setuptools &&     pip3 install --upgrade numpy &&     rm -rf /var/lib/apt/lists/*
                        
# 2023-05-27 06:26:29  49.29KB 
|2 TRITON_CONTAINER_VERSION=23.05 TRITON_VERSION=2.34.0 /bin/sh -c ln -sf ${_CUDA_COMPAT_PATH}/lib.real ${_CUDA_COMPAT_PATH}/lib  && echo ${_CUDA_COMPAT_PATH}/lib > /etc/ld.so.conf.d/00-cuda-compat.conf  && ldconfig  && rm -f ${_CUDA_COMPAT_PATH}/lib
                        
# 2023-05-27 06:26:26  701.59MB 
|2 TRITON_CONTAINER_VERSION=23.05 TRITON_VERSION=2.34.0 /bin/sh -c curl -o /tmp/cuda-keyring.deb     https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.0-1_all.deb     && apt install /tmp/cuda-keyring.deb && rm /tmp/cuda-keyring.deb &&     apt-get update && apt-get install -y datacenter-gpu-manager=1:2.4.7
                        
# 2023-05-27 06:25:53  0.00B 
/bin/sh -c #(nop)  ENV DCGM_VERSION=2.4.7
                        
# 2023-05-22 03:25:45  0.00B 
/bin/sh -c #(nop)  ENV TCMALLOC_RELEASE_RATE=200
                        
# 2023-05-22 03:25:43  109.98MB 
|2 TRITON_CONTAINER_VERSION=23.05 TRITON_VERSION=2.34.0 /bin/sh -c apt-get update &&     apt-get install -y --no-install-recommends             software-properties-common             libb64-0d             libcurl4-openssl-dev             libre2-9             git             gperf             dirmngr             libgoogle-perftools-dev             libnuma-dev             curl             libjemalloc-dev             libgomp1 &&     rm -rf /var/lib/apt/lists/*
                        
# 2023-05-22 03:24:50  0.00B 
/bin/sh -c #(nop)  ENV DEBIAN_FRONTEND=noninteractive
                        
# 2023-05-22 03:24:49  329.08KB 
|2 TRITON_CONTAINER_VERSION=23.05 TRITON_VERSION=2.34.0 /bin/sh -c userdel tensorrt-server > /dev/null 2>&1 || true &&     if ! id -u $TRITON_SERVER_USER > /dev/null 2>&1 ; then         useradd $TRITON_SERVER_USER;     fi &&     [ `id -u $TRITON_SERVER_USER` -eq 1000 ] &&     [ `id -g $TRITON_SERVER_USER` -eq 1000 ]
                        
# 2023-05-22 03:24:47  0.00B 
/bin/sh -c #(nop)  ENV TRITON_SERVER_USER=triton-server
                        
# 2023-05-22 03:24:47  0.00B 
/bin/sh -c #(nop)  ENV TRITON_SERVER_GPU_ENABLED=1
                        
# 2023-05-22 03:24:46  0.00B 
/bin/sh -c #(nop)  ENV TF_AUTOTUNE_THRESHOLD=2
                        
# 2023-05-22 03:24:46  0.00B 
/bin/sh -c #(nop)  ENV TF_ENABLE_WINOGRAD_NONFUSED=1
                        
# 2023-05-22 03:24:46  0.00B 
/bin/sh -c #(nop)  ENV TF_ADJUST_SATURATION_FUSED=1
                        
# 2023-05-22 03:24:46  0.00B 
/bin/sh -c #(nop)  ENV TF_ADJUST_HUE_FUSED=1
                        
# 2023-05-22 03:24:46  0.00B 
/bin/sh -c #(nop)  ENV LD_LIBRARY_PATH=/opt/tritonserver/backends/onnxruntime:/usr/local/cuda/compat/lib:/usr/local/nvidia/lib:/usr/local/nvidia/lib64
                        
# 2023-05-22 03:24:45  0.00B 
/bin/sh -c #(nop)  ENV PATH=/opt/tritonserver/bin:/usr/local/mpi/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/ucx/bin
                        
# 2023-05-22 03:24:45  0.00B 
/bin/sh -c #(nop)  LABEL com.nvidia.tritonserver.version=2.34.0
                        
# 2023-05-22 03:24:45  0.00B 
/bin/sh -c #(nop)  ENV NVIDIA_TRITON_SERVER_VERSION=23.05
                        
# 2023-05-22 03:24:44  0.00B 
/bin/sh -c #(nop)  ENV TRITON_SERVER_VERSION=2.34.0
                        
# 2023-05-22 03:24:43  0.00B 
/bin/sh -c #(nop)  ARG TRITON_CONTAINER_VERSION
                        
# 2023-05-22 03:24:42  0.00B 
/bin/sh -c #(nop)  ARG TRITON_VERSION
                        
# 2023-05-02 05:15:07  83.29MB 执行命令并创建新的镜像层
RUN |9 GDRCOPY_VERSION=2.3 HPCX_VERSION=2.14 RDMACORE_VERSION=36.0 MOFED_VERSION=5.4-rdmacore36.0 OPENUCX_VERSION=1.14.0 OPENMPI_VERSION=4.1.4 TARGETARCH=amd64 HPCX_CUDA_MAJMIN=11.0 HPCX_CUDA_VERSION=11.8.0.065 /bin/sh -c if find /opt/hpcx/ -name "*so.*[0-9]" -type f -print -exec ldd {} \; | grep "not found" | sort -u | grep -q "libcudart.so.${HPCX_CUDA_MAJMIN} => not found"; then       echo "hpcx version depends on CUDA ${HPCX_CUDA_MAJMIN} so installing libcudart from ${HPCX_CUDA_VERSION}"  &&       BASE=2 /nvidia/build-scripts/installCUDA.sh ${HPCX_CUDA_VERSION};     fi # buildkit
                        
# 2023-05-02 05:14:55  0.00B 定义构建参数
ARG HPCX_CUDA_VERSION=11.8.0.065
                        
# 2023-05-02 05:14:55  0.00B 定义构建参数
ARG HPCX_CUDA_MAJMIN=11.0
                        
# 2023-05-02 05:14:55  0.00B 设置环境变量 LIBRARY_PATH
ENV LIBRARY_PATH=/usr/local/cuda/lib64/stubs:
                        
# 2023-05-02 05:14:55  868.07MB 执行命令并创建新的镜像层
RUN |7 GDRCOPY_VERSION=2.3 HPCX_VERSION=2.14 RDMACORE_VERSION=36.0 MOFED_VERSION=5.4-rdmacore36.0 OPENUCX_VERSION=1.14.0 OPENMPI_VERSION=4.1.4 TARGETARCH=amd64 /bin/sh -c export DEVEL=1 BASE=0  && /nvidia/build-scripts/installNCU.sh  && /nvidia/build-scripts/installCUDA.sh  && /nvidia/build-scripts/installLIBS.sh  && /nvidia/build-scripts/installNCCL.sh  && /nvidia/build-scripts/installCUDNN.sh  && /nvidia/build-scripts/installCUTENSOR.sh  && /nvidia/build-scripts/installTRT.sh  && /nvidia/build-scripts/installNSYS.sh  && if [ -f "/tmp/cuda-${_CUDA_VERSION_MAJMIN}.patch" ]; then patch -p0 < /tmp/cuda-${_CUDA_VERSION_MAJMIN}.patch; fi  && rm -f /tmp/cuda-*.patch # buildkit
                        
# 2023-05-02 05:08:47  1.49KB 复制新文件或目录到容器中
COPY cuda-*.patch /tmp # buildkit
                        
# 2023-05-02 05:08:47  0.00B 设置环境变量 OMPI_MCA_coll_hcoll_enable
ENV OMPI_MCA_coll_hcoll_enable=0
                        
# 2023-05-02 05:08:47  0.00B 设置环境变量 OPAL_PREFIX PATH
ENV OPAL_PREFIX=/opt/hpcx/ompi PATH=/usr/local/mpi/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/ucx/bin
                        
# 2023-05-02 05:08:47  183.79MB 执行命令并创建新的镜像层
RUN |7 GDRCOPY_VERSION=2.3 HPCX_VERSION=2.14 RDMACORE_VERSION=36.0 MOFED_VERSION=5.4-rdmacore36.0 OPENUCX_VERSION=1.14.0 OPENMPI_VERSION=4.1.4 TARGETARCH=amd64 /bin/sh -c cd /nvidia  && ( cd opt/rdma-core/                             && dpkg -i libibverbs1_*.deb                            libibverbs-dev_*.deb                         librdmacm1_*.deb                             librdmacm-dev_*.deb                          libibumad3_*.deb                             libibumad-dev_*.deb                          ibverbs-utils_*.deb                          ibverbs-providers_*.deb           && rm $(dpkg-query -L                                    libibverbs-dev                               librdmacm-dev                                libibumad-dev                            | grep "\(\.so\|\.a\)$")          )                                            && ( cd opt/gdrcopy/                              && dpkg -i libgdrapi_*.deb                   )                                         && ( cp -r opt/hpcx /opt/                                         && cp etc/ld.so.conf.d/hpcx.conf /etc/ld.so.conf.d/          && ln -sf /opt/hpcx/ompi /usr/local/mpi                      && ln -sf /opt/hpcx/ucx  /usr/local/ucx                      && sed -i 's/^\(hwloc_base_binding_policy\) = core$/\1 = none/' /opt/hpcx/ompi/etc/openmpi-mca-params.conf         && sed -i 's/^\(btl = self\)$/#\1/'                             /opt/hpcx/ompi/etc/openmpi-mca-params.conf       )                                                         && ldconfig # buildkit
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG TARGETARCH
                        
# 2023-05-02 05:08:47  0.00B 设置环境变量 GDRCOPY_VERSION HPCX_VERSION MOFED_VERSION OPENUCX_VERSION OPENMPI_VERSION RDMACORE_VERSION
ENV GDRCOPY_VERSION=2.3 HPCX_VERSION=2.14 MOFED_VERSION=5.4-rdmacore36.0 OPENUCX_VERSION=1.14.0 OPENMPI_VERSION=4.1.4 RDMACORE_VERSION=36.0
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG OPENMPI_VERSION
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG OPENUCX_VERSION
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG MOFED_VERSION=5.4-rdmacore36.0
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG RDMACORE_VERSION
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG HPCX_VERSION
                        
# 2023-05-02 05:08:47  0.00B 定义构建参数
ARG GDRCOPY_VERSION
                        
# 2023-05-02 05:08:41  84.77MB 执行命令并创建新的镜像层
RUN /bin/sh -c export DEBIAN_FRONTEND=noninteractive  && apt-get update  && apt-get install -y --no-install-recommends         build-essential         git         libglib2.0-0         less         libnl-route-3-200         libnl-3-dev         libnl-route-3-dev         libnuma-dev         libnuma1         libpmi2-0-dev         nano         numactl         openssh-client         vim         wget  && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2023-05-02 04:56:14  148.72KB 复制新文件或目录到容器中
COPY NVIDIA_Deep_Learning_Container_License.pdf /workspace/ # buildkit
                        
# 2023-05-02 04:56:13  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/opt/nvidia/nvidia_entrypoint.sh"]
                        
# 2023-05-02 04:56:13  0.00B 设置环境变量 NVIDIA_PRODUCT_NAME
ENV NVIDIA_PRODUCT_NAME=CUDA
                        
# 2023-05-02 04:56:13  14.53KB 复制新文件或目录到容器中
COPY entrypoint/ /opt/nvidia/ # buildkit
                        
# 2023-05-02 04:56:13  0.00B 设置环境变量 PATH LD_LIBRARY_PATH NVIDIA_VISIBLE_DEVICES NVIDIA_DRIVER_CAPABILITIES
ENV PATH=/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LD_LIBRARY_PATH=/usr/local/cuda/compat/lib:/usr/local/nvidia/lib:/usr/local/nvidia/lib64 NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=compute,utility,video
                        
# 2023-05-02 04:56:13  0.00B 定义构建参数
ARG _LIBPATH_SUFFIX
                        
# 2023-05-02 04:56:13  46.00B 执行命令并创建新的镜像层
RUN |21 CUDA_VERSION=12.1.1.009 CUDA_DRIVER_VERSION=530.30.02 JETPACK_HOST_MOUNTS= NCCL_VERSION=2.18.1 CUBLAS_VERSION=12.1.3.1 CUFFT_VERSION=11.0.2.54 CURAND_VERSION=10.3.2.106 CUSPARSE_VERSION=12.1.0.106 CUSOLVER_VERSION=11.4.5.107 CUTENSOR_VERSION=1.7.0.1 NPP_VERSION=12.1.0.4 NVJPEG_VERSION=12.2.0.2 CUDNN_VERSION=8.9.1.23 TRT_VERSION=8.6.1.2+cuda12.0.1.011 TRTOSS_VERSION=23.05 NSIGHT_SYSTEMS_VERSION=2023.2.1.89 NSIGHT_COMPUTE_VERSION=2023.1.1.4 DALI_VERSION=1.25.0 DALI_BUILD=7922358 POLYGRAPHY_VERSION=0.47.1 TRANSFORMER_ENGINE_VERSION=0.8 /bin/sh -c echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf  && echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf # buildkit
                        
# 2023-05-02 04:56:12  13.39KB 复制文件或目录到容器中
ADD docs.tgz / # buildkit
                        
# 2023-05-02 04:56:12  0.00B 设置环境变量 DALI_VERSION DALI_BUILD POLYGRAPHY_VERSION TRANSFORMER_ENGINE_VERSION
ENV DALI_VERSION=1.25.0 DALI_BUILD=7922358 POLYGRAPHY_VERSION=0.47.1 TRANSFORMER_ENGINE_VERSION=0.8
                        
# 2023-05-02 04:56:12  0.00B 定义构建参数
ARG TRANSFORMER_ENGINE_VERSION
                        
# 2023-05-02 04:56:12  0.00B 定义构建参数
ARG POLYGRAPHY_VERSION
                        
# 2023-05-02 04:56:12  0.00B 定义构建参数
ARG DALI_BUILD
                        
# 2023-05-02 04:56:12  0.00B 定义构建参数
ARG DALI_VERSION
                        
# 2023-05-02 04:56:12  0.00B 添加元数据标签
LABEL com.nvidia.nccl.version=2.18.1 com.nvidia.cublas.version=12.1.3.1 com.nvidia.cufft.version=11.0.2.54 com.nvidia.curand.version=10.3.2.106 com.nvidia.cusparse.version=12.1.0.106 com.nvidia.cusolver.version=11.4.5.107 com.nvidia.cutensor.version=1.7.0.1 com.nvidia.npp.version=12.1.0.4 com.nvidia.nvjpeg.version=12.2.0.2 com.nvidia.cudnn.version=8.9.1.23 com.nvidia.tensorrt.version=8.6.1.2+cuda12.0.1.011 com.nvidia.tensorrtoss.version=23.05 com.nvidia.nsightsystems.version=2023.2.1.89 com.nvidia.nsightcompute.version=2023.1.1.4
                        
# 2023-05-02 04:56:12  4.66GB 执行命令并创建新的镜像层
RUN |17 CUDA_VERSION=12.1.1.009 CUDA_DRIVER_VERSION=530.30.02 JETPACK_HOST_MOUNTS= NCCL_VERSION=2.18.1 CUBLAS_VERSION=12.1.3.1 CUFFT_VERSION=11.0.2.54 CURAND_VERSION=10.3.2.106 CUSPARSE_VERSION=12.1.0.106 CUSOLVER_VERSION=11.4.5.107 CUTENSOR_VERSION=1.7.0.1 NPP_VERSION=12.1.0.4 NVJPEG_VERSION=12.2.0.2 CUDNN_VERSION=8.9.1.23 TRT_VERSION=8.6.1.2+cuda12.0.1.011 TRTOSS_VERSION=23.05 NSIGHT_SYSTEMS_VERSION=2023.2.1.89 NSIGHT_COMPUTE_VERSION=2023.1.1.4 /bin/sh -c /nvidia/build-scripts/installNCCL.sh  && /nvidia/build-scripts/installLIBS.sh  && /nvidia/build-scripts/installCUDNN.sh  && /nvidia/build-scripts/installTRT.sh  && /nvidia/build-scripts/installNSYS.sh  && /nvidia/build-scripts/installNCU.sh  && /nvidia/build-scripts/installCUTENSOR.sh # buildkit
                        
# 2023-05-02 04:52:46  0.00B 设置环境变量 NCCL_VERSION CUBLAS_VERSION CUFFT_VERSION CURAND_VERSION CUSPARSE_VERSION CUSOLVER_VERSION CUTENSOR_VERSION NPP_VERSION NVJPEG_VERSION CUDNN_VERSION TRT_VERSION TRTOSS_VERSION NSIGHT_SYSTEMS_VERSION NSIGHT_COMPUTE_VERSION
ENV NCCL_VERSION=2.18.1 CUBLAS_VERSION=12.1.3.1 CUFFT_VERSION=11.0.2.54 CURAND_VERSION=10.3.2.106 CUSPARSE_VERSION=12.1.0.106 CUSOLVER_VERSION=11.4.5.107 CUTENSOR_VERSION=1.7.0.1 NPP_VERSION=12.1.0.4 NVJPEG_VERSION=12.2.0.2 CUDNN_VERSION=8.9.1.23 TRT_VERSION=8.6.1.2+cuda12.0.1.011 TRTOSS_VERSION=23.05 NSIGHT_SYSTEMS_VERSION=2023.2.1.89 NSIGHT_COMPUTE_VERSION=2023.1.1.4
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG NSIGHT_COMPUTE_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG NSIGHT_SYSTEMS_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG TRTOSS_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG TRT_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUDNN_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG NVJPEG_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG NPP_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUTENSOR_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUSOLVER_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUSPARSE_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CURAND_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUFFT_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUBLAS_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG NCCL_VERSION
                        
# 2023-05-02 04:52:46  0.00B 添加元数据标签
LABEL com.nvidia.volumes.needed=nvidia_driver com.nvidia.cuda.version=9.0
                        
# 2023-05-02 04:52:46  0.00B 设置环境变量 _CUDA_COMPAT_PATH ENV BASH_ENV SHELL NVIDIA_REQUIRE_CUDA
ENV _CUDA_COMPAT_PATH=/usr/local/cuda/compat ENV=/etc/shinit_v2 BASH_ENV=/etc/bash.bashrc SHELL=/bin/bash NVIDIA_REQUIRE_CUDA=cuda>=9.0
                        
# 2023-05-02 04:52:46  58.45KB 执行命令并创建新的镜像层
RUN |3 CUDA_VERSION=12.1.1.009 CUDA_DRIVER_VERSION=530.30.02 JETPACK_HOST_MOUNTS= /bin/sh -c cp -vprd /nvidia/. /  &&  patch -p0 < /etc/startup_scripts.patch  &&  rm -f /etc/startup_scripts.patch # buildkit
                        
# 2023-05-02 04:52:46  405.88MB 执行命令并创建新的镜像层
RUN |3 CUDA_VERSION=12.1.1.009 CUDA_DRIVER_VERSION=530.30.02 JETPACK_HOST_MOUNTS= /bin/sh -c /nvidia/build-scripts/installCUDA.sh # buildkit
                        
# 2023-05-02 04:52:46  0.00B 设置环境变量 CUDA_VERSION CUDA_DRIVER_VERSION CUDA_CACHE_DISABLE NVIDIA_REQUIRE_JETPACK_HOST_MOUNTS
ENV CUDA_VERSION=12.1.1.009 CUDA_DRIVER_VERSION=530.30.02 CUDA_CACHE_DISABLE=1 NVIDIA_REQUIRE_JETPACK_HOST_MOUNTS=
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG JETPACK_HOST_MOUNTS
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUDA_DRIVER_VERSION
                        
# 2023-05-02 04:52:46  0.00B 定义构建参数
ARG CUDA_VERSION
                        
# 2023-04-29 08:47:59  316.78MB 执行命令并创建新的镜像层
RUN /bin/sh -c export DEBIAN_FRONTEND=noninteractive  && apt-get update  && apt-get install -y --no-install-recommends         apt-utils         build-essential         ca-certificates         curl         libncurses5         libncursesw5         patch         wget         rsync         unzip         jq         gnupg         libtcmalloc-minimal4 # buildkit
                        
# 2023-03-08 12:44:27  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2023-03-08 12:44:27  77.81MB 
/bin/sh -c #(nop) ADD file:c8ef6447752cab2541ffca9e3cfa27d581f3491bc8f356f6eafd951243609341 in / 
                        
# 2023-03-08 12:44:25  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2023-03-08 12:44:25  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2023-03-08 12:44:25  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2023-03-08 12:44:25  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:b89e3ec436749c7c24b0d142afe88aecd434e4c948dfd45e6cd3b04a2ff148e5",
    "RepoTags": [
        "nvcr.io/nvidia/tritonserver:23.05-py3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver:23.05-py3"
    ],
    "RepoDigests": [
        "nvcr.io/nvidia/tritonserver@sha256:0190dbdba3012f0c3ed9a7eaf57f0c8612443b36a3fbcbb4cff807a74dbdb819",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/nvcr.io/nvidia/tritonserver@sha256:96fad686736ec08ba6ccd9f4be37045f84571f476690be1e39f0e537309975c4"
    ],
    "Parent": "",
    "Comment": "",
    "Created": "2023-05-26T22:28:45.142127653Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/tritonserver/bin:/usr/local/mpi/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/ucx/bin",
            "CUDA_VERSION=12.1.1.009",
            "CUDA_DRIVER_VERSION=530.30.02",
            "CUDA_CACHE_DISABLE=1",
            "NVIDIA_REQUIRE_JETPACK_HOST_MOUNTS=",
            "_CUDA_COMPAT_PATH=/usr/local/cuda/compat",
            "ENV=/etc/shinit_v2",
            "BASH_ENV=/etc/bash.bashrc",
            "SHELL=/bin/bash",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=9.0",
            "NCCL_VERSION=2.18.1",
            "CUBLAS_VERSION=12.1.3.1",
            "CUFFT_VERSION=11.0.2.54",
            "CURAND_VERSION=10.3.2.106",
            "CUSPARSE_VERSION=12.1.0.106",
            "CUSOLVER_VERSION=11.4.5.107",
            "CUTENSOR_VERSION=1.7.0.1",
            "NPP_VERSION=12.1.0.4",
            "NVJPEG_VERSION=12.2.0.2",
            "CUDNN_VERSION=8.9.1.23",
            "TRT_VERSION=8.6.1.2+cuda12.0.1.011",
            "TRTOSS_VERSION=23.05",
            "NSIGHT_SYSTEMS_VERSION=2023.2.1.89",
            "NSIGHT_COMPUTE_VERSION=2023.1.1.4",
            "DALI_VERSION=1.25.0",
            "DALI_BUILD=7922358",
            "POLYGRAPHY_VERSION=0.47.1",
            "TRANSFORMER_ENGINE_VERSION=0.8",
            "LD_LIBRARY_PATH=/opt/tritonserver/backends/onnxruntime:/usr/local/cuda/compat/lib:/usr/local/nvidia/lib:/usr/local/nvidia/lib64",
            "NVIDIA_VISIBLE_DEVICES=all",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility,video",
            "NVIDIA_PRODUCT_NAME=Triton Server",
            "GDRCOPY_VERSION=2.3",
            "HPCX_VERSION=2.14",
            "MOFED_VERSION=5.4-rdmacore36.0",
            "OPENUCX_VERSION=1.14.0",
            "OPENMPI_VERSION=4.1.4",
            "RDMACORE_VERSION=36.0",
            "OPAL_PREFIX=/opt/hpcx/ompi",
            "OMPI_MCA_coll_hcoll_enable=0",
            "LIBRARY_PATH=/usr/local/cuda/lib64/stubs:",
            "TRITON_SERVER_VERSION=2.34.0",
            "NVIDIA_TRITON_SERVER_VERSION=23.05",
            "TF_ADJUST_HUE_FUSED=1",
            "TF_ADJUST_SATURATION_FUSED=1",
            "TF_ENABLE_WINOGRAD_NONFUSED=1",
            "TF_AUTOTUNE_THRESHOLD=2",
            "TRITON_SERVER_GPU_ENABLED=1",
            "TRITON_SERVER_USER=triton-server",
            "DEBIAN_FRONTEND=noninteractive",
            "TCMALLOC_RELEASE_RATE=200",
            "DCGM_VERSION=2.4.7",
            "NVIDIA_BUILD_ID=61161506"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/opt/tritonserver",
        "Entrypoint": [
            "/opt/nvidia/nvidia_entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "com.amazonaws.sagemaker.capabilities.accept-bind-to-port": "true",
            "com.amazonaws.sagemaker.capabilities.multi-models": "true",
            "com.nvidia.build.id": "61161506",
            "com.nvidia.build.ref": "cef5288e1d981eb7cf43622960293e7f2c4aae5f",
            "com.nvidia.cublas.version": "12.1.3.1",
            "com.nvidia.cuda.version": "9.0",
            "com.nvidia.cudnn.version": "8.9.1.23",
            "com.nvidia.cufft.version": "11.0.2.54",
            "com.nvidia.curand.version": "10.3.2.106",
            "com.nvidia.cusolver.version": "11.4.5.107",
            "com.nvidia.cusparse.version": "12.1.0.106",
            "com.nvidia.cutensor.version": "1.7.0.1",
            "com.nvidia.nccl.version": "2.18.1",
            "com.nvidia.npp.version": "12.1.0.4",
            "com.nvidia.nsightcompute.version": "2023.1.1.4",
            "com.nvidia.nsightsystems.version": "2023.2.1.89",
            "com.nvidia.nvjpeg.version": "12.2.0.2",
            "com.nvidia.tensorrt.version": "8.6.1.2+cuda12.0.1.011",
            "com.nvidia.tensorrtoss.version": "23.05",
            "com.nvidia.tritonserver.version": "2.34.0",
            "com.nvidia.volumes.needed": "nvidia_driver",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "22.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 12875359271,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/4497c9ba4576c68c5f441679c7bafb3f5bbfcb59c0a5f975173fd2d59d6f3806/diff:/var/lib/docker/overlay2/43b2c257289ebab73b2a3eac29e10e0d15768425ae83e8d0b85df60ff09dae42/diff:/var/lib/docker/overlay2/f60a9e64ddbe1ad81e9bf00c72b4be2174d96e28d663a9a55c4c20e25957a25a/diff:/var/lib/docker/overlay2/b808c557fb8705f7dc63fae0bfa5b63898cea727cabc2687e88e5f07d826ac3b/diff:/var/lib/docker/overlay2/b5514273aa20cd649bf42c9ed8e92b4455ee26cde18af23267d6c148a73fab0b/diff:/var/lib/docker/overlay2/8955db4b1a45da6da4c1995ea5708aebcbca95ba0f94cd0eca18e41bbb86640e/diff:/var/lib/docker/overlay2/7373844ce7f00c24e98dbff75650ba672ec14e49292763470bc1f237ec2ba0c5/diff:/var/lib/docker/overlay2/d37416afc03cc3bdd1171d8ba426d6827fd588fdfb40efbb45eb57e58826978a/diff:/var/lib/docker/overlay2/5f9b6356b537f2ff8d218aeef08d2974a06bfdefe2d59871272ef9e5edcc6828/diff:/var/lib/docker/overlay2/73cf74cd9827c26e0bebef55474d31e4c03778d5c1b363bed83da8da4bc982d0/diff:/var/lib/docker/overlay2/a8736df8a2aee4557aa117fc8d40e7e0ae014733bc303cc4fa7470dcf8323452/diff:/var/lib/docker/overlay2/3ad14e54f6c86b154161e4bf7b54aef240f337945ff1e04986943b51ad37bb1a/diff:/var/lib/docker/overlay2/4c40ceaf2d037299a3cf585904e49bf264fcf22ab94eb7c32ebe66d0457fe3d0/diff:/var/lib/docker/overlay2/a11fe0608761c4deae19781e4fe48ead75beb3ff8c15b3191f9f7ce12c072b99/diff:/var/lib/docker/overlay2/e90291a2d47082f3895839ae357ad7a311e81fcbdd8f739c1223469cc99bbdf0/diff:/var/lib/docker/overlay2/958719a5d0a5526c3fce28d4c64211a0921b512a835e29e1ff76f7695506664e/diff:/var/lib/docker/overlay2/5018c88b5f320a9bb4a97ca9312501dca1d575c13df334989d1a4f82b748b480/diff:/var/lib/docker/overlay2/cdefe165dbcee566c62eb30a8bfafc6a8c5efa690432d2691ae6c3aee4177301/diff:/var/lib/docker/overlay2/4a26da36298de008120da44957e228e32a4cd251cce20ba73492b4f827b01d52/diff:/var/lib/docker/overlay2/573d5c83dff8c35d9e2d47c19ab21428e697d59701a29513cb4aab5a0661c1e1/diff:/var/lib/docker/overlay2/dbdd7f651d7a0c9a83433af19acf2b04b92ccd939800c6c93f1d88e82a503a67/diff:/var/lib/docker/overlay2/b6ebbf2ebf8bb427d0aee6491c24009349c58721b7726e00f213b1b7008491ac/diff:/var/lib/docker/overlay2/d3f00860d417e273a50440cff15443608a9c2589c021a911204b6b3e5478aadb/diff",
            "MergedDir": "/var/lib/docker/overlay2/93c49ce7416ef5b7240a4681c22a260c19c436e3b194005517cb7b604e6b5b85/merged",
            "UpperDir": "/var/lib/docker/overlay2/93c49ce7416ef5b7240a4681c22a260c19c436e3b194005517cb7b604e6b5b85/diff",
            "WorkDir": "/var/lib/docker/overlay2/93c49ce7416ef5b7240a4681c22a260c19c436e3b194005517cb7b604e6b5b85/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:b93c1bd012ab8fda60f5b4f5906bf244586e0e3292d84571d3abb56472248466",
            "sha256:f1c52a5f1f6e6e6bfd7c5bcbbe7f39ee0b2ab4449e1455bc7a2b8e6cb61e1ecf",
            "sha256:87da2c54cfd3f45e622dba6361b068f3b5f0eca5fffd8d2bd7dd419c55ea21ba",
            "sha256:da1fa333cd99d45712b64f3c2bc820bf9cbd3d9e139b122d7a773de597147f78",
            "sha256:b5eef279b1b981f0a5f47d6844e9f0f56de2a686fee58b961ba8ba881467b8fe",
            "sha256:fbf380ecac6898b086e9193c48baac438ca69cdba8eecce1fc1d58f46942e455",
            "sha256:b31064da75689eddb88ef3a1d342bf284ac5a16f92a450725db639b90a41cf0f",
            "sha256:67f37f41694c26a071f77424e294db1680979612499db89e1699848fdfb190eb",
            "sha256:44962a08a3494107d8928f57386d74b8dea1c39ac3a904f9ed757ac4f935c853",
            "sha256:12fa423146a29368067c32ddb5e30ebaedbea4536adaefa502bdf7aa71804d4d",
            "sha256:1aaa5323b21392ab2a73934f97fa0322444e5d351b5a65ed8fbc5d2bbfeb2aaf",
            "sha256:1bffb63a5613bed00cb532125e6b906f3c40d11d2f694e1e34554feaee59de4f",
            "sha256:cc4020295ad27aa141507db93f8edfb44f42ee2255e0ea8a1039671643081209",
            "sha256:1649b46d7a72bf9cd580151da49f6e3d908f8eb179a2d703fed27f91fc3a0dfd",
            "sha256:6bd8c17e10b4a0a9efd5e5b838d13cc32c2d100c318a55750cc7821291515519",
            "sha256:f295797963ca7885a4c6ab3c98289f7bf647ca5e1523ed31b63e8ced5d9d146d",
            "sha256:3fd0471593d27020e65c02fee9554e6786d153be73a5cfcb3f705d907bd9ae09",
            "sha256:fa239584025fb80fd9a8c8a2d9c2b1e7eb0d2fccb95b7ded5dae8b4b2e3af520",
            "sha256:2035a96da1f228b01ec0740c855fcda70a835a6685dea643241181dfacbf3035",
            "sha256:bf52669ad7a7527e629afb9589e1bd9f0e09e21e8307bfa4481dcaaf606f1844",
            "sha256:aa931ff7c236cb9386c44a0b3ade7f729b74bf75caeb4c8f004e10a4faec4861",
            "sha256:1ae5f1de08c011a78907b1920dce2426480656f83f63820723aedac6c8f5db84",
            "sha256:4c56cd505558511afd3153ffd96301ed5a064a7ac48759a3627c20f3fd708a4d",
            "sha256:81eb4a4d68b5d6e8b144457c44c0113c8584aa9696676403e7ee01d8fc56435c"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-01-30T00:10:10.125818634+08:00"
    }
}

更多版本

docker.io/nvcr.io/nvidia/tritonserver:24.11-trtllm-python-py3

linux/amd64 docker.io24.86GB2025-02-26 02:25
1089

docker.io/nvcr.io/nvidia/tritonserver:25.04-py3

linux/amd64 docker.io19.59GB2025-05-22 00:46
1317

docker.io/nvcr.io/nvidia/tritonserver:25.02-trtllm-python-py3

linux/amd64 docker.io30.15GB2025-05-25 02:51
825

docker.io/nvcr.io/nvidia/tritonserver:22.12-py3

linux/amd64 docker.io14.00GB2025-06-04 06:32
529

docker.io/nvcr.io/nvidia/tritonserver:25.05-trtllm-python-py3

linux/amd64 docker.io32.90GB2025-06-09 03:41
801

docker.io/nvcr.io/nvidia/tritonserver:25.04-vllm-python-py3

linux/amd64 docker.io23.34GB2025-06-15 02:52
495

docker.io/nvcr.io/nvidia/tritonserver:25.05-vllm-python-py3

linux/amd64 docker.io23.93GB2025-06-17 01:33
864

docker.io/nvcr.io/nvidia/tritonserver:23.05-py3

linux/amd64 docker.io12.88GB2026-01-30 00:29
9