docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04 linux/arm64

docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04 - 国内下载镜像源 浏览次数:10 温馨提示: 这是一个 linux/arm64 系统架构镜像
使用 Docker 镜像 llama_cpp 此镜像基于 C++ 编程语言,用于高性能、高效的编程任务。
源镜像 docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64
镜像ID sha256:b47912ab278c53a8b9a8725beb6bd28b4afce4b38c83122d219ae6a1a48bf99b
镜像TAG b5283-r36.4-cu128-24.04-linuxarm64
大小 7.08GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD /bin/bash
启动入口
工作目录 /
OS/平台 linux/arm64
浏览量 10 次
贡献者
镜像创建 2025-05-05T16:03:23.731957936Z
同步时间 2025-11-04 16:25
更新时间 2025-11-04 23:27
环境变量
PATH=/opt/venv/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin DEBIAN_FRONTEND=noninteractive LANGUAGE=en_US:en LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 WGET_FLAGS=--quiet --show-progress --progress=bar:force:noscroll --no-check-certificate MULTIARCH_URL=https://apt.jetson-ai-lab.dev/multiarch TAR_INDEX_URL=https://apt.jetson-ai-lab.dev/jp6/cu128/24.04 PIP_INDEX_URL=https://pypi.jetson-ai-lab.dev/jp6/cu128 PIP_TRUSTED_HOST= TWINE_REPOSITORY_URL=http://alice:3141/jp6/cu128 TWINE_USERNAME=jp6 TWINE_PASSWORD=NvidiaJetson24 SCP_UPLOAD_URL=jao-51:/dist/jp6/cu128/24.04 SCP_UPLOAD_USER=nvidia SCP_UPLOAD_PASS=nvidia CUDA_HOME=/usr/local/cuda NVCC_PATH=/usr/local/cuda/bin/nvcc NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=all CUDAARCHS=87 CUDA_ARCHITECTURES=87 CUDNN_LIB_PATH=/usr/lib/aarch64-linux-gnu CUDNN_LIB_INCLUDE_PATH=/usr/include CMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc CUDA_NVCC_EXECUTABLE=/usr/local/cuda/bin/nvcc CUDACXX=/usr/local/cuda/bin/nvcc TORCH_NVCC_FLAGS=-Xfatbin -compress-all CUDA_BIN_PATH=/usr/local/cuda/bin CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda LD_LIBRARY_PATH=/usr/local/cuda/compat:/usr/local/cuda/lib64: PYTHON_VERSION=3.12 PYTHONFAULTHANDLER=1 PYTHONUNBUFFERED=1 PYTHONIOENCODING=utf-8 PYTHONHASHSEED=random PIP_NO_CACHE_DIR=true PIP_CACHE_PURGE=true PIP_ROOT_USER_ACTION=ignore PIP_DISABLE_PIP_VERSION_CHECK=on PIP_DEFAULT_TIMEOUT=100 PIP_WHEEL_DIR=/opt/wheels PIP_VERBOSE=1 TWINE_NON_INTERACTIVE=1 OPENBLAS_CORETYPE=ARMV8 NUMPY_PACKAGE=numpy NUMPY_VERSION_MAJOR=2 TRANSFORMERS_CACHE=/data/models/huggingface HUGGINGFACE_HUB_CACHE=/data/models/huggingface HF_HOME=/data/models/huggingface
镜像标签
ubuntu: org.opencontainers.image.ref.name 24.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64  docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64  docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04

Shell快速替换命令

sed -i 's#dustynv/llama_cpp:b5283-r36.4-cu128-24.04#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64  docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64  docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04'

镜像构建历史


# 2025-05-06 00:03:23  62.34KB 
|8 FORCE_BUILD=off LLAMA_CPP_BRANCH=b5283 LLAMA_CPP_BRANCH_PY=main LLAMA_CPP_FLAGS=-DGGML_CUDA=on -DGGML_CUDA_F16=on -DLLAMA_CURL=on -DGGML_CUDA_FA_ALL_QUANTS=ON -DLLAMA_BUILD_SERVER=ON -DLLAMA_BUILD_EXAMPLES=ON -DLLAMA_BUILD_TESTS=OFF LLAMA_CPP_VERSION=5283 LLAMA_CPP_VERSION_PY=0.3.8 SOURCE_DIR=/opt/llama_cpp_python TMP=/tmp/llama_cpp /bin/sh -c if [ ! -f "$TMP/.llama_cpp" ]; then       echo "FAILED to install llama.cpp $LLAMA_CPP_VERSION";       exit 1;     fi
                        
# 2025-05-06 00:03:22  342.47MB 
|8 FORCE_BUILD=off LLAMA_CPP_BRANCH=b5283 LLAMA_CPP_BRANCH_PY=main LLAMA_CPP_FLAGS=-DGGML_CUDA=on -DGGML_CUDA_F16=on -DLLAMA_CURL=on -DGGML_CUDA_FA_ALL_QUANTS=ON -DLLAMA_BUILD_SERVER=ON -DLLAMA_BUILD_EXAMPLES=ON -DLLAMA_BUILD_TESTS=OFF LLAMA_CPP_VERSION=5283 LLAMA_CPP_VERSION_PY=0.3.8 SOURCE_DIR=/opt/llama_cpp_python TMP=/tmp/llama_cpp /bin/sh -c $TMP/install.sh || $TMP/build.sh || true
                        
# 2025-05-06 00:03:00  4.42KB 
/bin/sh -c #(nop) COPY file:8d42431863726d218c4ec812e3f83ed4d81a182c11449a67c5de4aa4f879b404 in /usr/local/bin/llama_cpp_benchmark.py 
                        
# 2025-05-06 00:03:00  2.15KB 
/bin/sh -c #(nop) COPY multi:8991b91d4ac96d2bf70348e3f78528ade3878b557056ea0d4a67bce7b0a3e764 in /tmp/llama_cpp/ 
                        
# 2025-05-05 20:31:31  0.00B 
/bin/sh -c #(nop)  ARG LLAMA_CPP_VERSION LLAMA_CPP_VERSION_PY LLAMA_CPP_BRANCH LLAMA_CPP_BRANCH_PY LLAMA_CPP_FLAGS FORCE_BUILD=off SOURCE_DIR=/opt/llama_cpp_python TMP=/tmp/llama_cpp
                        
# 2025-04-24 15:35:04  61.46KB 
|2 SUDONIM_PATCH_DIR=off SUDONIM_SOURCE_DIR=/opt/sudonim /bin/sh -c if [ ${SUDONIM_PATCH_DIR} != "off" ]; then         cd ${SUDONIM_SOURCE_DIR}/patches/${SUDONIM_PATCH_DIR} ;         for script in ./*.sh; do             bash $script ;         done     fi
                        
# 2025-04-24 15:35:04  474.00B 
/bin/sh -c #(nop) COPY dir:f567f8065695115e90a628efd3f1c1f926e0f740f025f8864c6cad1832d4c516 in /opt/sudonim/patches/ 
                        
# 2025-04-24 15:35:04  1.78MB 
|2 SUDONIM_PATCH_DIR=off SUDONIM_SOURCE_DIR=/opt/sudonim /bin/sh -c git clone https://github.com/dusty-nv/sudonim ${SUDONIM_SOURCE_DIR} &&     pip3 install -e ${SUDONIM_SOURCE_DIR}
                        
# 2025-04-24 15:34:56  330.00B 
/bin/sh -c #(nop) ADD f56a19b3cf094215744a993384e33caac28ca93811e49c2c0356a8c09b45ab59 in /tmp/sudonim_version.json 
                        
# 2025-04-24 15:34:55  0.00B 
/bin/sh -c #(nop)  ARG SUDONIM_SOURCE_DIR=/opt/sudonim SUDONIM_PATCH_DIR=off
                        
# 2025-04-24 15:34:55  13.57MB 
/bin/sh -c set -ex     && pip3 install         huggingface_hub[cli]         dataclasses         && huggingface-cli --help     && huggingface-downloader --help     && pip3 show huggingface_hub     && python3 -c 'import huggingface_hub; print(huggingface_hub.__version__)'         && apt-get update     && rm -rf /var/lib/apt/lists/*     && apt-get clean
                        
# 2025-04-24 15:34:42  3.48KB 
/bin/sh -c #(nop) COPY file:b9e93da08167ff3d361a60be33029d9e580dcf9c5b7efa3223bc2908fbaac57d in /usr/local/bin/_huggingface-downloader.py 
                        
# 2025-04-24 15:34:42  499.00B 
/bin/sh -c #(nop) COPY file:31583bc005bc47928bcc5337891ac48bf6e3bc3d3bd525b755d487e841b21faf in /usr/local/bin/ 
                        
# 2025-04-24 15:34:42  0.00B 
/bin/sh -c #(nop)  ENV TRANSFORMERS_CACHE=/data/models/huggingface HUGGINGFACE_HUB_CACHE=/data/models/huggingface HF_HOME=/data/models/huggingface
                        
# 2025-04-24 15:34:41  60.05MB 
/bin/sh -c bash /tmp/numpy/install.sh
                        
# 2025-04-24 15:34:33  482.00B 
/bin/sh -c #(nop) COPY file:85e6a5b8709bad26521b8e24657f666cdaec430e7afce1728906776ab07947c2 in /tmp/numpy/ 
                        
# 2025-04-24 15:34:33  0.00B 
/bin/sh -c #(nop)  ENV NUMPY_PACKAGE=numpy NUMPY_VERSION_MAJOR=2
                        
# 2025-04-24 15:34:33  0.00B 
/bin/sh -c #(nop)  ARG NUMPY_PACKAGE=numpy NUMPY_VERSION_MAJOR=2
                        
# 2025-04-24 15:34:33  0.00B 
/bin/sh -c #(nop)  ENV OPENBLAS_CORETYPE=ARMV8
                        
# 2025-04-24 15:34:32  63.03MB 
/bin/sh -c /tmp/cmake/install.sh
                        
# 2025-04-24 15:34:25  373.00B 
/bin/sh -c #(nop) COPY file:424a79e7f0d3b6050533f2c4aa2744b5a3d0c8d5295a86d844cd8e981ccf276c in /tmp/cmake/install.sh 
                        
# 2025-04-21 22:13:27  129.78MB 
|1 TMP=/tmp/python /bin/sh -c $TMP/install.sh
                        
# 2025-04-21 22:12:32  1.97KB 
/bin/sh -c #(nop) COPY file:1e03bb20d3573380c21a34ae324b701e85d56f97fb5bfb176bfd2cb6382c5ce1 in /tmp/python/ 
                        
# 2025-04-21 22:12:32  0.00B 
/bin/sh -c #(nop)  ENV PYTHON_VERSION=3.12 PYTHONFAULTHANDLER=1 PYTHONUNBUFFERED=1 PYTHONIOENCODING=utf-8 PYTHONHASHSEED=random PIP_NO_CACHE_DIR=true PIP_CACHE_PURGE=true PIP_ROOT_USER_ACTION=ignore PIP_DISABLE_PIP_VERSION_CHECK=on PIP_DEFAULT_TIMEOUT=100 PIP_WHEEL_DIR=/opt/wheels PIP_VERBOSE=1 TWINE_NON_INTERACTIVE=1 DEBIAN_FRONTEND=noninteractive PATH=/opt/venv/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-04-21 22:12:32  0.00B 
/bin/sh -c #(nop)  ARG PYTHON_VERSION TMP=/tmp/python
                        
# 2025-04-21 19:03:34  703.45KB 
|3 CUDNN_DEB=cudnn-local-tegra-repo-ubuntu2404-9.8.0 CUDNN_PACKAGES=libcudnn9-cuda-12 libcudnn9-dev-cuda-12 libcudnn9-samples CUDNN_URL=https://developer.download.nvidia.com/compute/cudnn/9.8.0/local_installers/cudnn-local-tegra-repo-ubuntu2404-9.8.0_1.0-1_arm64.deb /bin/sh -c cd /usr/src/cudnn_samples_v*/conv_sample/ &&     make -j$(nproc)
                        
# 2025-04-21 19:03:21  1.22GB 
|3 CUDNN_DEB=cudnn-local-tegra-repo-ubuntu2404-9.8.0 CUDNN_PACKAGES=libcudnn9-cuda-12 libcudnn9-dev-cuda-12 libcudnn9-samples CUDNN_URL=https://developer.download.nvidia.com/compute/cudnn/9.8.0/local_installers/cudnn-local-tegra-repo-ubuntu2404-9.8.0_1.0-1_arm64.deb /bin/sh -c echo "Downloading ${CUDNN_DEB}" &&     rm -rf /tmp/cudnn && mkdir /tmp/cudnn && cd /tmp/cudnn &&     wget ${WGET_FLAGS} ${CUDNN_URL} &&     dpkg -i *.deb &&     cp /var/cudnn-*-repo-*/cudnn-*-keyring.gpg /usr/share/keyrings/ &&     apt-get update &&     apt-cache search cudnn &&     apt list --installed | grep 'cuda\|cudnn\|cublas' &&     apt-get install -y --no-install-recommends ${CUDNN_PACKAGES} file &&     rm -rf /var/lib/apt/lists/* &&     apt-get clean &&     dpkg --list | grep cudnn &&     dpkg -P ${CUDNN_DEB} &&     rm -rf /tmp/cudnn
                        
# 2025-04-21 19:01:56  47.27MB 
|3 CUDNN_DEB=cudnn-local-tegra-repo-ubuntu2404-9.8.0 CUDNN_PACKAGES=libcudnn9-cuda-12 libcudnn9-dev-cuda-12 libcudnn9-samples CUDNN_URL=https://developer.download.nvidia.com/compute/cudnn/9.8.0/local_installers/cudnn-local-tegra-repo-ubuntu2404-9.8.0_1.0-1_arm64.deb /bin/sh -c ls /etc/apt/sources.list.d/ &&     apt-get update &&     apt-cache search cudnn
                        
# 2025-04-21 19:01:49  0.00B 
/bin/sh -c #(nop)  ARG CUDNN_PACKAGES
                        
# 2025-04-21 19:01:49  0.00B 
/bin/sh -c #(nop)  ARG CUDNN_DEB
                        
# 2025-04-21 19:01:49  0.00B 
/bin/sh -c #(nop)  ARG CUDNN_URL
                        
# 2025-04-21 19:01:48  0.00B 
/bin/sh -c #(nop) WORKDIR /
                        
# 2025-04-21 19:01:48  0.00B 
/bin/sh -c #(nop)  ENV NVIDIA_VISIBLE_DEVICES=all NVIDIA_DRIVER_CAPABILITIES=all CUDAARCHS=87 CUDA_ARCHITECTURES=87 CUDA_HOME=/usr/local/cuda CUDNN_LIB_PATH=/usr/lib/aarch64-linux-gnu CUDNN_LIB_INCLUDE_PATH=/usr/include CMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc CUDA_NVCC_EXECUTABLE=/usr/local/cuda/bin/nvcc CUDACXX=/usr/local/cuda/bin/nvcc TORCH_NVCC_FLAGS=-Xfatbin -compress-all CUDA_BIN_PATH=/usr/local/cuda/bin CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda PATH=/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LD_LIBRARY_PATH=/usr/local/cuda/compat:/usr/local/cuda/lib64: DEBIAN_FRONTEND=noninteractive
                        
# 2025-04-21 19:01:48  0.00B 
/bin/sh -c #(nop)  ENV NVCC_PATH=/usr/local/cuda/bin/nvcc
                        
# 2025-04-21 19:01:48  0.00B 
/bin/sh -c #(nop)  ENV CUDA_HOME=/usr/local/cuda
                        
# 2025-04-21 19:01:45  4.30GB 
|5 CUDA_ARCH_LIST=87 CUDA_DEB=cuda-tegra-repo-ubuntu2204-12-8-local CUDA_PACKAGES=cuda-toolkit* CUDA_URL=https://developer.download.nvidia.com/compute/cuda/12.8.1/local_installers/cuda-tegra-repo-ubuntu2204-12-8-local_12.8.1-1_arm64.deb DISTRO=ubuntu2404 /bin/sh -c /tmp/cuda/install.sh
                        
# 2025-04-21 18:59:00  1.36KB 
/bin/sh -c #(nop) COPY file:0259a14a19a16aa974693e2b9b7acd338142217fcffb000c9c7d027c22d9771d in /tmp/cuda/install.sh 
                        
# 2025-04-21 18:59:00  0.00B 
/bin/sh -c #(nop)  ARG CUDA_URL CUDA_DEB CUDA_PACKAGES CUDA_ARCH_LIST DISTRO=ubuntu2004
                        
# 2025-04-21 18:59:00  0.00B 
/bin/sh -c #(nop)  ENV MULTIARCH_URL=https://apt.jetson-ai-lab.dev/multiarch TAR_INDEX_URL=https://apt.jetson-ai-lab.dev/jp6/cu128/24.04 PIP_INDEX_URL=https://pypi.jetson-ai-lab.dev/jp6/cu128 PIP_TRUSTED_HOST= TWINE_REPOSITORY_URL=http://alice:3141/jp6/cu128 TWINE_USERNAME=jp6 TWINE_PASSWORD=NvidiaJetson24 SCP_UPLOAD_URL=jao-51:/dist/jp6/cu128/24.04 SCP_UPLOAD_USER=nvidia SCP_UPLOAD_PASS=nvidia
                        
# 2025-04-21 18:59:00  0.00B 
/bin/sh -c #(nop)  ARG PIP_INDEX_REPO PIP_UPLOAD_REPO PIP_UPLOAD_USER PIP_UPLOAD_PASS PIP_TRUSTED_HOSTS TAR_INDEX_URL MULTIARCH_URL SCP_UPLOAD_URL SCP_UPLOAD_USER SCP_UPLOAD_PASS
                        
# 2025-04-21 18:59:00  2.66KB 
/bin/sh -c #(nop) COPY multi:604af4218d2c26088788b4e95f41b1499210d510044800998bdc4e31c4169982 in /usr/local/bin/ 
                        
# 2025-04-21 18:58:59  793.76MB 
/bin/sh -c set -ex     && apt-get update     && apt-get install -y --no-install-recommends         locales         locales-all         tzdata     && locale-gen en_US $LANG     && update-locale LC_ALL=$LC_ALL LANG=$LANG     && locale         && apt-get install -y --no-install-recommends         build-essential         software-properties-common         apt-transport-https         ca-certificates         lsb-release         pkg-config         gnupg         git         git-lfs         gdb         wget         wget2         curl         nano         zip         unzip         time         sshpass         ssh-client     && apt-get clean     && rm -rf /var/lib/apt/lists/*         && gcc --version     && g++ --version
                        
# 2025-04-21 18:57:15  0.00B 
/bin/sh -c #(nop)  ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=en_US:en LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 WGET_FLAGS=--quiet --show-progress --progress=bar:force:noscroll --no-check-certificate
                        
# 2025-01-27 12:14:54  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-01-27 12:14:54  100.73MB 
/bin/sh -c #(nop) ADD file:68158f1ff76fd4de9f92666ad22571e6cd11df166255c2814a135773fdd6acd7 in / 
                        
# 2025-01-27 12:14:51  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=24.04
                        
# 2025-01-27 12:14:51  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2025-01-27 12:14:51  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2025-01-27 12:14:51  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:b47912ab278c53a8b9a8725beb6bd28b4afce4b38c83122d219ae6a1a48bf99b",
    "RepoTags": [
        "dustynv/llama_cpp:b5283-r36.4-cu128-24.04",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04-linuxarm64"
    ],
    "RepoDigests": [
        "dustynv/llama_cpp@sha256:178e4173a23fef4e1689d429505baf5156e0e041d564d70fbbfa5697484f9770",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/dustynv/llama_cpp@sha256:3dfbc5c644870cfa1ad2b60eef3c9da7bc562ee17df8b58794f19ebf7ea5eefe"
    ],
    "Parent": "",
    "Comment": "",
    "Created": "2025-05-05T16:03:23.731957936Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "24.0.7",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/venv/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "DEBIAN_FRONTEND=noninteractive",
            "LANGUAGE=en_US:en",
            "LANG=en_US.UTF-8",
            "LC_ALL=en_US.UTF-8",
            "WGET_FLAGS=--quiet --show-progress --progress=bar:force:noscroll --no-check-certificate",
            "MULTIARCH_URL=https://apt.jetson-ai-lab.dev/multiarch",
            "TAR_INDEX_URL=https://apt.jetson-ai-lab.dev/jp6/cu128/24.04",
            "PIP_INDEX_URL=https://pypi.jetson-ai-lab.dev/jp6/cu128",
            "PIP_TRUSTED_HOST=",
            "TWINE_REPOSITORY_URL=http://alice:3141/jp6/cu128",
            "TWINE_USERNAME=jp6",
            "TWINE_PASSWORD=NvidiaJetson24",
            "SCP_UPLOAD_URL=jao-51:/dist/jp6/cu128/24.04",
            "SCP_UPLOAD_USER=nvidia",
            "SCP_UPLOAD_PASS=nvidia",
            "CUDA_HOME=/usr/local/cuda",
            "NVCC_PATH=/usr/local/cuda/bin/nvcc",
            "NVIDIA_VISIBLE_DEVICES=all",
            "NVIDIA_DRIVER_CAPABILITIES=all",
            "CUDAARCHS=87",
            "CUDA_ARCHITECTURES=87",
            "CUDNN_LIB_PATH=/usr/lib/aarch64-linux-gnu",
            "CUDNN_LIB_INCLUDE_PATH=/usr/include",
            "CMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc",
            "CUDA_NVCC_EXECUTABLE=/usr/local/cuda/bin/nvcc",
            "CUDACXX=/usr/local/cuda/bin/nvcc",
            "TORCH_NVCC_FLAGS=-Xfatbin -compress-all",
            "CUDA_BIN_PATH=/usr/local/cuda/bin",
            "CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda",
            "LD_LIBRARY_PATH=/usr/local/cuda/compat:/usr/local/cuda/lib64:",
            "PYTHON_VERSION=3.12",
            "PYTHONFAULTHANDLER=1",
            "PYTHONUNBUFFERED=1",
            "PYTHONIOENCODING=utf-8",
            "PYTHONHASHSEED=random",
            "PIP_NO_CACHE_DIR=true",
            "PIP_CACHE_PURGE=true",
            "PIP_ROOT_USER_ACTION=ignore",
            "PIP_DISABLE_PIP_VERSION_CHECK=on",
            "PIP_DEFAULT_TIMEOUT=100",
            "PIP_WHEEL_DIR=/opt/wheels",
            "PIP_VERBOSE=1",
            "TWINE_NON_INTERACTIVE=1",
            "OPENBLAS_CORETYPE=ARMV8",
            "NUMPY_PACKAGE=numpy",
            "NUMPY_VERSION_MAJOR=2",
            "TRANSFORMERS_CACHE=/data/models/huggingface",
            "HUGGINGFACE_HUB_CACHE=/data/models/huggingface",
            "HF_HOME=/data/models/huggingface"
        ],
        "Cmd": [
            "/bin/bash"
        ],
        "Image": "sha256:392287ed45b3f431526933f81533684f4e4856130bb59972afd9b9af90dd2208",
        "Volumes": null,
        "WorkingDir": "/",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "24.04"
        }
    },
    "Architecture": "arm64",
    "Variant": "v8",
    "Os": "linux",
    "Size": 7075305336,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/71d4e60dfb2d5f379ecf154259f08dd99f2c99e8c199e0b146e079b831981512/diff:/var/lib/docker/overlay2/f815f3db12a2bd9620275aa61359286100150ad4fa107a300b2759b26445833e/diff:/var/lib/docker/overlay2/b7849df3b3b93a7d2386309a57edb2ee2898422eb1f389d93134277d2001f26e/diff:/var/lib/docker/overlay2/075b677fab38967ca923e324d37a0e2e20cd657b3ffe5fed105381bfabcbdfa5/diff:/var/lib/docker/overlay2/11d35a9d17a1b025516906bd96d12d77c555e59b1a8e575673d354f91d88afbd/diff:/var/lib/docker/overlay2/6fa1986d4da41a7de7b77d81e30a93b740dfb077873ea14a0ad3ad69feaeaf0a/diff:/var/lib/docker/overlay2/bfa885a0cf4af2e5dc5de094df12ef0660e96567c4cfc08254512bf17f08d227/diff:/var/lib/docker/overlay2/e30992ee56c6921fe76263b74c8c81c803cfe8152602444d5236adead60ba7f6/diff:/var/lib/docker/overlay2/2d1befa4bbb640e2d7fbea6555183db9ea6069423fd565d650ef44871a6cf2db/diff:/var/lib/docker/overlay2/e2fd8e761dd647f63d18f89acf802fece49171b779955afa2b0d414a63a0fe74/diff:/var/lib/docker/overlay2/a08a4c0f0c38a12d059b1113ef24bf79533cee44b94c623560e643c2fbf177d9/diff:/var/lib/docker/overlay2/0dd3c32971a6a66769aa5ae4bd86c596addf812b17cbee59e66aa344fa4e232c/diff:/var/lib/docker/overlay2/25bf81360894a735ab1b236f06711ae62cbe65c9a7bcfeeea1772370f7a0d707/diff:/var/lib/docker/overlay2/287e36d2426177b63400896ff5188e89f173bc826cb775b8189da665dbb1bbf8/diff:/var/lib/docker/overlay2/c0d8b32b1ea654700a47b05e5363fbef792ed91b355b9121876e8898b27263a4/diff:/var/lib/docker/overlay2/8caa197197d8c1eb6704b198a6cacb562d88b28d2998af041db474ea56549a4d/diff:/var/lib/docker/overlay2/d85dbf01b0ab64e6f2f3f6b480195bc6e17cb50397c8568755b3930b9dcbaf18/diff:/var/lib/docker/overlay2/5a9940360a2d5878249e2004edb7aa057f4c700c65baac85c9f6c88f6eb24e7c/diff:/var/lib/docker/overlay2/30b1f4e30e6484172b4d321e7349ceb0fe5b74fdfce9a854cf3d5a4a59205ca1/diff:/var/lib/docker/overlay2/353292ba5f927a2c1c950a91a6b3fb12886159f89a8a79b0c0e5773fe366df2a/diff:/var/lib/docker/overlay2/e1ac48c11198e36d9463cc7d9c9ef38365970321e84922733eda713ebda148cb/diff:/var/lib/docker/overlay2/ade543eb050201aa8d0422c29357adc6f612ec9fa840c094595fc67657c4bffc/diff:/var/lib/docker/overlay2/e19b24831b416f2fe1510f489d14be6c8d342b2802256012fdf2d62804c9f38e/diff:/var/lib/docker/overlay2/d963ef7b080424daaf6ec013b54afe2d8d1d5e2ab2b1b02b587229532d3f22c4/diff",
            "MergedDir": "/var/lib/docker/overlay2/e4812b96fa3a8f18ce7eb11881bc7b4a58a1465422427154965589ee21a4f794/merged",
            "UpperDir": "/var/lib/docker/overlay2/e4812b96fa3a8f18ce7eb11881bc7b4a58a1465422427154965589ee21a4f794/diff",
            "WorkDir": "/var/lib/docker/overlay2/e4812b96fa3a8f18ce7eb11881bc7b4a58a1465422427154965589ee21a4f794/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:375990b2a90a8d8f332d9b9422d948f7068a3313bf5a1c9fbb91ff2d29046130",
            "sha256:df3adf198c3943f2ea080e5a5a27b8f50a4c4f739a71054d449c42e68dc13aa0",
            "sha256:8edca82f5167a8ed155eb1f9c6ed1a7977e3f7809856802d7ee7a7952cfd0811",
            "sha256:7fa1eddcce76af129bee94fe03a3f7083c76c2b47979efab6ae7e395cbaec5cf",
            "sha256:738a9ee7d9e73f4041c9592c7291d7f070f4d1d1d79951bafa7f7b863fad155d",
            "sha256:1b0c8d57a40bae839c24f0f03509595785da67c38ba23dbb6b175cb5629dcb15",
            "sha256:db6d3dcdc32885bf842ffeda870d78f346ba46b331be05075295dbe716b36aff",
            "sha256:a6c1b42864aa89bae9ce53e550cbe004991bc2b43d34c0c8c545d4a3e62093d5",
            "sha256:4a1a64586ee769e7975710be7c655712ab64798aba50b1df0695b89b376c82e3",
            "sha256:7974ec64a2c609cd6746eae0a50bfff861442ea6f7ce38950c732773047fd604",
            "sha256:267ae235016ae10858c2c17d80dae5b3025cb68abbb2c9df266078f012b37bce",
            "sha256:cf8bf6422f09b8ff287f8dd58d17519ee9392d17072af81e117ad6a0f993a292",
            "sha256:12ade4eb21590896f3ec31ad0d0a7b978eb82ae2ec7622ca315ef368d7505ac9",
            "sha256:2d48dca9876440795b952a20539e81030b550ac75a0ec2575b1e0d7ea6bc70d7",
            "sha256:7851d81f465c978655e89596e632b8b8689a60f83583b59c35b889fe97f1917e",
            "sha256:8b86fa162fa55fc05c08a8b3b2e0975420ef15717793a0a13d50ac18d4993149",
            "sha256:23c96dd79a134926800ceefe02dc9a7594ee51bbc45a97a6221069915b7ca7ca",
            "sha256:365bb29aab9cf26e87158782d877db1ff747e856cf8e197c6df540a70a6cea37",
            "sha256:47a00027f8c4a71b2a42b14898d1b25b4674e09228c8157511157f1945a23cef",
            "sha256:d0577059beceaf96bed4269226c8617bd4dd99f5859c79b197b5868c9c8c03ae",
            "sha256:6899597a505f98551eec0de1bbb21e4056ab6aa0725074b4a404641790b3bf83",
            "sha256:ce6c622782cfb731dae89127cfd019d1f250fc59372bde2beb1df45ef589ae9f",
            "sha256:b768fc6903ea6cca6097c97823210c5f66e1a458410dddc9c6a7c5ef88355f6c",
            "sha256:9d48306b8168e2912cc93856d773624a5b337d46fac3236a881e94ab59aacda1",
            "sha256:3ef8525a03fc9b494b5c1892f441a9152f16136745199bd383a8d54321d540cf"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-11-04T16:24:17.883114267+08:00"
    }
}

更多版本

docker.io/dustynv/llama_cpp:r36.2.0

linux/arm64 docker.io10.65GB2024-11-02 16:10
432

docker.io/dustynv/llama_cpp:b5283-r36.4-cu128-24.04

linux/arm64 docker.io7.08GB2025-11-04 16:25
9