docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12 linux/amd64

docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12 - 国内下载镜像源 浏览次数:10

LocalAI Docker 镜像

这是一个包含 LocalAI 软件的 Docker 镜像, LocalAI 是一款用于构建、训练和部署机器学习模型的开源平台。

镜像用途

* 在 Docker 容器中快速部署和运行 LocalAI * 方便地进行机器学习项目开发和实验 * 轻松共享和部署机器学习模型

镜像内容

* LocalAI 软件及所有依赖项 * 必要的配置文件和工具 * 示例数据集和模型

使用说明

请参考 LocalAI 官方文档获取详细的使用说明和安装步骤。
源镜像 docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12
镜像ID sha256:157392a4cf54bbcd104578c6a6815158320aefe5316c05bcabafb4fb326477af
镜像TAG v3.10.1-aio-gpu-nvidia-cuda-12
大小 6.73GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /aio/entrypoint.sh
工作目录 /
OS/平台 linux/amd64
浏览量 10 次
贡献者
镜像创建 2026-01-23T15:53:48.727939849Z
同步时间 2026-02-03 01:18
更新时间 2026-02-03 08:49
开放端口
8080/tcp
目录挂载
/backends /configuration /models
环境变量
PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin DEBIAN_FRONTEND=noninteractive BUILD_TYPE=cublas HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz NVIDIA_DRIVER_CAPABILITIES=compute,utility NVIDIA_REQUIRE_CUDA=cuda>=12.0 NVIDIA_VISIBLE_DEVICES=all
镜像标签
2026-01-23T15:43:03.119Z: org.opencontainers.image.created :robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more. Features: Generate Text, MCP, Audio, Video, Images, Voice Cloning, Distributed, P2P and decentralized inference: org.opencontainers.image.description MIT: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name 923ebbb3440dcd105e13097a819f7a77193ab06f: org.opencontainers.image.revision https://github.com/mudler/LocalAI: org.opencontainers.image.source LocalAI: org.opencontainers.image.title https://github.com/mudler/LocalAI: org.opencontainers.image.url v3.10.1-aio-gpu-nvidia-cuda-12: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12

Shell快速替换命令

sed -i 's#localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12'

镜像构建历史


# 2026-01-23 23:53:48  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/aio/entrypoint.sh"]
                        
# 2026-01-23 23:53:48  26.28KB 复制新文件或目录到容器中
COPY aio/ /aio # buildkit
                        
# 2026-01-23 23:53:48  59.59MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update && apt-get install -y pciutils && apt-get clean # buildkit
                        
# 2026-01-23 23:47:00  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/entrypoint.sh"]
                        
# 2026-01-23 23:47:00  0.00B 声明容器运行时监听的端口
EXPOSE [8080/tcp]
                        
# 2026-01-23 23:47:00  0.00B 创建挂载点用于持久化数据或共享数据
VOLUME [/models /backends /configuration]
                        
# 2026-01-23 23:47:00  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "curl -f ${HEALTHCHECK_ENDPOINT} || exit 1"] "1m0s" "10m0s" "0s" "0s" '\n'}
                        
# 2026-01-23 23:47:00  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c mkdir -p /models /backends # buildkit
                        
# 2026-01-23 23:47:00  78.43MB 复制新文件或目录到容器中
COPY /build/local-ai ./ # buildkit
                        
# 2026-01-23 23:45:07  777.00B 复制新文件或目录到容器中
COPY ./entrypoint.sh . # buildkit
                        
# 2026-01-23 23:45:07  0.00B 设置工作目录为/
WORKDIR /
                        
# 2026-01-23 23:45:07  0.00B 设置环境变量 NVIDIA_VISIBLE_DEVICES
ENV NVIDIA_VISIBLE_DEVICES=all
                        
# 2026-01-23 23:45:07  0.00B 设置环境变量 NVIDIA_REQUIRE_CUDA
ENV NVIDIA_REQUIRE_CUDA=cuda>=12.0
                        
# 2026-01-23 23:45:07  0.00B 设置环境变量 NVIDIA_DRIVER_CAPABILITIES
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
                        
# 2026-01-23 23:45:07  0.00B 定义构建参数
ARG CUDA_MAJOR_VERSION=12
                        
# 2026-01-23 23:45:07  0.00B 设置环境变量 HEALTHCHECK_ENDPOINT
ENV HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz
                        
# 2026-01-23 23:45:07  0.00B 设置环境变量 PATH
ENV PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2026-01-23 23:45:07  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2026-01-23 23:45:07  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c expr "${BUILD_TYPE}" = intel && echo "intel" > /run/localai/capability || echo "not intel" # buildkit
                        
# 2026-01-23 23:45:07  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c if [ "${BUILD_TYPE}" = "hipblas" ]; then     ln -s /opt/rocm-**/lib/llvm/lib/libomp.so /usr/lib/libomp.so     ; fi # buildkit
                        
# 2026-01-23 23:45:07  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c if [ "${BUILD_TYPE}" = "hipblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then         apt-get update &&         apt-get install -y --no-install-recommends             hipblas-dev             rocblas-dev &&         apt-get clean &&         rm -rf /var/lib/apt/lists/* &&         echo "amd" > /run/localai/capability &&         ldconfig     ; fi # buildkit
                        
# 2026-01-23 23:45:07  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c if [ "${BUILD_TYPE}" = "clblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then         apt-get update &&         apt-get install -y --no-install-recommends             libclblast-dev &&         apt-get clean &&         rm -rf /var/lib/apt/lists/*     ; fi # buildkit
                        
# 2026-01-23 23:45:07  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c <<EOT bash
    if [ "${BUILD_TYPE}" = "cublas" ] && [ "${TARGETARCH}" = "arm64" ]; then
        wget https://developer.download.nvidia.com/compute/cudss/0.6.0/local_installers/cudss-local-tegra-repo-ubuntu${UBUNTU_VERSION}-0.6.0_0.6.0-1_arm64.deb && \
        dpkg -i cudss-local-tegra-repo-ubuntu${UBUNTU_VERSION}-0.6.0_0.6.0-1_arm64.deb && \
        cp /var/cudss-local-tegra-repo-ubuntu${UBUNTU_VERSION}-0.6.0/cudss-*-keyring.gpg /usr/share/keyrings/ && \
        apt-get update && apt-get -y install cudss cudss-cuda-${CUDA_MAJOR_VERSION} && \
        wget https://developer.download.nvidia.com/compute/nvpl/25.5/local_installers/nvpl-local-repo-ubuntu${UBUNTU_VERSION}-25.5_1.0-1_arm64.deb && \
        dpkg -i nvpl-local-repo-ubuntu${UBUNTU_VERSION}-25.5_1.0-1_arm64.deb && \
        cp /var/nvpl-local-repo-ubuntu${UBUNTU_VERSION}-25.5/nvpl-*-keyring.gpg /usr/share/keyrings/ && \
        apt-get update && apt-get install -y nvpl
    fi
EOT # buildkit
                        
# 2026-01-23 23:45:07  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c <<EOT bash
    if [ "${BUILD_TYPE}" = "cublas" ] && [ "${TARGETARCH}" = "arm64" ]; then
        echo "nvidia-l4t-cuda-${CUDA_MAJOR_VERSION}" > /run/localai/capability
    fi
EOT # buildkit
                        
# 2026-01-23 23:45:07  5.93GB 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c <<EOT bash
    if ( [ "${BUILD_TYPE}" = "cublas" ] || [ "${BUILD_TYPE}" = "l4t" ] ) && [ "${SKIP_DRIVERS}" = "false" ]; then
        apt-get update && \
        apt-get install -y  --no-install-recommends \
            software-properties-common pciutils
        if [ "amd64" = "$TARGETARCH" ]; then
            curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu${UBUNTU_VERSION}/x86_64/cuda-keyring_1.1-1_all.deb
        fi
        if [ "arm64" = "$TARGETARCH" ]; then
            if [ "${CUDA_MAJOR_VERSION}" = "13" ]; then
                curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu${UBUNTU_VERSION}/sbsa/cuda-keyring_1.1-1_all.deb
            else
                curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu${UBUNTU_VERSION}/arm64/cuda-keyring_1.1-1_all.deb
            fi
        fi
        dpkg -i cuda-keyring_1.1-1_all.deb && \
        rm -f cuda-keyring_1.1-1_all.deb && \
        apt-get update && \
        apt-get install -y --no-install-recommends \
            cuda-nvcc-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcufft-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcurand-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcublas-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcusparse-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcusolver-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION}
        if [ "${CUDA_MAJOR_VERSION}" = "13" ] && [ "arm64" = "$TARGETARCH" ]; then
            apt-get install -y --no-install-recommends \
            libcufile-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} libcudnn9-cuda-${CUDA_MAJOR_VERSION} cuda-cupti-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} libnvjitlink-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION}
        fi
        apt-get clean && \
        rm -rf /var/lib/apt/lists/* && \
        echo "nvidia-cuda-${CUDA_MAJOR_VERSION}" > /run/localai/capability
    fi
EOT # buildkit
                        
# 2026-01-23 23:43:40  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c <<EOT bash
    if [ "${BUILD_TYPE}" = "vulkan" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
        apt-get update && \
        apt-get install -y  --no-install-recommends \
            software-properties-common pciutils wget gpg-agent && \
        apt-get install -y libglm-dev cmake libxcb-dri3-0 libxcb-present0 libpciaccess0 \
            libpng-dev libxcb-keysyms1-dev libxcb-dri3-dev libx11-dev g++ gcc \
            libwayland-dev libxrandr-dev libxcb-randr0-dev libxcb-ewmh-dev \
            git python-is-python3 bison libx11-xcb-dev liblz4-dev libzstd-dev \
            ocaml-core ninja-build pkg-config libxml2-dev wayland-protocols python3-jsonschema \
            clang-format qtbase5-dev qt6-base-dev libxcb-glx0-dev sudo xz-utils mesa-vulkan-drivers
        if [ "amd64" = "$TARGETARCH" ]; then
            wget "https://sdk.lunarg.com/sdk/download/1.4.335.0/linux/vulkansdk-linux-x86_64-1.4.335.0.tar.xz" && \
            tar -xf vulkansdk-linux-x86_64-1.4.335.0.tar.xz && \
            rm vulkansdk-linux-x86_64-1.4.335.0.tar.xz && \
            mkdir -p /opt/vulkan-sdk && \
            mv 1.4.335.0 /opt/vulkan-sdk/ && \
            cd /opt/vulkan-sdk/1.4.335.0 && \
            ./vulkansdk --no-deps --maxjobs \
                vulkan-loader \
                vulkan-validationlayers \
                vulkan-extensionlayer \
                vulkan-tools \
                shaderc && \
            cp -rfv /opt/vulkan-sdk/1.4.335.0/x86_64/bin/* /usr/bin/ && \
            cp -rfv /opt/vulkan-sdk/1.4.335.0/x86_64/lib/* /usr/lib/x86_64-linux-gnu/ && \
            cp -rfv /opt/vulkan-sdk/1.4.335.0/x86_64/include/* /usr/include/ && \
            cp -rfv /opt/vulkan-sdk/1.4.335.0/x86_64/share/* /usr/share/ && \
            rm -rf /opt/vulkan-sdk
        fi
        if [ "arm64" = "$TARGETARCH" ]; then
            mkdir vulkan && cd vulkan && \
            curl -L -o vulkan-sdk.tar.xz https://github.com/mudler/vulkan-sdk-arm/releases/download/1.4.335.0/vulkansdk-ubuntu-24.04-arm-1.4.335.0.tar.xz && \
            tar -xvf vulkan-sdk.tar.xz && \
            rm vulkan-sdk.tar.xz && \
            cd 1.4.335.0 && \
            cp -rfv aarch64/bin/* /usr/bin/ && \
            cp -rfv aarch64/lib/* /usr/lib/aarch64-linux-gnu/ && \
            cp -rfv aarch64/include/* /usr/include/ && \
            cp -rfv aarch64/share/* /usr/share/ && \
            cd ../.. && \
            rm -rf vulkan
        fi
        ldconfig && \
        apt-get clean && \
        rm -rf /var/lib/apt/lists/* && \
        echo "vulkan" > /run/localai/capability
    fi
EOT # buildkit
                        
# 2026-01-23 23:43:40  8.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c echo "default" > /run/localai/capability # buildkit
                        
# 2026-01-23 23:43:40  0.00B 执行命令并创建新的镜像层
RUN |7 BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=9 SKIP_DRIVERS=false TARGETARCH=amd64 TARGETVARIANT= UBUNTU_VERSION=2404 /bin/sh -c mkdir -p /run/localai # buildkit
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG UBUNTU_VERSION=2404
                        
# 2026-01-23 23:43:40  0.00B 设置环境变量 BUILD_TYPE
ENV BUILD_TYPE=cublas
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG TARGETVARIANT=
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG TARGETARCH=amd64
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG SKIP_DRIVERS=false
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG CUDA_MINOR_VERSION=9
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG CUDA_MAJOR_VERSION=12
                        
# 2026-01-23 23:43:40  0.00B 定义构建参数
ARG BUILD_TYPE=cublas
                        
# 2026-01-23 23:43:40  586.32MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update &&     apt-get install -y --no-install-recommends         ca-certificates curl wget espeak-ng libgomp1         ffmpeg libopenblas0 libopenblas-dev sox &&     apt-get clean &&     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-01-23 23:43:40  0.00B 设置环境变量 DEBIAN_FRONTEND
ENV DEBIAN_FRONTEND=noninteractive
                        
# 2026-01-13 13:37:27  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2026-01-13 13:37:27  78.12MB 
/bin/sh -c #(nop) ADD file:3077ee44db3cc7d38740d60a05c81418dd3825a007db473658464f52689e867b in / 
                        
# 2026-01-13 13:37:25  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=24.04
                        
# 2026-01-13 13:37:25  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2026-01-13 13:37:25  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2026-01-13 13:37:25  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:157392a4cf54bbcd104578c6a6815158320aefe5316c05bcabafb4fb326477af",
    "RepoTags": [
        "localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12"
    ],
    "RepoDigests": [
        "localai/localai@sha256:01b05d27876a8e2d6f207b13943b94b13bee44cc523206caaf324fb6d05f9393",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai@sha256:11eb93bf135b6b6365d4fa6bf3383bcbfd306f24f21b7859f12e07b2636d061d"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2026-01-23T15:53:48.727939849Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "8080/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "DEBIAN_FRONTEND=noninteractive",
            "BUILD_TYPE=cublas",
            "HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=12.0",
            "NVIDIA_VISIBLE_DEVICES=all"
        ],
        "Cmd": null,
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "curl -f ${HEALTHCHECK_ENDPOINT} || exit 1"
            ],
            "Interval": 60000000000,
            "Timeout": 600000000000,
            "Retries": 10
        },
        "Image": "",
        "Volumes": {
            "/backends": {},
            "/configuration": {},
            "/models": {}
        },
        "WorkingDir": "/",
        "Entrypoint": [
            "/aio/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2026-01-23T15:43:03.119Z",
            "org.opencontainers.image.description": ":robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement,  running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more. Features: Generate Text, MCP, Audio, Video, Images, Voice Cloning, Distributed, P2P and decentralized inference",
            "org.opencontainers.image.licenses": "MIT",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "923ebbb3440dcd105e13097a819f7a77193ab06f",
            "org.opencontainers.image.source": "https://github.com/mudler/LocalAI",
            "org.opencontainers.image.title": "LocalAI",
            "org.opencontainers.image.url": "https://github.com/mudler/LocalAI",
            "org.opencontainers.image.version": "v3.10.1-aio-gpu-nvidia-cuda-12"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 6732975303,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/08062db82027d65734edb55d84618483bd0e7ecfa45f4f598cd736ce56ad762d/diff:/var/lib/docker/overlay2/4f03314cfcb39d901615360ae91ec76fab3b8dc8d64757cce1af01a2d52bb66d/diff:/var/lib/docker/overlay2/21e310e846f6c6355a25ff84dfc97599f697f3ba338dae3930051b4e74a1beab/diff:/var/lib/docker/overlay2/a878478dcca2586863e631f075631a760fe11b7cfd77c1c9d3bd5eee8a4305d2/diff:/var/lib/docker/overlay2/4836f4fdfa10bb1ec6fd25559b706ec0d6f84e18303005c3b9fa63279523d8a8/diff:/var/lib/docker/overlay2/f7ed3c8e9a7dc70a25c322f4d0271ee595c94fb9c5e00c61df12f1c2c2dea5ec/diff:/var/lib/docker/overlay2/151490ad06e90d5dd637e34be731315e60baa8af251fcd92434a8888c183ef4f/diff:/var/lib/docker/overlay2/3097be21af5cd980908291e29c7377ec9abe278c4e23398aceb007d3dc4dfd18/diff:/var/lib/docker/overlay2/80281f301efc31debce5f1491cc11ffa252757315b9c10a660b982d7fa347d5d/diff:/var/lib/docker/overlay2/409ebe8e2042abc6e521dca4ab0945d72013a6f5f64295dd9fa1e9638cb0c78f/diff:/var/lib/docker/overlay2/0adadf4393e110e02f39841302cfba012bf1fdd2c0df4056f3fe924f0ad3fe51/diff:/var/lib/docker/overlay2/3bb158655bc0d03af0cd63bca1a4673da16c4f9dde613937b954f3b3e0c09d6b/diff:/var/lib/docker/overlay2/f2a1de9148f62fc8c99d15bb893c2561ed95d02583573aa1a430c9e51ea50648/diff:/var/lib/docker/overlay2/5111c8b08033020b6a4b967617700b8fba613051839f4bbb0d52078a3f88894d/diff:/var/lib/docker/overlay2/f60dd8d277cdb967336cea2237834acdb5559f5adeafc6b80701209d2e76cfe0/diff:/var/lib/docker/overlay2/179e8393bd17d570d7628e461a839899c153848fb1db08586925dddd7887188a/diff",
            "MergedDir": "/var/lib/docker/overlay2/03871d53a081e2166cca4fb8ac00e27ae2c63ae42acf00f958e505945a670072/merged",
            "UpperDir": "/var/lib/docker/overlay2/03871d53a081e2166cca4fb8ac00e27ae2c63ae42acf00f958e505945a670072/diff",
            "WorkDir": "/var/lib/docker/overlay2/03871d53a081e2166cca4fb8ac00e27ae2c63ae42acf00f958e505945a670072/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:123a078714d5ea9382d4d9f550753aefce8b34ec5ae11ae8273038d3bcbb943f",
            "sha256:0db52089e6d413b4ee7bc8a3b6f006bf5deab7f923fb3e601ec5ec585746de43",
            "sha256:8a794d0439cfae9f1c85a1e850470472803db830c24ff1c9c6bbb1abec5b15e9",
            "sha256:333a242690556da11767056c797705e62707cadcc5d1ff2550976cbbddc298e5",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:29e53c0d9b8210f7a86584543a68aab1a0212a60104b606bd27698b876cd65cb",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:0e63efd22514efe1b136c476e99f1f28808521e1efbf4d64dc569339ef534fef",
            "sha256:cc5eb39827fd0d9c9ee4b4cf45cc96196b119e837882336340ec16441fe0d31d",
            "sha256:fc0b908b69320189d808c4900fa8c7ea193321d2219bbcbfe4ff9afc1dea0c9c",
            "sha256:79610dd80768405250a3d3c309289a587e43ff9da3851ecb524023bdd5032cd1",
            "sha256:cac0fb0387436b717c49260627780ad67ebe68ba489f80d1236c1aed3a27d814"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-02-03T01:08:15.678718203+08:00"
    }
}

更多版本

docker.io/localai/localai:latest-aio-cpu

linux/amd64 docker.io6.42GB2024-11-08 14:23
1105

docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io45.94GB2024-11-21 01:51
632

docker.io/localai/localai:master-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io42.47GB2025-02-28 01:41
581

docker.io/localai/localai:master-vulkan-ffmpeg-core

linux/amd64 docker.io5.91GB2025-03-03 18:48
423

docker.io/localai/localai:latest-aio-gpu-hipblas

linux/amd64 docker.io88.18GB2025-03-10 02:52
602

docker.io/localai/localai:latest-gpu-nvidia-cuda-12

linux/amd64 docker.io41.80GB2025-03-11 02:52
603

docker.io/localai/localai:v2.29.0-cublas-cuda12

linux/amd64 docker.io13.57GB2025-05-22 16:07
498

docker.io/localai/localai:v2.29.0-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io48.62GB2025-05-29 02:49
417

docker.io/localai/localai:latest-gpu-nvidia-cuda-11

linux/amd64 docker.io4.11GB2025-09-05 09:51
229

docker.io/localai/localai:master-gpu-nvidia-cuda-13

linux/amd64 docker.io4.46GB2025-12-23 01:50
141

docker.io/localai/localai:latest

linux/arm64 docker.io487.63MB2025-12-26 21:58
83

docker.io/localai/localai:latest-gpu-intel

linux/amd64 docker.io16.44GB2025-12-27 01:47
97

docker.io/localai/localai:latest

linux/amd64 docker.io732.47MB2026-01-20 15:37
79

docker.io/localai/localai:v3.10.1-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io6.73GB2026-02-03 01:18
9