docker.io/localai/localai:master-vulkan-ffmpeg-core linux/amd64

docker.io/localai/localai:master-vulkan-ffmpeg-core - 国内下载镜像源 浏览次数:65

LocalAI Docker 镜像

这是一个包含 LocalAI 软件的 Docker 镜像, LocalAI 是一款用于构建、训练和部署机器学习模型的开源平台。

镜像用途

* 在 Docker 容器中快速部署和运行 LocalAI * 方便地进行机器学习项目开发和实验 * 轻松共享和部署机器学习模型

镜像内容

* LocalAI 软件及所有依赖项 * 必要的配置文件和工具 * 示例数据集和模型

使用说明

请参考 LocalAI 官方文档获取详细的使用说明和安装步骤。
源镜像 docker.io/localai/localai:master-vulkan-ffmpeg-core
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core
镜像ID sha256:a8fdaf72af7fce5050079d1cc1337b43c521818e39fc8bca87b6cee43a4af1c5
镜像TAG master-vulkan-ffmpeg-core
大小 5.91GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /build/entrypoint.sh
工作目录 /build
OS/平台 linux/amd64
浏览量 65 次
贡献者
镜像创建 2025-03-03T01:08:05.425575379Z
同步时间 2025-03-03 18:48
更新时间 2025-03-26 04:20
开放端口
8080/tcp
环境变量
PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin DEBIAN_FRONTEND=noninteractive EXTERNAL_GRPC_BACKENDS=coqui:/build/backend/python/coqui/run.sh,transformers:/build/backend/python/transformers/run.sh,rerankers:/build/backend/python/rerankers/run.sh,autogptq:/build/backend/python/autogptq/run.sh,bark:/build/backend/python/bark/run.sh,diffusers:/build/backend/python/diffusers/run.sh,faster-whisper:/build/backend/python/faster-whisper/run.sh,kokoro:/build/backend/python/kokoro/run.sh,vllm:/build/backend/python/vllm/run.sh,exllama2:/build/backend/python/exllama2/run.sh BUILD_TYPE=vulkan REBUILD=false HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz MAKEFLAGS=--jobs=4 --output-sync=target NVIDIA_DRIVER_CAPABILITIES=compute,utility NVIDIA_REQUIRE_CUDA=cuda>=.0 NVIDIA_VISIBLE_DEVICES=all
镜像标签
2025-03-03T00:53:46.633Z: org.opencontainers.image.created :robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference: org.opencontainers.image.description MIT: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name d616058b124cddc09ef3d80f391821509da07eb3: org.opencontainers.image.revision https://github.com/mudler/LocalAI: org.opencontainers.image.source LocalAI: org.opencontainers.image.title https://github.com/mudler/LocalAI: org.opencontainers.image.url master-vulkan-ffmpeg-core: org.opencontainers.image.version

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core  docker.io/localai/localai:master-vulkan-ffmpeg-core

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core  docker.io/localai/localai:master-vulkan-ffmpeg-core

Shell快速替换命令

sed -i 's#localai/localai:master-vulkan-ffmpeg-core#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core  docker.io/localai/localai:master-vulkan-ffmpeg-core'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core  docker.io/localai/localai:master-vulkan-ffmpeg-core'

镜像构建历史


# 2025-03-03 09:08:05  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/build/entrypoint.sh"]
                        
# 2025-03-03 09:08:05  0.00B 声明容器运行时监听的端口
EXPOSE map[8080/tcp:{}]
                        
# 2025-03-03 09:08:05  0.00B 创建挂载点用于持久化数据或共享数据
VOLUME [/build/models]
                        
# 2025-03-03 09:08:05  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "curl -f ${HEALTHCHECK_ENDPOINT} || exit 1"] "1m0s" "10m0s" "0s" "0s" '\n'}
                        
# 2025-03-03 09:08:05  0.00B 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/bash -c mkdir -p /build/models # buildkit
                        
# 2025-03-03 09:08:05  0.00B 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/bash -c if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/vllm     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/autogptq     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/bark     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/rerankers     ; fi # buildkit
                        
# 2025-03-03 09:08:05  0.00B 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/bash -c if [[ ( "${EXTRA_BACKENDS}" =~ "kokoro" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/kokoro     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "exllama2" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/exllama2     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "transformers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/transformers     ; fi # buildkit
                        
# 2025-03-03 09:08:05  0.00B 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/bash -c if [[ ( "${EXTRA_BACKENDS}" =~ "coqui" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/coqui     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "faster-whisper" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/faster-whisper     ; fi &&     if [[ ( "${EXTRA_BACKENDS}" =~ "diffusers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then         make -C backend/python/diffusers     ; fi # buildkit
                        
# 2025-03-03 09:08:04  0.00B 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/bash -c if [[ ( "${IMAGE_TYPE}" == "extras ")]]; then         apt-get -qq -y install espeak-ng     ; fi # buildkit
                        
# 2025-03-03 09:08:04  0.00B 
SHELL [/bin/bash -c]
                        
# 2025-03-03 09:08:04  34.64MB 复制新文件或目录到容器中
COPY /build/sources/go-piper/piper-phonemize/pi/lib/* /usr/lib/ # buildkit
                        
# 2025-03-03 09:08:04  582.33MB 复制新文件或目录到容器中
COPY /build/local-ai ./ # buildkit
                        
# 2025-03-03 09:08:02  1.51GB 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/sh -c make prepare-sources # buildkit
                        
# 2025-03-03 09:07:41  1.21GB 复制新文件或目录到容器中
COPY /opt/grpc /usr/local # buildkit
                        
# 2025-03-03 09:07:38  743.01MB 复制新文件或目录到容器中
COPY /build/sources ./sources/ # buildkit
                        
# 2025-03-03 08:56:08  16.10MB 复制新文件或目录到容器中
COPY . . # buildkit
                        
# 2025-03-03 08:56:08  0.00B 设置工作目录为/build
WORKDIR /build
                        
# 2025-03-03 08:56:08  150.50MB 执行命令并创建新的镜像层
RUN |13 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false FFMPEG=true IMAGE_TYPE=core EXTRA_BACKENDS= MAKEFLAGS=--jobs=4 --output-sync=target /bin/sh -c if [ "${FFMPEG}" = "true" ]; then         apt-get update &&         apt-get install -y --no-install-recommends             ffmpeg &&         apt-get clean &&         rm -rf /var/lib/apt/lists/*     ; fi # buildkit
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 NVIDIA_VISIBLE_DEVICES
ENV NVIDIA_VISIBLE_DEVICES=all
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 NVIDIA_REQUIRE_CUDA
ENV NVIDIA_REQUIRE_CUDA=cuda>=.0
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 NVIDIA_DRIVER_CAPABILITIES
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG CUDA_MAJOR_VERSION=
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 MAKEFLAGS --output-sync
ENV MAKEFLAGS=--jobs=4 --output-sync=target
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 HEALTHCHECK_ENDPOINT
ENV HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 REBUILD
ENV REBUILD=false
                        
# 2025-03-03 08:55:54  0.00B 设置环境变量 BUILD_TYPE
ENV BUILD_TYPE=vulkan
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG MAKEFLAGS=--jobs=4 --output-sync=target
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG EXTRA_BACKENDS
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG IMAGE_TYPE=core
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG TARGETARCH=amd64
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG BUILD_TYPE=vulkan
                        
# 2025-03-03 08:55:54  0.00B 定义构建参数
ARG FFMPEG=true
                        
# 2025-03-03 08:55:54  0.00B 执行命令并创建新的镜像层
RUN |9 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false /bin/sh -c if [ "${BUILD_TYPE}" = "hipblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then         apt-get update &&         apt-get install -y --no-install-recommends             hipblas-dev             rocblas-dev &&         apt-get clean &&         rm -rf /var/lib/apt/lists/* &&         ldconfig     ; fi # buildkit
                        
# 2025-03-03 08:55:54  0.00B 执行命令并创建新的镜像层
RUN |9 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false /bin/sh -c if [ "${BUILD_TYPE}" = "clblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then         apt-get update &&         apt-get install -y --no-install-recommends             libclblast-dev &&         apt-get clean &&         rm -rf /var/lib/apt/lists/*     ; fi # buildkit
                        
# 2025-03-03 08:55:54  0.00B 执行命令并创建新的镜像层
RUN |9 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false /bin/sh -c <<EOT bash
    if [ "${BUILD_TYPE}" = "cublas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
        apt-get update && \
        apt-get install -y  --no-install-recommends \
            software-properties-common pciutils
        if [ "amd64" = "$TARGETARCH" ]; then
            curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb
        fi
        if [ "arm64" = "$TARGETARCH" ]; then
            curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/arm64/cuda-keyring_1.1-1_all.deb
        fi
        dpkg -i cuda-keyring_1.1-1_all.deb && \
        rm -f cuda-keyring_1.1-1_all.deb && \
        apt-get update && \
        apt-get install -y --no-install-recommends \
            cuda-nvcc-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcufft-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcurand-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcublas-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcusparse-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \
            libcusolver-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} && \
        apt-get clean && \
        rm -rf /var/lib/apt/lists/*
    fi
EOT # buildkit
                        
# 2025-03-03 08:55:53  742.40MB 执行命令并创建新的镜像层
RUN |9 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=vulkan CUDA_MAJOR_VERSION= CUDA_MINOR_VERSION= SKIP_DRIVERS=false /bin/sh -c <<EOT bash
    if [ "${BUILD_TYPE}" = "vulkan" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
        apt-get update && \
        apt-get install -y  --no-install-recommends \
            software-properties-common pciutils wget gpg-agent && \
        wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
        wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
        apt-get update && \
        apt-get install -y \
            vulkan-sdk && \
        apt-get clean && \
        rm -rf /var/lib/apt/lists/*
    fi
EOT # buildkit
                        
# 2025-03-03 08:54:48  0.00B 设置环境变量 BUILD_TYPE
ENV BUILD_TYPE=vulkan
                        
# 2025-03-03 08:54:48  0.00B 定义构建参数
ARG SKIP_DRIVERS=false
                        
# 2025-03-03 08:54:48  0.00B 定义构建参数
ARG CUDA_MINOR_VERSION=
                        
# 2025-03-03 08:54:48  0.00B 定义构建参数
ARG CUDA_MAJOR_VERSION=
                        
# 2025-03-03 08:54:48  0.00B 定义构建参数
ARG BUILD_TYPE=vulkan
                        
# 2025-03-03 08:54:48  0.00B 设置工作目录为/build
WORKDIR /build
                        
# 2025-03-03 08:54:48  110.52MB 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c apt-get update &&     apt-get install -y --no-install-recommends         libopenblas-dev &&     apt-get clean &&     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-03-03 08:54:42  0.00B 设置环境变量 PATH
ENV PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
                        
# 2025-03-03 08:54:42  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
                        
# 2025-03-03 08:54:42  0.00B 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c echo "Target Variant: $TARGETVARIANT" # buildkit
                        
# 2025-03-03 08:54:42  0.00B 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c echo "Target Architecture: $TARGETARCH" # buildkit
                        
# 2025-03-03 08:54:42  0.00B 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c test -n "$TARGETARCH"     || (echo 'warn: missing $TARGETARCH, either set this `ARG` manually, or run using `docker buildkit`') # buildkit
                        
# 2025-03-03 08:54:42  219.34KB 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c update-ca-certificates # buildkit
                        
# 2025-03-03 08:54:41  0.00B 复制新文件或目录到容器中
COPY --chmod=644 custom-ca-certs/* /usr/local/share/ca-certificates/ # buildkit
                        
# 2025-03-03 08:54:41  136.48MB 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.34.2 &&     go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@1958fcbe2ca8bd93af633f11e97d44e567e945af # buildkit
                        
# 2025-03-03 08:54:33  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
                        
# 2025-03-03 08:54:33  222.31MB 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c curl -L -s https://go.dev/dl/go${GO_VERSION}.linux-${TARGETARCH}.tar.gz | tar -C /usr/local -xz # buildkit
                        
# 2025-03-03 08:54:29  68.61MB 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c <<EOT bash
    if [ "${CMAKE_FROM_SOURCE}}" = "true" ]; then
        curl -L -s https://github.com/Kitware/CMake/releases/download/v${CMAKE_VERSION}/cmake-${CMAKE_VERSION}.tar.gz -o cmake.tar.gz && tar xvf cmake.tar.gz && cd cmake-${CMAKE_VERSION} && ./configure && make && make install
    else
        apt-get update && \
        apt-get install -y \
            cmake && \
        apt-get clean && \
        rm -rf /var/lib/apt/lists/*
    fi
EOT # buildkit
                        
# 2025-03-03 08:54:22  308.87MB 执行命令并创建新的镜像层
RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c apt-get update &&     apt-get install -y --no-install-recommends         build-essential         ccache         ca-certificates         curl libssl-dev         git         unzip upx-ucl &&     apt-get clean &&     rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-03-03 08:54:22  0.00B 设置环境变量 EXTERNAL_GRPC_BACKENDS
ENV EXTERNAL_GRPC_BACKENDS=coqui:/build/backend/python/coqui/run.sh,transformers:/build/backend/python/transformers/run.sh,rerankers:/build/backend/python/rerankers/run.sh,autogptq:/build/backend/python/autogptq/run.sh,bark:/build/backend/python/bark/run.sh,diffusers:/build/backend/python/diffusers/run.sh,faster-whisper:/build/backend/python/faster-whisper/run.sh,kokoro:/build/backend/python/kokoro/run.sh,vllm:/build/backend/python/vllm/run.sh,exllama2:/build/backend/python/exllama2/run.sh
                        
# 2025-03-03 08:54:22  0.00B 设置环境变量 DEBIAN_FRONTEND
ENV DEBIAN_FRONTEND=noninteractive
                        
# 2025-03-03 08:54:22  0.00B 定义构建参数
ARG TARGETVARIANT=
                        
# 2025-03-03 08:54:22  0.00B 定义构建参数
ARG TARGETARCH=amd64
                        
# 2025-03-03 08:54:22  0.00B 定义构建参数
ARG CMAKE_FROM_SOURCE=false
                        
# 2025-03-03 08:54:22  0.00B 定义构建参数
ARG CMAKE_VERSION=3.26.4
                        
# 2025-03-03 08:54:22  0.00B 定义构建参数
ARG GO_VERSION=1.22.6
                        
# 2025-03-03 08:54:22  0.00B 指定运行容器时使用的用户
USER root
                        
# 2025-01-26 13:31:11  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-01-26 13:31:10  77.86MB 
/bin/sh -c #(nop) ADD file:1b6c8c9518be42fa2afe5e241ca31677fce58d27cdfa88baa91a65a259be3637 in / 
                        
# 2025-01-26 13:31:07  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=22.04
                        
# 2025-01-26 13:31:07  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2025-01-26 13:31:07  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2025-01-26 13:31:07  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:a8fdaf72af7fce5050079d1cc1337b43c521818e39fc8bca87b6cee43a4af1c5",
    "RepoTags": [
        "localai/localai:master-vulkan-ffmpeg-core",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:master-vulkan-ffmpeg-core"
    ],
    "RepoDigests": [
        "localai/localai@sha256:06d294730d40c5984cd250650ac9dd64378003134985f33e8f0f75c8620bc03c",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai@sha256:49d9b196d23bc3b64d962ae2a9dd66f035ead29910c905770a5d2b3a4985fec2"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-03-03T01:08:05.425575379Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "root",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "8080/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin",
            "DEBIAN_FRONTEND=noninteractive",
            "EXTERNAL_GRPC_BACKENDS=coqui:/build/backend/python/coqui/run.sh,transformers:/build/backend/python/transformers/run.sh,rerankers:/build/backend/python/rerankers/run.sh,autogptq:/build/backend/python/autogptq/run.sh,bark:/build/backend/python/bark/run.sh,diffusers:/build/backend/python/diffusers/run.sh,faster-whisper:/build/backend/python/faster-whisper/run.sh,kokoro:/build/backend/python/kokoro/run.sh,vllm:/build/backend/python/vllm/run.sh,exllama2:/build/backend/python/exllama2/run.sh",
            "BUILD_TYPE=vulkan",
            "REBUILD=false",
            "HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz",
            "MAKEFLAGS=--jobs=4 --output-sync=target",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=.0",
            "NVIDIA_VISIBLE_DEVICES=all"
        ],
        "Cmd": null,
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "curl -f ${HEALTHCHECK_ENDPOINT} || exit 1"
            ],
            "Interval": 60000000000,
            "Timeout": 600000000000,
            "Retries": 10
        },
        "Image": "",
        "Volumes": {
            "/build/models": {}
        },
        "WorkingDir": "/build",
        "Entrypoint": [
            "/build/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2025-03-03T00:53:46.633Z",
            "org.opencontainers.image.description": ":robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI,  running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference",
            "org.opencontainers.image.licenses": "MIT",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "d616058b124cddc09ef3d80f391821509da07eb3",
            "org.opencontainers.image.source": "https://github.com/mudler/LocalAI",
            "org.opencontainers.image.title": "LocalAI",
            "org.opencontainers.image.url": "https://github.com/mudler/LocalAI",
            "org.opencontainers.image.version": "master-vulkan-ffmpeg-core"
        },
        "Shell": [
            "/bin/bash",
            "-c"
        ]
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 5910982339,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/dd2d63db14a1d7be80e162eb5aaca3e0e72072327bc667a04a85c4ec9a40c000/diff:/var/lib/docker/overlay2/0867422c5ff0f7ffe6d049a5469cdca939c5abe273f810f2d01d5375254152d9/diff:/var/lib/docker/overlay2/ec138263db70bc659097c95be036f252f1c22cdb9ac1662ea32a7ba89e66f1ad/diff:/var/lib/docker/overlay2/6da6a2b7f5f66d3326d078ffbbc227bc851fa386372211d9567da17d97687681/diff:/var/lib/docker/overlay2/1498d0f562e7cdcd319cad3475fd251cb3f8a63987d9420e2307744ce9abbed0/diff:/var/lib/docker/overlay2/c2e958257e17a715a8260d9364be69e9bcb8addecdaeac2d6582b145342356e9/diff:/var/lib/docker/overlay2/e5b7a500efc4251eea99d27bf2560956bad5ee94f850d40de38446cc7f0def50/diff:/var/lib/docker/overlay2/130ba456ebbca44022619a0dc673132d190ae5b5751a49697a3423a5d3a4fdaa/diff:/var/lib/docker/overlay2/65efd12f4cce30eac072ffd8fb307e793dd013ff110d872fafed86d46865b457/diff:/var/lib/docker/overlay2/7aa2e803885f3d180cadf4344bfead9b0f6a49406634767436718afc482be131/diff:/var/lib/docker/overlay2/65fce5be968aa71dc6407b6f649633dcd51131a724e3c106d9a71919d539da36/diff:/var/lib/docker/overlay2/2b7b9f6af8807c002dde84681b5810ea4c110c2d4722d00f43020e543ea01028/diff:/var/lib/docker/overlay2/eff4629f6af81bc33ab55ccaf2fc7a9ce3606b825d8a139553a6507dbc383c32/diff:/var/lib/docker/overlay2/ea7ff6450780e7df98cd6832d2a08cf5c668dbbbef329bed55bc18c2b85f2436/diff:/var/lib/docker/overlay2/23c58f737f7c5a6c2f7e57cab6ea14fbb05b5ad053e678c2d16964818634bb0a/diff:/var/lib/docker/overlay2/90497b9b61ee3c868637ace312a3c1aa79b4417bd4c96ea00b5c0159e7850a02/diff:/var/lib/docker/overlay2/3268788374bba886de950bc55f59b345d6f6f26236b48b0cbfabbd970bb0b455/diff:/var/lib/docker/overlay2/e33e89105341dd754f168e13c886f8a0604926f00b191c4090299d4f761d10c5/diff:/var/lib/docker/overlay2/9a40c8f386836851ff51cf11c0054087c0c7d874a26fb8a312c2723ced39b15c/diff:/var/lib/docker/overlay2/253c5ac1e9552406f7ff3b5e71036e42563a3b440b15ca174f444dba45e50aa9/diff:/var/lib/docker/overlay2/18d5c8f1a3b0738ace26c445b28c3671863384c4207b63da5b59711d3558d64d/diff:/var/lib/docker/overlay2/8be8321189cb30638f157ba4ce2dc0490a5e03ca3705797dbd75feaa30291673/diff:/var/lib/docker/overlay2/71f6d836892aaa72cce01b24e9fb4530698e9feac26e04a1d9f2ad445f63f7f0/diff:/var/lib/docker/overlay2/a23f4e8244b16ab68ad7c159706a5fc8afc61ea21cf5129a966f5104e0fc0ac2/diff:/var/lib/docker/overlay2/6a9a6bccaa430788ea1151270cc1c7f9f7e4db5eadc81cb08faa6e2eab1d1b73/diff:/var/lib/docker/overlay2/003a2e3a10e463b484d07e753400c38e7774eab735ec46bdeeae5a4ff8d266e2/diff:/var/lib/docker/overlay2/e4eafe9a73d57766ac636428c7ad3a8578fc0d1c78dd31b299c7505282414f3d/diff:/var/lib/docker/overlay2/ace3f972cf88bd330727fa9a25fd0df2c3fec1df161ac9102bf9f5739b40b82c/diff",
            "MergedDir": "/var/lib/docker/overlay2/91b1e1b8000f7c1221a1609d4dd4910f350c3b95201fd720a40e089d06a88351/merged",
            "UpperDir": "/var/lib/docker/overlay2/91b1e1b8000f7c1221a1609d4dd4910f350c3b95201fd720a40e089d06a88351/diff",
            "WorkDir": "/var/lib/docker/overlay2/91b1e1b8000f7c1221a1609d4dd4910f350c3b95201fd720a40e089d06a88351/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:270a1170e7e398434ff1b31e17e233f7d7b71aa99a40473615860068e86720af",
            "sha256:85474af5600119435d9bc808bf831ec2083ad415f800f42bbb6c96ccbd999e40",
            "sha256:e68a825eacbfc8677b20bd8f117a0fd4789bc9c90b218f6e9ced5fbda9f6986b",
            "sha256:42794ef29432cbe0667444342a2f0f9dd447913d55647a93e90592fa1bae23c0",
            "sha256:f75ca1545afa33bc6d981e3b4627442583c066f4cdc3412fb95696b5a4613658",
            "sha256:6555089f025eecbe129ce7a2da89305d382cda674affccdcce9eaa6c2cc25482",
            "sha256:558dcdfffd29b647bab6abd936815c8b7de68870fa3abd84ec69becee06c7f33",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5fdc358a868175f9df4e0954c5ee3f1ba76f5904322cfcb8eff5226464134813",
            "sha256:f65b499b91499c1b979afab78b10a02fc9591cf9b9f48b2388ce915d2cafaf12",
            "sha256:df984be93d89faa11e74fef75ce013db3172cfdb0dda7add986df6b7c20766be",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:07c012d12b7cf85ac643f177c224412693dc3d3a297fa820c4395e488a247f22",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:3259e6f3bd62b9ba8246d38cca5780527e1e567459e85bfc0d04aa222c23c620",
            "sha256:1d75a899858c40c07c9d6afd3b52d3a32f926f14a5aa96981d9c97bf11cc177f",
            "sha256:07cb08656714804b7e047977e313280462ea36d19d72b64eb650ac41dadf1452",
            "sha256:65f0a0d9d670f49e7631a17651e6c6e550ad11b73ca353b5ef28cc48fa779a53",
            "sha256:65e67952eea7d62ea2658813e4447061d028bf6003656032229ccbce8b814867",
            "sha256:053b4f4ce8e636d148d3c8e1a775ba398d0c8aef011988950a2c3bfc52dbb590",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:04ed6e21d0fe5b6093f1bf64038d8ac963c1ac86ba58ae7813a6f3d1f9ea3bb4"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-03-03T18:43:43.684026872+08:00"
    }
}

更多版本

docker.io/localai/localai:latest-aio-cpu

linux/amd64 docker.io6.42GB2024-11-08 14:23
235

docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io45.94GB2024-11-21 01:51
137

docker.io/localai/localai:master-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io42.47GB2025-02-28 01:41
91

docker.io/localai/localai:master-vulkan-ffmpeg-core

linux/amd64 docker.io5.91GB2025-03-03 18:48
64

docker.io/localai/localai:latest-aio-gpu-hipblas

linux/amd64 docker.io88.18GB2025-03-10 02:52
87

docker.io/localai/localai:latest-gpu-nvidia-cuda-12

linux/amd64 docker.io41.80GB2025-03-11 02:52
83