docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12 linux/amd64

docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12 - 国内下载镜像源 浏览次数:11

温馨提示:此镜像为latest tag镜像,本站无法保证此版本为最新镜像

LocalAI Docker 镜像

这是一个包含 LocalAI 软件的 Docker 镜像, LocalAI 是一款用于构建、训练和部署机器学习模型的开源平台。

镜像用途

* 在 Docker 容器中快速部署和运行 LocalAI * 方便地进行机器学习项目开发和实验 * 轻松共享和部署机器学习模型

镜像内容

* LocalAI 软件及所有依赖项 * 必要的配置文件和工具 * 示例数据集和模型

使用说明

请参考 LocalAI 官方文档获取详细的使用说明和安装步骤。
源镜像 docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12
镜像ID sha256:ed567b6d1cdeff319d33d901c52c2f5fb93f4acf359fd937dba847f82e57ba27
镜像TAG latest-aio-gpu-nvidia-cuda-12
大小 45.94GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /aio/entrypoint.sh
工作目录 /build
OS/平台 linux/amd64
浏览量 11 次
贡献者 48*****7@qq.com
镜像创建 2024-11-10T20:03:00.006501711Z
同步时间 2024-11-21 01:51
更新时间 2024-11-21 12:42
开放端口
8080/tcp
环境变量
PATH=/root/.cargo/bin:/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin DEBIAN_FRONTEND=noninteractive EXTERNAL_GRPC_BACKENDS=coqui:/build/backend/python/coqui/run.sh,huggingface-embeddings:/build/backend/python/sentencetransformers/run.sh,transformers:/build/backend/python/transformers/run.sh,sentencetransformers:/build/backend/python/sentencetransformers/run.sh,rerankers:/build/backend/python/rerankers/run.sh,autogptq:/build/backend/python/autogptq/run.sh,bark:/build/backend/python/bark/run.sh,diffusers:/build/backend/python/diffusers/run.sh,openvoice:/build/backend/python/openvoice/run.sh,vall-e-x:/build/backend/python/vall-e-x/run.sh,vllm:/build/backend/python/vllm/run.sh,mamba:/build/backend/python/mamba/run.sh,exllama2:/build/backend/python/exllama2/run.sh,transformers-musicgen:/build/backend/python/transformers-musicgen/run.sh,parler-tts:/build/backend/python/parler-tts/run.sh BUILD_TYPE=cublas REBUILD=false HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz MAKEFLAGS=--jobs=3 --output-sync=target NVIDIA_DRIVER_CAPABILITIES=compute,utility NVIDIA_REQUIRE_CUDA=cuda>=12.0 NVIDIA_VISIBLE_DEVICES=all
镜像标签
2024-11-10T18:10:05.710Z: org.opencontainers.image.created :robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference: org.opencontainers.image.description MIT: org.opencontainers.image.licenses ubuntu: org.opencontainers.image.ref.name 9099d0c77e9e52f4a63c53aa546cc47f1e0cfdb1: org.opencontainers.image.revision https://github.com/mudler/LocalAI: org.opencontainers.image.source LocalAI: org.opencontainers.image.title https://github.com/mudler/LocalAI: org.opencontainers.image.url v2.23.0-aio-gpu-nvidia-cuda-12: org.opencontainers.image.version

Docker拉取命令 无权限下载?点我修复

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12

Shell快速替换命令

sed -i 's#localai/localai:latest-aio-gpu-nvidia-cuda-12#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12  docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12'

镜像历史

大小 创建时间 层信息
0.00B 2024-11-11 04:03:00 ENTRYPOINT ["/aio/entrypoint.sh"]
29.78KB 2024-11-11 04:03:00 COPY aio/ /aio # buildkit
58.60MB 2024-11-11 04:02:59 RUN /bin/bash -c apt-get update && apt-get install -y pciutils && apt-get clean # buildkit
0.00B 2024-11-11 03:33:24 ENTRYPOINT ["/build/entrypoint.sh"]
0.00B 2024-11-11 03:33:24 EXPOSE map[8080/tcp:{}]
0.00B 2024-11-11 03:33:24 VOLUME [/build/models]
0.00B 2024-11-11 03:33:24 HEALTHCHECK &{["CMD-SHELL" "curl -f ${HEALTHCHECK_ENDPOINT} || exit 1"] "1m0s" "10m0s" "0s" "0s" '\n'}
0.00B 2024-11-11 03:33:24 RUN |15 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 FFMPEG=true BUILD_TYPE=cublas TARGETARCH=amd64 IMAGE_TYPE=extras EXTRA_BACKENDS= MAKEFLAGS=--jobs=3 --output-sync=target CUDA_MAJOR_VERSION=12 /bin/bash -c mkdir -p /build/models # buildkit
10.95GB 2024-11-11 03:33:23 RUN |15 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 FFMPEG=true BUILD_TYPE=cublas TARGETARCH=amd64 IMAGE_TYPE=extras EXTRA_BACKENDS= MAKEFLAGS=--jobs=3 --output-sync=target CUDA_MAJOR_VERSION=12 /bin/bash -c if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/vllm ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/autogptq ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/bark ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/rerankers ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "mamba" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/mamba ; fi # buildkit
14.27GB 2024-11-11 03:30:12 RUN |15 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 FFMPEG=true BUILD_TYPE=cublas TARGETARCH=amd64 IMAGE_TYPE=extras EXTRA_BACKENDS= MAKEFLAGS=--jobs=3 --output-sync=target CUDA_MAJOR_VERSION=12 /bin/bash -c if [[ ( "${EXTRA_BACKENDS}" =~ "vall-e-x" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/vall-e-x ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "openvoice" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/openvoice ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "sentencetransformers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/sentencetransformers ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "exllama2" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/exllama2 ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "transformers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/transformers ; fi # buildkit
7.55GB 2024-11-11 03:24:45 RUN |15 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 FFMPEG=true BUILD_TYPE=cublas TARGETARCH=amd64 IMAGE_TYPE=extras EXTRA_BACKENDS= MAKEFLAGS=--jobs=3 --output-sync=target CUDA_MAJOR_VERSION=12 /bin/bash -c if [[ ( "${EXTRA_BACKENDS}" =~ "coqui" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/coqui ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "parler-tts" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/parler-tts ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "diffusers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/diffusers ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "transformers-musicgen" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then make -C backend/python/transformers-musicgen ; fi # buildkit
0.00B 2024-11-11 03:23:57 SHELL [/bin/bash -c]
6.83MB 2024-11-11 03:23:57 COPY /build/backend-assets/grpc/stablediffusion ./backend-assets/grpc/stablediffusion # buildkit
34.64MB 2024-11-11 03:23:57 COPY /build/sources/go-piper/piper-phonemize/pi/lib/* /usr/lib/ # buildkit
1.35GB 2024-11-11 03:23:57 COPY /build/local-ai ./ # buildkit
1.45GB 2024-11-11 03:23:52 RUN |15 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 FFMPEG=true BUILD_TYPE=cublas TARGETARCH=amd64 IMAGE_TYPE=extras EXTRA_BACKENDS= MAKEFLAGS=--jobs=3 --output-sync=target CUDA_MAJOR_VERSION=12 /bin/sh -c make prepare-sources # buildkit
1.21GB 2024-11-11 03:23:32 COPY /opt/grpc /usr/local # buildkit
1.80GB 2024-11-11 03:23:26 COPY /build/sources ./sources/ # buildkit
14.39MB 2024-11-11 02:14:33 COPY . . # buildkit
0.00B 2024-11-11 02:14:33 WORKDIR /build
45.84MB 2024-11-11 02:14:33 RUN |15 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 FFMPEG=true BUILD_TYPE=cublas TARGETARCH=amd64 IMAGE_TYPE=extras EXTRA_BACKENDS= MAKEFLAGS=--jobs=3 --output-sync=target CUDA_MAJOR_VERSION=12 /bin/sh -c if [ "${FFMPEG}" = "true" ]; then apt-get update && apt-get install -y --no-install-recommends ffmpeg && apt-get clean && rm -rf /var/lib/apt/lists/* ; fi # buildkit
0.00B 2024-11-11 02:14:24 ENV NVIDIA_VISIBLE_DEVICES=all
0.00B 2024-11-11 02:14:24 ENV NVIDIA_REQUIRE_CUDA=cuda>=12.0
0.00B 2024-11-11 02:14:24 ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
0.00B 2024-11-11 02:14:24 ARG CUDA_MAJOR_VERSION=12
0.00B 2024-11-11 02:14:24 ENV MAKEFLAGS=--jobs=3 --output-sync=target
0.00B 2024-11-11 02:14:24 ENV HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz
0.00B 2024-11-11 02:14:24 ENV REBUILD=false
0.00B 2024-11-11 02:14:24 ENV BUILD_TYPE=cublas
0.00B 2024-11-11 02:14:24 ARG MAKEFLAGS=--jobs=3 --output-sync=target
0.00B 2024-11-11 02:14:24 ARG EXTRA_BACKENDS
0.00B 2024-11-11 02:14:24 ARG IMAGE_TYPE=extras
0.00B 2024-11-11 02:14:24 ARG TARGETARCH=amd64
0.00B 2024-11-11 02:14:24 ARG BUILD_TYPE=cublas
0.00B 2024-11-11 02:14:24 ARG FFMPEG=true
0.00B 2024-11-11 02:14:24 RUN |8 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 /bin/sh -c if [ "${BUILD_TYPE}" = "hipblas" ]; then apt-get update && apt-get install -y --no-install-recommends hipblas-dev rocblas-dev && apt-get clean && rm -rf /var/lib/apt/lists/* && ldconfig ; fi # buildkit
0.00B 2024-11-11 02:14:24 RUN |8 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 /bin/sh -c if [ "${BUILD_TYPE}" = "clblas" ]; then apt-get update && apt-get install -y --no-install-recommends libclblast-dev && apt-get clean && rm -rf /var/lib/apt/lists/* ; fi # buildkit
3.92GB 2024-11-11 02:14:24 RUN |8 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 /bin/sh -c <<EOT bash if [ "${BUILD_TYPE}" = "cublas" ]; then apt-get update && \ apt-get install -y --no-install-recommends \ software-properties-common pciutils if [ "amd64" = "$TARGETARCH" ]; then curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb fi if [ "arm64" = "$TARGETARCH" ]; then curl -O https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/arm64/cuda-keyring_1.1-1_all.deb fi dpkg -i cuda-keyring_1.1-1_all.deb && \ rm -f cuda-keyring_1.1-1_all.deb && \ apt-get update && \ apt-get install -y --no-install-recommends \ cuda-nvcc-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \ libcufft-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \ libcurand-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \ libcublas-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \ libcusparse-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} \ libcusolver-dev-${CUDA_MAJOR_VERSION}-${CUDA_MINOR_VERSION} && \ apt-get clean && \ rm -rf /var/lib/apt/lists/* fi EOT # buildkit
0.00B 2024-11-11 02:12:38 RUN |8 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= BUILD_TYPE=cublas CUDA_MAJOR_VERSION=12 CUDA_MINOR_VERSION=0 /bin/sh -c <<EOT bash if [ "${BUILD_TYPE}" = "vulkan" ]; then apt-get update && \ apt-get install -y --no-install-recommends \ software-properties-common pciutils wget gpg-agent && \ wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \ wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \ apt-get update && \ apt-get install -y \ vulkan-sdk && \ apt-get clean && \ rm -rf /var/lib/apt/lists/* fi EOT # buildkit
0.00B 2024-11-11 02:12:38 ENV BUILD_TYPE=cublas
0.00B 2024-11-11 02:12:38 ARG CUDA_MINOR_VERSION=0
0.00B 2024-11-11 02:12:38 ARG CUDA_MAJOR_VERSION=12
0.00B 2024-11-11 02:12:38 ARG BUILD_TYPE=cublas
34.81MB 2024-11-11 02:12:38 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c pip install --user grpcio-tools # buildkit
275.39MB 2024-11-11 02:12:34 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c apt-get update && apt-get install -y --no-install-recommends espeak-ng espeak python3-pip python-is-python3 python3-dev llvm python3-venv && apt-get clean && rm -rf /var/lib/apt/lists/* && pip install --upgrade pip # buildkit
1.24GB 2024-11-11 02:12:16 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y # buildkit
0.00B 2024-11-11 02:11:50 ENV PATH=/root/.cargo/bin:/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
33.74MB 2024-11-11 02:11:50 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c curl -LsSf https://astral.sh/uv/install.sh | UV_INSTALL_DIR=/usr/bin sh # buildkit
0.00B 2024-11-11 02:11:48 WORKDIR /build
28.00B 2024-11-11 02:11:48 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c ln -s /usr/include/opencv4/opencv2 /usr/include/opencv2 # buildkit
906.15MB 2024-11-11 02:11:48 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c apt-get update && apt-get install -y --no-install-recommends libopenblas-dev libopencv-dev && apt-get clean && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-11-11 02:10:59 ENV PATH=/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
0.00B 2024-11-11 02:10:59 ENV PATH=/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
0.00B 2024-11-11 02:10:59 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c echo "Target Variant: $TARGETVARIANT" # buildkit
0.00B 2024-11-11 02:10:59 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c echo "Target Architecture: $TARGETARCH" # buildkit
0.00B 2024-11-11 02:10:59 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c test -n "$TARGETARCH" || (echo 'warn: missing $TARGETARCH, either set this `ARG` manually, or run using `docker buildkit`') # buildkit
219.34KB 2024-11-11 02:10:59 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c update-ca-certificates # buildkit
0.00B 2024-11-11 02:10:58 COPY --chmod=644 custom-ca-certs/* /usr/local/share/ca-certificates/ # buildkit
136.48MB 2024-11-11 02:10:58 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.34.2 && go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@1958fcbe2ca8bd93af633f11e97d44e567e945af # buildkit
0.00B 2024-11-11 02:10:45 ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin
222.31MB 2024-11-11 02:10:45 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c curl -L -s https://go.dev/dl/go${GO_VERSION}.linux-${TARGETARCH}.tar.gz | tar -C /usr/local -xz # buildkit
68.61MB 2024-11-11 02:10:42 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c <<EOT bash if [ "${CMAKE_FROM_SOURCE}}" = "true" ]; then curl -L -s https://github.com/Kitware/CMake/releases/download/v${CMAKE_VERSION}/cmake-${CMAKE_VERSION}.tar.gz -o cmake.tar.gz && tar xvf cmake.tar.gz && cd cmake-${CMAKE_VERSION} && ./configure && make && make install else apt-get update && \ apt-get install -y \ cmake && \ apt-get clean && \ rm -rf /var/lib/apt/lists/* fi EOT # buildkit
289.90MB 2024-11-11 02:10:36 RUN |5 GO_VERSION=1.22.6 CMAKE_VERSION=3.26.4 CMAKE_FROM_SOURCE=false TARGETARCH=amd64 TARGETVARIANT= /bin/sh -c apt-get update && apt-get install -y --no-install-recommends build-essential ccache ca-certificates curl libssl-dev git unzip upx-ucl && apt-get clean && rm -rf /var/lib/apt/lists/* # buildkit
0.00B 2024-11-11 02:10:36 ENV EXTERNAL_GRPC_BACKENDS=coqui:/build/backend/python/coqui/run.sh,huggingface-embeddings:/build/backend/python/sentencetransformers/run.sh,transformers:/build/backend/python/transformers/run.sh,sentencetransformers:/build/backend/python/sentencetransformers/run.sh,rerankers:/build/backend/python/rerankers/run.sh,autogptq:/build/backend/python/autogptq/run.sh,bark:/build/backend/python/bark/run.sh,diffusers:/build/backend/python/diffusers/run.sh,openvoice:/build/backend/python/openvoice/run.sh,vall-e-x:/build/backend/python/vall-e-x/run.sh,vllm:/build/backend/python/vllm/run.sh,mamba:/build/backend/python/mamba/run.sh,exllama2:/build/backend/python/exllama2/run.sh,transformers-musicgen:/build/backend/python/transformers-musicgen/run.sh,parler-tts:/build/backend/python/parler-tts/run.sh
0.00B 2024-11-11 02:10:36 ENV DEBIAN_FRONTEND=noninteractive
0.00B 2024-11-11 02:10:36 ARG TARGETVARIANT=
0.00B 2024-11-11 02:10:36 ARG TARGETARCH=amd64
0.00B 2024-11-11 02:10:36 ARG CMAKE_FROM_SOURCE=false
0.00B 2024-11-11 02:10:36 ARG CMAKE_VERSION=3.26.4
0.00B 2024-11-11 02:10:36 ARG GO_VERSION=1.22.6
0.00B 2024-11-11 02:10:36 USER root
0.00B 2024-09-12 00:25:18 /bin/sh -c #(nop) CMD ["/bin/bash"]
77.86MB 2024-09-12 00:25:17 /bin/sh -c #(nop) ADD file:ebe009f86035c175ba244badd298a2582914415cf62783d510eab3a311a5d4e1 in /
0.00B 2024-09-12 00:25:16 /bin/sh -c #(nop) LABEL org.opencontainers.image.version=22.04
0.00B 2024-09-12 00:25:16 /bin/sh -c #(nop) LABEL org.opencontainers.image.ref.name=ubuntu
0.00B 2024-09-12 00:25:16 /bin/sh -c #(nop) ARG LAUNCHPAD_BUILD_ARCH
0.00B 2024-09-12 00:25:16 /bin/sh -c #(nop) ARG RELEASE

镜像信息

{
    "Id": "sha256:ed567b6d1cdeff319d33d901c52c2f5fb93f4acf359fd937dba847f82e57ba27",
    "RepoTags": [
        "localai/localai:latest-aio-gpu-nvidia-cuda-12",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12"
    ],
    "RepoDigests": [
        "localai/localai@sha256:d6a3789cd9d159357003b66386bcc62b61e8043793fdd168ab5cd36ab63cb124",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/localai/localai@sha256:b42b30fef9a11805574ed54e09ef3696cdf7705df785efd735733bb303cc5289"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-11-10T20:03:00.006501711Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "root",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "8080/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/root/.cargo/bin:/opt/rocm/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/root/go/bin:/usr/local/go/bin",
            "DEBIAN_FRONTEND=noninteractive",
            "EXTERNAL_GRPC_BACKENDS=coqui:/build/backend/python/coqui/run.sh,huggingface-embeddings:/build/backend/python/sentencetransformers/run.sh,transformers:/build/backend/python/transformers/run.sh,sentencetransformers:/build/backend/python/sentencetransformers/run.sh,rerankers:/build/backend/python/rerankers/run.sh,autogptq:/build/backend/python/autogptq/run.sh,bark:/build/backend/python/bark/run.sh,diffusers:/build/backend/python/diffusers/run.sh,openvoice:/build/backend/python/openvoice/run.sh,vall-e-x:/build/backend/python/vall-e-x/run.sh,vllm:/build/backend/python/vllm/run.sh,mamba:/build/backend/python/mamba/run.sh,exllama2:/build/backend/python/exllama2/run.sh,transformers-musicgen:/build/backend/python/transformers-musicgen/run.sh,parler-tts:/build/backend/python/parler-tts/run.sh",
            "BUILD_TYPE=cublas",
            "REBUILD=false",
            "HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz",
            "MAKEFLAGS=--jobs=3 --output-sync=target",
            "NVIDIA_DRIVER_CAPABILITIES=compute,utility",
            "NVIDIA_REQUIRE_CUDA=cuda\u003e=12.0",
            "NVIDIA_VISIBLE_DEVICES=all"
        ],
        "Cmd": null,
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "curl -f ${HEALTHCHECK_ENDPOINT} || exit 1"
            ],
            "Interval": 60000000000,
            "Timeout": 600000000000,
            "Retries": 10
        },
        "Image": "",
        "Volumes": {
            "/build/models": {}
        },
        "WorkingDir": "/build",
        "Entrypoint": [
            "/aio/entrypoint.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.created": "2024-11-10T18:10:05.710Z",
            "org.opencontainers.image.description": ":robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI,  running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference",
            "org.opencontainers.image.licenses": "MIT",
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.revision": "9099d0c77e9e52f4a63c53aa546cc47f1e0cfdb1",
            "org.opencontainers.image.source": "https://github.com/mudler/LocalAI",
            "org.opencontainers.image.title": "LocalAI",
            "org.opencontainers.image.url": "https://github.com/mudler/LocalAI",
            "org.opencontainers.image.version": "v2.23.0-aio-gpu-nvidia-cuda-12"
        },
        "Shell": [
            "/bin/bash",
            "-c"
        ]
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 45940797844,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/ff0a288fcd239fd43b4b2e28f255333625c14a7c070f110a02242f042980580f/diff:/var/lib/docker/overlay2/bc457da3caa28dfbec6a5565f79531bd1a3d99ca6f7ca8edc347dc7f0ce9df76/diff:/var/lib/docker/overlay2/039e59a0ecc6e4174aab5b984067a86e7645464abbf8f862036cc044d8ebd2a9/diff:/var/lib/docker/overlay2/e02d2f3df76ef7b0edfd9b1653620fe490387567ff530da2c139868983bf65be/diff:/var/lib/docker/overlay2/9c0b3873850e8ddf3d63189ed3f4c504c266a455908452fb02005613b536a8b7/diff:/var/lib/docker/overlay2/1ef466067b90f88c8d2c1e1eb1268939ff16cc9a5fa1ebf47dd7cf6120c0d2ab/diff:/var/lib/docker/overlay2/2e8c4d379619985dd738d613c34885ffa2c1f714db88d65bcb3b1f0f5cd5953e/diff:/var/lib/docker/overlay2/5f539bad5f7055a53aa751c62b24950de21d65ce72e3a573c55a6af50b4d3ef3/diff:/var/lib/docker/overlay2/8d1005fc2eb911be470820e67d5c2da0149ef550b1e717fd58af28a80454f59b/diff:/var/lib/docker/overlay2/d3511109fe35bf914cc66f807d0882a983620ddc315341a66aedf9a2cd530cda/diff:/var/lib/docker/overlay2/c388b208044275108575847f1b23a82c8944ceca814ea177034fa898a766cf1c/diff:/var/lib/docker/overlay2/71586233c659688f96a452f721d5c6d2f9d476dcb95b73cfbd08aadf80e313b8/diff:/var/lib/docker/overlay2/c676d85b36c3e4a4d92a61bb7bcef187410955b03822c115897ff52b73dd98df/diff:/var/lib/docker/overlay2/27999c91bc7181346e6b937d27731f11938efc07258dd000b7fd061f3a4c9f2e/diff:/var/lib/docker/overlay2/01cc1658f56d7f6368ca45b9cd8728ecf8da05a2c5094e72bb5f911ef7376407/diff:/var/lib/docker/overlay2/ef3c279b910c07d332667b9b182c826eac89a5efce3649aaf5a908b867de632a/diff:/var/lib/docker/overlay2/7c09ba2407260a1feec94e5b81f9a427243c04e2df72dd8f9ad50f5e8324d59e/diff:/var/lib/docker/overlay2/8d783a1307969b9968376a01baf22995bfc65881b60bc61c96c62634e8cad454/diff:/var/lib/docker/overlay2/e8cce7c168a60b12c3c72a4404be64ac88e70e0ae6a9da8831fff9caad87850d/diff:/var/lib/docker/overlay2/26475026d2af753640514c077df8ddb166fcc5fb844a518725cc4d62d5cb5e31/diff:/var/lib/docker/overlay2/aa05f2abfc4de7761bd283a8e2ea909f1e2f8da0cff7aa82716709e46f4efb18/diff:/var/lib/docker/overlay2/14fcbc48c9a182f832f65115922a91127d9b842710990b05c71e79be6a33972d/diff:/var/lib/docker/overlay2/9ac6447f69e7f2b1203069bc34b67f3d6178f440a87dae4f7d235cc44fc05de8/diff:/var/lib/docker/overlay2/8ec5c6964d40be9d0d8da3f1974ae62aff12ac3fd0b9575e343734f6491af8c3/diff:/var/lib/docker/overlay2/fae77a97628f8e4b013a1c8a1916f5ee7e34bf1397cbf0c519a9ac5f3d4e38b2/diff:/var/lib/docker/overlay2/e5885a29d8e88d5f9d608e8ae5edf022f235ac3ea233c2c8361223e090634212/diff:/var/lib/docker/overlay2/348f4462a24508c805172fcae37201036341138fe1b67c59a6b9f33504e4f8ac/diff:/var/lib/docker/overlay2/9da7319627005708aed2331865b20cfca1a05d07c5c573f5b44476740dbe69b1/diff:/var/lib/docker/overlay2/b402a6fb327ad2b968e21bc49afca41a850785fe9a5076528bec6bb5b26dde4f/diff:/var/lib/docker/overlay2/9f97de3e67a70919be7f2f185af688f64215a638c8851628c03bca57366ccdc1/diff:/var/lib/docker/overlay2/c88ccfdc45510215300a21bd690422d10f40e606500a67f5215b0824edc1028d/diff:/var/lib/docker/overlay2/8482565118a85c37e4960ee2a21566ee52f820a24eed64eaebce34ab8f49d951/diff:/var/lib/docker/overlay2/45dad6981383909f38d4a229a42826c3fe8efd59a63f2e1f73d53b86139ce0f9/diff:/var/lib/docker/overlay2/60d332a1ded06dec413eb9c62c61a28c68eb667b66d47b0c1c881dcd8712b9ed/diff:/var/lib/docker/overlay2/4cfb2ff6eb670d08d805fcc326973c76acabc424b2f6ce5f1903149f34750452/diff",
            "MergedDir": "/var/lib/docker/overlay2/ad549ae16d885d6f59cb3c5ecaeefac7b6c29b816c393db05dd3264ddbb57533/merged",
            "UpperDir": "/var/lib/docker/overlay2/ad549ae16d885d6f59cb3c5ecaeefac7b6c29b816c393db05dd3264ddbb57533/diff",
            "WorkDir": "/var/lib/docker/overlay2/ad549ae16d885d6f59cb3c5ecaeefac7b6c29b816c393db05dd3264ddbb57533/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:2573e0d8158209ed54ab25c87bcdcb00bd3d2539246960a3d592a1c599d70465",
            "sha256:b2849f0e01697ee612bd5ef5e0f4151d99cb828921bd80ad46a9d0b4f4ce7925",
            "sha256:34e168bce6e4c4cc28e5ffad8cd60ec7065019c4453e6033062f8e455dfa85cd",
            "sha256:3ff5b090a0559e833bd4d12af06083346eadbecdd494dfc898c4dff08d84e0ef",
            "sha256:ba8335a7d8d93c31f0b8063b561bc2e0dc6a3d6e53b14dd08acc159d8862794a",
            "sha256:7f2cad03f6a3aa9e3a60cb8c83ed75e5b2307d9b79296ba8eea4ba89b077ac70",
            "sha256:6ea5c3a600629d7f8f1e96276207bbd61430f23a6367f35f04ffcde6b19c545a",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:d4f0395f8abf300c0867c90b2bfecf54bede13d110d7d44037e6c2565ef9a277",
            "sha256:b3c97be33c00d9da68d6b92ea4364b6593eecbe6b509525d453ec6305e0fee3f",
            "sha256:ea9f103a0a7a9b36d2d5350c692700e1c9773b36272248e993c0885120ff64d4",
            "sha256:35c53ad9b8d7744cd228496477d9e94d6163976cb2e36e513870f896bf71be09",
            "sha256:3c43ad93336b12e92f0dd9228c6c8a438676019ebb88f4a6160d5631b235721e",
            "sha256:daa9fa3026abbc3657c1fd81234642465a3970830997525ad5268e78f5ed0dd6",
            "sha256:e3540a67d66eb9d9d16cf59bfab117de6a50a3e846d131504b69cf8742b0a4ee",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:1998caa9d682dd79d846fffab6d0c4065583b35861a7f11b20fe3ea0428d3b9e",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:d94f4e11c337a0644c9f3d06b660ed17fb928669a612cd54a1570ba16fb88fb4",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:a0f3ee44ea0152769360ac947ee63994cf88c293c7da840d131d1c1664dbe66b",
            "sha256:92c4c2f7552b8b9aae5fcb91b2e0a7cfca77e52a265ab3ca71f83888e8be992e",
            "sha256:f1ce41fbfaca59b19ba488fc520b7f14a4603151307a5f7c4a32c2d0bf23eebb",
            "sha256:83e0712ecdbb7ed1a3a09e39dc6aebeb95ea52885e2d5afafe8c8707a2351d9c",
            "sha256:ebf6e5cb5e3581f6a97e3e7bd2acacfabe893763281af8822d043ae239e15a39",
            "sha256:a98c2a87ddd31d44efa923facb7b433c5b4260648ac04f9d6d956366b3a870a5",
            "sha256:d24169ce0425519ddf67ec27541fbfdd3834e20dc4ea3d054464bf82be9ff976",
            "sha256:b34ed20c4264ccb88c27193018cdf7e531f10f8a377ad0618c3e540b383f8bd6",
            "sha256:0f78d835d444efc5c01fb7adf410c03cd1dac7b3273b646c0512615ed94b6cee",
            "sha256:d865c28c18ce2783dfcc36a794e272fcb32e50153da52239ff964b5ba5bfc3d7",
            "sha256:01dad8da16afdb37313c47b993d6608333b12e3c16c8da8cfc4ddd373d2a1b5c",
            "sha256:84b05252526cbebf26005270d66e0da6a7199f90c1b60f43dbd8a22f0784c022",
            "sha256:8af4b79bfa0530ed60304c9b40abaf64124bb0c87fb0ce9813d248ab02268fc6"
        ]
    },
    "Metadata": {
        "LastTagTime": "2024-11-21T01:05:17.215537156+08:00"
    }
}

更多版本

docker.io/localai/localai:latest-aio-cpu

linux/amd64 docker.io6.42GB2024-11-08 14:23
27

docker.io/localai/localai:latest-aio-gpu-nvidia-cuda-12

linux/amd64 docker.io45.94GB2024-11-21 01:51
10