ghcr.io/ggerganov/llama.cpp:full-vulkan linux/amd64

ghcr.io/ggerganov/llama.cpp:full-vulkan - 国内下载镜像源 浏览次数:69
这里是镜像ghcr.io/ggerganov/llama.cpp 的描述信息:

LLaMA 是一个由 Google 的研究人员开发的预训练语言模型,旨在通过生成高质量、相关的内容来改善人机对话和文本理解。该模型以其高效的计算性能、广泛的知识覆盖范围以及简单易用的界面而闻名。使用 LLaMA 可以实现各种应用,如智能客服、内容创作、自然语言处理等。

源镜像 ghcr.io/ggerganov/llama.cpp:full-vulkan
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan
镜像ID sha256:72e4b3ce9cdbd87e2cf90ea5f73e9cc8a251390d37f1a2ca556eecd984f487b9
镜像TAG full-vulkan
大小 2.20GB
镜像源 ghcr.io
CMD
启动入口 /app/tools.sh
工作目录 /app
OS/平台 linux/amd64
浏览量 69 次
贡献者
镜像创建 2025-03-03T04:23:57.832119567Z
同步时间 2025-03-03 17:58
更新时间 2025-04-18 16:48
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
镜像标签
ubuntu: org.opencontainers.image.ref.name 24.04: org.opencontainers.image.version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan  ghcr.io/ggerganov/llama.cpp:full-vulkan

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan  ghcr.io/ggerganov/llama.cpp:full-vulkan

Shell快速替换命令

sed -i 's#ghcr.io/ggerganov/llama.cpp:full-vulkan#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan  ghcr.io/ggerganov/llama.cpp:full-vulkan'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan  ghcr.io/ggerganov/llama.cpp:full-vulkan'

镜像构建历史


# 2025-03-03 12:23:57  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/app/tools.sh"]
                        
# 2025-03-03 12:23:57  1.66GB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update     && apt-get install -y     git     python3     python3-pip     python3-wheel     && pip install --break-system-packages --upgrade setuptools     && pip install --break-system-packages -r requirements.txt     && apt autoremove -y     && apt clean -y     && rm -rf /tmp/* /var/tmp/*     && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete     && find /var/cache -type f -delete # buildkit
                        
# 2025-03-03 12:23:09  0.00B 设置工作目录为/app
WORKDIR /app
                        
# 2025-03-03 12:23:09  102.86MB 复制新文件或目录到容器中
COPY /app/full /app # buildkit
                        
# 2025-03-03 12:23:08  29.06MB 复制新文件或目录到容器中
COPY /app/lib/ /app # buildkit
                        
# 2025-03-03 12:18:37  329.14MB 执行命令并创建新的镜像层
RUN /bin/sh -c apt-get update     && apt-get install -y libgomp1 curl libvulkan-dev     && apt autoremove -y     && apt clean -y     && rm -rf /tmp/* /var/tmp/*     && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete     && find /var/cache -type f -delete # buildkit
                        
# 2025-01-27 12:14:03  0.00B 
/bin/sh -c #(nop)  CMD ["/bin/bash"]
                        
# 2025-01-27 12:14:03  78.13MB 
/bin/sh -c #(nop) ADD file:6df775300d76441aa33f31b22c1afce8dfe35c8ffbc14ef27c27009235b12a95 in / 
                        
# 2025-01-27 12:14:00  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.version=24.04
                        
# 2025-01-27 12:14:00  0.00B 
/bin/sh -c #(nop)  LABEL org.opencontainers.image.ref.name=ubuntu
                        
# 2025-01-27 12:14:00  0.00B 
/bin/sh -c #(nop)  ARG LAUNCHPAD_BUILD_ARCH
                        
# 2025-01-27 12:14:00  0.00B 
/bin/sh -c #(nop)  ARG RELEASE
                        
                    

镜像信息

{
    "Id": "sha256:72e4b3ce9cdbd87e2cf90ea5f73e9cc8a251390d37f1a2ca556eecd984f487b9",
    "RepoTags": [
        "ghcr.io/ggerganov/llama.cpp:full-vulkan",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp:full-vulkan"
    ],
    "RepoDigests": [
        "ghcr.io/ggerganov/llama.cpp@sha256:aeddec51aa41adc5d60f95155f96f330ae194000ff48ca5dac7a57a86da8f8f1",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/ghcr.io/ggerganov/llama.cpp@sha256:c15e6b0cdee82c8fd6a08c0f9728f3af3871513c0ba8e96be345f76436097ff2"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-03-03T04:23:57.832119567Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
        ],
        "Cmd": null,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/app",
        "Entrypoint": [
            "/app/tools.sh"
        ],
        "OnBuild": null,
        "Labels": {
            "org.opencontainers.image.ref.name": "ubuntu",
            "org.opencontainers.image.version": "24.04"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 2195338492,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/a213f638f9b9d57b9086c994bba26843036cdfb76ad53db215baacc06a9b0e64/diff:/var/lib/docker/overlay2/158b3e88c335b207d39283b82fca6818e164a5b7a52b45bb0d53422da129f271/diff:/var/lib/docker/overlay2/8e7cb2ee809e307177bcd5a14014e77128508c4113d027059e4f8aa64ed5056d/diff:/var/lib/docker/overlay2/495c990d1350aa6c6da011c7301759f5e93b0c54e5fc24d8c76d9852ee35eace/diff:/var/lib/docker/overlay2/d5ba5778451cb9d6cd53a762324cbf17a65345e17306b42b60d69ba8f9186927/diff",
            "MergedDir": "/var/lib/docker/overlay2/793b799034f9c412cf05d755b200882144d3a344d15a0f729889743ae18023b0/merged",
            "UpperDir": "/var/lib/docker/overlay2/793b799034f9c412cf05d755b200882144d3a344d15a0f729889743ae18023b0/diff",
            "WorkDir": "/var/lib/docker/overlay2/793b799034f9c412cf05d755b200882144d3a344d15a0f729889743ae18023b0/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:4b7c01ed0534d4f9be9cf97d068da1598c6c20b26cb6134fad066defdb6d541d",
            "sha256:175ebd64349d16df823564d1afa2d772011e6fcb714fb43097bdbd2b89f0511b",
            "sha256:079b3f17db4cd9b5b334b54cc03c276db7b82dd73082de65d803b90f08a7e01a",
            "sha256:6385b4b44ffca1b9df51a851393a537d59db415afb4cb2cdd0f21564d51f69e6",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:d9764bee3f7a82d30f716958ca2b0240389426f6148d3e8cebdec46f203ee8b9"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-03-03T17:56:38.201589231+08:00"
    }
}

更多版本

ghcr.io/ggerganov/llama.cpp:server-cuda

linux/amd64 ghcr.io2.73GB2024-09-12 11:55
357

ghcr.io/ggerganov/llama.cpp:server-cuda--b1-7d1a378

linux/amd64 ghcr.io2.32GB2024-11-03 15:07
157

ghcr.io/ggerganov/llama.cpp:server-cuda--b1-a59f8fd

linux/amd64 ghcr.io2.55GB2024-11-03 15:35
253

ghcr.io/ggerganov/llama.cpp:light

linux/amd64 ghcr.io175.71MB2024-11-05 16:15
148

ghcr.io/ggerganov/llama.cpp:full

linux/amd64 ghcr.io3.52GB2024-11-08 14:49
265

ghcr.io/ggerganov/llama.cpp:server-cuda-b4641

linux/amd64 ghcr.io2.67GB2025-02-05 14:38
78

ghcr.io/ggerganov/llama.cpp:server-cuda-b4646

linux/amd64 ghcr.io2.67GB2025-02-06 19:31
157

ghcr.io/ggerganov/llama.cpp:full-cuda

linux/amd64 ghcr.io4.68GB2025-02-07 15:47
195

ghcr.io/ggerganov/llama.cpp:server-cuda-b4563

linux/amd64 ghcr.io2.68GB2025-02-10 16:54
98

ghcr.io/ggerganov/llama.cpp:full-vulkan

linux/amd64 ghcr.io2.20GB2025-03-03 17:58
68