docker.io/openeuler/llm-server:1.0.0-oe2203sp3 linux/amd64

docker.io/openeuler/llm-server:1.0.0-oe2203sp3 - 国内下载镜像源 浏览次数:110
<>

openeuler/llm-server 镜像描述

这是一个基于OpenEuler操作系统的LLM(大型语言模型)服务器镜像。它预装了运行LLM模型所需的环境和依赖,方便用户快速部署和运行自己的LLM应用。 具体包含哪些LLM框架或模型,需要查看镜像的具体标签和文档。

源镜像 docker.io/openeuler/llm-server:1.0.0-oe2203sp3
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3
镜像ID sha256:2383d0d43f1fd13017f6f19759ff8c486e02e0f834624aa62f2a14016c30ab31
镜像TAG 1.0.0-oe2203sp3
大小 1.01GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD
启动入口 /bin/sh -c python3 -m llama_cpp.server --host 0.0.0.0 --port 8000 --api_key $KEY --model $MODEL --model_alias $MODEL_NAME --n_threads $THREADS --n_ctx $CONTEXT
工作目录 /
OS/平台 linux/amd64
浏览量 110 次
贡献者
镜像创建 2024-06-13T02:03:06.495855249Z
同步时间 2025-03-20 08:59
更新时间 2025-05-14 06:03
开放端口
8000/tcp
环境变量
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin KEY=sk-123456 MODEL=/models/model.gguf MODEL_NAME=qwen-1.5 THREADS=8 CONTEXT=8192

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3  docker.io/openeuler/llm-server:1.0.0-oe2203sp3

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3  docker.io/openeuler/llm-server:1.0.0-oe2203sp3

Shell快速替换命令

sed -i 's#openeuler/llm-server:1.0.0-oe2203sp3#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3  docker.io/openeuler/llm-server:1.0.0-oe2203sp3'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3  docker.io/openeuler/llm-server:1.0.0-oe2203sp3'

镜像构建历史


# 2024-06-13 10:03:06  0.00B 配置容器启动时运行的命令
ENTRYPOINT ["/bin/sh" "-c" "python3 -m llama_cpp.server --host 0.0.0.0 --port 8000 --api_key $KEY --model $MODEL --model_alias $MODEL_NAME --n_threads $THREADS --n_ctx $CONTEXT"]
                        
# 2024-06-13 10:03:06  0.00B 声明容器运行时监听的端口
EXPOSE map[8000/tcp:{}]
                        
# 2024-06-13 10:03:06  0.00B 设置环境变量 CONTEXT
ENV CONTEXT=8192
                        
# 2024-06-13 10:03:06  0.00B 设置环境变量 THREADS
ENV THREADS=8
                        
# 2024-06-13 10:03:06  0.00B 设置环境变量 MODEL_NAME
ENV MODEL_NAME=qwen-1.5
                        
# 2024-06-13 10:03:06  0.00B 设置环境变量 MODEL
ENV MODEL=/models/model.gguf
                        
# 2024-06-13 10:03:06  0.00B 设置环境变量 KEY
ENV KEY=sk-123456
                        
# 2024-06-13 10:03:06  155.86MB 执行命令并创建新的镜像层
RUN /bin/sh -c CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip3 install -i https://pypi.tuna.tsinghua.edu.cn/simple --no-cache-dir guidance llama-cpp-python[server] # buildkit
                        
# 2024-06-13 09:59:57  46.62MB 执行命令并创建新的镜像层
RUN /bin/sh -c git clone https://github.com/OpenMathLib/OpenBLAS.git &&     cd OpenBLAS &&     make -j "$(nproc)" &&     make install &&     cd .. &&     rm -rf OpenBLAS # buildkit
                        
# 2024-06-13 09:52:20  554.89MB 执行命令并创建新的镜像层
RUN /bin/sh -c sed -i 's|http://repo.openeuler.org/|https://mirrors.huaweicloud.com/openeuler/|g' /etc/yum.repos.d/openEuler.repo &&    yum update -y &&    yum install -y python3 python3-pip python3-devel shadow-utils cmake gcc g++ git make &&    yum clean all # buildkit
                        
# 2024-04-28 04:34:43  0.00B 设置默认要执行的命令
CMD ["bash"]
                        
# 2024-04-28 04:34:43  82.32MB 执行命令并创建新的镜像层
RUN |1 TARGETARCH=amd64 /bin/sh -c ln -sf /usr/share/zoneinfo/UTC /etc/localtime &&     sed -i "s/TMOUT=300/TMOUT=0/g" /etc/bashrc &&     yum -y update && yum clean all # buildkit
                        
# 2024-04-28 04:34:23  171.31MB 复制文件或目录到容器中
ADD openEuler-docker-rootfs.amd64.tar.xz / # buildkit
                        
# 2024-04-28 04:34:23  0.00B 定义构建参数
ARG TARGETARCH
                        
                    

镜像信息

{
    "Id": "sha256:2383d0d43f1fd13017f6f19759ff8c486e02e0f834624aa62f2a14016c30ab31",
    "RepoTags": [
        "openeuler/llm-server:1.0.0-oe2203sp3",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server:1.0.0-oe2203sp3"
    ],
    "RepoDigests": [
        "openeuler/llm-server@sha256:7a52af9932130aa0cac9024cb62107317cb37a9ca0724ebe1824538ebec79071",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/openeuler/llm-server@sha256:698b4ededf676b7f89d01af257d7a02048396c60469746350d39c57dc7ea68ac"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2024-06-13T02:03:06.495855249Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "8000/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "KEY=sk-123456",
            "MODEL=/models/model.gguf",
            "MODEL_NAME=qwen-1.5",
            "THREADS=8",
            "CONTEXT=8192"
        ],
        "Cmd": null,
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/",
        "Entrypoint": [
            "/bin/sh",
            "-c",
            "python3 -m llama_cpp.server --host 0.0.0.0 --port 8000 --api_key $KEY --model $MODEL --model_alias $MODEL_NAME --n_threads $THREADS --n_ctx $CONTEXT"
        ],
        "OnBuild": null,
        "Labels": null
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 1010998507,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/d894c087b10f2dedad4bc0074e13fd7db4ad0a7053f3a251a57274f2acc869ab/diff:/var/lib/docker/overlay2/e148d813ae4aeebe8b6e6e687fd18cd94c9a46b573c530a17d779a62b3a2dd18/diff:/var/lib/docker/overlay2/ba2354b8ccdd414d5a10a8ad871402f8bd3b09a4c20158b90642c6535b4b231b/diff:/var/lib/docker/overlay2/03763da3d4ed30d298af94a3543298097d69a2f402510ed0738fa339de24c9f1/diff",
            "MergedDir": "/var/lib/docker/overlay2/d655aa326e20b8c18e41676e6979acee3011f4605e71b950b46f1610dad17efb/merged",
            "UpperDir": "/var/lib/docker/overlay2/d655aa326e20b8c18e41676e6979acee3011f4605e71b950b46f1610dad17efb/diff",
            "WorkDir": "/var/lib/docker/overlay2/d655aa326e20b8c18e41676e6979acee3011f4605e71b950b46f1610dad17efb/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:64e8964136d1688cb0eacb25065818a6acff6f9682fb3bf4dbddd06f091caa0c",
            "sha256:3bee1347a14f84b55a8a755d5da4d30fd8538129b77dceea4a678ae1d6fc41e5",
            "sha256:6bb4946f568df1eb685cf21aab2e07521c2d67d3783419a092bbbf46794780f0",
            "sha256:d9d97a6922381ed4f26b813045791c918f57527e60e33c1a3f8bc167e4811fd6",
            "sha256:4f19b7a672055352272062019b2373f67bd6f0ea47077bc22e6b8f8e259954a2"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-03-20T08:58:14.387534687+08:00"
    }
}

更多版本

docker.io/openeuler/llm-server:1.0.0-oe2203sp3

linux/amd64 docker.io1.01GB2025-03-20 08:59
109