广告图片

docker.io/unclecode/crawl4ai:0.8.5 linux/arm64

docker.io/unclecode/crawl4ai:0.8.5 - 国内下载镜像源 浏览次数:11 温馨提示: 这是一个 linux/arm64 系统架构镜像

这是一个名为docker.io/unclecode/crawl4ai 的Docker容器镜像。 具体的描述信息需要参考镜像的维护者提供的文档或README。 由于我没有访问外部网络资源的能力,无法直接获取该镜像的描述。

源镜像 docker.io/unclecode/crawl4ai:0.8.5
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64
镜像ID sha256:803c2734ab5d3fd5ed7dc712ac643924225ab6f74f2e27ba5bb6109afa36db12
镜像TAG 0.8.5-linuxarm64
大小 6.92GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD supervisord -c supervisord.conf
启动入口
工作目录 /app
OS/平台 linux/arm64
浏览量 11 次
贡献者
镜像创建 2026-03-21T06:49:38.724079077Z
同步时间 2026-04-07 14:43
开放端口
6379/tcp
环境变量
PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.12.13 PYTHON_SHA256=c08bc65a81971c1dd5783182826503369466c7e67374d1646519adf05207b684 C4AI_VERSION=0.8.5 PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379 PYTHON_ENV=production
镜像标签
0.8.5: c4ai.version 🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper: description unclecode: maintainer 1.0: version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64  docker.io/unclecode/crawl4ai:0.8.5

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64  docker.io/unclecode/crawl4ai:0.8.5

Shell快速替换命令

sed -i 's#unclecode/crawl4ai:0.8.5#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64  docker.io/unclecode/crawl4ai:0.8.5'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64  docker.io/unclecode/crawl4ai:0.8.5'

镜像构建历史


# 2026-03-21 14:49:38  0.00B 设置默认要执行的命令
CMD ["supervisord" "-c" "supervisord.conf"]
                        
# 2026-03-21 14:49:38  0.00B 设置环境变量 PYTHON_ENV
ENV PYTHON_ENV=production
                        
# 2026-03-21 14:49:38  0.00B 指定运行容器时使用的用户
USER appuser
                        
# 2026-03-21 14:49:38  0.00B 声明容器运行时监听的端口
EXPOSE [6379/tcp]
                        
# 2026-03-21 14:49:38  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi &&     redis-cli ping > /dev/null &&     curl -f http://localhost:11235/health || exit 1'"] "30s" "10s" "5s" "0s" '\x03'}
                        
# 2026-03-21 14:49:38  0.00B 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c mkdir -p /var/lib/redis /var/log/redis && chown -R appuser:appuser /var/lib/redis /var/log/redis # buildkit
                        
# 2026-03-21 14:49:38  1.31GB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c chown -R appuser:appuser ${APP_HOME} # buildkit
                        
# 2026-03-21 14:49:35  115.41KB 复制新文件或目录到容器中
COPY deploy/docker/static /app/static # buildkit
                        
# 2026-03-21 14:49:35  1.20MB 复制新文件或目录到容器中
COPY deploy/docker/* /app/ # buildkit
                        
# 2026-03-21 14:49:35  0.00B 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c mkdir -p /home/appuser/.cache     && chown -R appuser:appuser /home/appuser/.cache # buildkit
                        
# 2026-03-21 14:49:33  1.31GB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c crawl4ai-doctor # buildkit
                        
# 2026-03-21 14:48:57  630.11MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c mkdir -p /home/appuser/.cache/ms-playwright     && cp -r /root/.cache/ms-playwright/chromium-* /home/appuser/.cache/ms-playwright/     && chown -R appuser:appuser /home/appuser/.cache/ms-playwright # buildkit
                        
# 2026-03-21 14:48:55  849.40MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c playwright install --with-deps # buildkit
                        
# 2026-03-21 14:44:37  1.23GB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c crawl4ai-setup # buildkit
                        
# 2026-03-21 14:41:12  14.53MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c pip install --no-cache-dir --upgrade pip &&     /tmp/install.sh &&     python -c "import crawl4ai; print('✅ crawl4ai is ready to rock!')" &&     python -c "from playwright.sync_api import sync_playwright; print('✅ Playwright is feeling dramatic!')" # buildkit
                        
# 2026-03-21 14:39:17  672.60MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install "/tmp/project/[all]" &&         python -m crawl4ai.model_loader ;     elif [ "$INSTALL_TYPE" = "torch" ] ; then         pip install "/tmp/project/[torch]" ;     elif [ "$INSTALL_TYPE" = "transformer" ] ; then         pip install "/tmp/project/[transformer]" &&         python -m crawl4ai.model_loader ;     else         pip install "/tmp/project" ;     fi # buildkit
                        
# 2026-03-21 14:34:27  0.00B 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install --no-cache-dir             torch             torchvision             torchaudio             scikit-learn             nltk             transformers             tokenizers &&         python -m nltk.downloader punkt stopwords ;     fi # buildkit
                        
# 2026-03-21 14:34:27  115.24MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c pip install --no-cache-dir -r requirements.txt # buildkit
                        
# 2026-03-21 14:32:46  303.00B 复制新文件或目录到容器中
COPY deploy/docker/requirements.txt . # buildkit
                        
# 2026-03-21 14:32:46  1.31KB 复制新文件或目录到容器中
COPY deploy/docker/supervisord.conf . # buildkit
                        
# 2026-03-21 14:32:46  37.33MB 复制新文件或目录到容器中
COPY . /tmp/project/ # buildkit
                        
# 2026-03-21 14:32:46  435.00B 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c echo '#!/bin/bash\nif [ "$USE_LOCAL" = "true" ]; then\n    echo "📦 Installing from local source..."\n    pip install --no-cache-dir /tmp/project/\nelse\n    echo "🌐 Installing from GitHub..."\n    for i in {1..3}; do \n        git clone --branch ${GITHUB_BRANCH} ${GITHUB_REPO} /tmp/crawl4ai && break || \n        { echo "Attempt $i/3 failed! Taking a short break... ☕"; sleep 5; }; \n    done\n    pip install --no-cache-dir /tmp/crawl4ai\nfi' > /tmp/install.sh && chmod +x /tmp/install.sh # buildkit
                        
# 2026-03-21 14:32:45  0.00B 设置工作目录为/app
WORKDIR /app
                        
# 2026-03-21 14:32:45  0.00B 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c mkdir -p /home/appuser && chown -R appuser:appuser /home/appuser # buildkit
                        
# 2026-03-21 14:32:45  4.50KB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c groupadd -r appuser && useradd --no-log-init -r -g appuser appuser # buildkit
                        
# 2026-03-21 14:32:45  64.19MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c if [ "$TARGETARCH" = "arm64" ]; then     echo "🦾 Installing ARM-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libopenblas-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; elif [ "$TARGETARCH" = "amd64" ]; then     echo "🖥️ Installing AMD64-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libomp-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; else     echo "Skipping platform-specific optimizations (unsupported platform)"; fi # buildkit
                        
# 2026-03-21 14:32:17  0.00B 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c if [ "$ENABLE_GPU" = "true" ] && [ "$TARGETARCH" = "amd64" ] ; then     apt-get update && apt-get install -y --no-install-recommends     nvidia-cuda-toolkit     && apt-get clean     && rm -rf /var/lib/apt/lists/* ; else     echo "Skipping NVIDIA CUDA Toolkit installation (unsupported platform or GPU disabled)"; fi # buildkit
                        
# 2026-03-21 14:32:17  25.87MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c apt-get update && apt-get dist-upgrade -y     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-21 14:31:45  32.70MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     libglib2.0-0     libnss3     libnspr4     libatk1.0-0     libatk-bridge2.0-0     libcups2     libdrm2     libdbus-1-3     libxcb1     libxkbcommon0     libx11-6     libxcomposite1     libxdamage1     libxext6     libxfixes3     libxrandr2     libgbm1     libpango-1.0-0     libcairo2     libasound2     libatspi2.0-0     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-21 14:30:50  467.52MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     build-essential     curl     wget     gnupg     git     cmake     pkg-config     python3-dev     libjpeg-dev     redis-tools${REDIS_VERSION:+=$REDIS_VERSION}     redis-server${REDIS_VERSION:+=$REDIS_VERSION}     supervisor     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-21 14:28:22  2.38KB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c curl -fsSL https://packages.redis.io/gpg | gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg     && echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] https://packages.redis.io/deb bookworm main"     > /etc/apt/sources.list.d/redis.list # buildkit
                        
# 2026-03-21 14:28:20  10.99MB 执行命令并创建新的镜像层
RUN |10 C4AI_VER=0.8.5 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 REDIS_VERSION=6:7.2.7-1rl1~bookworm1 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends curl gnupg     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-21 14:28:20  0.00B 添加元数据标签
LABEL version=1.0
                        
# 2026-03-21 14:28:20  0.00B 添加元数据标签
LABEL description=🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper
                        
# 2026-03-21 14:28:20  0.00B 添加元数据标签
LABEL maintainer=unclecode
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG REDIS_VERSION=6:7.2.7-1rl1~bookworm1
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG TARGETARCH=arm64
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG ENABLE_GPU=false
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG INSTALL_TYPE=default
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG PYTHON_VERSION=3.12
                        
# 2026-03-21 14:28:20  0.00B 设置环境变量 PYTHONFAULTHANDLER PYTHONHASHSEED PYTHONUNBUFFERED PIP_NO_CACHE_DIR PYTHONDONTWRITEBYTECODE PIP_DISABLE_PIP_VERSION_CHECK PIP_DEFAULT_TIMEOUT DEBIAN_FRONTEND REDIS_HOST REDIS_PORT
ENV PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG USE_LOCAL=true
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG GITHUB_BRANCH=main
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG GITHUB_REPO=https://github.com/unclecode/crawl4ai.git
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG APP_HOME=/app
                        
# 2026-03-21 14:28:20  0.00B 添加元数据标签
LABEL c4ai.version=0.8.5
                        
# 2026-03-21 14:28:20  0.00B 设置环境变量 C4AI_VERSION
ENV C4AI_VERSION=0.8.5
                        
# 2026-03-21 14:28:20  0.00B 定义构建参数
ARG C4AI_VER=0.8.5
                        
# 2026-03-17 07:15:37  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2026-03-17 07:15:37  36.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pip3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2026-03-17 07:15:37  43.60MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		savedAptMark="$(apt-mark showmanual)"; 	apt-get update; 	apt-get install -y --no-install-recommends 		dpkg-dev 		gcc 		gnupg 		libbluetooth-dev 		libbz2-dev 		libc6-dev 		libdb-dev 		libffi-dev 		libgdbm-dev 		liblzma-dev 		libncursesw5-dev 		libreadline-dev 		libsqlite3-dev 		libssl-dev 		make 		tk-dev 		uuid-dev 		wget 		xz-utils 		zlib1g-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	echo "$PYTHON_SHA256 *python.tar.xz" | sha256sum -c -; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-optimizations 		--enable-option-checking=fatal 		--enable-shared 		$(test "${gnuArch%%-*}" != 'riscv64' && echo '--with-lto') 		--with-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="$(dpkg-buildflags --get CFLAGS)"; 	LDFLAGS="$(dpkg-buildflags --get LDFLAGS)"; 	LDFLAGS="${LDFLAGS:-} -Wl,--strip-all"; 	arch="$(dpkg --print-architecture)"; arch="${arch##*-}"; 	case "$arch" in 		amd64|arm64) 			EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer"; 			;; 		i386) 			;; 		*) 			EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer"; 			;; 	esac; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-} -Wl,-rpath='\$\$ORIGIN/../lib'" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		ldconfig; 		apt-mark auto '.*' > /dev/null; 	apt-mark manual $savedAptMark; 	find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec ldd '{}' ';' 		| awk '/=>/ { so = $(NF-1); if (index(so, "/usr/local/") == 1) { next }; gsub("^/(usr/)?", "", so); printf "*%s\n", so }' 		| sort -u 		| xargs -rt dpkg-query --search 		| awk 'sub(":$", "", $1) { print $1 }' 		| sort -u 		| xargs -r apt-mark manual 	; 	apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; 	rm -rf /var/lib/apt/lists/*; 		export PYTHONDONTWRITEBYTECODE=1; 	python3 --version; 	pip3 --version # buildkit
                        
# 2026-03-17 07:04:02  0.00B 设置环境变量 PYTHON_SHA256
ENV PYTHON_SHA256=c08bc65a81971c1dd5783182826503369466c7e67374d1646519adf05207b684
                        
# 2026-03-17 07:04:02  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.12.13
                        
# 2026-03-17 07:04:02  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2026-03-17 07:04:02  9.25MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apt-get update; 	apt-get install -y --no-install-recommends 		ca-certificates 		netbase 		tzdata 	; 	rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-03-17 07:04:02  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2026-03-17 07:04:02  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2026-03-16 08:00:00  97.21MB 
# debian.sh --arch 'arm64' out/ 'bookworm' '@1773619200'
                        
                    

镜像信息

{
    "Id": "sha256:803c2734ab5d3fd5ed7dc712ac643924225ab6f74f2e27ba5bb6109afa36db12",
    "RepoTags": [
        "unclecode/crawl4ai:0.8.5",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8.5-linuxarm64"
    ],
    "RepoDigests": [
        "unclecode/crawl4ai@sha256:f95ba27c4a4c07841eeff35b956906c5311961000e704dec3ebeeb96b5c46c60",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai@sha256:3feaf01e9eb53aeca173aa2166324282a399fb3fb30b50e0a23b6c4d54d98932"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2026-03-21T06:49:38.724079077Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "appuser",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "6379/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.12.13",
            "PYTHON_SHA256=c08bc65a81971c1dd5783182826503369466c7e67374d1646519adf05207b684",
            "C4AI_VERSION=0.8.5",
            "PYTHONFAULTHANDLER=1",
            "PYTHONHASHSEED=random",
            "PYTHONUNBUFFERED=1",
            "PIP_NO_CACHE_DIR=1",
            "PYTHONDONTWRITEBYTECODE=1",
            "PIP_DISABLE_PIP_VERSION_CHECK=1",
            "PIP_DEFAULT_TIMEOUT=100",
            "DEBIAN_FRONTEND=noninteractive",
            "REDIS_HOST=localhost",
            "REDIS_PORT=6379",
            "PYTHON_ENV=production"
        ],
        "Cmd": [
            "supervisord",
            "-c",
            "supervisord.conf"
        ],
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi \u0026\u0026     redis-cli ping \u003e /dev/null \u0026\u0026     curl -f http://localhost:11235/health || exit 1'"
            ],
            "Interval": 30000000000,
            "Timeout": 10000000000,
            "StartPeriod": 5000000000,
            "Retries": 3
        },
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/app",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "c4ai.version": "0.8.5",
            "description": "🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler \u0026 scraper",
            "maintainer": "unclecode",
            "version": "1.0"
        }
    },
    "Architecture": "arm64",
    "Os": "linux",
    "Size": 6921640452,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/77fae03bc17879e95fd948b8be2cb9a9923f55b11ca37ddd67c6661df715326f/diff:/var/lib/docker/overlay2/edc0dca3ce3664e60ccbd46afbe1d32f95ed1b118979e4f041b632bc3f589926/diff:/var/lib/docker/overlay2/be1624f323ecd476dc3afa2d86737359c52e3ef2e42fbee9eb527dce8100d65c/diff:/var/lib/docker/overlay2/db1bc60b2514977dec979b37cfe642bb3b3e7b3bba91bd4af6dff4e8deeedbc6/diff:/var/lib/docker/overlay2/e4e3030db82a4e4509c607ac3734bb0e83e599f659bd9fa1a92cbcbf416d8595/diff:/var/lib/docker/overlay2/dae3b3bcfc0eead668bad7dc227906244b399e0c3c8f1dcaf602a8a92ab5124e/diff:/var/lib/docker/overlay2/42b6699a36a2ff2fedf23beca44087b4d32c9df5745c4c78371c00f59a29dab2/diff:/var/lib/docker/overlay2/5427fb533462571ebfeed8bab0f3162094d8f7c0d98f4d9b95c8a8a8841ef427/diff:/var/lib/docker/overlay2/4b69bd558748545bb0e2dc28d217fa222107a91a02e0f1e570c854e9861395a5/diff:/var/lib/docker/overlay2/284a1e00cf0302948cb4f3a19e736a577f58b8532bb8c90c3e1675691f3e60f2/diff:/var/lib/docker/overlay2/4c971c77e1c6865a3cc345ac5cdefa4e7cf4ea961fb305c1f59d4867c022d182/diff:/var/lib/docker/overlay2/ae7bf5e6f42c9ef1a5eb37c3c866a7d7e37e912fd554a9d87d0defb1eaafdada/diff:/var/lib/docker/overlay2/d81bbc4353dcb50cfeeceeb5dcb04ae129c4e13688822b9f584d6f2b201aac61/diff:/var/lib/docker/overlay2/3dc72b823349893ce9bcd40efcf2bbf2a29ac9d6a127802c33a576a3fde480d3/diff:/var/lib/docker/overlay2/47de71fdfc5d1e0e02efc382b9f39d6ab70e29657d23a76993280b5577d4d6ac/diff:/var/lib/docker/overlay2/e1fd0613c14672681a6278930ce117dad6c3501dc69f32eaa5705c85c99f9505/diff:/var/lib/docker/overlay2/a85033ade41ad42892ad261c55386049397596737198c4c59f0a1553bd132333/diff:/var/lib/docker/overlay2/bd0d2b84330b4e5cac320ebdcab8960e3454daad8d1b188cc3241bded6d515f1/diff:/var/lib/docker/overlay2/79a73773b544093240ce562f608e4be3eff9ad9734a5a1108817f0bfbd9bc201/diff:/var/lib/docker/overlay2/cd9c9bd0439ac7185e45560b182ef5b2cc19ad2f16d5a0c54670a6ec31ba1517/diff:/var/lib/docker/overlay2/1474994d0d74727770600b242c25a00d932bb1db77aee2625351502f379ba593/diff:/var/lib/docker/overlay2/4b615a27f5d286220d9549922d3463bf5f922fa5a66a50ba762250bdd3073d86/diff:/var/lib/docker/overlay2/02485dc9ed2a24c946fbad916b0ec3a4cbb9bd27bce990fd4dc48a0174de4c61/diff:/var/lib/docker/overlay2/4c41682d410ea663ac9b82d19adf97a04c0d2d2d636533f0720a816b04f39496/diff:/var/lib/docker/overlay2/6de6b744d4fc2d43924e616b42788ea7164359d458e3be1bb1445edcb66b302a/diff:/var/lib/docker/overlay2/d97b3ba9149f120fc039b46f3e80b5664589fcb3bd271c2f6bb53190edc418e1/diff:/var/lib/docker/overlay2/5b46ddcb06ae5e22e3b1500eb0b11a29d3724c4394fbb8b353e22c9922a51f70/diff:/var/lib/docker/overlay2/010d070995f0793e9a8e85e19e27985460f45fbd5d042dba34a57ea43c033b8e/diff:/var/lib/docker/overlay2/b05b351ff22e481a24a41af5cc6e4c035cea4f8b4cec7f6c6201ae27b43b3bf9/diff:/var/lib/docker/overlay2/9010d75114e22d372a88f14436e1812afb46fdf6368b43c63c474844bd8e17b5/diff",
            "MergedDir": "/var/lib/docker/overlay2/f6174b5b24e7872f54ff87969f313bafaa97d3d2a3893b5b9a84dfb442c5cf42/merged",
            "UpperDir": "/var/lib/docker/overlay2/f6174b5b24e7872f54ff87969f313bafaa97d3d2a3893b5b9a84dfb442c5cf42/diff",
            "WorkDir": "/var/lib/docker/overlay2/f6174b5b24e7872f54ff87969f313bafaa97d3d2a3893b5b9a84dfb442c5cf42/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:6ef58325ba0a417962b349e6810a7177443d67fed7462c1d19da22af8b7197e0",
            "sha256:7df2a8ebfaaebdf55359911987063a53f8be2d215514b93487df85f15ad77eed",
            "sha256:19d6e5da0c1b6fa24c75546543c1ef67df036935cbb967ff3014c8346b4b090f",
            "sha256:8f9eabc6c6acc66a842db02f59dfa60ce5d2d330db519da327392c6f1c684c0b",
            "sha256:5950b978221d2c4fa596157825b0cbe35f2395601062655c3752c5a756821f17",
            "sha256:b51c3cb8b81199b852bc5f2e203765dc6a1b107c11404debdbb1b0d734b7b322",
            "sha256:ed12e7321feff2cd26659a199b66e221b186c46ac6e216b8d358fb52b78f2cb2",
            "sha256:dc3915773d3dc3f3b278edb3b43572f5f7183019d3839b811dfd3a8bc75879a9",
            "sha256:4018e65ae58feed1316a8d8673306800e3eecdd6d1f502fa930e1fb256fa1df5",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:bcb4f437d838fc4c2d808b1d00d83a0554f543fe44f0e86afa99c7007946f6b2",
            "sha256:31e429406e5572303b9cd8703266a7e521021afb4b93db099b5cee925cffdc92",
            "sha256:50c5a28354ea0e016ee714bbc399bbf4f49940b301817bafd04982172a7c042a",
            "sha256:42ce5f9d5e82953f1ade1f661f397cf0605690756ebc95bfea50f5f4670e7296",
            "sha256:b4722525c4f9d099871cc9e240ef62d35fbea2b204a4a62c19c878b10ba720b0",
            "sha256:995480a933a5f0c82d3ef2ea877f74354fb7e4a89025c893f02a7d1728636a2c",
            "sha256:b3015f6dd55cd80bd6ecbfd49e8588175aa0923f2051c893504ae24dbaa6835a",
            "sha256:394e6dc02c13fb74216b76949b6fb25e0139308f3cfedf5caa124df84df1a3f2",
            "sha256:f3e1a957dea35509b93389e9c90e306ca649d87d2bfafe2a0530f3ce9d4c4ab0",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:6a7107275f1bcc65af39453d2f8c6220b1335fcf9b135be81164780c3571026e",
            "sha256:8e4e0a859ab99a2ba2dd5c1cbeed17f490c537b07752da67c137429d98486e33",
            "sha256:aa11a89a84e3d289503ea83c4ad6d21ecec5b90a6c0ab2ca4053652861ceb960",
            "sha256:cfa88043366ee2a3dcbe86711555273f6ffa32d6aa19c5752af9e0fd364cdadd",
            "sha256:b5fe063f0e663a4fc7cf222332ff55b71fbe774dfa961177859c94096fe08f69",
            "sha256:0c1cde79e00e16c4dcfbc927dcaa689d89b6a5893a55344ab579eca7f479a002",
            "sha256:ba3ca8facaaa57eb28d01f2c09eb5d65c6799d02b40b4e4f0fa17f89ec76f751",
            "sha256:4e2ddb9a51644aaf96eb0b5cc13cf8f3c35197049450ca9dc1a1b0265570aae3",
            "sha256:46c70403cc596b1627434cdfa894a8b08b6148f4cd0d21059e5b203669bcb781",
            "sha256:4542dd8d93443b5028ce73d60ec9df0ed60edb00fd83dbe96f14fc4aa89d7fa2",
            "sha256:bd588fe807b484d575b0449b0b4cb4aeabe7de0916ad069a03556951a666fcd3"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-04-07T14:37:48.243198636+08:00"
    }
}

更多版本

docker.io/unclecode/crawl4ai:basic-amd64

linux/amd64 docker.io2.32GB2024-12-30 11:12
441

docker.io/unclecode/crawl4ai:all-amd64

linux/amd64 docker.io8.53GB2025-01-08 00:15
1045

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/amd64 docker.io4.16GB2025-05-19 15:26
442

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/arm64 docker.io3.98GB2025-06-12 21:13
278

docker.io/unclecode/crawl4ai:latest

linux/amd64 docker.io4.16GB2025-07-08 15:51
392

docker.io/unclecode/crawl4ai:latest

linux/arm64 docker.io6.64GB2025-08-01 11:12
302

docker.io/unclecode/crawl4ai:0.7.2

linux/arm64 docker.io6.64GB2025-08-01 11:14
214

docker.io/unclecode/crawl4ai:0.7.2

linux/amd64 docker.io4.17GB2025-08-06 15:57
377

docker.io/unclecode/crawl4ai:0.7.4

linux/amd64 docker.io5.84GB2025-09-12 17:36
268

docker.io/unclecode/crawl4ai:0.7.7

linux/amd64 docker.io4.36GB2025-11-29 11:28
195

docker.io/unclecode/crawl4ai:0.8

linux/amd64 docker.io3.80GB2026-01-21 14:31
151

docker.io/unclecode/crawl4ai:0.8.5

linux/arm64 docker.io6.92GB2026-04-07 14:43
10