docker.io/unclecode/crawl4ai:0.7.2 linux/arm64

docker.io/unclecode/crawl4ai:0.7.2 - 国内下载镜像源 浏览次数:43 温馨提示: 这是一个 linux/arm64 系统架构镜像

这是一个名为docker.io/unclecode/crawl4ai 的Docker容器镜像。 具体的描述信息需要参考镜像的维护者提供的文档或README。 由于我没有访问外部网络资源的能力,无法直接获取该镜像的描述。

源镜像 docker.io/unclecode/crawl4ai:0.7.2
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64
镜像ID sha256:2f7bb73bc7c67bee501d227cd5021924e353c90e8afb5ebb762b375757a270a9
镜像TAG 0.7.2-linuxarm64
大小 6.64GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD supervisord -c supervisord.conf
启动入口
工作目录 /app
OS/平台 linux/arm64
浏览量 43 次
贡献者 pe****p@outlook.com
镜像创建 2025-07-25T10:06:31.966206815Z
同步时间 2025-08-01 11:14
更新时间 2025-08-13 01:00
开放端口
6379/tcp
环境变量
PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.12.11 PYTHON_SHA256=c30bb24b7f1e9a19b11b55a546434f74e739bb4c271a3e3a80ff4380d49f7adb C4AI_VERSION=0.7.0-r1 PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379 PYTHON_ENV=production
镜像标签
0.7.0-r1: c4ai.version 🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper: description unclecode: maintainer 1.0: version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64  docker.io/unclecode/crawl4ai:0.7.2

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64  docker.io/unclecode/crawl4ai:0.7.2

Shell快速替换命令

sed -i 's#unclecode/crawl4ai:0.7.2#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64  docker.io/unclecode/crawl4ai:0.7.2'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64  docker.io/unclecode/crawl4ai:0.7.2'

镜像构建历史


# 2025-07-25 18:06:31  0.00B 设置默认要执行的命令
CMD ["supervisord" "-c" "supervisord.conf"]
                        
# 2025-07-25 18:06:31  0.00B 设置环境变量 PYTHON_ENV
ENV PYTHON_ENV=production
                        
# 2025-07-25 18:06:31  0.00B 指定运行容器时使用的用户
USER appuser
                        
# 2025-07-25 18:06:31  0.00B 声明容器运行时监听的端口
EXPOSE map[6379/tcp:{}]
                        
# 2025-07-25 18:06:31  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi &&     redis-cli ping > /dev/null &&     curl -f http://localhost:11235/health || exit 1'"] "30s" "10s" "5s" "0s" '\x03'}
                        
# 2025-07-25 18:06:31  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c mkdir -p /var/lib/redis /var/log/redis && chown -R appuser:appuser /var/lib/redis /var/log/redis # buildkit
                        
# 2025-07-25 18:06:31  1.33GB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c chown -R appuser:appuser ${APP_HOME} # buildkit
                        
# 2025-07-25 18:06:28  43.66KB 复制新文件或目录到容器中
COPY deploy/docker/static /app/static # buildkit
                        
# 2025-07-25 18:06:28  890.81KB 复制新文件或目录到容器中
COPY deploy/docker/* /app/ # buildkit
                        
# 2025-07-25 18:06:28  1.33GB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c crawl4ai-doctor # buildkit
                        
# 2025-07-25 18:05:37  601.65MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c mkdir -p /home/appuser/.cache/ms-playwright     && cp -r /root/.cache/ms-playwright/chromium-* /home/appuser/.cache/ms-playwright/     && chown -R appuser:appuser /home/appuser/.cache/ms-playwright # buildkit
                        
# 2025-07-25 18:05:35  826.87MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c playwright install --with-deps # buildkit
                        
# 2025-07-25 18:02:12  1.19GB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c crawl4ai-setup # buildkit
                        
# 2025-07-25 17:58:44  14.24MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c pip install --no-cache-dir --upgrade pip &&     /tmp/install.sh &&     python -c "import crawl4ai; print('✅ crawl4ai is ready to rock!')" &&     python -c "from playwright.sync_api import sync_playwright; print('✅ Playwright is feeling dramatic!')" # buildkit
                        
# 2025-07-25 17:56:51  487.27MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install "/tmp/project/[all]" &&         python -m crawl4ai.model_loader ;     elif [ "$INSTALL_TYPE" = "torch" ] ; then         pip install "/tmp/project/[torch]" ;     elif [ "$INSTALL_TYPE" = "transformer" ] ; then         pip install "/tmp/project/[transformer]" &&         python -m crawl4ai.model_loader ;     else         pip install "/tmp/project" ;     fi # buildkit
                        
# 2025-07-25 17:51:53  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install --no-cache-dir             torch             torchvision             torchaudio             scikit-learn             nltk             transformers             tokenizers &&         python -m nltk.downloader punkt stopwords ;     fi # buildkit
                        
# 2025-07-25 17:51:53  108.22MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c pip install --no-cache-dir -r requirements.txt # buildkit
                        
# 2025-07-25 17:50:20  302.00B 复制新文件或目录到容器中
COPY deploy/docker/requirements.txt . # buildkit
                        
# 2025-07-25 17:50:20  1.31KB 复制新文件或目录到容器中
COPY deploy/docker/supervisord.conf . # buildkit
                        
# 2025-07-25 17:50:20  33.53MB 复制新文件或目录到容器中
COPY . /tmp/project/ # buildkit
                        
# 2025-07-25 17:50:20  435.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c echo '#!/bin/bash\nif [ "$USE_LOCAL" = "true" ]; then\n    echo "📦 Installing from local source..."\n    pip install --no-cache-dir /tmp/project/\nelse\n    echo "🌐 Installing from GitHub..."\n    for i in {1..3}; do \n        git clone --branch ${GITHUB_BRANCH} ${GITHUB_REPO} /tmp/crawl4ai && break || \n        { echo "Attempt $i/3 failed! Taking a short break... ☕"; sleep 5; }; \n    done\n    pip install --no-cache-dir /tmp/crawl4ai\nfi' > /tmp/install.sh && chmod +x /tmp/install.sh # buildkit
                        
# 2025-07-25 17:50:20  0.00B 设置工作目录为/app
WORKDIR /app
                        
# 2025-07-25 17:50:20  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c mkdir -p /home/appuser && chown -R appuser:appuser /home/appuser # buildkit
                        
# 2025-07-25 17:50:20  4.50KB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c groupadd -r appuser && useradd --no-log-init -r -g appuser appuser # buildkit
                        
# 2025-07-25 17:50:19  64.18MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c if [ "$TARGETARCH" = "arm64" ]; then     echo "🦾 Installing ARM-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libopenblas-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; elif [ "$TARGETARCH" = "amd64" ]; then     echo "🖥️ Installing AMD64-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libomp-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; else     echo "Skipping platform-specific optimizations (unsupported platform)"; fi # buildkit
                        
# 2025-07-25 17:49:53  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c if [ "$ENABLE_GPU" = "true" ] && [ "$TARGETARCH" = "amd64" ] ; then     apt-get update && apt-get install -y --no-install-recommends     nvidia-cuda-toolkit     && apt-get clean     && rm -rf /var/lib/apt/lists/* ; else     echo "Skipping NVIDIA CUDA Toolkit installation (unsupported platform or GPU disabled)"; fi # buildkit
                        
# 2025-07-25 17:49:53  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c apt-get update && apt-get dist-upgrade -y     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-07-25 17:49:34  32.70MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     libglib2.0-0     libnss3     libnspr4     libatk1.0-0     libatk-bridge2.0-0     libcups2     libdrm2     libdbus-1-3     libxcb1     libxkbcommon0     libx11-6     libxcomposite1     libxdamage1     libxext6     libxfixes3     libxrandr2     libgbm1     libpango-1.0-0     libcairo2     libasound2     libatspi2.0-0     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-07-25 17:48:40  475.52MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.7.0-r1 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=arm64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     build-essential     curl     wget     gnupg     git     cmake     pkg-config     python3-dev     libjpeg-dev     redis-server     supervisor     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-07-25 17:48:40  0.00B 添加元数据标签
LABEL version=1.0
                        
# 2025-07-25 17:48:40  0.00B 添加元数据标签
LABEL description=🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper
                        
# 2025-07-25 17:48:40  0.00B 添加元数据标签
LABEL maintainer=unclecode
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG TARGETARCH=arm64
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG ENABLE_GPU=false
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG INSTALL_TYPE=default
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG PYTHON_VERSION=3.12
                        
# 2025-07-25 17:48:40  0.00B 设置环境变量 PYTHONFAULTHANDLER PYTHONHASHSEED PYTHONUNBUFFERED PIP_NO_CACHE_DIR PYTHONDONTWRITEBYTECODE PIP_DISABLE_PIP_VERSION_CHECK PIP_DEFAULT_TIMEOUT DEBIAN_FRONTEND REDIS_HOST REDIS_PORT
ENV PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG USE_LOCAL=true
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG GITHUB_BRANCH=main
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG GITHUB_REPO=https://github.com/unclecode/crawl4ai.git
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG APP_HOME=/app
                        
# 2025-07-25 17:48:40  0.00B 添加元数据标签
LABEL c4ai.version=0.7.0-r1
                        
# 2025-07-25 17:48:40  0.00B 设置环境变量 C4AI_VERSION
ENV C4AI_VERSION=0.7.0-r1
                        
# 2025-07-25 17:48:40  0.00B 定义构建参数
ARG C4AI_VER=0.7.0-r1
                        
# 2025-06-04 08:40:01  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2025-06-04 08:40:01  36.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pip3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2025-06-04 08:40:01  43.59MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		savedAptMark="$(apt-mark showmanual)"; 	apt-get update; 	apt-get install -y --no-install-recommends 		dpkg-dev 		gcc 		gnupg 		libbluetooth-dev 		libbz2-dev 		libc6-dev 		libdb-dev 		libffi-dev 		libgdbm-dev 		liblzma-dev 		libncursesw5-dev 		libreadline-dev 		libsqlite3-dev 		libssl-dev 		make 		tk-dev 		uuid-dev 		wget 		xz-utils 		zlib1g-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	echo "$PYTHON_SHA256 *python.tar.xz" | sha256sum -c -; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-optimizations 		--enable-option-checking=fatal 		--enable-shared 		$(test "$gnuArch" != 'riscv64-linux-musl' && echo '--with-lto') 		--with-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="$(dpkg-buildflags --get CFLAGS)"; 	LDFLAGS="$(dpkg-buildflags --get LDFLAGS)"; 	LDFLAGS="${LDFLAGS:--Wl},--strip-all"; 		arch="$(dpkg --print-architecture)"; arch="${arch##*-}"; 		case "$arch" in 			amd64|arm64) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer"; 				;; 			i386) 				;; 			*) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer"; 				;; 		esac; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:--Wl},-rpath='\$\$ORIGIN/../lib'" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		ldconfig; 		apt-mark auto '.*' > /dev/null; 	apt-mark manual $savedAptMark; 	find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec ldd '{}' ';' 		| awk '/=>/ { so = $(NF-1); if (index(so, "/usr/local/") == 1) { next }; gsub("^/(usr/)?", "", so); printf "*%s\n", so }' 		| sort -u 		| xargs -r dpkg-query --search 		| cut -d: -f1 		| sort -u 		| xargs -r apt-mark manual 	; 	apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; 	rm -rf /var/lib/apt/lists/*; 		export PYTHONDONTWRITEBYTECODE=1; 	python3 --version; 	pip3 --version # buildkit
                        
# 2025-06-04 08:40:01  0.00B 设置环境变量 PYTHON_SHA256
ENV PYTHON_SHA256=c30bb24b7f1e9a19b11b55a546434f74e739bb4c271a3e3a80ff4380d49f7adb
                        
# 2025-06-04 08:40:01  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.12.11
                        
# 2025-06-04 08:40:01  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2025-06-04 08:40:01  9.18MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apt-get update; 	apt-get install -y --no-install-recommends 		ca-certificates 		netbase 		tzdata 	; 	rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-06-04 08:40:01  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2025-06-04 08:40:01  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-06-04 08:40:01  97.21MB 
# debian.sh --arch 'arm64' out/ 'bookworm' '@1753056000'
                        
                    

镜像信息

{
    "Id": "sha256:2f7bb73bc7c67bee501d227cd5021924e353c90e8afb5ebb762b375757a270a9",
    "RepoTags": [
        "unclecode/crawl4ai:0.7.2",
        "unclecode/crawl4ai:latest",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.7.2-linuxarm64",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:latest-linuxarm64"
    ],
    "RepoDigests": [
        "unclecode/crawl4ai@sha256:ebc33dcc0292b86f0016fa87baabb8cb14fbf975722a85ca91a1081f5f36c879",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai@sha256:687a31caad3d204e03eac9d0b8308631c27b8a53d0298da260d7ccb510fa2742"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-07-25T10:06:31.966206815Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "appuser",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "6379/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.12.11",
            "PYTHON_SHA256=c30bb24b7f1e9a19b11b55a546434f74e739bb4c271a3e3a80ff4380d49f7adb",
            "C4AI_VERSION=0.7.0-r1",
            "PYTHONFAULTHANDLER=1",
            "PYTHONHASHSEED=random",
            "PYTHONUNBUFFERED=1",
            "PIP_NO_CACHE_DIR=1",
            "PYTHONDONTWRITEBYTECODE=1",
            "PIP_DISABLE_PIP_VERSION_CHECK=1",
            "PIP_DEFAULT_TIMEOUT=100",
            "DEBIAN_FRONTEND=noninteractive",
            "REDIS_HOST=localhost",
            "REDIS_PORT=6379",
            "PYTHON_ENV=production"
        ],
        "Cmd": [
            "supervisord",
            "-c",
            "supervisord.conf"
        ],
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi \u0026\u0026     redis-cli ping \u003e /dev/null \u0026\u0026     curl -f http://localhost:11235/health || exit 1'"
            ],
            "Interval": 30000000000,
            "Timeout": 10000000000,
            "StartPeriod": 5000000000,
            "Retries": 3
        },
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/app",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "c4ai.version": "0.7.0-r1",
            "description": "🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler \u0026 scraper",
            "maintainer": "unclecode",
            "version": "1.0"
        }
    },
    "Architecture": "arm64",
    "Os": "linux",
    "Size": 6642920720,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/a6e59e996df0885b2e36a096f02020e8657a75459641af9da4a6b2f2ba898ac2/diff:/var/lib/docker/overlay2/af6257f3011ad8bc4c9f9735c50df2c7799be44c2eb692926ed9c7bf490be841/diff:/var/lib/docker/overlay2/ce284a7b80e2e58bec28930539427ab2ed5e57832b0263b47e8fcbaf008790f9/diff:/var/lib/docker/overlay2/fb1d6f1ac2ee226dcc5b60677e1ca65aa99f4166ee5b41d8620fd7474830f24a/diff:/var/lib/docker/overlay2/853807d86fc8b2a2d83e6a0e99aca6acc349df73bb6b730879aa230110ca2318/diff:/var/lib/docker/overlay2/0bd8b0a847a64f9f1ea591791bc4e7df3ae4598cf44086841265ccc9be108329/diff:/var/lib/docker/overlay2/8f5c6c35ad8add606fbe536e8d099f83708cb04c53d83eabc27c157d06993a86/diff:/var/lib/docker/overlay2/6b90344edc198b1ec2c7e7824ac0f9de51e3ed0a7b07a60d5184f2c5176c0280/diff:/var/lib/docker/overlay2/e32790af90846686ddd8635ad2c03635c0263f1079debd5b02c561864cc16c92/diff:/var/lib/docker/overlay2/05f0bb9089e1c07013453b10847ec59081486920e8af2168cc36a867f4c74e44/diff:/var/lib/docker/overlay2/5a41075c0f955d142f5c011dcd1a8fbd2cb30cce641a306992c70dfef92bf532/diff:/var/lib/docker/overlay2/d73ac2c723e5ebdd66c4dbc9822796652c6e96812609e3d70481d9f9200973cf/diff:/var/lib/docker/overlay2/d7834f09e775d3b80d5315d93363ca57cc5d0540813b150702a6881ec8879b5d/diff:/var/lib/docker/overlay2/2841e990221c6100cdb64706866bd7b86283b95ab8b17e73dcde43121f10da48/diff:/var/lib/docker/overlay2/67a9c2f7d5a5373b2a41ecc8f8662bb6239f7d3955b618891cf13fa7d512a965/diff:/var/lib/docker/overlay2/50f51875c83dbfeed8eff3f8f2397f8f4f1d36a9c85bbf2a5d8f132d9f9bb195/diff:/var/lib/docker/overlay2/f34432e0d8059c1adee7d8a818bee808c7858558ff8869535063cb1312f65057/diff:/var/lib/docker/overlay2/d9065af1aab175518ac3834f078a97169a69e5ca9b8ead0c0c6dcbf544b42b60/diff:/var/lib/docker/overlay2/4ddeefc53f980851173ab2832f43b1702b346179732145fc81a588b6df312c97/diff:/var/lib/docker/overlay2/4da8e7dddb6c1456437e7659a838d069027f9f15d222674ac36facbd02c08834/diff:/var/lib/docker/overlay2/570a55847ada29626b1ccf4fddff65a6c1e204c3ec3e0ad4204846e871b8a32c/diff:/var/lib/docker/overlay2/07667363b2bbafedf1c3cdda639c6a52c39a9ef056e34d3c657b5c3259384703/diff:/var/lib/docker/overlay2/79f1bbf50c66ef5fa56cce5fa2a3deec2e95d418e2856bc216a13d71bba459e4/diff:/var/lib/docker/overlay2/fa23edfb2d7e63b529a8644db7ee9e153b3e0dc870d18aa3caca914699eec84b/diff:/var/lib/docker/overlay2/dc8a6be4278f3974866cc047ac7337b47f473072407be1573480cf2684b38ee3/diff:/var/lib/docker/overlay2/70adad60265542deccf5069e34bf28893823246f5cab2c931fe4305345b14e73/diff:/var/lib/docker/overlay2/87389907bc1ad20728e04557fef3c78551de3059bc92b2785b10bb1bdb00777c/diff",
            "MergedDir": "/var/lib/docker/overlay2/7b84940b6dacbed2015e8785f0256d586e3ecd7813d1132848d020fc56423fc3/merged",
            "UpperDir": "/var/lib/docker/overlay2/7b84940b6dacbed2015e8785f0256d586e3ecd7813d1132848d020fc56423fc3/diff",
            "WorkDir": "/var/lib/docker/overlay2/7b84940b6dacbed2015e8785f0256d586e3ecd7813d1132848d020fc56423fc3/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:dd97e58b4e812b247f3cd261fa7eac247b7c5896b8e34d9474762678cae7774d",
            "sha256:e26286284f6ea22d800d847e7f0ccee9e39591de19270b17eec80289a6270c32",
            "sha256:d7451490396181beac54ea74fc9bb0f2709e46f29f6a853c8c5512e94a5f51ff",
            "sha256:05ed7b2bcaa5ff09e57a7906942a046cbbcb668ac35dcc6151bf0b53f4b4ed03",
            "sha256:af7a20bcfecb9c9d1a9d0979686043ddccd67c041b9eac36d1bb73a7a3166d7b",
            "sha256:e59832f9622369bcb7c911453e60508b8398ed02fa198f7f82c7ace2aecf4558",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:89dc5d58461384e0d4e02e315033c432d1063334d705698fa4fdd1f601dcd91f",
            "sha256:68d450863a86e1d1f0cd94dd874849c257f61552ab68593d31a68c90fcd8df7b",
            "sha256:3ed0fc957a29d066f1b3449f48c03da6929feac1808fa361a860c6b58ab241d2",
            "sha256:f1e1c31fb543239146babe4cd54cd70cd7322d90f9692e1664384bdb836ec607",
            "sha256:cedee098900a0964f6391ff53cda686c12e6b903b2fa163f4fdb24bbc0040c7a",
            "sha256:0e0f08491185df23cef2ecde1553ec2fa520c46c9d795344647f36eccd11cc6a",
            "sha256:fa4b3680304cdbf3fb1bd5adee56296a890cfd48777e3a7acf1385f5562a0b09",
            "sha256:d4c992a3467036421b755e088a87618d0e7735b0080c28f5d6b9f6ae6893fdc3",
            "sha256:ab3a9f9b4af9edcd63d3babe75f734012a54973601da39186c6f61333a25e6eb",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:8f8bec1c851d1e11e1beb6c56ffcb5cb26b3944cca5181f39e190f937a5a7b21",
            "sha256:43ddd934903ee282a957634814f9339e6aa24872661b8be3f2060abba9b1089a",
            "sha256:5d240b202638f1b76c991a5a56c44aabc447f53ce942d9fdb37deda3ff73b4ad",
            "sha256:fabb99ddfa09d4c3b1ff903768090a2b0b8b4efc7a89a0e28a6ca2d4e15c409b",
            "sha256:b141016c7405a1e8ae0f8edec5ca5410c0fb1046ffe8435c0d1ab80b56095fd0",
            "sha256:b10dbbc5d44e1587696d29352155658d9dbb1c3ad04b04b4e4418b87017a462d",
            "sha256:c447197fae05fe7f342c5cc5ac5b0844597089f76c01171a87da4266a0f56021",
            "sha256:2ff9f795d5d4caa3511c5556cc477fd583f3cf671d1b93f507ff1940d02be018",
            "sha256:662c88747d201f9d1f830bf7da6d9d92083df2991823e5337b3c7d402e2e4bc4",
            "sha256:1829a3fdb2b3ecd26f2cd00b5ff149af29d18fae2a07289e96827e159edf1649"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-08-01T11:14:37.47991291+08:00"
    }
}

更多版本

docker.io/unclecode/crawl4ai:basic-amd64

linux/amd64 docker.io2.32GB2024-12-30 11:12
210

docker.io/unclecode/crawl4ai:all-amd64

linux/amd64 docker.io8.53GB2025-01-08 00:15
578

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/amd64 docker.io4.16GB2025-05-19 15:26
162

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/arm64 docker.io3.98GB2025-06-12 21:13
71

docker.io/unclecode/crawl4ai:latest

linux/amd64 docker.io4.16GB2025-07-08 15:51
94

docker.io/unclecode/crawl4ai:latest

linux/arm64 docker.io6.64GB2025-08-01 11:12
44

docker.io/unclecode/crawl4ai:0.7.2

linux/arm64 docker.io6.64GB2025-08-01 11:14
42

docker.io/unclecode/crawl4ai:0.7.2

linux/amd64 docker.io4.17GB2025-08-06 15:57
35