docker.io/unclecode/crawl4ai:0.8 linux/amd64

docker.io/unclecode/crawl4ai:0.8 - 国内下载镜像源 浏览次数:17

这是一个名为docker.io/unclecode/crawl4ai 的Docker容器镜像。 具体的描述信息需要参考镜像的维护者提供的文档或README。 由于我没有访问外部网络资源的能力,无法直接获取该镜像的描述。

源镜像 docker.io/unclecode/crawl4ai:0.8
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8
镜像ID sha256:477d748c74a55104bc2d0839a120b4890924ec092aeecfd507e325f498e99d34
镜像TAG 0.8
大小 3.80GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD supervisord -c supervisord.conf
启动入口
工作目录 /app
OS/平台 linux/amd64
浏览量 17 次
贡献者
镜像创建 2026-01-16T10:47:25.88042212Z
同步时间 2026-01-21 14:31
更新时间 2026-01-22 03:08
开放端口
6379/tcp
环境变量
PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.12.12 PYTHON_SHA256=fb85a13414b028c49ba18bbd523c2d055a30b56b18b92ce454ea2c51edc656c4 C4AI_VERSION=0.8.0 PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379 PYTHON_ENV=production
镜像标签
0.8.0: c4ai.version 🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper: description unclecode: maintainer 1.0: version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8  docker.io/unclecode/crawl4ai:0.8

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8  docker.io/unclecode/crawl4ai:0.8

Shell快速替换命令

sed -i 's#unclecode/crawl4ai:0.8#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8  docker.io/unclecode/crawl4ai:0.8'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8  docker.io/unclecode/crawl4ai:0.8'

镜像构建历史


# 2026-01-16 18:47:25  0.00B 设置默认要执行的命令
CMD ["supervisord" "-c" "supervisord.conf"]
                        
# 2026-01-16 18:47:25  0.00B 设置环境变量 PYTHON_ENV
ENV PYTHON_ENV=production
                        
# 2026-01-16 18:47:25  0.00B 指定运行容器时使用的用户
USER appuser
                        
# 2026-01-16 18:47:25  0.00B 声明容器运行时监听的端口
EXPOSE [6379/tcp]
                        
# 2026-01-16 18:47:25  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi &&     redis-cli ping > /dev/null &&     curl -f http://localhost:11235/health || exit 1'"] "30s" "10s" "5s" "0s" '\x03'}
                        
# 2026-01-16 18:47:25  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /var/lib/redis /var/log/redis && chown -R appuser:appuser /var/lib/redis /var/log/redis # buildkit
                        
# 2026-01-16 18:47:25  1.30MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c chown -R appuser:appuser ${APP_HOME} # buildkit
                        
# 2026-01-16 18:47:25  115.26KB 复制新文件或目录到容器中
COPY deploy/docker/static /app/static # buildkit
                        
# 2026-01-16 18:47:25  1.19MB 复制新文件或目录到容器中
COPY deploy/docker/* /app/ # buildkit
                        
# 2026-01-16 18:47:25  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /home/appuser/.cache     && chown -R appuser:appuser /home/appuser/.cache # buildkit
                        
# 2026-01-16 18:47:24  286.06KB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c crawl4ai-doctor # buildkit
                        
# 2026-01-16 18:47:20  373.24MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /home/appuser/.cache/ms-playwright     && cp -r /root/.cache/ms-playwright/chromium-* /home/appuser/.cache/ms-playwright/     && chown -R appuser:appuser /home/appuser/.cache/ms-playwright # buildkit
                        
# 2026-01-16 18:47:19  930.75MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c playwright install --with-deps # buildkit
                        
# 2026-01-16 18:46:49  877.28MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c crawl4ai-setup # buildkit
                        
# 2026-01-16 18:46:24  14.04MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c pip install --no-cache-dir --upgrade pip &&     /tmp/install.sh &&     python -c "import crawl4ai; print('✅ crawl4ai is ready to rock!')" &&     python -c "from playwright.sync_api import sync_playwright; print('✅ Playwright is feeling dramatic!')" # buildkit
                        
# 2026-01-16 18:46:11  658.33MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install "/tmp/project/[all]" &&         python -m crawl4ai.model_loader ;     elif [ "$INSTALL_TYPE" = "torch" ] ; then         pip install "/tmp/project/[torch]" ;     elif [ "$INSTALL_TYPE" = "transformer" ] ; then         pip install "/tmp/project/[transformer]" &&         python -m crawl4ai.model_loader ;     else         pip install "/tmp/project" ;     fi # buildkit
                        
# 2026-01-16 18:45:37  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install --no-cache-dir             torch             torchvision             torchaudio             scikit-learn             nltk             transformers             tokenizers &&         python -m nltk.downloader punkt stopwords ;     fi # buildkit
                        
# 2026-01-16 18:45:37  117.01MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c pip install --no-cache-dir -r requirements.txt # buildkit
                        
# 2026-01-16 18:45:26  303.00B 复制新文件或目录到容器中
COPY deploy/docker/requirements.txt . # buildkit
                        
# 2026-01-16 18:45:26  1.31KB 复制新文件或目录到容器中
COPY deploy/docker/supervisord.conf . # buildkit
                        
# 2026-01-16 18:45:26  34.74MB 复制新文件或目录到容器中
COPY . /tmp/project/ # buildkit
                        
# 2026-01-16 18:45:26  435.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c echo '#!/bin/bash\nif [ "$USE_LOCAL" = "true" ]; then\n    echo "📦 Installing from local source..."\n    pip install --no-cache-dir /tmp/project/\nelse\n    echo "🌐 Installing from GitHub..."\n    for i in {1..3}; do \n        git clone --branch ${GITHUB_BRANCH} ${GITHUB_REPO} /tmp/crawl4ai && break || \n        { echo "Attempt $i/3 failed! Taking a short break... ☕"; sleep 5; }; \n    done\n    pip install --no-cache-dir /tmp/crawl4ai\nfi' > /tmp/install.sh && chmod +x /tmp/install.sh # buildkit
                        
# 2026-01-16 18:45:26  0.00B 设置工作目录为/app
WORKDIR /app
                        
# 2026-01-16 18:45:26  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /home/appuser && chown -R appuser:appuser /home/appuser # buildkit
                        
# 2026-01-16 18:45:26  4.50KB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c groupadd -r appuser && useradd --no-log-init -r -g appuser appuser # buildkit
                        
# 2026-01-16 18:45:26  143.33MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$TARGETARCH" = "arm64" ]; then     echo "🦾 Installing ARM-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libopenblas-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; elif [ "$TARGETARCH" = "amd64" ]; then     echo "🖥️ Installing AMD64-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libomp-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; else     echo "Skipping platform-specific optimizations (unsupported platform)"; fi # buildkit
                        
# 2026-01-16 18:45:21  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$ENABLE_GPU" = "true" ] && [ "$TARGETARCH" = "amd64" ] ; then     apt-get update && apt-get install -y --no-install-recommends     nvidia-cuda-toolkit     && apt-get clean     && rm -rf /var/lib/apt/lists/* ; else     echo "Skipping NVIDIA CUDA Toolkit installation (unsupported platform or GPU disabled)"; fi # buildkit
                        
# 2026-01-16 18:45:21  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get dist-upgrade -y     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-01-16 18:45:19  30.98MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     libglib2.0-0     libnss3     libnspr4     libatk1.0-0     libatk-bridge2.0-0     libcups2     libdrm2     libdbus-1-3     libxcb1     libxkbcommon0     libx11-6     libxcomposite1     libxdamage1     libxext6     libxfixes3     libxrandr2     libgbm1     libpango-1.0-0     libcairo2     libasound2     libatspi2.0-0     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-01-16 18:45:12  495.87MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.8.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     build-essential     curl     wget     gnupg     git     cmake     pkg-config     python3-dev     libjpeg-dev     redis-server     supervisor     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-01-16 18:45:12  0.00B 添加元数据标签
LABEL version=1.0
                        
# 2026-01-16 18:45:12  0.00B 添加元数据标签
LABEL description=🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper
                        
# 2026-01-16 18:45:12  0.00B 添加元数据标签
LABEL maintainer=unclecode
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG TARGETARCH=amd64
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG ENABLE_GPU=false
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG INSTALL_TYPE=default
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG PYTHON_VERSION=3.12
                        
# 2026-01-16 18:45:12  0.00B 设置环境变量 PYTHONFAULTHANDLER PYTHONHASHSEED PYTHONUNBUFFERED PIP_NO_CACHE_DIR PYTHONDONTWRITEBYTECODE PIP_DISABLE_PIP_VERSION_CHECK PIP_DEFAULT_TIMEOUT DEBIAN_FRONTEND REDIS_HOST REDIS_PORT
ENV PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG USE_LOCAL=true
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG GITHUB_BRANCH=main
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG GITHUB_REPO=https://github.com/unclecode/crawl4ai.git
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG APP_HOME=/app
                        
# 2026-01-16 18:45:12  0.00B 添加元数据标签
LABEL c4ai.version=0.8.0
                        
# 2026-01-16 18:45:12  0.00B 设置环境变量 C4AI_VERSION
ENV C4AI_VERSION=0.8.0
                        
# 2026-01-16 18:45:12  0.00B 定义构建参数
ARG C4AI_VER=0.8.0
                        
# 2026-01-13 11:15:56  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2026-01-13 11:15:56  36.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pip3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2026-01-13 11:15:56  40.22MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		savedAptMark="$(apt-mark showmanual)"; 	apt-get update; 	apt-get install -y --no-install-recommends 		dpkg-dev 		gcc 		gnupg 		libbluetooth-dev 		libbz2-dev 		libc6-dev 		libdb-dev 		libffi-dev 		libgdbm-dev 		liblzma-dev 		libncursesw5-dev 		libreadline-dev 		libsqlite3-dev 		libssl-dev 		make 		tk-dev 		uuid-dev 		wget 		xz-utils 		zlib1g-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	echo "$PYTHON_SHA256 *python.tar.xz" | sha256sum -c -; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-optimizations 		--enable-option-checking=fatal 		--enable-shared 		$(test "${gnuArch%%-*}" != 'riscv64' && echo '--with-lto') 		--with-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="$(dpkg-buildflags --get CFLAGS)"; 	LDFLAGS="$(dpkg-buildflags --get LDFLAGS)"; 	LDFLAGS="${LDFLAGS:--Wl},--strip-all"; 		arch="$(dpkg --print-architecture)"; arch="${arch##*-}"; 		case "$arch" in 			amd64|arm64) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer"; 				;; 			i386) 				;; 			*) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer"; 				;; 		esac; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:--Wl},-rpath='\$\$ORIGIN/../lib'" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		ldconfig; 		apt-mark auto '.*' > /dev/null; 	apt-mark manual $savedAptMark; 	find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec ldd '{}' ';' 		| awk '/=>/ { so = $(NF-1); if (index(so, "/usr/local/") == 1) { next }; gsub("^/(usr/)?", "", so); printf "*%s\n", so }' 		| sort -u 		| xargs -rt dpkg-query --search 		| awk 'sub(":$", "", $1) { print $1 }' 		| sort -u 		| xargs -r apt-mark manual 	; 	apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; 	rm -rf /var/lib/apt/lists/*; 		export PYTHONDONTWRITEBYTECODE=1; 	python3 --version; 	pip3 --version # buildkit
                        
# 2026-01-13 11:06:39  0.00B 设置环境变量 PYTHON_SHA256
ENV PYTHON_SHA256=fb85a13414b028c49ba18bbd523c2d055a30b56b18b92ce454ea2c51edc656c4
                        
# 2026-01-13 11:06:39  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.12.12
                        
# 2026-01-13 11:06:39  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2026-01-13 11:06:39  9.26MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apt-get update; 	apt-get install -y --no-install-recommends 		ca-certificates 		netbase 		tzdata 	; 	rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2026-01-13 11:06:39  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2026-01-13 11:06:39  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2026-01-12 08:00:00  74.81MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1768176000'
                        
                    

镜像信息

{
    "Id": "sha256:477d748c74a55104bc2d0839a120b4890924ec092aeecfd507e325f498e99d34",
    "RepoTags": [
        "unclecode/crawl4ai:0.8",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.8"
    ],
    "RepoDigests": [
        "unclecode/crawl4ai@sha256:4d8b065bf185962733cb5f9701f4122d03383fa1ab6b5f6a9873f04fa0416a84",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai@sha256:61f0dafc7b0f6df2c78ee13b62591183e20178beef767599ccd4b7bd65058d6a"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2026-01-16T10:47:25.88042212Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "appuser",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "6379/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.12.12",
            "PYTHON_SHA256=fb85a13414b028c49ba18bbd523c2d055a30b56b18b92ce454ea2c51edc656c4",
            "C4AI_VERSION=0.8.0",
            "PYTHONFAULTHANDLER=1",
            "PYTHONHASHSEED=random",
            "PYTHONUNBUFFERED=1",
            "PIP_NO_CACHE_DIR=1",
            "PYTHONDONTWRITEBYTECODE=1",
            "PIP_DISABLE_PIP_VERSION_CHECK=1",
            "PIP_DEFAULT_TIMEOUT=100",
            "DEBIAN_FRONTEND=noninteractive",
            "REDIS_HOST=localhost",
            "REDIS_PORT=6379",
            "PYTHON_ENV=production"
        ],
        "Cmd": [
            "supervisord",
            "-c",
            "supervisord.conf"
        ],
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi \u0026\u0026     redis-cli ping \u003e /dev/null \u0026\u0026     curl -f http://localhost:11235/health || exit 1'"
            ],
            "Interval": 30000000000,
            "Timeout": 10000000000,
            "StartPeriod": 5000000000,
            "Retries": 3
        },
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/app",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "c4ai.version": "0.8.0",
            "description": "🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler \u0026 scraper",
            "maintainer": "unclecode",
            "version": "1.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 3802748149,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/c8f93280cdfb45b6fd27c37196122060d7c22e8f069f187a4899fb957872b42d/diff:/var/lib/docker/overlay2/dab24d5fd881ac2f9b08a5cd93f94474c9d3c395145ad9e28db37c26633b11e4/diff:/var/lib/docker/overlay2/6ddcd28f3508e34d41bce774f6af0723c35681cdecba0e885d895ec94831788f/diff:/var/lib/docker/overlay2/9efcea1f5c6cae2c142f4b4355be6f838ac4a21fb70e05e84850f1315ee0efb1/diff:/var/lib/docker/overlay2/2e8bcce2188135d9e9131995c3194febd278cbdd940a33d11dad2aaf1fd44382/diff:/var/lib/docker/overlay2/2f7b37f2866cfefceb59d0310b6da599d7e48acd947d38b194cb514a3647e8db/diff:/var/lib/docker/overlay2/efc0533da5f64f7383737dbc8e95005ee876f65a7825bbc918e1663dcd94c2e5/diff:/var/lib/docker/overlay2/c14cd6c626dbe88d98e9821316b006e95f4f3d9cbcf7d265be94f4dd4f7efe9e/diff:/var/lib/docker/overlay2/402de7f8f2fefeadf928627c5c66c699d3d7bd68be435904fb32c84952044e38/diff:/var/lib/docker/overlay2/a303b6d547482ed763d06b3e983e4fd8436bd90626db045e55519dbebd8eb760/diff:/var/lib/docker/overlay2/645732a40fbaed09cf83db232ecca1abf7057ea8aff4c4e335bcd53c5f252552/diff:/var/lib/docker/overlay2/733f4449b88752f2cf607e260d92e141ee52f62ecd6a17784d3f39c8de4f6bf1/diff:/var/lib/docker/overlay2/49eedc0519928bf3ae43b05c92f5d66f5c86425674e6974991c0593e1d63cdba/diff:/var/lib/docker/overlay2/782c0193e2bc34ad921c0639e2b393375541ebb84bdb5ef95669db606c6c4c2b/diff:/var/lib/docker/overlay2/6501e01e9bb8192e1e612b8572e7a2dddcd5009699f63f622a083682120593a5/diff:/var/lib/docker/overlay2/fcb7b662df94e536c7fd49c14554179f60a8a4426e1910d1efe8ebe4055d2647/diff:/var/lib/docker/overlay2/decb637d6973a8572f2192a4976e92f97cd008481cbdd3fdc0443f35c59a4ab2/diff:/var/lib/docker/overlay2/92a110e0e760aa99bd1ef23ba74097ffee0117e7b4a1e13413781ae40cf7604a/diff:/var/lib/docker/overlay2/234b9476559cace2ac24f966c157ba52c312e2424dc1f81fd55b7d9fb5f87ab5/diff:/var/lib/docker/overlay2/c96cb3ee63cd7b60e61b6f22871e18a0785c6cc6d8e761ea52660341e5b8a264/diff:/var/lib/docker/overlay2/3bf13b3dd47eae1535f191094ce344f06a0aa1bea50857c21d423242074b693d/diff:/var/lib/docker/overlay2/cf8ef35af37273bae29f8a990c5af2d956871d874f51cbfbb21095055704ffd8/diff:/var/lib/docker/overlay2/53e6e9a3a510f3519f59ac802e6f7a1ec55ba7a2b5029223bb762434e8e8b2fe/diff:/var/lib/docker/overlay2/dd30495db1b8c8570adc90d65d2f6e9462e0e781dc1c3469f96e9f590ffb508c/diff:/var/lib/docker/overlay2/be7e28fa9debb852b284e0bfe2aa5d34496a8a10427d405bbd1f55db6b50c0a4/diff:/var/lib/docker/overlay2/76d0e307d1c74ea25c553a19603544512cb0563e7dbcfbd89fedd9e51092c3f4/diff:/var/lib/docker/overlay2/fa202adf3b4a131d5ac1e150a87ab22c8c064a6b18586fb872c0aa0640402921/diff:/var/lib/docker/overlay2/f43eec0733d59eca13ec1ffe2759f6d875f0b5fbf6b092449572f7b92db425cb/diff",
            "MergedDir": "/var/lib/docker/overlay2/b7b6077e1385b51d86590ae55941c0a3e70ff8783ec8dfda6c414b96919720f2/merged",
            "UpperDir": "/var/lib/docker/overlay2/b7b6077e1385b51d86590ae55941c0a3e70ff8783ec8dfda6c414b96919720f2/diff",
            "WorkDir": "/var/lib/docker/overlay2/b7b6077e1385b51d86590ae55941c0a3e70ff8783ec8dfda6c414b96919720f2/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:e0e6002570470d87b99366522e2deadfd07fd6abb0c481198c1e336f9117e5a6",
            "sha256:975333dc3dfdbaa41b8ad42335280e24444abe97dc4a9ed96d79b8ffa392953c",
            "sha256:d89377a197495541b09fbff7720431c392b01c1e4f6f5b1243198cb9aede1e3d",
            "sha256:5a8d961c5be8711baa60b0f598338d4be20e37fea282975f8b2f7c2265a56d78",
            "sha256:61f9a4cc385ff570bc3e3c828e0717a669691d89f63dafe6308e867fee261548",
            "sha256:9cb1e9ab6eda20a3368a742ffeed068773304ca88ee020e701ce4a0be4944161",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:965dc8c089f935cdca250b0b0daccdb1a0c3cad8007d75b91e87be3d8d3c4170",
            "sha256:e161eba3d4a6643b3d07be860a7281d4c709faa0eb5abcb0aa8f041822c51668",
            "sha256:c20be64e6d713fec5662e6d68d0e0971d3f690498a597b97d910dd804a52d959",
            "sha256:4558ed97a7a074f37bf6a097dddaa8a1f56c1c5b870b7d81c289c9bf031b85ed",
            "sha256:1e2f80abc4ca7aedbbf359254c7ccb02e3fc403ad9122f7d613464a41534c244",
            "sha256:80de0be06e2be091cca81bb67e0e6874695592e13b1286ac4c0de6f08b6f3c8e",
            "sha256:f57a739b6c5fca20fb57dbf1bff5c20a0f5fab52bfe80e5d932065aa71547dad",
            "sha256:b5d1ef7e024c9daf5a69158932515bbf23142e2adb34c7f7bb0cd1a76fa73d0e",
            "sha256:5962f8f80ee6f23283c26b3e8e474ff966915927f1935344da5a1e4d967f5741",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:b8fc745711c105d4cbf30b245a711c33e549fbbf46ad53971fad1030d7360373",
            "sha256:ab5225dabfe86b0a38d538ec14824a465f480a4d03362f0b749396e3c0c79f94",
            "sha256:1ea78e4615a04b9ac2be21b4c8557f2a37489757fe3b2613db778d3e304c2d9d",
            "sha256:ae523f387354ac1972dd9f2b1a7f91c43ffbcf2b4fabd82b50bc72ac24ad59b6",
            "sha256:f57ab7dee9dfe1b4a249f2f6ea3c54c2bdec28c35faf15c19b889367e41a6fc5",
            "sha256:26715a4743698a083c6e48ded03da66e49ba0b2f78dedfc7d6bc2c8f3543325a",
            "sha256:f2f08e7a1903d70ad9becbc1d6963ddfd0f9dacbc048bd0460a87bbc6ae34210",
            "sha256:d4b64bd312d17a39d9ecaefc099026b1b17ef78d723ce60157d2eb6c8fd292a8",
            "sha256:b61394f0d09acccc1b6e1298db399731ae0f686cce033dcead985fe64ff9fc57",
            "sha256:4cee1f33274816e9c541b9edcf9d42260cf85403c9b7d529125a2dfca206303e",
            "sha256:92e2aad49c77926d1b4e377225f4b6af60841690c5e5f76fcc8d6e7dbd75d23a"
        ]
    },
    "Metadata": {
        "LastTagTime": "2026-01-21T14:27:45.729174558+08:00"
    }
}

更多版本

docker.io/unclecode/crawl4ai:basic-amd64

linux/amd64 docker.io2.32GB2024-12-30 11:12
364

docker.io/unclecode/crawl4ai:all-amd64

linux/amd64 docker.io8.53GB2025-01-08 00:15
882

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/amd64 docker.io4.16GB2025-05-19 15:26
344

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/arm64 docker.io3.98GB2025-06-12 21:13
223

docker.io/unclecode/crawl4ai:latest

linux/amd64 docker.io4.16GB2025-07-08 15:51
316

docker.io/unclecode/crawl4ai:latest

linux/arm64 docker.io6.64GB2025-08-01 11:12
243

docker.io/unclecode/crawl4ai:0.7.2

linux/arm64 docker.io6.64GB2025-08-01 11:14
159

docker.io/unclecode/crawl4ai:0.7.2

linux/amd64 docker.io4.17GB2025-08-06 15:57
278

docker.io/unclecode/crawl4ai:0.7.4

linux/amd64 docker.io5.84GB2025-09-12 17:36
206

docker.io/unclecode/crawl4ai:0.7.7

linux/amd64 docker.io4.36GB2025-11-29 11:28
108

docker.io/unclecode/crawl4ai:0.8

linux/amd64 docker.io3.80GB2026-01-21 14:31
16