docker.io/unclecode/crawl4ai:0.6.0-r2 linux/amd64

docker.io/unclecode/crawl4ai:0.6.0-r2 - 国内下载镜像源 浏览次数:26

这是一个名为docker.io/unclecode/crawl4ai 的Docker容器镜像。 具体的描述信息需要参考镜像的维护者提供的文档或README。 由于我没有访问外部网络资源的能力,无法直接获取该镜像的描述。

源镜像 docker.io/unclecode/crawl4ai:0.6.0-r2
国内镜像 swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2
镜像ID sha256:1dd69a22e325044031c4ab71417c598dba158959ff8e829eeac3636b99880348
镜像TAG 0.6.0-r2
大小 4.16GB
镜像源 docker.io
项目信息 Docker-Hub主页 🚀项目TAG 🚀
CMD supervisord -c supervisord.conf
启动入口
工作目录 /app
OS/平台 linux/amd64
浏览量 26 次
贡献者
镜像创建 2025-04-24T10:42:20.747112336Z
同步时间 2025-05-19 15:26
更新时间 2025-05-22 15:49
开放端口
6379/tcp
环境变量
PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin LANG=C.UTF-8 GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305 PYTHON_VERSION=3.12.10 PYTHON_SHA256=07ab697474595e06f06647417d3c7fa97ded07afc1a7e4454c5639919b46eaea C4AI_VERSION=0.6.0 PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379 PYTHON_ENV=production
镜像标签
0.6.0: c4ai.version 🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper: description unclecode: maintainer 1.0: version

Docker拉取命令

docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2
docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2  docker.io/unclecode/crawl4ai:0.6.0-r2

Containerd拉取命令

ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2
ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2  docker.io/unclecode/crawl4ai:0.6.0-r2

Shell快速替换命令

sed -i 's#unclecode/crawl4ai:0.6.0-r2#swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2#' deployment.yaml

Ansible快速分发-Docker

#ansible k8s -m shell -a 'docker pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2 && docker tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2  docker.io/unclecode/crawl4ai:0.6.0-r2'

Ansible快速分发-Containerd

#ansible k8s -m shell -a 'ctr images pull swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2 && ctr images tag  swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2  docker.io/unclecode/crawl4ai:0.6.0-r2'

镜像构建历史


# 2025-04-24 18:42:20  0.00B 设置默认要执行的命令
CMD ["supervisord" "-c" "supervisord.conf"]
                        
# 2025-04-24 18:42:20  0.00B 设置环境变量 PYTHON_ENV
ENV PYTHON_ENV=production
                        
# 2025-04-24 18:42:20  0.00B 指定运行容器时使用的用户
USER appuser
                        
# 2025-04-24 18:42:20  0.00B 声明容器运行时监听的端口
EXPOSE map[6379/tcp:{}]
                        
# 2025-04-24 18:42:20  0.00B 指定检查容器健康状态的命令
HEALTHCHECK &{["CMD-SHELL" "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi &&     redis-cli ping > /dev/null &&     curl -f http://localhost:11235/health || exit 1'"] "30s" "10s" "5s" "0s" '\x03'}
                        
# 2025-04-24 18:42:20  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /var/lib/redis /var/log/redis && chown -R appuser:appuser /var/lib/redis /var/log/redis # buildkit
                        
# 2025-04-24 18:42:20  929.56KB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c chown -R appuser:appuser ${APP_HOME} # buildkit
                        
# 2025-04-24 18:42:20  42.97KB 复制新文件或目录到容器中
COPY deploy/docker/static /app/static # buildkit
                        
# 2025-04-24 18:42:20  885.00KB 复制新文件或目录到容器中
COPY deploy/docker/* /app/ # buildkit
                        
# 2025-04-24 18:42:20  296.35KB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c crawl4ai-doctor # buildkit
                        
# 2025-04-24 18:42:07  588.49MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /home/appuser/.cache/ms-playwright     && cp -r /root/.cache/ms-playwright/chromium-* /home/appuser/.cache/ms-playwright/     && chown -R appuser:appuser /home/appuser/.cache/ms-playwright # buildkit
                        
# 2025-04-24 18:42:06  896.81MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c playwright install --with-deps # buildkit
                        
# 2025-04-24 18:40:31  1.16GB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c crawl4ai-setup # buildkit
                        
# 2025-04-24 18:39:55  2.38MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c pip install --no-cache-dir --upgrade pip &&     /tmp/install.sh &&     python -c "import crawl4ai; print('✅ crawl4ai is ready to rock!')" &&     python -c "from playwright.sync_api import sync_playwright; print('✅ Playwright is feeling dramatic!')" # buildkit
                        
# 2025-04-24 18:39:43  292.54MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install "/tmp/project/[all]" &&         python -m crawl4ai.model_loader ;     elif [ "$INSTALL_TYPE" = "torch" ] ; then         pip install "/tmp/project/[torch]" ;     elif [ "$INSTALL_TYPE" = "transformer" ] ; then         pip install "/tmp/project/[transformer]" &&         python -m crawl4ai.model_loader ;     else         pip install "/tmp/project" ;     fi # buildkit
                        
# 2025-04-24 18:39:18  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$INSTALL_TYPE" = "all" ] ; then         pip install --no-cache-dir             torch             torchvision             torchaudio             scikit-learn             nltk             transformers             tokenizers &&         python -m nltk.downloader punkt stopwords ;     fi # buildkit
                        
# 2025-04-24 18:39:18  108.53MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c pip install --no-cache-dir -r requirements.txt # buildkit
                        
# 2025-04-24 18:39:09  281.00B 复制新文件或目录到容器中
COPY deploy/docker/requirements.txt . # buildkit
                        
# 2025-04-24 18:39:09  1.31KB 复制新文件或目录到容器中
COPY deploy/docker/supervisord.conf . # buildkit
                        
# 2025-04-24 18:39:09  313.62MB 复制新文件或目录到容器中
COPY . /tmp/project/ # buildkit
                        
# 2025-04-23 15:51:43  435.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c echo '#!/bin/bash\nif [ "$USE_LOCAL" = "true" ]; then\n    echo "📦 Installing from local source..."\n    pip install --no-cache-dir /tmp/project/\nelse\n    echo "🌐 Installing from GitHub..."\n    for i in {1..3}; do \n        git clone --branch ${GITHUB_BRANCH} ${GITHUB_REPO} /tmp/crawl4ai && break || \n        { echo "Attempt $i/3 failed! Taking a short break... ☕"; sleep 5; }; \n    done\n    pip install --no-cache-dir /tmp/crawl4ai\nfi' > /tmp/install.sh && chmod +x /tmp/install.sh # buildkit
                        
# 2025-04-23 15:51:43  0.00B 设置工作目录为/app
WORKDIR /app
                        
# 2025-04-23 15:51:43  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c mkdir -p /home/appuser && chown -R appuser:appuser /home/appuser # buildkit
                        
# 2025-04-23 15:51:43  4.50KB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c groupadd -r appuser && useradd --no-log-init -r -g appuser appuser # buildkit
                        
# 2025-04-23 15:51:43  143.32MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$TARGETARCH" = "arm64" ]; then     echo "🦾 Installing ARM-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libopenblas-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; elif [ "$TARGETARCH" = "amd64" ]; then     echo "🖥️ Installing AMD64-specific optimizations";     apt-get update && apt-get install -y --no-install-recommends     libomp-dev     && apt-get clean     && rm -rf /var/lib/apt/lists/*; else     echo "Skipping platform-specific optimizations (unsupported platform)"; fi # buildkit
                        
# 2025-04-23 15:51:34  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c if [ "$ENABLE_GPU" = "true" ] && [ "$TARGETARCH" = "amd64" ] ; then     apt-get update && apt-get install -y --no-install-recommends     nvidia-cuda-toolkit     && apt-get clean     && rm -rf /var/lib/apt/lists/* ; else     echo "Skipping NVIDIA CUDA Toolkit installation (unsupported platform or GPU disabled)"; fi # buildkit
                        
# 2025-04-23 15:51:34  0.00B 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get dist-upgrade -y     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-04-23 15:51:30  30.97MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     libglib2.0-0     libnss3     libnspr4     libatk1.0-0     libatk-bridge2.0-0     libcups2     libdrm2     libdbus-1-3     libxcb1     libxkbcommon0     libx11-6     libxcomposite1     libxdamage1     libxext6     libxfixes3     libxrandr2     libgbm1     libpango-1.0-0     libcairo2     libasound2     libatspi2.0-0     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-04-23 15:51:13  503.30MB 执行命令并创建新的镜像层
RUN |9 C4AI_VER=0.6.0 APP_HOME=/app GITHUB_REPO=https://github.com/unclecode/crawl4ai.git GITHUB_BRANCH=main USE_LOCAL=true PYTHON_VERSION=3.12 INSTALL_TYPE=default ENABLE_GPU=false TARGETARCH=amd64 /bin/sh -c apt-get update && apt-get install -y --no-install-recommends     build-essential     curl     wget     gnupg     git     cmake     pkg-config     python3-dev     libjpeg-dev     redis-server     supervisor     && apt-get clean     && rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-04-23 15:51:13  0.00B 添加元数据标签
LABEL version=1.0
                        
# 2025-04-23 15:51:13  0.00B 添加元数据标签
LABEL description=🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler & scraper
                        
# 2025-04-23 15:51:13  0.00B 添加元数据标签
LABEL maintainer=unclecode
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG TARGETARCH=amd64
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG ENABLE_GPU=false
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG INSTALL_TYPE=default
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG PYTHON_VERSION=3.12
                        
# 2025-04-23 15:51:13  0.00B 设置环境变量 PYTHONFAULTHANDLER PYTHONHASHSEED PYTHONUNBUFFERED PIP_NO_CACHE_DIR PYTHONDONTWRITEBYTECODE PIP_DISABLE_PIP_VERSION_CHECK PIP_DEFAULT_TIMEOUT DEBIAN_FRONTEND REDIS_HOST REDIS_PORT
ENV PYTHONFAULTHANDLER=1 PYTHONHASHSEED=random PYTHONUNBUFFERED=1 PIP_NO_CACHE_DIR=1 PYTHONDONTWRITEBYTECODE=1 PIP_DISABLE_PIP_VERSION_CHECK=1 PIP_DEFAULT_TIMEOUT=100 DEBIAN_FRONTEND=noninteractive REDIS_HOST=localhost REDIS_PORT=6379
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG USE_LOCAL=true
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG GITHUB_BRANCH=main
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG GITHUB_REPO=https://github.com/unclecode/crawl4ai.git
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG APP_HOME=/app
                        
# 2025-04-23 15:51:13  0.00B 添加元数据标签
LABEL c4ai.version=0.6.0
                        
# 2025-04-23 15:51:13  0.00B 设置环境变量 C4AI_VERSION
ENV C4AI_VERSION=0.6.0
                        
# 2025-04-23 15:51:13  0.00B 定义构建参数
ARG C4AI_VER=0.6.0
                        
# 2025-04-09 03:02:43  0.00B 设置默认要执行的命令
CMD ["python3"]
                        
# 2025-04-09 03:02:43  36.00B 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	for src in idle3 pip3 pydoc3 python3 python3-config; do 		dst="$(echo "$src" | tr -d 3)"; 		[ -s "/usr/local/bin/$src" ]; 		[ ! -e "/usr/local/bin/$dst" ]; 		ln -svT "$src" "/usr/local/bin/$dst"; 	done # buildkit
                        
# 2025-04-09 03:02:43  40.19MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 		savedAptMark="$(apt-mark showmanual)"; 	apt-get update; 	apt-get install -y --no-install-recommends 		dpkg-dev 		gcc 		gnupg 		libbluetooth-dev 		libbz2-dev 		libc6-dev 		libdb-dev 		libffi-dev 		libgdbm-dev 		liblzma-dev 		libncursesw5-dev 		libreadline-dev 		libsqlite3-dev 		libssl-dev 		make 		tk-dev 		uuid-dev 		wget 		xz-utils 		zlib1g-dev 	; 		wget -O python.tar.xz "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz"; 	echo "$PYTHON_SHA256 *python.tar.xz" | sha256sum -c -; 	wget -O python.tar.xz.asc "https://www.python.org/ftp/python/${PYTHON_VERSION%%[a-z]*}/Python-$PYTHON_VERSION.tar.xz.asc"; 	GNUPGHOME="$(mktemp -d)"; export GNUPGHOME; 	gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$GPG_KEY"; 	gpg --batch --verify python.tar.xz.asc python.tar.xz; 	gpgconf --kill all; 	rm -rf "$GNUPGHOME" python.tar.xz.asc; 	mkdir -p /usr/src/python; 	tar --extract --directory /usr/src/python --strip-components=1 --file python.tar.xz; 	rm python.tar.xz; 		cd /usr/src/python; 	gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; 	./configure 		--build="$gnuArch" 		--enable-loadable-sqlite-extensions 		--enable-optimizations 		--enable-option-checking=fatal 		--enable-shared 		--with-lto 		--with-ensurepip 	; 	nproc="$(nproc)"; 	EXTRA_CFLAGS="$(dpkg-buildflags --get CFLAGS)"; 	LDFLAGS="$(dpkg-buildflags --get LDFLAGS)"; 	LDFLAGS="${LDFLAGS:--Wl},--strip-all"; 		arch="$(dpkg --print-architecture)"; arch="${arch##*-}"; 		case "$arch" in 			amd64|arm64) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer"; 				;; 			i386) 				;; 			*) 				EXTRA_CFLAGS="${EXTRA_CFLAGS:-} -fno-omit-frame-pointer"; 				;; 		esac; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:-}" 	; 	rm python; 	make -j "$nproc" 		"EXTRA_CFLAGS=${EXTRA_CFLAGS:-}" 		"LDFLAGS=${LDFLAGS:--Wl},-rpath='\$\$ORIGIN/../lib'" 		python 	; 	make install; 		cd /; 	rm -rf /usr/src/python; 		find /usr/local -depth 		\( 			\( -type d -a \( -name test -o -name tests -o -name idle_test \) \) 			-o \( -type f -a \( -name '*.pyc' -o -name '*.pyo' -o -name 'libpython*.a' \) \) 		\) -exec rm -rf '{}' + 	; 		ldconfig; 		apt-mark auto '.*' > /dev/null; 	apt-mark manual $savedAptMark; 	find /usr/local -type f -executable -not \( -name '*tkinter*' \) -exec ldd '{}' ';' 		| awk '/=>/ { so = $(NF-1); if (index(so, "/usr/local/") == 1) { next }; gsub("^/(usr/)?", "", so); printf "*%s\n", so }' 		| sort -u 		| xargs -r dpkg-query --search 		| cut -d: -f1 		| sort -u 		| xargs -r apt-mark manual 	; 	apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; 	rm -rf /var/lib/apt/lists/*; 		export PYTHONDONTWRITEBYTECODE=1; 	python3 --version; 	pip3 --version # buildkit
                        
# 2025-04-09 03:02:43  0.00B 设置环境变量 PYTHON_SHA256
ENV PYTHON_SHA256=07ab697474595e06f06647417d3c7fa97ded07afc1a7e4454c5639919b46eaea
                        
# 2025-04-09 03:02:43  0.00B 设置环境变量 PYTHON_VERSION
ENV PYTHON_VERSION=3.12.10
                        
# 2025-04-09 03:02:43  0.00B 设置环境变量 GPG_KEY
ENV GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305
                        
# 2025-04-09 03:02:43  9.23MB 执行命令并创建新的镜像层
RUN /bin/sh -c set -eux; 	apt-get update; 	apt-get install -y --no-install-recommends 		ca-certificates 		netbase 		tzdata 	; 	rm -rf /var/lib/apt/lists/* # buildkit
                        
# 2025-04-09 03:02:43  0.00B 设置环境变量 LANG
ENV LANG=C.UTF-8
                        
# 2025-04-09 03:02:43  0.00B 设置环境变量 PATH
ENV PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
                        
# 2025-04-07 08:00:00  74.83MB 
# debian.sh --arch 'amd64' out/ 'bookworm' '@1743984000'
                        
                    

镜像信息

{
    "Id": "sha256:1dd69a22e325044031c4ab71417c598dba158959ff8e829eeac3636b99880348",
    "RepoTags": [
        "unclecode/crawl4ai:0.6.0-r2",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai:0.6.0-r2"
    ],
    "RepoDigests": [
        "unclecode/crawl4ai@sha256:b33e98fba44182409a99a84db655c152344be710e3bd9e97079004375a90148a",
        "swr.cn-north-4.myhuaweicloud.com/ddn-k8s/docker.io/unclecode/crawl4ai@sha256:0260e6f751a025dd3eb75074b325ba41780b2ec9eb122661008f88b1ae7e1fa5"
    ],
    "Parent": "",
    "Comment": "buildkit.dockerfile.v0",
    "Created": "2025-04-24T10:42:20.747112336Z",
    "Container": "",
    "ContainerConfig": null,
    "DockerVersion": "",
    "Author": "",
    "Config": {
        "Hostname": "",
        "Domainname": "",
        "User": "appuser",
        "AttachStdin": false,
        "AttachStdout": false,
        "AttachStderr": false,
        "ExposedPorts": {
            "6379/tcp": {}
        },
        "Tty": false,
        "OpenStdin": false,
        "StdinOnce": false,
        "Env": [
            "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
            "LANG=C.UTF-8",
            "GPG_KEY=7169605F62C751356D054A26A821E680E5FA6305",
            "PYTHON_VERSION=3.12.10",
            "PYTHON_SHA256=07ab697474595e06f06647417d3c7fa97ded07afc1a7e4454c5639919b46eaea",
            "C4AI_VERSION=0.6.0",
            "PYTHONFAULTHANDLER=1",
            "PYTHONHASHSEED=random",
            "PYTHONUNBUFFERED=1",
            "PIP_NO_CACHE_DIR=1",
            "PYTHONDONTWRITEBYTECODE=1",
            "PIP_DISABLE_PIP_VERSION_CHECK=1",
            "PIP_DEFAULT_TIMEOUT=100",
            "DEBIAN_FRONTEND=noninteractive",
            "REDIS_HOST=localhost",
            "REDIS_PORT=6379",
            "PYTHON_ENV=production"
        ],
        "Cmd": [
            "supervisord",
            "-c",
            "supervisord.conf"
        ],
        "Healthcheck": {
            "Test": [
                "CMD-SHELL",
                "bash -c '    MEM=$(free -m | awk \"/^Mem:/{print \\$2}\");     if [ $MEM -lt 2048 ]; then         echo \"⚠️ Warning: Less than 2GB RAM available! Your container might need a memory boost! 🚀\";         exit 1;     fi \u0026\u0026     redis-cli ping \u003e /dev/null \u0026\u0026     curl -f http://localhost:11235/health || exit 1'"
            ],
            "Interval": 30000000000,
            "Timeout": 10000000000,
            "StartPeriod": 5000000000,
            "Retries": 3
        },
        "ArgsEscaped": true,
        "Image": "",
        "Volumes": null,
        "WorkingDir": "/app",
        "Entrypoint": null,
        "OnBuild": null,
        "Labels": {
            "c4ai.version": "0.6.0",
            "description": "🔥🕷️ Crawl4AI: Open-source LLM Friendly Web Crawler \u0026 scraper",
            "maintainer": "unclecode",
            "version": "1.0"
        }
    },
    "Architecture": "amd64",
    "Os": "linux",
    "Size": 4161895142,
    "GraphDriver": {
        "Data": {
            "LowerDir": "/var/lib/docker/overlay2/9b2bd77c3160aa20366df2e84bd0f13633b270c18a556edc1450ad5268a8013e/diff:/var/lib/docker/overlay2/407990c18cec0910bf8cf1c459083f7e8089b805864ec6734f26dcf0845b0655/diff:/var/lib/docker/overlay2/7573d421fcdbfd7198e1838e3094b8adb4f67b5c56ee469dc50cf5971e0943c1/diff:/var/lib/docker/overlay2/d61cbb27ee30fb99ed16de35b859d98554baf7465726e6f9dd1decd6e2de0db2/diff:/var/lib/docker/overlay2/f3b589529894af97ccae9078f582a1b6061bf4f18e37b22d1e9a5e931bc26148/diff:/var/lib/docker/overlay2/78ce16af9c74fba4ab04383051dc2697647a89455bec8980b651adf3ae252e79/diff:/var/lib/docker/overlay2/461f0b6f048eed3f710909a922193d6200566022281af4508aaca783bea191e6/diff:/var/lib/docker/overlay2/70e536e9ad1926a5aae669101004ff5fd49ce58efcdf9ce27ae531f98e1899f7/diff:/var/lib/docker/overlay2/1567184ea3bdd49c0cf46099c4f7ca7729aec48515121708092114684a14a056/diff:/var/lib/docker/overlay2/eb65ce684f6c12d501aecfc18511f82cc135c9a5fccc1a6b4c76827560aacd70/diff:/var/lib/docker/overlay2/5075abb6d75fe2cb159854e3916d545cdcb4eeddcba3a5477b45d5a3c2a408bc/diff:/var/lib/docker/overlay2/2e2c037103b4b4f9136d45378f1b638582c5a848e99282705c2f11192fd8797a/diff:/var/lib/docker/overlay2/97faa7d748af7268bc70a81424a50effc730e10b65b133208c7469b9aa121bd2/diff:/var/lib/docker/overlay2/cabf50c668537f82b09468970d249ed21866d0dd76881aa14ff802f23a4f9114/diff:/var/lib/docker/overlay2/f6dc7549f71b00e1711cb751cb6fac7c3b2dab7dd29c66073f98bb98a33e39e3/diff:/var/lib/docker/overlay2/0b2eb12f4c9d865666037130ecf033e7498c65fd6c2fa10a707e0c46690f78fc/diff:/var/lib/docker/overlay2/fce85418a073bcaf6ae3cb44f96c0c604a03fd132bc15b51385c513eb46658e1/diff:/var/lib/docker/overlay2/0e3c1497a3c5bb0871d0756b5415cc1df3d947cf11099290848b6aa56b77d0ac/diff:/var/lib/docker/overlay2/5a2e1837d3615738c8bb7f2e57393f9799307da1f223b46d92c898a4e9ec6513/diff:/var/lib/docker/overlay2/3c022a1b446aab70da6efa329db1e70953dac9a8c4c07dc3a2be57d87b3c133c/diff:/var/lib/docker/overlay2/a03c484a37479a2f62d3c1d430cd2b6fc4f48147f7a273724aa3494095b3e8dc/diff:/var/lib/docker/overlay2/b7da92cbd413073f82cc67eff45267519cf66592b9e10a24b7c848467a88967c/diff:/var/lib/docker/overlay2/a35049bee4abbd12d2bcbde5a2f129e841557c4a1ea6ea421dbb69ecfa5b95be/diff:/var/lib/docker/overlay2/3e6fa7f05f16db2f7308afd76e63337fca3a7d8dd02dc77dc0ef59c6955fb5ab/diff:/var/lib/docker/overlay2/e4c38474a2e99e575f60b3789879e68fd167e98c60d4c963776a9a49129e5064/diff:/var/lib/docker/overlay2/9886df6e4efeefdae2c801249f7aa816a44eae0471cc3ec669283027068eb555/diff:/var/lib/docker/overlay2/6c87f3c40f6916490852332f5b9f29b4cf427a690869e3d003326ba790adfed3/diff",
            "MergedDir": "/var/lib/docker/overlay2/55bd8ffaeace4795872cc467e0260f51c7a1f02d5c44a14227b6dd59cbe7fdc4/merged",
            "UpperDir": "/var/lib/docker/overlay2/55bd8ffaeace4795872cc467e0260f51c7a1f02d5c44a14227b6dd59cbe7fdc4/diff",
            "WorkDir": "/var/lib/docker/overlay2/55bd8ffaeace4795872cc467e0260f51c7a1f02d5c44a14227b6dd59cbe7fdc4/work"
        },
        "Name": "overlay2"
    },
    "RootFS": {
        "Type": "layers",
        "Layers": [
            "sha256:ea680fbff095473bb8a6c867938d6d851e11ef0c177fce983ccc83440172bd72",
            "sha256:67c69e32800ff6bdaf973dd90010ca63bb7858f7077bba0abde1680d234e96e4",
            "sha256:e72ee629d8b7663fa594f85a170619785422794b9e02a496f58756f89e6560d1",
            "sha256:68b89356c898a133e6c335725f8fd6aca40f155c84d951481cb087b5d55658a9",
            "sha256:d1e33567fea43b7dc86c327909b3064855d0cd5972aa9e5103920785286b77ca",
            "sha256:1a6559a942c1b5f9889fd73d1472a19c557bc84c418347098aa9916227c0d529",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:0b3c2db44b34bf682607e924f4d1dfc720b0f0ed9c45573aacf82ee4b86dec2e",
            "sha256:82727c37066a292fe33211a93603392fffab541f9349334a521abb534e1f1a3f",
            "sha256:e86419bac1ab4966b36050390ec92e5ad23daf2ce48f63bf67bebbc472f585b2",
            "sha256:884ef2c50cf57cdd448d7dac3fd1e8f153b18d6c09a2d589436bc1a8ea194c11",
            "sha256:b31a419289ed8b473a146d504e0dbbf27b35f6cfbdc6f452e58a9fda1b762801",
            "sha256:997732126fa73140381d73a58dbd6139c5825725abc41fef754a76fcfdd89363",
            "sha256:e1226c6822867ed577096c1ed5a2e7b3192c0b41a0e10826b49e438097c80403",
            "sha256:9a17edb8fcc39d82a43ffa0c1737c1dcace7f77caa458e124a20cad512187a4f",
            "sha256:5359760763c8ed72c190ad769283aca5475bf4e93dc5574ed1b387838f255e3b",
            "sha256:5f70bf18a086007016e948b04aed3b82103a36bea41755b6cddfaf10ace3c6ef",
            "sha256:a0b9113b5ec63a0799a38cbe23b07fd75990c6b351f1d6e3bd3a3b55c5fbeb5b",
            "sha256:7aa4b76471d9b9ec745f1acea862ec8d0589632e94edd3b07ae4f8bdf622a2f0",
            "sha256:4954c2a0b7dae112e12329cb33ed938e7e4132d10a5189d44bf6eb3ac32e08ce",
            "sha256:a164b730fca530510b3302ecfee0a8105e5c2fbc589237e1c3d6828069a6af5f",
            "sha256:05faf6c23fbfb398e7715afbace9a490221ff3c39172da62a768d84a13a2348b",
            "sha256:1ee1014d109cde4e3c3e972d9ce90bc8f237d1cfc0fc1fd88986cc30db4ec69c",
            "sha256:e80e0ec7eb96d9988b55e7904c94a6f241b72561ab346229fd21ec0d671315b2",
            "sha256:69391181a9b955a0c7197dc4faa3e1ed52b2dd70740dee49b772dd6d7b4bbee6",
            "sha256:ae78e8c1c3acb54f1b9207f7608871d204e2af92096a55963ac9cb18a77d1c39",
            "sha256:43d95e0a2f9215c3358e47f38724883322c3f52c267e68930f2fc0c8c3d87b96"
        ]
    },
    "Metadata": {
        "LastTagTime": "2025-05-19T15:23:08.611224807+08:00"
    }
}

更多版本

docker.io/unclecode/crawl4ai:basic-amd64

linux/amd64 docker.io2.32GB2024-12-30 11:12
154

docker.io/unclecode/crawl4ai:all-amd64

linux/amd64 docker.io8.53GB2025-01-08 00:15
356

docker.io/unclecode/crawl4ai:0.6.0-r2

linux/amd64 docker.io4.16GB2025-05-19 15:26
25