Automation Pipelines

Automating Cloudflare Pages Preview Deployments with Lighthouse and AI Analysis

Streamline Cloudflare Pages preview deploys, Lighthouse audits, and AI diagnostics with a single repeatable script.

Modern frontend teams can stitch Cloudflare Pages previews, Google Lighthouse, and AI analysis together to validate every change before it merges.

Why run Lighthouse against a Cloudflare preview URL?

Preview deployments behave exactly like production. They run on Cloudflare's edge network, respect the CDN cache, and use the same optimization pipeline. Auditing that environment captures real-world hydration costs, rendering stalls, and CDN behavior that a localhost run simply misses.

Once Lighthouse finishes, hand the JSON to an AI assistant. It can triage unused scripts, image bloat, hydration delays, render-blocking assets, accessibility issues, and SEO pitfalls without a human staring at metric tables all afternoon.

Prepare Chrome in WSL or Linux

Before running Lighthouse locally, point the CLI to a system Chrome build and pass stable headless flags.

Set environment variables

export CHROME_PATH="/usr/bin/google-chrome-stable"
export CHROME_FLAGS="--headless --no-sandbox --disable-gpu --disable-dev-shm-usage"

Reload the shell

source ~/.bashrc

Cloudflare preview deployment + Lighthouse automation

Replace the placeholders with your project slug and preview hostname. The script deploys `./out` via Wrangler, waits five seconds for the preview to initialize, and runs Lighthouse in mobile and desktop modes. Afterwards it trims the noisy `is-crawlable` audit so temporary domains do not trip SEO alarms.

deploy-and-audit.sh

#!/usr/bin/env bash
set -euo pipefail

OUT_DIR="${1:-reports/lighthouse}"
TIMESTAMP="$(date -u +%Y%m%dT%H%M%SZ)"
PREVIEW_URL="${PREVIEW_URL:-https://preview.example.pages.dev}"

mkdir -p "$OUT_DIR"

DEPLOY_LOG="$(mktemp)"

echo "Deploying ./out to Cloudflare Pages (preview branch)..."
npx --yes wrangler pages deploy ./out \
  --project-name example-project \
  --branch preview \
  2>&1 | tee "$DEPLOY_LOG"

DEPLOY_URL="$PREVIEW_URL"

echo "Preview URL: $DEPLOY_URL"
echo "Waiting 5 seconds for preview to warm up..."
sleep 5

CHROME_FLAGS=${CHROME_FLAGS:---headless --no-sandbox --disable-gpu --disable-dev-shm-usage}

run_lighthouse() {
  local label="$1"
  local preset="$2"
  local outfile="${OUT_DIR}/${TIMESTAMP}-${label}.json"

  local args=(
    "$DEPLOY_URL"
    "--output=json"
    "--output-path=$outfile"
    "--chrome-flags=$CHROME_FLAGS"
    "--chrome-path=$CHROME_PATH"
    "--max-wait-for-load=45000"
    "--quiet"
  )

  if [[ -n "$preset" ]]; then
    args+=("--preset=$preset")
  fi

  echo "Running Lighthouse (${label}) for ${DEPLOY_URL}"
  npx --yes lighthouse "${args[@]}"

  node "$outfile" <<'NODE'
const fs = require("fs");
const path = process.argv[1];
const data = JSON.parse(fs.readFileSync(path, "utf8"));
if (data?.audits?.["is-crawlable"]) {
  delete data.audits["is-crawlable"];
}
fs.writeFileSync(path, JSON.stringify(data));
NODE

  echo "Saved ${outfile}"
}

run_lighthouse "mobile" ""
run_lighthouse "desktop" "desktop"

echo "All reports complete. Files are in ${OUT_DIR}"
echo "Wrangler log: ${DEPLOY_LOG}"

Workflow benefits

Running the script before every merge keeps regressions out of the main branch and leaves the interpretation to an AI reviewer.

  • Prove that preview deployments behave like production before shipping.
  • Catch render-blocking scripts, hydration stalls, and accessibility issues early.
  • Feed Lighthouse JSON into an AI assistant to generate concrete remediation tasks.
  • Reuse the script locally, in CI, or as a required PR check.

The result is a lightweight, automation-friendly QA loop tailored for Cloudflare Pages. Every preview build ships with hard data, and every audit ends with action items instead of guesswork.

<- Back to Coding