Files
stef-openclaw-skills/docs/web-automation.md

286 lines
8.8 KiB
Markdown

# web-automation
Automated web browsing and scraping using Playwright-compatible CloakBrowser, with one-shot extraction and broader persistent automation under a single skill.
## What this skill is for
- One-shot extraction from one URL with JSON output
- Automating web workflows
- Authenticated session flows (logins/cookies)
- Extracting page content to markdown
- Working with bot-protected or dynamic pages
## Command selection
- Use `node skills/web-automation/scripts/extract.js "<URL>"` for one-shot extraction from a single URL
- Use `npx tsx scrape.ts ...` for markdown scraping modes
- Use `npx tsx browse.ts ...`, `auth.ts`, or `flow.ts` for interactive or authenticated flows
- Use `node skills/web-automation/scripts/zillow-discover.js "<street-address>"` or `har-discover.js` to resolve a real-estate listing URL from an address
- Use `node skills/web-automation/scripts/zillow-photos.js "<listing-url>"` or `har-photos.js` for real-estate photo extraction before attempting generic gallery automation
## Requirements
- Node.js 20+
- `pnpm`
- Network access to download the CloakBrowser binary on first use or via preinstall
## First-time setup
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
pnpm install
npx cloakbrowser install
pnpm approve-builds
pnpm rebuild better-sqlite3 esbuild
```
## Updating CloakBrowser
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
pnpm up cloakbrowser playwright-core
npx cloakbrowser install
pnpm approve-builds
pnpm rebuild better-sqlite3 esbuild
```
## System libraries (for OpenClaw Docker builds)
```bash
export OPENCLAW_DOCKER_APT_PACKAGES="ffmpeg jq curl libnss3 libatk1.0-0 libatk-bridge2.0-0 libcups2 libdrm2 libxkbcommon0 libxcomposite1 libxdamage1 libxfixes3 libxrandr2 libgbm1 libasound2"
```
## Native module note
If `pnpm install` warns that build scripts were ignored for native modules such as `better-sqlite3` or `esbuild`, run:
```bash
pnpm approve-builds
pnpm rebuild better-sqlite3 esbuild
```
Without this, helper scripts may fail before launch because the native bindings are missing.
## Prerequisite check
Before running automation, verify the local install and CloakBrowser wiring:
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
node check-install.js
```
If this fails, stop and fix setup before troubleshooting site automation.
## Exec approvals allowlist
If OpenClaw keeps prompting for approval when running this skill, add a local allowlist for the main agent:
```bash
openclaw approvals allowlist add --agent main "/opt/homebrew/bin/node"
openclaw approvals allowlist add --agent main "/usr/bin/env"
openclaw approvals allowlist add --agent main "~/.openclaw/workspace/skills/web-automation/scripts/*.js"
openclaw approvals allowlist add --agent main "~/.openclaw/workspace/skills/web-automation/scripts/node_modules/.bin/*"
```
Verify with:
```bash
openclaw approvals get
```
Notes:
- If `node` lives somewhere else, replace `/opt/homebrew/bin/node` with the output of `which node`.
- If matching is inconsistent, replace `~/.openclaw/...` with the full absolute path for the machine.
- Keep the allowlist scoped to the main agent unless there is a clear reason to widen it.
- Prefer file-based commands like `node check-install.js`, `node zillow-photos.js ...`, and `node har-photos.js ...` over inline `node -e ...`. Inline interpreter eval is more likely to trigger approval friction.
- The same applies to `zillow-discover.js` and `har-discover.js`: keep discovery file-based, not inline.
## Common commands
```bash
# Install / wiring check
cd ~/.openclaw/workspace/skills/web-automation/scripts
node check-install.js
# One-shot JSON extraction
node skills/web-automation/scripts/extract.js "https://example.com"
# Zillow listing discovery from address
node skills/web-automation/scripts/zillow-discover.js "4141 Whiteley Dr, Corpus Christi, TX 78418"
# HAR listing discovery from address
node skills/web-automation/scripts/har-discover.js "4141 Whiteley Dr, Corpus Christi, TX 78418"
# Zillow photo extraction
node skills/web-automation/scripts/zillow-photos.js "https://www.zillow.com/homedetails/..."
# HAR photo extraction
node skills/web-automation/scripts/har-photos.js "https://www.har.com/homedetail/..."
# Browse a page with persistent profile
npx tsx browse.ts --url "https://example.com"
# Scrape markdown
npx tsx scrape.ts --url "https://example.com" --mode main --output page.md
# Authenticate flow
npx tsx auth.ts --url "https://example.com/login"
# General natural-language browser flow
npx tsx flow.ts --instruction 'go to https://search.fiorinis.com then type "pippo" then press enter then wait 2s'
```
## Real-estate listing discovery and photo extraction
Use the dedicated Zillow and HAR discovery/photo commands before trying a free-form gallery flow.
### Zillow discovery
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
node zillow-discover.js "4141 Whiteley Dr, Corpus Christi, TX 78418"
```
What it does:
- opens the Zillow address URL with CloakBrowser
- resolves directly to a property page when Zillow supports the address slug
- otherwise looks for a `homedetails` listing link in the rendered page
- returns the discovered listing URL as JSON
### HAR discovery
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
node har-discover.js "4141 Whiteley Dr, Corpus Christi, TX 78418"
```
What it does:
- opens the HAR address search page
- looks for a confident `homedetail` match in rendered results
- returns the discovered listing URL when HAR exposes a strong enough match
- returns `listingUrl: null` when HAR discovery is not confident enough
### Zillow
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
node zillow-photos.js "https://www.zillow.com/homedetails/4141-Whiteley-Dr-Corpus-Christi-TX-78418/2103723704_zpid/"
```
What it does:
- opens the listing page with CloakBrowser
- tries the `See all photos` / `See all X photos` entry point
- if Zillow keeps the click path flaky, falls back to the listing's embedded `__NEXT_DATA__` payload
- returns direct `photos.zillowstatic.com` image URLs as JSON
Expected success shape:
- `complete: true`
- `expectedPhotoCount` matches `photoCount`
- `imageUrls` contains the listing photo set
### HAR
```bash
cd ~/.openclaw/workspace/skills/web-automation/scripts
node har-photos.js "https://www.har.com/homedetail/4141-whiteley-dr-corpus-christi-tx-78418/14069438"
```
What it does:
- opens the HAR listing page
- clicks `Show all photos` / `View all photos`
- extracts the direct `pics.harstatic.com` image URLs from the all-photos page
Expected success shape:
- `complete: true`
- `expectedPhotoCount` matches `photoCount`
- `imageUrls` contains the listing photo set
### Test commands
From `skills/web-automation/scripts`:
```bash
node check-install.js
npm run test:photos
node zillow-discover.js "<street-address>"
node har-discover.js "<street-address>"
node zillow-photos.js "<zillow-listing-url>"
node har-photos.js "<har-listing-url>"
```
Use the live Zillow and HAR URLs above for a known-good regression check.
## One-shot extraction (`extract.js`)
Use `extract.js` when the task is just: open one URL, render it, and return structured content.
### Features
- JavaScript rendering
- lightweight stealth and bounded anti-bot shaping
- JSON-only output
- optional screenshot and saved HTML
- browser sandbox left enabled
### Options
```bash
WAIT_TIME=5000 node skills/web-automation/scripts/extract.js "https://example.com"
SCREENSHOT_PATH=/tmp/page.png node skills/web-automation/scripts/extract.js "https://example.com"
SAVE_HTML=true node skills/web-automation/scripts/extract.js "https://example.com"
HEADLESS=false node skills/web-automation/scripts/extract.js "https://example.com"
USER_AGENT="Mozilla/5.0 ..." node skills/web-automation/scripts/extract.js "https://example.com"
```
### Output fields
- `requestedUrl`
- `finalUrl`
- `title`
- `content`
- `metaDescription`
- `status`
- `elapsedSeconds`
- `challengeDetected`
- optional `screenshot`
- optional `htmlFile`
## Persistent browsing profile
`browse.ts`, `auth.ts`, `flow.ts`, and `scrape.ts` use a persistent CloakBrowser profile so sessions survive across runs.
Canonical env vars:
- `CLOAKBROWSER_PROFILE_PATH`
- `CLOAKBROWSER_HEADLESS`
- `CLOAKBROWSER_USERNAME`
- `CLOAKBROWSER_PASSWORD`
Legacy aliases still supported for compatibility:
- `CAMOUFOX_PROFILE_PATH`
- `CAMOUFOX_HEADLESS`
- `CAMOUFOX_USERNAME`
- `CAMOUFOX_PASSWORD`
## Natural-language flow runner (`flow.ts`)
Use `flow.ts` when you want a general command style like:
- "go to this site"
- "find this button and click it"
- "type this and press enter"
### Example
```bash
npx tsx flow.ts --instruction 'go to https://example.com then click on "Sign in" then type "stef@example.com" in #email then press enter'
```
You can also use JSON steps for deterministic runs:
```bash
npx tsx flow.ts --steps '[{"action":"goto","url":"https://example.com"},{"action":"click","text":"Sign in"}]'
```