🕷️
Decodo Scraper
Scraping Browser & AutomationInstall Command
npx clawhub@latest install decodo
Installation Guide
1
Check Environment
Make sure Node.js 22+ and OpenClaw are installed. Run openclaw --version in your terminal to verify.
2
Run Installation
Run the install command above in your terminal. ClawHub will automatically download and install Decodo Scraper to the ~/.openclaw/skills/ directory.
3
Verify Installation
Run openclaw skills list to check your installed skills and confirm Decodo Scraper appears in the list.
4
Configure (Optional)
Follow the configuration instructions in the description below to add skill settings to ~/.config/openclaw/openclaw.json5.
Manual Installation: Copy the Skill folder to
~/.openclaw/skills/ or the skills/ directory in your project root. Make sure the folder contains a SKILL.md file.
Anti-Bot Bypass
IP Rotation
Structured Extraction
Detailed Description
Decodo Scraper is an enterprise-grade web content scraping skill that achieves highly reliable data collection through the Decodo Web Scraping API.
Core Features
- Smart Scraping: Automatically handles JavaScript rendering, CAPTCHAs, and Cloudflare protection
- IP Rotation: Uses a global proxy pool with automatic IP rotation to avoid bans
- Structured Extraction: Converts webpage content into structured data (JSON, CSV)
- Batch Collection: Supports parallel scraping of batch URL lists
- Cache Management: Intelligently caches scraped content to reduce duplicate requests
Configuration
{
skills: {
decodo: {
apiKey: "xxx",
outputFormat: "markdown",
javascript: true,
proxy: "residential"
}
}
}
Use Cases
- Data collection from websites with complex anti-bot protections
- E-commerce price monitoring and competitive analysis
- Batch fetching search result page content
- News website content aggregation and archiving