Credit Optimization Guide
Minimize scraping costs by choosing the right tools, implementing smart caching strategies, and optimizing your workflows for maximum value.
Quick Wins (Save 50-80%)
Use fetch_url instead of search_web when you know the URL
Saves 4 credits per request (1 credit vs 5 credits)
💰 80% cost reduction
Try static scraping before browser automation
Use fetch_url (1 credit) before scrape_with_actions (5 credits)
💰 80% cost reduction for static content
Cache results locally
Store scraped data in Redis/database to avoid re-scraping same URLs
💰 90%+ reduction on repeated requests
Use batch_scrape for multiple URLs
Same cost (1 credit/URL) but faster and more efficient than individual requests
⚡ 5x faster throughput
1. Tool Selection Strategy
Always start with the cheapest tool that meets your needs, then upgrade only if necessary.
1. Do you know the URL?
✅ Yes → Use fetch_url (1 credit)
❌ No → Use search_web (5 credits)
2. Is the content in the initial HTML?
✅ Yes → Use fetch_url (1 credit) + parse locally (free)
❌ No (JavaScript-rendered) → Use scrape_with_actions (5 credits)
3. Do you need structured extraction?
✅ Yes → Use scrape_structured (2 credits)
❌ No (raw HTML is fine) → Use fetch_url (1 credit)
4. Do you need AI-powered research?
✅ Yes → Use deep_research (10 credits)
❌ No → Use cheaper alternatives
| Use Case | Wrong Tool | Right Tool | Savings |
|---|---|---|---|
| Fetch HTML | search_web (5) | fetch_url (1) | 80% |
| Extract text | scrape_with_actions (5) | extract_text (1) | 80% |
| Get metadata | scrape_structured (2) | extract_metadata (1) | 50% |
| Research topic | deep_research (10) | search_web (5) + fetch_url (1×3) | 20% |
Wrong (10 credits)
Right (2 credits)
2. Caching Strategies
Avoid re-scraping the same content by implementing smart caching.
3. Batch vs Individual Requests
Use batch processing for multiple URLs to improve throughput and reduce overhead.
Individual Requests
⏱️ Time: ~5 seconds per URL
💰 Cost: 1 credit per URL
📊 Throughput: 12 URLs/minute
Use for: <10 URLs
Batch Requests (Recommended)
⏱️ Time: ~15 seconds for 50 URLs
💰 Cost: 1 credit per URL
📊 Throughput: 200 URLs/minute
✅ Use for: 10+ URLs (16x faster!)
4. Cost/Benefit Analysis
Calculate the ROI of your scraping operations.
Free Plan
$0
1,000 credits
= 1,000 fetch_url requests
Hobby Plan
$19
5,000 credits
= $0.0038/credit
Professional
$99
50,000 credits
= $0.002/credit
Cost Per 1,000 URLs (Hobby Plan)
✅ Using fetch_url (1 credit): $3.80
⚠️ Using scrape_structured (2 credits): $7.60
❌ Using scrape_with_actions (5 credits): $19.00
❌ Using deep_research (10 credits): $38.00
Optimization Summary
Always start with fetch_url
Cache results for at least 1 hour
Use batch_scrape for 10+ URLs
Avoid deep_research for simple tasks
Parse HTML locally when possible
Use onlyMainContent: true for batch scraping
Don't scrape same URL twice in 24h
Check if site has API before scraping
Use webhooks for large batches (100+ URLs)
Monitor usage dashboard weekly