CrawlForge
Api Reference
...
Tools
Batch Scrape
Advanced Tool5 credits

batch_scrape

Scrape multiple URLs in parallel with async job management, webhook notifications, and configurable concurrency. Perfect for bulk data collection and automated workflows.

Use Cases

Bulk Data Collection

Scrape product catalogs, news articles, or research papers across multiple pages simultaneously

Competitor Analysis

Monitor pricing, features, and content across competitor websites in one batch

Automated Workflows

Integrate with webhooks for real-time processing as scraping jobs complete

Scheduled Reporting

Generate daily reports by batch scraping dashboards, analytics, or status pages

Content Archival

Archive multiple pages as screenshots or PDFs for compliance or historical records

Parallel Processing

Control concurrency levels to optimize speed while respecting rate limits

Endpoint

POST/api/v1/tools/batch_scrape
Auth Required
2 req/s on Free plan
5 credits

Parameters

NameTypeRequiredDefaultDescription
urls
string[]Required-
Array of URLs to scrape (1-50 URLs)
Example: ["https://example.com", "https://example.org"]
formats
string[]Optional["markdown"]
Output formats for each URL: markdown, html, text, screenshot, or pdf
Example: ["markdown", "screenshot"]
webhook
stringOptional-
Webhook URL to receive job completion notification
Example: https://yourapp.com/webhook/scrape-complete
maxConcurrency
numberOptional5
Maximum concurrent requests (1-10)
Example: 10
timeout
numberOptional30000
Timeout per URL in milliseconds
Example: 45000
onlyMainContent
booleanOptionalfalse
Extract only main content, removing boilerplate
Example: true

Request Examples

terminalBash

Response Example

200 OK156ms
{
"success": true,
"data": {
"jobId": "batch_1234567890abcdef",
"status": "processing",
"totalUrls": 3,
"completed": 0,
"successful": 0,
"failed": 0,
"startedAt": "2025-10-01T12:00:00Z",
"estimatedCompletionAt": "2025-10-01T12:02:00Z",
"results": []
},
"credits_used": 5,
"credits_remaining": 995,
"processing_time": 156
}
Field Descriptions
data.jobIdUnique identifier for tracking this batch job
data.statusJob status: queued, processing, completed, or failed
data.totalUrlsTotal number of URLs in the batch
data.completedNumber of URLs processed (successful + failed)
data.estimatedCompletionAtEstimated completion time based on concurrency
credits_used5 credits per batch request (flat fee)
credits_remainingYour remaining credit balance

Webhook Payload

When the batch completes, your webhook URL will receive:

webhook-payload.jsonJson

Error Handling

Too Many URLs (400 Bad Request)

Maximum 50 URLs per batch. Split large batches into multiple requests.

Invalid Webhook URL (400 Bad Request)

Webhook must be a valid HTTPS URL. HTTP webhooks are not supported for security.

Insufficient Credits (402 Payment Required)

Batch requires credits upfront (1 per URL). Add more credits before retrying.

Job Not Found (404 Not Found)

The job ID doesn't exist or has expired. Jobs are retained for 7 days after completion.

Pro Tip: Use webhooks for large batches instead of polling. This reduces API calls and improves reliability. Failed URLs don't consume credits—only successful scrapes are charged.

Credit Cost

5 credits
5 credits per request
Flat fee per batch request. Process up to 50 URLs per batch with parallel execution and webhook notifications.

What's Included:

  • Up to 50 URLs per batch
  • Parallel processing with configurable concurrency
  • Multiple output formats (markdown, HTML, text, screenshot, PDF)
  • Webhook notifications on completion
  • Async job management

Plan Recommendations:

Free Plan: 1,000 credits = 200 batch requests

Hobby Plan: 5,000 credits = 1,000 batch requests ($19/mo)

Professional Plan: 50,000 credits = 10,000 batch requests ($99/mo)

Related Tools

crawl_deep
Discover URLs first, then batch scrape them (4 credits)
deep_research
Multi-stage research across multiple sources (10 credits)
screenshot
Capture visual snapshots in batch (2 credits per screenshot)
stealth_mode
Bypass bot detection for protected sites (5 credits)
Ready to try batch_scrape? Sign up for free and get 1,000 credits to start building.