On this page
Windsurf is a next-generation AI code editor built by Codeium, designed around the concept of "Flows" -- multi-step AI actions that understand your entire codebase. CrawlForge MCP extends Windsurf's capabilities beyond your local files and into the live web, letting the AI fetch documentation, scrape reference implementations, research APIs, and extract data -- all from within your editor.
This guide shows you how to configure CrawlForge as an MCP server in Windsurf and use it for real development tasks.
Table of Contents
- Prerequisites
- Why CrawlForge in Windsurf
- Step 1: Install CrawlForge MCP Server
- Step 2: Configure Windsurf MCP Settings
- Step 3: Verify the Connection
- Practical Use Cases
- Credit Cost Breakdown
- Troubleshooting
- Next Steps
Prerequisites
- Windsurf IDE installed (windsurf.com)
- Node.js 18+ installed
- A CrawlForge API key -- sign up free for 1,000 credits
Why CrawlForge in Windsurf
Windsurf's Cascade AI agent can already read your codebase and make multi-file edits. But it cannot access external resources:
- It cannot read the latest API documentation for a library you are using
- It cannot check how a competitor implements a specific feature
- It cannot pull real-world HTML to test your parser against
- It cannot research best practices across multiple sources
CrawlForge solves this. Once configured, Windsurf's AI agent gains access to all 18 CrawlForge tools -- and selects the right one based on what you ask.
Step 1: Install CrawlForge MCP Server
Open your terminal and install the CrawlForge MCP server globally:
Verify the installation:
Step 2: Configure Windsurf MCP Settings
Windsurf reads MCP server configurations from a JSON file. Open your Windsurf MCP config:
macOS / Linux:
Windows:
Add the CrawlForge server to the mcpServers object:
Replace cf_live_your_api_key_here with your actual API key from the CrawlForge dashboard.
Alternative: Project-Level Configuration
For team projects, create a .windsurf/mcp_config.json in your project root. This keeps the configuration version-controlled (but remember to add your API key via environment variables rather than hardcoding):
Step 3: Verify the Connection
Restart Windsurf to pick up the new MCP config. Then open the Cascade panel (Cmd+L / Ctrl+L) and type:
Fetch the homepage of https://crawlforge.dev and tell me what the product does.
Windsurf should automatically invoke the CrawlForge fetch_url or extract_content tool and return a summary. If you see the results, the connection is working.
You can verify which tools are available by asking:
What CrawlForge tools do you have access to?
Cascade will list all 18 tools with their descriptions.
Practical Use Cases
Fetch Documentation While Coding
Ask Cascade to read the latest docs for any library:
Read the Next.js docs for the App Router metadata API and show me how to add Open Graph tags.
CrawlForge's extract_content tool fetches the documentation page, and Cascade applies it directly to your code. Cost: 2 credits.
Research API Design Patterns
Search for best practices for REST API pagination in 2026 and summarize the top 3 approaches.
Cascade uses search_web to find articles, then extract_content to read them. Cost: ~11 credits (5 for search + 2 per page for 3 pages).
Generate Test Data from Real Websites
Scrape 5 product listings from https://books.toscrape.com and generate TypeScript interfaces that match the data structure.
CrawlForge's scrape_structured extracts the data, and Cascade generates typed interfaces from the response. Cost: 2 credits.
Monitor a Competitor's Changelog
Fetch the changelog at https://competitor.com/changelog and list all features released in the last 30 days.
Uses extract_content to read the page and filters by date. Cost: 2 credits.
Credit Cost Breakdown
| Task | Tool | Credits |
|---|---|---|
| Fetch a documentation page | extract_content | 2 |
| Search for articles | search_web | 5 |
| Extract structured data | scrape_structured | 2 |
| Crawl a multi-page docs site | crawl_deep | 5 |
| Get page metadata | extract_metadata | 1 |
| Analyze content sentiment | analyze_content | 3 |
A typical development session uses 10-30 credits. The Free tier (1,000 credits/month) covers most individual developer needs.
Troubleshooting
"MCP server not found": Verify the global install path is in your system PATH. Run which crawlforge-mcp-server to check. If it returns nothing, reinstall with npm install -g crawlforge-mcp-server.
"Authentication failed": Double-check your API key in mcp_config.json. Keys start with cf_live_ for production or cf_test_ for development.
Tools not appearing: Restart Windsurf after editing the MCP config. The server loads on IDE startup.
Slow responses: CrawlForge tools that render JavaScript (scrape_with_actions, stealth_mode) take 3-8 seconds. Static tools like fetch_url and extract_text respond in under 1 second.
Next Steps
With CrawlForge configured in Windsurf, explore these workflows:
- Use stealth mode to access sites with anti-bot protection
- Build a research agent workflow for market analysis
- Explore the full CrawlForge Quick Start for MCP configuration options
- Check all 18 tools and their capabilities
Give your AI editor access to the live web. Start free with 1,000 credits -- no credit card required.