CrawlForge
HomeUse CasesIntegrationsPricingDocumentationBlog
How to Use CrawlForge with Windsurf IDE
Tutorials
Back to Blog
Tutorials

How to Use CrawlForge with Windsurf IDE

C
CrawlForge Team
Engineering Team
April 9, 2026
7 min read

On this page

Windsurf is a next-generation AI code editor built by Codeium, designed around the concept of "Flows" -- multi-step AI actions that understand your entire codebase. CrawlForge MCP extends Windsurf's capabilities beyond your local files and into the live web, letting the AI fetch documentation, scrape reference implementations, research APIs, and extract data -- all from within your editor.

This guide shows you how to configure CrawlForge as an MCP server in Windsurf and use it for real development tasks.

Table of Contents

  • Prerequisites
  • Why CrawlForge in Windsurf
  • Step 1: Install CrawlForge MCP Server
  • Step 2: Configure Windsurf MCP Settings
  • Step 3: Verify the Connection
  • Practical Use Cases
  • Credit Cost Breakdown
  • Troubleshooting
  • Next Steps

Prerequisites

  • Windsurf IDE installed (windsurf.com)
  • Node.js 18+ installed
  • A CrawlForge API key -- sign up free for 1,000 credits

Why CrawlForge in Windsurf

Windsurf's Cascade AI agent can already read your codebase and make multi-file edits. But it cannot access external resources:

  • It cannot read the latest API documentation for a library you are using
  • It cannot check how a competitor implements a specific feature
  • It cannot pull real-world HTML to test your parser against
  • It cannot research best practices across multiple sources

CrawlForge solves this. Once configured, Windsurf's AI agent gains access to all 18 CrawlForge tools -- and selects the right one based on what you ask.

Step 1: Install CrawlForge MCP Server

Open your terminal and install the CrawlForge MCP server globally:

Bash

Verify the installation:

Bash

Step 2: Configure Windsurf MCP Settings

Windsurf reads MCP server configurations from a JSON file. Open your Windsurf MCP config:

macOS / Linux:

Bash

Windows:

Bash

Add the CrawlForge server to the mcpServers object:

Typescript

Replace cf_live_your_api_key_here with your actual API key from the CrawlForge dashboard.

Alternative: Project-Level Configuration

For team projects, create a .windsurf/mcp_config.json in your project root. This keeps the configuration version-controlled (but remember to add your API key via environment variables rather than hardcoding):

Typescript

Step 3: Verify the Connection

Restart Windsurf to pick up the new MCP config. Then open the Cascade panel (Cmd+L / Ctrl+L) and type:

Fetch the homepage of https://crawlforge.dev and tell me what the product does.

Windsurf should automatically invoke the CrawlForge fetch_url or extract_content tool and return a summary. If you see the results, the connection is working.

You can verify which tools are available by asking:

What CrawlForge tools do you have access to?

Cascade will list all 18 tools with their descriptions.

Practical Use Cases

Fetch Documentation While Coding

Ask Cascade to read the latest docs for any library:

Read the Next.js docs for the App Router metadata API and show me how to add Open Graph tags.

CrawlForge's extract_content tool fetches the documentation page, and Cascade applies it directly to your code. Cost: 2 credits.

Research API Design Patterns

Search for best practices for REST API pagination in 2026 and summarize the top 3 approaches.

Cascade uses search_web to find articles, then extract_content to read them. Cost: ~11 credits (5 for search + 2 per page for 3 pages).

Generate Test Data from Real Websites

Scrape 5 product listings from https://books.toscrape.com and generate TypeScript interfaces that match the data structure.

CrawlForge's scrape_structured extracts the data, and Cascade generates typed interfaces from the response. Cost: 2 credits.

Monitor a Competitor's Changelog

Fetch the changelog at https://competitor.com/changelog and list all features released in the last 30 days.

Uses extract_content to read the page and filters by date. Cost: 2 credits.

Credit Cost Breakdown

TaskToolCredits
Fetch a documentation pageextract_content2
Search for articlessearch_web5
Extract structured datascrape_structured2
Crawl a multi-page docs sitecrawl_deep5
Get page metadataextract_metadata1
Analyze content sentimentanalyze_content3

A typical development session uses 10-30 credits. The Free tier (1,000 credits/month) covers most individual developer needs.

Troubleshooting

"MCP server not found": Verify the global install path is in your system PATH. Run which crawlforge-mcp-server to check. If it returns nothing, reinstall with npm install -g crawlforge-mcp-server.

"Authentication failed": Double-check your API key in mcp_config.json. Keys start with cf_live_ for production or cf_test_ for development.

Tools not appearing: Restart Windsurf after editing the MCP config. The server loads on IDE startup.

Slow responses: CrawlForge tools that render JavaScript (scrape_with_actions, stealth_mode) take 3-8 seconds. Static tools like fetch_url and extract_text respond in under 1 second.

Next Steps

With CrawlForge configured in Windsurf, explore these workflows:

  • Use stealth mode to access sites with anti-bot protection
  • Build a research agent workflow for market analysis
  • Explore the full CrawlForge Quick Start for MCP configuration options
  • Check all 18 tools and their capabilities

Give your AI editor access to the live web. Start free with 1,000 credits -- no credit card required.

Tags

windsurfideintegrationweb-scrapingmcptutorialai-engineering

About the Author

C

CrawlForge Team

Engineering Team

Building the most comprehensive web scraping MCP server. We create tools that help developers extract, analyze, and transform web data for AI applications.

On this page

Related Articles

How to Use CrawlForge with LangGraph Agents
Tutorials

How to Use CrawlForge with LangGraph Agents

Build stateful web scraping agents with LangGraph and CrawlForge. TypeScript guide covering graph nodes, state management, and conditional scraping flows.

C
CrawlForge Team
|
Apr 24
|
8m
How to Use CrawlForge with Dify Workflows
Tutorials

How to Use CrawlForge with Dify Workflows

Add CrawlForge as a custom tool in Dify for web scraping in your LLM app workflows. No-code and API integration guide with workflow examples.

C
CrawlForge Team
|
Apr 22
|
7m
How to Use CrawlForge with Mastra AI Agents
Tutorials

How to Use CrawlForge with Mastra AI Agents

Build AI agents with web scraping capabilities using Mastra and CrawlForge. TypeScript setup guide with tool integration, workflows, and agent examples.

C
CrawlForge Team
|
Apr 21
|
7m

Footer

CrawlForge

Enterprise web scraping for AI Agents. 18 specialized MCP tools designed for modern developers building intelligent systems.

Product

  • Features
  • Pricing
  • Use Cases
  • Integrations
  • Changelog

Resources

  • Getting Started
  • API Reference
  • Templates
  • Guides
  • Blog
  • FAQ

Developers

  • MCP Protocol
  • Claude Desktop
  • Cursor IDE
  • LangChain
  • LlamaIndex

Company

  • About
  • Contact
  • Privacy
  • Terms

Stay updated

Get the latest updates on new tools and features.

Built with Next.js and MCP protocol

© 2025-2026 CrawlForge. All rights reserved.