CrawlForge
HomePricingDocumentationBlog
Tutorials

How to Install CrawlForge MCP and Use It in Claude Code: A Beginner's Guide

C
CrawlForge Team
Engineering Team
December 26, 2025
10 min read

If you're new to Claude Code (Anthropic's terminal-based AI assistant) and want to give it the power to scrape websites, search the web, and extract content, you're in the right place. This beginner-friendly guide will walk you through everything from installation to your first successful web scrape.

What You'll Learn

By the end of this guide, you'll be able to:

  • Install CrawlForge MCP on your computer
  • Set up your free API key (1,000 credits included)
  • Configure Claude Code to use CrawlForge
  • Run your first web scraping commands

Time required: About 5-10 minutes

What is CrawlForge MCP?

Before we dive in, let's understand what we're installing:

  • CrawlForge is a professional web scraping service with 18 different tools
  • MCP stands for Model Context Protocol - it's how AI assistants like Claude connect to external tools
  • Claude Code is Anthropic's command-line AI assistant that runs in your terminal

When you connect CrawlForge to Claude Code, you can simply ask Claude to "fetch this webpage" or "search for information about X" and it will use CrawlForge's tools automatically.

Prerequisites

Before starting, make sure you have:

  1. Node.js 18 or higher installed

    • Check by running: node --version
    • Download from nodejs.org if needed
  2. Claude Code installed

    • If you don't have it yet, install with: npm install -g @anthropic-ai/claude-code
  3. A terminal/command prompt open

That's it! Let's get started.

Step 1: Install CrawlForge MCP Server

Open your terminal and run this command:

Bash

What this does:

  • Downloads the CrawlForge MCP server package
  • Installs it globally on your computer (the -g flag)
  • Makes it available from any directory

You should see output like:

added 1 package in 2s

Step 2: Set Up Your API Key

Now we need to get your free API key and configure it. Run:

Bash

This interactive setup wizard will:

  1. Ask if you have an API key

    • If yes: Enter it when prompted
    • If no: It will open your browser to create a free account
  2. Guide you to get your free API key

    • Go to crawlforge.dev/signup
    • Create your account (no credit card required)
    • You'll receive 1,000 free credits instantly
    • Copy your API key (it starts with cf_live_)
  3. Configure your credentials securely

    • The setup stores your API key at ~/.crawlforge/config.json
    • This keeps it safe and separate from your projects
  4. Verify everything is working

    • The setup tests your connection to confirm it's ready

Alternative: Manual Configuration

If you prefer to set up manually, you can:

Option A: Environment Variable

Bash

Option B: Config File

Create ~/.crawlforge/config.json:

Json

Step 3: Configure Claude Code

Now we need to tell Claude Code about CrawlForge. There are two ways to do this:

Method 1: Add to Claude Code Settings (Recommended)

Run Claude Code and use the /mcp command to add the server:

Bash

This adds CrawlForge to your Claude Code configuration automatically.

Method 2: Edit Configuration File

You can also manually edit your Claude Code MCP settings. The configuration file location depends on your operating system:

macOS:

~/.config/claude/mcp_servers.json

Windows:

%APPDATA%\claude\mcp_servers.json

Linux:

~/.config/claude/mcp_servers.json

Add this configuration:

Json

Step 4: Restart Claude Code

For the changes to take effect, restart Claude Code:

  1. If Claude Code is running, exit it (type exit or press Ctrl+C)
  2. Start it again:
Bash

You should see CrawlForge listed in the available MCP tools.

Step 5: Your First Web Scrape!

Now for the fun part - let's test it! Start Claude Code and try these commands:

Example 1: Fetch a Simple Webpage

Fetch the content from https://example.com

Claude will use the fetch_url tool (1 credit) and show you the HTML content.

Example 2: Extract Clean Text

Extract the main text content from https://news.ycombinator.com

Claude will use extract_text (1 credit) to get clean, readable text without HTML tags.

Example 3: Get All Links from a Page

List all the links on https://crawlforge.dev

Claude will use extract_links (1 credit) to find and list every link.

Example 4: Search the Web

Search the web for "best practices for web scraping in 2025"

Claude will use search_web (5 credits) to find relevant results.

Example 5: Deep Research

Research the latest developments in AI and summarize your findings

Claude will use deep_research (10 credits) to search multiple sources, verify information, and synthesize a comprehensive answer.

Understanding Credits

CrawlForge uses a credit-based system. Each tool costs a certain number of credits:

Tool TypeCreditsExamples
Basic1fetch_url, extract_text, extract_links, extract_metadata
Advanced2-3scrape_structured, summarize_content, analyze_content
Premium5-10search_web, crawl_deep, batch_scrape, deep_research

Your free account includes 1,000 credits - that's enough for:

  • 1,000 basic page fetches, OR
  • 100 deep research queries, OR
  • A mix of different operations

Pro tip: Start with basic tools like fetch_url and extract_text to conserve credits while learning!

All 18 Available Tools

Here's what you can do with CrawlForge:

Basic Tools (1 credit each)

  • fetch_url - Get raw HTML from any URL
  • extract_text - Extract clean text without HTML
  • extract_links - Get all links from a page
  • extract_metadata - Get SEO metadata, Open Graph tags, etc.

Content Tools (2-3 credits)

  • scrape_structured - Extract data using CSS selectors
  • extract_content - Smart article/main content extraction
  • summarize_content - AI-powered summarization
  • analyze_content - Sentiment, language, and topic analysis

Site Tools (2-5 credits)

  • map_site - Discover all pages on a website
  • crawl_deep - Crawl multiple pages with depth control
  • batch_scrape - Process many URLs at once

Research Tools (5-10 credits)

  • search_web - Web search via Google/DuckDuckGo
  • deep_research - Multi-stage research with verification

Advanced Tools (3-10 credits)

  • stealth_mode - Anti-detection browsing
  • scrape_with_actions - Browser automation (clicks, forms)
  • process_document - Extract text from PDFs
  • localization - Geo-targeted scraping
  • generate_llms_txt - Create AI interaction guidelines

Troubleshooting

"Command not found: npx"

Make sure Node.js is installed:

Bash

If not, download from nodejs.org.

"API key not found"

Run the setup again:

Bash

Or manually check your config file at ~/.crawlforge/config.json.

"Insufficient credits"

Check your balance at crawlforge.dev/dashboard. You can:

  • Upgrade to a paid plan for more credits
  • Purchase additional credit packs

Claude Code doesn't see CrawlForge

  1. Make sure you restarted Claude Code after configuration
  2. Check that the MCP server is properly configured
  3. Try running npx crawlforge-mcp-server manually to see if there are errors

What's Next?

Now that you have CrawlForge set up, here are some ideas:

  1. Build a research workflow - Ask Claude to research topics and compile reports
  2. Monitor competitors - Track changes on competitor websites
  3. Collect data - Extract product information, prices, or reviews
  4. Create content - Gather information for blog posts or documentation

Pricing Plans

When you need more credits:

PlanCredits/MonthPriceBest For
Free1,000$0Testing & learning
Hobby5,000$19/moPersonal projects
Professional50,000$99/moProduction use
Business250,000$399/moHeavy usage

Credits never expire and roll over month-to-month on paid plans!

Need Help?

  • Documentation: crawlforge.dev/docs
  • GitHub Issues: github.com/mysleekdesigns/crawlforge-mcp/issues
  • Email: support@crawlforge.dev
  • Discord: Join our community

Ready to start? You've got 1,000 free credits waiting for you at crawlforge.dev/signup. Happy scraping!

Tags

Claude CodeMCPInstallationBeginnersTutorialnpm

About the Author

C
CrawlForge Team

Engineering Team

Related Articles

Tutorials
5 Ways to Use CrawlForge with LangChain: AI Web Scraping Tutorial
Learn how to integrate CrawlForge MCP with LangChain for powerful AI-driven web scraping workflows. Build RAG systems, research agents, and data pipelines.
LangChainTutorialAI Engineering+2 more
C
CrawlForge Team
Dec 24, 2025
12 min read
Read more
Tutorials
How to Add Web Scraping to Claude Desktop in 5 Minutes
Learn how to enable web scraping capabilities in Claude Desktop using the Model Context Protocol (MCP) and CrawlForge in just 5 minutes.
Claude DesktopMCPWeb Scraping+1 more
C
CrawlForge Team
Dec 20, 2025
8 min read
Read more
Product Updates
18 Web Scraping Tools in One MCP Server: The Complete CrawlForge Guide
Discover all 18 powerful web scraping tools available in CrawlForge MCP - from basic URL fetching to AI-powered research. Learn which tool to use for every scraping scenario.
MCPWeb ScrapingAPI+2 more
C
CrawlForge Team
Dec 23, 2025
10 min read
Read more

Footer

CrawlForge

Enterprise web scraping for AI Agents. 18 specialized MCP tools designed for modern developers building intelligent systems.

Product

  • Features
  • Pricing

Resources

  • Getting Started
  • Guides
  • Blog
  • FAQ

Company

  • About
  • Contact

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy
  • Acceptable Use

Stay updated

Get the latest updates on new tools and features.

Built with Next.js and MCP protocol

© 2025 CrawlForge. All rights reserved.