CrawlForge
HomeUse CasesIntegrationsPricingDocumentationBlog
How to Use CrawlForge with Smithery: MCP Registry Guide
Tutorials
Back to Blog
Tutorials

How to Use CrawlForge with Smithery: MCP Registry Guide

C
CrawlForge Team
Engineering Team
April 19, 2026
6 min read

On this page

Smithery is the largest MCP server registry, hosting over 100,000 tools and integrations for AI agents. CrawlForge is listed on Smithery, which means you can discover, install, and manage your web scraping MCP server without touching a config file manually.

This guide walks you through the complete Smithery workflow -- from searching the registry to calling CrawlForge tools in production.

Table of Contents

  • What Is Smithery?
  • Prerequisites
  • Step 1: Install the Smithery CLI
  • Step 2: Find CrawlForge on Smithery
  • Step 3: Connect to CrawlForge
  • Step 4: Browse Available Tools
  • Step 5: Call Tools from the CLI
  • Step 6: Use Namespaces for Environments
  • Credit Cost Reference
  • When to Use Smithery vs Direct Installation
  • Next Steps

What Is Smithery?

Smithery is a marketplace and registry for MCP servers. Think of it as npm for AI agent tools. Instead of manually configuring each MCP server in your client's settings file, Smithery provides a CLI and web interface to search, install, and manage MCP connections.

Smithery handles OAuth flows, credential storage, token refresh, and session lifecycle automatically. For teams running multiple MCP servers across dev and production environments, this eliminates significant configuration overhead.

Prerequisites

  • Node.js 18+ installed
  • A CrawlForge account with an API key (free tier includes 1,000 credits)
  • Terminal access (macOS, Linux, or Windows with WSL)

Step 1: Install the Smithery CLI

Bash

Authenticate with your Smithery account:

Bash

This opens your browser for confirmation. Once authenticated, the CLI stores your credentials locally.

Step 2: Find CrawlForge on Smithery

Search the Smithery registry for CrawlForge:

Bash

You will see CrawlForge listed with its description, tool count (18 tools), and connection URL. You can also browse the Smithery web catalog to read reviews and documentation before installing.

Step 3: Connect to CrawlForge

Add CrawlForge as a managed MCP connection:

Bash

Verify the connection status:

Bash

You should see crawlforge with status connected. If it shows auth_required, follow the authorization URL provided in the output.

Step 4: Browse Available Tools

List all 18 CrawlForge tools through Smithery:

Bash

The tool get command shows the full JSON schema for each tool, including required parameters, optional fields, and response format. This is useful for building automation scripts.

Step 5: Call Tools from the CLI

Execute CrawlForge tools directly from your terminal:

Typescript
Bash

When piped, Smithery outputs JSONL, which integrates cleanly with jq and other CLI tools:

Bash

Step 6: Use Namespaces for Environments

Smithery namespaces let you isolate CrawlForge connections across dev, staging, and production:

Bash

This prevents accidental credit consumption on your production account during development.

Credit Cost Reference

CreditsTools
1fetch_url, extract_text, extract_links, extract_metadata
2scrape_structured, extract_content, summarize_content, generate_llms_txt
3map_site, process_document, analyze_content, localization
5search_web, crawl_deep, batch_scrape, scrape_with_actions, stealth_mode
10deep_research

The free tier includes 1,000 credits. A typical research workflow -- search (5) + extract 3 pages (6) + summarize (2) -- costs 13 credits total.

When to Use Smithery vs Direct Installation

ApproachBest ForSetup Time
SmitheryTeams managing multiple MCP servers, CI/CD pipelines, multi-environment deployments2 minutes
Direct npm installSolo developers, Claude Code / Cursor users who want zero dependencies1 minute
API-onlyNon-MCP clients, custom integrations, server-to-server workflows5 minutes

If you are already using Smithery for other MCP servers (GitHub, Slack, Notion), adding CrawlForge through Smithery keeps everything in one management layer. If CrawlForge is your only MCP server, direct installation is simpler.

Next Steps

  • CrawlForge Quick Start -- direct installation guide for Claude Code
  • 18 Web Scraping Tools Overview -- what each tool does and when to use it
  • View Pricing -- credit packs and subscription plans
  • Smithery Documentation -- full Smithery CLI reference

Ready to start scraping? Sign up free and get 1,000 credits -- no credit card required. Then install via Smithery or npm and start extracting web data in under 60 seconds.

Tags

smitherymcpintegrationtutorialmcp-registryweb-scrapingai-tools

About the Author

C

CrawlForge Team

Engineering Team

Building the most comprehensive web scraping MCP server. We create tools that help developers extract, analyze, and transform web data for AI applications.

On this page

Related Articles

How to Use CrawlForge with LangGraph Agents
Tutorials

How to Use CrawlForge with LangGraph Agents

Build stateful web scraping agents with LangGraph and CrawlForge. TypeScript guide covering graph nodes, state management, and conditional scraping flows.

C
CrawlForge Team
|
Apr 24
|
8m
How to Use CrawlForge with Dify Workflows
Tutorials

How to Use CrawlForge with Dify Workflows

Add CrawlForge as a custom tool in Dify for web scraping in your LLM app workflows. No-code and API integration guide with workflow examples.

C
CrawlForge Team
|
Apr 22
|
7m
How to Use CrawlForge with Mastra AI Agents
Tutorials

How to Use CrawlForge with Mastra AI Agents

Build AI agents with web scraping capabilities using Mastra and CrawlForge. TypeScript setup guide with tool integration, workflows, and agent examples.

C
CrawlForge Team
|
Apr 21
|
7m

Footer

CrawlForge

Enterprise web scraping for AI Agents. 18 specialized MCP tools designed for modern developers building intelligent systems.

Product

  • Features
  • Pricing
  • Use Cases
  • Integrations
  • Changelog

Resources

  • Getting Started
  • API Reference
  • Templates
  • Guides
  • Blog
  • FAQ

Developers

  • MCP Protocol
  • Claude Desktop
  • Cursor IDE
  • LangChain
  • LlamaIndex

Company

  • About
  • Contact
  • Privacy
  • Terms

Stay updated

Get the latest updates on new tools and features.

Built with Next.js and MCP protocol

© 2025-2026 CrawlForge. All rights reserved.