CrawlForge
HomeUse CasesIntegrationsPricingDocumentationBlog
How to Use CrawlForge with Vercel AI SDK
Tutorials
Back to Blog
Tutorials

How to Use CrawlForge with Vercel AI SDK

C
CrawlForge Team
Engineering Team
April 7, 2026
8 min read

On this page

The Vercel AI SDK gives you a unified interface for calling LLMs from any framework -- Next.js, SvelteKit, Nuxt, or plain Node. CrawlForge gives your LLM access to live web data through 18 specialized scraping tools. Together, they let you build AI applications that can search, fetch, extract, and analyze web content in real time.

This tutorial shows you how to register CrawlForge tools with the Vercel AI SDK's tool() API, so your AI can scrape the web as naturally as it generates text.

Table of Contents

  • Prerequisites
  • How It Works: Tools in the Vercel AI SDK
  • Step 1: Create the CrawlForge Tool Wrapper
  • Step 2: Register Tools with generateText
  • Step 3: Build a Streaming Chat with Web Access
  • Advanced: Multi-Tool Research Agent
  • Credit Cost Breakdown
  • Best Practices
  • Next Steps

Prerequisites

Bash
Bash

Get your CrawlForge API key at crawlforge.dev/signup -- 1,000 free credits included.

How It Works: Tools in the Vercel AI SDK

The Vercel AI SDK lets you define tools that the LLM can invoke during a conversation. Each tool has a name, description, parameter schema (using Zod), and an execute function. The LLM reads the tool descriptions, decides when to call them, and incorporates the results into its response.

CrawlForge's REST API maps perfectly to this pattern: each of the 18 tools becomes a callable function with typed parameters.

Step 1: Create the CrawlForge Tool Wrapper

First, build a reusable helper that calls any CrawlForge endpoint:

Typescript

Step 2: Register Tools with generateText

Now register CrawlForge tools using the Vercel AI SDK's tool() function:

Typescript

When the LLM encounters a question that requires web data, it automatically invokes the right CrawlForge tool, receives the results, and weaves them into its response. No manual orchestration needed.

Step 3: Build a Streaming Chat with Web Access

For a real-time chat experience, use streamText instead of generateText:

Typescript

The maxSteps: 5 parameter lets the LLM chain multiple tool calls -- for example, searching the web first, then extracting content from the top result.

Frontend Component

Typescript

Advanced: Multi-Tool Research Agent

Combine multiple CrawlForge tools for deeper research workflows:

Typescript

This agent will:

  1. Search Google for relevant articles (5 credits)
  2. Extract content from the top results (2 credits each)
  3. Analyze the content for themes (3 credits each)

Total cost for a 3-source research run: ~20 credits.

Credit Cost Breakdown

ToolCreditsBest For
fetch_url1Raw HTML retrieval
extract_text1Clean text extraction
extract_links1Link discovery
extract_metadata1Page metadata (title, OG tags)
extract_content2Readable content extraction
scrape_structured2CSS selector-based data extraction
summarize_content2Text summarization
analyze_content3Topic and sentiment analysis
search_web5Google search results
deep_research10Multi-source research with citations

See the full pricing breakdown for all 18 tools.

Best Practices

Minimize credit usage. Use extract_content (2 credits) instead of deep_research (10 credits) when you only need one page. The tool descriptions guide the LLM toward the cheapest option that satisfies the query.

Set maxSteps carefully. A higher maxSteps value allows more tool calls per response but consumes more credits. Start with 3-5 and increase only if the LLM consistently needs more.

Cache frequently accessed pages. If your app repeatedly fetches the same URL, cache the CrawlForge response in Redis or a database rather than re-fetching each time.

Use Zod descriptions. The .describe() strings on your Zod parameters help the LLM understand what values to pass. Be specific: "The full URL including https://" is better than "URL".

Next Steps

You now have a Vercel AI SDK application with live web access. Expand from here:

  • Add the remaining 18 CrawlForge tools as needed
  • Implement stealth mode for sites with anti-bot protection
  • Build a deep research agent for automated market analysis
  • Check the CrawlForge Quick Start for MCP client setup

Build AI apps that see the web. Start free with 1,000 credits -- no credit card required.

Tags

vercel-ai-sdknextjsintegrationweb-scrapingai-engineeringtutorialtypescript

About the Author

C

CrawlForge Team

Engineering Team

Building the most comprehensive web scraping MCP server. We create tools that help developers extract, analyze, and transform web data for AI applications.

On this page

Related Articles

How to Use CrawlForge with LangGraph Agents
Tutorials

How to Use CrawlForge with LangGraph Agents

Build stateful web scraping agents with LangGraph and CrawlForge. TypeScript guide covering graph nodes, state management, and conditional scraping flows.

C
CrawlForge Team
|
Apr 24
|
8m
How to Use CrawlForge with Mastra AI Agents
Tutorials

How to Use CrawlForge with Mastra AI Agents

Build AI agents with web scraping capabilities using Mastra and CrawlForge. TypeScript setup guide with tool integration, workflows, and agent examples.

C
CrawlForge Team
|
Apr 21
|
7m
How to Use CrawlForge with Anthropic Claude API
Tutorials

How to Use CrawlForge with Anthropic Claude API

Connect CrawlForge web scraping tools to the Claude API via tool_use. Build AI applications with live web data using TypeScript and Claude Sonnet.

C
CrawlForge Team
|
Apr 15
|
9m

Footer

CrawlForge

Enterprise web scraping for AI Agents. 18 specialized MCP tools designed for modern developers building intelligent systems.

Product

  • Features
  • Pricing
  • Use Cases
  • Integrations
  • Changelog

Resources

  • Getting Started
  • API Reference
  • Templates
  • Guides
  • Blog
  • FAQ

Developers

  • MCP Protocol
  • Claude Desktop
  • Cursor IDE
  • LangChain
  • LlamaIndex

Company

  • About
  • Contact
  • Privacy
  • Terms

Stay updated

Get the latest updates on new tools and features.

Built with Next.js and MCP protocol

© 2025-2026 CrawlForge. All rights reserved.