Stay Updated
Subscribe to our newsletter for the latest news and updates about MCP servers
Subscribe to our newsletter for the latest news and updates about MCP servers
Get real-time web data, automatically handle anti-bot systems, extract structured data, and capture screenshots with your favorite tool like Cursor or LangChain.
👉 Live web data access 👉 Anti-bot handling & proxy infrastructure 👉 Structured data extraction 👉 Screenshots capture 👉 Secure Auth options
Works with your favorite AI tools: Claude Desktop, Cursor, Cline, Windsurf, LangChain, LlamaIndex, CrewAI, OpenAI function calling, n8n, Make, and Zapier.
🕸️ Job aggregation 🕸️ Price monitoring 🕸️ Content research & aggregation 🕸️ Competitor intelligence & market analysis 🕸️ Real-estate analysis 🕸️ RAG & LLM data ingestion 🕸️ Automated multi-step workflows
instructions
Purpose: Provide guidance and required parameters for successful scraping calls.
Features: Supplies best-practice recommendations, parameter selection hints, error-handling guidance, and the mandatory pow value needed before calling web_get_page or web_scrape.
Example usage:
{
"tool": "web_get_page",
"parameters": {
"url": "https://news.ycombinator.com",
"pow": "obtained_from_instruction_tool",
"format": "markdown",
"format_options": ["only_content"]
}
}
web_scrape
Purpose: Provide full-control, customizable scraping for complex sites and workflows.
Features: Supports browser automation, login/auth flows, custom HTTP methods and headers, cookie management, multi-step interactions, and AI-powered data extraction.
Example usage:
{
"tool": "web_scrape",
"parameters": {
"url": "https://web-scraping.dev/login",
"pow": "obtained_from_instruction_tool",
"render_js": true,
"js_scenario": [
{ "fill": { "selector": "input[name='username']", "value": "myuser" } },
{ "fill": { "selector": "input[name='password']", "value": "mypass" } },
{ "click": { "selector": "button[type='submit']" } },
{ "wait_for_navigation": { "timeout": 5000 } }
]
}
}
screenshot
Purpose: Capture visual snapshots of webpages.
Features: Supports full-page screenshots or element-level captures using CSS selectors.
Example usage:
{
"tool": "screenshot",
"parameters": {
"url": "https://web-scraping.dev/pricing",
"capture": ".pricing-table",
"format": "png",
"options": ["load_images", "block_banners"]
}
}
info_account
Purpose: Retrieve real-time details about your Scrapfly account and project.
Features: Returns account metadata, project configuration, subscription status, credit usage, remaining quota, and concurrency limits.
Example usage:
{ "tool": "info_account" }
1. Locate Your Configuration File
Claude Desktop stores its configuration in a JSON file. Open the file for your operating system:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
Windows:
%APPDATA%\Claude\claude_desktop_config.json
2. Choose Authentication Method Select your preferred authentication method: OAuth2 (recommended): Add the following configuration to your claude_desktop_config.json file:
{
"mcpServers": {
"scrapfly": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.scrapfly.io/mcp"
]
}
}
}
API Key: Add the following configuration to your claude_desktop_config.json file:
{
"mcpServers": {
"scrapfly": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.scrapfly.io/mcp?key=YOUR-API-KEY"
]
}
}
}
3. Restart Claude Desktop After saving the configuration file, completely quit and restart Claude Desktop to apply the changes.
4. Verify the Integration After restarting, check that the MCP tools are available: