ScraperCity logo

Integration Guide

ScraperCity + Cursor

Scrape B2B leads without leaving Cursor. Add ScraperCity as an MCP server and Cursor can pull Apollo contacts, extract Google Maps businesses, find and validate emails, look up phone numbers, and export results to CSV - all through natural language in Composer.

What Is a Cursor MCP Server?

Model Context Protocol (MCP) is an open standard that lets Cursor's AI agent talk to external tools and data sources. Without MCP, Cursor can only work with files open in your editor. With an MCP server connected, Cursor's Agent can call APIs, query databases, and run scrapers as part of any task - all without you switching tabs or writing boilerplate HTTP code.

ScraperCity runs as a local MCP server via npx. When you open Composer in Agent mode, Cursor automatically discovers every ScraperCity tool - Apollo scraping, Google Maps extraction, email finding, email validation, phone lookup, skip tracing, and more - and decides which ones to call based on your prompt. You write plain English; Cursor handles the API calls, async polling, and file output.

Setup: Cursor MCP Server Configuration

1

Get your ScraperCity API key

Sign up at app.scrapercity.com and copy your API key from the API Docs page. Plans start at $49/mo and include access to all 25 scrapers.

2

Add the MCP config to your project

Create or edit .cursor/mcp.json in your project root. This is a project-scoped config - it only activates when you open that project in Cursor. To make ScraperCity available in every project instead, use ~/.cursor/mcp.json in your home directory.

// .cursor/mcp.json
{
  "mcpServers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": {
        "SCRAPERCITY_API_KEY": "your_api_key_here"
      }
    }
  }
}

Replace your_api_key_here with your actual key. The env block passes the key to the MCP server process securely - it is never exposed in your source code.

3

Restart Cursor and verify the connection

After saving the config, restart Cursor. Navigate to Cursor Settings > Tools & MCP. You should see scrapercity listed with a green dot indicating it is connected. A red dot means the server failed to start - see the Troubleshooting section below.

You can also verify quickly by opening a new Composer session in Agent mode and asking: "What ScraperCity tools do you have access to?" Cursor will list all available scrapers.

4

Open Composer in Agent mode and start scraping

Open Composer (Cmd+I on Mac, Ctrl+I on Windows), switch to Agent mode, and describe what you need in plain English. MCP tools only run in Agent mode - they are not available in normal Cursor chat. Cursor will ask for confirmation before running each tool call unless you enable Yolo mode in settings.

What You Can Ask Cursor

Scrape all real estate agents in Miami from Google Maps. Save to data/miami-agents.csv and write a Python script that sends each one a personalized email using their business name.

Cursor scrapes the data, saves the CSV, then builds a working email script in the same session - with full context of the scraped fields.

Find the business email for every company in my targets.csv file. Add an email column to the file with the results.

Cursor reads the CSV, calls the ScraperCity email finder for each row, and writes the enriched file back. All inside the IDE.

Build me a Node.js script that queries the ScraperCity API for marketing managers at SaaS companies, validates their emails, and pushes them to HubSpot via the Contacts API.

Cursor writes the complete script with proper error handling, rate limiting, and field mapping - using the ScraperCity tools to test the API calls live.

Check my ScraperCity wallet balance. If I have credits, scrape Apollo for 1000 CTOs at companies with 50-200 employees in the UK.

Cursor checks the balance first, then conditionally runs the scrape. Results download as a CSV into your project directory.

Scrape the top 200 Shopify stores selling fitness equipment. For each one, find the owner's email and validate it. Export everything to leads/fitness-stores.csv.

Cursor chains three ScraperCity tools in sequence: Store Leads for the Shopify data, Email Finder for the contacts, Email Validator to check deliverability. One prompt, three API calls, one output file.

I have a list of 50 LinkedIn URLs in prospects.txt. Look up the mobile number for each one and add it to a new column.

Cursor reads the file, calls the Mobile Finder tool for each URL, and writes the enriched output back to your project. Results typically come back within 1-5 minutes per contact.

Available ScraperCity Tools in Cursor

Every ScraperCity scraper is available through the MCP server. Cursor's Agent picks the right tool automatically based on your prompt, or you can name a specific tool explicitly.

ToolWhat It DoesCostTime
Apollo ScraperB2B contacts by title, industry, and location$0.0039/lead11-48+ hrs
Google MapsLocal businesses with phones, emails, and reviews$0.01/place5-30 min
Email FinderBusiness email from name + company domain$0.05/contact1-10 min
Email ValidatorVerify deliverability, catch-all detection, MX records$0.0036/email1-10 min
Mobile FinderPhone numbers from LinkedIn profile or email$0.25/input1-5 min
People FinderSkip trace by name, email, phone, or address$0.02/result2-10 min
Store LeadsShopify/WooCommerce stores with owner contacts$0.0039/leadInstant
BuiltWithAll websites using a specific technology$4.99/search1-5 min
Website FinderContact info scraped from any domainPer domain5-15 min
YelpBusiness listings from Yelp by category and city$0.01/listing5-15 min
Zillow AgentsReal estate agent listings and contact dataPer agent5-15 min
Property LookupProperty data and owner contact information$0.15/address2-10 min
Lead DatabaseQuery 3M+ B2B contacts instantly ($649/mo plan)IncludedInstant
WalletCheck your ScraperCity credit balanceFreeInstant

Common Workflows

The real power of connecting ScraperCity to Cursor is combining scraping with code generation in a single session. Here are complete end-to-end workflows you can run in one Composer thread.

Email validation pipeline

  1. 1.Upload a CSV of contacts to your project
  2. 2.Ask Cursor to validate every email address using ScraperCity Email Validator ($0.0036/email)
  3. 3.Cursor adds a deliverability column (valid / invalid / catch-all) to the file
  4. 4.Ask Cursor to generate a Python script that filters the list and uploads valid addresses to Mailchimp via API

This pattern removes bad addresses before you send - protecting sender reputation without leaving the IDE.

Google Maps to outreach campaign

  1. 1.Ask Cursor to scrape plumbers in Chicago from Google Maps (returns phones, emails, reviews, addresses)
  2. 2.Ask Cursor to validate the email addresses from the scrape results
  3. 3.Ask Cursor to write a personalized cold email template using each business's name and review count
  4. 4.Ask Cursor to build a Node.js script that sends the campaign via SendGrid

Cursor keeps context of the scraped fields throughout - it knows the column names when writing the email template and the send script.

Apollo enrichment + CRM push

  1. 1.Ask Cursor to scrape VP of Sales contacts at logistics companies with 100-500 employees from Apollo
  2. 2.Cursor polls for results automatically (Apollo scrapes take 11-48+ hours - Cursor handles the wait)
  3. 3.Once complete, ask Cursor to find mobile numbers for the highest-priority contacts
  4. 4.Ask Cursor to write a script that pushes all contacts to your CRM via its API with correct field mapping

Set a webhook at app.scrapercity.com/dashboard/webhooks to get notified when long-running Apollo scrapes finish.

Why Cursor for Lead Scraping

IDE context

Cursor sees your full project. Ask it to scrape leads AND build the application that uses them in the same conversation.

Agent mode

Composer Agent mode lets Cursor chain multiple tool calls autonomously - scrape, validate, enrich, and export in one prompt.

Multi-model support

Cursor works with Claude, GPT, and other models. All of them can call ScraperCity tools through MCP regardless of which model you pick.

Project-level config

.cursor/mcp.json scopes the ScraperCity connection to specific projects. Different projects can use different API keys.

File read and write

Cursor reads your existing CSVs, enriches them with ScraperCity data, and writes the results back - no copy-paste between tools.

Terminal access

In Agent mode, Cursor can run terminal commands alongside ScraperCity tool calls - install dependencies, run scripts, and verify output in the same thread.

Performance Tips

Keep your active MCP server list short

Cursor loads tool descriptions for every enabled MCP server into the agent's context. With too many servers active at once, the agent gets slower at picking the right tool. Disable MCP servers you are not actively using via Cursor Settings > Tools & MCP.

Batch your email validation requests

When validating large lists, ask Cursor to validate in batches of 100-200 at a time rather than one by one. This keeps individual tool calls fast and makes it easy to retry a batch if a call fails without reprocessing the entire list.

Use webhooks for Apollo scrapes

Apollo results take 11-48+ hours. Instead of leaving Cursor polling overnight, set up a webhook at app.scrapercity.com/dashboard/webhooks to receive a notification when the results are ready. Then start a new Cursor session to download and process them.

Check your wallet before large scrapes

Ask Cursor to check your ScraperCity wallet balance before running expensive scrapes. This is free and instant - the Wallet tool takes a second and confirms you have enough credits before committing to a large Apollo run.

Use project-scoped keys for team repos

If multiple people are using the same codebase, each person should set their own SCRAPERCITY_API_KEY in their local .cursor/mcp.json. Add .cursor/mcp.json to .gitignore to prevent API keys from being committed.

Troubleshooting

ScraperCity tools don't appear in Cursor Agent

Check that the server shows a green dot in Cursor Settings > Tools & MCP. If it shows a red dot, open the Output panel (Cmd+Shift+U on Mac, Ctrl+Shift+U on Windows), select MCP from the dropdown, and read the startup error. The most common causes are: Node.js not installed or not on your PATH, a missing or incorrect SCRAPERCITY_API_KEY, or a JSON syntax error in mcp.json.

Tools are visible in settings but Cursor never calls them

Make sure you are using Agent mode in Composer. MCP tools are only called in Agent mode - they are not available in regular Cursor chat. Switch the mode toggle in the Composer panel to Agent before sending your prompt.

Server shows a green dot but tools keep failing

Test your API key by running npx scrapercity wallet in your terminal. If that returns an error, the key is invalid or expired - generate a new one from app.scrapercity.com/dashboard/api-docs. Also confirm that the env block in mcp.json is using the key name SCRAPERCITY_API_KEY exactly as shown.

Cursor asks to run a tool but the call times out

For most scrapers this is expected behavior - Apollo scrapes take 11-48+ hours and Cursor may time out waiting. Ask Cursor to save the run ID to a file in your project, then start a new session later and ask it to poll that run ID and download results. You can also set a webhook to get notified when the scrape completes.

mcp.json changes are not taking effect

Cursor loads the MCP config on startup. After editing mcp.json, you must fully restart Cursor (not just reload the window) for changes to take effect. You can also click the Reload button next to the server in Cursor Settings > Tools & MCP.

Duplicate request error from the ScraperCity API

ScraperCity blocks identical requests submitted within 30 seconds to prevent duplicate charges. If Cursor retries a failed tool call immediately with the same parameters, it will get a duplicate protection error. Wait 30 seconds before retrying, or change a parameter slightly (e.g. adjust the result limit by 1).

FAQ

Get API access to ScraperCity: