Integration Guide
Add ScraperCity as an MCP server in Cline and scrape B2B data without leaving VS Code. Pull Apollo contacts, extract Google Maps businesses, find and validate emails, look up phone numbers, and export results - all through natural language prompts inside your editor.
Cline is a VS Code extension that turns your editor into an autonomous AI coding agent. It reads your files, writes code, runs terminal commands, and - through the Model Context Protocol (MCP) - calls external tools and APIs on your behalf. MCP servers are the mechanism that connects Cline to outside data sources. Once you register an MCP server in Cline, it becomes a callable tool available to any prompt you write.
ScraperCity publishes an MCP server that exposes all 25 of its B2B data scrapers as Cline-callable tools. That means you can ask Cline to scrape leads, validate emails, run a reverse phone number lookup, or execute a skip trace - and Cline handles the API call, async polling, result download, and file export as part of a single workflow, right alongside your code.
Cline supports multiple AI model providers - Anthropic Claude, OpenAI GPT-4o, Google Gemini, and others. The ScraperCity MCP server works identically regardless of which model you have configured. You also keep Cline's plan-and-approve step, so you can review every ScraperCity tool call before it executes and before any credits are spent.
Cline stores MCP server config in cline_mcp_settings.json. Add the ScraperCity block and restart.
Search for Cline in the VS Code Extensions panel and install it. Once installed, the Cline icon appears in the activity bar. Open the Cline pane and configure your model provider and API key before proceeding.
Log in to your ScraperCity account and navigate to app.scrapercity.com/dashboard/api-docs to copy your API key. All plans ($49/mo, $149/mo, $649/mo) include full API access to every scraper.
In the Cline pane, click the MCP Servers icon in the top navigation bar, then select the Configure tab and click Configure MCP Servers. This opens cline_mcp_settings.json. Add the ScraperCity block inside the mcpServers object:
{
"mcpServers": {
"scrapercity": {
"command": "npx",
"args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
"env": {
"SCRAPERCITY_API_KEY": "your_api_key_here"
}
}
}
}Replace your_api_key_here with your actual key. Save the file.
After saving, Cline detects the config change. Click the Restart Server button next to the scrapercity entry in the MCP Servers panel. A green indicator means the server is connected. You can now mention ScraperCity tasks in any Cline prompt and the tool will appear in Cline's plan step before execution.
These examples show the kinds of natural language prompts Cline handles end-to-end using the ScraperCity MCP server.
“Scrape 500 restaurants in Chicago from Google Maps. Save results to data/chicago-restaurants.csv and tell me how many have email addresses.”
Cline calls the Maps scraper ($0.01/place), waits for completion (5-30 min), downloads the CSV, reads it, and reports the email coverage.
“Read the company list in targets.txt and find the marketing director email for each company using ScraperCity. Write a Python script that does this with proper rate limiting.”
Cline builds a reusable script that reads the file, calls the email finder ($0.05/contact) per company, handles errors, and writes results to a new CSV.
“Check my ScraperCity wallet balance, then scrape Apollo for all VPs of Engineering at companies with 200+ employees in Texas. Validate the emails before saving.”
Cline chains wallet check, Apollo scrape ($0.0039/lead, 11-48+ hours), email validation ($0.0036/email), and CSV export into one autonomous workflow.
“Pull all Shopify stores selling outdoor gear and find the owner contact info for each one. Export a spreadsheet I can upload to my CRM.”
Cline calls the Store Leads scraper ($0.0039/lead) for ecommerce store discovery, then chains the Website Finder or Email Finder for each domain to enrich owner contacts.
“I have a list of 200 email addresses in leads.csv. Validate each one for deliverability and flag any catch-all or invalid addresses in a new column.”
Cline reads the CSV, batches the addresses through the ScraperCity email validation API ($0.0036/email), and rewrites the file with a status column for each record.
“Run a skip trace on the names in prospects.csv using their city and state, then write the results back to the same file with phone and address columns added.”
Cline calls the People Finder (skip trace) tool ($0.02/result) for each row, polls for results, and merges the returned contact fields back into the original spreadsheet.
Every ScraperCity tool is available as a Cline-callable MCP tool once the server is configured. Here is what you can instruct Cline to use:
Apollo Leads
B2B contacts by title, industry, location
Google Maps
Local businesses with phones, emails, reviews
Email Validator
Deliverability, catch-all, MX record checks
Email Finder
Business email from name + company
Mobile Finder
Phone numbers from LinkedIn or email
People Finder (Skip Trace)
Contact data by name, email, phone, or address
Store Leads
Shopify/WooCommerce stores with contacts
BuiltWith
All sites using a given technology stack
Lead Database
3M+ B2B contacts, instant query
Property Lookup
Property data and owner contact info
Yelp
Business listings from Yelp
YouTube Email
Business emails for YouTube channels
Full catalog includes 25 scrapers: Airbnb Email, Angi, Zillow Agents, BizBuySell, Crexi, Criminal Records, Website Finder, and more. All accessible through the same Cline MCP config.
ScraperCity's email validation API checks each address for format validity, MX record existence, catch-all domain behavior, and SMTP deliverability - returning a clean status for each contact. At $0.0036 per email, you can validate a list of 10,000 addresses for under $40.
Through Cline, validation fits naturally into broader workflows. Instead of exporting a CSV, uploading it to a standalone tool, and reimporting, you ask Cline to validate inline. It reads your leads file, submits addresses to the ScraperCity validation endpoint, waits for the 1-10 minute processing window, and writes the results back into the same file with a deliverability status column.
A typical prompt for bulk validation looks like this:
"I exported 3,000 leads from our CRM into leads_export.csv.
Validate every email address using ScraperCity and add a
'email_status' column with the result. Flag catch-all and
invalid addresses so I can filter them before sending."Cline will batch the validation requests, handle polling, and update the CSV. You can chain this directly after an Apollo scrape or Google Maps extraction so you only finish with deliverable contacts in your output file.
ScraperCity's People Finder is a skip trace API that accepts any of: full name + city/state, email address, phone number, or physical address - and returns available contact data including phone numbers, addresses, and associated records at $0.02 per result.
The Mobile Finder tool complements this with a dedicated reverse phone number lookup API workflow: provide a LinkedIn profile URL or email address and it returns associated mobile numbers at $0.25 per input. This is particularly useful for enriching B2B contacts where direct dials are missing.
Through Cline, both tools integrate into your existing data files without a separate app or manual upload step:
"I have a spreadsheet of real estate investors in prospects.xlsx
with name, city, and state columns. Run a skip trace on each
row using ScraperCity People Finder and add phone, email, and
full address columns to the output file."Cline reads the file, iterates through each record, submits to the ScraperCity skip trace endpoint, polls for the 2-10 minute processing, and merges returned fields back into the spreadsheet. You end up with an enriched file ready for outreach without writing a line of code yourself.
Because Cline operates autonomously within VS Code, it can chain multiple ScraperCity tools together in a single session. Here are workflows developers and sales teams use regularly.
Google Maps Scraper to Validated Email List
Apollo Lead Scrape to CRM-Ready File
Ecommerce Store Prospecting Pipeline
Most issues connecting Cline to the ScraperCity MCP server fall into one of these categories.
Server shows a red indicator or 'connection closed' error
Open cline_mcp_settings.json and confirm the SCRAPERCITY_API_KEY value has no extra spaces or quotes. Save the file and click Restart Server in the Cline MCP Servers panel. If the error persists, confirm that Node.js (v18+) is installed by running node --version in your terminal.
'npx' command not found on Windows
Cline spawns MCP server processes using the PATH available to VS Code, which can differ from your terminal PATH. On Windows, try using the full path to npx.cmd (e.g., C:\\Program Files\\nodejs\\npx.cmd) as the command value in your config, or ensure Node.js is in the system PATH rather than a user-only PATH variable.
Tool call does not appear in Cline's plan step
Confirm the server shows a green indicator in the MCP Servers panel. If the server is connected but tools are missing, try disabling and re-enabling the server using the toggle switch next to the scrapercity entry. Cline re-fetches the tool list on each enable.
API key error / 401 Unauthorized from ScraperCity
Your API key is found at app.scrapercity.com/dashboard/api-docs. Paste it directly into the SCRAPERCITY_API_KEY field in cline_mcp_settings.json with no extra whitespace. Check that your ScraperCity plan is active and your wallet has a positive balance before running paid scrapers.
Request timeout on long-running Apollo scrapes
Apollo scrapes take 11-48+ hours. Cline's default MCP network timeout is 1 minute. For Apollo jobs, increase the Network Timeout setting in the Cline MCP server config panel (accessible by clicking the scrapercity server entry). Set it to the maximum available value and let Cline poll the status endpoint rather than waiting for a single synchronous response.
Duplicate scrape blocked / 30-second lockout
ScraperCity blocks identical requests submitted within 30 seconds of each other as duplicate protection. If you are iterating on a prompt and resubmitting the same parameters quickly, wait at least 30 seconds between attempts or slightly vary the request parameters.
app.scrapercity.com/dashboard/webhooks and have Cline include the webhook URL in its scrape requests. You get a POST notification when results are ready instead of polling.