ScraperCity logo

Integration Guide

ScraperCity + Cline

Add ScraperCity as an MCP server in Cline and scrape B2B data without leaving VS Code. Pull Apollo contacts, extract Google Maps businesses, find and validate emails, look up phone numbers, and export results - all through natural language prompts inside your editor.

Cline and MCP Servers

Cline is a VS Code extension that turns your editor into an autonomous AI coding agent. It reads your files, writes code, runs terminal commands, and - through the Model Context Protocol (MCP) - calls external tools and APIs on your behalf. MCP servers are the mechanism that connects Cline to outside data sources. Once you register an MCP server in Cline, it becomes a callable tool available to any prompt you write.

ScraperCity publishes an MCP server that exposes all 25 of its B2B data scrapers as Cline-callable tools. That means you can ask Cline to scrape leads, validate emails, run a reverse phone number lookup, or execute a skip trace - and Cline handles the API call, async polling, result download, and file export as part of a single workflow, right alongside your code.

Cline supports multiple AI model providers - Anthropic Claude, OpenAI GPT-4o, Google Gemini, and others. The ScraperCity MCP server works identically regardless of which model you have configured. You also keep Cline's plan-and-approve step, so you can review every ScraperCity tool call before it executes and before any credits are spent.

Setup

Cline stores MCP server config in cline_mcp_settings.json. Add the ScraperCity block and restart.

1

Install Cline from the VS Code Marketplace

Search for Cline in the VS Code Extensions panel and install it. Once installed, the Cline icon appears in the activity bar. Open the Cline pane and configure your model provider and API key before proceeding.

2

Get your ScraperCity API key

Log in to your ScraperCity account and navigate to app.scrapercity.com/dashboard/api-docs to copy your API key. All plans ($49/mo, $149/mo, $649/mo) include full API access to every scraper.

3

Add ScraperCity to cline_mcp_settings.json

In the Cline pane, click the MCP Servers icon in the top navigation bar, then select the Configure tab and click Configure MCP Servers. This opens cline_mcp_settings.json. Add the ScraperCity block inside the mcpServers object:

{
  "mcpServers": {
    "scrapercity": {
      "command": "npx",
      "args": ["-y", "--package", "scrapercity", "scrapercity-mcp"],
      "env": {
        "SCRAPERCITY_API_KEY": "your_api_key_here"
      }
    }
  }
}

Replace your_api_key_here with your actual key. Save the file.

4

Restart the server and verify

After saving, Cline detects the config change. Click the Restart Server button next to the scrapercity entry in the MCP Servers panel. A green indicator means the server is connected. You can now mention ScraperCity tasks in any Cline prompt and the tool will appear in Cline's plan step before execution.

What You Can Ask Cline

These examples show the kinds of natural language prompts Cline handles end-to-end using the ScraperCity MCP server.

Scrape 500 restaurants in Chicago from Google Maps. Save results to data/chicago-restaurants.csv and tell me how many have email addresses.

Cline calls the Maps scraper ($0.01/place), waits for completion (5-30 min), downloads the CSV, reads it, and reports the email coverage.

Read the company list in targets.txt and find the marketing director email for each company using ScraperCity. Write a Python script that does this with proper rate limiting.

Cline builds a reusable script that reads the file, calls the email finder ($0.05/contact) per company, handles errors, and writes results to a new CSV.

Check my ScraperCity wallet balance, then scrape Apollo for all VPs of Engineering at companies with 200+ employees in Texas. Validate the emails before saving.

Cline chains wallet check, Apollo scrape ($0.0039/lead, 11-48+ hours), email validation ($0.0036/email), and CSV export into one autonomous workflow.

Pull all Shopify stores selling outdoor gear and find the owner contact info for each one. Export a spreadsheet I can upload to my CRM.

Cline calls the Store Leads scraper ($0.0039/lead) for ecommerce store discovery, then chains the Website Finder or Email Finder for each domain to enrich owner contacts.

I have a list of 200 email addresses in leads.csv. Validate each one for deliverability and flag any catch-all or invalid addresses in a new column.

Cline reads the CSV, batches the addresses through the ScraperCity email validation API ($0.0036/email), and rewrites the file with a status column for each record.

Run a skip trace on the names in prospects.csv using their city and state, then write the results back to the same file with phone and address columns added.

Cline calls the People Finder (skip trace) tool ($0.02/result) for each row, polls for results, and merges the returned contact fields back into the original spreadsheet.

Available Scrapers Through Cline

Every ScraperCity tool is available as a Cline-callable MCP tool once the server is configured. Here is what you can instruct Cline to use:

Apollo Leads

B2B contacts by title, industry, location

$0.0039/lead11-48+ hrs

Google Maps

Local businesses with phones, emails, reviews

$0.01/place5-30 min

Email Validator

Deliverability, catch-all, MX record checks

$0.0036/email1-10 min

Email Finder

Business email from name + company

$0.05/contact1-10 min

Mobile Finder

Phone numbers from LinkedIn or email

$0.25/input1-5 min

People Finder (Skip Trace)

Contact data by name, email, phone, or address

$0.02/result2-10 min

Store Leads

Shopify/WooCommerce stores with contacts

$0.0039/leadInstant

BuiltWith

All sites using a given technology stack

$4.99/search1-5 min

Lead Database

3M+ B2B contacts, instant query

$649/mo planInstant

Property Lookup

Property data and owner contact info

$0.15/address2-10 min

Yelp

Business listings from Yelp

$0.01/listing5-15 min

YouTube Email

Business emails for YouTube channels

Per channel5-15 min

Full catalog includes 25 scrapers: Airbnb Email, Angi, Zillow Agents, BizBuySell, Crexi, Criminal Records, Website Finder, and more. All accessible through the same Cline MCP config.

Email Validation API Through Cline

ScraperCity's email validation API checks each address for format validity, MX record existence, catch-all domain behavior, and SMTP deliverability - returning a clean status for each contact. At $0.0036 per email, you can validate a list of 10,000 addresses for under $40.

Through Cline, validation fits naturally into broader workflows. Instead of exporting a CSV, uploading it to a standalone tool, and reimporting, you ask Cline to validate inline. It reads your leads file, submits addresses to the ScraperCity validation endpoint, waits for the 1-10 minute processing window, and writes the results back into the same file with a deliverability status column.

A typical prompt for bulk validation looks like this:

"I exported 3,000 leads from our CRM into leads_export.csv.
Validate every email address using ScraperCity and add a
'email_status' column with the result. Flag catch-all and
invalid addresses so I can filter them before sending."

Cline will batch the validation requests, handle polling, and update the CSV. You can chain this directly after an Apollo scrape or Google Maps extraction so you only finish with deliverable contacts in your output file.

Skip Trace API and Phone Number Lookup Through Cline

ScraperCity's People Finder is a skip trace API that accepts any of: full name + city/state, email address, phone number, or physical address - and returns available contact data including phone numbers, addresses, and associated records at $0.02 per result.

The Mobile Finder tool complements this with a dedicated reverse phone number lookup API workflow: provide a LinkedIn profile URL or email address and it returns associated mobile numbers at $0.25 per input. This is particularly useful for enriching B2B contacts where direct dials are missing.

Through Cline, both tools integrate into your existing data files without a separate app or manual upload step:

"I have a spreadsheet of real estate investors in prospects.xlsx
with name, city, and state columns. Run a skip trace on each
row using ScraperCity People Finder and add phone, email, and
full address columns to the output file."

Cline reads the file, iterates through each record, submits to the ScraperCity skip trace endpoint, polls for the 2-10 minute processing, and merges returned fields back into the spreadsheet. You end up with an enriched file ready for outreach without writing a line of code yourself.

Common Multi-Step Workflows

Because Cline operates autonomously within VS Code, it can chain multiple ScraperCity tools together in a single session. Here are workflows developers and sales teams use regularly.

Google Maps Scraper to Validated Email List

  1. 1.Ask Cline to scrape Google Maps for a category and location (e.g., "HVAC contractors in Phoenix")
  2. 2.Cline scrapes all business listings including name, phone, and website ($0.01/place)
  3. 3.Cline then calls the Email Finder for each domain to find owner/contact email ($0.05/contact)
  4. 4.Cline runs all found emails through the Email Validator ($0.0036/email)
  5. 5.Final CSV contains only deliverable contacts, ready for outreach

Apollo Lead Scrape to CRM-Ready File

  1. 1.Provide a job title, industry, and location target to Cline
  2. 2.Cline submits an Apollo scrape request ($0.0039/lead, 11-48+ hour delivery)
  3. 3.Cline sets a webhook or polls the status endpoint until complete
  4. 4.On completion, Cline downloads the CSV and validates all email addresses
  5. 5.Cline formats the output columns to match your CRM import template

Ecommerce Store Prospecting Pipeline

  1. 1.Ask Cline to pull Shopify or WooCommerce stores in a niche using Store Leads ($0.0039/lead)
  2. 2.Cline enriches each domain with the Website Finder for contact details
  3. 3.Cline adds a BuiltWith check on target domains to confirm tech stack ($4.99/search)
  4. 4.Cline writes a final prospect list with store URL, contact, and technology data

Troubleshooting

Most issues connecting Cline to the ScraperCity MCP server fall into one of these categories.

Server shows a red indicator or 'connection closed' error

Open cline_mcp_settings.json and confirm the SCRAPERCITY_API_KEY value has no extra spaces or quotes. Save the file and click Restart Server in the Cline MCP Servers panel. If the error persists, confirm that Node.js (v18+) is installed by running node --version in your terminal.

'npx' command not found on Windows

Cline spawns MCP server processes using the PATH available to VS Code, which can differ from your terminal PATH. On Windows, try using the full path to npx.cmd (e.g., C:\\Program Files\\nodejs\\npx.cmd) as the command value in your config, or ensure Node.js is in the system PATH rather than a user-only PATH variable.

Tool call does not appear in Cline's plan step

Confirm the server shows a green indicator in the MCP Servers panel. If the server is connected but tools are missing, try disabling and re-enabling the server using the toggle switch next to the scrapercity entry. Cline re-fetches the tool list on each enable.

API key error / 401 Unauthorized from ScraperCity

Your API key is found at app.scrapercity.com/dashboard/api-docs. Paste it directly into the SCRAPERCITY_API_KEY field in cline_mcp_settings.json with no extra whitespace. Check that your ScraperCity plan is active and your wallet has a positive balance before running paid scrapers.

Request timeout on long-running Apollo scrapes

Apollo scrapes take 11-48+ hours. Cline's default MCP network timeout is 1 minute. For Apollo jobs, increase the Network Timeout setting in the Cline MCP server config panel (accessible by clicking the scrapercity server entry). Set it to the maximum available value and let Cline poll the status endpoint rather than waiting for a single synchronous response.

Duplicate scrape blocked / 30-second lockout

ScraperCity blocks identical requests submitted within 30 seconds of each other as duplicate protection. If you are iterating on a prompt and resubmitting the same parameters quickly, wait at least 30 seconds between attempts or slightly vary the request parameters.

Performance Tips

FAQ

Get API access to ScraperCity: