ScraperCity logo

Integration Guide

ScraperCity + Claude Code

Pull B2B leads from the Lead Database using natural language prompts in your terminal. Claude Code handles the API calls, pagination, and CSV export.

What You Need

ScraperCity API Key

Generate one at app.scrapercity.com/dashboard/api-docs. Requires the $649/mo plan.

Claude Code

Install with npm install -g @anthropic-ai/claude-code. Requires Node.js 18+.

Anthropic Account

Claude Pro, Max, or API access to run Claude Code.

Setup (2 Minutes)

1

Set your API key as an environment variable

Add this to your shell profile (~/.bashrc, ~/.zshrc, etc.):

export SCRAPERCITY_API_KEY="your_api_key_here"

Alternatively, create a .env file in your project directory. Claude Code reads .env files automatically.

2

Open Claude Code and give it context

Launch Claude Code in your terminal and paste this context block at the start of your session. This tells Claude Code everything it needs to call the API correctly:

I need you to call the ScraperCity Lead Database API.

Endpoint: GET https://app.scrapercity.com/api/v1/database/leads
Auth: Bearer $SCRAPERCITY_API_KEY
Max 100 per request. Paginate using ?page=1&page=2 etc.
Response shape: { data: [...leads], pagination: { page, limit, total, totalPages }, rate_limit: { daily_used, daily_limit } }

Available filters (all query params):
- title, seniority, department, industry, country, state, city
- companyName, companyDomain, companySize
- minEmployees, maxEmployees
- hasEmail=true, hasPhone=true
- page, limit (max 100)

Array params use repeated keys: ?seniority=vp&seniority=director
3

Ask for what you want in plain English

Now just tell Claude Code what leads you want. It will build the API call, paginate through results, and save the output.

Example Prompts

Find all CTOs in the United States with email addresses. Pull every page and save to ctos-usa.csv.

Claude Code will loop through all pages at 100 per request and write a CSV with name, email, title, company, and LinkedIn URL.

Query the lead database for VPs and Directors in the sales department at companies with 200-1000 employees in California. Deduplicate by email and save as a CSV.

Uses multiple seniority values, employee range filters, and deduplication logic. Claude Code handles all of it.

I have a file called domains.txt with one company domain per line. For each domain, query the lead database for up to 5 contacts with email addresses. Save the combined results to enriched-contacts.csv with a 200ms delay between requests.

Claude Code reads the file, loops through each domain, calls the API with companyDomain filter, and combines everything.

Write a bash script that pulls all leads where industry is "computer software" and country is "United States" with emails, paginates through everything, and saves to leads-YYYY-MM-DD.csv. Print the daily rate limit remaining when done.

Generates a reusable script you can run on a cron. Claude Code reads the rate_limit object from each response to track usage.

Tips

Add the context block to a CLAUDE.md file in your project root. Claude Code reads this file automatically at the start of every session, so you never have to re-paste the API details.

Ask Claude Code to check rate_limit.daily_used in the response and stop before hitting 100,000. This prevents wasted requests if you are running large pulls.

For recurring pulls, ask Claude Code to generate a standalone bash or Python script. Run it on a cron and you have a daily lead pipeline without needing Claude Code in the loop.

Use hasEmail=true on almost every query. Leads without email addresses are not useful for outbound campaigns.

FAQ

Get API access to the Lead Database: