A powerful and open-source Google Maps scraper for extracting business data at scale. Available as CLI, Web UI, REST API, or deployable to Kubernetes/AWS Lambda.
| Free & Open Source | MIT licensed, no hidden costs or usage limits |
| Multiple Interfaces | CLI, Web UI, REST API - use what fits your workflow |
| High Performance | ~120 places/minute with optimized concurrency |
| 33+ Data Points | Business details, reviews, emails, coordinates, and more |
| Production Ready | Scale from a single machine to Kubernetes clusters |
| Flexible Output | CSV, JSON, PostgreSQL, S3, or custom plugins |
| Proxy Support | Built-in SOCKS5/HTTP/HTTPS proxy rotation |
- Quick Start
- Installation
- Features
- Extracted Data Points
- Configuration
- Advanced Usage
- Performance
- Contributing
- License
Start the web interface with a single command:
mkdir -p gmapsdata && docker run -v $PWD/gmapsdata:/gmapsdata -p 8080:8080 google-maps-scraper -data-folder /gmapsdataThen open http://localhost:8080 in your browser.
Note: Results take at least 3 minutes to appear (minimum configured runtime).
macOS Users: Docker command may not work. See MacOS Instructions.
touch results.csv && docker run \
-v $PWD/example-queries.txt:/example-queries \
-v $PWD/results.csv:/results.csv \
google-maps-scraper \
-depth 1 \
-input /example-queries \
-results /results.csv \
-exit-on-inactivity 3mTip: Use
google-maps-scraper:latest-rodfor the Rod version with faster container startup.
Want emails? Add the -email flag.
Want all reviews (up to ~300)? Add --extra-reviews and use -json output.
When running the web server, a full REST API is available:
| Endpoint | Method | Description |
|---|---|---|
/api/v1/jobs |
POST | Create a new scraping job |
/api/v1/jobs |
GET | List all jobs |
/api/v1/jobs/{id} |
GET | Get job details |
/api/v1/jobs/{id} |
DELETE | Delete a job |
/api/v1/jobs/{id}/download |
GET | Download results as CSV |
Two Docker image variants are available:
| Image | Tag | Browser Engine | Best For |
|---|---|---|---|
| Playwright (default) | latest, vX.X.X |
Playwright | Most users, better stability |
| Rod | latest-rod, vX.X.X-rod |
Rod/Chromium | Lightweight, faster startup |
# Playwright version (default)
docker pull google-maps-scraper
# Rod version (alternative)
docker pull google-maps-scraper:latest-rodRequirements: Go 1.25.6+
git clone https://github.com/yourusername/google-maps-scraper.git
cd google-maps-scraper
go mod download
# Playwright version (default)
go build
./google-maps-scraper -input example-queries.txt -results results.csv -exit-on-inactivity 3m
# Rod version (alternative)
go build -tags rod
./google-maps-scraper -input example-queries.txt -results results.csv -exit-on-inactivity 3mFirst run downloads required browser libraries (Playwright or Chromium depending on version).
| Feature | Description |
|---|---|
| 33+ Data Points | Business name, address, phone, website, reviews, coordinates, and more |
| Email Extraction | Optional crawling of business websites for email addresses |
| Multiple Output Formats | CSV, JSON, PostgreSQL, S3, or custom plugins |
| Proxy Support | SOCKS5, HTTP, HTTPS with authentication |
| Scalable Architecture | Single machine to Kubernetes cluster |
| REST API | Programmatic control for automation |
| Web UI | User-friendly browser interface |
| Fast Mode (Beta) | Quick extraction of up to 21 results per query |
| AWS Lambda | Serverless execution support (experimental) |
Click to expand all 33 data points
| # | Field | Description |
|---|---|---|
| 1 | input_id |
Internal identifier for the input query |
| 2 | link |
Direct URL to the Google Maps listing |
| 3 | title |
Business name |
| 4 | category |
Business type (e.g., Restaurant, Hotel) |
| 5 | address |
Street address |
| 6 | open_hours |
Operating hours |
| 7 | popular_times |
Visitor traffic patterns |
| 8 | website |
Official business website |
| 9 | phone |
Contact phone number |
| 10 | plus_code |
Location shortcode |
| 11 | review_count |
Total number of reviews |
| 12 | review_rating |
Average star rating |
| 13 | reviews_per_rating |
Breakdown by star rating |
| 14 | latitude |
GPS latitude |
| 15 | longitude |
GPS longitude |
| 16 | cid |
Google's unique Customer ID |
| 17 | status |
Business status (open/closed/temporary) |
| 18 | descriptions |
Business description |
| 19 | reviews_link |
Direct link to reviews |
| 20 | thumbnail |
Thumbnail image URL |
| 21 | timezone |
Business timezone |
| 22 | price_range |
Price level ( |
| 23 | data_id |
Internal Google Maps identifier |
| 24 | images |
Associated image URLs |
| 25 | reservations |
Reservation booking link |
| 26 | order_online |
Online ordering link |
| 27 | menu |
Menu link |
| 28 | owner |
Owner-claimed status |
| 29 | complete_address |
Full formatted address |
| 30 | about |
Additional business info |
| 31 | user_reviews |
Customer reviews (text, rating, timestamp) |
| 32 | emails |
Extracted email addresses (requires -email flag) |
| 33 | user_reviews_extended |
Extended reviews up to ~300 (requires -extra-reviews) |
| 34 | place_id |
Google's unique place id |
Custom Input IDs: Define your own IDs in the input file:
Matsuhisa Athens #!#MyCustomID
Usage: google-maps-scraper [options]
Core Options:
-input string Path to input file with queries (one per line)
-results string Output file path (default: stdout)
-json Output JSON instead of CSV
-depth int Max scroll depth in results (default: 10)
-c int Concurrency level (default: half of CPU cores)
Email & Reviews:
-email Extract emails from business websites
-extra-reviews Collect extended reviews (up to ~300)
Location Settings:
-lang string Language code, e.g., 'de' for German (default: "en")
-geo string Coordinates for search, e.g., '37.7749,-122.4194'
-zoom int Zoom level 0-21 (default: 15)
-radius float Search radius in meters (default: 10000)
Web Server:
-web Run web server mode
-addr string Server address (default: ":8080")
-data-folder Data folder for web runner (default: "webdata")
Database:
-dsn string PostgreSQL connection string
-produce Produce seed jobs only (requires -dsn)
Proxy:
-proxies string Comma-separated proxy list
Format: protocol://user:pass@host:port
Advanced:
-exit-on-inactivity duration Exit after inactivity (e.g., '5m')
-fast-mode Quick mode with reduced data
-debug Show browser window
-writer string Custom writer plugin (format: 'dir:pluginName')
Run ./google-maps-scraper -h for the complete list.
For larger scraping jobs, proxies help avoid rate limiting. Here's how to configure them:
./google-maps-scraper \
-input queries.txt \
-results results.csv \
-proxies 'socks5://user:pass@host:port,http://host2:port2' \
-depth 1 -c 2Supported protocols: socks5, socks5h, http, https
Email extraction is disabled by default. When enabled, the scraper visits each business website to find email addresses.
./google-maps-scraper -input queries.txt -results results.csv -emailNote: Email extraction increases processing time significantly.
Fast mode returns up to 21 results per query, ordered by distance. Useful for quick data collection with basic fields.
./google-maps-scraper \
-input queries.txt \
-results results.csv \
-fast-mode \
-zoom 15 \
-radius 5000 \
-geo '37.7749,-122.4194'Warning: Fast mode is in Beta. You may experience blocking.
For distributed scraping across multiple machines:
1. Start PostgreSQL:
docker-compose -f docker-compose.dev.yaml up -d2. Seed the jobs:
./google-maps-scraper \
-dsn "postgres://postgres:postgres@localhost:5432/postgres" \
-produce \
-input example-queries.txt \
-lang en3. Run scrapers (on multiple machines):
./google-maps-scraper \
-c 2 \
-depth 1 \
-dsn "postgres://postgres:postgres@localhost:5432/postgres"apiVersion: apps/v1
kind: Deployment
metadata:
name: google-maps-scraper
spec:
replicas: 3 # Adjust based on needs
selector:
matchLabels:
app: google-maps-scraper
template:
metadata:
labels:
app: google-maps-scraper
spec:
containers:
- name: google-maps-scraper
image: google-maps-scraper:latest
args: ["-c", "1", "-depth", "10", "-dsn", "postgres://user:pass@host:5432/db"]
resources:
requests:
memory: "512Mi"
cpu: "500m"Note: The headless browser requires significant CPU/memory resources.
Create custom output handlers using Go plugins:
1. Write the plugin (see examples/plugins/example_writer.go)
2. Build:
go build -buildmode=plugin -tags=plugin -o myplugin.so myplugin.go3. Run:
./google-maps-scraper -writer ~/plugins:MyWriter -input queries.txtExpected throughput: ~120 places/minute (with -c 8 -depth 1)
| Keywords | Results/Keyword | Total Jobs | Estimated Time |
|---|---|---|---|
| 100 | 16 | 1,600 | ~13 minutes |
| 1,000 | 16 | 16,000 | ~2.5 hours |
| 10,000 | 16 | 160,000 | ~22 hours |
For large-scale scraping, use the PostgreSQL provider with Kubernetes.
Contributions are welcome! Please:
- Open an issue to discuss your idea
- Fork the repository
- Create a pull request
See AGENTS.md for development guidelines.
This project is licensed under the MIT License.
Please use this scraper responsibly and in accordance with applicable laws and regulations. Unauthorized scraping may violate terms of service.
