Docs
Log in
Coming soon — preview of how it will work

The dedicated Crawlbase Make module is in development. The setup steps below are a preview of the shipped flow. Email us to be notified when it lands.

Need it today? Use Make's built-in HTTP app and call the Crawling API directly — outputs map cleanly to downstream modules with one extra wiring step.

Setup

  1. In your Make scenario editor, click the + button and search for Crawlbase.
  2. Pick a module (Crawl, Scrape, Screenshot, Crawler).
  3. Click Add next to Connection and paste your token. Save.
  4. Configure module fields. Outputs are structured (status, body, headers as named items) so downstream modules can map them directly.

Available modules

Crawl a URL
module
Standard Crawling API with all parameters as form fields.
Scrape Structured Data
module
Pick a scraper from a dropdown — output bundles all extracted fields.
Capture Screenshot
module
Returns the image as a binary that other modules (Drive, S3, email) can save or attach.

Example scenarios

  • Watch and notify: Schedule (every 6h) → Scrape product → Comparator → Email if price changed
  • Lead pipeline: Webhook → Scrape LinkedIn → Hubspot upsert
  • Visual archive: Schedule → Screenshot → Google Drive folder by date