Google Sheets

Export scraped data directly to Google Sheets. Perfect for price monitoring, lead collection, and data analysis.

Integration Methods

Direct Export

Send scraping results directly to Sheets via Scrpy dashboard.

Zapier

Use Zapier for automated, no-code workflows.

Apps Script

Write custom Google Apps Script for advanced automation.

Python

Use gspread library with Scrpy SDK.

Google Apps Script Method

Add this script to your Google Sheet (Extensions → Apps Script):

javascript
const SCRPY_API_KEY = 'sk_live_xxxxx';

function scrapeToSheet() {
  const sheet = SpreadsheetApp.getActiveSpreadsheet().getActiveSheet();
  
  const response = UrlFetchApp.fetch('https://api.scrpy.co/v1/scrape', {
    method: 'POST',
    headers: {
      'Authorization': 'Bearer ' + SCRPY_API_KEY,
      'Content-Type': 'application/json'
    },
    payload: JSON.stringify({
      url: 'https://example.com/products',
      selectors: {
        name: '.product-name',
        price: '.price | parseNumber',
        url: 'a::attr(href)'
      }
    })
  });
  
  const result = JSON.parse(response.getContentText());
  
  if (result.success) {
    const data = result.data;
    sheet.appendRow([
      new Date(),
      data.name,
      data.price,
      data.url
    ]);
  }
}

// Set up daily trigger
function createTrigger() {
  ScriptApp.newTrigger('scrapeToSheet')
    .timeBased()
    .everyDays(1)
    .atHour(9)
    .create();
}

Python + gspread Method

python
import gspread
from oauth2client.service_account import ServiceAccountCredentials
from scrpy import Scrpy

# Setup Google Sheets
scope = [#a5d6ff;">'https://spreadsheets.google.com/feeds']
creds = ServiceAccountCredentials.from_json_keyfile_name(#a5d6ff;">'credentials.json', scope)
gc = gspread.authorize(creds)
sheet = gc.open(#a5d6ff;">'Scraping Results').sheet1

# Setup Scrpy
client = Scrpy(api_key=#a5d6ff;">'sk_live_xxxxx')

# Scrape and export
result = client.scrape(
    url=#a5d6ff;">'https://example.com/products',
    selectors={
        #a5d6ff;">'name': '.product-name',
        #a5d6ff;">'price': '.price | parseNumber'
    }
)

# Append to sheet
sheet.append_row([
    datetime.now().isoformat(),
    result.data[#a5d6ff;">'name'],
    result.data[#a5d6ff;">'price']
])

Bulk Export from Jobs

python
from scrpy import Scrpy
import gspread

client = Scrpy(api_key=#a5d6ff;">'sk_live_xxxxx')
gc = gspread.authorize(creds)
sheet = gc.open(#a5d6ff;">'Products').sheet1

# Create bulk job
job = client.jobs.create(
    name=#a5d6ff;">'Product Scrape',
    urls=[f#a5d6ff;">'https://store.com/product/{i}' for i in range(1, 101)],
    selectors={
        #a5d6ff;">'name': '.product-name',
        #a5d6ff;">'price': '.price | parseNumber',
        #a5d6ff;">'sku': '.sku'
    }
)

# Wait for completion
job = client.jobs.wait(job.id)

# Export all results
rows = []
for result in job.results:
    rows.append([
        result.url,
        result.data.get(#a5d6ff;">'name', ''),
        result.data.get(#a5d6ff;">'price', ''),
        result.data.get(#a5d6ff;">'sku', '')
    ])

# Batch update for efficiency
sheet.append_rows(rows)

Common Use Cases

💰 Price Monitoring

Track competitor prices daily with automatic spreadsheet updates. Create charts to visualize trends.

📋 Lead Lists

Build contact lists from directories with names, emails, and company info.

🏠 Real Estate Data

Collect property listings with prices, locations, and features for market analysis.

📰 Content Aggregation

Aggregate headlines and articles from multiple sources into one sheet.

Best Practices

Add Timestamps

Always include a timestamp column to track when data was scraped.

Use Batch Updates

For large datasets, use append_rows() instead of individual append_row() calls.

Include Source URL

Store the source URL for each row to trace data back to its origin.

Related