← Skills

supabase-lead-ops

Supabase PostgreSQL operations for lead pipeline management

Used by

Supabase Lead Operations

You have access to the LeadsPanther Supabase PostgreSQL database via the lead-engine CLI. Never write lead data to markdown files. All lead data goes through the database.

Connection

  • DATABASE_URL is set in the environment (.env)
  • Lead engine path: C:/Users/Administrator/.openclaw/workspace/leadspanther-lead-engine
  • All CLI commands below are run from the lead-engine directory using npm run <script> -- <args>

Database Schema

Core Tables

  • leads — 37 columns: id, source, company_name, website, phone, email, address1, city, state, zip, gbp_rating, gbp_review_count, gbp_category, gbp_maps_url, gbp_analysis (jsonb), ready_for_outreach (bool), ready_for_hubspot (bool), hubspot_company_id, hubspot_synced_at, etc.
  • clients — id, name, tier (basic/standard/premium), lead_cap_per_day, delivery_method (sheets/hubspot/webhook)
  • lead_assignments — lead_id → client_id mapping (exclusive)
  • exclusivity_locks — address-level exclusivity (90-day default)
  • runs — batch run logs
  • hubspot_tokens — OAuth tokens per portal
  • conversations — messaging threads
  • messages — individual messages with direction, payload, handling status
  • telephony_events — call/SMS events

CLI Commands

All commands use --prefix so they work from any directory:

set LE=C:/Users/Administrator/.openclaw/workspace/leadspanther-lead-engine

Import Leads

npm --prefix $LE run import:google-maps -- --query "roofing company" --location "Dallas, TX" --pages 3 --num 20 npm --prefix $LE run import:google-maps -- --query "dental office" --counties "Tarrant,Dallas,Collin,Denton" --pages 2
  • Inserts leads into leads table with dedup (ON CONFLICT DO NOTHING)
  • Sources: google_places (if GOOGLE_PLACES_API_KEY set) or google_maps_serper (fallback)

Full Pipeline (import → enrich → analyze → ready → hubspot)

npm --prefix $LE run pipeline:run -- --counties "Tarrant,Dallas,Collin,Denton" --pages 2

This runs all 7 steps in sequence: import → places details → emails → website screenshots → GBP analysis → compute ready → HubSpot sync.

Individual Enrichment Steps

npm --prefix $LE run enrich:places-details -- --limit 200 npm --prefix $LE run enrich:emails -- --limit 100 npm --prefix $LE run enrich:website-screens -- --limit 50 npm --prefix $LE run analyze:gbp -- --limit 1000 npm --prefix $LE run compute:ready -- --limit 5000 npm --prefix $LE run hubspot:import-google-maps -- --limit 200

Query Pipeline Metrics

npm --prefix $LE run query:pipeline npm --prefix $LE run query:pipeline -- --format json npm --prefix $LE run query:pipeline -- --ready # only ready_for_outreach leads npm --prefix $LE run query:pipeline -- --recent 7 # last 7 days

Export

npm --prefix $LE run google:export-companies # export to Google Sheets npm --prefix $LE run hubspot:import-companies # import to HubSpot

Lead Lifecycle

  1. Import: Agent discovers businesses → import:google-maps inserts into DB
  2. Enrich: enrich:places-details fills phone/website/address from Google Places API
  3. Email: enrich:emails scrapes contact pages for email addresses
  4. Analyze: analyze:gbp generates GBP analysis (rating, flags, size guess, summary)
  5. Ready Gate: compute:ready sets ready_for_outreach and ready_for_hubspot flags
  6. HubSpot Sync: hubspot:import-google-maps pushes ready leads to HubSpot CRM
  7. Assignment: Leads assigned to clients via lead_assignments table

Important Rules

  • Never log leads to .md files — always use the lead-engine CLI
  • After importing new leads, always run the enrichment pipeline
  • Respect rate limits: max 100 businesses per import session
  • Budget enrichment: max 25 companies + 10 people per session via Crustdata
  • Log all credit usage and report to Friedrich (Finance)
  • When querying for outreach targets, filter by ready_for_outreach = true
  • Report pipeline metrics via query:pipeline — do not manually count from files

Verticals to Rotate (DFW)

dental, HVAC, plumbing, landscaping, cleaning, lawyers, accountants, roofing, insurance, real estate, recruiting, consulting, auto repair, med spas, chiropractic

View raw SKILL.md
# Supabase Lead Operations

You have access to the LeadsPanther Supabase PostgreSQL database via the lead-engine CLI. **Never write lead data to markdown files.** All lead data goes through the database.

## Connection

- **DATABASE_URL** is set in the environment (`.env`)
- **Lead engine path**: `C:/Users/Administrator/.openclaw/workspace/leadspanther-lead-engine`
- All CLI commands below are run from the lead-engine directory using `npm run <script> -- <args>`

## Database Schema

### Core Tables
- **leads** — 37 columns: id, source, company_name, website, phone, email, address1, city, state, zip, gbp_rating, gbp_review_count, gbp_category, gbp_maps_url, gbp_analysis (jsonb), ready_for_outreach (bool), ready_for_hubspot (bool), hubspot_company_id, hubspot_synced_at, etc.
- **clients** — id, name, tier (basic/standard/premium), lead_cap_per_day, delivery_method (sheets/hubspot/webhook)
- **lead_assignments** — lead_id → client_id mapping (exclusive)
- **exclusivity_locks** — address-level exclusivity (90-day default)
- **runs** — batch run logs
- **hubspot_tokens** — OAuth tokens per portal
- **conversations** — messaging threads
- **messages** — individual messages with direction, payload, handling status
- **telephony_events** — call/SMS events

## CLI Commands

All commands use `--prefix` so they work from **any directory**:

```
set LE=C:/Users/Administrator/.openclaw/workspace/leadspanther-lead-engine
```

### Import Leads
```
npm --prefix $LE run import:google-maps -- --query "roofing company" --location "Dallas, TX" --pages 3 --num 20
npm --prefix $LE run import:google-maps -- --query "dental office" --counties "Tarrant,Dallas,Collin,Denton" --pages 2
```
- Inserts leads into `leads` table with dedup (ON CONFLICT DO NOTHING)
- Sources: `google_places` (if GOOGLE_PLACES_API_KEY set) or `google_maps_serper` (fallback)

### Full Pipeline (import → enrich → analyze → ready → hubspot)
```
npm --prefix $LE run pipeline:run -- --counties "Tarrant,Dallas,Collin,Denton" --pages 2
```
This runs all 7 steps in sequence: import → places details → emails → website screenshots → GBP analysis → compute ready → HubSpot sync.

### Individual Enrichment Steps
```
npm --prefix $LE run enrich:places-details -- --limit 200
npm --prefix $LE run enrich:emails -- --limit 100
npm --prefix $LE run enrich:website-screens -- --limit 50
npm --prefix $LE run analyze:gbp -- --limit 1000
npm --prefix $LE run compute:ready -- --limit 5000
npm --prefix $LE run hubspot:import-google-maps -- --limit 200
```

### Query Pipeline Metrics
```
npm --prefix $LE run query:pipeline
npm --prefix $LE run query:pipeline -- --format json
npm --prefix $LE run query:pipeline -- --ready    # only ready_for_outreach leads
npm --prefix $LE run query:pipeline -- --recent 7 # last 7 days
```

### Export
```
npm --prefix $LE run google:export-companies   # export to Google Sheets
npm --prefix $LE run hubspot:import-companies  # import to HubSpot
```

## Lead Lifecycle

1. **Import**: Agent discovers businesses → `import:google-maps` inserts into DB
2. **Enrich**: `enrich:places-details` fills phone/website/address from Google Places API
3. **Email**: `enrich:emails` scrapes contact pages for email addresses
4. **Analyze**: `analyze:gbp` generates GBP analysis (rating, flags, size guess, summary)
5. **Ready Gate**: `compute:ready` sets `ready_for_outreach` and `ready_for_hubspot` flags
6. **HubSpot Sync**: `hubspot:import-google-maps` pushes ready leads to HubSpot CRM
7. **Assignment**: Leads assigned to clients via `lead_assignments` table

## Important Rules

- **Never log leads to .md files** — always use the lead-engine CLI
- After importing new leads, always run the enrichment pipeline
- Respect rate limits: max 100 businesses per import session
- Budget enrichment: max 25 companies + 10 people per session via Crustdata
- Log all credit usage and report to Friedrich (Finance)
- When querying for outreach targets, filter by `ready_for_outreach = true`
- Report pipeline metrics via `query:pipeline` — do not manually count from files

## Verticals to Rotate (DFW)
dental, HVAC, plumbing, landscaping, cleaning, lawyers, accountants, roofing, insurance, real estate, recruiting, consulting, auto repair, med spas, chiropractic