Block retry storms before they drain your API budget.
Loop detection + automatic blocking in 3 lines of code.
while True: scrape() โ except you didn't write that. The LLM did.
429 → exponential backoff → backoff too aggressive. $89 in 20 minutes.
Ctrl+C doesn't work when it's running on a server. You notice at 3am.
scrape() Runsgate.check() before scrape()One check before each API call.
Patterns tracked in real-time. No config needed.
Retry storm blocked. Your wallet survives.
{
"allowed": false,
"error": "loop_detected",
"pattern_count": 12,
"reason": "Same scraping request repeated 12x in 47 seconds",
"action": "firecrawl_scrape"
}
3 lines of code to protect your scraping budget.
import { ProceedGate } from '@proceedgate/sdk';
const gate = new ProceedGate({ apiKey: 'pg_ws_...' });
// Before your scraping call
async function scrapeWithProtection(url: string) {
const check = await gate.check({
user_id: 'agent-1',
action: 'firecrawl_scrape',
cost: 1, // 1 credit per scrape
metadata: { url }
});
if (!check.allowed) {
console.log(`๐ซ Blocked: ${check.error}`);
console.log(`๐ฐ Saved: $${check.cost_saved}`);
return null;
}
// Safe to proceed
return await firecrawl.scrape(url);
}
Built for scraping agents. Also works for research & web automation.
| Generic Rate Limiting | ProceedGate | |
|---|---|---|
| Detects retry loops | ❌ No | ✅ Yes |
| Tracks cost saved | ❌ No | ✅ Yes |
| Per-agent budgets | ❌ No | ✅ Yes |
| Webhooks on blocks | ❌ No | ✅ Yes |
| Works across tools | ❌ Tool-specific | ✅ Universal |
| Setup time | Hours/days | 5 minutes |
Stop building your own scraping budget control. We did it for you.