Update Protocol
SpoolBench updates pages on a signal-driven cadence rather than calendar intervals. Each signal — price drift, ASIN liveness loss, AI consensus shift, competitor downfall, methodology version bump — has a defined threshold and a routed edit tier (surgical paragraph rewrite, section rewrite, or structural rebuild). This page documents the full mapping.
Signal Sources
Every page on SpoolBench is monitored by a fixed set of automated signal generators. Each generator watches a specific data source on a defined cadence and writes structured signal records into the queue when its threshold trips.
| Signal | Source | Cadence | Threshold | Default edit tier |
|---|---|---|---|---|
| AI citation drop | Perplexity / ChatGPT / Google AI Mode | Weekly | Citation count drops >25% week over week | Tier 2 — section rewrite |
| ASIN liveness loss | Amazon catalog (live re-fetch) | Weekly | ASIN unavailable, redirected, or 404 | Tier 2 — replacement product |
| Price drift | Keepa price history | Weekly | Price moves >20% in 7 days | Tier 1 — surgical paragraph rewrite |
| AI consensus shift | 3-engine AI consensus tracker | Weekly | Top recommendation changes across >1 engine | Tier 2 — recommendation reframing |
| New release candidate | Keepa movers + scan-new-releases | Weekly | Score ≥ 75 with ≥ 2 corroborating signals | Tier 2 — insert + demote |
| Competitor downfall | DataForSEO domain rating + ranked-keyword diff | Weekly | DR drop ≥ 5 OR keyword footprint loss ≥ 30% | Tier 2 — refresh broadly |
| Methodology version bump | Pipeline release event | Per release | Major version increment in atomization or scoring | Tier 3 — structural rebuild |
| Empty citation slot | citation-decay-track tracker | Weekly | Topic with high AI-engine query volume + zero SpoolBench citation | New full Phase 3 page |
Edit Tiers
Every signal routes to one of three tiers. Tier choice is codable — the signal-router applies a fixed lookup, not editorial judgment, so the same signal always produces the same kind of edit.
- Tier 1 — Surgical. One paragraph rewritten in place.
Heading structure, internal links, and schema fields untouched
except
dateModifiedand the affectedOfferorReviewblock. Used for single-claim invalidation (price drift, ASIN dead). - Tier 2 — Section-scoped. The affected section plus the page intro are rewritten. Demoted picks stay on the page as "former pick" / "previous pick" (Forbes pattern — preserves ranking signals). Used for recommendation reframing (top pick replaced, AI consensus shifted).
- Tier 3 — Full rewrite. Whole-page re-evaluation.
Old content archives to
research/archived-versions/for rollback. Used only for methodology version bumps, niche reframing, or repeated quality-failure quarantines.
Ranking Protection Rails
Every Tier 2 and Tier 3 edit captures a Google Search Console baseline before the edit ships: clicks, impressions, and average position for the page's top 5 query targets. Fourteen days after the edit goes live, the system re-pulls the same metrics and compares.
If position drops > 5 places on ≥ 2 of the 5 targets, the edit is flagged as a regression. The admin panel surfaces a one-click "roll back to archived version" option so any edit can be undone before the decay compounds. Pages younger than 30 days are excluded from refresh eligibility — Google's honeymoon / recalibration window has to stabilize before any edit ships against it.
Changelog Entries
Every edit appends a structured entry to the page's bottom-of-page changelog. Each entry carries the trigger date, the trigger type (machine-readable enum + human-readable label), a one-line "what changed" summary, optional structured detail prose, and a methodology version stamp.
Append-forever: changelogs grow as pages mature (the RTINGS pattern). A single same-day signal cannot stack entries — each edit is keyed by a stable trigger key and repeat fires of the same key are no-ops.
Audit Log + Merkle Verification
Every edit writes one line to research/audit-log.jsonl
on commit: trigger key, source URL, atom set fingerprint, claim hash,
page hash, model version, and timestamp. The log is append-only — the
pipeline does not delete or rewrite past entries.
A daily job hashes that day's lines into a SHA-256 Merkle root and
commits the root to research/audit-merkle.jsonl. The
chain links each day's root to the previous day's root, so any
"Updated" date on the site can be verified end-to-end:
page → trigger key → audit log line → Merkle root → daily commit.
Tampering with a past entry breaks the chain at the verification
step.
All edits enter an append-only audit log; daily Merkle roots lock the chain so any "Updated" date on the site can be verified back to the signal that triggered it and the source the new claim references. Page-bottom changelogs surface the same data to readers.
Spotted a page where the date moved but the content didn't seem to? Reach us via the contact form on our about page with the URL — we'll check the audit chain and surface the trigger that fired.