695 files in 3 days
Delivered a large-scale documentation rebrand in days instead of weeks
Turned a multi-week rename into a 3-day operation by building 6 AI-assisted commands and a standalone Python redirect validator. 695 files, 349 URL mappings, 486 images audited across 4 products. Zero 404s in production.
695
Files touched
3 days
End-to-end execution
349
URL mappings validated
486
Images audited
The challenge
Four products, renamed at the same time, across a large documentation site, with multiple writers and a fixed deadline. The conventional approach — manual find-and-replace, screenshot review, redirect spreadsheets — would have taken weeks.
Speed wasn't the only issue. A documentation rename isn't just a find-and-replace problem. It touches YAML config files, frontmatter, component props, changelog entries, SVG icons, and redirect files. Miss a configuration reference and the build crashes. Miss a redirect and a user hits a 404 from a bookmark, a search result, or a cross-product link.
The largest single rename touched 390 files across 5 commits: 265 cross-reference files spanning 17 other products, 88 partial files, roughly 70 render path updates. That was one of four.
The approach
None of the tools were designed upfront. Each one came from hitting a specific problem during execution. By the end of three days, the toolkit was 6 AI-assisted commands and a standalone Python CLI.
The rename engine
The core /rename-product command handled folder renames, content updates, redirects, cross-references, and bridge phrasing ("formerly X"). It ran in two passes: first the product's own files, then cross-references in other products. That let multiple writers work in parallel without merge conflicts. Batch mode split each rename into content changes first, structural changes second — keeping diffs readable.
The validation layer
/validate ran 16 automated checks against the repository, each classified by severity: BUILD (build crashes), RUNTIME (UI breaks), CORRECTNESS (wrong content), COSMETIC (minor). It detected stale product references, missing redirects, broken internal links, and missing SVG icons. Instead of waiting for a slow build to catch failures, it found them in seconds.
Visual image scanning
/check-images used AI vision to scan screenshots for outdated product names — a problem grep can't solve. All 113 SVGs in the scan were fully vectorized with zero <text> elements, meaning old names were baked into vector paths. The only way to find them was rendering each SVG to PNG and reading them visually.
The redirect validator
A standalone Python CLI handled redirect validation in 4 stages: scraping the live sitemap to create a baseline before any renames, comparing the baseline against the redirect mapping CSV to find gaps, validating the GitHub preview deployment, and confirming production after merge. Per-product filtering meant each writer could validate their own rename independently.
The outcomes
| Metric | Count |
|---|---|
| Products renamed | 4 |
| Files touched | ~695 |
| Insertions / deletions | ~2,848 / ~2,816 |
| URL mappings validated | 349 |
| Redirect rules added or updated | 71 |
| Images audited | 486 |
| Images flagged for update | 67 |
| Reports generated | 29 |
| Commits | 17 |
| Execution window | 3 days (Feb 16–18) |
The image scan caught things nothing else would have. Among third-party vendor screenshots, 33 had the old product abbreviation in the filename — but only 11 actually showed the old name on screen. Ten others with no abbreviation in the filename still contained it. Only the visual scan found them.
After the project, both commands were generalised — hardcoded product tables replaced with interactive prompts. Anyone on the team can run them for any future rename, no setup required.
Why it mattered
Compressing a multi-week process into 3 days meant product teams launched with docs that already matched. Users following old bookmarks didn't hit 404s. The image audit caught things that would have sat wrong for months without anyone noticing.
The transition held up. No broken links, no customer complaints. Product leadership confirmed it landed cleanly — for a customer-facing rename across four products and 695 files, that was the real test.
But the more durable thing was the toolkit. Products get renamed again. Domains change. The tools now sit in the team's hands permanently — the next rename starts with working infrastructure, not a blank page. Each run makes the tooling better, because edge cases that trip it up become rules.
None of this was predicted. The problems worth solving only became visible during the work. That's why the tools are specific — each one started as a response to something that actually broke, not something that might break.