Editorial illustration of structured data flowing between two inventory systems

A month of work, three hours of work

Over 6,000 SKUs of product data migrated between two inventory systems across two Google Meets.

A friend of mine mentioned over a coffee that a British e-commerce business he invests in was stuck migrating years of product data into a new inventory system. Their internal estimate was about a month. I told him I'd take a look. We did it in about three hours, across two Google Meets.

How I did it: I taught my Claude to do the work. I was the teacher of the Claude, the business owner was the subject-matter expert, and we worked together live over Google Meets to teach the Claude what needed to be done. Then the Claude did it.

(People always ask me what "teaching the Claude" means. It means getting Claude to acquire the know-how and the judgment to do the work — not just running it once, but building up what it knows until it can carry the task the next time too. This engagement was a classic example.)

The first thing wasn't moving data. It was finding out what was already in the destination.

Master reference first — extracting the destination system's current state from PDFs before any transformation

Then the real work: making the rows say the same thing every time.

A single product row before and after the cleanup pass, with fragments merged and notation normalized

Sequenced by dependency, so each pass could trust the last.

Four dependency-ordered passes: structural, notation, mapping, fuzzy match

Every change logged with the rule that fired, so the client could read the work.

Audit trail showing old value, new value, and the rule that fired for every row

99.5% matched automatically. 87 flagged for a human, each with the reason attached.

Staged validation — auto-applied matches, manual-review queue, and the reasons each row was flagged

That engagement produced a protocol I now apply to any inventory-system migration. The full 21-technique version lives in my know-how library.

If you have a migration-shaped problem, the first session looks like this: an hour on your real data. If the shape fits, we keep going.

Let's talk →

Related projects