Incident story · Ransomware
Tuesday phishing email. Friday encryption. Eleven days to find out the backups worked.
Tuesday morning, 9:14 AM. The bookkeeper at a 75-attorney Midwest litigation firm clicked a link in an email she thought was a court filing notice from a federal docket she knew well. By Friday at 4:40 PM, every document on the firm's file server was encrypted and a ransom note sat on the managing partner's desktop. The firm had backups. It still took eleven days to be back at work — and eight of those days were spent finding out the backups they had paid for, scheduled, and never tested were five days stale at the worst possible moment.
At a glance
The shape of the incident
What happened
Tuesday to the following Tuesday — day by day
Tuesday 9:14 AM. The bookkeeper, a 14-year employee with finance and HR responsibilities, received an email styled as a federal court filing notice. The attachment opened a fake login page that asked her to re-authenticate her firm email to download the filing. She did. Two minutes later her credentials were on a Telegram channel and an attacker in another country was logged into her mailbox.
Tuesday afternoon. The attacker did nothing visible. They read her email folders to understand who she emailed and what she had access to. They watched her forward an invoice to a partner. They noted she could connect to the firm's file server, the practice management system, and the accounting system.
Wednesday and Thursday. The attacker moved laterally. Using credentials they harvested by installing a small tool on her workstation, they reached the IT admin's account, then a domain administrator account. They mapped every file share. They identified that the backup server was on the same domain as production — a single set of credentials reached both. They quietly disabled the backup job for the most recent four nights. Nobody noticed because the backup software still reported "success — last run Tuesday."
Friday 4:40 PM. A paralegal tried to open a deposition transcript and got a file extension she had never seen. Within ninety seconds three attorneys reported the same problem from different floors. The managing partner walked into his office to find a ransom note open on his screen demanding $400,000 in Bitcoin. The firm's IT manager — one full-time person and a part-time contractor — pulled the plug on every server he could reach. By 7 PM the firm was effectively offline.
Saturday.The insurance carrier's breach hotline routed the firm to a breach coach, a DFIR firm, and a ransomware negotiator. The DFIR firm began forensics; the negotiator opened a back-channel to the ransomware crew to buy time. The partners assembled in a conference room with no working computers and started writing client notifications by hand.
Sunday through Wednesday. The IT manager and the DFIR firm worked through backup tapes. The first restoration attempt failed — the most recent backup was from Tuesday night, after the bookkeeper's account had been compromised but before the encryption ran, and the restored files still contained the attacker's footholds. The second attempt failed for a different reason: a corrupted index file the backup vendor had flagged six months earlier in a quarterly report that nobody had read.
Thursday — day 7. The third restoration attempt — from a backup five days older than the attacker's entry — finally produced a clean, mountable copy. The firm lost five days of billable work, time entries, document edits, and email. Two partners had been deep in discovery production on a major case; their week of redlines had to be re-done from scratch.
Day 11. The firm was operational again, on a rebuilt environment with new domain controllers, new backup architecture, enforced MFA, and an external SOC monitoring 24/7. Six weeks later, three clients had quietly moved their matters elsewhere. Three more left during the next renewal cycle. Insurance covered roughly 52 percent of the $2.4M total. The firm absorbed the rest.
What it cost
The bill, itemized
Direct remediation (forensics, IR firm, IT overtime)
$420,000DFIR engagement, 11 days of incident response, rebuild of 38 endpoints and 4 servers.
Downtime — lost billable hours over 11 days
$840,000Litigation calendar froze. Hearings postponed. ~75 attorneys × 11 working days × average billed hour.
Client churn — 6 clients departed over 12 months
$680,000Three immediately, three quietly during the next renewal cycle. Two were top-15 by revenue.
Cyber insurance premium increase (3-year impact)
$220,000Renewal premium jumped 240% YoY. Carrier required external SOC and quarterly attestation.
Legal + regulatory + bar notification
$185,000State bar notification, client-by-client disclosure letters, ethics counsel, breach coach.
Ransom paid
$0Restored from backups after 11 days. Ransom demand was $400,000 in BTC.
Total estimated
$2,345,000Cyber insurance covered $1.25M (~52%). Firm absorbed the remaining $1.1M directly.
What we did
EFROS-style response — what an engagement looks like
First 24 hours — containment. On engagement, the first thing we do is isolate the blast radius. Every endpoint and server gets either disconnected or quarantined under EDR. We stand up an out-of-band communication channel (so the partners can actually coordinate without using compromised email), and we walk the attacker's timeline backwards from the encryption event to the first foothold.
Days 2 to 4 — forensics and backup triage. We work in parallel: one team confirms which backup snapshots predate the compromise and are safe to restore; another team identifies every account the attacker touched and every system they reached. The question we answer for the owner: which backup is the latest one we are willing to bet the firm on. That answer is rarely the most recent.
Days 4 to 8 — rebuild, not restore. For a ransomware event of this scale we do not restore to the existing environment. We rebuild domain controllers from clean media, force every account through a credential rotation, enforce MFA on every identity (not just enable — enforce, with no exceptions), and bring up the file server, practice management, and accounting systems on the new identity plane.
Days 8 to 11 — operational handoff. We move the firm from emergency mode into ongoing managed detection and response. An external SOC begins 24/7 monitoring with documented escalation paths. Backups move to an architecture where credentials that reach production cannot reach backup storage — and we test a real restore on day 11.
Week 6 — bar and client notifications. We coordinate with breach counsel on state bar notification, client disclosure letters that comply with attorney professional-conduct requirements, and the insurance carrier's evidence package. Client calls are scripted with breach counsel so that what gets said is what gets said.
Ongoing. The firm moves to a quarterly cadence: tested restores from backup, phishing simulations against the staff most likely to be targeted, MFA attestation, and a board-grade summary the managing partner can hand to the insurance carrier at renewal.
What you should take from this
Five things to do this week
- 01
Backups exist is not the same as backups work. The firm had a backup product, a backup schedule, and a backup vendor. None of those facts mattered until day 8 of recovery, when the third restoration attempt finally pulled a clean, recent copy.
- 02
A four-day quiet attacker is the normal case, not the exception. Ransomware crews routinely sit in a network for days mapping shares, identifying the most painful files, and waiting for Friday afternoon. By the time you see encryption, they have already seen everything.
- 03
Phishing reaches the bookkeeper for a reason. Bookkeepers, AP clerks, and intake staff are the highest-value initial targets — they routinely click links from people they do not know, and they have credentials that move money.
- 04
Cyber insurance pays when the controls you wrote on the application were real on the day of the incident. If you said MFA is enforced, MFA had better be enforced. Carriers re-read the application like a deposition.
- 05
Eleven days down at a 75-attorney firm is not eleven days of lost revenue — it is a court calendar that does not pause for you. Continuances stack up, opposing counsel pushes for sanctions, clients call asking why their hearing was moved.
The 60-second self-check
Three yes/no questions
If any answer is no — or any answer is "I think so" — you have the same exposure profile as the firm in this story.
1. Have you — in the last 30 days — actually restored a representative file from your backup system, end to end, on a clean machine?
Not 'verified the backup ran.' Not 'tested the backup software.' Actually restored a file you can open.
2. If your bookkeeper or AP clerk clicked a link and entered their email password right now, would MFA stop the attacker from logging in from a different country?
If MFA is configured but not enforced, the answer is no.
3. Do you know — without asking IT — how many days of business records you would lose if you had to restore from your most recent reliable backup tonight?
If the answer is 'a few hours' but you have not tested it, the real answer is closer to 'I do not know.'
What this would cost you
Three private numbers, none of them require talking to a salesperson.
Related incident stories
Two more patterns owners ask about
Metal-fab manufacturer
BEC fraud — $185K wire stolen, insurance denied
The attacker watched the AP clerk for six weeks and waited for the right invoice. The insurance application said MFA. The carrier said show me where it was enforced.
Regional logistics firm
Supply-chain compromise — $8M ARR lost
The dispatch software vendor pushed a routine update containing a backdoor. Eight weeks later, customer contracts were on a dark-web forum and the largest shippers did not renew.
Names, locations, and identifying details changed. Numbers represent typical ranges from EFROS engagements; specific cases vary. Nothing on this page is legal advice.