Most general counsel don’t know it, but the guy who’s killing them is Robert Metcalfe, the founder of 3Com. Metcalfe’s law about the value—and complexity—of networks is the amplifying force behind the GC’s increasingly impossible conflict between risk, cost and quality. And for those who understand its implications, it suggests some radical but achievable changes in law department management.
Today’s GC needs to control cost. We’re all aware of that. For most, more than half of the outside counsel budget is spent on litigation, two-thirds of which is consumed by discovery. No surprise there either—discovery is ground zero for the cost control crisis faced by law department managers.
On the flip side of the cost equation, risk in discovery has simultaneously spiked. Despite a few cautionary tales imprinting themselves upon the conscience of the industry, such as the malpractice charge against McDermott Will & Emery for the production of 3,900 privileged documents, most remain unaware of the increase in motions for discovery sanctions. Between 2004 and 2010, requests for sanctions, costs or 502(b) rulings rose an alarming 800 percent.
Also concerning is the mounting evidence that the average document review is woefully inaccurate. Analyses of the Text Retrieval Conference (TREC) 2009 Legal Track data repeatedly show that reviewers fail to identify 20 percent to 75 percent of relevant documents. Another study in connection with MCI’s attempted acquisition of Verizon revealed that independent review teams only yielded a 16 percent to 28 percent overlap in responsive documents. What we face today in discovery is a trifecta of GC nightmares: high costs, high risk and poor quality.
Why are all three dynamics heading into the red zone at the same time? It’s Metcalfe. He pointed out that the complexity or number of unique interactions in a network increases near-exponentially with the number of unique nodes in the network.
In English: If you have 10 people working on a team charged with reviewing documents, there are 45 unique interactions to resolve issues, share lessons and ensure consistent interpretation of information. But Metcalfe tells us that when you have 100 people reviewing documents, there are 4,950 unique interactions. Complexity increases 10 times faster than team size.
As the magnitude of the discovery challenge has increased, instead of rethinking how the work gets done, we’ve simply replaced law firm associates with temp attorneys. The quality of discovery is suffering mightily as a result, but at least we're saving money, right? Unfortunately, the savings we think we’re getting through cheaper reviewers are largely fictitious. While $50/hour may feel gratifying, we’re failing to account for other more meaningful cost inputs.
To improve the total cost of review, one must pay attention to the volume of review hours and the amount of re-review of the supervising law firm. These inputs balloon as complexity increases. The supervising firm only needs to re-review 15 percent of the work to double the overall review cost. Anecdotally, 30 percent re-review is not uncommon.
It’s astounding how many in-house departments spending millions on discovery fool themselves into believing that they are getting a great deal because of lower reviewer rates. The relationship between the hourly reviewer rate and the overall cost of discovery is anything but linear—in some cases, it is inversely correlated.
So what should we do? Tackling risk and cost in complex discovery requires us to shift focus from who is doing the review to how the work gets done.
Unfortunately, just about everyone has learned to say “process,” “Six-Sigma,” and “LEAN.” A genuine exercise to get underneath real process requires an investment of time on the part of the buyer.
Here are three questions designed to test whether a discovery partner is substituting buzzwords for capabilities:
- What is the statistical sampling methodology that drives the quality control process? Quality control is all about statistical confidence levels. You don’t want to over-sample (costly) or under-sample (risky). How you sample, by reviewer, by issue, is crucial. Pay attention to the timing and frequency with which sampling is done. A robust quality control process emphasizes continual learning throughout the review. If a provider samples 15 percent of all documents because ”that’s industry standard,“ raise the red flag.
- How does tight process management result in greater defensibility and inspire trust in lead counsel? The key to defensibility and efficiency is how replicable processes are deployed end-to-end. A trustworthy partner will be able to demonstrate robust inter-locking processes throughout the review, and show how those processes translate to lead counsel re-reviewing less than 3 percent of the documents.
- How is knowledge management used to analyze information throughout the case? Extracting and organizing information during the first review should avoid substantial re-review downstream. A discerning buyer will demand that his review partner generate issue analyses, timelines, privilege logs and other analytic output in real time throughout the case.
Ultimately, the sensible answer to the discovery challenge is likely technological, where analytics and predictive coding replace human effort and error. That’s the glorious and inevitable future. But as we work out the kinks in automated solutions, process capabilities—the how —is the only way to ride herd over large-scale, high-stakes reviews.
The International Data Corporation predicts that the amount of electronic data will increase by 45 percent per year for the next several years. Metcalfe is just getting started.