In order to analyze the reasons that Lehman Brothers failed in September 2008, Jenner & Block mined the company's electronically stored information (ESI) containing an estimated three petabytes of data--roughly of 350 billion pages of documents--for relevant issues. That's 150 times more than all the information in the Library of Congress.
The law firm used electronic review technology to pare down the huge collection. Still, the bill for the page-by-page linear (manual) review of the remaining 40 million pages by 70 contract attorneys at rates averaging $50 per hour came to nearly $6 million, according to Robert Byman, a partner at Jenner. Sifting through all the ESI at the same rates without the initial computer-aided review would have cost a staggering $52.5 billion.
Linear review remains one of the most time-consuming and costly parts of e-discovery. A project consisting of 500 gigabytes of data reviewed by 40 reviewers making 100 document decisions per hour and working 10 hours per day at a rate of $60 per hour may take two months and cost more than $1 million, according to LexisNexis Applied Discovery.
Jack Halprin, vice president of e-discovery and compliance at systems provider Autonomy, claims that computer-aided review can save 75 to 90 percent over similar review done manually.
Nevertheless, many lawyers accustomed to linear review think of it as the "gold standard," convinced an attorney's eyes will identify relevant documents and sort out privileged material that a computer will miss. But a recent study reveals that the gold standard is a bit tarnished.
The Electronic Discovery Institute, a non-profit research organization, completed a study of ESI that had been reviewed manually in response to a Department of Justice investigation of Verizon's acquisition of MCI. The cost of the initial review was more than $13 million, about $8.50 per document. A sample of 5,000 documents from the original collection was reviewed twice by computer and also by two human review teams--Teams A and B.
Both machine reviews agreed with the original human reviews about 83 percent of the time. Team A agreed with the original reviewers 76 percent of the time, and Team B agreed on about 72 percent of the documents. Team A agreed with Team B 70 percent of the time, reports Joseph Howie, a principal at Howie Consulting and director of metrics development and communications at the Electronic
Howie says this shows that machine review is more consistent. "Manual review of each record produces poor quality decisions if poor quality is defined to include the concept of replicability--the ability to take the same input through the same process and produce the same results," he says.
Craig Carpenter, vice president of marketing and GC for systems provider Recommind, cites reasons he believes human review is expensive while often performing poorly. For starters, many documents collected for an e-discovery review are "false positives" (irrelevant, unresponsive or both), which racks up unnecessary costs if an attorney reviews all the documents.
Second, documents typically are not organized by topic. Computer software can cluster similar subject matter together to increase reviewer productivity.
Third, if documents aren't prioritized by importance and reviewers know little about a case's substance, they can miss key documents. Many computer systems rank records by importance. Records given the highest ranking can be assigned to the most experienced attorneys for review.
Finally, linear reviewing personnel can become prone to error after reviewing page after page under a tight schedule.
Foster Gibbons, director of document review services at Integreon, a legal outsourcing firm that conducts linear privilege and responsiveness reviews, agrees that computer-aided reviews can result in "jaw-dropping reduction in volume and more efficient workflow," but emphasizes the value of lawyers prioritizing issues and safeguarding privileged documents.
He says the results of linear review depend on the experience of the review team, the direction the team receives, and a well-designed quality-control system.
"Counsel must provide clear instructions at the outset and resolve questions raised in the course of review," he says. "We know that a review team gains proficiency in stages during a review project, eventually reaching a 'steady state,' with decisions more consistently correct on first pass."
Art Crivella, co-founder and CEO of discovery provider Crivella West, says in some cases, extensive human review is unnecessary as long as privilege is protected. His company's technology focuses on isolating privileged material before relevancy searching.
"Relevancy searching makes the small case unaffordable to litigate," he says. "After you make your best efforts to reduce the trash and duplicates, just produce the rest. Why mount an army of reviewers--a slave ship--to go over this material?"
But most experts advise a combination of techniques with computer review supporting, but not replacing, manual review.
"It's important to note that humans are not being taken out of the process, only leveraged," Howie says. "Nobody that I know would suggest that some system will analyze and produce records with no guidance or input from lawyers."
A sound e-discovery methodology can support efforts to reach an agreement with opposing counsel in meet-and-confer negotiations to limit the documents that must be produced, which can be incorporated into a court order. Rule 26 of the Federal Rules of Civil Procedure permits courts to limit e-discovery expense.
In terms of the method used for e-discovery review, case law requires only demonstrating a "reasonable and defensible" method.
As Judge Paul Grimm of the Federal District Court for the District of Maryland held in Victor Stanley Inc. v. Creative Pipe Inc. I: "The implementation of the methodology selected should be tested for quality assurance; and the party selecting the methodology must be prepared to explain the rationale for the method chosen to the court, demonstrate that it is appropriate for the task, and show that it was properly implemented."
Carpenter says computer review is as defensible as human review because it is easy to document the methodologies and workflows with automated random sampling of documents for possible errors.