This site uses cookies to provide you with a more responsive and personalized service. By using this site you agree to our use of cookies. Please read our cookie notice for more information on the cookies we use and how to delete or block them.

Bookmark Email Print this page

Is it Time for Human Analytics in Document Review?

Discovery Digest – Q4 2013

By Robert B. "Barry" Wiggins

Document review often accounts for 60 percent or more of the cost of litigation. And although the size and nature of document reviews can vary, the basic goal is to identify documents that meet predefined criteria as specified in a document request. That said, document review needn’t be just a “cost of doing business” in discovery; instead, a well-devised and managed review can be a tool that furthers case strategy. Good processes, training, team management and the right mix of people and technology can pay tremendous dividends.

Through the application of these factors, or “human analytics,” review teams can prioritize the information to be reviewed with more confidence than ever before and work smarter and faster. At the same time, documents that have a significant bearing on the case can be identified earlier, giving counsel the opportunity to understand the facts and formulate litigation strategies earlier.

The human analytics process

Process-driven document review is the foundation for human analytics, regardless of whether those reviews are performed by in-house corporate groups, outside counsel, or through the efforts of review teams managed by third parties. Some basic issues are present in nearly all reviews: the extent to which technology is going to be used to identify and organize the documents, who is going the conduct the actual eyes-on review and the process around the review.

As an initial matter, for any technology to be deployed effectively, actual users — the review team and counsel — must be involved in determining whether it is “right” for the case at hand. This holds true for keyword searching, clustering, near-dupe analysis, predictive coding, or some combination, as it is only through human judgment and scrutiny that the efficacy of the technology can be determined.

For example, in a matter in which keyword searching is to be used, the technology can rapidly identify “hits,” and reports can be generated to show the frequency with which each keyword appears. An experienced reviewer is needed to determine whether the reported information validates the words chosen and, if not, to suggest alternatives. Similarly, the game-changing power offered by predictive coding can only be unleashed once human review has validated the results of the technology. This teaming of technology with reviewer expertise can exploit the best attributes of each.

Once the appropriate technology is in place, it is important to consider how the actual review can be structured to maximize the potential of human analytics. This structure should address how to train and build the review team, how the team and counsel will communicate and how to achieve desired levels of quality. It should also establish overall project oversight.

A successful review begins with collaborative planning between counsel and leadership of the review team. Indeed, before the team reviews a single document, counsel and review team leadership should collaborate on a number of issues, such as drafting the training memorandum and addressing the application of privilege in the review. Counsel might consider undertaking a “quick peek” — either manually or through the application of concept or text-based technologies — to identify documents that can be used to train the technology and to educate the review team. This effort will result in a more meaningful context for the review and it will help reviewers spot key documents much sooner.

The review team can be built while planning is underway. With human analytics, the formation of the team should be guided by issues like subject matter expertise, scalability, the transparency of the staffing process and the skill of reviewers on the technology being deployed. As the team is coming together, technology can be used to organize the data for review. By using technology to examine various attributes of the discovery materials, such as text metadata or patterns of communication between custodians, counsel can determine how to present the documents for review. By using these and similar technologies, documents can be assigned on a targeted basis, thus giving the various groups of reviewers the opportunity to quickly become subject matter experts on their topic, thus enhancing review efficiency and consistency.

Once the team has been formed, the training materials created through the collaborative planning process can be used to guide the team through its review. These ought to include a “teach-in” or training memorandum for the matter, including guidance on document analysis and coding requirements and training on the designated review database. Once the review begins, communication becomes more important. Effective document review is not a static process.

As the review progresses, reviewers learn more about the case and the documents. Counsel can take advantage of this learning process through the application of human analytics and by doing so, leverage people and technology to identify key documents sooner, improve consistency, as well as leverage reviewer insights to spot trends and abnormalities in the data that could be useful in building the case itself. This can be accomplished in a number of ways, including the use of regularly prepared status reports, budget reports, updates to review guidelines and document release schedules. By staying informed in real time, counsel can adjust case tactics and strategy along the way and be better positioned to take proactive measures in the management of the case.

Quality management underlies the review structure and many of the same technologies can be used to detect and correct document coding defects. For example, a party can analyze reviewer coding decisions with predictive coding tools, targeted searching, automated scripts to catch inconsistencies, concept clustering, or near-dupe analysis. These tools permit the reviewer to be much more targeted in their quality-control efforts, which minimizes time while maximizing effectiveness.

Statistical quality acceptance (SQA) procedures can also be used to determine, through sampling techniques, whether a particular set of documents is ready to be released for production. If a sample fails to meet the established confidence level, appropriate remediation can be conducted and SQA can be repeated until the batch meets the agreed-upon quality level. A solid SQA program reduces time spent on quality control without sacrificing overall quality.

These are challenging times for legal departments. Human analytics — the right people using the right tool in the right way — is a proven approach for confronting the challenges of increasing amounts of discoverable information and tightening legal budgets.

Reprinted with permission from the October 15, 2013 issue of Corporate Counsel. © 2013 ALM Media Properties, LLC.

As used in this document, “Deloitte” means Deloitte LLP and its subsidiaries. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.

Share this page

Email this Send to LinkedIn Send to Facebook Tweet this More sharing options

Stay connected