Call 855-808-4530 or email [email protected] to receive your discount on a new subscription.
I, and my fellow electronic discovery veterans, have authored thousands of prescriptive articles offering sage advice on how to best improve and conduct an e-discovery review. Our industry often talks about 'best' like they are non-intuitive or otherwise so unique that only the truly gifted and inspired can attain such vaulted status. In truth, I believe that the best e-discovery review best practices are better characterized as the application of real-life lessons. I believe they are not complicated, and I further believe that focusing on a few of these learned lessons with a thoughtful, deliberate approach will achieve a truly effective electronic discovery review.
Plan, and Then Plan Some More
The absolute number one lesson is to plan. I would advocate that in nearly every project, e-discovery or otherwise, doubling (yes doubling) the planning effort for a project results in substantially greater returns than the additional investment. Take, for example, a large e-discovery effort involving 100 custodians, which not uncommonly can incur more than $5 million in review and e-discovery costs.
From my experience, teams for e-discovery projects of this size spend in the neighborhood of 100 hours or so in the planning phase. At a billing rate $500 per hour, that's $50,000 in planning costs, which astoundingly is less than 1% of the project budget. When compared with other industries, where 10% is often the norm, this is a miserably small investment for such a critical activity. It is even more astonishing when you think about the level of coordination that is needed in large-scale discovery involving law firms, clients, service providers and others ' all the while synthesizing legal and technical requirements with a kitchen full of cooks. So how do we plan to plan?
Lesson Number One
Utilize staff and service providers who are trained, certified project managers and understand all of the nuances of a complex e-discovery review. Second, start early. Often, project planning gets short changed because there is a rush to start. Lastly, formally establish a reasonable project management budget and communicate the plan to all parties involved, so that realistic expectations can be established up front.
Lesson Number Two
Determine objectives and think SMART ' Specific, Measurable, Actionable, Relevant, and Timely. Every project starts with an objective. The more clearly defined the objective, the greater the likelihood that the project will be successful. How many e-discovery reviews have started with no known complete date and yet they wind up late? Defining the objectives and goals of a project at the beginning of a project, even if arbitrary, are critical for setting appropriate expectations and ultimately delivering a successful review. SMART is a wonderfully useful acronym that every objective should fulfill.
Lesson Number Three
Determine objectives and think SMART. Wait ' isn't this a repeat? Why yes, it is. I can't stress how trivial yet important this is. Take, for example, an objective defining review accuracy (by the way, most e-discovery reviews don't formally consider this issue but should).
The theoretical goal of any review is to be 100% accurate in determining responsive and privilege calls. Most of us know that attorneys are human beings and are prone to the occasional error. So, what is a reasonable and yet acceptable accuracy rate? Is it 99%, or one error out of every 100; or 99.99%, which is 1 error out of every 10,000; or will 90%, or one error out of every 10, be acceptable? And what are we measuring? Is it that every redaction, responsive call, and issue classification is correct, or whether privilege claims are the primary focus? I can tell you that a review striving to achieve a 99.99% accuracy looks (and costs) much different than a review with 90% accuracy. As you are thinking about this, be SMART.
Lesson Number Four
Detailed and documented instructions for the review team are critical. By the time you get to checking for quality and consistency at the end of the review, it is too late. The project has to be approached from the beginning with quality objectives in mind. For example, if the objective is to reach 99.99% accuracy on responsive calls, is it reasonable to assume that two different people will review 1000 documents and agree on every document call except one? At minimum, providing appropriate training and documentation will insure that the review team has the basic foundation to make consistent document calls.
Lesson Number Five
Review environments vary. An under-appreciated factor in effective reviews is the computing environment. For example, the Internet traffic generated by a large review team can equal, if not exceed, the internet traffic generated by the typical browsing activities of an entire firm or legal department. Will the IT infrastructure support it? What are the contingencies if an Internet disruption occurs? If an in-house application is being used, will it support a large number of concurrent users? Review environment requirements should not only address technology but also people. Happy, enthusiastic people do better work. In all of the last minute madness, don't forget to take care of the people who fundamentally can make or break your e-discovery review project.
Conclusion
Planning, SMART objectives, documented review and a healthy, positive review environment are the basic building blocks for effective e-discovery review. These practical, hard-earned lessons are not complicated and are easy to include in your next review.
Allen Gurney is the Engagement Manager, Discovery Management at Fios, Portland, OR, a firm that provides electronic discovery information. He is responsible for helping clients reduce the costs and risks of e-discovery. Prior to Fios, Gurney spent nine years at Thompson Coburn LLP in St. Louis, leading the firm's client technology services group.
I, and my fellow electronic discovery veterans, have authored thousands of prescriptive articles offering sage advice on how to best improve and conduct an e-discovery review. Our industry often talks about 'best' like they are non-intuitive or otherwise so unique that only the truly gifted and inspired can attain such vaulted status. In truth, I believe that the best e-discovery review best practices are better characterized as the application of real-life lessons. I believe they are not complicated, and I further believe that focusing on a few of these learned lessons with a thoughtful, deliberate approach will achieve a truly effective electronic discovery review.
Plan, and Then Plan Some More
The absolute number one lesson is to plan. I would advocate that in nearly every project, e-discovery or otherwise, doubling (yes doubling) the planning effort for a project results in substantially greater returns than the additional investment. Take, for example, a large e-discovery effort involving 100 custodians, which not uncommonly can incur more than $5 million in review and e-discovery costs.
From my experience, teams for e-discovery projects of this size spend in the neighborhood of 100 hours or so in the planning phase. At a billing rate $500 per hour, that's $50,000 in planning costs, which astoundingly is less than 1% of the project budget. When compared with other industries, where 10% is often the norm, this is a miserably small investment for such a critical activity. It is even more astonishing when you think about the level of coordination that is needed in large-scale discovery involving law firms, clients, service providers and others ' all the while synthesizing legal and technical requirements with a kitchen full of cooks. So how do we plan to plan?
Lesson Number One
Utilize staff and service providers who are trained, certified project managers and understand all of the nuances of a complex e-discovery review. Second, start early. Often, project planning gets short changed because there is a rush to start. Lastly, formally establish a reasonable project management budget and communicate the plan to all parties involved, so that realistic expectations can be established up front.
Lesson Number Two
Determine objectives and think SMART ' Specific, Measurable, Actionable, Relevant, and Timely. Every project starts with an objective. The more clearly defined the objective, the greater the likelihood that the project will be successful. How many e-discovery reviews have started with no known complete date and yet they wind up late? Defining the objectives and goals of a project at the beginning of a project, even if arbitrary, are critical for setting appropriate expectations and ultimately delivering a successful review. SMART is a wonderfully useful acronym that every objective should fulfill.
Lesson Number Three
Determine objectives and think SMART. Wait ' isn't this a repeat? Why yes, it is. I can't stress how trivial yet important this is. Take, for example, an objective defining review accuracy (by the way, most e-discovery reviews don't formally consider this issue but should).
The theoretical goal of any review is to be 100% accurate in determining responsive and privilege calls. Most of us know that attorneys are human beings and are prone to the occasional error. So, what is a reasonable and yet acceptable accuracy rate? Is it 99%, or one error out of every 100; or 99.99%, which is 1 error out of every 10,000; or will 90%, or one error out of every 10, be acceptable? And what are we measuring? Is it that every redaction, responsive call, and issue classification is correct, or whether privilege claims are the primary focus? I can tell you that a review striving to achieve a 99.99% accuracy looks (and costs) much different than a review with 90% accuracy. As you are thinking about this, be SMART.
Lesson Number Four
Detailed and documented instructions for the review team are critical. By the time you get to checking for quality and consistency at the end of the review, it is too late. The project has to be approached from the beginning with quality objectives in mind. For example, if the objective is to reach 99.99% accuracy on responsive calls, is it reasonable to assume that two different people will review 1000 documents and agree on every document call except one? At minimum, providing appropriate training and documentation will insure that the review team has the basic foundation to make consistent document calls.
Lesson Number Five
Review environments vary. An under-appreciated factor in effective reviews is the computing environment. For example, the Internet traffic generated by a large review team can equal, if not exceed, the internet traffic generated by the typical browsing activities of an entire firm or legal department. Will the IT infrastructure support it? What are the contingencies if an Internet disruption occurs? If an in-house application is being used, will it support a large number of concurrent users? Review environment requirements should not only address technology but also people. Happy, enthusiastic people do better work. In all of the last minute madness, don't forget to take care of the people who fundamentally can make or break your e-discovery review project.
Conclusion
Planning, SMART objectives, documented review and a healthy, positive review environment are the basic building blocks for effective e-discovery review. These practical, hard-earned lessons are not complicated and are easy to include in your next review.
Allen Gurney is the Engagement Manager, Discovery Management at Fios, Portland, OR, a firm that provides electronic discovery information. He is responsible for helping clients reduce the costs and risks of e-discovery. Prior to Fios, Gurney spent nine years at
ENJOY UNLIMITED ACCESS TO THE SINGLE SOURCE OF OBJECTIVE LEGAL ANALYSIS, PRACTICAL INSIGHTS, AND NEWS IN ENTERTAINMENT LAW.
Already a have an account? Sign In Now Log In Now
For enterprise-wide or corporate acess, please contact Customer Service at [email protected] or 877-256-2473
What Law Firms Need to Know Before Trusting AI Systems with Confidential Information In a profession where confidentiality is paramount, failing to address AI security concerns could have disastrous consequences. It is vital that law firms and those in related industries ask the right questions about AI security to protect their clients and their reputation.
During the COVID-19 pandemic, some tenants were able to negotiate termination agreements with their landlords. But even though a landlord may agree to terminate a lease to regain control of a defaulting tenant's space without costly and lengthy litigation, typically a defaulting tenant that otherwise has no contractual right to terminate its lease will be in a much weaker bargaining position with respect to the conditions for termination.
The International Trade Commission is empowered to block the importation into the United States of products that infringe U.S. intellectual property rights, In the past, the ITC generally instituted investigations without questioning the importation allegations in the complaint, however in several recent cases, the ITC declined to institute an investigation as to certain proposed respondents due to inadequate pleading of importation.
As the relationship between in-house and outside counsel continues to evolve, lawyers must continue to foster a client-first mindset, offer business-focused solutions, and embrace technology that helps deliver work faster and more efficiently.
Practical strategies to explore doing business with friends and social contacts in a way that respects relationships and maximizes opportunities.