Call 855-808-4530 or email [email protected] to receive your discount on a new subscription.
When's the last time you stepped back from an upcoming document review project and said, 'Hmm, I wonder if there's a better way to do this?”Analysis of the e-discovery process before it begins can make a project more efficient and cost-effective, but often, lawyers and document reviewers dive right in without a plan.
Instead, advocates Gareth Evans, a partner at Gibson Dunn and co-chair of the firm's electronic discovery and information law practice, it is possible to use statistical analysis to both minimize the burden of discovery as well as provide a guideline by which lawyers can know whether any document review steps need to be redone.
Evans says that performing statistical analysis can answer a number of questions: “It's really important in terms of knowing what you're dealing with, what's the best process to use to cull and review the documents, how long it's going to take, and how much it's going to cost.”'Especially as more and more lawyers are missing production deadlines imposed by courts and governmental investigators, finding an answer to these questions takes on monumental importance.
In a whitepaper, “Metrics that Matter,” Evans laid out six key metrics that all reviewers should take into account when analyzing a project:
Perhaps surprisingly, Evans says that the metric that trips up the most lawyers is actually the one that he deems the most important: recall.
“When lawyers are dealing with issues of recall, the initial inclination is that anything short of perfection would be unacceptable, which runs totally contrary to what search retrieval science tells us,” Evans said. “In reality, if you're able to get a recall level of between 70% and 80%, you're doing very, very well.”
In practice, he adds, “it's pretty rare that there's any actual formal testing of proposed search terms by recall and precision,” as most often, the two sides will instead come to a general, non-specific agreement on the amount of terms that will be used.
The two final metrics ' confidence level and confidence interval ' may be tough to obtain, but they are also the statistics that provide defensibility of the review process in front of the court or opposing counsel.
'It provides defensibility of the overall process, because you can back up your assertions that the search process went well with actual information about it, as well as how well your reviewers did,' Evans notes.
Typically, Evans says, he institutes these metrics at the very beginning of the review process, discussing them with opposing counsel when the two sides meet to lay out ground rules. Evans says that he has not yet been forced to back up his document culling methods in court using the given analytics, but if he has to, he is confident in the methods.
Using analytics to streamline the process seems straightforward, but often, Evans says, attorneys working on the case look at the forest, but not the trees.
“Most attorneys are not familiar with [recall and precision] and how helpful they can be,” he says. “Usually, counsel are focused on the facts, theories and overall strategy of the case, and they don't have a focus or interest on the logistics of document search and review.”
When'metrics'such as recall and precision are introduced, it's often done so through third party e-discovery vendors, Evans adds. Even some vendors, though, do not use statistical analysis in review, so learning the basics on your own can be crucial to increased efficiency.
When's the last time you stepped back from an upcoming document review project and said, 'Hmm, I wonder if there's a better way to do this?”Analysis of the e-discovery process before it begins can make a project more efficient and cost-effective, but often, lawyers and document reviewers dive right in without a plan.
Instead, advocates Gareth Evans, a partner at
Evans says that performing statistical analysis can answer a number of questions: “It's really important in terms of knowing what you're dealing with, what's the best process to use to cull and review the documents, how long it's going to take, and how much it's going to cost.”'Especially as more and more lawyers are missing production deadlines imposed by courts and governmental investigators, finding an answer to these questions takes on monumental importance.
In a whitepaper, “Metrics that Matter,” Evans laid out six key metrics that all reviewers should take into account when analyzing a project:
Perhaps surprisingly, Evans says that the metric that trips up the most lawyers is actually the one that he deems the most important: recall.
“When lawyers are dealing with issues of recall, the initial inclination is that anything short of perfection would be unacceptable, which runs totally contrary to what search retrieval science tells us,” Evans said. “In reality, if you're able to get a recall level of between 70% and 80%, you're doing very, very well.”
In practice, he adds, “it's pretty rare that there's any actual formal testing of proposed search terms by recall and precision,” as most often, the two sides will instead come to a general, non-specific agreement on the amount of terms that will be used.
The two final metrics ' confidence level and confidence interval ' may be tough to obtain, but they are also the statistics that provide defensibility of the review process in front of the court or opposing counsel.
'It provides defensibility of the overall process, because you can back up your assertions that the search process went well with actual information about it, as well as how well your reviewers did,' Evans notes.
Typically, Evans says, he institutes these metrics at the very beginning of the review process, discussing them with opposing counsel when the two sides meet to lay out ground rules. Evans says that he has not yet been forced to back up his document culling methods in court using the given analytics, but if he has to, he is confident in the methods.
Using analytics to streamline the process seems straightforward, but often, Evans says, attorneys working on the case look at the forest, but not the trees.
“Most attorneys are not familiar with [recall and precision] and how helpful they can be,” he says. “Usually, counsel are focused on the facts, theories and overall strategy of the case, and they don't have a focus or interest on the logistics of document search and review.”
When'metrics'such as recall and precision are introduced, it's often done so through third party e-discovery vendors, Evans adds. Even some vendors, though, do not use statistical analysis in review, so learning the basics on your own can be crucial to increased efficiency.
With each successive large-scale cyber attack, it is slowly becoming clear that ransomware attacks are targeting the critical infrastructure of the most powerful country on the planet. Understanding the strategy, and tactics of our opponents, as well as the strategy and the tactics we implement as a response are vital to victory.
In June 2024, the First Department decided Huguenot LLC v. Megalith Capital Group Fund I, L.P., which resolved a question of liability for a group of condominium apartment buyers and in so doing, touched on a wide range of issues about how contracts can obligate purchasers of real property.
This article highlights how copyright law in the United Kingdom differs from U.S. copyright law, and points out differences that may be crucial to entertainment and media businesses familiar with U.S law that are interested in operating in the United Kingdom or under UK law. The article also briefly addresses contrasts in UK and U.S. trademark law.
The Article 8 opt-in election adds an additional layer of complexity to the already labyrinthine rules governing perfection of security interests under the UCC. A lender that is unaware of the nuances created by the opt in (may find its security interest vulnerable to being primed by another party that has taken steps to perfect in a superior manner under the circumstances.