Call 855-808-4530 or email [email protected] to receive your discount on a new subscription.
The line separating reality from artificiality is more blurred than ever. The development of machine learning and artificial intelligence has produced several exciting but troubling phenomena. Specifically, that machines are currently producing completely new forms of content and data that may be leveraged to produce sophisticated deepfakes that strikingly resemble actual people.
Until very recently, the e-discovery process of finding, preserving and analyzing electronic information to uncover facts was centered on two broad types of information: user-generated (email, documents, chat messages, etc.) and systems generated (operating systems data, application logs, etc.). Long-established rules and processes exist for how these groups of electronic information are filtered and handled during discovery. These ensure that discovery, review and document productions are focused only on the individuals, timeframes and activities relevant to the matter at hand. That said, the emerging data sources — such as Slack, WhatsApp, Microsoft Teams and other cloud sources — that have come into this realm have created numerous challenges across evidence preservation, collection, analysis, review and production.
In addition to rapidly evolving complexities with emerging data, what about the wholly new data that's not user or system generated, but rather AI generated, either entirely by algorithms or in tandem with human input? How will traditional tools and techniques need to adapt to handle new data challenges that have never been encountered by digital forensics specialists or lawyers?
ENJOY UNLIMITED ACCESS TO THE SINGLE SOURCE OF OBJECTIVE LEGAL ANALYSIS, PRACTICAL INSIGHTS, AND NEWS IN ENTERTAINMENT LAW.
Already a have an account? Sign In Now Log In Now
For enterprise-wide or corporate acess, please contact Customer Service at [email protected] or 877-256-2473
Chief information officers still bear the brunt of cybersecurity worries at many companies. But a study by the Association of Corporate Counsel Foundation finds that chief legal officers are increasingly taking a leadership role in cybersecurity strategy.
General counsel are eager to tap the promise of generative AI. But without clear technology road maps, many legal departments are struggling to turn that interest into action.
Part Two of this two-part articleexamines practical steps marketers must take to succeed in this changing landscape by embracing a multichannel, AI-driven approach to their marketing and PR efforts. This means rethinking your strategy to build direct connections with your audience, using platforms that elevate your visibility and focusing on storytelling that resonates.
When the SEC issues the next annual enforcement report for fiscal year 2025, we expect securities offering actions and investment adviser actions will almost certainly be up, and the “crypto” and “cyber” cases will almost certainly be down. Public statements by the new SEC administration have said as much, but even more telling than public statements are the allocation of limited enforcement resources.
The VPPA may be nearly four-decades old and video-rental stores largely a thing of the past, but the rise of online content, streaming services and ancillary activities has brought with it frequent litigation based on the VPPA. The key challenge in these litigations is how to interpret the VPPA’s 1980s terms in light of today’s digital advances.