See Credit Details Below
Why You Should Attend
From predictive analytics, language modeling, and biometric customer authentication tools to lightning-speed data transfer and autonomous vehicles, industries across the globe are increasingly adopting and expanding artificial intelligence solutions that present unique legal challenges. If you work at or counsel a technology company or any other company that creates or works with data-sharing or -analyzing technologies or services or artificial intelligence offerings to support or enhance business methods — which probably means most companies today — this program will provide you with guidance on the legal risks associated with these technologies and services, best practices, and tips on how to anticipate and mitigate relevant risks.
What You Will Learn
- Hear current best practices in protecting and exploiting data to ensure compliance with law, including data privacy law
- Learn about the dynamic global regulatory landscape for AI products and services across a variety of sectors, including mobile, social media, healthcare and finance
- Understand current best practices for ensuring compliance with key regulations focused on AI
- Hear about how to anticipate and eliminate bias in AI and data-centric business models
- Learn about mitigation strategies to address the unique legal risks and challenges presented by artificial intelligence
- Earn one hour of Elimination of Bias credit
Who Should Attend
Intellectual property, privacy, litigation, and corporate counsel focused on developing or implementing policies to mitigate risks associated with the exploitation of data, artificial intelligence and related products and services should attend this program. Additionally, in-house counsel involved with: (i) data privacy compliance; (ii) assessment of data collection, sharing and retention practices, as well as AI product deployment; (iii) litigation risk management in AI development and exploitation, and/or (iv) addressing bias in the data analytics and AI-informed decision making.