Still basking in the glow of our 20th Anniversary celebrations, we marked another significant milestone this spring – CDS’ 7th annual Putting Insights Into Practice (PIIP) forum took place virtually on May 1-2, 2024.
Hundreds of eDiscovery practitioners joined us for fascinating discussions on short message data and mobile device collections, current case law, choosing the right tech at the right time and new considerations in data governance.
With so many compelling, confounding data challenges facing the industry, it’s never been more important to foster exploration, innovation and collaboration and propel eDiscovery forward, together.
PIIP: Not Your Average Legal Data Management Conference
This year’s PIIP set a new bar. Our program launched with an electrifying, real-time reality check from Chuck Nice, comedian and co-host of Star Talk with Dr. Neil deGrasse Tyson. Grounded in scientific theory, infused with humor, Chuck’s keynote reminded us to trust in our humanity when confronting technological change.
“[AI] can’t look up and wonder . . . It cannot ask those great questions that we, as human beings, are able to. So, any position that requires true questioning, true contextual thinking, true accountability, true creativity, those will require a human being. And AI will be the technology that those human beings use to gain an advantage, to increase productivity, to bring added value to their clients. But what it won’t do is replace them.” – Chuck Nice
Check out a clip from Chuck’s talk here:
Expedition Generative AI
Many PIIP sessions and speakers touched on the potential of Generative AI, highlighting the outsized role this rapidly evolving technology is already playing in today’s dynamic dataverse. To help our audience better understand the many implications of Gen AI, we invited ‘Storyteller from the Future’ Karen Palmer to expose the complexities of combating bias in AI based on race, gender, and class.
Karen, an SXSW-award-winning Extended Reality (XR) filmmaker, shared examples of AI influencing unfair outcomes in the workplace and the justice system. She described three main sources of AI bias: AI systems trained on data and that data is biased; or, algorithms designed by humans and humans are biased; or, if humans have introduced bias through algorithmic testing, and the testing was biased by design.
“So you may call it artificial intelligence. I call it artificial stupidity, because AI systems can basically reinforce existing societal biases creating a cycle of discrimination.” – Karen Palmer
Watch Karen describe her award-winning project, Perception iO:
Her prescription for eDiscovery practitioners: Each of us has a stake in the development of more just, ethical AI-empowered future. As you sharpen your Gen AI aptitude, hone your awareness of bias as well. We can be the heroes and heroines of this developing AI story.
These talking points from both of of our keynote speakers inspired discussion among our six panels at PIIP 2024 that had plenty to say on the state of eDiscovery today. Read Part 2 of our PIIP 2024 recap for our top five takeaways.