- Responsible by ClearOPS
- Posts
- Responsible, by ClearOPS, 1/7/2025
Responsible, by ClearOPS, 1/7/2025
Happy New Year! No predictions or look backs to start the year; hate the machine.
What a year that was! Phew.
Welcome to the 2025 edition of Responsible, a newsletter about responsible AI business practices, mixed with my English humor and wit. I am Caroline, a lawyer, avid cook and a complete nerd when it comes to AI. I love it. Especially GenAI. I also run a course on implementing AI governance and I run an AI company called ClearOPS. There are a few spots left in the course if you want to join us on Friday!
I love to help, give advice, share stories and strategize. Thus, this newsletter. I hope you enjoy.
What I have for you this week:
All that content means a lot more discovery
AI is causing us to go backwards
Caroline’s weekly musings
Chef Maggie Recommends
AI Tool of the Week
My Three AI Things

Exasperated in house counsel - DALL-E
What is the number one question on every in-house lawyer’s mind right this very second?
It is, “how much discovery is all this AI tooling creating and how am I going to manage it.”
I remember when chats and SMS created this headache. Now we have video calls with AI notetakers and recordings, we have copilots creating document summaries and automated emails, including automated responses. I could go on and on.
Frankly, I am less concerned about the discovery than I am about just locating all this data.
Which brings me to this next point, has anyone realized that current AI tools are all about creating more data? Use the AI to feed the AI…
But the legal risk calculation is actually quite difficult, which is why every in-house lawyer is pondering it right now. An AI notetaker on Zoom is different from recording the call. Sure, Zoom has the ability to ask for consent in compliance with wiretapping laws, but the analysis is a lot deeper. Now you have to consider privacy, confidentiality/ contract law, customer content, cybersecurity and data retention.
When Slack first got popular, I remember my team at Sailthru coming to me to ask for a review of their terms. When I asked what conversations Slack would have access to, and then I read their terms, I actually blocked them as a vendor. I told them that their confidentiality, privacy and cybersecurity provisions were unacceptable for the types of conversations allowed, such as amongst HR. I heavily negotiated their terms and forced them to “level up” before we would do business with them.
But that was the tip of the iceberg.
Conversations on Slack were hard to keep track of and contained a lot of sensitive information. So many channels meant a lot of context that was hard to capture. Then we had to figure out how long to keep Slack conversations. What if we got sued? Any legal obligations to retain the Slack messages or should we delete them as soon as possible? But then certain employees would save Slack messages because they contained projects and information needed for their jobs. My point is that these tools can create more work and that is happening with all these AI add-ons.
It makes my head hurt.

Snail Mail - DALL-E Image
Two things happened today that were unrelated but got me thinking that we may have to go backwards.
The first thing was a LinkedIn post about how a person was using an AI bot to scan for job listings and then the bot would apply using the person’s resume. It enabled them to submit dozens of applications each day.
The second was going to a college counseling meeting and listening to the college admissions officers describe the application process.
When I read the LinkedIn post, I wanted to quip that AI is going to make us go back to snail mail. The best preventative measure to stuffing inboxes with resumes is to put in road blocks i.e. end the ability to apply online, through a web form or an ATS system.
But when I listened to the admissions officers, I realized that colleges already had figured out that added friction prevents AI stuffing.
When a high school student applies to college, they have to submit a lot of different documents: grades, essays, test scores, letters of recommendation, etc.
To get a job, you just have to submit your resume and maybe a cover letter.
Perhaps AI will make applying for a job like applying for college. Imagine if when you apply to a job, you had to include more than a piece of paper, but some sort of certification certificate given by a third party, a recommendation letter, a custom, specific essay about why the company was your ideal and your resume and cover letter? Not all of that can be produced by a bot and that friction is what would make it harder to do the AI stuffing that is freezing up a company’s ability to evaluate candidates.
Or maybe we will just go back to the old fashioned days of requiring that everything is sent via mail.
Reply