Responsible, about business

šŸŽ¶ Do you remember the 21st night of September?šŸŽ¶

Hello, and welcome to Responsible, by ClearOPS, a newsletter about ResponsibleAI and other responsible business practices.

Hereā€™s what we have for you today

  • What does Apple Intelligence mean for your privacy on your device?

  • There are some memories that are not worth recalling

  • Carolineā€™s weekly musings

As multiple people on Twitter have already pointed out, Musk got it a little wrong. The use of ChatGPT on your phone will be optional. BUT, there are a lot of other AI features here and it makes me wonder, how is Apple handling training?

First of all, I donā€™t think AI models training on personal data is irresponsible, so long as you explain how its done. To dig into this a little more, I visited Appleā€™s Terms of Use and noted I clearly went to the wrong spot AND that their last update date was 2009. Wait, what?

So there are no terms of service for Apple Intelligence yet, which I guess makes sense because it is not yet released? Since I cannot rely on knowing exactly what they are doing with AI, it is hard to worry about this at this point. I will have to get back to you in a future post.

As to Elon Muskā€™s point, he wasnā€™t all wrong. Apple is going to somehow make it easier to use ChatGPT on your phone without downloading the app and without a login. Since OpenAI can use your ChatGPT data for training (as it can with any free account), per its terms, this is where I think Elon Musk is really right. If it is even easier to use ChatGPT, Apple has its work cut out for it to make sure users know that OpenAI is training on their inputs. It makes that terms of service update even more important and Apple needs to be very intentional about warning users if it wishes to maintain its reputation as the brand for privacy.

And another brand famous for its privacy, Microsoft, also had an AI misstep with its recently announced product Recall. In case you didnā€™t hear about it, Recall is part of Microsoftā€™s Copilot + PC launch and it takes screenshots of your work as you go and creates a recallable database of everything you have seen or done on your PC.

But they didnā€™t quite get the security right, as this one researcher hilariously describes (itā€™s long but worth it). If you are one of my customers, then you know that the Microsoft SSPA, Supplier Security & Privacy Assurance Program, is a PITA. They really put their vendors through a serious audit, but apparently their internal controls are not as ā€¦ rigorous? Or is this an AI only thing for product release?

Thatā€™s not fully true as I am not confident they actually launched it yet, but it is fun to say. But the problem I have with Microsoft is they clearly are not following their own policies. It is irresponsible to have requirements in place on your vendors that you do not uphold yourself! It breaks trust and, here, Microsoft broke my trust. I will not be running out to get one of their PCs and I will not be using this Recall feature for a very long time. And I am more than A okay with that because I was not, am not, excited for Recall on my computer.

P.S. If you think about it, Recall is really just a back up of all your data. So in and of itself, it is supposed to be a security tool. The irony is real.

Subscribe to keep reading

This content is free, but you must be subscribed to Responsible by ClearOPS to continue reading.

I consent to receive newsletters via email. Terms of Use and Privacy Policy.

Already a subscriber?Sign In.Not now

Reply

or to participate.