“Responsible AI” is just a page on a website
Any company offering AI products should be providing users with information on how they are behaving responsibly and ethically. It's not optional. Users should have access to how their data is being used and be informed on how the AI works. That's literally the least a company can do. Fortunately, for us, companies are doing more than that. The big question for me is are they doing it as performance, or is it a deeper commitment to responsibility?
If you take a look at some of the big players in the game, it is clear they are doing something. Microsoft, Google, IBM, Accenture… I could go on… have portions of their websites focused on this. In some cases, you really have to dig to find it. Selling products is more important than talking about how those products are being built and used responsibly. That shouldn't be a surprise to us, unfortunately. It should also not come as a surprise that there is no standard manner of doing this. Each company is defining it as they see fit, and communicating it in their own way.
But hey, at least they're doing it. They have managed to tiptoe over the extremely low bar given there is very little oversight and hardly any case law that presses them to do more.
The thing about this is that's just not enough.
More and more we are seeing the real time impact of AI on jobs, mental health, and the environment. It's easy now to go down a TikTok rabbit hole about a woman in AI psychosis, scroll through endless posts on LinkedIn about people being fired as company leaders decide AI can do their jobs, or read about the impact of AI servers on communities that are nearby, and how much water they use. We are watching negative impacts in real time over products that many people don't understand, don't use, or don't want.
I've written before that I think there will be a secondary market for ethical AI that will be similar to the organic food movement or sustainable clothing companies. There will be a number of AI users who will vote with their dollars for these types of companies just like they do in other parts of their consumer life. These companies will meet or beat any standard put in front of them, but I think they will also help set standards going forward.
While it hasn't happened in clothing, in organic food there has been some significant bleeding into the mainstream market. Mostly this has been tied to health ads and weight loss, but there have been positive outcomes for everyone as a result. As Europe begins to put more constraints on fast fashion we are seeing some shift to ethics in the clothing market as well. It will be interesting to see if the AI market follows this trend as a secondary market develops.
I'm imagining that these things may happen concurrently considering how fast the AI market is moving, and how quickly people are reacting to it. Things like the reaction to the Vogue ad with an AI model, and AI influencers at events such as Wimbledon demonstrate how quickly people will lose trust in a brand if they perceive it as being inauthentic. Social media is full of commentary on how using AI to write is "cheating" with thought pieces coming out daily offering suggestions on how one can tell if AI was used in a post or article. If the market can continue to create more user-friendly products, the ones that can create a sense of trust with customers will be the ones that win out.
This means that messaging around AI and ethics/responsibility will have to move off of the deeply buried webpages and into marketing. My hope is that it will push the conversation more into consumer protections and drive companies to behave responsibly not just say they behave responsibly. I, for one, I'm really looking forward to this happening as I think it will increase transparency and help bring a more human element to how AI is being positioned societally. Right now the people spending money on AI are venture capitalists, not your average human. Even people who use these products don't truly understand how they work, and as they learn more, they will expect more.