I now hate a particular major bank. I also hate an online small appliance and electronics retailer. And a Fortune 100 company that also has an online store. I hate them because they don’t know me. Most probably their artificial intelligence (AI) has been anything but intelligent with me. With their new and horribly misguided “brains,” these companies have refused to accept my money. As far as I can tell anyway, they simply don’t want my business.
OK, no problem for me. It’s their loss, forever in all likelihood.
It’s hard to know for sure, in every case, why and how AI goes so horribly wrong. However, there’s a lot of accumulating evidence that these new technologies are amplifying human prejudices and biases rather than avoiding them.
Here are some ideas that could help solve or at least reduce these problems:
1. Use AI (and humans) as a check against out-of-control AI. How about an anti-bias and anti-prejudice intelligence engine that constantly pushes against racial, gender, religious, and other forms of AI-augmented discrimination (that happen to be illegal in many jurisdictions)?
2. If you’re going to embrace AI, embrace it, including directly on the mainframe and with people who understand and appreciate mainframe-based systems of record. If an AI strategy isn’t good enough for core business processes, then it isn’t a good strategy.
3. Just as you should never tolerate final, dictatorial human decisions with no effective means of appeal, the same is true of AI. Even the best implementation and model errs. So what happens when (not if) your customers are caught in AI hell?
4. Regularly check to make sure experts can explain AI decisions. If they cannot explain them, then either they need to learn more or your new monster is misbehaving.
Obviously business and government use of AI is growing rapidly, and AI technologies show great promise and are already literally protecting lives. However, that’s why it’s particularly important to imbue our new AI colleagues with strong ethics, to confront and to combat humanity’s worst behaviors rather than amplifying them.