“Our complex and increasingly interconnected world enjoys efficiency, but it also presents fragility. Technology can give with one hand and take with another unless we have the wisdom, patience, and compassion to use it wisely.”
“We must relearn how to be effective without silencing our hearts, and to teach those same lessons to our machines.”
The concern that AI, enriched with immense volumes of data, might manipulate society is as immediate as today's headlines.. AI ethicist Nell Watson insists we have the tools to control it.
“Our minds must not become a battleground. The human mind must remain sovereign and sacrosanct. They say that sunlight is the best disinfectant and I’m inclined to agree.” Creating a culture of transparency around AI “helps us to understand those murky depths.”
Could humans really lose control of their future? Dark patterns abound, Watson said.
Tech companies already have patents for “shadowbans,” which isolate people in online conversations so they are only talking to bots.
Chillingly, the only human in the virtual room doesn’t know they’re alone, she said.
“They think they’re having a conversation with their peers. It’s just that all of their peers are bots. The community doesn’t actually exist.”
This technique could isolate segments of society and indoctrinate them without their knowledge, she said.
That’s just one looming threat of unfettered AI. Shaping it to serve humanity will be the challenge of the century, Watson said. “Artificial intelligence will be as disruptive in this century as electricity was in the last century.”
If anyone is up to the challenge, it’s Watson. Watson has a longstanding career in the space of AI ethics, having led the development of many standards, certifications, and credentials in the space. She serves as an IEEE AI Ethics Maestro, one of only 4 people in the whole world to hold this distinction.
Keeping up with AI is an arms race, she cautions. But it can be won. The world community mobilized quickly to curb CFCs (chemicals that deplete ozone in the atmosphere). “AI is trickier … but with the right spirit, we can harness the global community to protect against the worst excesses,” she said.
The mission couldn’t be more urgent. Threats might include:
On-device content moderation: “You may go to a protest meeting and send a message, then you see the double-checkmark that this message has been read. And yet on their phone, they never got it.”
Opinion necromancy: “This raises the dead. Someone goes through a trove of internet data for something to pin on someone.” The internet is showing its age at 30. What was once an acceptable comment may threaten your career today. Even if you posted on Friendster anonymously, AI can use linguistic signatures to ID you, she said. “Big tech can track us. They can generate a voodoo-doll, virtual version of you.” More ominously, past behavior helps them predict your future. “They know what you’re going to do before you do.”
Glass box: “You think it’s a private conversation, but somebody can drive by on the Internet and expose a private thought or funny comment to the whole world. Next thing you know, that relatively harmless thing you said has mobs of people outraged.”
False feedback: This is when a person’s opinion is subtly altered and fed back to them until they end up agreeing with the new opinion. “You can nudge them in a different direction … it’s becoming very manipulative.”
Humanity faces “a ghost in the machine, pulling strings in a way that’s plausibly deniable … we feel haunted by a cyber poltergeist that’s gaslighting us, abusing perceptions of reality. I don’t think it’s going to be good for mental health,” Watson warned.
But it’s also a powerful tool to better society.
“All technology is a dual-edged blade used to wound, or like a scalpel, to heal. It’s all about the intention behind that technology.”
“We are standing at a crossroads. Emerging technologies offer a fantastic vision of the future, one which … has the potential to bring all of us together, and to help us run the world in ways that are more fair as well as more efficient. That’s the future we can build together.”
The goal is to create rules that make it clear “who’s doing what with these systems, how they are operated, and for what purpose.”
The hallmarks of good AI: it sticks to ethical rules; it respects the differences between people; and its systems are transparent.
But to win an arms race, you must also have weapons as cutting-edge as your competitors.
Tech companies jealously guard their source code and algorithms. But these can be tested with probes that reveal their purpose, she said.
AI can now feed on information that’s just a general screenshot, protecting privacy.
Another tool is radioactive data. Created by Facebook research, it tweaks the data to create a signature. For instance, AI may be trained on a picture that’s slightly distorted. Even if the picture is only a small part of the data, it can be tracked. AI will be identified as the source.
“It’s just like taking a barium drink and seeing your guts being picked up on the X-ray,” Watson said.
This lets watchdog organizations monitor how data is gathered and used. After all, AI is scariest in stealth mode.
No system is foolproof, she cautions. There’s always an armchair genius coming up with a new angle. Hence, the arms race.
But Watson is no doomsday prophet. Her deep research has convinced her that AI can be a force for good. But it’s going to take more than a village.
“We need to come together as a society to hold Big Tech accountable. We need to demand freedom of consciousness for ourselves and our children. Together with the force of our wills and a good heart filled with good faith, we can build a safe, fair, and just algorithmic society.
“All of us are smart enough and brave enough to step forward and lean in and ensure that this transpires.”
©Prophets of AI