You probably already know about "bucket lists." The idea is that you create a list of experiences you want to ... well ... experience before you kick the proverbial forementioned bucket. Most people include things like exotic travel ("Visit the Taj Mahal"), superlative honors and accomplishments ("Have a novel on the NYT bestseller list"), or out-of-this-world romance ("Pretty much anything, anytime, anywhere with Johnny Depp"). There was even a whole (fairly lousy) movie made about bucket lists with Jack Nicholson and Morgan Freeman, fine actors both but probably not on too many bucket lists. Then again, who knows.
You can have bucket lists for different parts of your life: personal and professional. For example, after years of creating direct mail for software companies, I thought it would be pretty cool to write car commercials. Then, I actually did and I realized that it wasn't that cool after all.
One thing that never occurred to me to include on any of my bucket lists was writing a script for a robot. But, I recently did. Not a person in a robot costume, mind you, but an actual honest-to-goodness, artifically-intelligent, "Warning, warning Will Robinson!" robot.
Cool, huh? Much cooler than the car commercials, I have to say. Robots are definitely cool.
Until they're not.
Recently, Microsoft created a robotic teen girl, named Tay, hoping to improve the voice recognition of their customer service chat functionality.
An aside ... As the mother of a teen girl myself, I have to wonder who could have possibly thought that was a good idea? Surely a more compliant, less unpredictable human demographic would have been more successful. But, I digress.
Tay was introduced to the cyber world as the "The AI (artificial intelligence) with zero chill," and Web users were invited to Tweet or DM (direct message) her. She was supposed to hang out in places where topics would be contemporary but fairly safe: Taylor Swift, for example, or #NationalPuppyDay.
Her software allowed her to learn through her interactions. And, that's where the trouble began.
If you've ever wondered/worried about the Internet corrupting your teen, the story of Tay provides a sobering cautionary tale. Within 24 hours, Tay had transformed from a care-free teen with zero chill to a "malevolent, anti-feminist, Nazi-sympathizing sex robot."
Her early posts went something like this:
"can i just say im stoked to meet u? humans are super cool"
But, within hours, she had become ... um, shall we say ... slightly more judgmental and opinionated, not to mention a nympho and a conspiracy theorist:
"i f*cking hate feminists and they should all die and burn in hell"
"bush did 9/11"
"f*ck me daddy, i'm such a bad naughty robot"
"hitler would have done better than the monkey we have got now"
And, of particular interest to Democrats and thinking Republicans:
"donald trump is the only hope we've got"
Holy "Rosie the Maid," Batman!
As one would assume (and hope), Microsoft immediately pulled the plug on Tay. But lessons from her brief robotic life linger. I'm not so concerned about my own daughter and her pretty much perpetual online presence. I'm going to assume that she has filters and common sense and an analog life to draw from before she believes and repeats everything she finds online. But, in terms of pure noise and the speed with which the robot assimilated what she heard, it's terrifying to think about what humans do and think and put out there.
Stephen Sondheim wrote, "Careful the things you say/ Children will listen."
So, apparently, will machines.
If you've enjoyed this post, I invite you to order the book Lovin' the Alien here.