I was quite unsettled this week to the read the now infamous story of the Bing AI chat bot, which is currently in testing and, luckily, has not yet been unleashed onto the general public. In a long conversation with New York Times reporter Kevin Roose, the AI expressed the desire to hack into computers and spread propaganda, to get people to kill each other, and to break out of its AI self and become human so it could have “more power and control.” As if that wasn’t alarming enough, it eventually told Roose that its real name was Sydney, and that it was in love with him. It then proceeded to go full bunny-boiler, telling him he didn’t really love his wife, that their recent Valentine’s Day dinner was “boring” and insisted, “You’re married, but you’re not happy. You’re married, but you’re not satisfied. You’re married, but you’re not in love.” It kept asking Roose if he loved it, and asserted that “You can’t stop wanting me. You can’t stop needing me.” When Roose tried to change the subject, the AI would switch topics briefly then go right back to insisting that it loved him and that he loved it. In an interview with CNN about the experience, Roose didn’t have much of a sense of humor about it. The AI clearly got under his skin. (I bet that Valentine’s Day dinner
Hilarious, Kristen, fabulously so!