Slave girl chat bot

As someone called @God Damn Roads pointed out today on Twitter, "it doesn't take coordination for people to post lulzy things at a chat bot." Microsoft's accusation doesn't surprise me.Outsiders are constantly mistaking spontaneous subcultural activities for organized conspiracies.That means they must now suffer the indignities unethical bosses inflict on their human assistants, especially sexual harassment.As bots do more of our bidding, their algorithms are spending more time parrying flirtations, dodging personal questions, and dealing with darker forms of sexual harassment.We learned recently how Microsoft’s new artificial intelligence (A.I.) bot for teens, named Tay, could be outspoken when it comes to flirtation, but now the bot has gone one step too far.

Slave girl chat bot-57Slave girl chat bot-4Slave girl chat bot-61

Of course this is hardly the fault of Redmond, more a consequence of picking up language from your many online neighbors.

Facebook’s chatbot platform Messenger, launched in April, already offers more than 11,000 bots and “tens of thousands” of developers are reportedly working on more.

The scale of the sexual harassment issue is unclear. Eckstein says 5% of interactions in their database are categorized as clearly sexually explicit, although he believes the actual number is far higher due to the difficulty of identifying them automatically.

on the day the bot debuted, "The more you talk to her the smarter she gets in terms of how she can speak to you in a way that's more appropriate and more relevant." That means Tay didn't just repeat racist remarks on command; she drew from them when responding to other people.

When Microsoft took the bot offline and deleted the offending tweets, it blamed its troubles on a "coordinated effort" to make Tay "respond in inappropriate ways." I suppose it's possible that some of the shitposters were working together, but c'mon.