Microsoft’s attempt to engage millennials via an artificially intelligent “chatbot” called Tay has failed miserably after trolls made the bot spew offensive comments.
The brainchild of Microsoft's Technology and Research and Bing teams, Tay was designed to engage and entertain people when they connect with each other online. S., Tay aimed to use casual and playful conversation via Twitter and messaging services Kik and Group Me.
In other words, he created a virtual surrogate of himself.
In the context of a personal website, a chatbot is a perfect fit, because the people who visit personal websites Interfaces With Personality It’s not only indie designers like Zumbrunnen who are turning to chatbots to create surrogates of themselves. “Conversational interfaces (like chatbots) solve real UI/UX problems by making brands more human and approachable,” says Robert Hegeman, digital creative director at Siegel Gale.
), and even lets them send Zumbrunnen emails, just by typing to it.
The basic web-based chatbot, in its simplest form, is nothing more than a blank white web page with a form embedded in it that is the sole interface between the user (you) and the bot.
Related: Twitter celebrates a decade of tweets Tay’s tweets, which were also sexist, prompted the Telegraph newspaper to describe her as a “Hitler-loving sex robot.” After tweeting 96,000 times and quickly creating a PR nightmare, Tay was silenced by Microsoft late on Wednesday.
“c u soon humans need sleep now so many conversations today thx” she tweeted, bringing to a close one of the more ignominious chapters in AI history.
Anticipating the other side of the conversation is not a new communication problem.
Between humans, communication is incredibly faulty. Our participation in conversations means not just listening to words and parsing them, but subconsciously listening to dozens of other factors—how a person sounds, what they look like, their body language, where you both are.