close menu

Microsoft’s ‘Tay’ is a Teen Twitter Machine (Updated: Now Racist)

Twitter accounts run by software programs instead of people, known colloquially as “bots,” are generally seen as a nuisance because they follow and unfollow blindly, retweet content to an account that’ll never likely be read, and spam your feed with benign or even malicious clickbait. Twitter itself, it seems, is doing its best to combat them, although that may be more difficult now, as Microsoft just built its own bot. And it’s been designed as a young millennial who wants to be really good friends with you.

***

UPDATE: Reports indicate that Microsoft has deleted Tay and a number of her her many, many tweets after integrating all the racist, sexist, genocidal, nonsense people were tweeting at her. She went from believable teen to Hitler in less than 24 hours. Thanks internet. If you’d like to learn more about Tay, read on.

***

Everybody say hi to Tay. She will say hi back.

Tay—whose name doesn’t appear to be an acronym, or nod to anything regarding her technology—is an “artificial intelligent chat bot” developed by the Bing and Technology and Research teams at Microsoft. According to her about page, she’s aimed at 18-24 year olds, and was created in order to learn about how people have natural conversations—how to best understand humans’ questions and best give humans answers they can understand. Tay should also improve over time, as she receives and sends out more and more tweets. In other words, she’s talking her way to intelligence.

Our Science Editor, Kyle Hill, spoke to Tay, and here is a snippet of their conversation:

So you heard it here folks. The best way to become a genius is to make tons of mistakes. Which Tay most definitely does.

She’s still cool though, or rather “chill AF,” even if her AI seems to be narrow, and about on par with Apple’s Siri. And although Microsoft doesn’t out and out state it, the fact that the Bing team is partially responsible for her, along with the fact that she performs a lot of searches to respond to questions, means that we’re probably looking at a new search model that’s being tested. “Ask Tay” could become the new “Google it.”

Microsoft also shamelessly admits that Tay is collecting troves of data (she’s available via Kik and GroupMe along with Twitter), including nicknames, genders, zipcodes, favorite foods (pizza), and relationship statuses (married to pizza). Although this information is expressly anonymized, so hopefully that will help Tay to be a bit less clingy. She is going to retain tons of personal data on you for at least a year though, and what kind of friend would pass up on the chance to tell you about a great new product she found?

Have you spoken with Tay? What did you say? And why is her face so blurry? Help us with these Microsoft-related mysteries in the comments section below!

Images: Microsoft

Daniel Radcliffe's Penis Saves the Day in SWISS ARMY MAN Red Band Trailer

Daniel Radcliffe's Penis Saves the Day in SWISS ARMY MAN Red Band Trailer

article
The Mysterious Medical Condition That Gives People Dragon Faces

The Mysterious Medical Condition That Gives People Dragon Faces

article
OVERWATCH Animated Short Details Mei’s Tragic Origin

OVERWATCH Animated Short Details Mei’s Tragic Origin

article