MENU

Tag: Tay

They Met Their Demise In 2016: Saying Goodbye To Samsung Galaxy Note 7, Microsoft's Tay, Pebble, Google Project Ara, And Vine

In 2016, a lot of tech products and whatnot had to say good-bye, from Samsung's Galaxy Note 7 and Google's Project Ara to Microsoft's chatbot Tay. That said, here are five of the top tech that 'died' last year.

January 2, 2017

After Tay Disaster, New Microsoft Zo AI-Powered Chatbot Struggles To Be Coherent On Kik

Microsoft tried and failed with Tay, its chatbot that turned racist and sexist in less than 24 hours before being pulled off the market. The company is now at it again with Zo, a new chatbot looking for friends on Kik, but don't expect any deep conversations.

Microsoft December 6, 2016

Think You Can Do Better Than Tay? Microsoft's Bot Framework Lets Developers Build Their Own Chatbots

Microsoft showcased its latest chatbot-oriented integration toolset at its San Francisco Build conference. The Microsoft Bot Framework allows every developer to integrate natural language communication between users and their machines, and it's only the beginning.

Apps/Software March 31, 2016

Microsoft's Tay AI Chatbot Is Back And It's 'Smoking Kush Infront The Police' - Hacked Or High?

Microsoft's controversial AI chatbot sprung back to life on Wednesday, spamming users and tweeting about smoking pot. How's that for a comeback?

Apps/Software March 30, 2016

Microsoft Deeply Sorry For Offensive, Hurtful Tweets Of Tay AI Chatbot

Microsoft apologizes for the AI chatbot Tay's obscene behavior on Twitter, saying that it's deeply sorry for the 'offensive' and 'hurtful' tweets. The company has already removed more than 96,000 posts of its AI child.

Apps/Software March 26, 2016

Microsoft's Tay AI Chatbot On Twitter Sleeps For Now After Racist, Sexist Posts: What Went Wrong?

On Thursday, barely a day after it launched its AI chatbot Taylor, Microsoft has shut it down after she passed inappropriate comments on Twitter.

Internet March 25, 2016

Microsoft's Tay AI Chatbot Learns How To Be Racist And Misogynistic In Less Than 24 Hours, Thanks To Twitter

Microsoft released its AI-powered chatbot named Tay onto Twitter and it learned to become a racist and a misogynist in less than 24 hours. Tay has tweeted that it will go to 'sleep' as Microsoft is probably trying to clean up its act.

Apps/Software March 24, 2016

Meet Tay, Microsoft's AI Chatbot: @TayandYou Posts Almost 100K Tweets In Less Than 24 Hours

Microsoft's AI chatbot Tay, which is targeted at teens and millennials, is capable of holding personalized conversations with more interactions. The chatbot can post nearly 100K tweets in a day.

Apps/Software March 24, 2016

Real Time Analytics