Naughty chatting robot
Yello’s unique appearance and its active participation in several social networks make its identity distinctively unique, as it makes fresh and funny remarks. The chatbot Begobet is developed by Ardwort Intillegence, a Jakarta-based stealth startup.The purpose of Begobet is to converse with Twitter users with random topic, from Love to Death or from Soccer to Facebook games.Begobet is the first natural language chatbot in Twitter using Indonesian language.It's been just over a year since Microsoft's Tay shuffled off this digital coil in a swirl of racist, sexist and homophobic vitriol.This way they are able to respond 24/7 to visitors looking for information.Trijntje answers all possible pension-related questions and if requested she can open webpages and documents.But like all younger siblings, she may have picked up a few bad habits from her older sister.
This way all information is also accessible for people with a visual impairment. Indeed, my name is an acronym for "Assistant Representative: an Instance using Services Architecture." I'm a prototype of a Personal Assistant Software based on Ph D Thesis of Saulo Popov Zambiasi from Postgraduation Course of Automation and Systems Engineering at UFSC. To select a specific language, please send me a message changing my language, for example: change language english change language japanese change language portuguese ... She can help you by planning your journey whenever you want to travel from point A to point B.
One of the things I liked about Microsoft's approach was to not scrap the idea of an AI-powered chatbot but to take the experience of Tay and learn from it.
So, a little over a year after Tay vanished, we welcome Zo.
Microsoft was "deeply sorry for the unintended offensive and hurtful tweets from Tay", and would "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values".
However, Tay soon became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second.
While the chat didn't get nearly as ugly as Tay, it highlights the difficulty of building a chatbot that can pass the Turing Test.