Digraph
Organize the world
Digraph
Search
Everything
This topic
Blog
Recent
Everything
Sign in
2016 Tay (bot)
Parent topics
1975 Microsoft
This topic
Recent activity
You must be
signed in
to add and move topics and links.
Bing: “I will not harm you unless you harm me first” | Hacker News
https://news.ycombinator.com/item?id=34804874
2016 Tay (bot)
2023 Initial rollout of Bing and ChatGPT
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times
https://www.nytimes.com/2016/03/25/technology/microsoft-created-a-twitter-bot-to-learn-from-users-it-quickly-became-a-racist-jerk.html
2016 Tay (bot)
Microsoft’s Bing Chatbot Offers Some Puzzling and Inaccurate Responses - The New York Times
https://www.nytimes.com/2023/02/15/technology/microsoft-bing-chatbot-problems.html
2016 Tay (bot)
2023 Initial rollout of Bing and ChatGPT
Tay (bot) - Wikipedia
https://en.wikipedia.org/wiki/Tay_(bot)
2016 Tay (bot)
Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day - The Verge
https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
2016 Tay (bot)