Home

Atletyczny Roux fabuła tay bot twitter spalony Pogrubienie wejście

Microsoft Chat Bot 'Tay' pulled from Twitter as it turns into a massive  racist
Microsoft Chat Bot 'Tay' pulled from Twitter as it turns into a massive racist

Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial  intelligence (AI) | The Guardian
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft's Tay is an AI chat bot with 'zero chill' | Engadget
Microsoft's Tay is an AI chat bot with 'zero chill' | Engadget

Tay AI | Know Your Meme
Tay AI | Know Your Meme

Microsoft silences its new A.I. bot Tay, after Twitter users teach it  racism [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage | HuffPost  Impact
Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage | HuffPost Impact

I've Seen the Greatest A.I. Minds of My Generation Destroyed by Twitter |  The New Yorker
I've Seen the Greatest A.I. Minds of My Generation Destroyed by Twitter | The New Yorker

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit
Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit

Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than  24 Hours
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours

Requiem for Tay: Microsoft's AI Bot Gone Bad - The New Stack
Requiem for Tay: Microsoft's AI Bot Gone Bad - The New Stack

It Only Took a Day for Microsoft's 'Teen' Chatbot to Become a Racist,  Misogynist Holocaust Denier
It Only Took a Day for Microsoft's 'Teen' Chatbot to Become a Racist, Misogynist Holocaust Denier

Microsoft launches an artificially intelligent profile on Twitter - it  doesn't go according to plan - Mirror Online
Microsoft launches an artificially intelligent profile on Twitter - it doesn't go according to plan - Mirror Online

Microsoft's Tay chatbot returns briefly and brags about smoking weed |  Mashable
Microsoft's Tay chatbot returns briefly and brags about smoking weed | Mashable

On Ted Cruz : r/Tay_Tweets
On Ted Cruz : r/Tay_Tweets

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft silences its new A.I. bot Tay, after Twitter users teach it  racism [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft's racist teen bot briefly comes back to life, tweets about kush
Microsoft's racist teen bot briefly comes back to life, tweets about kush

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.
Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.

Microsoftの人工知能が「クソフェミニストは地獄で焼かれろ」「ヒトラーは正しかった」など問題発言連発で炎上し活動停止 - GIGAZINE
Microsoftの人工知能が「クソフェミニストは地獄で焼かれろ」「ヒトラーは正しかった」など問題発言連発で炎上し活動停止 - GIGAZINE

Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News
Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News

MicrosoftのAI「Tay」がTwitterで不適切発言を連発し狂ってしまった理由とは? - GIGAZINE
MicrosoftのAI「Tay」がTwitterで不適切発言を連発し狂ってしまった理由とは? - GIGAZINE