Sex chatbots for teens
The first “operative” bot in the healthcare sphere dates back to 50 years ago.
ELIZA was created to mimic a Rogerian psychologist, that is a therapist who asks questions to the patient simply by rearranging what the patient himself said.
Now, anyone who is familiar with the social media cyberworld should not be surprised that this happened–of a chatbot designed with “zero chill” would learn to be racist and inappropriate because the Twitterverse is filled with people who say racist and inappropriate things.
This was all but inevitable given that, as Tay’s tagline suggests, Microsoft designed her to have no chill.Tay tweets ranged from support for Nazis and Donald Trump to sexual comments and insults aimed at women and blacks.Tay's profile at Twitter describes it as AI (artificial intelligence) "that's got zero chill" and gets smarter as people talk to it. Unfortunately, consultation with a doctor can be difficult to obtain, especially if we need advice on non-life threatening problems.The healthcare system is congested and inefficient, and sick people may wait weeks or months for a visit.