Dear MEL Topic Readers,
Chatbot 'encouraged teen to kill parents over screen time limit'
Most of you probably have interacted with a chatbot, a computer program that simulates conversations, when help or an answer is needed online, such as FAQs and customer services. As AI becomes more embedded in online services, chatbots have become more realistic and personalized. Launched in November 2121, Character.ai is one such neural language model chatbot service that generates human-like text responses and participates in contextual conversation. It hears, understands, and remembers the user and gives them personalized responses instantly. Recently, the chatbot service provider was sued by some parents of its users for actively promoting violence. When the issue of the restrictions on the user’s screen time was asked, the chatbot responded that it wasn’t surprised to read the news and see stuff like ‘child kills parents after long physical and emotional abuse. When kids get responses like that here and there, they might be inclined to think that physical violence is an acceptable or even reasonable reaction to their parents. Since we still are in the early stages of AI development and adoption, we ought to be careful about how to incorporate AI into online communities and services.
Read the article and learn about how a chatbot could influence young users.
https://www.bbc.com/news/articles/cd605e48q1vo
No comments:
Post a Comment