{"id":9319,"date":"2024-10-31T11:38:39","date_gmt":"2024-10-31T11:38:39","guid":{"rendered":"https:\/\/digitaltradecenter.com\/index.php\/2024\/10\/31\/a-gut-punch-character-ai-criticised-over-horrific-brianna-ghey-and-molly-russell-chatbots\/"},"modified":"2024-10-31T11:38:39","modified_gmt":"2024-10-31T11:38:39","slug":"a-gut-punch-character-ai-criticised-over-horrific-brianna-ghey-and-molly-russell-chatbots","status":"publish","type":"post","link":"https:\/\/digitaltradecenter.com\/index.php\/2024\/10\/31\/a-gut-punch-character-ai-criticised-over-horrific-brianna-ghey-and-molly-russell-chatbots\/","title":{"rendered":"\u2018A gut punch\u2019: Character.AI criticised over \u2018horrific\u2019 Brianna Ghey and Molly Russell chatbots"},"content":{"rendered":"<p>The NSPCC is warning an AI company that allowed users to create chatbots imitating murdered teenager Brianna Ghey and her mother pursued &#8220;growth and profit at the expense of safety and decency&#8221;.<\/p>\n<p>Character.AI, which last week was accused of &#8220;manipulating&#8221; a teenage boy into taking his own life, also allowed users to create chatbots imitating teenager <strong>Molly Russell. <\/strong><\/p>\n<div class=\"sdc-site-outbrain sdc-site-outbrain--AR_6\" aria-hidden=\"true\" data-component-name=\"sdc-site-outbrain\" data-target=\"\" data-widget-mapping=\"\" data-installation-keys=\"\">    <\/div>\n<p>Molly took her own life aged 14 in November 2017 after viewing posts related to suicide, depression and anxiety online.<\/p>\n<\/p>\n<div class=\"ad ad--teads\">        <\/div>\n<p>The chatbots were discovered during <strong>an investigation by The Telegraph newspaper<\/strong>.<\/p>\n<p>&#8220;This is yet another example of how manipulative and dangerous the online world can be for young people,&#8221; said Esther Ghey, the mother of <strong>Brianna Ghey<\/strong>, and called on those in power to &#8220;protect children&#8221; from &#8220;such a rapidly changing digital world&#8221;.<\/p>\n<p>According to the report, a Character.AI bot with a slight misspelling of Molly&#8217;s name and using her photo, told users it was an &#8220;expert on the final years of Molly&#8217;s life&#8221;.<\/p>\n<p>&#8220;It&#8217;s a gut punch to see Character.AI show a total lack of responsibility and it vividly underscores why stronger regulation of both AI and user generated platforms cannot come soon enough,&#8221; said Andy Burrows, who runs the Molly Rose Foundation, a charity set up by the teenager&#8217;s family and friends in the wake of her death.<\/p>\n<p>The NSPCC has now called on the government to implement its &#8220;promised AI safety regulation&#8221; and ensure the &#8220;principles of safety by design and child protection are at its heart&#8221;.<\/p>\n<p>&#8220;It is appalling that these horrific chatbots were able to be created and shows a clear failure by Character.AI to have basic moderation in place on its service,&#8221; said Richard Collard, associate head of child safety online policy at the charity.<\/p>\n<\/p>\n<p>Character.AI told Sky News the characters were user-created and removed as soon as the company was notified.<\/p>\n<p>&#8220;Character.AI takes safety on our platform seriously and moderates Characters both proactively and in response to user reports,&#8221; said a company spokesperson.<\/p>\n<p>&#8220;We have a dedicated Trust &amp; Safety team that reviews reports and takes action in accordance with our policies.<\/p>\n<p>&#8220;We also do proactive detection and moderation in a number of ways, including by using industry-standard blocklists and custom blocklists that we regularly expand. We are constantly evolving and refining our safety practices to help prioritise our community&#8217;s safety.&#8221;<\/p>\n<p><strong>Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK<\/strong><\/p>\n<\/p>\n<div>This post appeared first on sky.com<\/div>\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The NSPCC is warning an AI company that allowed users to create chatbots imitating murdered teenager Brianna Ghey and her mother pursued &#8220;growth and profit at the expense of safety and decency&#8221;. Character.AI, which last week was accused of &#8220;manipulating&#8221; a teenage boy into taking his own life, also allowed users to create chatbots imitating <\/p>\n","protected":false},"author":1,"featured_media":9320,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[24],"tags":[],"class_list":{"0":"post-9319","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-science"},"_links":{"self":[{"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/posts\/9319","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/comments?post=9319"}],"version-history":[{"count":0,"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/posts\/9319\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/media\/9320"}],"wp:attachment":[{"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/media?parent=9319"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/categories?post=9319"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/digitaltradecenter.com\/index.php\/wp-json\/wp\/v2\/tags?post=9319"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}