Roblox, Discord, OpenAI and Google found new child safety group

MT HANNACH
2 Min Read
Disclosure: This website may contain affiliate links, which means I may earn a commission if you click on the link and make a purchase. I only recommend products or services that I personally use and believe will add value to my readers. Your support is appreciated!

Roblox, Discord, Openai and Google launch A non -profit organization called Roostor robust open online safety tools, which hope to “build an evolutionary and interoperable interoperable safety infrastructure adapted to the AI ​​era”.

The organization plans to provide free open-source security tools to public and private organizations to use on their own platforms, with a particular emphasis on children’s safety to start. The press release announcing Roost specifically calls plans to offer “tools to detect, examine and report children of sexual abuse (CSAM)”. Partner companies finance these tools and technical expertise to build them as well.

Roost’s operational theory is that access to a generative AI quickly modifies the online landscape, which makes the need for a “reliable and accessible safety infrastructure” all the more urgent. And rather than expecting a smaller business or organization to create its own safety tools from zero, Roost wants to provide them for free.

Children’s online security is the problem of the day since the law on the protection of privacy of children and adolescents (Coppa) and the online security law of children (Kosa) started to make his way through the congress, even if the two failed to go to the room. At least some of the companies involved in Roost, in particular Google and Openai, have also already promised To prevent AI tools from being used to generate CSAM.

The problem of children’s safety is even more urgent for Roblox. Since 2020, Two -thirds of all children between nine and 12 years old Play Roblox, and the platform has historically had trouble fighting children’s safety. Bloomberg businessweek reported that the company had a “pedophile problem” in 2024, which prompted multiple policy changes and new Restrictions on children’s DMS. The perch will not facilitate all these problems, but should make them easier for any other organization or company that is in Roblox’s position.

This article originally appeared on engadget on https://www.engadget.com/big-tech/roblox-discord-openai-and-google-found-new-child-safety-group-194445241.html?src = RSSS

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *