织梦CMS - 轻松建站从此开始!

欧博ABG官网-欧博官方网址-会员登入

A皇冠ge Checks to Access Chat, Studio Team Create, a

时间:2026-01-12 14:17来源: 作者:admin 点击: 3 次
[Update] January 9, 2025 [Update] January 7, 2026 [Update] December 19, 2025 [Update] December 11, 2025 [Update] December 3, 2025 [Update] No

(This reply inncludes information from the Newsroom post, which covers the age ranges for “Can users chat with others outside of their age band if they are Trusted Connections?” in the FAQ.)

Thank you Roblox, you will be fostering a safer community on Roblox… by killing the communities and forcing them off Roblox to less safe options. You have potentially started the doomsday clock for security groups, or any group where you roleplay as a lower rank under the community managers acting as high ranks.

Here are some base stats:

About 40% of Innovation Security Training Facility’s players are 13-17, and about 20% are younger. This experience is a requirement to rank up in the group, and runs manually with community managers in single servers (no splitting of ages between servers).

About 70% of Innovation Security’s community managers are 21+. About 25% are 18+ but not 21+.

With these 2, assuming a 100% take rate for willingly giving a video of your face to a random company: 70% of the community managers will be unable to interact with at least 60% of the community, since they can only interact with 18+ people. They also can’t interact with the last 5% of the community managers who aren’t 18+. At least 20% of the players are uninteractable by all community managers because <12 is inaccessible by all.

By interact, this is mostly with sessions in the Training Facility, but also moderation. For me, I can’t see what 70% of players are saying, or tell them to stop what they are doing. Or collect evidence for action against others. Or handle appeals. I can only ban players with limited information and force them off-platform to Discord to continue. As a reminder, Discord is outside of Roblox’s control, has no gloabl text filter, has no image approval system, and allows porn. I know a Roblox Discord server in social media links that has porn in the main channel’s pins, and Roblox moderation would not take down the link.

While we could spend a lot of money to roll out a minigame type system for ranking up, we would cease to be any different from other minigame experiences and would exit the lower rank roleplay niche. That niche can just disappear entirely.

Moving on to Ultimate Mining Tycoon, we have a new problem. Not as players, but as creators. Our demographics push a lot more 18+ than normal, so we only lose 50% of our players being able to give realtime us any feedback on-platform. They think we are just ignoring them, when we aren’t the ones deciding to do so. We will need to instead force them off-platform, again, to Discord. Which again, has worse safety issues.

Oh, and what about friends that alternate between 2 and 3 years apart because of birthdays? I can be 2 years from someone, and then jump to 3 years in May, unexpectedly lose contact with me on-platform, and then regain contact when their birthday happens later in the year. If we aren’t on Discord and relying on Roblox’s chat, we will be completely cut off for months on end.

That is unless we find a game that violates the rules by implementing a custom chat system that doesn’t respect the age groupings. Those exist now, and will grow more. Maybe someone looking to redirect people off-platform with this setup will use this to profile users with large enough servers to figure out age ranges with any sort of API that determines if 2 players can chat. That just sounds like a “coloring problem” in graph theory that ChatGPT can figure out.

In summary:

Roblox is feeding into the narrative that handing over personal information of minors to random companies and individuals is a good idea (it isn’t - this will increase identity theft, fraud, and child endangerment).

Security and related roleplay groups will need to force people off-platform onto platforms that allow porn, or die out refusing to do so.

Malicious games will continue to exist, since this can’t be automatically detected. Pretty sure this is what happened with Roblox getting banned in Turkey a while ago.

Funneling people off-platform will drastically increase to the point of being normalized. Again, outside of Roblox’s control, and arguably less safe.

Moderation and realtime feedback of games will drastically suffer, annd the FAQ only mentions they are “looking into solutions”.

This change should not be rolled out globally, and must not unless we are ready for another massive chunk of Roblox to vanish or become unsafe. This is not a PR win - this just increases child endangerment off-platform and kills communities in the process, and give predators more Discord servers to exploit.

(责任编辑:)
------分隔线----------------------------
发表评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
用户名: 验证码:
发布者资料
查看详细资料 发送留言 加为好友 用户等级: 注册时间:2026-01-13 22:01 最后登录:2026-01-13 22:01
栏目列表
推荐内容