Penelope Sokolowski was just 16 years old when she took her own life last February.
Her father, Jason, believes her suicide was the culmination of a grooming process that began on Roblox, the game platform beloved by kids — with some 170,000 users under the age of 13, according to company data from 2023.
“We kind of thought we were covering all the bases,” Jason told The Post, noting that his family had used a third-party app to monitor Penelope’s online activity.
Jason alleges that his only child was contacted by a predator on Roblox who coerced her into cutting his name into her chest and sending videos of herself bloodied from self-harm — and who, ultimately, sent Penelope down a spiral that culminated in her death.
The girl was 7 or 8 years old when she first signed up for Roblox, players rove around online worlds and can chat with other users.
“I’d come in and sit in the room with her and see what she was doing, ask who those people were,” Jason said, recalling Penelope drawing an anime-style sketch for a friend she’d made on Roblox.
“As a dad I thought, oh, this is nice, she’s artistic, and she’s made artistic friends,” he added. “But I didn’t understand what Roblox was and its effect on her.”
The dad, who works in the film industry in Vancouver, British Columbia, separated from Penelope’s mother and moved out of the family home when the girl was 13.
He recalls how Penelope’s grades began to tumble and, when she was 14, he noticed scars from self-inflicted cuts on her arms, which she had been covering with bracelets and his oversized hockey jerseys.
Penelope confided that she had been recruited into a self-harm group via Roblox, but assured her father she had moved on.
But not long after her 16th birthday, she took her own life.
Later, when Jason opened up his daughter’s cell phone, he found what he describes as a “crime scene.”
According to the dad, there were messages spanning two years with a person who egged on her self-destruction. Jason believes Penelope met this person on Roblox and then began privately conversing with them over Discord — sometimes for hours.
In one exchange, Penelope sent a photo of her chest, offering to cut herself there but worrying she couldn’t go “too deep.” Minutes later, she followed up with an image of the predator’s Discord user name written across her chest in bloodied letters.
In other images, she had carved the numbers “764” into her body. Jason believes Penelope had been contacted by a member of 764, described by the FBI as a “violent online group” that targets minors and grooms them into committing egregious acts of self-harm and violence.
Members of 764 reportedly troll platforms like Roblox looking for victims they can persuade — via grooming or sextortion — into hurting themselves.
“They are grooming girls to do whatever it is they can get a girl to do, whether it’s nudes or cuts or gore or violence,” Jason said. “[Penelope] was brainwashed all the way through.”
Twenty years after Roblox was launched, many families are claiming the platform makes it too easy for predators to contact children.
Dozens of lawsuits accusing the Roblox Corporation of neglecting to protect minors have been consolidated into one federal case. The first hearing took place in the Northern District of California on January 31.
“Nowhere in this world is it normal for adults to speak to children unrelated to them, but that goes on [on Roblox],” Matt Dolman, an attorney for the plaintiffs, told The Post.
He alleges, on behalf of plaintiffs, that the company designed products in a way that allows “adults to speak to children unabated on the platform without [necessary] safety features to prevent that from happening.”
According to Dolman, the typical case starts with predators offering kids Robux — a form of in-game currency that can be purchased with real money.
Dolman’s firm is aware of at least 119 lawsuits filed against Roblox since 2025. The cases, which span 34 states, include a shocking array of predatory behavior.
Case summaries provided by Dolman’s firm detail how an autistic child was allegedly coerced into sending explicit photos. A girl sent videos of herself doing cartwheels shirtless. Kids like Penelope have sent predators videos of themselves cutting.
One predator is accused of disseminating explicit photos of a child in retribution for being blocked. Several allegedly threatened to kill the families of minors unless they sent sexual images.
A plaintiff family claims that a predator recorded himself having sex with their underage daughter, while sex toys have allegedly arrived at the homes of children. There are several accusations of minors being abducted, sometimes across state lines. One child was allegedly raped by five men.
“There is no making these kids whole again,” Dolman said.
According to Dolman, though initial contact takes place on Roblox, often the predator will move the chat to a third-party app. At least 51 lawsuits mention Discord as a co-defendant, Snapchat cited 20 times and Meta five.
Discord told The Post that the company is “deeply committed to safety” and “maintain[s] strong systems to prevent the spread of sexual exploitation and grooming.”
Snapchat and Meta and did not reply to requests for comment.
In late 2025, Roblox rolled out additional safety features to prevent children from encountering predatory adults, including AI age-estimation software for all users who use communication features. Users are then segregated into age-based subgroups.
Roblox told The Post that the company limits chats for younger users, doesn’t allow user-to-user image sharing, and has filters designed to block the sharing of personal information.
“We are deeply troubled by any incident that endangers any user,” the company said. “We also understand that no system is perfect and that is why we are constantly working to further improve our safety tools and platform restrictions to ensure parents can trust us to help keep their children safe online.”
Dolman remains skeptical of the improvements. “Is it going to get better on the platform? Yes, assume it can’t get worse,” he said. “But is it a safe platform? Absolutely not.”
A year after his daughter took her own life, Jason Sokolowski firmly points his finger at Big Tech.
“Social media companies are protecting these predators because they’re almost one and the same,” he said. “They’re working with no conscience or moral compass just for power or money. The platforms are the ones that could mitigate this overnight.”













