The underage daughter of a former Meta safety researcher was bombarded with sexually explicit pictures and propositions within just days of creating her first Instagram account, according to bombshell testimony at a landmark New Mexico trial.
Arturo Béjar, who led a safety-focused team at Mark Zuckerberg’s company from 2009 to 2015, told jurors he decided to return as a consultant in 2019 after becoming alarmed at the sick messages his then-14-year-old daughter had received on Instagram, including “unsolicited penis pictures.”
“I was with her when she created the account,” said Béjar, who appeared to choke up several times. “I didn’t know that was going to bring predators to her door, people who attacked her to her door, people who would ask her to sell nude photos of herself when she was a teenager to her door.”
Béjar, who has become an outspoken Meta critic since leaving the company for good in 2021, is a key witness for New Mexico state attorneys, who say the tech giant turned a blind eye as predators and sex creeps targeted kids in order to protect Meta’s profits.
His testimony was set to continue Wednesday, with another round of questions from state attorneys followed by cross-examination by Meta’s defense team.
The online safety expert said Meta changed significantly during his second stint at the company as it scrambled to keep pace with budding rivals like Snap and TikTok. Béjar testified that safety researchers were often stonewalled by top brass, including Zuckerberg and Instagram chief Adam Mosseri, when trying to implement new features meant to protect kids.
Béjar said, for example, that Instagram lacked a feature allowing users to report why they were blocking a particular individual – which meant that he and his daughter had little recourse when trying to alert the company about predators.
“When my daughter got a message saying, “Do you want to have sex tonight?” – first message she gets from a stranger – there was no option [to report],” Béjar said.
He also alleged that Meta was woefully understaffed to address the volume of safety violations it faced. He laid the blame on its use of algorithms, which, he said, helped creeps find potential victims more easily.
“The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” he testified.
A Meta spokesperson strongly pushed back on Béjar’s testimony, stating that the notion safety suffered between his two stints was “simply not true.”
“We’ve spent years doing research to better understand teens’ experiences on Instagram – both good and bad,” the spokesperson said. “We do this research to help us improve, and it’s informed many new features designed to provide safer, more positive experiences for teens.”
“These range from Teen Accounts, which automatically place teens into private accounts and the strictest message settings, the development of parental supervision tools, and easier ways for teens to report and block,” the spokesperson added.
Béjar also alleged that Meta obscured the true view of how much harm and abuse was occurring on its apps through “misleading” statistics.
One internal survey called the “Bad Experiences and Encounters Framework,” or BEEF, showed at one point that 16.3% of users had reported seeing nudity or sexual activity on Instagram in a week.
Meta instead published data in its Community Standards Enforcement Report, or CSER, that said only 0.02% to 0.03% of total views on Instagram contained nudity or sexual activity.
Meta said its CSER reports “outline the content we don’t allow on the platform and has been informed by experts over many years. These policies need to be precise to help us enforce them accurately and consistently at scale.”
The New Mexico trial is one of several legal headaches currently facing Meta, which is also defending itself in a separate California trial alleging it fueled teen social media addiction.
During opening arguments, Meta’s attorneys stressed that the company has taken numerous steps to protect young users from harm and is upfront with parents about the potential risks.
On the eve of the trial, Meta spokesman Andy Stone blasted New Mexico Attorney General Raul Torrez, alleging he led “an ethically compromised investigation into Meta that knowingly put real children at risk” after state investigators set up test accounts to probe the company’s safety standards.
















