Like other kids her age, 9-year-old Victoria signed up for Pinterest because she wasn’t allowed on TikTok. Her mother feared she might encounter dangerous content or individuals on the popular video-sharing app. Pinterest, meanwhile, seemed safe.
But while the third grader was “pinning” pictures of baby animals, craft ideas and nail art inspiration into her image “boards” on the site, grown men were pinning her.
Clips Victoria uploaded of herself to Pinterest, such as one in which she cheerfully turns a cartwheel, have been compiled by at least 50 users into their own boards with titles like “young girls,” as well as “Sexy little girls,” “hot,” “delicious,” and “guilty pleasures.” Those boards are filled with dozens, hundreds and sometimes thousands of photos and videos of children.
“I’m shocked and disgusted,” said Victoria’s mother, Nathalia, the one person the girl followed on the platform. “I thought Pinterest was a place to be creative and inspired.”
Nathalia asked that both she and her daughter be identified only by their first names for fear of attracting more unwanted attention to her daughter from adults online.
Aggregating individually innocuous images of minors into potentially sexually suggestive collections is a practice experts describe as awful, but in many cases, lawful, meaning platforms have no legal obligation to take action.
Yet Pinterest isn’t just allowing this to happen on its website — its recommendation engine is making it easy. The company is inadvertently curating this content for adults who seek it out and potentially exposing the girls to pedophiles, an NBC News investigation found.
Over a one-month period, NBC News created a Pinterest account and reviewed hundreds of girls’ Pinterest accounts as well as their followers’ pages, many of whom appeared to be men. (The majority of the girls’ profiles were algorithmically surfaced by Pinterest itself during the course of the reporting.) The review showed that the site’s recommendation engine serves up photos and videos of visibly underage girls, including toddlers, in large quantities to a user seeking out this type of imagery. The children are typically seen bending over, doing the splits, sticking their tongues out and dancing in their bedrooms while dressed in outfits like pajama shorts, bathing suits and leotards.
“This is material that is innocently posted and is now being used to drive sexual interest in children,” said Stephen Sauer of the nonprofit Canadian Centre for Child Protection, which works to stop child exploitation.
Sauer conducted an independent review of the site at the request of NBC News, using a similar methodology. Upon doing an initial search for images of kids, he said, his homepage “almost immediately” filled with images of children often dressed in similarly revealing attire, several of which had received sexually suggestive comments.
Spokesperson Crystal Espinosa said in a statement that Pinterest has “a strict zero-tolerance policy for any content that may exploit or endanger minors,” and noted that when sexually suggestive boards containing “otherwise innocuous or non-sexual images” are detected, the company immediately takes them down and bans the creators.
Pinterest removed pages whose links NBC News shared as part of its requests for comment from the company.
After NBC News asked about the sexually suggestive boards, Espinosa said Pinterest was planning to roll out a new feature next week enabling users to report boards, which is not currently possible.
The company said it is also adding more options for flagging individual profiles, which can at present only be reported for “Spam” and “Inappropriate cover image”; the new options will include the ability “to specifically call-out when content may involve a minor.” Pinterest said it will introduce new age-verification measures at a later date.
“Young girl fashion”
When Nathalia signed into Victoria’s Pinterest account in mid-February for the first time in a year, after being contacted by NBC News, her daughter’s inbox was brimming with messages including “cute ass” and “Mmmm.” The senders all looked to Nathalia like men, based on their usernames and profile pictures. At least three of Victoria’s nearly 400 followers had also uploaded on their own public profile pages now-deleted footage of erect penises.
Another user, with whom NBC News corresponded, and who pinned a video of Victoria doing a headstand in a board called “Young girl fashion,” was a 45-year-old Texas man. He joined Pinterest last year after he was released on parole from prison, where he served 27 years for attempted murder, criminal records show.
Shortly after NBC News contacted him via direct message, all three of his boards containing underage girls vanished from his profile.
Pinterest users can create “secret boards” to keep certain content private, but “you can’t completely hide your active account,” according to Pinterest’s Help Center. Users who identify themselves as teens when they sign up can set their accounts to receive direct messages from only users they follow.
NBC News’ investigation comes as lawmakers express growing concern about children’s safety on the internet. In a rare instance of bipartisanship in Washington, some Democrats and Republicans are coming together in a push for legislation to curb the sexual exploitation of minors online.
“We are in complete agreement that any user behavior that sexualizes children is wrong and must be prevented,” Espinosa said.
“We genuinely appreciate when any negative actions or content on the platform is brought to our attention, whether via machine learning, manual discovery from our trust & safety team, user reporting, or journalists. Billions of ideas are searched and saved every year on Pinterest and we recognize that this job will never be done.”
Pinterest launched more than a decade ago as a digital scrapbooking service with basic social networking features. The company went public in 2019 and has continued to unveil new creator-friendly tools to compete with TikTok and Instagram under pressure from shareholders for rapid growth.
Although Pinterest has a policy requiring users to be at least 13 years old, as with other social media sites, many children like Victoria appear to be on the 450 million-user platform. Some use their pages like TikTok accounts, sharing videos of themselves singing, doing hair tutorials and dancing. Pinterest’s mobile app even has a “Watch” tab of scrollable, algorithmically selected short videos from its users that was introduced in 2021 and functions like TikTok’s “For You” page.
As they’re algorithmically prodded into the spotlight, often attracting hundreds or thousands of followers despite only following a small number of people themselves, some girls are excited by the level of attention they’re receiving.
“Nobody’s surprised that the 4chans and the Reddits are awful. But Pinterest, really?”
said Hany Farid, A professor at the University of California, Berkeley, who has researched social media algorithms.
Erin Hahn, a Michigan mother of two, let her 12-year-old daughter join Pinterest last summer. Like Victoria’s mom, Hahn didn’t think twice about Pinterest at the time; many of her daughter’s friends were on the platform, too. But within a few months, Hahn noticed something odd: To her middle schooler’s delight, she had already amassed around 500 followers, where she had shared posts including TikTok-style “get ready with me” videos preparing for cheerleading practice.
When Hahn scrolled through her daughter’s list of followers, and those of her friends, she said she was alarmed to find that many of the users looked like men and had “disturbing” boards.
“It was gross and very obvious that they weren’t there for fashion,” Hahn said.
“Pinterest, for God’s sake”
While conducting research for this article, NBC News visited the Pinterest profiles of several young girls and then scrolled through 100 consecutive videos offered on the “Watch” tab. All but one featured little girls doing things such as dancing, often with the camera on the floor pointed upward. The feed was interspersed with ads from multiple major brands.
Meanwhile, Pinterest began recommending content categories that were full of images of minors, including “Kids bathing suits” and “Girls leg pic.”
“Our recommendation teams are aggressively investigating this type of content to ensure it’s only recommended in appropriate contexts,” said Espinosa, who noted that parents might search for “kids bathing suits” while planning a vacation.
“When we discover new issues, we update our detection systems. We’re constantly looking for what other guardrails we can implement.”
A few days after Victoria’s account was disabled, Pinterest still featured a thumbnail of her headstand video while previewing a suggested collection called “Gymnastics pictures” that was suggested to NBC News. Espinosa said profiles and boards are removed immediately upon account deletion; this should not have happened.
Another recommended picture, a close-up of a young girl’s face, had first been posted five years before by her grandmother and was still getting sexual comments from users.
“Pinterest, for God’s sake,” sighed Hany Farid, a computer sciences professor at the University of California, Berkeley, who has researched social media algorithms’ promotion of harmful content. He has not studied Pinterest specifically, and suggested few experts in the field had because the website hasn’t been on many people’s radars as potentially problematic.
“Nobody’s surprised that the 4chans and the Reddits are awful. But Pinterest, really?”
Social media companies want to keep users engaged to maximize their ad exposure, said Farid. He believes these websites have the ability to prevent specific types of content from being artificially amplified.
In general, Farid added, social media platforms say: “‘Look, we want users to come back for more. So if you want to search for content about self-harm, or white supremacists, or kids in their underwear, then that’s what we’re gonna deliver.’”
The potential impact of these algorithms isn’t new: In 2017, a 14-year-old British girl, Molly Russell, took her own life after viewing a series of depression- and self-harm-related images on Pinterest among other platforms. A coroner’s inquest into her death found that many were shown to her via algorithmic recommendations, including “10 depression pins you might like.” Senior company executive Jud Hoffman admitted to the inquest in September 2022 that the platform was “not safe” when Russell used it. Espinosa said the platform “updated our self-harm policy to be even stricter” after Russell’s death.
In the U.S., internet platforms are shielded from liability for the user-generated content they host under Section 230 of the federal Communications Decency Act. But in Gonzalez v. Google, a case recently argued before the Supreme Court involving recruitment videos for the Islamic State terrorist groups, plaintiffs argued that such immunity does not extend to posts the platforms algorithmically recommend.
Ultimately, the Supreme Court seems likely to punt the issue back to Congress. One week after President Joe Biden’s Feb. 7 State of the Union speech, during which he called on Big Tech to prioritize children’s online safety over company profits, the Senate Judiciary Committee held a hearing to voice bipartisan support for legislation mandating the same. Sen. Josh Hawley, R-Mo., also introduced a bill to ban those under 16 from using social media, though it’s gained little traction.
Time to “step up”
Sauer, from the Canadian Centre for Child Protection, argued that even within the current legal protections, social media companies need to “step up,” rein in their algorithms and proactively police for problematic but noncriminal activity on their websites. He believes Pinterest should invest more heavily in human moderators because they can understand the nuances behind users’ behavior, like the difference between someone creating a board of their own children versus one full of random kids in bathing suits.
While Pinterest reports sexually exploitative content to the National Center for Missing & Exploited Children in the U.S. and has complied with takedown requests, it has for years had no clear way for users to report the types of boards identified in this article.
The company prohibits “the intentional misuse of content depicting minors engaging in non-sexualized activities, like modeling clothing or participating in athletics.” But its reporting options — which cover “Nudity or pornography” and “Privacy violation” — don’t necessarily apply to such content.
Espinosa said Pinterest uses artificial intelligence programs and human review for content moderation, and is “working on developing new features that make it easier for the community to flag accounts that may be using the platform for exploitative reasons combined with improved machine learning automation to detect the same.” She declined to share further details about the detection strategies “to prevent people from circumventing our systems.”
Despite Pinterest’s ban on porn, NBC News encountered pornographic content numerous times while reviewing the profiles of men who followed young girls. In one case, Pinterest added a warning label of “sensitive content” to a board containing porn and left it up for months. NBC News did not encounter any child sexual abuse material during the course of its review.
Other social media sites offer more choices for flagging problematic content. Instagram users can directly report content that “involves a child” or contains “sexual exploitation or solicitation.” On Facebook, posts can be reported for “sexual exploitation” and “child abuse,” which includes content that “sexualizes them.” TikTok has a “minor safety” reporting option with multiple subcategories.
YouTube faced sweeping advertiser boycotts over its own algorithmic curation of minors in 2019. It quelled public outrage by banning comments on videos featuring children while resisting calls to cut off all algorithmic recommendations of such videos entirely. (The company does limit recommendations of videos containing minors specifically in potentially risky situations, YouTube said.)
On Twitter, which states that it has a “zero-tolerance child sexual exploitation policy,” accounts trading child sex abuse material have remained online for months under CEO Elon Musk’s tenure, according to a review by NBC News in January. Twitter did not respond to a request for comment.
Seara Adair, a mother and influencer advocating for online safety, has spent several years independently studying Pinterest and other platforms. By seemingly letting “predatory” users run wild, she said, Pinterest has in her view fostered a culture of impunity. Some of those creating seemingly sexually suggestive boards filled with images of children are doing so using what appear to be their real names and photos in their Pinterest profiles.
Adair believes she intercepted a predator posing as a teen in an effort to groom her daughter, who is 12. A Pinterest user claiming to be a 14-year-old girl messaged her daughter in January out of the blue, tried to obtain additional contact information from the girl and then sent her a pinned video of two young girls kissing, writing, “I want to see us like thisss.” NBC News reviewed the exchange and could not verify the user’s identity.
“Pinterest is a mess,” Adair said. “Nobody’s paying attention to what goes on there because, well, it’s Pinterest.”