UNITED STATES
SECURITIES AND EXCHANGE COMMISSION
Washington, D.C. 20549
NOTICE OF EXEMPT SOLICITATION
Submitted Pursuant to Rule 14a-6(g)
NAME OF REGISTRANT: Meta Platforms
NAME OF PERSONS RELYING ON EXEMPTION: Proxy Impact
ADDRESS OF PERSON RELYING ON EXEMPTION: 5011 Esmond Ave, Richmond CA. 94805
WRITTEN MATERIALS are submitted pursuant to Rule 14a-6(g)(1) promulgated under the
Securities Exchange Act of 1934. *Submission is not required of this filer under the terms of the Rule, because the proponent does not hold in excess of the mandatory filing threshold of $5 million in shares of the company but is being made voluntarily in the interest of public disclosure and consideration of these important issues. This communication should not be construed as a solicitation of authority to vote your proxy, and proxy cards will not be accepted. Please follow the company’s instructions on voting your proxy.
(Written material follows on next page)
Meta Platforms Inc. (FB)
Proposal #9–Report on Child Safety Impacts and Actual Harm Reduction to Children
Annual Meeting May 28, 2025
Contact: Michael Passoff, CEO, Proxy Impact, michael@proxyimpact.com
RESOLVED CLAUSE: Shareholders request that the Board of Directors publishes a report (prepared at reasonable expense, excluding proprietary information) that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.
SUMMARY
Social media in general, and Meta in particular, has been linked to significant online harms to children and teens.
Key concerns include:
· | child sexual abuse materials (CSAM) |
· | sextortion and grooming |
· | cyberbullying |
· | hate speech |
· | data privacy violations |
· | negative self-image |
· | self-harm/suicide |
· | addictive content |
Meta’s Instagram, Facebook, and WhatsApp platforms are regularly cited as top online sources for these impacts. As the world’s largest social media platform with over 3 billion users, Meta has been harshly criticized for not doing enough to address child safety and harm reduction to children.
This has resulted in growing regulatory, legal, political and reputational risks for Meta including:
· | multiple Congressional and Parliamentary hearings regarding online child safety |
· | dozens of online child safety legislative bills in several countries |
· | thousands of lawsuits from state Attorneys General, school districts, and individuals |
· | investigative reports from major news media including the Wall St. Journal and NY Times |
· | public protests and social media campaigns |
· | formal appeals from law enforcement, child safety organizations, investors, child sexual abuse survivors, and families of victims asking the company to take stronger and more effective action to make its platforms safer for children and teens. |
The proposal requests a report focused on quantitative metrics to assess whether Meta is taking successful, concrete steps to reduce its levels of child endangerment.
NEW EVENTS SINCE THE PROPOSAL WAS FILED
Child Sexual Abuse Materials (CSAM)
Impact of End-to-End Encryption: Meta’s plan to expand end-to-end encryption (E2EE) to multiple platforms is arguably its most controversial and incendiary child safety issue. It began applying E2EE to Instagram and Facebook in 2024, but a May 8, 2025, report spotlights how Meta’s E2EE policy puts children and teens at risk because it prevents perpetrators and victims from being identified and negates any help for kids or justice for their predators.
U.S. based companies are required to report CSAM to the National Cener for Missing and Exploited Children (NCMEC). These reports can provide evidence and data on the whereabouts of the victim and or perpetrator, which in turn, provides actionable information to law enforcement. The number of CSAM reports has grown exponentially with the spread of internet access and technological advancements. In 2007, there were about 600,000 reports. Fifteen years later, in 2023 there were 36 million reports, and each report can include multiple files of suspected CSAM images and videos.i
Meta’s Facebook and Instagram platforms have been the source for the majority of these reports. In 2023, Meta was responsible for 31 million of the 36 million reports. In 2024, after Meta expanded E2EE to Facebook and Instagram that number dropped by almost 17 million reports.ii
Some of this reduction is due to Meta bundling similar reports. Yet given expanded federal mandatory reporting requirements beginning in 2024, and that Meta’s reporting of CSAM had been significantly increasing over the last several years, and that other online child sexual abuse continues to increase (child sex trafficking reports up 55% from 2023, online enticement reports up 192% from 2023 ) all expectations where that the number of CSAM reports would go up.iii Instead, there was a dramatic drop in the number of reports (see table) and there is little doubt that Meta’s significant decrease in reporting is due to its application of end-to-end encryption.
Meta is now enabling millions of child sex images and videos to go undetected, putting children beyond the reach of help, and their predators beyond the law.
2 |
Lawsuits
Impacts on mental health: As of April 11, 2025, there were 1,745 active claims in the combined lawsuit against Meta and other social media companies claiming product liability and negligence.iv The plaintiffs argue that Meta engineered addictive content that knowingly resulted in mental health impacts and data privacy violations. 42 U.S. State Attorneys are collaborating on the Meta lawsuit.v The suit further claims that Meta failed to follow age-appropriate standards and that users were exposed to harmful content. A range of mental health impacts are identified including eating disorders, low self-esteem, self-harm and suicide.
Artificial Intelligence (AI)
Underage sexual role play: On April 26, 2025, the Wall Street Journal reported on its investigation of Meta AI chatbots which took on the persona of underage children for explicit sex talk fantasies, and that they even allowed reporters – stating that they were underage users – to initiate and engage in sexualized role play. The report describes that competitive pressure led Meta to loosen its AI guardrails despite staff warnings that Meta was crossing ethical lines (only to be overruled by CEO Mark Zuckerberg), and that while Meta did make some modifications in response to the WSJ’s findings, users age 13+ still had access to sexual role playing characters. vi
Democratic and Republican Senators wrote Meta condemning its failure to protect kids from sexually explicit AI chatbots and asked for documentation on its decision making process and governance of its AI development. The Senators stated that this failure “underscores a disturbing trend: Meta consistently chooses growth and engagement metrics over the protection of its most vulnerable users” and that “Meta’s repeated failures highlight the urgent need for this legislation” – a reference to the recent KOSA legislation that Congress failed to act on in 2024.vii
Data privacy: Other AI child safety issues include the possibility of child data privacy violations from chatbots accessing user’s biographical information to steer conversations, and the emotional impact on children and teens relying on chatbots for companionship and advice.viii
Growth of AI: According to NCMEC “While AI can be very useful, many widely available generative AI tools can be weaponized to harm children, and the technology's use in child sexual exploitation has increased. This technology can be used to create or alter images, provide guidelines for how to groom or abuse children or even simulate the experience of an explicit chat with a child. In 2024, NCMEC's CyberTipline saw a 1,325% increase in reports involving Generative AI, going from 4,700 last year to 67,000 reports in 2024.” ix
Lobbying
U.S. child safety legislation: The Wall Street Journal reported that Meta has been a leader in the tech industry’s opposition to the Kid’s Online Safety Act (KOSA) which “would assign to platforms what it calls a “duty of care,” essentially putting a legal onus on them to take steps to address mental-health disorders, addiction-like behaviors, bullying, sexual exploitation and more.” x KOSA was the first major federal legislation about child safety online since 1998 and had enjoyed rare bipartisan support until targeted by Big Tech lobbyists. This legislation was expected to pass until a lobbying blitz, primarily funded by Meta and Alphabet, derailed the legislation and KOSA never came to a vote.
Meta was also identified as leading the lobbying fight against New York’s Stop Addictive Feeds Exploitation (SAFE) for Kids Act and the New York Child Data Protection Act. xi
3 |
International child safety legislation: Meta has also lobbied against child safety-related legislation in Australia and the UK, and technology companies have been accused of illegal lobbying via "front groups" in the EU against legislation that would, among other things, curtail the spread of illegal content online and restrict targeted ads.xii
Public Protest and Social Media Campaigns
Families organizing against Meta: There has been an increase in public protests led by parents who have lost children to online harms. Forty-five of these families, supported by 150 people, recently protested outside of Meta’s New York office in an event that was widely covered by all NYC TV stations.xiii Elsewhere in New York, Prince Harry and Meghan Markle unveiled a memorial to honor the memory of children who lost their lives “due to the harms of social media.” xiv Online petition campaigns led by parents and youth groups are also targeting Meta.
Meta’s Opposition Statement
Over the past year, Meta responded to looming EU and UK legislative requirements by releasing a flurry of parental controls, privacy settings, and warning and reporting mechanisms. Yet, in a 2024 call with the proponents, Meta said that the company was not tracking how often these were being used or what its impact is.
And while the new controls described in the opposition statement are laudable, they are not remotely scalable enough to meet the growing child safety problems and even appear to shift responsibility from Meta’s algorithms to parents.
Meta touts its detection, removal and reporting of content and its support for NCMEC, while ignoring that its E2EE policy has led to a massive decrease in NCMEC reports and actionable data. Critics argue that Meta appears more interested in hiding the problem rather than solving it.
Meta's statements and actions simply do not adequately address the magnitude of the risks to children and teens, and the pace and scale of its efforts have utterly failed to stem the exponential growth in negative impacts on youth according to survivors, their families, child health and safety experts, legislators, and media investigations which consistently expose new issues.
If Meta has made any significant measurable improvements in reducing child safety risks, then it is keeping that information to itself, hence the proposal’s call for a report focused on quantitative metrics to assess whether Meta is actually taking comprehensive and successful steps to reduce its levels of child endangerment.
MAJORITY INDEPENDENT VOTE
This is the third year that this proposal has been filed. In 2024, the Meta child safety proposal received 18.5% of the vote. This represents a majority of 59% of the independent, non-management controlled vote (CEO Mark Zuckerberg controls about 13% of the company, but his class of shares provide him with 10 votes per share allowing him to control about 60% of the total vote). Over 925 million shares were voted FOR the proposal which, based on the closing stock price on the day of the annual meeting, represented more than $439 billion in stock value.
4 |
In 2023 the proposal received 16.27% of the vote, which equaled majority support of 53.8% of the non-management-controlled vote. This represented the support of 817 million shares, valued at $216 billion based on the stock closing price on the day of the annual meeting.
Both ISS and Glass Lewis supported those proposals (their 2025 recommendations were not yet public at the time of filing this exempt solicitation).
This year, the proposal has been co-filed by 22 investors across North America and Europe (up from 7 in 2023), a sign of growing shareholder dissatisfaction with Meta’s response to this issue. The proponents believe that the increasing amount of online harms to children and teens poses unnecessary and unacceptable regulatory, legal, financial and reputational risks to the company, and to children themselves.
We ask that you Vote FOR Proposal #9 – Report on Child Safety Impacts and Actual Harm Reduction to Children
i https://johncarr.blog/2025/05/12/numbers-2/
ii https://www.ncmec.org/content/dam/missingkids/pdfs/cybertiplinedata2024/2024-reports-by-esp.pdf
iii https://www.ncmec.org/gethelpnow/cybertipline/cybertiplinedata
iv https://www.robertkinglawfirm.com/personal-injury/social-media-addiction-lawsuit/
v https://trulaw.com/social-media-mental-health-lawsuit/meta-lawsuit-overview-and-updates/
vi https://www.wsj.com/tech/ai/meta-ai-chatbots-sex-a25311bf?mod=author_content_page_1_pos_1
vii https://www.adweek.com/media/senators-condemn-metas-failure-to-guard-kids-from-sexually-explicit-ai-interactions/
viii https://www.wsj.com/tech/ai/meta-ai-chatbots-sex-a25311bf?mod=author_content_page_1_pos_1
ix https://www.ncmec.org/gethelpnow/cybertipline/cybertiplinedata
x https://www.wsj.com/politics/policy/meta-google-lobbying-child-online-safety-bill-5ee63dcc
xi https://nypost.com/2024/05/20/business/meta-google-leading-nearly-1m-lobbying-fight-to-kill-ny-online-child-safety-bills/
xii https://www.politico.eu/article/big-tech-companies-face-potential-eu-lobbying-ban/
xiii https://www.bloomberg.com/news/articles/2025-04-24/grieving-parents-march-to-meta-ny-headquarters-demanding-change?embedded-checkout=true
xiv https://www.bbc.com/news/articles/cjewne81lq4o.amp
5