In March 2026, a Los Angeles jury delivered a verdict that sent shockwaves through the technology industry: Meta and Google were found liable for intentionally designing addictive platforms that harmed the mental health of a young woman who had been using their products since she was six. The jury awarded $6 million in damages and recommended further punitive damages. For those of us in PR and communications, it is a moment that calls for serious reflection — not just as observers of a historic legal case, but as professionals who communicate with, and on behalf of, organisations that reach young people every day.
As the father to an 11-year-old and a Chartered PR practitioner, I am personally concerned about the potentially harmful effects of the online environment. I believe many of you reading this are in a similar situation. My son is a gamer and spends a lot of time on Roblox and Minecraft, as well as watching YouTube. Every day. We receive mixed messages about whether the online world is good or bad for children’s and young people’s development and mental health.
Long before the verdict, warning signs were evident. In 2021, Frances Haugen, a former Facebook data scientist, appeared before the US Senate and testified that Facebook’s leadership knowingly allowed Instagram to harm teenagers, especially girls, leading to increases in suicidal thoughts and eating disorders. She claimed that “the company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have prioritised their massive profits over people’s wellbeing.”
The revelations did not end there. In 2025, four more whistleblowers — two current and two former Meta employees — provided documents to Congress claiming that the company had suppressed research on children’s safety. According to these documents, Meta revised its policies on sensitive research topics shortly after Haugen’s 2021 disclosures, with researchers reportedly advised to involve lawyers in their work to protect findings from “adverse parties” under attorney-client privilege, and to avoid using terms like “not compliant” or “illegal” in their reports. The picture that emerges is not one of isolated misconduct but of a systemic prioritisation of engagement metrics over child welfare.
The policy response: legislation vs. self-regulation
The debate over how to respond has divided into two main camps: those who support strict legislative action and those who believe platforms should self-regulate. In the UK, Parliament opted for legislation. The Online Safety Act, which fully came into effect on 25 July 2025, imposes a legal duty of care on technology platforms operating within the UK to safeguard children from harmful content. From that date, providers are legally obliged to complete children’s risk assessments, conduct age-verification checks, and adjust their algorithms to block harmful content from young users’ feeds. Ofcom, the regulator, can impose significant fines on platforms that fail to comply.
The Act has received both praise and criticism. Supporters contend it serves as the essential safeguard that voluntary codes never provided: the UK should not have to depend on the goodwill of companies with a proven history of suppressing safety research. However, critics express concerns about unintended effects — including age-verification requirements affecting adults and a possible chilling effect on free speech online.
The US experience offers valuable lessons. Congress has repeatedly failed to pass comprehensive federal laws. Instead, at least 20 US states introduced their own legislation last year — a fragmented patchwork that causes compliance issues for global platforms without providing the clear protections children need.
How young people consume information
If we are to communicate responsibly with younger audiences, we must first understand them. A report published this month by the Reuters Institute for the Study of Journalism at Oxford University, Understanding Young News Audiences at a Time of Rapid Change, provides valuable evidence. The findings are stark. Young people aged 18–24 now predominantly turn to social media for information: in 2025, 39% reported social media as their main news source, up from just 21% in 2015. News websites and apps have seen a sharp decline in their share as primary sources, dropping from 36% to 24% in the same period. Only 64% of 18–24s consume news daily, compared to 87% of those aged 55 and over.
Perhaps the most striking point for communication professionals is that on social and video networks, young people pay more attention to individual creators and personalities (51%) than to traditional news brands (39%). Only a third of 18–24-year-olds describe themselves as very or extremely interested in news, a figure that has decreased by 25 percentage points since 2013. At the same time, the World Happiness Report 2026 found that heavy social media use seems to be linked with declining well-being among young people in English-speaking countries, with an especially strong effect among girls.
What this indicates to communicators is that young audiences are not passive recipients waiting for a press release. They are constructing their own fragmented, creator-led, algorithm-curated information diets — often without the media literacy needed to question what they are consuming.
What good practice looks like for communicators
The UK’s PR and communications sector has its own responsibilities here. Whether you work in-house or at an agency, the following principles should guide your engagement with young audiences.
- Understand your audience and their vulnerabilities. The CIPR has long published best practice guidance on communicating with children, first issued in 2009 and subsequently revised. That guidance examines age, maturity, and gender as key factors, and looks specifically at the implications of online communications. These are not abstract considerations. The court verdict in Los Angeles was, in part, about a six-year-old being algorithmically served content because the platform failed to account for its youngest users.
- Avoid exploiting engagement mechanics. The addictive design elements at the core of the Meta and Google case — infinite scroll, algorithmic boosting, dopamine-loop notifications — are well known. Communications professionals who knowingly employ or recommend digital tactics that exploit these features when interacting with under-18s must confront serious ethical questions. The Advertising Standards Authority’s CAP Code is clear: marketing communications should not include direct appeals to children to persuade them to buy products, and advertisements must not cause mental or moral harm to children.
- Prioritise transparency and consent. If your organisation collects data from or communicates directly with young people online, ensure that consent processes are genuinely age-appropriate and clearly presented, not hidden in terms and conditions designed to be ignored. The whistleblower accounts from Meta describe employees being discouraged from researching how children under 13 were using platforms, despite clear legal obligations around minors’ data. Communicators should avoid being on the wrong side of that history.
- Meet young people where they are, but do so responsibly. The Reuters Institute report shows that young people are now more likely to encounter news and information through TikTok, Instagram, and YouTube than through traditional publishers. If your organisation wants to engage meaningfully with younger audiences, you need to be present on these platforms. But presence must be accompanied by purpose: create content that informs and empowers, rather than simply optimising for clicks and shares. The report also notes that around 42% of young people say they sometimes or often actively avoid the news because it feels overwhelming or irrelevant. A signal that tone and relevance matter enormously.
- Champion media literacy. The Reuters Institute found that just 37% of 18–24s say they trust most news most of the time. In a low-trust environment, credibility is earned through consistency, accuracy, and transparency. Communicators can play a role in media literacy both by modelling good practice and by supporting initiatives that help young people navigate a complex information landscape.
The bigger picture
The verdict against Meta and Google is more than just a legal curiosity from across the Atlantic. It signals that the age of impunity for harmful platform design is coming to an end. Thousands of similar cases are now advancing through US courts. In the UK, the Online Safety Act has introduced a new framework of accountability, albeit imperfect, around platforms that reach children. And the evidence base, from Reuters Institute research to the World Happiness Report, is converging on a clear message: what we create, how we create it, and how we communicate through it have real effects on the mental health and wellbeing of young people.
For PR and communications professionals, that is both a responsibility and an opportunity. We shape narratives, influence platform choices, and advise organisations on how to engage with audiences. The question is whether we use those skills to prioritise young people’s safety, or look the other way while the platforms optimise for something entirely different.
The jury has, quite literally, delivered its verdict. Now, it is our turn.
[Image of an electronic wiring board with a padlock by Allison Saeng on Unsplash+]



Leave a comment