By John Nguyet Erni Two US jury verdicts against Meta and YouTube last month crystallise, but do not resolve, the promises and contradictions of our legal commitments to both freedom of expression and the protection of children online. A smartphone that has been installed with social media apps. Photo: Indra Projects/Pexels. The US verdicts resonate in Hong Kong, which follows international human rights standards, yet our public conversations about platforms often oscillate between moral panic and technological fatalism. We do not ask hard, unsettling questions.
In Los Angeles, jurors awarded US$6 million (HK$47 million) to a young woman, Kaley, who began using social media at six. She argued that Instagram and YouTube designed addictive features – infinite scroll, autoplay, constant nudges to stay online – that harmed her. She hated her body and thought about hurting herself. In New Mexico, another jury fined Meta US$375 million (HK$2.9 billion) for failing to keep children safe from predators, violating consumer protection laws. These are not censorship cases; instead, the platform itself was dangerous, more like tobacco or opioid producers than publishers. Australia chose a different path, banning social media for those under 16. Spain, Denmark, France, Malaysia and Indonesia are also considering age-based bans. The world is grappling with legal solutions.
Predictably, Big Tech cries foul, claiming violations of free speech. However, many laws protect expression while allowing proportionate restrictions to protect others’ rights. These include Article 19 of the International Covenant on Civil and Political Rights (ICCPR), which applies to Hong Kong through the Basic Law and the Hong Kong Bill of Rights. The United Nations Convention on the Rights of the Child (CRC), ratified by China, goes further, requiring primary consideration of the “best interests of the child” and protecting them from “all forms of… mental violence.” While Australia and others keep youths off social media, US juries ask a different question: If social media companies’ design choices fuel anxiety, self-harm and exposure to exploitation, are they still neutral conduits of speech? Or do they carry specific duties of care? Teenagers look at a mobile phone. Photo: Mary Taylor, via Pexels. Too often, large platforms hide behind free-speech arguments to avoid liability, whereas children using them have no say. Some worry that holding platforms accountable will chill speech, but the status quo already chills the speech and spirit of the young. These platforms track children, measure them, and nudge them for profit, shaping their values, desires, identities, and speech. Unregulated design can create its own chilling effect – not by censoring, but by moulding youths’ online world and their habits of mind. They compare themselves with others online and imagine who they might become. The juries saw that these companies know far more than their users about these risks. So, these juries shifted responsibility away from supposedly “weak” or “irresponsible” youths to the firms that profit from their pain.
For Hong Kong, it is tempting to read these cases as a morality tale about “Big Tech finally being punished.” Or to long for a simple answer like bans. But there is a harder question. We often worry about online lies and threats to social harmony, so appeals to “protection,” especially of children, can slide into arguments to control everyone’s speech. Why do our policy instincts gravitate toward regulating what we say – through content takedowns, offences and tighter control – rather than governing how platforms are designed and how their business models operate? Why do we rely on schools and parents to fix these problems created by Big Tech’s recommendation algorithms, engagement metrics and data-driven profiling? Rather than manage political risk and public opinion, how do we genuinely centre children’s rights and voices?
The law does not ask us to choose between Article 19 and the CRC. Instead, it asks harder questions: Can we pass laws that target platforms’ amplification engines rather than opinions? Can we change the defaults rather than individual choices? Can we change profit structures rather than rely on teenage “self-discipline”? Can we see children as people with rights, not just victims of a toxic digital environment or future workers needing digital skills?
The juries in Los Angeles and New Mexico did not solve these dilemmas, but they made it harder to believe a comforting lie: that we can celebrate free speech, outsource our sociality to commercial platforms, and still keep our promise to protect our young. The real challenge for Hong Kong is whether we will ask the difficult questions now – about Big Tech’s power, our own regulatory choices, and the rights of children as real people, not just as symbols – before our courts, or our children, force those questions upon us. John Nguyet Erni is a chair professor and former dean of humanities at The Education University of Hong Kong [Table]
HKFP is an impartial platform & does not necessarily share the views of opinion writers or advertisers. HKFP presents a diversity of views & regularly invites figures across the political spectrum to write for us. Press freedom is guaranteed under the Basic Law, security law, Bill of Rights and Chinese constitution. Opinion pieces aim to constructively point out errors or defects in the government, law or policies, or aim to suggest ideas or alterations via legal means without an intention of hatred, discontent or hostility against the authorities or other communities.
[/Table]