Swipe, scroll, sue: How big tech got caught in the algorithm trap
At a glance
- In K.G.M. v. Meta Platforms et al. a landmark US judgment was passed down that found in favour of a claimant who lodged a claim against social media giants, Google (YouTube), Meta, TikTok and Snap.
- While this judgment is likely to be appealed, it demonstrates some of the underlying issues with social media, specifically its addictive nature which resulted from feeds which were designed to maximise user engagement.
- In light of global rulings about social media, South African lawmakers should consider regulatory measures to mitigate the adverse effects social media platforms have on teenagers and children.
The argument made by the claimant’s legal representatives was based on the addictive features of the platforms and not the content. The claimant’s legal representative argued that the social media platforms utilised similar techniques as gambling machines to ensure children got addicted to the use of the platforms:
“These companies built machines designed to addict the brains of children, and they did it on purpose.”
While Google argued that YouTube is a responsibly built streaming platform and not a social media site, Meta’s argument was based on free speech implications. Meta’s argument was based on section 230 of the US Communications Decency Act which protects online service providers, such as Meta and Google, from liability arising from user-generated content.
The jury found that social media platforms had been specifically designed to be addictive and harmful to teenagers and children, who were not warned of the dangers of these platforms. These platforms used algorithms that presented content designed to maximise engagement, including features that encouraged users to keep scrolling and interacting in exchange for rewards such as likes, comments and new content mechanisms not dissimilar to those found in gambling. The claimant was not required to prove that all of her mental health issues were caused by the social media platforms, only that they were a substantial contributor to these issues.
Both TikTok and Snap reached confidential settlements with the claimant prior to the trial. The jury awarded damages to the claimant of USD 6 million (apportioned between Google (30%) and Meta (70%)).
Other cases brought against Meta
In New Mexico, Meta was found to be liable for failing to protect children from child predators and was ordered to pay USD 375 million. Meta was found to be in breach of the New Mexico consumer protection laws by misleading children and teenagers on the safety of its platforms by concealing evidence of child sexual exploitation which occurs on its platforms.
In 2025, Meta was ordered by the Gauteng Division of the High Court in Johannesburg to provide the identities of those linked to Instagram and WhatsApp accounts which were involved in distributing child pornography, permanently disable the offending WhatsApp channels and Instagram accounts and shut down all accounts which were linked to the distribution of child pornography.
The K.G. v Meta ruling has been compared to the cases previously brought against ‘Big Tobacco’ which claimed that information relating to the harm of cigarettes was intentionally concealed. Similarly, it is likely that there will be a large number of cases brought against social media platforms and these companies will need to reconsider the measures which are in place for children and teenagers who sign up to and use these social media platforms.
Global regulatory responses
There have already been social media bans put in place in Australia and Indonesia for children and teenagers who are younger than 16, and many other countries are considering implementing similar bans, such as Denmark and the UK, as well as in the EU. In some countries, there is a proposal to ban social media for children under the age of 16 without parental consent or oversight.
While this judgment is likely to be appealed by the conglomerates, it demonstrates some of the underlying issues with social media, specifically its addictive nature which resulted from feeds which were designed to maximise user engagement.
Following these rulings, our South African lawmakers should consider regulatory measures to mitigate the adverse effects social media platforms have on teenagers and children. Social media platforms are also likely to introduce guidelines and amend the terms and conditions to assist in managing their liability in such cases. These steps may protect vulnerable members of the population, such as teenagers and children, from potential harm caused by the use of social media platforms.
The information and material published on this website is provided for general purposes only and does not constitute legal advice. We make every effort to ensure that the content is updated regularly and to offer the most current and accurate information. Please consult one of our lawyers on any specific legal problem or matter. We accept no responsibility for any loss or damage, whether direct or consequential, which may arise from reliance on the information contained in these pages. Please refer to our full terms and conditions. Copyright © 2026 Cliffe Dekker Hofmeyr. All rights reserved. For permission to reproduce an article or publication, please contact us cliffedekkerhofmeyr@cdhlegal.com.
Subscribe
We support our clients’ strategic and operational needs by offering innovative, integrated and high quality thought leadership. To stay up to date on the latest legal developments that may potentially impact your business, subscribe to our alerts, seminar and webinar invitations.
Subscribe