ADVERTISEMENT

ADVERTISEMENT

Point/Counterpoint: Protecting Minnesota kids online could make internet, social media worse for everyone

From the column: "Banning automated sorting mechanisms and imposing verification requirements would do little to solve the problems affecting children online. They may, however, have unintended consequences for other vulnerable groups and risk user privacy."

052022.op.dnt.toon2.jpg
Guy Parsons / Cagle Cartoons
We are part of The Trust Project.

A bill to protect children online quickly advanced through the Minnesota Legislature this session. But the proposal likely would inconvenience internet users and promote needless data collection.

Digital ecosystems pose real and substantial risks to children, who must navigate their way around threats such as obscene content and sexual exploitation. In Minnesota, state lawmakers attempted to tackle this problem by focusing on algorithms. The legislation they proposed ( HF 3724 and S 3933 ) would prevent social-media platforms from using algorithms to “target user-generated content at an account holder under the age of 18.”

While well-intended, these efforts would complicate all users’ experience on social media — without substantially improving safety.

From the column: "To their credit, Minnesota legislators made a number of amendments ... to reflect a bit more awareness of some of the potential unintended harms their algorithm ban might cause."

Rep. Kristin Robbins , the House bill’s sponsor, was prompted to take action after reading a series of anecdotal accounts about curated TikTok videos and their impact on adolescent mental health.

“Social media algorithms” may seem an ominous concept; but, really, they’re just rules that help order content by relevance. Not every platform operates the same way, and there is no singular algorithm . Each social-media company sorts and prioritizes content differently, such that the metrics TikTok uses differ from Facebook or Instagram.

ADVERTISEMENT

Fundamentally, the broad focus on sorting mechanisms is misguided. Algorithms, while imperfect, make it possible for social-media companies to sort through the millions of images, videos, and comments posted each day and show users what might be interesting to them.

Yet the proposed legislation doesn’t account for this nuance and utility of algorithms. It would cover any “electronic medium … that allows users to create, share, and view user-generated content.” Although Rep. Robbins may be trying to target companies like TikTok and Instagram, the inclusive language of her bill would implicate websites like LinkedIn, which is geared toward working professionals, not teens. Childproofing LinkedIn, among many other websites with predominantly adult users, is unlikely to yield significant gains for adolescent mental health.

Most importantly, it would significantly burden the segment of the population that lawmakers are trying to protect. The prohibition on algorithms simply means that Minnesota residents under 18 would need to filter through content themselves. Offensive content would still be there, just embedded with other posts. In essence, social media would resemble a large pile of unsorted cards. Whereas a teenage user could flag unwanted photos and videos, thereby “teaching” the algorithm it doesn’t want to see such content, the bill would require all posts to be shown.

Furthermore, in order to determine whether someone is a Minnesota resident under the age of 18, social-media companies would be forced to collect a trove of personal information from all users. To comply with the bills, companies would need to confirm all users’ ages and locations. This poses significant privacy concerns, especially for human-rights activists, political dissidents, and journalists, who often rely upon anonymity to keep themselves safe. As noted by the Wall Street Journal , such measures would also disadvantage groups with less access to identification.

This isn’t the first bill of its kind. It joins a litany of state content moderation bills introduced over the past few months by concerned legislators from both sides of the aisle.

After the Jan. 6 attack on the Capitol, social-media platforms tightened their moderation practices to the dismay of conservatives, who saw this action as a form of censorship. While these bills were successfully signed into law in both Florida and Texas, they are being challenged in court on constitutional grounds. On the federal level, child-focused content moderation bills like the EARN IT Act have garnered criticism from technology experts, who say that such proposals would curtail legal free speech and erode privacy.

Banning automated sorting mechanisms and imposing verification requirements would do little to solve the problems affecting children online. They may, however, have unintended consequences for other vulnerable groups and risk user privacy. Ultimately, the conversation surrounding child safety deserves greater and more thoughtful discussion — not quick fixes that would make the internet worse for all.

Rachel Chiu is a contributor for Young Voices (young-voices.com), a nonprofit talent agency and PR firm for writers under 35. Follow her on Twitter: @rachelhchiu. She wrote this exclusively for the News Tribune.

ADVERTISEMENT

Rachel Chiu.JPG
Rachel Chiu

What to read next
From the column: "Our kids, communities, and economy will pay for his neglect of our commonwealth."
From the column: "We can be good stewards of our environment while also making sure we aren’t raising the cost of energy for Minnesota families."
From the column: "She is not being treated as a woman. She is housed with men. She is regularly exposed to abuse. She faces a much greater likelihood of violence and mistreatment than cisgender people."
From the column: "The answer is simple: racism, homophobia, transphobia, and the lack of health equity. We can do better."