ADVERTISEMENT

ADVERTISEMENT

Point/Counterpoint: Protecting Minnesota kids online could make internet, social media worse for everyone

From the column: "Banning automated sorting mechanisms and imposing verification requirements would do little to solve the problems affecting children online. They may, however, have unintended

052022.op.dnt.toon2.jpg
Guy Parsons / Cagle Cartoons
We are part of The Trust Project.

A bill to protect children online quickly advanced through the Minnesota Legislature this session. But the proposal likely would inconvenience internet users and promote needless data collection.

Digital ecosystems pose real and substantial risks to children, who must navigate their way around threats such as obscene content and sexual exploitation. In Minnesota, state lawmakers attempted to tackle this problem by focusing on algorithms. The legislation they proposed ( HF 3724 and S 3933 ) would prevent social-media platforms from using algorithms to “target user-generated content at an account holder under the age of 18.”

While well-intended, these efforts would complicate all users’ experience on social media — without substantially improving safety.

From the column: "To their credit, Minnesota legislators made a number of amendments ... to reflect a bit more awareness of some of the potential unintended harms their algorithm ban might cause."

Rep. Kristin Robbins , the House bill’s sponsor, was prompted to take action after reading a series of anecdotal accounts about curated TikTok videos and their impact on adolescent mental health.

“Social media algorithms” may seem an ominous concept; but, really, they’re just rules that help order content by relevance. Not every platform operates the same way, and there is no singular algorithm . Each social-media company sorts and prioritizes content differently, such that the metrics TikTok uses differ from Facebook or Instagram.

ADVERTISEMENT

Fundamentally, the broad focus on sorting mechanisms is misguided. Algorithms, while imperfect, make it possible for social-media companies to sort through the millions of images, videos, and comments posted each day and show users what might be interesting to them.

Yet the proposed legislation doesn’t account for this nuance and utility of algorithms. It would cover any “electronic medium … that allows users to create, share, and view user-generated content.” Although Rep. Robbins may be trying to target companies like TikTok and Instagram, the inclusive language of her bill would implicate websites like LinkedIn, which is geared toward working professionals, not teens. Childproofing LinkedIn, among many other websites with predominantly adult users, is unlikely to yield significant gains for adolescent mental health.

Most importantly, it would significantly burden the segment of the population that lawmakers are trying to protect. The prohibition on algorithms simply means that Minnesota residents under 18 would need to filter through content themselves. Offensive content would still be there, just embedded with other posts. In essence, social media would resemble a large pile of unsorted cards. Whereas a teenage user could flag unwanted photos and videos, thereby “teaching” the algorithm it doesn’t want to see such content, the bill would require all posts to be shown.

Furthermore, in order to determine whether someone is a Minnesota resident under the age of 18, social-media companies would be forced to collect a trove of personal information from all users. To comply with the bills, companies would need to confirm all users’ ages and locations. This poses significant privacy concerns, especially for human-rights activists, political dissidents, and journalists, who often rely upon anonymity to keep themselves safe. As noted by the Wall Street Journal , such measures would also disadvantage groups with less access to identification.

This isn’t the first bill of its kind. It joins a litany of state content moderation bills introduced over the past few months by concerned legislators from both sides of the aisle.

After the Jan. 6 attack on the Capitol, social-media platforms tightened their moderation practices to the dismay of conservatives, who saw this action as a form of censorship. While these bills were successfully signed into law in both Florida and Texas, they are being challenged in court on constitutional grounds. On the federal level, child-focused content moderation bills like the EARN IT Act have garnered criticism from technology experts, who say that such proposals would curtail legal free speech and erode privacy.

Banning automated sorting mechanisms and imposing verification requirements would do little to solve the problems affecting children online. They may, however, have unintended consequences for other vulnerable groups and risk user privacy. Ultimately, the conversation surrounding child safety deserves greater and more thoughtful discussion — not quick fixes that would make the internet worse for all.

Rachel Chiu is a contributor for Young Voices (young-voices.com), a nonprofit talent agency and PR firm for writers under 35. Follow her on Twitter: @rachelhchiu. She wrote this exclusively for the News Tribune.

ADVERTISEMENT

Rachel Chiu.JPG
Rachel Chiu

What to read next
From the column: "The only other time Mountain Iron won a state championship was the year I graduated, in 1972. Many old-timers like me still reminisce about that game. And I'm certain the town's
From the column: "(St. Louis) County can lead on this issue. It would, however, take committing to a few key things."
"My Advent prayer for you is that ... you find joy no matter what your circumstances are. ... Advent is a season of light, and I pray that you see Jesus’ light in all that life has to offer you!"
From the column: "Acknowledging the already active fan interest, and investing in it, is a reflection of our better selves. When we honor Bob Dylan in the city of his birth, it’s not for Bob’s sake