ADVERTISEMENT

ADVERTISEMENT

Point/Counterpoint: Clumsy regulation could create new privacy, 1st Amendment concerns

From the column: "To their credit, Minnesota legislators made a number of amendments ... to reflect a bit more awareness of some of the potential unintended harms their algorithm ban might cause."

052022.op.dnt.toon1.jpg
Dick Wright / Cagle Cartoons
We are part of The Trust Project.

Lawmakers responding to concerns about how digital content may harm some users have become increasingly fixated upon regulating the algorithms that organize online content — especially with respect to content targeted at children. Unfortunately, as is often the case with emerging technologies, most of the legislative proposals aimed at protecting kids from potentially harmful online content are not being crafted with much understanding of how the technology actually works.

One of these well-intentioned but clumsy attempts could possibly make it into law in Minnesota in the coming few days — and could cause a host of presumably unintended headaches for the state’s internet users.

From the column: "Banning automated sorting mechanisms and imposing verification requirements would do little to solve the problems affecting children online. They may, however, have unintended consequences for other vulnerable groups and risk user privacy."

The companion bills HF 3724 and SF 3933 are labeled as simply “prohibiting certain social-media algorithms that target children.” This concept of an algorithm ban has its own problems, made worse because the legislation defines social-media platforms and algorithms so broadly that it would apply not only to social-media sites but to any sufficiently large “browser-based or application-based interactive computer service, telephone network, or data network, that allows users to create, share, and view user-created content.”

From prioritizing which posts show up on a social-media feed, to targeted advertisements, to ranking what shows up when you type something into a search engine: all of it would have to be shut off for any site user identified as being under 18, as the legislation was initially written. The net effect, as described accurately by TechDirt’s Mike Masnick , would be that “websites can no longer be useful for teenagers.”

To their credit, Minnesota legislators made a number of amendments to both the House and Senate bills to reflect a bit more awareness of some of the potential unintended harms their algorithm ban might cause, such as carving out algorithms intended to prevent harmful content from being served to kids. Another amendment wisely specified that search engines and email are not covered — indeed, previous versions would have made even Google and Bing unusable. The House and Senate bills were each amended somewhat differently, and it is unclear which amendments will be incorporated in the omnibus bill sent to the governor.

ADVERTISEMENT

Regardless of what version advances, there remain fundamental problems that cannot easily be amended away. For instance, the legislation treats all minors the same. A 17-year-old would be banned from accessing the same services as a 4-year-old. There is no discretion left for parents to decide at what point their kids should be able to access the full functionality of the services being regulated.

Most actual social-media sites like Facebook or Twitter already have age verification and try to keep minors under a certain age from using their platforms. Since the Minnesota legislation applies far more broadly than these sites, however, it would likely, as NetChoice’s Jennifer Huddleston pointed out , force some services to track users’ age and identity when they had not previously done so, ironically creating new privacy concerns in the process.

But perhaps the most formidable obstacle that faces not only Minnesota’s bill but any attempt to ban social-media algorithms is the First Amendment. As Stanford’s Daphne Keller explains , platforms’ legal rights in how they choose to display and target content toward users are not unlimited, but generally the U.S. Supreme Court “has set a low bar in defining rights of entities that aggregate third-party speech.” These algorithms are a complex way of prioritizing what order to present content to a given user, and such decisions are constitutionally protected.

All this is not to say that consumers and lawmakers cannot not demand better transparency or better tools to be made available for parents to use to control their kids’ internet use. Many of the large online platforms already have a number of tools available to help parents exercise more control over what content their kids can see, how much, and how often; but there may be some onus on the platforms themselves to be better at making these tools more widely known and easier to use.

The blunt approach of simply banning algorithms from applying to minors is almost certainly a legal non-starter.

Josh Withrow is a fellow for technology and innovation policy at the R Street Institute (rstreet.org), a nonpartisan think tank in Washington, D.C. He wrote this exclusively for the News Tribune.

What to read next
From the column: "An amicus brief (pointed) out that affirmative action, far from hurting Asian-American students, actually benefits them. Overturning affirmative action 'would benefit only White applicants'.”
From the column: "Asian students being denied admission are part of a group that also faced severe discrimination in the past."
From the column: "Put social-media platforms on the same footing as traditional media, which is responsible for the content that appears in newspapers, magazines, television, and other outlets."
From the column: "'By banning single-use plastic bags, you are actually increasing carbon emissions and you’re inconveniencing people'.”