See more of the story

Opinion editor's note: Editorials represent the opinions of the Star Tribune Editorial Board, which operates independently from the newsroom.

•••

Among things that fall under the category of "so good it hurts" must be Section 230 of the Communications Decency Act of 1996.

That's the part of the U.S. Code that essentially frees online content providers from liability for what their multitudes of users might post. It also gives the platforms legal protection for moderating content to the extent they choose.

When the legislation was passed, one of the motives was "to promote the continued development of the Internet and other interactive computer services and other interactive media."

Guess that worked.

Another was "to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation."

Hmm. Internet content is indeed a vibrant coat of many colors, but as long as we're making metaphors, we'll add that much of it matches the palette of various other human outputs. The off-putting experience of it all can drive the most constructive participants from the room, which means there isn't a truly competitive marketplace of ideas.

The primary goal of the Communications Decency Act, itself part of an overdue update of telecommunications law, was to cleanse the young internet of pornography. Most of the CDA was later felled by the courts, but Section 230 survived and brought us the online world we have today. Which is hardly wholesome, though that's the lesser concern now. The greater worry is how legal immunity allows misinformation and hatefulness to spread — and how to reach agreement on defining and addressing those things.

Political leaders of both parties are lining up to take swings at Section 230.

President Joe Biden would seemingly bring the entire piñata down. He told the New York Times Editorial Board during his campaign that Section 230 "should be revoked, immediately" to counter the spreading of falsehoods, though that now looks like an example of speaking loudly while holding a small stick.

President Donald Trump also wanted the law repealed, for the opposite reason — he thought content providers have too much power over what circulates and that they wield it for political reasons.

House Republican staff members have developed a set of principles that aim to address the essence of Trump's allegation. One of their ideas is to remove immunity for large providers while keeping it in place for smaller and newer participants. Another is to revisit Section 230's application to Big Tech companies every five years.

Former President Barack Obama, in a speech last month at Stanford University, said he favors reform of Section 230 but not wholesale repeal. "So much of the conversation around disinformation is focused on what people post," he said. "The bigger issue is what content these platforms promote." He said the companies should allow regulators to scrutinize their algorithms and moderation, similar to what occurs with other industries' proprietary practices.

Minnesota's senior U.S. Sen. Amy Klobuchar also entered the fray with a 2021 bill to remove tech companies' immunity if their algorithms amplify health misinformation during public health emergencies. (The Department of Health and Human Services would define misinformation.) Klobuchar has also co-introduced the bipartisan NUDGE Act, which attempts to address the role of algorithms without changes to Section 230 immunity.

A little legal liability can go a long way in strengthening internal standards, as those of us who work in traditional media can attest. But dumping Section 230 seems unlikely given the differences in philosophy behind the various Republican and Democratic efforts. It does seem that there's overlap for careful reforms. It's not a bad place to start. Unintended consequences are the reason the law is being revisited. They'd also be a risk in a repeal.