Helene & Milton - Political fallout | FEMA steps back

  • Thread starter Thread starter nycfan
  • Start date Start date
  • Replies: 743
  • Views: 38K
  • Politics 
The surge of misinformation we are experiencing as a society is less a failure of the internet itself than it is a failure to regulate the internet.

It’s a great microcosm to examine. When the internet first came out, people thought it would be liberating. For a while, it was. In the era where there were thousands and thousands of small websites, forums, chat rooms, etc., the flow of information was democratized in a way.

The rise of mass social media and Web 2.0 has exposed what happens when you don’t control private profit impulses. The same companies dominate advertising and an increasing number of people get their news exclusively from social media.

We see the dangers of having these massive pillars of our current society under private control. Look at Twitter. This lesson can be applied far beyond the internet.
I don't think this is an issue of regulation as I don't think there is anyway you could regulate what we have now out of existence unless you're prepared to greatly, greatly curtail the 1A.

This isn't a "private ownership issue" because this is not a situation where the problem has been created top down. This is a bottom up issue where a significant percentage of our society has decided they prefer - both creating and receiving - lies and falsehoods that agree with their preconceived notions rather than the truth to the degree that they live in an "alternative reality" to the one that is actually occurring around us.

The problem of mass social media and Web 2.0 isn't that the internet isn't democratized due to private ownership of the main sites, the problem is that the creation and consumption of "news" is fully democratized and a large minority of the population are just blithering idiots.

The internet worked better in the early years precisely because it wasn't really available to everyone. It seems a simple truism, but the more widely available the internet has become and the more easily it has been made accessible, the worse it has gotten. And the reason is because so many people on the internet are simply stupid, stupid people.

The right-wing propaganda ecosystem - from the internet to tv to radio to books to email/texts - only succeeds because it is feeding a hungry audience. And unless you are to take significant steps to curtail the freedom to publish in all of those media, you're really not going to solve this problem.

The issue we face as a country is that (a) a whole lot of folks lack real critical thinking skills and (b) those folks default to answers that "feel right" in the face of a variety information sources. Things were better in the past because technological limitations largely worked as a barrier to the development of a full-scale propaganda source to appeal to these folks and it is actually the democratization of information that has led to our current situation.

I have no idea how we solve the problem of misinformation and disinformation that we currently face. But there's almost no way we can put the media genie back in the bottle and the solution will have to be that somehow more folks learn to parse through the vast amounts of information that is available to them and learn to (a) identify reality-based sources of information and (b) choose to receive information from those sources even when they don't like what they're being told.
 
It’s not a matter of greatly curtailing 1A though. These social media companies have a responsibility to not publish misinformation. If their algorithms are directly responsible for people dying, which they have been, then the government needs to step in a regulate these companies. There is litigation around this issue currently. We’ll likely have to see what the outcome of that is before anything else can happen.

The other piece of the regulatory side is antitrust. This absolutely is a top down issue when the largest sites are owned by massive companies like Google and Meta or by billionaires like Musk. You’re right. The genie can’t go back in the bottle. So these sites need to be regulated in the same way that the airwaves are (or used to be.)
Antitrust isn't going to solve this, it wouldn't matter if X or facebook were broken up into various companies, the same misinformation would be on all of them.

As far as not publishing misinformation, one man's misinformation is another man's truth straight from the mouth of god. Outside of some fairly strict limits, you simply can't prevent folks from publishing whatever they believe to be worthy of publishing unless you can show they absolutely know it to be false.

Let's say that you put something like the Fairness Doctrine in place for all media, all that would mean is that you'd have to give roughly equal time to the misinformation as to accurate information as all the Fairness Doctrine did was ensure that differing viewpoints were presented.

For better or for worse, the 1A largely protects a "right to lie" unless it can be shown to have been done intentionally by the publisher with the intent to harm others. And if you place the burden on social media site owners to be "publishers" to ensure the "accuracy" of everything posted on social media, you'll largely shut down social media (including places like IC and even this site) because the risk of letting all users generate content will overwhelm any potential profit reward.
 

Typically, men have worse color vision than women. And my color vision is particularly bad for a man. So what color is Mecklenburg County. Because in 2020, Biden got 66.7% of the vote in Mecklenburg County and Trump only got 31.6%.
 
Biden just called out Trump for his lies.
Harris put out a list of the 99 Republicans that vote against FEMA funding. I hope the list is shared in all 99 districts. Especially shared it with those being impacted by their representatives putting politics over the lives of those they represent.
 


"... Johnson himself made claims that only hours before had been refuted by Republican House Rep. Chuck Edwards, who stood beside him at the Wednesday press conference.

... While speaking about the Biden-Harris administration, Johnson said the federal response to the disaster “took days.”

“I wonder what people in the path of … Helene would say about the fact that it took days for them to receive services that they desperately needed,” Johnson said.

Last Friday, while speaking at Asheville’s Mission Hospital, Sen. Tillis remarked on the online chatter that the federal response to the storm has been slow:

“We need to keep in mind that the federal government had assets in this area before we knew what kind of a weather event it was going to be. Did we have everybody? No. And the reason for that is it would have defied any sort of historical examples to think that we'd be experiencing the damage that we're experiencing today. But I have a lot of confidence. I want to thank the federal, uh, state and local agencies first among the first responders,” he said. ..."
 
Biden just now in response to media questions about disinformation generally and Trump specifically-

“Mr. President Trump, former President Trump, get a life man, help these people”

 
Last edited:
Maybe they should be shut down then.
That still does nothing regarding TV, radio, websites, email chains, etc.

You're not going to be able to solve this by shutting down certain media, there's just no practical solution there.
 
Depends on what you think is practical I guess. Listened to an interview with a mass tort lawyer today about the litigation involving social media algorithms. These sites can’t claim they have no responsibility for the content on their site while actively curating an algorithm that promotes dangerous content that has directly led to kids dying.
I'm guessing that lawyers are a long way from proving that not only did the social media companies "promote" content that led to kids dying, but did so with knowledge that doing so would lead to kids dying either had the intent or extreme indifference to said outcome.

If you think it's practical to roll back misinformation and disinformation on social media, tv, radio, internet & email and more...describe how you would do so in a way that doesn't violate the first amendment.
 
I don’t know, I’m not an expert or a lawyer. I just think sitting back and saying there is no practical solution to such a serious issue is stupid.

40 states have sued Meta due to the addictive nature of their product towards children. This is no different than the tobacco or opioid industry.

If the companies can’t figure out a way to have their product without promoting dangerous misinformation, then we can do without social media.

Might be a good topic for another thread if anyone has some articles/papers they’ve read on the subject.
I'm not a lawyer, either, but I know that if lawsuits around mis/disinformation were easy to win, then FoxNews and Newsmax and most of the rest of the right-wing media ecosystem would have been sued into oblivion already. The fact that they're still out there shows such efforts are hard to pull off.

I'll be amazed if anything significant comes of the lawsuits against Meta. Most likely they reach a settlement where Meta "strengthens" the age-restriction procedures in a way that are still easily avoidable and, perhaps, pays a small fine.

I don't disgree that mis/disinformation is a major problem on social media, more that there's little way to solve it that would be (a) legal and (b) effective short of essentially shutting it all down (which I don't think the government has the ability to do) because the problem isn't actually the companies...it's the users.
 
Last edited:
Back
Top