Russian interference & Iranian Interference | Musk & Russia chummy convos

  • Thread starter Thread starter theel4life
  • Start date Start date
  • Replies: 161
  • Views: 3K
  • Politics 
It's an interesting conundrum. The first step requires an attempt to craft federal legislation adequately addressing the problem. Next, it has be enacted by Congress. Good luck with getting a majority to push that through the Senate. Assuming that could be accomplished, the final step is for courts to deal with challenges to the new law until there is a sufficient body of precedent to firmly establish the constitutionality of the statutory provision.
I'm not even sure how you'd craft a law or set of laws that would effectively deal with the issue.

A lot of disinformation comes from outside of the US. Even if you make it illegal, the actors aren't within US jurisdiction, won't be extradited, and therefore aren't really affected by any of our laws.

For those actors within the US, I think it gets really difficult really quickly. You can certainly make it illegal to intentionally create and distribute knowingly false information. (In fact, in many ways, that's already illegal.) The bigger issue will come from 2 questions:

1) What, if anything, do you do with folks who distribute such information not aware that it is false?
2) Do you put in place anything that puts the burden on the media companies to prevent distribution?

I don't think you can do much about #1 unless you're prepared to prosecute every little old lady who shares a FB post unaware that it's false. If you can show a significant pattern of distributing such materials by major players, you could probably go after them, but likely wouldn't get far with the little folks that end up sharing a lot of these posts.

For #2, you can certainly go this direction, but you're going to put a major burden on social media sites. How do they determine what is false and what is not? If they aren't sure about certain issues or topics, are they still liable if things are distributed? Who ultimately determines what things are true and which are false under this law? (Because, if the government ends up being the party who makes the determination, then a Trump-like administration could use such laws/processes to enforce a Fox News-like "reality" upon social and other media.)

I think it's going to be insanely difficult to create laws that can protect our media from disinformation but doesn't essentially shut down social media nor makes every person on social media responsible in a legal way for things they post that are inaccurate due to ignorance. If such a system can be created, I would likely support it, but I think it will be so difficult as to be essentially impossible.

Also, another post mentioned AI-created fakes, but I think that while our laws need to be updated, those are going to be much easier to address at the root/creation level, at least for content created within the US. The problem there is going to be content created outside the US where our laws don't reach but get shared to the US and amplified by unsuspecting folks thinking they are legit.
 
I'm not even sure how you'd craft a law or set of laws that would effectively deal with the issue.

A lot of disinformation comes from outside of the US. Even if you make it illegal, the actors aren't within US jurisdiction, won't be extradited, and therefore aren't really affected by any of our laws.

For those actors within the US, I think it gets really difficult really quickly. You can certainly make it illegal to intentionally create and distribute knowingly false information. (In fact, in many ways, that's already illegal.) The bigger issue will come from 2 questions:

1) What, if anything, do you do with folks who distribute such information not aware that it is false?
2) Do you put in place anything that puts the burden on the media companies to prevent distribution?

I don't think you can do much about #1 unless you're prepared to prosecute every little old lady who shares a FB post unaware that it's false. If you can show a significant pattern of distributing such materials by major players, you could probably go after them, but likely wouldn't get far with the little folks that end up sharing a lot of these posts.

For #2, you can certainly go this direction, but you're going to put a major burden on social media sites. How do they determine what is false and what is not? If they aren't sure about certain issues or topics, are they still liable if things are distributed? Who ultimately determines what things are true and which are false under this law? (Because, if the government ends up being the party who makes the determination, then a Trump-like administration could use such laws/processes to enforce a Fox News-like "reality" upon social and other media.)

I think it's going to be insanely difficult to create laws that can protect our media from disinformation but doesn't essentially shut down social media nor makes every person on social media responsible in a legal way for things they post that are inaccurate due to ignorance. If such a system can be created, I would likely support it, but I think it will be so difficult as to be essentially impossible.

Also, another post mentioned AI-created fakes, but I think that while our laws need to be updated, those are going to be much easier to address at the root/creation level, at least for content created within the US. The problem there is going to be content created outside the US where our laws don't reach but get shared to the US and amplified by unsuspecting folks thinking they are legit.
As this election has shown us, it has become the Wild Wild West for disinformation. We are truly in a Kafkaesque period and I don't foresee a solution as long as there are bad actors who are beyond the reach of corrective action.
 

Russia Could Stoke Unrest After U.S. Election, Officials Say​

Foreign powers including Russia and Iran could move quickly right after the vote to undermine the democratic process, intelligence agencies warn.


I'm kind of at the point where I read the headline and think "is this really news?"
 

This is where I point out that it makes no sense to be upset with China or Russia. They are going to do what they do. As people have said, we’ve done it so why not expect them to do it too.

Our anger should be focused on those entities within the US that allow or even encourage this. And that is where the “we do it too” fails. That rationale only works when pointing the finger at foreign governments.
 
Back
Top