• 0 Posts
  • 1 Comment
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle
  • The tools to manufacture content are more accessible, sure. But again, information has always been easy to manufacture. Consider a simple headline:

    [Group A] kills 5 [Group B] people in terrorist plot.

    I used no AI tools to generate it, yet I was able to create it with minimal effort nonetheless. You would be rightfully skeptical to question its veracity unless you recognized my authority.

    The content is not important. The person speaking it and your relationship of trust with them is. The evidence is only so good as the chain of custody leading to the origin of that piece of evidence.

    Not only that, but a lot of people already avoid hard truths, and seek to affirm their own belief system. It is soothing to believe the headline if you identify as a member of Group B and painful if you identify as a member of Group A. That phenomena does not change with AI.

    Our relationship with the truth is already extremely flawed. It has always been a giant mistake to treat information as the truth because it looks a certain way. Maybe a saturation of misinformation is the inoculation we need to finally break that habit and force ourselves to peg information to a verifiable origin (the reality we can experience personally, as we do with simple critical thinking skills). Or maybe nothing will change because people don’t actually want the truth, they just want to soothe themselves. I guess my point is we are already in a very bad place with the truth, and it seems like there isn’t much room for it to get any worse.