Clarification:

Judging by the fact that AI progress will never stop and will eventually be able to replace truth with fiction, it will become impossible to trust any article, and even if it is possible, not all of them, and we won’t even be able to tell exactly what’s true and what’s fiction.

So, what if people from different countries and regions exchanged contacts here and talked about what’s really happening in their countries, what laws are being passed, etc., and also shared their well-thought-out theories and thoughts?

If my idea works, why not sober up as many people as possible that only similar methods will be able to distinguish reality from falsehood in the future?

I’m also interested in your ideas, as I’m not much of an expert.

  • Naich@lemmings.world
    link
    fedilink
    arrow-up
    26
    ·
    1 day ago

    That is already the case with written news. How can you trust any of it, when anyone can make up anything and present it as fact? In the same way you have to rely on the source and provenance of a news story, the same has become true of photos and videos. Photo altering has been going on for over 100 years

      • SkyNTP@lemmy.ca
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        1 day ago

        The tools to manufacture content are more accessible, sure. But again, information has always been easy to manufacture. Consider a simple headline:

        [Group A] kills 5 [Group B] people in terrorist plot.

        I used no AI tools to generate it, yet I was able to create it with minimal effort nonetheless. You would be rightfully skeptical to question its veracity unless you recognized my authority.

        The content is not important. The person speaking it and your relationship of trust with them is. The evidence is only so good as the chain of custody leading to the origin of that piece of evidence.

        Not only that, but a lot of people already avoid hard truths, and seek to affirm their own belief system. It is soothing to believe the headline if you identify as a member of Group B and painful if you identify as a member of Group A. That phenomena does not change with AI.

        Our relationship with the truth is already extremely flawed. It has always been a giant mistake to treat information as the truth because it looks a certain way. Maybe a saturation of misinformation is the inoculation we need to finally break that habit and force ourselves to peg information to a verifiable origin (the reality we can experience personally, as we do with simple critical thinking skills). Or maybe nothing will change because people don’t actually want the truth, they just want to soothe themselves. I guess my point is we are already in a very bad place with the truth, and it seems like there isn’t much room for it to get any worse.