• @CarbonIceDragon
    link
    English
    18
    edit-2
    22 days ago

    Honestly, I dont think this is truly stupidity, at least beyond what is typical of everyone else, because intelligence is not the same thing as knowing facts. I dont think being uninformed is really the best word for it either, because if someone tells you something is true that conflicts with what you already think is true, just accepting this isnt really the intelligent thing to do, verifying it would be, and as most people do not have the expertise to verify most things, the best one can usually do is look to those one trusts as a source of information, and if the people that back up this new information are unknown to you but the people you already trust assure you that what you already believe to be true, is true, you dont really have a good reason to abandon that. What I think this is then is that a huge fraction of the population puts their trust in the wrong people, partly due to self perpetuating bad luck (one’s parents and family are likely to be the first people one trusts, and thus whoever they trust is likely to seem trustworthy to you as well, and if you are unlucky with what group you are born into and they trust the wrong sources of information, there’s therefore a decent likelihood that you will as well), and partly due to the fact that those that wish to intentionally deceive for their own ends are likely to know all kinds of psychological tricks to make themselves seem more trustworthy, and probably will be more willing to use them to manipulate public opinion to their own ends than an expert that just wants to share what they know.

    or for a TLDR: I dont think counterproductive political opinions like this are a result of mass-stupidity, I think they’re proof that propaganda works, and that under the right set of circumstances, you or I or anyone could be made to fall for them.

    • @jjjalljs@ttrpg.network
      link
      fedilink
      English
      622 days ago

      Yes, belief is social. What our in-group believes is way more important for what we believe and how we change our minds than one might think.

      Like, if someone is a flat-earther, changing their mind with facts and figures isn’t going to be very effective. Their in-group believes otherwise. And when you come at them with contrary facts, the brain treats it similarly to a physical threat to its survival. In ancient, pre-history humans, this might have been an advantage. The guy who didn’t go along with the group got left for dead. Unfortunately, modern life is more complicated.

      If we want to make the world better, we should probably focus on breaking up shitty ingroups (eg: fox news, the gop) and fostering groups that are worthwhile (I can’t think of an unassailable group, which may indicate another problem)

      • @CarbonIceDragon
        link
        English
        722 days ago

        An unassailable group seems impossible given that there shitty people out there, who if they join such a group, immediately mean that someone is not necessarily trustworthy because of merely being a part of said group. Even a belief that one’s group is an unassailable paragon seems problematic as if one truly thinks that one’s group is unassailable, then any accusation of wrongdoing by an outsider towards a member will get dismissed, and you could get a situation like some religious groups get with priests or others that they see as inherently good and trustworthy, where when an abusive person inevitably attains that status, allegations against them are dismissed and covered up.

        • @idiomaddict@feddit.de
          link
          fedilink
          English
          322 days ago

          That is by far the most empathetic take on the Catholic Church I’ve ever seen. I grew up catholic, and I’m not there yet, but I find it admirable that you are :)

    • @Aceticon@lemmy.world
      link
      fedilink
      English
      2
      edit-2
      21 days ago

      It’s not about lacking the hardware for thinking through complex things, it’s about thinking habits (non methodical, not validating conclusions, operating at a pure language level rather than at a very concrete and precise meaning level - which is why you see people dispute scientific conclusions based on their own definitions of words) and being emotional about it without the needed introspection to spot that and stop doing it (becoming wedded to the conclusions one reaches and take it really bad when they’re disproven, overestimating one’s knowledge and being unable to reevaluate that estimation because it feels unpleasant to admit one might not know something, wanting to feel one is winning the argument hence digging into ever more illogical arguments and basically ignoring the full picture when trying to “win by attacking the words of the explanation”).

      In the old days, people would yield to authoritativeness on domains outside their expertise, which is something that was abused (for example, look at how experts were paid by tobbaco companies to say that their products were no dangerous for Human Health or if you want a more recent example, look at the field of Economics) so now we have the problem that a lot of people think they’re as good as any expert even while not understanding even the most basic of basics of that expert domain (a quite common problem I see is people simply not knowing basic Statistics and assigning meaning and even motive to coincidences of random events or misreading as causation something that can just as easilly be correlation or even reverse causation).

      Most people don’t have training in Analytics or Science, so it makes sense that they just apply their day-to-day way of thinking (which has no method and using “common sense”) to any and all subjects including domains which are highly structured and can’t just be understood on face value or which are heavilly probabilistic and you can’t just apply the mental shortcut from day to day life (of the “if I thrown a stone it will fall” kind) to draw conclusions.