Based on our empirical strategy and exploring a questionnaire typically employed in studies on politics and ideology (Political Compass), we document robust evidence that ChatGPT presents a significant and sizeable political bias towards the left side of the political spectrum. In particular, the algorithm is biased towards the Democrats in the US, Lula in Brazil, and the Labour Party in the UK. In conjunction, our main and robustness tests strongly indicate that the phenomenon is indeed a sort of bias rather than a mechanical result from the algorithm.

  • Orion (awooo)
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I feel like this is one of those cases where whatever you do it will end up disadvantaging at least one group. Is there a good solution? I don’t know.

    Being politically unbiased would mean sometimes being inaccurate, just to represent “both sides”, as we seem to have made basic facts about the world into political arguments. This fundamentally goes against the goal of making a chatbot that is truthful and useful (imagine if it declined to answer any question that had any sort of relation to politics).

    Also as much as some centrists would want to disagree, centrism is itself a political position, so if ChatGPT scored right in the middle of the Overton window, that would give it the effect of pulling both sides together, it would still be influencing our politics.

    And yeah, left wing is very relative here, being a leftist myself I find ChatGPT to be painfully liberal, with a corporate wash added on top of that.