Not an advertisement I swear, but I honestly love language models, I specifically use Chat GPT and Gemini ALL the time.

I’m someone who loves research and learning. I always have questions on all sorts of things. Before GPT I would be googling things for hours each day, visiting forums, watching YouTube videos, etc.

Now with GPT I can ask so many questions on things I am curious about, and even ask follow up questions or ask for more in depth explanations.

I think I am also a bit ADHD and so my brain is always jumping around different topics which makes me curious on specific things. My latest is insects, specifically wasps.

I absolutely hate bugs, and wasps are the worst. But I am now learning more and more how important a lot of bugs actually are to the ecosystem. It’s really a great way to learn and engage with topics when you can ask follow up questions or ask for more details on specific aspects.

TLDR: GPT has replaced 90% of my research that Google used to be.

Important Note: Please be aware language models can be innacurate and prone to mistakes, so always verify the data from other sources if you need accurate information and not just general knowledge.

  • kubica@fedia.io
    link
    fedilink
    arrow-up
    30
    ·
    4 days ago

    Be careful about all the lies it tells you, everytime I doulbe check something there are made up things. It is only useful for introductory keywords, but it starts to fill the gaps randomly when it doesn’t know about the topic.

    • darkstar@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      4 days ago

      Thankfully for general knowledge questions it gets things correct. A lot of what I ask or am curious about is general knowledge so it works really well for me

      • x_cell@slrpnk.net
        link
        fedilink
        arrow-up
        1
        ·
        10 hours ago

        It really doesn’t get general knowledge correct.

        AI will often confuse itself because the answers are probabilistic. So it often is right, but just as often is hallucinating random BS, because AI doesn’t really know shit. It just detects patterns.

        I just saw a post on Reddit about GPT claiming Argentina is the second most populated country in South America (which is false, it’s Colombia), but since Argentina is usually second place in most lists in the region, GPT tends to place Argentina in there anyway.

      • kubica@fedia.io
        link
        fedilink
        arrow-up
        16
        ·
        4 days ago

        My problem used to be that I started like that, and then I wanted to ask more an then another bit more, and that’s when things get messy. And at some point I usually end up annoyed that half of the conversation was based on a lie. Now I barely open the chat because of that.

          • theneverfox
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Probably my #1 use case currently. I love passing in code and saying “parse all these values and build an error message if they’re empty” and such. It’s things good to have, but I can rarely be bothered to write out line by line - it’s easy but horribly tedious

      • Kichae@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 days ago

        It gets things most popular. That is often in a different postal code from “right”.