• Rekall Incorporated@lemm.eeOP
    link
    fedilink
    English
    arrow-up
    11
    ·
    18 hours ago

    I don’t have any stats to back this up, but I wouldn’t be surprised if failure rates were higher back in the 90s and 2000s.

    We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.

    Would be interesting to actually analyze the real world dynamics around this.

    • GrindingGears@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      14 hours ago

      Not very many people had a dedicated GPU in the 90s and 2000s. And there’s no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.

      • TacoSocks@infosec.pub
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.

    • tehWrapper@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      14 hours ago

      I am going to guess the amount made is also much higher than 90s and 2000s since hardware tech is way more popular and used in way more places in the world. So maybe a lower percent but just a high total amount.

      But I have no idea…