• Mossy Feathers (They/Them)
    link
    610 months ago

    The irony is that nowadays the monitors would be swapped. The “good PC” would have a CRT (because most CRTs nowadays are probably in enthusiast rigs), while the “bad PC” would have the common 1080p Dell IPS display.

    On an semi-related note, why are Dell’s IPS panel monitors so ridiculously common? VA and TN panels are a lot cheaper, so I’d think companies wanting to get the most bang for their buck would use those instead. Is it the fact that IPS panels have a decent horizontal viewing angle, so Mr. Micromanager can look over your shoulder and see what you’re doing more easily?

    • @w2tpmf@lemmy.world
      link
      fedilink
      510 months ago

      Dell produced monitors is much larger numbers that most distributors like CDW etc will have in every warehouse in the country. This makes it much easier to standardize equipment across a large organization when you can always order the exact same SKU for several years in a row.

    • funkajunk
      link
      fedilink
      English
      4
      edit-2
      10 months ago

      Where are you getting this information about CRTs from? I know they get used for old school emulation, but pretty sure for modern systems a high refresh rate and freesync/gsync is where it’s at.

      • Mossy Feathers (They/Them)
        link
        6
        edit-2
        10 months ago

        People who are into older games tend to have a CRT + retro rig or digital to analog converter. A lot of older PC games legitimately look nicer on CRTs. Additionally, CRTs can have ludicrously high refresh rates and resolutions, don’t let the 4:3 aspect ratio fool you. High-end CRTs (specifically computer monitors, not TVs) tended to max out at 1600x1200 (vs 1920x1080), giving them a slightly larger vertical resolution at the cost of a lower horizontal resolution, with some going as high as 2048x1536 (comparable to 1440p (yes, 1440p, CRT computer monitors were mostly progressive scan, not interlaced like TVs)). Additionally, the refresh rates on later CRTs tended to start at 75hz (vs 60hz on LCDs), and could max out at 200hz on high-end monitors. You’d sacrifice resolution to do so, though I think you could mitigate some of that by using a BNC cable if your monitor supported it (though I doubt most rigs could run anything even close to 200fps without decreasing resolution). Finally, CRTs tend to have extremely low response times, very good color depth, and true blacks.

        That said, CRTs are heavy, fragile, and nowadays, expensive (before the pandemic you could get a high-end Sony Trinitron 20" PVM (professional video monitor) for like, $300-$400; shipping was more expensive than the monitor, nowadays you’re easily talking $1000 or more). Most LCD panels can beat CRTs in resolution and refresh rate nowadays (though even high-end LCD panels tend to struggle at beating CRT response time), and OLEDs outclass CRTs in almost every way.

        Edit: oh, another weakness of CRTs is that they can burn-in. That’s where the term originated. If you left an image on the screen too long, it’d burn into the display, causing it to persist even after the monitor was turned off and unplugged. Since no one’s making CRTs anymore, that means there’s a smaller and smaller pool of CRTs in good condition, which means they’ll get more expensive until someone decides it’s worth the money to start making the tubes again.

        Edit 2: that’s also why screensavers were a thing! Screensavers were there to stop you from accidently burning in your monitor. I wonder why they haven’t made a comeback with OLEDs.

    • moosetwin
      link
      fedilink
      English
      310 months ago

      I bet it’s that intel already has a business relationship with a ton of companies and the inertia is keeping them common