- cross-posted to:
- hardware@lemmy.world
- cross-posted to:
- hardware@lemmy.world
Okay so the “S” models stood for “Super” which was a slight step up from the base. What are “D” models? “Duper”?
China exclusive models, labeled as such to get around regulations.
Dragon
Imagine paying $2K for a GPU and this shit happens
Scalpers have turned that into $6000 for the available units left.
The stupidly minor marginal gains you’d get from one of these cards vs a four series, isn’t even worth the time it would take to crack your case, let alone 6 grand. Worlds lost it’s god damn mind.
Closer to 2500 for most models
I doubt they got them for 2k
nvidia has been a garbage company for a few generations now. They got to the top, and sat up there enshitifying everything because they have a monopoly on the market. Don’t buy into this shit.
If you got hardware, use it until it dies…
Fucking 1070 can still put out decent performance lol
Running a 1060 in my desktop. Still does absolutely fine. I got my buddy’s old 1080ti OCd, just waiting to get the water-cooling kit put together and the 1060 will get put in my media sever to take over transcoding for the old 980.
I’m still using a VIC-II without any lag or drop in FPS
2070s here, doesn’t work that well with AAA (or even AAAA games! /s) that well with 3440x1440 resolution :S. But I can easily survive turning the graphics down.
3440x1440
Suffering from Success 🐸
Haha not so successful when your fps is total garbage
temporarily embarrassed millionaire vibes
For real. I’ve been rocking a 1070 for years and the only games that don’t get decent performance are new release open world survival sandbox titles that tend to suffer from a lack of optimization anyway.
I’m sure replacement units are in plentiful supply. Right?
Trust me bro!
Guess we’ll have to see how they handle this. Are they going to be good and do a full recall, or pull an Intel and do everything they can to avoid it?
It feels like things are so powerful and complex that failure rates of all these devices is much higher now.
I don’t have any stats to back this up, but I wouldn’t be surprised if failure rates were higher back in the 90s and 2000s.
We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.
Would be interesting to actually analyze the real world dynamics around this.
Not very many people had a dedicated GPU in the 90s and 2000s. And there’s no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.
Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.
I was referring to PC components in general.
We all did they used to cost like 60 bucks
I am going to guess the amount made is also much higher than 90s and 2000s since hardware tech is way more popular and used in way more places in the world. So maybe a lower percent but just a high total amount.
But I have no idea…
You are just short of needing a personal sized nuclear reactor to power these damn things, so I mean the logic follows that the failure rate is going to climb
deleted by creator
Lol