A couple of years back when I built my PC, I cheaped out on the GPU so that I could spend more on the CPU. At the time, I reasoned that the GPU I selected (a GTX 1060 6GB) was “adequate”, and the faster CPU would help with the other things I wanted to do, like school work, development, etc. So other than potentially moving to “enthusiast” gear, I went balls-out on the CPU, and settled on the GPU which left my build fairly lopsided.
I’d told myself after I got a job that I’d bump the GPU up (was aiming at a 1080Ti) to make the build make more sense, but that never eventuated… high-end GPUs became hard to get due to the crypto craze, and despite the fact I was making OK money I never got around to replacing it up to this point.
In August, I had my annual review coming up at work, and with it came a promotion with a pretty nice raise… so with Sabriena’s encouragement I figured I would jump aboard the new release of Nvidia GPUs coming up. I’d wanted one of the founder’s edition 3080s, even going to the point of staying up late on release day trying to snag one but to no luck.
After reading reviews, observing the issues some of the partner cards had, and generally overthinking it, I ended up deciding that I’d watch for my preferred vendor listing the Strix 3080 and jump on it, in the hopes of getting early enough on the preorders that I’d get one this year (there was some suggestion that some folks wouldn’t get theirs before next year!). That day came and went a few weeks back, the card can’t have been listed on the site for more than an hour before my order was placed and paid for, and then it finally showed up yesterday (thank fuck too because I was concerned that the delivery wouldn’t make it here before Monday!).
A moment of panic when I realized I wasn’t sure where my second type4 GPU cable was, but I managed to track down the sack of the extra cables and get it all installed.
The verdict? It’s great! Getting it installed was a bit of an exercise: it’s basically a 3-slot card, so I had to move my capture card, and connecting the three 8-pin power cables was a minor challenge. At 1440p with everything cranked, RDR2 basically sits at about 90% CPU and GPU, which I think makes it fairly balanced (I don’t see there being another GPU upgrade in this machine’s future, I’ll need a full rebuild I think), and with everything maxed I sit at around 110~130FPS most of the time, occasionally it boots higher, much less frequenlty it’ll dip down to 80~90, still perfectly enjoyable. Team Fortress 2 is utterly bananas on it, no problems maxing out my 144Hz monitor on it on most maps (the new halloween maps still give my machine a bit of a workout with everything maxed).
The only downside: I forgot how hot a decent gaming PC runs, the card puts off a significant amount of heat (I think it’s good for about 300W which assuming poor thermal efficiency is quite the little heater sitting on my desk) so summer won’t be fun. I did find that dropping back the power target in MSI Afterburner dropped the heat a lot with only a minor drop in performance, so that may be an option.
So what to do with the old card? Well, Duncan’s started playing more than just Minecraft (the primary target when we built his machine) and his budget build dogs a bit on some of the games he wants to play. Particularly with TF2, for some reason he can’t tolerate turning the graphics down to low, and at 1080p his 1050Ti drops down to a terrible 25FPS… but with my 1060 it rocks it, so he’s got a pretty huge performance boost out of it too. We were going to give it to Sabriena (she has a 1060 3GB), and give her card to Duncan, but she doesn’t play anything except TF2, and her card handles that fine, and since she’s been playing a lot of Animal Crossing instead she decided to just give it straight to Duncan.
I have a few more things left on my Christmas list at PCCG (for example the 750GB spinner that’s in my desktop makes funny noises and pauses on reads from time to time, it desperately wants replacing), and I’ll bump Duncan’s machine up a tiny bit while I’m at it, but I’m pretty stoked with how things are going.
Update 2020-11-01: I spent some time yesterday arranging my fan profiles again, because it occurred to me I had previously configured the fans based on the very light heat output of my CPU (overclocked, but still rather cool as I didn’t go silly with it) and my comically undersized GPU. The heat output of the new GPU is a lot higher, and so the fans weren’t rolling on early enough, then the heat would boost them, resulting in this super-annoying up/down cycling of the noisiest fan (an EK Vardar for intake air under load).
My fan setup is fairly simple, and not at all balanced, by design. I have all filtered intake fans, so to mitigate dust, I want maximum positive pressure. I have a single exhaust fan at the rear, one of the Be Quiet fans that came with the case. No idea how much air they move, but they do what it says on the box - they’re pretty bloody quiet even at full noise. So first thing’s first, figure out the level at which I can barely hear it, and set that level at right around the 65C the CPU runs at under gaming load. Two of the intake fans are the same as that, so they’re coupled with it. There’s a flat spot at the gaming temperature range, and then at 70C they go 100% and stay there.
The CPU fans are a similar story, though I think my CPU fans are unlikely to cause any great help here - the primary heat generator will be the GPU, and it’s dumping some of it’s hot air directly in the path of the CPU fan intakes, so moar fanz is probably not going to help things there. Still though, I did the same thing there, figure out the area just below where it’s audible and have it as loud as I can stand it for normal gameplay, and super quiet if I’m not actually doing anything on it.
The last fan is the Vardar, and it’s by far the noisiest. A high-static pressure fan meant for water cooling, it’s not marketed as being quiet and when fed the full 12V makes a distinct low-pitch whirr. So once again, figure out where I can stomach it for general driving, but more importantly introduce the same flat section to avoid pitch changes (when it’s the most noticeable, even at low speeds, any change in pitch is immediately noticeable). At 75C core temps, it goes full noise, providing me with a pretty good indicator the machine’s cooling is losing the fight.