Tuesday, November 22, 2016

Now That's More Like It

Don't know what you got till it's gone
Don't know what it is I did so wrong
Now I know what I got
It's just this song
And it ain't easy to get back
Takes so long
--Don't Know What You've Got (Till It's Gone), Cinderella


The adventure I've been on the past month began when the graphics card in the main PC went belly up. I'd been thinking that we might want to upgrade some of the components over the holidays --if the bonus gods were willing to smile on us, that is-- but I'd been thinking that a new graphics card would be #2 on the list behind an SSD.

But that idea got thrown in the trash heap on November 4th.

Yeah, like this. Only with fewer cows.
From quickmeme.com.

I've already covered my adventures dealing with the Intel integrated graphics for my 3rd Gen i7 system (spoilers: they weren't happy ones), so I knew I had to shell out for a new card. And yes, I warned The Boss just how much one that would be a bit better but not top of the line would cost (~$200 US). So, with a budget in hand and potential specs a plenty, I sallied forth to do battle with the mighty graphics card market.

I used to be an NVidia fanboy from way back in the day, but I had some bad experiences with the GS series of NVidia cards in the late 2000s, so I'd switched to AMD's Radeon offerings when the current PC was built. I saw no need to change that, particularly since the AMD integrated graphics on the mini-Red's laptops ran rings around Intel's integrated graphics. If they can do that, I figured, then their dedicated cards will be good enough for me.

What I saw in research only confirmed my suspicions, as I zoomed in on the RX470 as the potential card to have. The 4 GB option in particular hit that ~$200 price point, and I had a traditional HD monitor, so I had no need for either the 8 GB or the RX480 cards. I also had a quirk of the system in that I only had a 6 pin power connector available for the graphics card, not an 8 pin, so that ended up limiting my selection to only a couple of cards.*

Namely, this one:

The Sapphire Radeon RX470 4GB. From pcworld.com.

Courtesy of my living close to Newegg's warehouse (it's only a state away), it arrived several days ahead of its original delivery date, which meant that I had an opportunity to install the sucker a lot sooner than I expected.

Nah, man. If I can figure out how to get a refrigerator
to fit in a small kitchen, I can buy the right sized
graphics card. From catplanet.org.
The only surprise I had during installation was that the power connection was on top of the damn card, not on the side, which meant that I had to get creative in terms of making sure the card fit around the case frame. Still, the installation and driver updates went fairly smooth, and the card itself is quieter than the old HD7700 I had in the PC.

***

Over the past week I've had a chance to sit down and try a stable of games with the new card to see what sort of difference it made to the graphics settings.

Now, I don't have a game that's less than two years old (Wildstar and Mass Effect 3 are the newest, from 2014), so the games I have don't really stress a game card like a current gen game (such as, say, Witcher III or Black Desert Online). Still, this card ought to handle both current gen games without much issue.

As for the games I own? let's just say that one game in particular surprised me. A lot.

LOTRO experienced some lag when entering certain areas (such as around Emyn Lun in Mirkwood), as if the game were busy loading data from the LOTRO servers. Given that LOTRO is closing in on its 10 year anniversary next year, I wasn't expecting the graphical lag like I got. But once that initial lag was over, the game ran smoothly.

I've checked online a bit, and discovered that I'm not the only person who has had these issues with LOTRO, and that it might actually be due to the game architecture. I can't really say, but it is definitely the only game that I've experienced this issue with.

But the graphics... Oh, yes. All of the little LOTRO graphics options are selected, and it makes a big difference in the background on the game. Items such as fog are much more realistic now, and background scenery is far more detailed. I can stand on the northern Dwarven outpost in Angmar and look down at Imlad Balcorath in the distance and see all of the details, something I couldn't see without sacrificing framerates.

Not too surprisingly, the game that benefits the most from the new graphics card has been SWTOR. The graphics engine for SWTOR is a bit clunky --even Bioware admits that-- but with a 4 GB card the game finally shines. I can actually set the shadow detail on high and get good framerates; no blobby shadows for me anymore. I really need to get over to my own personal hell, Alderaan, and see how the game holds up now. That used to be the place where my old graphics card went to go cry in a corner, so if it can handle that place, it can handle anything SWTOR throws at it.

Before the new card, those shadows would be blobs.
From mmorpg.com.


As for other games, the weirdest result I got was when loading Star Trek Online. It bitched that I didn't have the current graphics firmware, but then proceeded to load up the highest settings anyway. Something tells me that Cryptic Studios needs to update their graphics card firmware data. Neverwinter and Wildstar looked better, but not overwhelmingly so, as did Age of Conan.

Now, if there was a way for your Guild Wars 2 toons to look more, well, lived in with higher graphics settings and not as pristine as they do...

***

Was it worth it to upgrade?

Well, since I had no real choice, yeah. But if you mean compared to the old card, then yes to that too. I believe that the bigger boost to my system, however, would come from replacing the HDD with an SSD. But that is now an adventure for another time.

Besides, I've got other items to worry about for the next few months, such as university applications.

Oh yay.





*Why change out the power supply when I can find a card that works? Sure, it'd be nice to get a Sapphire Nitro RX470, but not because I had to spend an extra $50-100 on a power supply.


No comments:

Post a Comment