• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Finally upgrading GPU from GTX580 black Friday help

Even when my card reaches 70c it only used 40% fan speed, you thought it would use more but even in my case of poor air flow due to it being too small the card hasnt seen over 73c!
 
Im not sure were the high temp feedback is coming from on the R390 cards as mine is almost silent and runs cooler than my HD7970's did .... good upgrade.
 
Its all you read on review sites and its all people say when recomending it!

Played fallout 4 maxed out on everything at 1080p and it does make an uglyish grey game look nice. 2 1/2 hours and stayed at arond 70c. Though it played perfectly smooth the gpu usage reading in msi afterburner was up and down non-stop on the graph.

Bought shadow of mordor last night, £6 for game of year edition with like 20 add ons!
 
I've got my i7 950 at 4ghz on very low volts its extremely stable, during gaming the processor sometimes gets to about 67c which is more than acceptable but on prime its like 85+

I doubt i'll bother oc'ing the card as the gains are marginal
 
I am very suprised how quiet it is compared to my gtx580. Even after an hour of gaming i cant hear the fans and its at least 10c cooler than my 580.

I was worried because every review out there says the 390 is hot and power hungry, so assumed it would be mega loud to try and cool it.

More often than not It's exacerbated hyperbole nonsense by fanbois or shills masquerading as reviewers. It comes from a time when AMD in their stupidity used a totally inadequate reference cooler on the 290's. That was a real isue but now some people just use the same rethoric for click bait.
They are a good cool quiet and powerful card, the best thing anyone can do is ignore YouTubers and look at temperature and noise slides from established reviews. They don't actually run any louder or warmer than another card.
 
I've got my i7 950 at 4ghz on very low volts its extremely stable, during gaming the processor sometimes gets to about 67c which is more than acceptable but on prime its like 85+

I doubt i'll bother oc'ing the card as the gains are marginal

I have left the card on defaults as well.

I like that while I'm on the desktop the 3 fans are off!

what volts are you using?
 
what volts are you using?

I've got 1.2 set in the bios, though monitoring through CPU-Z it's around 1.18

CPUZ%20screenie_zpsn7hyiyzz.png
 
I though free sync had more features enabled than g-sync, like everything Is sill enabled for free sync. G-sync doesn't but as a result can provide a higher refresh rate on some monitors

I'm way late replying to this which is slightly off topic now anyway, but figured I would anyway :P

Freesync has the big advantage of being an open standard. It's a slight change to the Displayport spec and can be implemented on almost any existing display scaler chip via a firmware change. This means that Freesync monitors, as any normal monitor, tend to have multiple inputs etc, and support all the features they did before the firmware change.

The downside is that not all scaler chips are ideally suited to this, as most were designed before Freesync was a thing, so you do get some cases were the monitor maker has slapped a freesync label on for some more sales but it only has, say, a 48-60hz freesync window, making it pointless. There are 1440p 144hz monitors that in Freesync mode, only go to about 90hz, making it really a 1440p 90hz monitor.


GSync on the other hand, is proprietary. This has the advantage of being owned and overseen by nVidia. If monitor maker (x) wants to make a GSync monitor, nVidia have to sign off on it or they can't make it. nVidia care about their GSync brand, so if you propose to make a GSync monitor with a 48-60hz range, nVidia will laugh at you and you won't be making a monitor today.

It also uses a custom designed nVidia scaler chip in place of the original scaler chip. This is partly how nVidia control it - we won't sell you scaler chips to put in to crap monitors. This has a disadvantage too - nVidia designed this chip for GSync, and the first version had no facility for secondary inputs etc, although later versions have. This is why most GSync monitors have only one display input. It also adds cost.

Until Crimson, nVidia had much better handling of the situation if you went below your variable minimum, as they doubled the frames and displayed them twice, bringing you back in the range. Crimson has fixed this for AMD cards, but only if your Freesync range is wide enough (as on say, a 48-60hz freesync monitor, double 40FPS and you get 80, which is still out of range.)

GSync effectively costs more for a much more polished solution. Freesync can be good but has an element of buyer beware about it as there are crap implementations.

Still, Apple manage to sell overpriced devices because they just work, so there is room in the market for both I think.
 
Its weird as its maxed out all settings except distance shadows and plays extremely smooth, by the afterburner graph it should be unplayable :-/

Whats driver cleaning?
 
If you got a capable PSU (which you have), there's very little reason for you to go for 970 over 390 I think...unless you want specific Nvidia features.

You know, except the drivers.. all I seem to hear now from my friends are AMD drivers this and that. Meanwhile I'm sat on nvidia and the drivers just work

that said cracking card and honestly I think the AMD = Hot as the sun is a big myth with a decent aftermarket cooler maker
 
Last edited:
Back
Top Bottom