• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Which GPU £280 budget

Status
Not open for further replies.
PAST being the keyword.

Last time I checked the 970 still has 3.5gb + 0.5Gb so not in the past is it.

8GB for a 390 is far too much of an overkill, even for 2. If you filled 8GB of hi resolution textures into the 8GB the combined processing of the GPU, or GPU in this case would not be able to handle it. It was a cheap selling ploy. Do you think NVidia are stupid, releasing top of the range 980ti with 6GB of VRAM? No, they are sensible and not trying to fool people.

Shadow of Mordor with Ultra textures needs over 6Gb of Vram @ 1080P so 8Gb is not overkill at all.

Exactly, old tech. Have you seen the opening credits for Metal Gear Sold V? Do you think they would put NVidia there if they thought the game would be crap on the hardware? The game runs flawless on my 980ti in Ultra 4K with my G-Sync monitor, which basically gives you the illusion of gaining 20-30FPS.

Like I've said, the Maxwell GPU architecture of the 970/980 is the most advanced in existence. And with ports like Metal Gear Sold V, Mad Max, Assassins Creed Unity, Tomb Raider: Definitive Edition... running amazing on GeForce cards, obviously the developers have got a handle on the Maxwell architecture already.

MGS runs at 60fps on an Xbox One so not exactly taxing for any mid-range gpu. They put Nvidia in the the opening credits because of the Gameworks program where Nvidia works with and funds/helps the devs so is it surprising?


Gsync costs more for more than that reason, it actually performs better too. Maybe your a AMD rep(what a stupid thing to say).

G-sync monitors have a wide frequency range but so do the better specced Freesync monitors. You obviously haven't used a Freesync monitor and are just going by reviews and looking at what the very cheap monitors can do.

It's getting late and accusations have started so that's me finished.
 
The main problem is not the 3.5GB but the way the last 512mb is handled(very low memory bandwidth) - the OP said they want a three year lifespan.

My GTX660 has something similar,and so did the GTX660TI which meant as time progressed the equivalent AMD cards like the HD7870 and HD7950 started to pull ahead especially with AA enabled.

They need to consider in another year or so Pascal will be out,and looking at the way HBM works,you are not going to have similar issues with fast and low bandwidth memory pools. So,they will need to consider whether future driver revisions in the second and third years,will be still be written with the unsual memory configuration in consideration.

We will need to wait and see TBH!!


The cost of GSync will reduce over time depending on how popular it is. If everyone had a weekend demo of it, they'd buy it on Monday! And I've just got a 60hz one because of 4K! 144hz must be beautiful....

Not at the rate adaptive v-sync monitors will,since they are based on scalers made by multiple companies,and its part of a VESA standard,and ultimately since Intel is now on-board they will probably drop quicker in price. We are seeing the first £100 adaptive v-sync monitors being released.

G-Sync uses an FPGA which is a lower volume and more expensive implementation,so its going to cost more.
 
Last edited:
maybe just maybe as the drivers for the r9 390 mature it could become a real star.

I wouldn't expect much more in the way of drivers maturing for a GPU that is already 2 years old. :)

Shadow of Mordor with Ultra textures needs over 6Gb of Vram @ 1080P so 8Gb is not overkill at all.

Good job this thread isn't about a Fury then isn't it. ;)

Opening poster, just get the MSI 390 while it is on TWO. Seeing as it is cheaper than the 970 and you have said the bundled game with the 970 wont sway you.
 
TODAY ONLY
At an amazing price

Definitely a good deal, especially if your going to overclock it anyway, snap one up before they all go.

YOUR BASKET
1 x powercolor radeon r9 390 8192mb gddr5 pci-express graphics card (axr9 390 8gbd5-ppdhe) £229.99
total : £229.99 (includes shipping : ex.vat).


that's a good deal. if op misses this he will be kicking himself. i'm half tempted myself.

check out the review:
https://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/22.html
 
Last edited:
Arkham knight, "the way its meant to be played". :p

You do know there is EDIT button, no need for post clocking.

Arkam Knight? I have heard stories of it being dodgy but I played it almost maxed with 1080p on my old 970GTX without hiccup, and can play it 4K Ultra with my 980ti GTX.
 
Andygully said:
I have a 980 strix and when running games it is not silent at all haha when there's no load the fans don't spin but the first time a used it to game i was surprised on how noisey it was
If it's as loud as this, then I guess it's normal. Check from 37 seconds onwards.

MSI and Palit are the quietest at load.

 
The main problem is not the 3.5GB but the way the last 512mb is handled(very low memory bandwidth) - the OP said they want a three year lifespan.

My GTX660 has something similar,and so did the GTX660TI which meant as time progressed the equivalent AMD cards like the HD7870 and HD7950 started to pull ahead especially with AA enabled.

It isn't the memory you're right, but I think you and quite a few others are not reading what I've been saying. The 970 houses the most advanced GPU architecture on the planet, that is future proof.

They need to consider in another year or so Pascal will be out,and looking at the way HBM works,you are not going to have similar issues with fast and low bandwidth memory pools. So,they will need to consider whether future driver revisions in the second and third years,will be still be written with the unsual memory configuration in consideration.

We will need to wait and see TBH!!

As far as I know a driver was released that solved the 3.5GB memory issue, it is now recognized at 4gb.


Not at the rate adaptive v-sync monitors will,since they are based on scalers made by multiple companies,and its part of a VESA standard,and ultimately since Intel is now on-board they will probably drop quicker in price. We are seeing the first £100 adaptive v-sync monitors being released.

G-Sync uses an FPGA which is a lower volume and more expensive implementation,so its going to cost more.

G-Sync monitors perform notably better then freesync in every article I've read or video I've seen. I challenge you to find a independent review that says freesync is better then G-sync.
 
My final post.

If I was the OP I'd go with the 970GTX, the free game if you don't want it will sell for £25 or so, and you can get this for £269.99 with a 5 year warranty.

That card is awesome for 1080P, as for Shadow of Mordor, it must be the worst port ever in the history. It would use at max 3GB of VRAM on a PS4 at 1080P, probably 2GB actually and the PC requires 6GB VRAM? No way on earth could a 390 process 6GB of high definition textures etc. hell even a 980ti would struggle. Absolutely ridiculous.
 
Let me start by thanking each and every one of you for all your help and advice.

I will continue so shop at overclockers because of you.

This forum is a credit and my one stop hardware home.

Team green team red who ever you follow.

KUDOS Guys and thankyou.

Heres the card i went ahead and pulled the trigger on.

MSI R9 390 GAMING 8G

https://www.overclockers.co.uk/showproduct.php?prodid=GX-289-MS

And heres why.

MSI - everything ive heard is good im a new customer lets try em and see.

TWIN FROZR V - COOLER - Ive read nouthing but good things about this cooler.

8192MB GDDR5 - a couple of blobs more of the fast stuff cant really be a bad thing can it ?

Back plate - Thay not only look cool i think there very practicle.

They help to keep these big heavy cards ridged , thay help keep the card cool,

and help keep dust and outher crud off of the circuits.

A change of vendor to perhaps give the little guy a break.

Im like that its just my way so shoot me.

I think nvidia did a real number with the whole memory thing.

Im just relly suprised that the people that follow this company and give over there hard earend,

still endorse a product that is quite obviously flawed or sold useing false marketing.

I dont belleve it cripples the card but i do think theres an underlying floor thats hidden with a driver trick.

Ive used and loved nvidia products since ever but this time round im gona have to pass.

Shame on you team green.

Maybe next time round.

My new card should be here tommorow cant wait :)

You guys really deserve credit thankyou all so much for your time and patiance.

If i can do the same for you please ask.

Paul.
 

Yeah, but he is right.

8GB of memory isn't a bad thing it is a way of making people go "Look it has 8GB of memory I will buy it" without knowing that it will never be used and is as bad a situation as the 3.5GB (which didn't reduce performance) mistake, which is fixed, it's just a selling ploy and very misleading.

As for Shadow of Mordor, that worked flawlessly on my last 970GTX so don't know where this "6GB" came from, it's recommended spec is a 660GTX.

I didn't want to post again, but after giving so much time to this thread and now realising the guy had beef with NVidia all the long, and would have got a 390 in anyway imo really annoys me. Last time I will give advice.
 
Yep in the monitor but I expect if you keep the gfx cards for over two years you would be alott happier with the 390 due to probably better dx12 and 8gb.

Maybe in a Cross-Fire config. The way VRAM works is that your cpu fills your VRAM up with textures, lightening etc. then your GPU has to process all of those in VRAM almost instantaneously, then it's rinse and repeat. A 390 or a 970 couldn't process 8GB of this information, my card probably couldn't hence it is 6GB.

The PS4 for example has beautiful graphics for a console yet it will use no more than 3GB of VRAM, and that VRAM is shared system memory, so no where near as fast as the fastest VRAM in this price range, which is the 970 @ around 7000mhz.

EDIT: Correction: The PS4 has DDR 5 system RAM lol that's faster than we get! Still shared graphics cards, or modules are not as effective as dedicated ones.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom