• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: WOULD YOU BUY AN 8GB 290X @ £599

Would you buy one?

  • YES

    Votes: 19 6.7%
  • NO

    Votes: 266 93.3%

  • Total voters
    285
While I do want to go 4k, i think in this current gen, we're forking out serious money for serious hardware to push 4k along (assuming you want max detail and such). where as a year from now, (next 1-2 gpu generations) will take 4k in its stride.

A cool card, I'd get one if i had more money than sense, but unfortunately despite my lack of sense i've also got a lack of money! (probably due to the lack of sense)
 
You gotta be impressed at its default memory clock being 1400Mhz rather than the factory 1250 on other cards. I see these added RAM cards and think that the memory probably wont co-operate with a half decent clock but obviously i am wrong with this one.
 
Seriously ? :confused:

Yes.

Reviewers have proven to truly enjoy 4k a pair is ideal, but you might hit the 4GB VRAM limit in some games, with 8GB you won't have an issue.

If I was building a state of the art 4k gaming PC I'd rather three of these cards than say a pair of 295 X2's as three cards is proven to be the most efficient crossfire setup as the 4th card never adds a huge amount more performance and with the 295 X2 you still have the 4GB VRAM limit.
 
Yes.

Reviewers have proven to truly enjoy 4k a pair is ideal, but you might hit the 4GB VRAM limit in some games, with 8GB you won't have an issue.

If I was building a state of the art 4k gaming PC I'd rather three of these cards than say a pair of 295 X2's as three cards is proven to be the most efficient crossfire setup as the 4th card never adds a huge amount more performance and with the 295 X2 you still have the 4GB VRAM limit.


Hey Gibbo, my point being that even with 2 of the highest end cards its only *just* enough for todays games, and makes no sense what so ever even in the mid term, 8gb of ram hardly future proofs you for a 4k setup, in fact in all the reviews ram was not often a limitation, its more often the raw power of the gpus that are avaliable to date thats the limiting factor for 4k gaming.
 
Hey Gibbo, my point being that even with 2 of the highest end cards its only *just* enough for todays games, and makes no sense what so ever even in the mid term, 8gb of ram hardly future proofs you for a 4k setup, in fact in all the reviews ram was not often a limitation but more the raw power of the gps that are avaliable to date.

You could use 3 or 4 GPUs.:)
 

Yeah i know , but then you're choice limited, everyone seems to fixate on the latest benchmark game, recently its been BF4, its really not the only game out there, there are MANY other titles that people play that dont scale well at all in Crossfire or Sli but are still heavily GPU dependent.
 
Yeah i know , but then you're choice limited, everyone seems to fixate on the latest benchmark game, recently its been BF4, its really not the only game out there, there are MANY other titles that people play that dont scale well at all in Crossfire or Sli but are still heavily GPU dependent.

In honesty, one 290 on its own will manage with 4K and 2 will be great but if you want all the bells and whistles, you will need 3 as a minimum and of the 8GB variant. I would love another Titan and will see what the MM brings in June time and that should see me settled for a couple of years but for those that like AMD and are going 4K, there is plenty of choice :)
 
I think heat is a concern for me. These would want to be running 80c tops under hours of load for me as playing rift at top settings canes even double gpu setups. Plus yes I will be one of the early 4k adopters but not THIS early. I do have to have everything as soon as it exists but at the moment I don't feel that 4k exists as a scalable platform. Once we have a standard connection that can offer 120hz and 4k then I'll invest into it but I need a monitor that will last 10 years. My Dell 30" is nearly 7 years old and I don't feel compelled to get into 4k until everything is standardised.
 
Last edited:
Dell 30inchers are EPIC! Esp the early ones with very little lag. Used to love mine :cool:

Don't think I can let go of my ZR30w for a while yet either, I guess I'll just have to add the AA@1600p. Poor me :( :p
 
In honesty, one 290 on its own will manage with 4K and 2 will be great but if you want all the bells and whistles, you will need 3 as a minimum and of the 8GB variant. I would love another Titan and will see what the MM brings in June time and that should see me settled for a couple of years but for those that like AMD and are going 4K, there is plenty of choice :)


You're telling me that as an enthusiast you're going to be happy with running lowered settings just to have that increase in resolution ?

When i first got my 1440p monitor i had a 7950, it struggled in most things on high settings and quite a few things on medium detail, (i knew it would), games maxed at 1080p looked much better than lowered eye candy at 1440p, my plan was to upgrade to another 7950/7970 and thats what i did. In games that scaled well @ 1440p looked awsome, in games that didn't i was stuck with lowered detail which isnt ideal, running outside of the native resolution looks like ass so that isnt really an option and no matter what clever tricks i've seen modern displays try to alleviate this issue it still looks like ass.
 
People consider max settings to be as far as the slider can go. TBH I dont like some of the graphic options (motion blur :() that are given in games, so whether i had all the grunt in the world or not, i would still not run max settings.

Just because you prefer max graphic settings, doesn't mean everyone else does. I personally can live without many things like max AA and would happily sacrifice some of the options to run a higher resolution or have better frame rates.

Gaming image quality is down to personal preference.

I will tell you as an enthusiast that I would be willing to sacrifice some settings to optimize frame rate and up resolution.
 
Back
Top Bottom