• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon R9 290X with Hawaii GPU pictured, has 512-bit 4GB Memory

mmm selective benchmarks

the fun continues - just release the reviews of the things and let us make our own choice

its not going to change anything

We love selective benchmarks, if you look at the one I posted earlier it shows a HD 7970 beating a Titan on minimums in bioshock.:D
 
"There’s not much more I can say at this point, except that I have several cards tested at 3840x2160, and R9 290X doesn’t just do well against the GeForce GTX 780…"

I think this statement means it does well in comparison to Titan as well.
 
"There’s not much more I can say at this point, except that I have several cards tested at 3840x2160, and R9 290X doesn’t just do well against the GeForce GTX 780…"

I think this statement means it does well in comparison to Titan as well.

maybe - but on that game, at that resolution, and "those" settings
 
Sorry for being an obnoxious ******* here but it seems some people are deliberately ignoring these very simple basics. How anyone can manage to miss such basic concepts that appear in a one line quote is astonishing.

I did not think you were being obnoxious, you were just making a point.:)
 
Anandtech show it = to the GTX780 at sub 30 FPS 4K Ultra in Bioshock Infinite. Though one of the comments mentions the R9 290X was in quiet mode.
 
@techreport

The numbers show the R9 290X beating the GTX 780 in two games, BioShock Infinite and Tomb Raider, at a 3840x2160 resolution. In case you're wondering, that's the resolution of Asus' PQ321Q, one of the first desktop 4K monitors on the market. Here's everything AMD is making public at this juncture:

BioShock Infinite
Setting: 3840x2160 systemspec=ultra

AMD Radeon R9 290X "Quiet Mode" average frame rate: 44.25
NVIDIA GTX 780 average frame rate: 37.67

Tomb Raider Setting: 3840x2160 16xAF Tressfx=off aatype=fxaa texturequality=ultra shadowmode=on shadowresolution=high
ssaomode=ultra dofquality=ultra reflectionquality=high lod=ultra postprocess=on highprecisionrt=on tessellation=on

AMD Radeon R9 290X "Quiet Mode" average frame rate: 40.2
NVIDIA GTX 780 average frame rate: 34.5
 
Interesting that AMD had the balls to challenge a 780 on Bioshock Infinte with DDOF enabled. This is a setting (DDOF) that notoriously favours Nvidia cards. It offers zero image quality benefit but does hurt both sides fps considerably. Sadly most review sites bench it. Let me tell you, default Ultra preset is better and this setting favours AMD cards strongly.


Typo i think Tommy. Otherwise the two graphs he just posted contradict everything he says. :D
 
First post, been lurking for weeks as I peruse the posts in aid of picking out my new GFX card but thios thread has both made me laugh my arse off and annoyed me. Namely the nVidia fanboys strutting around and rubbing in the rumoured temps from a single benchmark that means sod-all in the grand scheme of things.

Thanks to them I'm now firmly in the AMD corner, I'm a sucker for a good underdog and also the Radeon cards are just better at what I require them for (£ per perf (I'm a skint bugger) and Lux renderer performance). But this thread was important as if I ever have the cash a 290 looks like the card for me, I'm not doing 4k gaming and would be nice to have a AMD beast to brag about :P.

Now to the point I wanted to make, it's about Kaapstad and his comment on them picking a 780 because of it's vram shortage in comparison. He's having a dig at AMD, but why wouldn't they pick a 780? It's the card in direct competition with them for the price-range they might sell at and the vram comparisons are much better than using a Titan (33% adv over a 50% disadv). So not only are they pointing out that you can get superior 4k performance, but you can get that for around the same price (yes i know the prices aren't released yet but there's no way in hell they'll price it in the Titan range).

Now, the fact that 4k comparisons are pointless right now means little, both companies are hitching their wagons to the 4k train and there's no point in them picking a Titan because if you have no budget, the Titan will be king if you're wanting 4k gaming (that 50% extra vram).

There is absolutely no reason for comparing a GTX 780 to a 290X @4k as nobody who buys either will use them at that resolution. The only reason I can think of for AMD to do the above is to make the 290X look better (due to it's extra vram) than the GTX 780 in a pointless exercise. If AMD are into pointless exercises they should have used a Titan for the comparison.

Perhaps the smug nVidia fanboys should hold their tongues until legitimate benchmarks and temp figures come out, lest they end-up having to consume tons of humble pie for rubbing it in for weeks on end. But of course they'll still have Batman smoke...

I may own some NVidia cards but I also have two HD 7970s, two HD 5970s and have four R9 290Xs on order.:D
 
@techreport


The numbers show the R9 290X beating the GTX 780 in two games, BioShock Infinite and Tomb Raider, at a 3840x2160 resolution. In case you're wondering, that's the resolution of Asus' PQ321Q, one of the first desktop 4K monitors on the market. Here's everything AMD is making public at this juncture:

BioShock Infinite
Setting: 3840x2160 systemspec=ultra

AMD Radeon R9 290X "Quiet Mode" average frame rate: 44.25
NVIDIA GTX 780 average frame rate: 37.67

Tomb Raider Setting: 3840x2160 16xAF Tressfx=off aatype=fxaa texturequality=ultra shadowmode=on shadowresolution=high
ssaomode=ultra dofquality=ultra reflectionquality=high lod=ultra postprocess=on highprecisionrt=on tessellation=on

AMD Radeon R9 290X "Quiet Mode" average frame rate: 40.2
NVIDIA GTX 780 average frame rate: 34.5

Why would they disable TressFX and enable DDOF? Me thinks AMD are going easy on the 780 here.
 
quiet mode ???

have to say if the card has a quiet mode - I find that worrying

In what way? I would guess it simply means you can enable "who cares about noise" mode and the card will run faster and louder. This is no different than enabling a custom fan profile to allow higher overclocks, though it is most certainly easier to do.

Personally I am waiting to see what an average OC is like on the custom cooled versions.
 
First post, been lurking for weeks as I peruse the posts in aid of picking out my new GFX card but thios thread has both made me laugh my arse off and annoyed me. Namely the nVidia fanboys strutting around and rubbing in the rumoured temps from a single benchmark that means sod-all in the grand scheme of things.

Thanks to them I'm now firmly in the AMD corner, I'm a sucker for a good underdog and also the Radeon cards are just better at what I require them for (£ per perf (I'm a skint bugger) and Lux renderer performance). But this thread was important as if I ever have the cash a 290 looks like the card for me, I'm not doing 4k gaming and would be nice to have a AMD beast to brag about :P.

Now to the point I wanted to make, it's about Kaapstad and his comment on them picking a 780 because of it's vram shortage in comparison. He's having a dig at AMD, but why wouldn't they pick a 780? It's the card in direct competition with them for the price-range they might sell at and the vram comparisons are much better than using a Titan (33% adv over a 50% disadv). So not only are they pointing out that you can get superior 4k performance, but you can get that for around the same price (yes i know the prices aren't released yet but there's no way in hell they'll price it in the Titan range).

Now, the fact that 4k comparisons are pointless right now means little, both companies are hitching their wagons to the 4k train and there's no point in them picking a Titan because if you have no budget, the Titan will be king if you're wanting 4k gaming (that 50% extra vram).

Perhaps the smug nVidia fanboys should hold their tongues until legitimate benchmarks and temp figures come out, lest they end-up having to consume tons of humble pie for rubbing it in for weeks on end. But of course they'll still have Batman smoke...

Fair play for having the balls to say what a lot of us think. Its going to make you public enemy No.1 for a while though. :D
 
Back
Top Bottom