• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Exactly. I genuinely don't care who was first, who did this, who did that. I have a G-Sync monitor, like others and others have Freesync monitors and we all seem happy with them, so why does it really matter? I am certainly not out for point scoring but if DM and others want to believe Freesync was first, that's cool with me :)

But of course Freesync was first. Freesync begins with an 'F' whereas G-Sync begins with a 'G'. That puts Freesync one whole letter in front of G-Sync. If Nvidia had been first then they would have called it E-Sync...surely.

It's not rocket science you know. :rolleyes:

:p
 
Nvidia also have more aggressive driver level Geometry LOD.

This is what I can't stand about using nVidia. I can see this when I'm aiming on the edge of my vision (though I couldn't see it in BF4, I could *DEFINITELY* see it with BF2, Crysis, and other games). I kept fiddling with every setting I could to try and figure out why I couldn't see my targets until it was too late from some of my favorite sniping spots.

When I finally figured it out, the performance and smoothness that nVidia offered for the money was abysmal, so I sold the card (560Ti) and bought a 7870XT for a song and a dance and never looked back an nVidia for my personal rig.

Their video playback quality is worse as well.

I really hope AMD's solution doesn't have the same impact on quality - probably not an issue for tessellation.
 
But of course Freesync was first. Freesync begins with an 'F' whereas G-Sync begins with a 'G'. That puts Freesync one whole letter in front of G-Sync. If Nvidia had been first then they would have called it E-Sync...surely.

It's not rocket science you know. :rolleyes:

:p

true dat, anyone who cant see it is a bit fick init
 
But of course Freesync was first. Freesync begins with an 'F' whereas G-Sync begins with a 'G'. That puts Freesync one whole letter in front of G-Sync. If Nvidia had been first then they would have called it E-Sync...surely.

It's not rocket science you know. :rolleyes:

:p

true dat, anyone who cant see it is a bit fick init

My kind of humour :D
 
The Sun is out, Polaris.

The future's getting brighter
Hip hip hip hooray
The future's getting brighter
So said AMD yesterday

The best future is bright shining star glinting wonderfully burning our retinas and melting our faces off with AMD
 
The future's getting brighter
Hip hip hip hooray
The future's getting brighter
So said AMD yesterday

The best future is bright shining star glinting wonderfully burning our retinas and melting our faces off with AMD

You're aware the 980TI has higher power consumption than the 290x, Fury X, especially when overclocked, right? :)
 
Outwith ref 980Ti power requirement as the 980Ti's considerably faster than 290X and generally faster than Fiji, I wouldn't give a **** that it could use more juice.
 
You're aware the 980TI has higher power consumption than the 290x, Fury X, especially when overclocked, right? :)

I'm sorry, there was no hidden meaning in my post. I'm impressed you managed to find some, and that you thought I was having a go at AMD :p

The future's more viscous with Durex.
 
No, just no!




Yes....

Also I'll use the same website for my pictures, not different review website results like you did....

Anandtech results:

980ti overclocked (very common to overclock the 980ti, since it's such an overclocker's dream :))
PHnxIAH.png


980ti stock, Fury X stock:

om3wahH.png


Now rememeber, Fury X is a very poor overclocker, as we are constantly reminded of on this forum.

Either way, AMD/NVIDIA are very close power consumption wise on their top tier GPU's this generation, hence the jokes about Polaris/sun output are just nonsense.

Not my picture below (the typo's!) but generally quite truthful.
ggHcS9u.jpg
 
Last edited:
Well clearly you was wrong and if you are going to try and make someone look silly, you best get it right from the off or it backfires lol and whoever did that image needs to go back to school. Very poor effort - 1/10
 
Well clearly you was wrong

needs to go back to school.

I agree, that grammar is atrocious!

If you look carefully at the reputable benchmarks I linked from Anandtech, you can see the the overclocked 980ti has higher power consumption than the FuryX.

So no, I wasn't wrong about this.

Since the 980ti is an 'overclockers dream' - I feel it very relevant to mention, as many people are overclocking these cards.
 
Well clearly you was wrong and if you are going to try and make someone look silly, you best get it right from the off or it backfires lol and whoever did that image needs to go back to school. Very poor effort - 1/10

Not really. He said OVERCLOCKED. You posted stock speed specs. Does anyone run a 980ti at stock? I believe these 980ti's only beat FuryX's if they are massively overclocked, which will put them way past the wattage output of a FuryX, which is the point Dave is trying to get at.

Since the 980ti is an 'overclockers dream' - I feel it very relevant to mention, as many people are overclocking these cards.
 
Back
Top Bottom