• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

have AMD stopped competing

Nvidia are unlikely to be as complacent as intel have been, you can bet they watched last years kicking Ryzen dished out when everyone thought AMD could never make a competitive chip again, and took notes.
Will the next Nvidia chip just be a more optimized version of the current one? DX11 wont last forever and nvidia playing that card all the time wont help them, after all intel played the 4 core card for long enough and look what happened.
 
290/290X were very competitive cards when all was said and done - again though AMD kind of got screwed over by node related stuff being stuck then on 28nm for so long.

I think we'd have been a lot less critical at the time, had we known it was all downhill for AMD/RTG from thereon out :p
 
DX11 wont last forever and nvidia playing that card all the time wont help them

In the past nVidia have never jumped on tech like that before its ready (tessellation, DX10, etc. just to name two) and when they do jump on it they've so far always capitalised on it - there is no reason really to believe it will be any different this time (other than sour grapes from certain people).
 
In the past nVidia have never jumped on tech like that before its ready (tessellation, DX10, etc. just to name two) and when they do jump on it they've so far always capitalised on it - there is no reason really to believe it will be any different this time (other than sour grapes from certain people).

Nvidia has never contributed for improved image quality. They only work for higher framerates, even at the expense of severe reduction of image quality.
This works when convincing people to choose their products, because most people see numbers only and nothing else.
 
Nvidia has never contributed for improved image quality. They only work for higher framerates, even at the expense of severe reduction of image quality.
This works when convincing people to choose their products, because most people see numbers only and nothing else.
I've had a variety of both ATI/AMD and nV cards. I honestly can't say I've ever noticed this mysterious/mythical improved image quality that AMD are supposed to have. Maybe because I'm not analysing still frames but playing the games...? It's never been noticeable to me.
 
I've had a variety of both ATI/AMD and nV cards. I honestly can't say I've ever noticed this mysterious/mythical improved image quality that AMD are supposed to have. Maybe because I'm not analysing still frames but playing the games...? It's never been noticeable to me.

Tested cards: Radeon and GeForce, Counter-Strike Source, different maps:
Radeon:

GeForce:

Radeon:

GeForce:

Radeon:

GeForce:


On the first image: Radeons provides more details in the image, more dust particles visible in the tunnel, and less washed colours;
On the second image: The trees on the Radeon image have more leaves;
On the third image: The trees on the Radeon imahe have more leaves.
 
Wish folk would stop looking just at frame rates, there are lots of important things in a card other than how many frames it can spit out or how much power it uses doing so.
What about image quality (although i doubt there is much between them), lowest frame rates, noise, size, quality of driver experience and noise - some of which has hurt AMD in the past but now isnt to much of a problem. Well apart from cost but er well thats a whole different moan. :p
 
I'm struggling to see differences in those stills and I sure wouldn't notice them during play, I don't think. I guess you're more critical of that kind of thing than I am. Nobody is wrong in this case; I don't personally see enough differences there to base my purchasing decision around them.
 
I've had a variety of both ATI/AMD and nV cards. I honestly can't say I've ever noticed this mysterious/mythical improved image quality that AMD are supposed to have. Maybe because I'm not analysing still frames but playing the games...? It's never been noticeable to me.

Probably because Nvidia have had the best image quality for the last decade and ATi/AMD have been playing catchup ever since... and still are.

Current Vega imagery has worse iQ than latest Nvidia hardware... so seems ATi/AMD still have a way to go... especially in VR.
 
I'm struggling to see differences in those stills and I sure wouldn't notice them during play, I don't think. I guess you're more critical of that kind of thing than I am. Nobody is wrong in this case; I don't personally see enough differences there to base my purchasing decision around them.

TBH... in those stills, the ones labelled Radeon show the best IQ.

I struggle to believe it though, because every video I have ever seen, every comparison I've done myself and every other comparison I've ever seen have show Nvidia to have the lead on iQ... I wouldn't be surprised if those screenshots have been purposefully mis-labelled.
 
Probably because Nvidia have had the best image quality for the last decade and ATi/AMD have been playing catchup ever since... and still are.

Current Vega imagery has worse iQ than latest Nvidia hardware... so seems ATi/AMD still have a way to go... especially in VR.

TBH... in those stills, the ones labelled Radeon show the best IQ.

I struggle to believe it though, because every video I have ever seen, every comparison I've done myself and every other comparison I've ever seen have show Nvidia to have the lead on iQ... I wouldn't be surprised if those screenshots have been purposefully mis-labelled.

There are millions of threads all over the net saying the same as what I said:

Hi everyone, im currently in a nightmare of a struggle choosing between a Gtx 750 Ti or a r7 250, obviously, the gtx 750 Ti is much better and i found a very good offer to buy it, in my country's price its $2400 while the r7 250 is $2100, the thing is i would have no problem putting $10 more dollars into the 750Ti but ive heard that Nvidia video cards produce washed away colors and that AMD gfx cards make better, vivid colors? is this true to some extent and if it is is there any way to correct the colors on the nvidia gfx cards? thx everyone

Hello,

The GTX 970, and the GTX 960 are crushing blacks, it's really bad.

The GTX 980 and R9 290 look best, I'm giving the edge to the R9 290, because the GTX 980 looks slightly washed out.


All the best!

http://www.tomshardware.co.uk/answers/id-3033664/amd-nvidia-image-diferences.html
 
I know the ss were to show image quality differences, but why are the FPS so vastly different in the top-left corner? In every single one, the AMD is showing 5-10x the FPS as the GeForce. Which makes me instantly suspicious that something is not quite right...
 
I know the ss were to show image quality differences, but why are the FPS so vastly different in the top-left corner? In every single one, the AMD is showing 5-10x the FPS as the GeForce. Which makes me instantly suspicious that something is not quite right...

Because the Radeon is HD 4000 series, can't remember, though, if it HD 4670 or HD 4890. And the GeForce is 8500 GT.
 
I dont think i have ever seen anyone claiming nvidia has better image quality - quite the opposite although whether that is right or not is another matter.

In any modern generation when using equivalent settings there is pretty much no IQ difference - there might be rare exceptions but in 99.99% of cases the image is close to identical pixel for pixel - there have been times when for instance nVidia's 5000 series took shortcuts with DX9 shaders inducing banding, lower resolution and/or lower bit depth and/or dithering when rendering those shaders, etc. and both have previously used driver IQ hacks to try and cheat benchmarks.

For some reason there can be a very very tiny but perceptible difference in vibrancy of the image displayed on the monitor between them in some setups - I'm not sure why - AMD looks like nVidia does with the digital vibrancy up 1-2% but largely without the loss of detail in close to saturated areas you get doing that. (Back in the day I'd have said it was due to the use of different brand chips used for the video output display controller but in modern GPUs with digital output I can't imagine it would be like that).
 
AMD do tend to last better than NVidia historically due to their driver updates helping all the GCN generations of cards.
That was only true for the Kepler generation (they were affected by their lesser VRAM and compute capabilities). Tesla, Fermi and Maxwell remain pretty solid.

AMD suffered the same Kepler situation with Fiji, all Fury cards have aged horribly in 2017 and beyond. A 980Ti is now miles ahead of the FuryX for example:

Dozens of 2017 games where the FuryX is barely faster than an RX580, and way behind the 980Ti
https://www.reddit.com/r/Amd/comments/7ckjj5/furyx_aging_tremendously_bad_in_2017/
[GameGPU] 980Ti is 30% faster than FuryX in 2017 games
http://gamegpu.com/test-video-cards/podvedenie-itogov-po-graficheskim-resheniyam-2017-goda
[HardwareUnboxed] GTX 980Ti is Aging Considerably Better Than FuryX!
https://www.youtube.com/watch?v=IeUyvEVB984
[ComputerBase] GTX 1070 (~980Ti) is considerably ahead of the Fury X
https://www.computerbase.de/2017-12...marks_in_1920__1080_2560__1440_und_3840__2160


The 5870 was a masterpiece that blew the 280 out of the water, even nVidia's GTX 480 response could not match it, add to that the GTX 480 was heavily criticized for all sorts of reasons, nVidia sold something like twice as many GTX 480's as AMD HD 5870's.
The GTX 480 took the crown from the HD5870, even the slightly fixed GTX 580 took the crown easily from the HD 6970. This is what matters in the end.

480 GTX didn't retake any performance crown. Fermi was late, launched with revision A3, was hot, was exploding with its weak power delivery circuit, and as usual for Nvidia, was expensive. Complete ****.
Your own chart shows the GTX 480 7% faster than HD 5870 @1600p. @1080p it was 10% faster.

perfrel_1920.gif


The GTX 480 also came with 1.5gb of VRAM compared to the 5870 with 1gb, this difference very quickly became important even at 1080p.
Exactly. That's why it's still faster to this day in modern games:
https://www.youtube.com/watch?v=g41owucTjgs
Nvidia has never contributed for improved image quality. They only work for higher framerates, even at the expense of severe reduction of image quality.
GameWorks seems to disagree with you, NVIDIA GPUs have access to far more exclusive IQ effects than AMD GPUs.
I dont think i have ever seen anyone claiming nvidia has better image quality - quite the opposite although whether that is right or not is another matter.
Nope, AMD had bad AF quality with the HD 5000 and HD 6000 series. That was a major problem back in the day.
Right now there are no single difference between AMD and NVIDIA IQ wise.
 
Last edited:
Probably because Nvidia have had the best image quality for the last decade and ATi/AMD have been playing catchup ever since... and still are.

Current Vega imagery has worse iQ than latest Nvidia hardware... so seems ATi/AMD still have a way to go... especially in VR.

LOL well, I guess since all your other posts in this thread are completely inaccurate and not based in any kind of reality, not really surprised that this one is too.

I take it you are just joking at this stage. Nobody can be that wrong all the time.
 
LOL well, I guess since all your other posts in this thread are completely inaccurate and not based in any kind of reality, not really surprised that this one is too.
His assessment is not wrong, again people with NVIDIA GPUs have access to superior shadowing techniques (HFTS, VXAO and driver activated HBAO), better hardware PhysX, better AA (TXAA) and arguably better AF (AMD fixed most of it's DX11 era AF problems though).
 
Back
Top Bottom