• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Do you think AMD will be able to compete with Nvidia again during the next few years?

Do you think AMD will be able to compete with Nvidia again during the next few years?


  • Total voters
    213
  • Poll closed .
The article is from 10 years ago . Since then we have moved miles ahead.
Even that article was completely irrelevant when the 295X2 came out.

Is like you telling me right now that the only way a 2 CPU system can only operate like those server ones in 2008 and Infinity Fabric doesn't exist.

Well, the article is completely relevant because the technology with these PLX bridge chips is virtually the same. There is no Infinity Fabric connecting the Vegas.

This is HD 7990:


 
7990, don't remind me about that card. All the reviews virtually said the same thing it was such a quiet card. Yet the cards that went on sale were like hair dryers. It's Ike the review samples were all cherry picked low voltage cores which ran cool while the actual production run chucked any core into the card.
 
7990, don't remind me about that card. All the reviews virtually said the same thing it was such a quiet card. Yet the cards that went on sale were like hair dryers. It's Ike the review samples were all cherry picked low voltage cores which ran cool while the actual production run chucked any core into the card.

Still better than the nVidia Fermi fiasco.

 
The article is from 10 years ago . Since then we have moved miles ahead.
Even that article was completely irrelevant when the 295X2 came out.

Is like you telling me right now that the only way a 2 CPU system can only operate like those server ones in 2008 and Infinity Fabric doesn't exist.

So stop trying to shut down logical conversation without you personally knowing how this damn thing (Vega X2) works
Because clearly you do not know

You forgot.to link to an article I think, but it doesn't matter.
I know how Vega X2 works because AMD published how it works. Exactly the same as all the duel GPU.


Stop looking for things that simply aren't there. For HPC workloads, crossfire work perfectly well. All this talk about multi-chip GPUs is just pipe dreams in the short term. AMD' chief Radeon engineer siad this won't happen any time soon. Don't listen to ridiculous YouTube videos from clueless AMD fanboys, just listen to what AmD says them sells, and what the actual published peer,-reviewed research papers from AMD and Nvidia are talking about.
 
Still better than the nVidia Fermi fiasco.



A faulty card is hardy representative of how the majority of them went, in the case of 7990 it being hot and loud was the case for the majority as opposed to the cool and quiet review offerings.

Mine hit in the high 90c range constantly and peaked at over 100c on occasion.
 
Hardly a single card. Write in google "GTX 480 fail" to see how many results...

Nvidia’s Fermi GTX480 is broken and unfixable https://semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/

I am sure that when AMD presses nVidia with superior architecture, nVidia will repeat these failures again.

You can also Google for "7990 hot" and see tons of hits on the subject. A stark contrast to most of the reviews that apparently reviewed something totally different to what went on sale.
 
A faulty card is hardy representative of how the majority of them went, in the case of 7990 it being hot and loud was the case for the majority as opposed to the cool and quiet review offerings.

Mine hit in the high 90c range constantly and peaked at over 100c on occasion.

Talking about hot..... 8600M was responsible for Apple having the Macbooks used it to burn at some point.
Nvidia supplied them GPUs that were out of the initial specs.

Also 7990 was a dual chip solution.
GTX480 was single chip.... The one that on the initial presentation was with the wooden screws....
 
Hardly a single card. Write in google "GTX 480 fail" to see how many results...

Nvidia’s Fermi GTX480 is broken and unfixable https://semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/

I am sure that when AMD presses nVidia with superior architecture, nVidia will repeat these failures again.
Aye, but some years later many were saying the 480 was a good card. Reviewer release dodgy reviews and the NV haters see it as the bible yelling "it's rubbish, to hot, blah bla blah". Few years later a different view is formed by those who care little about all the AMD vs NV nonsense.
I owned a 480 and thought it was great. I don't care what x,y reviewer said when it was released. Half of the time they have their mind made up before even doing the review, probably :D (and this goes for AMD reviews too I;m sure).
i'll be surprised if AMD presses NV again on a superior architecture. I think times are a little different now. Would be welcomed of course, and who knows, but NV have massive amounts they can throw at R&D if they want to.
 
Last edited:
Aye, but some years later many were saying the 480 was a good card. Reviewer release dodgy reviews and the NV haters see it as the bible. Few years later a different view is formed by those who care little about all the AMD vs NV nonsense.
I owned a 480 and thought it was great. I don't care what x,y reviewer said when it was released. Half of the time they have their mind made up before even doing the review, probably :D (and this goes for AMD reviews too I;m sure).
i'll be surprised if AMD presses NV again on a superior architecture. I think times are a little different now. Would be welcomed of course, and who knows, but NV have massive amounts they can throw at R&D if they want to.

GTX480 is great card. On the other thread we going to use it for jerky.
 
AMD obliterates Nvidia in early Battlefield 5 benchmarks https://www.overclock3d.net/news/software/amd_obliterates_nvidia_in_early_battlefield_5_benchmarks/1

""During his testing, Battlefield 5 was played at Ultra settings using ASUS ROG Strix RX 580 and GTX 1060 graphics cards at resolutions of 1080p and 1440p. Under DirectX 11 the game ran at an average of 45FPS on Nvidia's GTX 1060 and 68FPS on AMD's RX 580. Minimum framerates also presented a larger gap, with minimum framerates dropping to 34FPS on Nvidia's GTX 1060 while AMD's RX 580 had a minimum of 56.

Moving into DirectX 12 the performance gap widened, with the average and minimum framerates of Nvidia's GTX 1060 dropping to 41 and 29 FPS respectively, while AMD's RX 580 maintained the same average framerate and achieved a higher minimum framerate of 59FPS. Under DirectX 12 AMD's RX 580 offered a minimum framerate that was 2x higher than its Nvidia counterpart.""

So:
Battlefield 5 @ Ultra settings, 1440p and DX11
ASUS ROG Strix 580 - 68 FPS (average) / 56 FPS (minimum)
nVidia GTX 1060 - 45 FPS (average) / 34 FPS (minimum)

Battelefield 5 @ Ultra settings, 1440p and DX12
ASUS ROG Strix 580 - 68 FPS (average) / 59 FPS (minimum)
nVidia GTX 1060 - 41 FPS (average) / 29 FPS (minimum)
Nvidia do this all the time when their new GPUs are on the horizon. They did it with Witcher 3 where Kepler was massively underperforming while Maxwell GPUs were getting stellar performance. Now that their new GPUs are releasing, Pascal is facing the chopping block. Once the 1180 releases it will steamroll the 1080ti and all previous gen nvidia cards and AMD as well. It's why I never buy older generation cards from Nvidia. AMD are better at supporting older cards than nvidia.
 
To be fair both systems i played this on it was far from smooth, random dips for no reason. I still don't get why they're basing performance on an alpha, its unoptimized to say the least.
+1

No idea why people getting all wet about results from some game that is not even in beta yet. lol.
 
That video looked far from 50-60FPS lol. Looked far more like 20FPS with stutters

Yes, it is severe stutterring. Who knows what runs in the background - by the way, the two sides of the screen are different - it isn't one and the same part of the game. WTH?

Certainly nothing to do with drivers.
The different detail level is related an LoD system within the game engine, which is sensitive to exact distances, FoV, angles and other parameters. Nvidia have no control over that kind of thing from the driver level.

Unless the game is programmed with an explicit benchmarking tool that aims to render exact frames then any comparison is bound to be flawed.

It is the nVidia graphics drivers. They do the mess and render the images wrong.
You can't say the game engine somehow recognises it is being run on a GeForce and intentionally makes the image go wrong.

It is highly possible nVidia doesn't have the know-how and patents for best image quality.
Hence, all these results:







For members who don't understand the images. This is a test with circles where the goal is ideal circle - any deviation means the graphics card renders the image with worse quality.

And Radeon:
https://www.ixbt.com/video3/images/rv870-quality/ssaa8x_2_5870.png

GeForce:
https://www.ixbt.com/video3/images/rv870-quality/ssaa8x_2_285.png

https://www.ixbt.com/video3/rv870-quality.shtml#p2
Research of quality of rendering of video cards AMD RADEON HD 5000
 
Is there any evidence of things not being draw on an Nvidia card in PUBG?
I'm considering switching out my GTX 980 SLI for my Fury X. Initially I was thinking if Nvidia are rendering everything maybe I'm missing something useful, although now I'm thinking I don't want to switch to AMD if it's going to draw more foliage or whatever that people could hide in.
 
Back
Top Bottom