• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Do you think AMD will be able to compete with Nvidia again during the next few years?

Do you think AMD will be able to compete with Nvidia again during the next few years?


  • Total voters
    213
  • Poll closed .
Sadly I don't think it is going to happen anytime soon IMHO as Crytek did that,we then had a huge performance jump with affordable cards like the 8800GT,and still many gamers moaned and also pirated the game. After that you saw less and less companies,really push the PC on the technical level like Crytek did as Crytek didn't make enough money. Outside AMD/Nvidia pushing some tech to sell new cards,its more likely consoles will be pushing technical innovations,to get around their limitations,and any big jumps in image quality are most likely going to be because of new consoles like the PS5 as the potato CPUs on the current consoles are a big problem. An example is the streaming tech we saw in Skyrim which meant a largish open world without loading screens.

Why do you think so many PC Gamers are throwing money at Star Citizen - the devs have promised a title which will only run on PC and push what it is possible which I really hope it does.Whether that happens is another thing. Most AAA titles are literally console titles but looking a bit prettier and running at higher FPS on PC. If you looked at Crysis and compared that to console games,the consoles looked utterly meh. The same with FarCry,HL2 or Unreal back in the day.

Edit!!

Then you have the whole early access crap on PC. ARK really pushed hardware and was not a bad looking game,but most of it was down to utterly rubbish optimisation especially with the early access fad(which is to save money for the devs). I would argue most of the really intensive PC games nowadays are usually just poorly optimised,or running on old engines,which are being strung along to save money. They don't look nearly as good as what hardware you are expected to use for them.

CB2077,might be another game which could push things,but they got 30FPS at 4K on a single GTX1080TI,and they want to run it on current consoles(sadly),so I expect they might have dialed down some detail to make it run better. The same happened with The Witcher 3,as the product we got looked worse than what was revealed earlier which was a shame. Still a pretty game,but it could have been the next Crysis.

Its also another issue,if AMD does not compete,the competition will string out improvements based on financials,as you can see what has happened under £300 with dGPUs.

Star Citizen is becoming increasingly controversial because the vast amount of money its bringing has changed its scope, massively, so its taking a lot longer than the 2 years originally promised, yet this is what its backers wanted, they voted to extend its scope and development time.
Its been more than 5 years and its going to be another 2 before it gets to a beta stage, getting the infrastructure and technology in place for a true scale 3D universe is proving extremely difficult, it is the fact that they are doing it that keeps most of its backers on side, and they can see the progress, all of the infrastructure is getting to completion and the features that depend on it are beginning to appear in the Alpha.

Visually its gorgeous, it cannot be run on lower end hardware, an i3 with an RX 550 and 8GB or RAM, forget it...... and that's how it should be if you make games that fit lower end hardware and then up scale it you don't get the fidelity, its compromised, that's not what Star Citizen backers want.

A couple of recent vids that show of its fidelity, these scenes are rendered in engine, when you play the game it looks like this.



 
Last edited:
Voted no, i don't think AMD are even interested to in making proper 1080TI gaming competitors because they can't sell them no matter how good they are

Oh, AMD are interested, they won't leave these so large pies to be eaten by nVidia alone :D If we are going to 3840x2160 and 7680x4320 resolutions, we need hardware to push all those pixels. Even on next-gen consoles.
1080Ti will be beaten and improven by AMD soon.

I agree mate. Most here want AMD to compete so they would buy their Nvidia card cheaper.

Except me. I wouldn't be that hypocritical :rolleyes:
 
Panos is right tho, AMD need to justify their R&D spend, if they don't get a return on their investments then they are not going to make those investments, AMD have been consistently loosing market share and money since, and including the HD 4870, which was a brilliant GPU, by rights that GPU should have cemented AMD as the GPU leaders, instead nVidia sold 2x as many GTX 290's which were rubbish by comparison.

As a result AMD are now out of money, the last of it went on developing Ryzen and lucky for them the CPU crowed are not so tribal, its doing very well, even outselling Intel in some instances.

The GPU market as a whole want AMD to compete so they can buy cheaper nVidia, that's unsustainable.
 
AMD are out of money for not that long more. Give them 12 months and they will post very impressive profit figures - they now cover the debt payments and still posted $90,000,000 profit.
AMD are out of money because they paid twice for ATi, because of anti-competitive Intel during the Athlon64 era and because of Bulldozer.
 
AMD are out of money for not that long more. Give them 12 months and they will post very impressive profit figures - they now cover the debt payments and still posted $90,000,000 profit.
AMD are out of money because they paid twice for ATi, because of anti-competitive Intel during the Athlon64 era and because of Bulldozer.

Even with Bulldozer their CPU part of the company was profitable, the GPU part of the company has never been profitable, but they have been losing less money just constraining on the mid range, Fury-X and Vega are compute cards rebranded for gaming, its why they are brilliant compute cards and lacking in gaming.
 
Even with Bulldozer their CPU part of the company was profitable, the GPU part of the company has never been profitable, but they have been losing less money just constraining on the mid range, Fury-X and Vega are compute cards rebranded for gaming, its why they are brilliant compute cards and lacking in gaming.

Must add some AMD exclusive AIBs doing the extra bit to provide better products also. Just got the V64 Nitro+ and was surprised that it had a bracket to hold the card straight. And that's a £500 card which is not as heavy at the Aorus 1080Ti Xtreme which bend from the first moment I put it on, taking a nice permanent shape 10 months later.
 
I really hope AMD can compete, just can't see it happening.

I also hope Intel (spit) start to compete in the GPU arena, something needs to break Nvidia"s stranglehold.

GPU prices are currently in the realm of madness at the medium to top end.
 
I really hope AMD can compete, just can't see it happening.

I also hope Intel (spit) start to compete in the GPU arena, something needs to break Nvidia"s stranglehold.

GPU prices are currently in the realm of madness at the medium to top end.

AMD really competes at the moment on all segments apart the halo products like Titan and 1080Ti.

And AMD has some great drivers with great features atm, that NV is 20 years behind on that front requiring 3rd party tools just to put an overlay on the screen to see how your card is doing and overclocking. Those things come by default with the AMD drivers these days.
 
So, we have better AMD drivers, better Freesync experience, better image quality and somehow we are still unhappy and buy GeForces?!

Even with Bulldozer their CPU part of the company was profitable, the GPU part of the company has never been profitable, but they have been losing less money just constraining on the mid range, Fury-X and Vega are compute cards rebranded for gaming, its why they are brilliant compute cards and lacking in gaming.

Does this mean the prices are still too low and need to be adjusted accordingly, so finally the division would start to post figures in the green?
 
So, we have better AMD drivers, better Freesync experience, better image quality and somehow we are still unhappy and buy GeForces?!



Does this mean the prices are still too low and need to be adjusted accordingly, so finally the division would start to post figures in the green?

Well that is one opinion, another would be.

Well AMD have prettier driver interface, a comparable variable refresh rate experience, very similar image quality, but saying all that they are dearer, consume more power and were much latter into the market.
 
Well that is one opinion, another would be.

Well AMD have prettier driver interface, a comparable variable refresh rate experience, very similar image quality, but saying all that they are dearer, consume more power and were much latter into the market.

Hold on a bit on the power consumption front. I had until yesterday a GTX1080Ti Xtreme, with no manual overclock just the normal boost and was consuming 70W more than the V64 Nitro+ I have today. Dont take the feeble FE power consumption to make a case for it. Take the factory overclocked versions which some are bordering to the old mighty 295X2 in that front.
 
Well that is one opinion, another would be.

Well AMD have prettier driver interface, a comparable variable refresh rate experience, very similar image quality, but saying all that they are dearer, consume more power and were much latter into the market.

Hold on a bit on the power consumption front. I had until yesterday a GTX1080Ti Xtreme, with no manual overclock just the normal boost and was consuming 70W more than the V64 Nitro+ I have today. Dont take the feeble FE power consumption to make a case for it. Take the factory overclocked versions which some are bordering to the old mighty 295X2 in that front.

Not only that, but you can adjust the voltage and frequency and lower the power consumption as much as virtually you would like.
So, we delete the power consumption argument.
The prettier driver interface is just a bonus added to higher stability without crashes, blue screens, and more interesting features, in general.
Freesync - an AMD card plus a Freesync monitor is still cheaper than GeForce plus a Gsync monitor.

About the time to market, it certainly doesn't matter if you buy a new card today. I would buy the card that has made more recent appearence, so its support might be prolongated further in the future.
 
Does this mean the prices are still too low and need to be adjusted accordingly, so finally the division would start to post figures in the green?

No, i think AMD make perhaps 10 or 15% margins on their GPU's, even the very expensive to make Vega, perhaps 40% on Polaris, that's not the problem.

The problem is if you spend $1bm on R&D but only get $800m back on the resulting product then you have lost money.

AMD need to sell a lot more GPU's.
 
Not only that, but you can adjust the voltage and frequency and lower the power consumption as much as virtually you would like.
So, we delete the power consumption argument.
The prettier driver interface is just a bonus added to higher stability without crashes, blue screens, and more interesting features, in general.
Freesync - an AMD card plus a Freesync monitor is still cheaper than GeForce plus a Gsync monitor.

About the time to market, it certainly doesn't matter if you buy a new card today. I would buy the card that has made more recent appearence, so its support might be prolongated further in the future.

Replaced the 1080Ti with the V64 because I can use the NU8000 with Freesync 2 now. Already the colleague is coming tomorrow to pick up the KS7000.
55" 4K HDR Freesync 2 experience, for less than the 27" 4K Gsync monitors on their own..... :P
 
On a side not to this, i heard nVidia are having to use a chip on the G-Sync modules to make them HDR, said chips costs a lot of money, we are talking a few hundred $, $500 was the quote i got, apparently its all nVidia can do at the moment to make G-Sync HDR, its why its taken so long to happen and why its only on $2000+ displays.

Anyone know anything more about that?
 
Freesync - an AMD card plus a Freesync monitor is still cheaper than GeForce plus a Gsync monitor.

Yeah it is now again, wasn't like that for a good few months though was it, when the 64s were even more expensive than 1080 Ti's, as could have gotten a Nvidia card + a G-Sync, for around/less than the price of a Vega + a Free back then.
 
The RX Vega pricing compared to their NVIDIA counterparts is ridiculous though.As much I would like to support AMD, its not possible at these prices.
 
The RX Vega pricing compared to their NVIDIA counterparts is ridiculous though.As much I would like to support AMD, its not possible at these prices.
 
Back
Top Bottom