• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Do you think AMD will be able to compete with Nvidia again during the next few years?

Do you think AMD will be able to compete with Nvidia again during the next few years?


  • Total voters
    213
  • Poll closed .
AMD site states Typical board power (desktop) 295w for Vega 64

NVidia states tdp of 180w for the 1080 and 250w for the 1080ti (click view full specs.)

So no I wont hold on the power consumption front, you might have tweaked your card to be relatively power efficient but do not try to kid us that they are all capable of it, as loads of people have not had much luck tweaking theirs. and that is with even mentioning you put the Vega 64 up against the 1080ti..lol


Nvidia TDP has nothing to do with power consumption, is based on absolute stock core speeds.
Same applies to the Intel TDP which is calculated at base core speeds.

Go and see on reviews the difference of power consumption between eg the MSI Lightning or the Aorus Xtreme , or the SC2 & Kingpin against the FE.
The different is roughly 100W, some times even more.
 
I'm not a fan of Polaris at all. Noisy, hot, absurdly power-hungry for the performance...

Anyway today sub-£200 is not the same perf-bracket as was back then. Sub-£200 is ****-tier these days. Can't buy a decent mid-range card for less than £250-£350 (or LOL-Asus at £400). I seem to recall the 7850 was £150 ish? The next mid-range king was the 460 (nV) also at £150.

https://www.overclockers.co.uk/pc-components/graphics-cards/amd/radeon-rx-580

^Mid-range today is 2.3x the cost of that 7850.
Sorry I wasn't speaking relative to if the cards are good performance : price (I'm well aware that the performance we get today for what we are paying is a joke); I was talking about how Nvidia don't really compete with AMD (thanks to no small part on Nvidia pushed their cards up one tier with a price and model number to match) at the sub £200 bracket just like back.
 
Yes but not at the top end. Ie 1170 and below they will have an equivalent but 1180+ not for a while imo. I hope I am wrong about the top end.
They would have competed better this gen had it not been for miners.

I have no ideae about gddr6 for mining but I hope it is rubbish for it
 
Last edited:
I agree mate. Most here want AMD to compete so they would buy their Nvidia card cheaper.

This.

Quite a few have admitted this plenty of times before, even more people have said the same now as they have gone and locked themselves into a gsync monitor.

You just have to look at history, even when AMD had the better product, most people still went with nvidia because "brand"......

Too many people think just because you make a good/better product, it will be an instant success. Sadly that is not the case, hence why companies spend millions and in some cases, billions for their marketing..... Apple and Samsung are the masters of this.


I doubt AMD will ever be successful with nvidia again, at least not at the very high end anyway. Only reason they ended up doing somewhat well with vega was because of the crypto mining boom, take that away and I dread to think how many GPUs they would have sold :o
 
gosh are people really arguing over a insignificant marker as power. it would be what less than £10 difference over the year ??? who cares...........

It has nothing to do with electricity cost, it is the heat production that needs to be expelled.
And performance per a watt is a ceitcrit measurement of architecture efficiency. It is trivial to make a GPU faster by feeding it more volts and upping the frequency.

The most important aspect for the IHV is datacenter use, where the power consumption cost is critical. A datacenter with 10,000 GPUs running at 100% 24/7 is very different
 
gosh are people really arguing over a insignificant marker as power. it would be what less than £10 difference over the year ??? who cares...........

Is even less. Unfortunately for some this isn't overclockers forum but money pinching forum. Cannot explain it otherwise.

Always were big heated arguments with some not grasping the simple maths, while many didn't had a clue how much they paid per kw/h.
Like when 290X came out against the 780Ti and TitanX. We run the calculations and we showed that if you pick the 780Ti just because it was efficient, you had to keep it for 15 years running it 6 hours per day at 100% for the period to break even, given the price difference. And for the TitanX this was 44 years.

And yet still even today people arguing.
 
It has nothing to do with electricity cost, it is the heat production that needs to be expelled.
And performance per a watt is a ceitcrit measurement of architecture efficiency. It is trivial to make a GPU faster by feeding it more volts and upping the frequency.

The most important aspect for the IHV is datacenter use, where the power consumption cost is critical. A datacenter with 10,000 GPUs running at 100% 24/7 is very different

But using different cards also designed for that purpose.
 
If NV and amd both made identical performing cards for both monitor AND VR for the same price then for me it would down to the free game pack ins and physx. Whilst physx has largely failed due to being locked to NV when supported it is better than any other added eye candy feature even HDR. Whilst no longer a huge issue instill have the Arkham titles to play and borderlands pre sequel I assume supports it. Ultimately like for like unless amd ever manage to get around physx then like for like they need to be better than NV OR cheaper. Imo on a par with ain't gonna cut it.
Right now however if I was building a gaming pc I would probably choose em for a CPU and their Apus for mid range gaming laptops are pretty good.
 
why even try to compare a data centre running thousands of card to the general populace of this forum who use 1-4 cards...... making power a pointless marker to compare for the 99.99% of users here

its just seems to be a stick that one group tries to bash another group with, its petty.
 
Generally lower power means lower heat and I can tell you now whilst in my man cave with 30 degree heat the less heat my pc throws out the better.
Not to mention. Less heat generally means less noise. Like you a few pennies a week on electricity is a drop in the bucket compared to all the other gear running, but I still want a cool running pc if possible.

The thing is you would have a point if AMD we're currently £100 cheaper for a 1080 equivalent card and the only difference was a few extra watts but this is not where AMD are right now.
 
Last edited:
Is even less. Unfortunately for some this isn't overclockers forum but money pinching forum. Cannot explain it otherwise.

Always were big heated arguments with some not grasping the simple maths, while many didn't had a clue how much they paid per kw/h.
Like when 290X came out against the 780Ti and TitanX. We run the calculations and we showed that if you pick the 780Ti just because it was efficient, you had to keep it for 15 years running it 6 hours per day at 100% for the period to break even, given the price difference. And for the TitanX this was 44 years.

And yet still even today people arguing.
Yeah I get this feeling a lot. Arguments between AMD and Nvidia always seem to come down to AMD being better value for money. Nvidia is quicker but AMD is cheaper. Nvidia runs cooler but AMD is cheaper.
Maybe it's us, maybe this isn't an enthusiast forum about performance parts but money pinching forum about finding thebest bang-for-buck.

There is part of me that suspects that if next gen things were reversed and Nvidia products were cheaper but AMD products were faster, cooler and more power efficient then a lot of people's opinion on what was important would reverse too.
 
Yeah I get this feeling a lot. Arguments between AMD and Nvidia always seem to come down to AMD being better value for money. Nvidia is quicker but AMD is cheaper. Nvidia runs cooler but AMD is cheaper.
Maybe it's us, maybe this isn't an enthusiast forum about performance parts but money pinching forum about finding thebest bang-for-buck.

There is part of me that suspects that if next gen things were reversed and Nvidia products were cheaper but AMD products were faster, cooler and more power efficient then a lot of people's opinion on what was important would reverse too.


But you can see it on the other discussion someone was asking about "if AMD had better card would you switch"?
The same people who said "no I have gsync" are those also who advice others to get Nvidia card even if they have Freesync monitors just on the discussion bellow!
 
as loads of people have not had much luck tweaking theirs.

Point out these loads of people.

They might not be all great overclockers, but, they all can be tweaked to use less power and perform better with a few simple steps because the default settings are rubbish.
 
Point out these loads of people.

They might not be all great overclockers, but, they all can be tweaked to use less power and perform better with a few simple steps because the default settings are rubbish.
Even with tweaking many people just want to plug and play, nVidia offer this and with Ryzen AMD offer it in the CPU department. We are a minority here, many just want to use default settings and be away.
 
Couldn't give a toss about electricity cost, however, less heat and less noise are nice bonuses.

Actually I have the V64 Nitro+. I was gaming all day yesterday in a very warm room with closed windows (during day there are thrips in the area not wanting to lose another monitor)
There was no fan noise running from the card at it's stock overclock settings (high 1500s). Nor is today.
 
Nvidia TDP has nothing to do with power consumption, is based on absolute stock core speeds.
Same applies to the Intel TDP which is calculated at base core speeds.

Go and see on reviews the difference of power consumption between eg the MSI Lightning or the Aorus Xtreme , or the SC2 & Kingpin against the FE.
The different is roughly 100W, some times even more.

+1, nVidia put a 150 Watt TDP rating on my GPU, MSI put 6 + 8 Pin plugs on it, that's 300 Watts of plugs and i can assure anyone reading this its actual power consumption is over 200 Watts, easily.

Again as Panos said, Both Intel and nVidia use TDP measured at the absolute minimum clock speed, for the 8700K that's 3.7Ghz, they all run at 4.3Ghz and use a lot more power than their TDP rating, my GPU has a base clock of about 1530Mhz, it actually runs at 1911Mhz out of the box.
 
I don't think that's even remotely true.

I also think you have a very selective memory.

Back when AMD made decent cards, products like the 7850 were the only recommendation in its price bracket.

It's not the consumer's fault that AMD is where it is. You can blame Intel; you can blame mis-management; you can blame partners like GloFlo for not delivering.

Some of you guys really do live in fantasy land tho. Atm, AMD does not have "equal or better products to nV at all points below the 1080ti." That statement is so far divorced from reality, that it's remarried five times and had 15 kids.
Agreed. I remember recommending the 7970 over the 680, the 7950 over the 670, the 290/x over the 980Ti etc. AMD had the decent prices as well as the decent performance.
 
Not sure where some people are getting their figures from but reference Vega 64 uses a lot more power than any of the high end reference NVidia cards like the 1080 Ti FE, Vitan Xp or Titan V.

Non reference cards can not be used for a fair comparison as they are not what either NVidia or AMD think the card should perform like.

Having said that as zophiel said above the cost in using either brand adds up to very little over a year.​

gosh are people really arguing over a insignificant marker as power. it would be what less than £10 difference over the year ??? who cares...........
 
Back
Top Bottom