• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
I asked did you watch the video, still waiting for a reply! I asked why do you think that you only have to use the written word when you could watch the video, what didn't you understand about the video? Perhaps you should leave a comment in the video saying that you didn't understand the power usage section
I take it you're not going to address a single one of the points i raised with you then?

Or address the elephant in the room, you known the one that you've been struggling with all day. That your insistence on using "power usage" is what everyone has taken issue with assumedly because it's not only technically incorrect to describe it that way but it's also highly misleading.
 
3xmgyptr3rw11.jpg
 
Last edited:
How is this news sensationalism, it clearly states that not all prebuilt PCs are banned yet but that there are regulations in place to limit the sale of prebuilt pcs that don't meet the regulations, did you even read this article that I linked?


Please for the love of god read your own links, you will see it confirms what I said.
And please for the the further love of god read what I posted, that you quoted, as you are asking a question that was already answered!
 
I take it you're not going to address a single one of the points i raised with you then?

Or address the elephant in the room, you known the one that you've been struggling with all day. That your insistence on using "power usage" is what everyone has taken issue with assumedly because it's not only technically incorrect to describe it that way but it's also highly misleading.
Actually you are trying to mislead people and avoid questions. Still waiting to know if you watched the video.
I have raised the issue of both power usage and power efficiency, sorry that you don't grasp that they are both important and that rather than engaging in a constructive discussion you are coming across as rude IMO.

Power usage is important for a number of reasons, and I stated that the 3080 is more of a power hog than the Fermi 480. Power usage is both accurate and technically correct, you are the one trying to mislead people and can't understand the difference. Power usage is the amount of energy a card uses, not how productive, energy efficient it is when using that power, eg the Nvidia 4070 is more efficient than the 3080 but has a lower power usage than the 3080

So when people state that they would rather have a gpu that uses more power eg, 300 watts plus than nearer than 200 there are consequences of that decision, eg
1. More energy is used = gpu is costlier to run over time
2. A more expensive psu is needed, and perhaps a new one, which adds to the cost of using the gpu
3. More cooling is needed, so more expensive cooler on the gpu and perhaps air conditioning, eg in data centres
4. More green house gases are produced and damage to the environment


Power efficiency (work done per unit of energy, not the total sum of power usage) is a target when designing a new gpu because of the above factors, a gpu with better perf/watt can help with points 1 to 4 above if the power usage is reduced but a 1000kwh gpu could in theory still have good perf/watt. AMD has set power usage ceilings and power efficiency targets. Nvidia haven't been as focused on keeping power usage as low and energy efficiency as high as AMD when comparing the 3000 series to the 6000 series.


"This clearly implies that AMD is aiming for a huge 50% performance-per-watt improvement for RDNA 3 GPUs versus the current generation of cards. AMD is currently the only manufacturer that has plenty of expertise in creating both mainstream graphics cards and processors, and as such, it’s able to utilize some of its CPU ideas in the creation of new GPUs — reducing bus widths and adding a large Infinity Cache comes to mind.

Thus far, it’s mostly been true that AMD is often more power-conservative than its competitors. As an example, Nvidia’s GeForce RTX 3090 Ti has a 450-watt TDP, while AMD’s flagship Radeon RX 6950 XT keeps things more reasonable at 335 watts. The next generation of GPUs for both manufacturers is still a subject of speculation, but it won’t be a surprise if AMD continues to keep things slightly more efficient — although perhaps sometimes at the expense of performance.

RDNA 3 GPUs are set to release later this year and instantly made to compete with Nvidia’s RTX 40-series graphics cards. Let’s hope that AMD’s power-efficiency focus will save many customers from buying a new power supply."


I hope that this helps with your comprehension but I suggest that you put a comment in the hardware unboxed review to Steve that you do not understand his section about power usage
 
Last edited:
I take it you're not going to address a single one of the points i raised with you then?

Or address the elephant in the room, you known the one that you've been struggling with all day. That your insistence on using "power usage" is what everyone has taken issue with assumedly because it's not only technically incorrect to describe it that way but it's also highly misleading.

Did you not take a look at California then? Or maybe ask Klaus at the WEF.
 
Please for the love of god read your own links, you will see it confirms what I said.
And please for the the further love of god read what I posted, that you quoted, as you are asking a question that was already answered!

Please read the articles that I sent as all the information is in there, you agree above that if you do read them that the information is in the links I sent, so how I have not given you access to the relevant information?
 
Last edited:
Actually you are trying to mislead people and avoid questions. Still waiting to know if you watched the video.
I have raised the issue of both power usage and power efficiency, sorry that you don't grasp that they are both important and that rather than engaging in a constructive discussion you are coming across as rude IMO.

Power usage is important for a number of reasons, and I stated that the 3080 is more of a power hog than the Fermi 480. Power usage is both accurate and technically correct, you are the one trying to mislead people and can't understand the difference. Power usage is the amount of energy a card uses, not how productive, energy efficient it is when using that power, eg the Nvidia 4070 is more efficient than the 3080 but has a lower power usage than the 3080

So when people state that they would rather have a gpu that uses more power eg, 300 watts plus than nearer than 200 there are consequences of that decision, eg
1. More energy is used = gpu is costlier to run over time
2. A more expensive psu is needed, and perhaps a new one, which adds to the cost of using the gpu
3. More cooling is needed, so more expensive cooler on the gpu and perhaps air conditioning, eg in data centres
4. More green house gases are produced and damage to the environment


Power efficiency is a target when designing a new gpu because of the above factors. AMD has set power usage ceilings and power efficiency targets. Nvidia haven't been as focused on keeping power usage as low and energy efficiency as high as AMD when comparing the 3000 series to the 6000 series.


"This clearly implies that AMD is aiming for a huge 50% performance-per-watt improvement for RDNA 3 GPUs versus the current generation of cards. AMD is currently the only manufacturer that has plenty of expertise in creating both mainstream graphics cards and processors, and as such, it’s able to utilize some of its CPU ideas in the creation of new GPUs — reducing bus widths and adding a large Infinity Cache comes to mind.

Thus far, it’s mostly been true that AMD is often more power-conservative than its competitors. As an example, Nvidia’s GeForce RTX 3090 Ti has a 450-watt TDP, while AMD’s flagship Radeon RX 6950 XT keeps things more reasonable at 335 watts. The next generation of GPUs for both manufacturers is still a subject of speculation, but it won’t be a surprise if AMD continues to keep things slightly more efficient — although perhaps sometimes at the expense of performance.

RDNA 3 GPUs are set to release later this year and instantly made to compete with Nvidia’s RTX 40-series graphics cards. Let’s hope that AMD’s power-efficiency focus will save many customers from buying a new power supply."


I hope that this helps with your comprehension but I suggest that you put a comment in the hardware unboxed review to Steve that you do not understand his section about power usage
I'm not going to read that waffle as you seemingly don't understand, still, that there's a difference between what you keep describing "power usage and power efficiency" and performance per watt.

e: Oh and the other reason I'm not going to read all that rubbish is because i suspect all you've done is describe what performance per watt is when all you had to do is replace all that waffle with three words, performance per watt.

I have to ask but are you Bencher's Alt account or something? Because you're demonstrating the same lack of contrition, the same lack of humility, the same difficulty with language, and exhibiting the same traits. Was it that you embarrassed yourself so thoroughly yesterday that you decided to use an Alt account?
 
Last edited:
I'm not going to read that waffle as you seemingly don't understand, still, that there's a difference between what you keep describing "power usage and power efficiency" and performance per watt.

e: Oh and the other reason I'm not going to read all that rubbish is because i suspect all you've done is describe what performance per watt is when all you had to do is replace all that waffle with three words, performance per watt.

I have to ask but are you Bencher's Alt account or something? Because you're demonstrating the same lack of contrition, the same lack of humility, the same difficulty with language, and exhibiting the same traits. Was it that you embarrassed yourself so thoroughly yesterday that you decided to use an Alt account?
You wrote, "That your insistence on using "power usage" is what everyone has taken issue with assumedly because it's not only technically incorrect to describe it that way but it's also highly misleading."

Sorry that you don't understand the simple concept that when you plug an electrical item into an electrical socket and switch it on that it uses electrical power.
 
Last edited:
Yet AMD managed to either cut or keep the same prices for its CPUs built on 5nm


CPUs have a lot of margin, more than GPUs. On the high end, something like a 7950x, half its price is pure profit for AMD. That's why anytime it's required, they can do massive price cuts, because they will still never sell at a loss. I know people haven't paid too much attention to it because GPU prices exploded so it kept the heat away from CPUs but the truth is that CPUs from both Intel and AMD make more profit than GPUs do, quite a bit more actually
 
Last edited:
CPUs have a lot of margin, more than GPUs. On the high end, something like a 7950x, half its price is pure profit for AMD. That's why anytime it's required, they can do massive price cuts, because they will still never sell at a loss. I know people haven't paid too much attention to it because GPU prices exploded so it kept the heat away from CPUs but the truth is that CPUs from both Intel and AMD make more profit than GPUs do, quite a bit more actually

Not affecting Nvidia as their gross margins are higher than AMD or Intel!
 
CPUs have a lot of margin, more than GPUs. On the high end, something like a 7950x, half its price is pure profit for AMD. That's why anytime it's required, they can do massive price cuts, because they will still never sell at a loss. I know people haven't paid too much attention to it because GPU prices exploded so it kept the heat away from CPUs but the truth is that CPUs from both Intel and AMD make more profit than GPUs do, quite a bit more actually
A TSMC 5nm wafer is $16,000 assuming Nvidia even pays the full price and doesn't get a discount for bulk buying. With a 100% yield your looking at around 248 dies, now lets say the yield is only 90% then you're looking at a cost per die of $71 or £57 so how we get from that to £600 or even £800 for a 4070ti is beyond me.
 
A TSMC 5nm wafer is $16,000 assuming Nvidia even pays the full price and doesn't get a discount for bulk buying. With a 100% yield your looking at around 248 dies, now lets say the yield is only 90% then you're looking at a cost per die of $71 or £57 so how we get from that to £600 or even £800 for a 4070ti is beyond me.

You can't ignore development costs.

A game costs pennies to print onto a disc or host for download on a server. They still cost a fortune in man hours to develop and design, hence why they cost ~£50

I'm definitely not saying Nvidia aren't taking us to the cleaners, but only thinking about material costs is wrong.
 
You can't ignore development costs.

A game costs pennies to print onto a disc or host for download on a server. They still cost a fortune in man hours to develop and design, hence why they cost ~£50

I'm definitely not saying Nvidia aren't taking us to the cleaners, but only thinking about material costs is wrong.

But Nvidia has higher margins than Apple,so whatever the costs are,I don't think going from an RRP of $330 for the RTX3060 12GB to $800 for the RTX4070TI 12GB is anything to do with costs.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom