• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
It is good progress just more on wattage than fps, many people might not like that much of the progress is by way of power consumption over fps, but in regards to frames per watt it has made good performance progress, which is the way states like California are directing computer hardware companies as they are banning PCs that use to much energy according to certain criteria.

Your idea of progress eg fps, differs from mine fps per watt. There will be people who agree with your view and people who welcome the progress on performance per watt rather than fps. This review recognises the progress on power consumption, https://youtu.be/DNX6fSeYYT8?t=1106

Progress on watts per frame is still progress

It's mostly useless progress if only power efficiency improves though. How does power efficiency help you play the latest games in 4k 120hz in vr for example?

As Jono said, big whoop.

The California stuff is a big red herring btw, i don't know why you are focussing on it, probably because it's the only example of where a few hrs gaming per week could be seen to be bothered about power use.

Power use and temperatures just need to be within a sensible range (most of us have 750W or above PSU's). As long as we don't get stupid increases in power to generate perf improvement most people are happy.
 
Bumping for posterity, now that news articles are cropping up with this :D

4060Ti also doesn't even look likely to get the full quota of AD106's shaders


Wow.... So Nvidia ain't quite done yet with their new normal.....

6OhibG2.png
 
Only PCIe x8 as well which looks weird.

Meh.... so long as its PCie 4 that's fine for a card like that.

It has 10% less shaders and 45% less memory bandwith, i know it says 22TFlops 4060Ti vs 16.2 3060Ti but that's as massive loss in memory bandwidth which the FP32 just doesn't account for, in reality IMO this is barley quicker than the card its replacing, sometimes it might even lose. I hope reviewers find where it does and ridicule it.
 
?

Yes, but normally that goes hand in hand with progress in price/performance as well, which is almost completely lacking in this gen.

No one can be truly that enthusiastic that it uses 100w less for the same performance and almost same price. Big woop.

Also, power has to naturally come down anyone otherwise we'd have GPU's sucking 2000w by now (ie so power usage improvement is a given really).
California seems to be interested and they didn't allow power usage to come down naturally they wrote a report and passed a law about it.

"• Aggregate energy demand places gaming among the top plug loads in California, with gaming representing one-fifth of the state’s total miscellaneous residential energy use.
• Market structure changes could substantially affect statewide energy use; energy demand could rise by 114 percent by 2021 under intensified desktop gaming...
• Energy efficiency opportunities are substantial, about 50 percent on a per-system basis for personal computers"
 
If the next iteration of the Playstation starts supporting 3440x1440, and is powerful enough to run it (with RT/PT) at a decent framerate >100, I'll give serious consideration to buying that, instead of forking over a couple of grand for the 5090/6090.
 
California seems to be interested and they didn't allow power usage to come down naturally they wrote a report and passed a law about it.

"• Aggregate energy demand places gaming among the top plug loads in California, with gaming representing one-fifth of the state’s total miscellaneous residential energy use.
• Market structure changes could substantially affect statewide energy use; energy demand could rise by 114 percent by 2021 under intensified desktop gaming...
• Energy efficiency opportunities are substantial, about 50 percent on a per-system basis for personal computers"

No one is saying power usage improvement is bad though. It is good, and it is always necessary (like i said, otherwise we'd have GPU's pulling thousands of watts, or GPU's melting as soon as they are used).

It has to come with good price/performance too.
 
Last edited:
California seems to be interested and they didn't allow power usage to come down naturally they wrote a report and passed a law about it.

"• Aggregate energy demand places gaming among the top plug loads in California, with gaming representing one-fifth of the state’s total miscellaneous residential energy use.
• Market structure changes could substantially affect statewide energy use; energy demand could rise by 114 percent by 2021 under intensified desktop gaming...
• Energy efficiency opportunities are substantial, about 50 percent on a per-system basis for personal computers"
If all you care about is energy usage then the answer is simple, ban gaming, yes?
 
Badging the 4060Ti as a 4070 then pricing it at the 3080 level is easy to deduce to the seasoned person. If you are comfortable parting with your money knowing this is one thing, but some punters will be oblivious to this. The soon to be released cards unbelievably look even worse!

This is half the issue with fake environmentalism being indoctrinated into people. Slower products need upgrading quicker so it means more energy needs to be expended to make them,ie,more pollution. Plus if you have to upgrade quicker and spend more money,you are not saving money.

PMSL, how people cant see the woods from the trees in all this is mind boggling.
 
Personally I think nVidia is only hurting themselves though there likely is no shortage of the average consumer who is buying, there will be plenty of people like myself who'd normally run out and buy something, who don't have specific budget constraints, but don't just buy something of poor value because we can afford it.
The following article sums up why I do not think Nvidia is that bothered by low consumer GPU sales.


Its another craze, like crypto mining, that has hurt your ordinary gamer.
 
Last edited:
Except for the 4090. Only one worth buying just to get the frontier performance. Everyone else should hold if they can.
You are probably correct, that or a 7900 XTX, but for ytour mid tier gamer, who has to replace a faulty GPU, then all there really is at a "reasonable" price, is the 4070, for next gen, or the 6800XT/6950XT, for last gen.
 
People need to learn that the product naming schemes are almost meaningless. They can be whatever AMD or Nvidia decide they should be.

If you have a lot of money, you just buy the top tier card, but if not you have to spent a little time reading product reviews.
 
Last edited:
Bumping for posterity, now that news articles are cropping up with this :D

4060Ti also doesn't even look likely to get the full quota of AD106's shaders


The more you pay the more you are gimped! :cry:


Meh.... so long as its PCie 4 that's fine for a card like that.

It has 10% less shaders and 45% less memory bandwith, i know it says 22TFlops 4060Ti vs 16.2 3060Ti but that's as massive loss in memory bandwidth which the FP32 just doesn't account for, in reality IMO this is barley quicker than the card its replacing, sometimes it might even lose. I hope reviewers find where it does and ridicule it.

Most people are still on PCI-E 3.0 systems though. The RX6600XT in some circumstances could lose a decent amount of performance.

This is an even faster dGPU,with only 8GB of VRAM,etc so there could be many instances where the card runs out of VRAM before it runs out of grunt.

Except for the 4090. Only one worth buying just to get the frontier performance. Everyone else should hold if they can.

To a degree. But in terms of percentage of the full die the RTX4090 uses 89% of the full chip,the RTX3080 81% of the full chip and the RTX3090 98% of the full chip.

The following article sums up why I do not think Nvidia is that bothered by low consumer GPU sales.


Its another craze, like crypto mining, that has hurt your ordinary gamer.

The problem with market capitalisation its another false speculative metric,based on predicted revenue and hype can push share prices up. The problem is if the actual revenue even misses the mark by a few percent it will all go down the drain. Just look at what happened to Intel.

The tech market is hitting serious headwinds,so all the speculators are loading onto AI,but the issue so is EVERY large tech company. The excessive amount of money printing has lead to insane amounts of money being pushed around the tech sector - this is what you need to blame for all of this. It was some of this money which enabled low interest rates for people to buy consumer goods. How interesting the moment interest rates start going up,etc we suddenly are seeing collapsing sales everywhere.

Now you are slowly starting to see Quantitative Tightening being a thing,with increasing interest rates. So in the next few years when the money tap starts getting turned down it will be a double whammy for the tech sector:
1.)Consumers can borrow less to fund the massive increases in purchase prices
2.)Companies will find it harder to borrow

Things like the CHIPS Act might still prolong the amount of cheap credit available,but even that is targetted towards certain things.

The tech sector has dined too long on cheap credit.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom