• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
In Cyberpunk 2077,there are parts even with DLSS performance can get dicey,so it meant turning down a whole lot of effects. In games which run better,the effects are less well prounounced. It's why Nvidia is trying to copy TVs and add inserted frames,because I have the feeling that dGPUs will be lagging the software for the immediate future,especially in the sub £500 area. But what is quite hilarious,is that Nvidia and their overpricing means an RTX3050 is fighting an RX6600/RX6600XT. Not even RT can save the RTX3050!

It ain't perfect, but I would rather have it then not. I just ain't gonna pay silly moneys for it. If I can get cards at what I consider a sensible price then I will. If not I won't. Simple as that.

End of the day we all know what Nvidia and AMD are about. They just want to maximise profits and will do at every opportunity they can get. Neither are your friend, that's why I don't get why some love to defend AMD so much.
 
It ain't perfect, but I would rather have it then not. I just ain't gonna pay silly moneys for it. If I can get cards at what I consider a sensible price then I will. If not I won't. Simple as that.

End of the day we all know what Nvidia and AMD are about. They just want to maximise profits and will do at every opportunity they can get. Neither are your friend, that's why I don't get why some love to defend AMD so much.

My big issue is that people are justifying huge price increases for it. If we look back 20 years,not only did we get decent performance improvements but also nice,new features at the same time. Otherwise modern games would look like Quake still,and a dGPU would start at £2000.

RT should be something on top of the other improvements,the same with DLSS/FSR,Frame Generation,etc. Not something to justify getting barely quicker performance at the same price after a few years. WRT,to the AMD defence crew - far worse with some of the people on the CPU side(Intel tactics are not bad if AMD does it).
 
Last edited:
I wonder if it will be RDNA3.0 and use Infinity Cache?

It could be, it would be pointless just to make a larger version of the GPU in the PS5, which i think is RDNA2. RDNA3 is the only place to go and RDNA3 APU's are coming out, IMO that Asus handheald has one, Phoenix.

 
It could be, it would be pointless just to make a larger version of the GPU in the PS5, which i think is RDNA2. RDNA3 is the only place to go and RDNA3 APU's are coming out, IMO that Asus handheald has one, Phoenix.


It would mean that they could keep the same memory set-up,instead of having to add even more memory channels and GDDR6 chips(which would cost more money). Also,with TSMC having more spare volume on 5NM,Sony could be a strong situation to negotiate for volume. It would be even more interesting if the console were to use chiplets of some sort.
 
My big issue is that people are justifying huge price increases for it. If we look back 20 years,not only did we get decent performance improvements but also nice,new features at the same time. Otherwise modern games would look like Quake still,and a dGPU would start at £2000.

RT should be something on top of the other improvements,the same with DLSS/FSR,Frame Generation,etc. Not something to justify getting barely quicker performance at the same price after a few years. WRT,to the AMD defence crew - far worse with some of the people on the CPU side(Intel tactics are not bad if AMD does it).

Way I see it is, as soon as AMD start doing it, which they have, I could not care less and have no loyalty. At which point I buy what I fancy and they have no sway over me. They used to have some sway in the past, plus they were actually competitive.
 
Way I see it is, as soon as AMD start doing it, which they have, I could not care less and have no loyalty. At which point I buy what I fancy and they have no sway over me. They used to have some sway in the past, plus they were actually competitive.

AMD did it last generation too. Just look at the pricing of the RX6600,RX6600XT and RX6700XT in the UK,especially since they couldn't be bothered with selling RRP reference models in the UK(unlike Nvidia). Then look at the pricing of the RX5600XT,RX5700 and RX5700XT. Then the sudden price bump when Zen3 launched(and the motherboard pricing and segmentation of certain features on AM5). But at least,unlike Nvidia,AMD is still willing to drop prices after a while. Nvidia still seems to want to get rid of billions of USD of inventory at RRP or higher.
 
Last edited:
  • Like
Reactions: TNA
Way I see it is, as soon as AMD start doing it, which they have, I could not care less and have no loyalty. At which point I buy what I fancy and they have no sway over me. They used to have some sway in the past, plus they were actually competitive.
You can see for example amd is stuck with 6 cores for the last 6-7 years, the prices of which 6 cores amd has also increased. Yet the same people who criticized Intel for stagnation are silent now, even though amd is doing worse, lol
 
AMD did it last generation too. Just look at the pricing of the RX6600,RX6600XT and RX6700XT in the UK,especially since they couldn't be bothered with selling RRP reference models in the UK(unlike Nvidia). Then look at the pricing of the RX5600XT,RX5700 and RX5700XT. Then the sudden price bump when Zen3 launched(and the motherboard pricing and segmentation of certain features on AM5). But at least,unlike Nvidia,AMD is still willing to drop prices after a while. Nvidia still seems to want to get rid of billions of USD of inventory at RRP or higher.

The problem is whatever Nvidia don't sell retail they can sell OEM and client, AMD can't, AMD have to care about retail, Nvidia don't, they can chase their 70% margins elsewhere if we don't want to line their pockets to such an extent.

It will only matter to Nvidia if AMD start drilling in to their retail market share, properly. The problem with AMD is they are chasing the highest revenue they can, if they sell 1000 GPU's at $500 they gross $500,000, if the sell 1100 GPU's at $400 they gross $440,000.
 
Well, I think that Nvidia is taking the p**** a bit here with the spec of the RTX 4070, if this spec ends up being correct:

I suppose we don't precisely know what impact the L2 cache will have, but I don't think it's a massive improvement.

These specs point towards 5888 shaders, the same as the RTX 3070. The only way the spec (in paper) is going to match the RTX 3080 is with a large increase in the core boost clock.

So it's not a typical increase in shaders that we've seen in previous generations. It looks like many of the higher spec AD104 GPUs are going into laptops instead :(

I suppose the positives are that the tgp is 200w, so potentially lots of headway for overclocking. A clock of 2600Mhz would the GPU them to 30 TFlops, 2750Mhz would result in 32 TFlops.
 
Last edited:
The problem is whatever Nvidia don't sell retail they can sell OEM and client, AMD can't, AMD have to care about retail, Nvidia don't, they can chase their 70% margins elsewhere if we don't want to line their pockets to such an extent.

It will only matter to Nvidia if AMD start drilling in to their retail market share, properly. The problem with AMD is they are chasing the highest revenue they can, if they sell 1000 GPU's at $500 they gross $500,000, if the sell 1100 GPU's at $400 they gross $440,000.
AMD could as they make a whole platform themselves,so it makes me wonder whether their dGPU releases are increasingly just a token effort to say they have something.
 
So for me I think the RTX5000/RX8000 series will be where things will hopefully get interesting

By then Nvidia will still be selling them cards based on Cyberpunk and all the new updates to it that will be Nvidia "optimised" aka gameworks in disguise games.. Cyberpunks should not be used to measure which card is better anymore as it is clearly a Nvidia "sponsored" game and CD Projekt RED are happy to take the money and add all the things Nvidia wants to cripple other cards and make their latest and greatest $2k card play at 40fps at 4K (even before the path tracing update, so PT FPS will be what 24fps and they will call it cinematic ?) :rolleyes: and brag about it. I have lost all respect for CD Projekt RED, Nvidia well we know their games and use to them doing these things.

Also by then there will be some new game that will cripple old cards and add more fake frames and they will say look DLSS3++++++ and ohh yes DLSSX ultimate that will upscale from 360p to 8k and add 10 times more fake frames. Meeh.. They really have killed the goose that lays the golden eggs. Life is giving many of us more important things to worry about and I hope by then some sense comes back into these companies that are all trying to take their customers for a ride.
 
Well, I think Nvidia is taking the p**** a bit here with the spec of the RTX 4070, if this spec ends up being correct:

I suppose we don't precisely know what impact the L2 cache will have, but I don't think it's a massive improvement.

These specs point towards 5888 shaders, the same as the RTX 3070.

So it's not a typical increase in shaders that we've seen in previous generations. It looks like many of the higher spec AD104 GPUs are going into laptops instead :(

We do know what impact the large L2 has.

The 4080 has 5% more shaders than the 3080Ti, the 4080 is 33% faster, but the 4080 also has 50% higher official boost clocks, i don't know what the difference is in reality but if its only 30% then the L2 makes no difference at all. In raster.

We shouldn't think of the 4000 series huge L2 cache increase in the same way as AMD's Infinity Cache, they are not the same thing, Nvidia need that huge L2 cache for RT.

IMO the 4070 will be around 30% faster than the 3070, like the 3080Ti vs the 4080, its only the clock speed difference that matters.
 
Last edited:
By then Nvidia will still be selling them cards based on Cyberpunk and all the new updates to it that will be Nvidia "optimised" aka gameworks in disguise games.. Cyberpunks should not be used to measure which card is better anymore as it is clearly a Nvidia "sponsored" game and CD Projekt RED are happy to take the money and add all the things Nvidia wants to cripple other cards and make their latest and greatest $2k card play at 40fps at 4K (even before the path tracing update, so PT FPS will be what 24fps and they will call it cinematic ?) :rolleyes: and brag about it. I have lost all respect for CD Projekt RED, Nvidia well we know their games and use to them doing these things.

Also by then there will be some new game that will cripple old cards and add more fake frames and they will say look DLSS3++++++ and ohh yes DLSSX ultimate that will upscale from 360p to 8k and add 10 times more fake frames. Meeh.. They really have killed the goose that lays the golden eggs. Life is giving many of us more important things to worry about and I hope by then some sense comes back into these companies that are all trying to take their customers for a ride.

It doesn't even run well on Nvidia cards! :cry:

Personally at the rate things are going I might just get a console for newer titles,and have a laptop/older desktop for older games and Indie titles.
 
AMD could as they make a whole platform themselves,so it makes me wonder whether their dGPU releases are increasingly just a token effort to say they have something.

I do wonder the same thing sometimes, on the other hard i do believe that AMD just don't think they can take marketshare from Nvidia without a loss in revenue.
 
PS6 in 2027, is that long enough for Intel ARC™ Celestial to be the running? Rumours are Battlemage 2024, Celestial 2026. Both TSMC so don't expect the saviour of mainstream PC gaming to come from Intel then...

The hardware for a Steam Deck is surely fairly easy. Getting tons of games certified is way harder. Plus, is the Asus one going to run Windows so MS want their cut, or are they gong to use SteamOS?
Asus handheld uses windows 11
 
I do wonder the same thing sometimes, on the other hard i do believe that AMD just don't think they can take marketshare from Nvidia without a loss in revenue.

Surely with laptops,it makes more sense to bundle their dGPUs with their own CPUs? That tells me maybe AMD CPU and GPU divisions need to communicate a bit better with each other. After all,think of the cost savings to an OEM? Although the RX7600 series being 6NM might indicate they want to sell a lot of these?
 
It doesn't even run well on Nvidia cards! :cry:

Personally at the rate things are going I might just get a console for newer titles,and have a laptop/older desktop for older games and Indie titles.
It runs absolutely fantastic on nvidia. Actually, if you turn RT off it runs incredibly well on AMD as well, unlike amd sponsored games which usually run like complete crap on nvidia because amd is holier than thou as we all know
 
Last edited:
It doesn't even run well on Nvidia cards! :cry:

BUT it will run fastest on their 5090 by double the frame rate of the 4090 and 4x the frame rate of the 3090.. but the frame rate was 24fps AKA DLSSX cinematic ultimate version with 5090 and of course won't run on 4090/3090/2080ti/amd cards/intel cards etc etc ....


Nvidia the only company I know that loves to shoot itself in the foot when they are onto a good thing... Also don't forget Moore's Law is dead... because they say so... while ASML and other companies that make semiconductor equipment would disagree and laugh at that statement. BUT if Nvidia says it it must be true, same was true when Intel use to say stuff back in the day too, till they were caught with their pants down too.

Mate it's just pure greed now from these companies and it's so obvious it's incredibly rude to their customers.
 
BUT it will run fastest on their 5090 by double the frame rate of the 4090 and 4x the frame rate of the 3090.. but the frame rate was 24fps AKA DLSSX cinematic ultimate version with 5090 and of course won't run on 4090/3090/2080ti/amd cards/intel cards etc etc ....


Nvidia the only company I know that loves to shoot itself in the foot when they are onto a good thing... Also don't forget Moore's Law is dead... because they say so... while ASML and other companies that make semiconductor equipment would disagree and laugh at that statement. BUT if Nvidia says it it must be true, same was true when Intel use to say stuff back in the day too, till they were caught with their pants down too.

Mate it's just pure greed now from these companies and it's so obvious it's incredibly rude to their customers.

I am guided price/performance and I have other hobbies too. I am quite happy to play older games,etc so in the end they can keep their overpriced hardware. As much as companies are not charities,neither are we as consumers.
 
Status
Not open for further replies.
Back
Top Bottom