• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

We need better quality games before we buy expensive hardware.
Actually, I will disagree with this.

More people need to buy more better hardware, once an bigger and the pc space needs to have a bigger install base of adequate specs, then developers can put a bigger budget in.

Price is the biggest issue here.

A bit of both - yes better games should entice people to upgrade. But we need better value performant hardware to do so.

Remember Crysis? The 8800GTX? The XBox 360 launched in late 2005,together with the PS3. The XBox 360 had the first unified shader dGPU in production,and the PS3 had an Nvidia 7900 series based GPU. The 8800GTX launched a year later and made the X1950XTX/7950GT look weak.

A year later we had the 8800GT/8800GTS which had most of the performance of the 8800GTX at lower resolutions,just in time for Crysis. The 8800GT was 40% of the price of the 8800GTX($240 IIRC). The HD3850 brought the performance of the X1950XTX/7950GT to a bargain basement price($179). The consoles cost $400~$500.

So that happened within two years of that console generation happening. The 8800GT/HD3850 cost almost half as much as console and were faster.

If this did not happen,much fewer people would have bothered upgrading to run Crysis.

We are coming onto three years since the current generation consoles are launching. The best we have from the new generation under £450,will be the RTX4060TI/RTX4060/RX7600XT 8GB cards. The consoles start at just under £400.Enough said.
 
Last edited:
A bit of both - yes better games should intice people to upgrade. But we need better value performant hardware to do so.

Remember Crysis? The 8800GTX? The XBox 360 launched in late 2005,together with the PS3. The XBox 360 had the first unified shader dGPU in production,and the PS3 had an Nvidia 7900 series based GPU. The 8800GTX launched a year later and made the X1950XTX/7950GT look weak.

A year later we had the 8800GT/8800GTS which had most of the performance of the 8800GTX at lower resolutions,just in time for Crysis. The 8800GT was 40% of the price of the 8800GTX($240 IIRC). The HD3850 brought the performance of the X1950XTX/7950GT to a bargain basement price($179). The consoles cost $400.

So that happened within two years of that console generation happening. The 8800GT/HD3850 cost almost half as much as console.

We are coming onto three years since the current generation consoles are launching. The best we have from the new generation under £450,will be the RTX4060TI/RTX4060/RX7600XT 8GB cards. The consoles start at just under £400.Enough said.

Yes its a bit of both.

But I was more referencing the utter crap state games are being released to us at the moment.
 
Yes its a bit of both.

But I was more referencing the utter crap state games are being released to us at the moment.

Its been happening for a few years now,especially the whole early access model. But it also seems to be a ton of complacency on the part of Nvidia/AMD/Microsoft too. In the past the inefficiencies of the PC platform were hidden by the big jumps in performance. Now we get mostly poor improvements,so now its exposing problems. The GTX1070 for example was 61% faster at qHD than a GTX970 with twice the VRAM(there was a bit of a price bump too). Now we are arguing whether get just over 40% extra performance(RTX4070) for 60% more money over an RTX3060TI is decent value,and you only get a 50% bump in VRAM too.We might be another year away from a console refresh too,which is going to compound things.

It shows you the different mentality of Nvidia/ATI back then.
 
Last edited:
Let me just repeat that statement again.

We need better quality games before we buy expensive hardware.

Of course better graphics sells hardware and we all want to play visually stunning games, but we also need games that deliver a quality gaming experience.

This year a lot of games are launching which are either crap (Red Fall) or are unoptimized messes (Jedi Survivor and TLOU)
 
Last edited:
Well I finally received my RTX 4070 FE after a bit of a faff with delivery days and a digital code being "out of stock" thus delaying the original delivery date.

Anywhoo, wall of pictures incoming, including more than one instance of the "wall of misfits.." :cry:

Can't fault the initial unboxing "experience".. maybe that's where all the extra cost has gone?? Funnily enough my ARC A750 LE also had a custom exterior box, only it didn't cost what this did..

6bUKktsl.jpg

Awww, they didn't mess up and send me a 4090FE by accident.. :(

n9Yp8ePl.jpg

Oooooh, its like Honey I shrunk the 4090..

pWk2rhjl.jpg

Not going to lie, it is a very nice, and very small Graphics card. I kind of also want a AMD built RX6xxx/7xxx now to add to the collection.

q30hwP4l.jpg

Meet the Great Great Great Great (Great) Grandparents...

VcnpmeOl.jpg

Sandwiched between two misfits who want to learn the ways of playing DX9 natively...

(Also this shot kind of highlights what a bizarre design the BiFrost actually is.

3PASyell.jpg

Finally, the wall of Misfits..

f43GHEcl.jpg

First impressions are that is a RTX 4070, looks like a 4070, smells (?) like a 4070 and performs like a 4070. That is to say it performs roughly around a desktop 3080 and allows me to test out the feature set of the 4000 series without needing the GDP of a small island nation to to act as collateral for the finance payments (just needed a couple of the principalities instead..). It runs cool (mid 60's in my NZXT flow with front 240mm AIO cooling the CPU and fans at minimal speeds), quiet, sips power (~180w) and boosts to around 2800Mhz in CP2077. So it's literally a 4070..

Overall first impressions are positive, if we ignore the elephant that has just entered the room, eaten all the food and defecated on the sofa... the price.

What I am most interested in is seeing how far it can be pushed whilst also playing around with undervolting / power limits. It would be cool to see its performance uplift over my Legions full power & overclocked 3080M (3070/3070Ti desktop performance) in the various games I play.

I have only really messed around with CP2077 atm and its quite interesting tbh. First off it can happily run full Ultra + Path Tracing at my 3440x1440 with DLSS on balanced at around 45FPS. Not amazing but playable. Turn on DLSS3.0 and it bumps it up to a pretty solid 60FPS. Input latency is noticeable, which surprised me as normally I am able to ignore it but I am assuming that is down to the FPS and input latency for said FPS not aligning - i.e. what I am seeing and feeling are different to what I would expect (60FPS visually but ~40FPS actual). The feeling does subside after a while which is likely the result of getting used to the difference, kind of like how I can go back and play D1 @ 30FPS on console and not really notice after the first few minutes.

Onto Hogwarts next where my goal is to hit 7900XT* native performance (I have my numbers to compare) by using DLSS. I was getting reasonably close in terms of overall feel, if not raw FPS, with the 3080M so this 4070 should get much closer. Obviously it isn't as fast as a 7900XT but it will be interesting to see how close it can get both in terms of performance and visuals. :)
 
Last edited:
Well run a bit of Hogwarts to test it out and I am suitably impressed. No it isn't a 7900XT with its brute force approach (and 335w power usage) to 100+ FPS but with a little bit of DLSS and Frame Gen it is technically outperforming it in raw FPS at least.

I will be completely honest, I couldn't even tell that Frame Gen was on apart for the extra 20 FPS or so it netted me. No noticeable change to input latency (confirming what I experienced in CP2077 was a result of using DLSS3.0 from a much lower starting point - 45ish vs 80+) and no obvious visual artefacts, at least no more than the game exhibits anyway. All at ~135-150w as well which was nice.

Visually I couldn't see any difference with this vs 7900XT Native and the A770 running XeSS Quality (same render resolution as DLSS Quality). Maybe there is something there if you pixel peep but I don't have a tendency to, whilst playing, screenshot a game, blow it up to 800X and, with adenoids, notice how the refinement on the funnel edges on the MK5's is slightly better than the MK4's (and yes that is a reference)... :D

Oh and no coil whine, thank **** for that!
 
Last edited:
Can't fault the initial unboxing "experience".. maybe that's where all the extra cost has gone??

One thing you can't fault nVidia on is the presentation and quality of the FE cards (including the packaging) - pictures don't do it justice at all just how premium the materials and design looks and feels in person.
 
Last edited:
I think they will have a 4090ti at some point. Like last gens though they seem to bring it out really late, too near the next gen launch.
Thats my main gripe with the x90 Ti. Launches way too late in the cycle to the point you are just 6 months away from the next gen x70 card matching it in performance.

I am still not that sure about the probability of 4090 Ti. As it stands the 4090 is currently in a league of its own. There aren't even any leaks of it looemlast time. In Cyberpunk, it's almost 30 FPS ahead of the 4080 which in itself is slightly faster than 7900 XTX. What I can see them doing is just calling it a Titan charge 3-4 grand for it.
They are like Apple,wanting to skimp on hardware as much as possible to increase margins.
Isn't everyone doing that though? Why does the 7900XTX cost almost identical to the 4080 when it does not have the RT performance, does not support DLSS and has an inferior encoder.

While I agree the mid range value seems garbage this gen, it's clear AMD and Nvidia have the sales data indicating their upselling strategy is working. Otherwise they wouldn't release such bad cards
 
Last edited:
Thats my main gripe with the x90 Ti. Launches way too late in the cycle to the point you are just 6 months away from the next gen x70 card matching it in performance.

I am still not that sure about the probability of 4090 Ti. As it stands the 4090 is currently in a league of its own. There aren't even any leaks of it looemlast time. In Cyberpunk, it's almost 30 FPS ahead of the 4080 which in itself is slightly faster than 7900 XTX. What I can see them doing is just calling it a Titan charge 3-4 grand for it.

Isn't everyone doing that though? Why does the 7900XTX cost almost identical to the 4080 when it does not have the RT performance, does not support DLSS and has an inferior encoder.

While I agree the mid range value seems garbage this gen, it's clear AMD and Nvidia have the sales data indicating their upselling strategy is working. Otherwise they wouldn't release such bad cards
The RX7900XTX priced is high because the RTX4080 is joked priced. 16GB for over £1000. You can get that for under £500 in 2023.

Only twice that of a £250 RX480 in 2016. These so called high end cards are a joke. Both should be under £800.

Also the need to use upscaling from day one shows you how weak these cards are. Neither of them can justify their pricing. Consoles need upscaling because they use weak and cheap hardware. It's not really something to brag about on £1000+ cards.

The irony is the RTX4090,is as you say OK looking.

But even that has technically a worse tier than the RTX3090.

Wouldn't surprise me if both are found to be price fixing in the future.
 
Last edited:
The RX7900XTX priced is high because the RTX4080 is joked priced. 16GB for over £1000. You can get that for under £500 in 2023.

Only twice that of a £250 RX480 in 2016. These so called high end cards are a joke. Both should be under £800.

Also the need to use upscaling from day one shows you how weak these cards are. Neither of them can justify their pricing. Consoles need upscaling because they use weak and cheap hardware. It's not really something to brag about on £1000+ cards.

The irony is the RTX4090,is as you say OK looking.

But even that has technically a worse tier than the RTX3090.

Wouldn't surprise me if both are found to be price fixing in the future.


6 Royal Gala apples should be £1.50 from M&S like they were a couple of years ago, they are now £3.

Home energy x3......gfx cards are in line with everything else. Even 2nd hand stuff costs more than it did a couple of years ago.

If you don't need want it, then like anything, just don't buy it. NV & AMD aren't reading your posts like you'd hope and will drop the prices. Only when the impending recession fully lands will prices come down, and relatively the £££ will be the same as we'll all have less buying power. No company or business cares about those that can't afford it. There's always the cheap end of the range.

Only people needing these cards are those with high res & refresh rate panels to run. Easy way to make PC gaming cheap is to cut your cloth and run a resolution you want to afford. Or, get a console for <£500 that has 16GB of RAM (even though it only accesses 10-12 directly for gfx).
 
Also the need to use upscaling from day one shows you how weak these cards are.
That's a bit debatable. On one hand, depends how you see these cards, on what resolution they should be running at. For instance, 4070 is more like a 1440p card where it does fine natively. 4080 also does decently at 4k. RT will be, for obvious reasons, a problem without upscaling unless playing at lower resolutions.
On the other hand, if games are more demanding, is not that easy to hit the 60fps spot. Case in point, Cyberpunk at 4k even in rasterization is a tough nut to crack where you need a 4090 to be comfortably above 60

cyberpunk-2077-3840-2160.png



Just like back when Crysis still had something to say about video cards :)

crysis_2560_1600.gif
 
6 Royal Gala apples should be £1.50 from M&S like they were a couple of years ago, they are now £3.

Home energy x3......gfx cards are in line with everything else. Even 2nd hand stuff costs more than it did a couple of years ago.

If you don't need want it, then like anything, just don't buy it. NV & AMD aren't reading your posts like you'd hope and will drop the prices. Only when the impending recession fully lands will prices come down, and relatively the £££ will be the same as we'll all have less buying power. No company or business cares about those that can't afford it. There's always the cheap end of the range.

Only people needing these cards are those with high res & refresh rate panels to run. Easy way to make PC gaming cheap is to cut your cloth and run a resolution you want to afford. Or, get a console for <£500 that has 16GB of RAM (even though it only accesses 10-12 directly for gfx).
Inflation is relative to a country/industry and there is also deflation on products which is also happening on day to day products (not to mention things like RAM, storage), just that people don't focus much on it...

Same goes with GPUs and, most importantly, we don't know how or IF the BOM of the cards was affected (which shouldn't be that significant anyway). R&D was already done previously and other costs would also go down.
 
Last edited:
Inflation is relative to a country/industry and there is also deflation on products which is only happening on day to day products (not to mention things like RAM, storage), just that people don't focus much on it...

Same goes with GPUs and, most importantly, we don't know how or IF the BOM of the cards was affected (which shouldn't be that significant anyway). R&D was already done previously and other costs would also go down.


STill need to pay for the R&D for the pipeline stuff, they didnt send all the R&D people home, they all have inflation, pretty sure the whole world is suffering inflation. CAn only kick the can so far down the road since 2008, we have to pay for QE, pandemic etc somehow. The poor will pay most.

At the end of the day, saying other countries and industries etc is futile, we are talking UK, Europe if you like (all suffering high inflation) and we are talking GPU industry. Pointless trying to bring other industries into it to put the blame back on solely profiteering with GPU's.

Mining boom and scalping brought high GPU prices. SHareholders will be - nope - we'll be having that profit from now on. If they were scalpable and x3 profits could be made from a £650 card - aint no shareholder or business that wont adjust their prices to stop that.

Everyone moaned they couldnt get one, many were jumping up and down when retailers put a perceived extra £100 on top of 3080's before it went mad. Same people moaning that AIB's weren't the same as FE. Got worse as cards were going to £2k when mining took over.

Now we have cards available as they are highly priced to make them non scalpable - people still moaning.

We've been as rich as we are going to be for a good long while. Cheap credit, interest rates lowest in 150 years, for a good 10 years. LARGE US banks failing now. Unfortunately, it's only going to get worse - much worse. Trouble withg young people they think that it is the norm - well, wait for the revist of 70's inflation. Look at Argentina if you want to see inflation, that's at 100% I think I saw.
 
Back
Top Bottom