• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I'm all for game devs pushing boundaries, look at my signature.

But not when its to promote ever more expensive hardware.
It will only be playable on 4090 for now. When the 6000 series launches in 4 years, you would likely be able to play it on x70 class GPUs. Those who paid through the nose for the 4090 did so precisely to get a glimpse of the future. I don't see an issue with value buyers having to wait 4 years to get it playable. You get what you pay for
 
Νot just PC gaming. Consoles as well, thats why we game new consoles / upgraded consoles / different versions of consoles (series S and series X etc).
Consoles are not future looking at all. Case in point, I have a PS5 and I still cannot run the Arkham trilogy at 60 fps even though it has the hardware to do it. RDR2 is still locked at 30 fps on PS5. Rockstar will likely charge £70 for unlocking the frame rate.The older Assassin's creed games are still locked at 30 fps.

Developers actually charge money to unlock frame rates or add visual options on PS5 on most cases. Unlike PC where for instance Cyberpunk was running sub 60 fps on my 3080 Ti but the 4090 brought me to 100 plis fps.
 
Last edited:
Star citizen was to be released back in 2014. Can I play the game maxed out on my 2014 PC?

It became a playable game in 2017, its come a long way since then keeping up with modern graphics, so no...

It will only be playable on 4090 for now. When the 6000 series launches in 4 years, you would likely be able to play it on x70 class GPUs. Those who paid through the nose for the 4090 did so precisely to get a glimpse of the future. I don't see an issue with value buyers having to wait 4 years to get it playable. You get what you pay for

I play it with very high settings 1440P on an RTX 2070 Super.
 
Last edited:
It became a playable game in 2017, its come a long way since then keeping up with modern graphics, so no...
Exactly. Does that mean the developers are forcing me to buy a new GPU? No, it's just that -- in order for me to get better graphics, I need better hardware. Why is it bad when nvidia does it, I don't understand.
 
Exactly. Does that mean the developers are forcing me to buy a new GPU? No, it's just that -- in order for me to get better graphics, I need better hardware. Why is it bad when nvidia does it, I don't understand.
Its just evolving with what's current, constantly updating a 2017 game so it still looks good in current year can't be a bad thing, its perfectly capable of running on todays midrange hardware while still looking good.

And it doesn't use any GPU vendor specific propitiatory tech, at all, it does have image up-scaling, its using TAA, like Unreal Engine. I'm a huge fan of Unreal Engine too as they develop their own alternatives to things like RT that are 100% hardware agnostic.

If you're sharp eyed you can see the tale tale signs of TAA ghosting, or trailing on fast moving objects, like the ships here...

 
Last edited:
Isn't that the point of PC gaming for a long time now? Remember Crysis? Adding truly next gen features which provide a glimpse of what is to come and which incentivise you to actually buy new hardware to experience it?

I remember it,because it also came with a massive increase in entry level and mainstream dGPU performance,plus we had £100 Core2 CPUs which could be massively overclocked to match more expensive ones,using relatively cheap motherboards. In the past there were huge jumps in mainstream performance and price/performance unlike now.

The 8800GT was almost the same performance at 1680X1050 as an 8800GTX on max settings with AA:

1680X1050 was probably the 1920X1080 of today. The 8800GTX only started to pull ahead at higher resolutions as it had more VRAM. The 8800GTX was double the performance of the previous fastest X1950XTX/7900GTX in games such as Prey and Stalker.

The upper mainstream king of 2005/2006 was the X1950 PRO 256MB:

It was the same $250 RRP as the 8800GT. Within two years,the 8800GT tripled performance in Stalker and doubled the VRAM. I had the slightly slower X1900GT(the X1950 PRO was the updated version).

The 8800GTX had an RRP of $599,and the 8800GT was $249,so had basically almost the same performance as the fastest card,at under half the price a year later.People didn't need to wait 4 years at mainsteam to get the improvements.

$599 in 2006 would equate to around $900 today,or around £865 in the UK with VAT added at current exchange rates. That would place the 8800GT 512MB at around £360 adjusted for inflation and exchange rates. The slightly faster 8800GTS 512MB was $299.Then you had the cheaper HD3850,HD3870 and 9600GT all offering decent performance jumps at lower pricing tiers. I had both the 8800GTS 512MB and HD3870 512MB GDDR4 myself,and used a £100~£120ish E4300,overclocked from 1.8GHZ to over 3GHZ!

What do we have for £400 now? A rubbish RTX4060 or RX7600XT? Even the RTX3060TI(which I got) looked good because Turing was such a rubbish price/performance improvement. It seems since AMD Hawaii/Nvidia Maxwell things started slowing down. Even Pascal,the lower tiers were not brilliant either,but were still competent.

The reality is that people need to stop defending the blatant shrinkflation which is happening at entry level/mainstream areas.

If PCMR wants to stop moaning about why things are not going forward in PC graphics,it needs decent improvements in entry level/mainstream hardware EVERY generation. You need the numbers,and consoles are increasingly going to part of that equation because they are comparable/better than a lot of average gaming PCs and are essentially PCs themselves.
 
Last edited:

More Nvidia comedy "the cheapest" 4050 GPU laptop .. What $1000-1100 buys you now. Almost a doubling in price from what use to be a $600-$650 laptop.. Also DDR4 RAM and 6GB VRAM and check the insides.. You can't make this stuff up. :rolleyes:... Junk...

This is where the 4000 series GPU's go when we don't buy them because they are over priced.

What we need to do now is stop buying crap like this.

I hear the Phoenix APU's are launching soon and they are very good...
 
adding excessively pointless extras to games to bring your OP gpu to its knees so you have to buy a new one next release

That's like complaining you have realistic physics or sound in game just because is demanding.
I'm all for game devs pushing boundaries, look at my signature.

But not when its to promote ever more expensive hardware.

Still needs a powerful CPU and GPU, plus it can cripple the performance for no good reason in the most basic scenes like the one from below where I've got 56fps at 1080p with a 2080 looking at... nothing. Or around 66fps looking at a wall...




PS: They will add ray tracing later on, so better get your money ready as nvidia will be selling something a lot better than since AMD is not trying that much on this front. ;)
 
That's like complaining you have realistic physics or sound in game just because is demanding.


Still needs a powerful CPU and GPU, plus it can cripple the performance for no good reason in the most basic scenes like the one from below where I've got 56fps at 1080p with a 2080 looking at... nothing. Or around 66fps looking at a wall...




PS: They will add ray tracing later on, so better get your money ready as nvidia will be selling something a lot better than since AMD is not trying that much on this front. ;)

Look at that for Multithreading tho, have you ever seen thread load spreading as even as that outside of SC? :)

They have talked about RT because people have asked about it, they said its very unlikely they will do full scale RT because of the sheer scale of the game making that almost imposible, but they are looking in to how and where they can use it.

They also need to optimised their proprietary Gen12 renderer, with the current 3.18 patch its in but they have only just turned the debug code off, over the next few patches it'll get optimised so it runs how it should, there are a lot of performance oddities because its running raw right now.
 
Last edited:
Look at that for Multithreading tho, have you ever seen thread load spreading as even as that outside of SC? :)

They have talked about RT because people have asked about it, they said its very unlikely they will do full scale RT because of the sheer scale of the game making that almost imposible, but they are looking in to how and where they can use it.

They also need to optimised their proprietary Gen12 renderer, with the current 3.18 patch its in but they have only just turned the debug code off, over the next few patches it'll get optimised so it runs how it should, there are a lot of performance oddities because its running raw right now.
Multithreading is not bad indeed, but in that first screenshot there is a 69% load with nothing much going on on a 5800x3d, for 57fps on a 2080... Something around 3080/4070ti and is CPU bound.
Anyway, my point was that each game has it's own issues. Some instances have a high demand in resources which is accounted for through better graphics, while others (like TLOU), does not.

Star Citizen is different 'cause at least they try. I'm waiting for their final result. :)
 
adding excessively pointless extras to games to bring your OP gpu to its knees so you have to buy a new one next release

What if some people want to spend £1500 on whatever is technologically possible?

Why do you think people spend similar amounts on phones? Very expensive TVs? Even expensive monitors?
 
Back
Top Bottom