• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

This is the swivel position I've seen in this thread a few times now and it raises an important question.

Why does this discussion belong in this thread if you abandon trying to pin this on AMD.

Surely this is manufacturer agnostic and deserves a special thread for speculating that developers find it a lot easier to optimise for their fixed hardware consoles than the mess that is PC.

Who said I'm pinpointing it on AMD? :confused:

The only titles where I would say amd have played a part is FC 6 but let's not go down that road again.....

I have always said that the problems being seen which as confirmed by Alex/DF and many other in the know is 100% down to how the games have been ported from consoles to PCs i.e. the game has been designed with consoles architecture in mind i.e. the unified memory system and having a 100% functional good direct storage implementation, until PC sees similar treatment or the games are optimised better to work with the traditional pc system then people will have to keep on paying ££££ to brute force/avoid issues inherent to the game on PC.
 
Last edited:
Except it's not a hardware limitation, it's a game/Dev issue.... 7900xtx should be performing on par with a 3080/3090, not being beat by a 3070 :) May very likely be down to Nvidia.....

But if using above logic, it's not a game/Dev fault at all :)

Surely it is a game/dev fault if they've coded to purely take advantage of one brands features/strengths and disregarded anothers.
 
Surely it is a game/dev fault if they've coded to purely take advantage of one brands features/strengths and disregarded anothers.

Kind of the point I am getting at :) :p

But in the case of cp 2077, nvidia would have provided "guidance" no doubt... What I will say though, cp 2077 has incredible performance given how dense and open the world is with path tracing, at least when compared to portal rtx (although it was done by remix) etc. but I digress :D
 
Last edited:
Like I've said many times before. I got no problem having more vram "if" it is actually being put to proper use other then to avoid/brute force pass issues inherent to how the game has been ported. Again see tlou when it released and watch df video, a similar spec pc to the ps5 it's running it worse in a number of ways but not a fault with the game.... :cry:

People should be not just pushing Nvidia to provide more vram but also pushing developers to do a better job of porting/optimising for pc hardware otherwise keep enjoying having to spend ££££ just to avoid/brute force issues....

If you can't see or understand that then you're part of the problem with these well regarded broken pc titles
You are writing it in a very wishful way, but you can see what the reality of this is in current gaming market. Namely, publishers cut cost on testing and optimisations and it is what it is later. Unless gamers stop buying such games (they proven many times over that's not going to happen) there's only 2 options:
1. Complain on forums with many excuses but still buy such games - so nothing is going to change aside never ending arguments.
2. Suck it up and get GPU with more vram to combat this problem. But again, gamers buy what's on the market and not push on the vendor so it is what it is.
 
Last edited:
Kind of the point I am getting at :) :p

But in the case of cp 2077, nvidia would have provided "guidance" no doubt... What I will say though, cp 2077 has incredible performance given how dense and open the world is with path tracing, at least when compared to portal rtx (although it was done by remix) etc. but I digress :D
Ah i see, my sarcasm detection was low with that one :p
 
You are writing it in a very wishful way, but you can see what the reality of this is in current gaming market. Namely, publishers cut cost on testing and optimisations and it is what it is later. Unless gamers stop buying such games (they proven many times over that's not going to happen) there's only 2 options:
1. Complain on forums with many excuses but still buy such games - so nothing is going to change aside never ending arguments.
2. Suck it up and get GPU with more vram to combat this problem. But again, gamers buy what's on the market and not push on the vendor so it is what it is.

Oh yeah of course nothing will change, just look at how many times pc releases come out in a **** state and they get panned as being awful on pc then devs come out and say the usual of "we hear you, we are also working hard to ensure pc gamers get a good experience blah blah blah....." but fast forward a couple years later, same dev/publisher releases game in another **** state :cry:

If UE 5 etc. do start using more vram then there will be a point where even 16gb on likes of rdna 2 won't be enough due to the poor optimisation of porting to pc and probably still won't have direct storage on pc :o In which case, 24gb will become the min and that may even be cutting it close..... in which case, pay another ££££ to avoid/brute force pass the issue.

Ah i see, my sarcasm detection was low with that one :p

We should go back to when physx was in games and see who's fault it was then ;) :p :D
 
I think the 4090 has pretty much the perfect horsepower/vRam ratio and to an extent the 4080 too, just a shame they're so stingy lower down the stack!
Yeah. If the 4080 was ~£760 which is about 3080 prices taking into account inflation then it'd actually be looking like a reasonable deal. As it stands the 4070 ti is not nearly as attractive at close to that price point (even if they did originally want to call it a 4080!).
 
Yeah. If the 4080 was ~£760 which is about 3080 prices taking into account inflation then it'd actually be looking like a reasonable deal. As it stands the 4070 ti is not nearly as attractive at close to that price point (even if they did originally want to call it a 4080!).
I must admit I was half expecting Nvidia to pricedrop the 4080 into the $999 bracket on the 4070 launch as that would make it nice and neat for the 3 cards ($599/$799/$999) and prob really boost 4080 sales but nope, we got nada.
 
I think the 4090 has pretty much the perfect horsepower/vRam ratio and to an extent the 4080 too, just a shame they're so stingy lower down the stack!

With the way things are going, I think it is going to start ******** the bed at some point too especially when the refresh consoles arrive. It's already barely enough for UE 5 demos.
 
With the way things are going, I think it is going to start ******** the bed at some point too especially when the refresh consoles arrive. It's already barely enough for UE 5 demos.
Tbf I'm not expecting to run UE5 games on anything more than 'High' to maintain decent amount of fps. They're still gonna look *awesome* tbf!
 
Tbf I'm not expecting to run UE5 games on anything more than 'High' to maintain decent amount of fps. They're still gonna look *awesome* tbf!

It's why I hesitant to upgrade ATM,because if UE5 ends up needing more VRAM then an 8GB/12GB VRAM dGPU might have problems. OFC,the RTX5000 series with 16GB of VRAM at the lower ends will be sold as the mainstream champion! :P
 
Tbf I'm not expecting to run UE5 games on anything more than 'High' to maintain decent amount of fps. They're still gonna look *awesome* tbf!
All the small UE5.1+ demos I've seen live on my 4090 worked just fine really (1440p UW) without dlss and the likes, whilst looking much better than cp2077 does with PT (lumen and nanite active). Just way better textures and assets and nanite makes a huge difference with foliage etc. - visually it beats PT (lighting) in such situations to me. Though, there's of course lumen as well, which is close enough to PT visually.

Creators of these also said they aren't optimised at all, it's literally drag and drop high resolution assets from blender directly, no changes done. Lumen also makes lighting trivial. It just works, very easily. That seems to be the way forward for gaming industry and not faffing around with rtx/dxr or raster tricks - seems to be very easy for Devs, looks great and performs decently on wide array of hardware (unlike PT).
 
Last edited:
It's why I hesitant to upgrade ATM,because if UE5 ends up needing more VRAM then an 8GB/12GB VRAM dGPU might have problems. OFC,the RTX5000 series with 16GB of VRAM at the lower ends will be sold as the mainstream champion! :p
I would expect that to be fixed with direct storage which also is supposed to work very well with UE5.1+. That will take away a lot of work from Devs to optimise these things, allowing engine to stream textures and assets in and out easily and seamlessly. In theory :)
 
Last edited:
Colorful confirms 4060ti is 8gb

 
I would expect that to be fixed with direct storage which also is supposed to work very well with UE5.1+. That will take away a lot of work from Devs to optimise these things, allowing engine to stream textures and assets in and out easily and seamlessly. In theory :)

I just hope all the people who bought QLC DRAMless SSDs or SATA SSDs don't start complaining! :o

I always told people to make sure they spent the extra and got a decent SSD in the first place!

All the small UE5.1+ demos I've seen live on my 4090 worked just fine really (1440p UW) without dlss and the likes, whilst looking much better than cp2077 does with PT (lumen and nanite active). Just way better textures and assets and nanite makes a huge difference with foliage etc. - visually it beats PT (lighting) in such situations to me. Though, there's of course lumen as well, which is close enough to PT visually.

Creators of these also said they aren't optimised at all, it's literally drag and drop high resolution assets from blender directly, no changes done. Lumen also makes lighting trivial. It just works, very easily. That seems to be the way forward for gaming industry and not faffing around with rtx/dxr or raster tricks - seems to be very easy for Devs, looks great and performs decently on wide array of hardware (unlike PT).
I do like the fact UE5 is trying to take a balanced approach and incorporate a mix of different technologies. Epic know very well they need to make it work on a range of PC hardware and consoles too. When Nvidia/AMD get too involved,it's mostly so they can sell more of their higher end dGPUs so you can end up with suboptimal results.
 
Colorful confirms 4060ti is 8gb


So an RTX3070 8GB for RTX3070 8GB pricing?
 
You are writing it in a very wishful way, but you can see what the reality of this is in current gaming market. Namely, publishers cut cost on testing and optimisations and it is what it is later. Unless gamers stop buying such games (they proven many times over that's not going to happen) there's only 2 options:
1. Complain on forums with many excuses but still buy such games - so nothing is going to change aside never ending arguments.
2. Suck it up and get GPU with more vram to combat this problem. But again, gamers buy what's on the market and not push on the vendor so it is what it is.

3. Wait for 6-12 months before buying the game, when it's fixed, cheaper and runs on cards that struggled in the beta.

Only need to buy much more expensive gfx card if you want to pay MSRP to be a beta tester.
 
Last edited:
I just hope all the people who bought QLC DRAMless SSDs or SATA SSDs don't start complaining! :o

I always told people to make sure they spent the extra and got a decent SSD in the first place!
QLC DRAMless SSDs will be fine for gaming, QLC is an issue for write speeds not read speeds.
 
Last edited:
Back
Top Bottom