• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

Explains it, I think is a great looking game, I play in 1440p as I can't run it in 4k but it looks fantastic for its age and they have significantly improved the graphics over the years.

You can't compare a single player game to a multiplayer game especially a multiplayer game where everything is player built.

One of the biggest issue is lighting, you have to restrict how good they look as they can severely impact FPS depending on the system. Now Cyberpunk the only good thing about the way it looks is the lighting everything else is a bit meh in all honesty.

For him to compare a 500+ player game which has to restrict lighting as it would impact the most with a game which literally specialises in lighting (can be argued its the only good looking thing about it) it's just a ridiculous argument.

It is a good looking game yes, i was surprised researching its release date i thought it was much newer than 10 years old, i've only been playing it on and off for a couple of years.

I like the servers that are friendly for a couple of weeks, the ones that allow you to build your little cave, or small town... before it all kicks off and every man / woman for themselves, a good fun game. some of the constructions people build are massive.

I agree with you on Cyberpunk, said it my self, it has beautiful lighting, but its everything in that game visually, Its Nvidia's showcase for all their RTX work, if not for that yeah it looks pretty naff.
 
Last edited:
It is a good looking game yes, i was surprised researching its release date i thought it was much newer than 10 years old, i've only been playing it on and off for a couple of years.

I like the servers that are friendly for a couple of weeks, the ones that allow you to build your little man cave, or small town... before it all kicks off and every man / woman for themselves, a good fun game. some of the constructions people build are massive.

I agree with you on Cyberpunk, said it my self, it has beautiful lighting, but its everything in that game visually, Its Nvidia's showcase for all their RTX work, if not for that yeah it looks pretty naff.

Yeh Rust is a blast good if you have a team as well.

But it never used to look that good back in the day it did look significantly worse I came back a few years later and had one of those "wait did this game always look this good" in 4k if you can run it, at the right time of day you can get some beautiful views for a game of that age.

100% agree with you there.
 
Well there ^^^^ it is, everything you see there is settings, Your Curves are here.

Not off topic, GPU's, CPU,'s, its all relevant to the question. :)

WhEutpr.png
So it's just what you have in the BIOS? I'd just do it in the BIOS.
Asus have something similar in their software suite I believe (or used to, I've not installed it recently) but I didn't care for that either, I just wanted the fan control software.
I'm sure everyone always said to do tweaking in the BIOS. Not sure why that would be any different now that it's AMD software instead of Asus or whoever.
 
So it's just what you have in the BIOS? I'd just do it in the BIOS.
Asus have something similar in their software suite I believe (or used to, I've not installed it recently) but I didn't care for that either, I just wanted the fan control software.
I'm sure everyone always said to do tweaking in the BIOS. Not sure why that would be any different now that it's AMD software instead of Asus or whoever.

Yeah and i agree with that, but i have used it for quick dirty tuning to test in the desktop environment.

So its useful but you're right its not what one should be using to do it properly, more than anything i appreciate the sentiment, it costs money and time, they didn't have to do it, others don't.
Its like 1usmus, the free tools he spends his time making are for the enthusiasm of the subject, he does this because of the type of person that he is, someone who i would quite like to have drink with and talk nerd knowing that's all good, he's on my level.
So while AMD making these tools might not be ultimately useful, AMD probably know this themselves, but they still do it, because they are that kind of people, so to me, that's all good.
 
Last edited:
Look at the VRam in the OSD on this, 120Hz, keep your eye on the system ram as i approach the scrapyard, the VRam is already full, it need more, what to put it? System ram, keep your eye on it and watch the frame rates, goes from 100 <> 120, to 30.

RTX 2070 Super.

A dev on reddit has confirmed the practice. But he also explained it a bit more so not just lazyness.

As we know UE is now dominant in the AAA industry, and this engine utilises streaming of textures whilst playing.

The consoles have lots of hardware optimisations which make the overhead of loading textures really low hence either reduced or no stutters. (they do of course also have better VRAM to tflops balance).

On the PC this is a problem of course, but according to this dev, if they used system ram more instead of VRAM which is an option on PC it would be even worse unless of course VRAM is being saturated, in which case using system ram would improve things.

He didnt really offer a solution though other than direct storage will improve things.

The take in my opinion from what he said is they choose to use a solution that works best for cards with decent VRAM, and its the usual "upgrade" for those who have VRAM starvation.

Of course time saving will still be a part of this in my opinion, as its quicker to port a game if you dont have to tinker with the memory allocation code.

--

Personally I wish UE would just go away but sadly its getting more common if anything, I have played low end games using UE4 and they still had dodgy performance, just seems a horrid engine.

Also a few other devs who responded spoke about UE4, and they admitted the engine has practically no built in optimisation. Most of these comments were on a thread about the new star wars game.
 
Last edited:
A dev on reddit has confirmed the practice. But he also explained it a bit more so not just lazyness.

As we know UE is now dominant in the AAA industry, and this engine utilises streaming of textures whilst playing.

The consoles have lots of hardware optimisations which make the overhead of loading textures really low hence either reduced or no stutters. (they do of course also have better VRAM to tflops balance).

On the PC this is a problem of course, but according to this dev, if they used system ram more instead of VRAM which is an option on PC it would be even worse unless of course VRAM is being saturated, in which case using system ram would improve things.

He didnt really offer a solution though other than direct storage will improve things.

The take in my opinion from what he said is they choose to use a solution that works best for cards with decent VRAM, and its the usual "upgrade" for those who have VRAM starvation.

Of course time saving will still be a part of this in my opinion, as its quicker to port a game if you dont have to tinker with the memory allocation code.

--

Personally I wish UE would just go away but sadly its getting more common if anything, I have played low end games using UE4 and they still had dodgy performance, just seems a horrid engine.

Also a few other devs who responded spoke about UE4, and they admitted the engine has practically no built in optimisation. Most of these comments were on a thread about the new star wars game.
He is not the first dev to say this, he will not be the last, i have been saying it for at least 2 years.

UE is now dominant in the industry because its nothing short of brilliant. And its not going to get any better, every other engine developer is going to want to emulate the technology.

Live texture streaming, and Object Container Streaming first appeared in 2016, to my knowledge, in Star Citizen, its the only way to get seamless transitions from space to the surface of planets, especially in a multilayer environment when you have crew mates in the back of your ship doing their own thing so you can't level load.
Then Sony, on the PS5, with Ratchet and Clank, again, seamless, no level loading.

When the CIG Frankfort office cracked this after working on the tech for about 2 years they quietly put out a little celebration video.


And in game in 2020.


This has been coming, for years, and frankly PCMR dGPU's are being left behind by Game Consoles and game developers have just had enough of being strangled by that, even CIG making a PC exclusive game are saying you're going to have to run proper hardware to play our game, because we can't make the game we want to make for middling dGPU's, tho they do try to keep it running ok on 8GB hardware they have talked about how difficult a task that is, its a lot of time and effort. It runs better with a 12GB and certainly 16GB GPU and a fast NVMe.

I don't do it just because i want to hate on Nvidia, i have a project in UE5 that's on hold until i get a GPU with more VRam, because 8GB just wont do it.
8GB isn't even low end these days, that's 12GB, mid range 16GB, higher end 20GB at least, there is no reason for Nvidia or AMD to not do that, VRam costs them peanuts, the only reason they would do this is for planned obsolescence, the RTX 3070 and 3080 are exactly that and i as a PC enthusiast do not take kindly to BS like that, having to pay hundreds and hundreds of £ for these things i take that personally. I fell like i'm being manipulated and ripped off by some cynically disrespectful people.

We should all call them out of it and demand better. Because right now PCMR is a pee taking joke. And its you and me they are taking the pee out of....
 
Last edited:
Just watched a bit of DF's review of star wars on the PS5, it seems the dev's got a bit obsessed with RT and force it on in both modes, without it the resolution mode would probably be a 30fps lock. I feel RT is ruining gaming :/.

But regardless I would be embarrassed to release a game that tanks to 20 fps, the industry is in a bad spell.

Personally I have little issues with loading screens, once we had ssd's they were at acceptable durations.

UE4 I can imagine is seen as amazing feature wise by dev's, but I do think it has severe problems. Which erupt on VRAM starvation (nvidia *cough*).
 
He is not the first dev to say this, he will not be the last, i have been saying it for at least 2 years.

UE is now dominant in the industry because its nothing short of brilliant. And its not going to get any better, every other engine developer is going to want to emulate the technology.

Live texture streaming, and Object Container Streaming first appeared in 2016, to my knowledge, in Star Citizen, its the only way to get seamless transitions from space to the surface of planets, especially in a multilayer environment when you have crew mates in the back of your ship doing their own thing so you can't level load.
Then Sony, on the PS5, with Ratchet and Clank, again, seamless, no level loading.

When the CIG Frankfort office cracked this after working on the tech for about 2 years they quietly put out a little celebration video.


And in game in 2020.


This has been coming, for years, and frankly PCMR dGPU's are being left behind by Game Consoles and game developers have just had enough of being strangled by that, even CIG making a PC exclusive game are saying you're going to have to run proper hardware to play our game, because we can't make the game we want to make for middling dGPU's, tho they do try to keep it running ok on 8GB hardware they have talked about how difficult a task that is, its a lot of time and effort. It runs better with a 12GB and certainly 16GB GPU and a fast NVMe.

I don't do it just because i want to hate on Nvidia, i have a project in UE5 that's on hold until i get a GPU with more VRam, because 8GB just wont do it.
8GB isn't even low end these days, that's 12GB, mid range 16GB, higher end 20GB at least, there is no reason for Nvidia or AMD to not do that, VRam costs them peanuts, the only reason they would do this is for planned obsolescence, the RTX 3070 and 3080 are exactly that and i as a PC enthusiast do not take kindly to BS like that, having to pay hundreds and hundreds of £ for these things i take that personally. I fell like i'm being manipulated and ripped off by some cynically disrespectful people.

We should all call them out of it and demand better. Because right now PCMR is a pee taking joke. And its you and me they are taking the pee out of....

Isn't Star Citizen using an evolved version of CryEngine?
 
Isn't Star Citizen using an evolved version of CryEngine?

They started out using Cryengine, its not Cryengine anymore, its so heavily modified there is literally nothing left of it, its their own in house engine now.
They haven't used the Cryengine Logo on the splash screen for years.
-----------

In order for technology to progress someone has to take the lead, developers are not waiting on PC hardware vendors anymore, forcing them to catch up, if they don't we will all be running game consoles.
 
Last edited:
Looks like AMD is joining in with Nvidia pushing up tiers:

FXhm0hI.jpeg

7kd3rj.jpg


They started out using Cryengine, its not Cryengine anymore, its so heavily modified there is literally nothing left of it, its their own in house engine now.
They haven't used the Cryengine Logo on the splash screen for years.
-----------

In order for technology to progress someone has to take the lead, developers are not waiting on PC hardware vendors anymore, forcing them to catch up, if they don't we will all be running game consoles.
Ironically it looks like games consoles are pushing more of the innovation now,ie,using SSDs properly.
 
Looks like AMD is joining in with Nvidia pushing up tiers:

FXhm0hI.jpeg

7kd3rj.jpg



Ironically it looks like games consoles are pushing more of the innovation now,ie,using SSDs properly.

Ironically it looks like games consoles are pushing more of the innovation now,ie,using SSDs properly.

That and more, yes.
 
Because if Nvidia end up allowing AIB partners to make 16GB versions of the RTX4060/RTX4060TI,that will be the end of AMD mainstream dGPU sales.

Are they? I'm glad of that, at the very least they should allow AIB's to double VRam if they so chose, my only concern is pricing, i'm ok with them charging a bit more for higher capacity VRam GPU's, but i don't want to see pricing go daft, there is no reason why they should cost a lot more.

To answer your point, yes.
 
Last edited:
Back
Top Bottom