Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Explains it, I think is a great looking game, I play in 1440p as I can't run it in 4k but it looks fantastic for its age and they have significantly improved the graphics over the years.
You can't compare a single player game to a multiplayer game especially a multiplayer game where everything is player built.
One of the biggest issue is lighting, you have to restrict how good they look as they can severely impact FPS depending on the system. Now Cyberpunk the only good thing about the way it looks is the lighting everything else is a bit meh in all honesty.
For him to compare a 500+ player game which has to restrict lighting as it would impact the most with a game which literally specialises in lighting (can be argued its the only good looking thing about it) it's just a ridiculous argument.
It is a good looking game yes, i was surprised researching its release date i thought it was much newer than 10 years old, i've only been playing it on and off for a couple of years.
I like the servers that are friendly for a couple of weeks, the ones that allow you to build your little man cave, or small town... before it all kicks off and every man / woman for themselves, a good fun game. some of the constructions people build are massive.
I agree with you on Cyberpunk, said it my self, it has beautiful lighting, but its everything in that game visually, Its Nvidia's showcase for all their RTX work, if not for that yeah it looks pretty naff.
So it's just what you have in the BIOS? I'd just do it in the BIOS.
So it's just what you have in the BIOS? I'd just do it in the BIOS.
Asus have something similar in their software suite I believe (or used to, I've not installed it recently) but I didn't care for that either, I just wanted the fan control software.
I'm sure everyone always said to do tweaking in the BIOS. Not sure why that would be any different now that it's AMD software instead of Asus or whoever.
Well there ^^^^ it is, everything you see there is settings, Your Curves are here.
Not off topic, GPU's, CPU,'s, its all relevant to the question.
A dev on reddit has confirmed the practice. But he also explained it a bit more so not just lazyness.Look at the VRam in the OSD on this, 120Hz, keep your eye on the system ram as i approach the scrapyard, the VRam is already full, it need more, what to put it? System ram, keep your eye on it and watch the frame rates, goes from 100 <> 120, to 30.
RTX 2070 Super.
He is not the first dev to say this, he will not be the last, i have been saying it for at least 2 years.A dev on reddit has confirmed the practice. But he also explained it a bit more so not just lazyness.
As we know UE is now dominant in the AAA industry, and this engine utilises streaming of textures whilst playing.
The consoles have lots of hardware optimisations which make the overhead of loading textures really low hence either reduced or no stutters. (they do of course also have better VRAM to tflops balance).
On the PC this is a problem of course, but according to this dev, if they used system ram more instead of VRAM which is an option on PC it would be even worse unless of course VRAM is being saturated, in which case using system ram would improve things.
He didnt really offer a solution though other than direct storage will improve things.
The take in my opinion from what he said is they choose to use a solution that works best for cards with decent VRAM, and its the usual "upgrade" for those who have VRAM starvation.
Of course time saving will still be a part of this in my opinion, as its quicker to port a game if you dont have to tinker with the memory allocation code.
--
Personally I wish UE would just go away but sadly its getting more common if anything, I have played low end games using UE4 and they still had dodgy performance, just seems a horrid engine.
Also a few other devs who responded spoke about UE4, and they admitted the engine has practically no built in optimisation. Most of these comments were on a thread about the new star wars game.
He is not the first dev to say this, he will not be the last, i have been saying it for at least 2 years.
UE is now dominant in the industry because its nothing short of brilliant. And its not going to get any better, every other engine developer is going to want to emulate the technology.
Live texture streaming, and Object Container Streaming first appeared in 2016, to my knowledge, in Star Citizen, its the only way to get seamless transitions from space to the surface of planets, especially in a multilayer environment when you have crew mates in the back of your ship doing their own thing so you can't level load.
Then Sony, on the PS5, with Ratchet and Clank, again, seamless, no level loading.
When the CIG Frankfort office cracked this after working on the tech for about 2 years they quietly put out a little celebration video.
And in game in 2020.
Nothing Much
youtu.be
This has been coming, for years, and frankly PCMR dGPU's are being left behind by Game Consoles and game developers have just had enough of being strangled by that, even CIG making a PC exclusive game are saying you're going to have to run proper hardware to play our game, because we can't make the game we want to make for middling dGPU's, tho they do try to keep it running ok on 8GB hardware they have talked about how difficult a task that is, its a lot of time and effort. It runs better with a 12GB and certainly 16GB GPU and a fast NVMe.
I don't do it just because i want to hate on Nvidia, i have a project in UE5 that's on hold until i get a GPU with more VRam, because 8GB just wont do it.
8GB isn't even low end these days, that's 12GB, mid range 16GB, higher end 20GB at least, there is no reason for Nvidia or AMD to not do that, VRam costs them peanuts, the only reason they would do this is for planned obsolescence, the RTX 3070 and 3080 are exactly that and i as a PC enthusiast do not take kindly to BS like that, having to pay hundreds and hundreds of £ for these things i take that personally. I fell like i'm being manipulated and ripped off by some cynically disrespectful people.
We should all call them out of it and demand better. Because right now PCMR is a pee taking joke. And its you and me they are taking the pee out of....
Isn't Star Citizen using an evolved version of CryEngine?
Ironically it looks like games consoles are pushing more of the innovation now,ie,using SSDs properly.They started out using Cryengine, its not Cryengine anymore, its so heavily modified there is literally nothing left of it, its their own in house engine now.
They haven't used the Cryengine Logo on the splash screen for years.
-----------
In order for technology to progress someone has to take the lead, developers are not waiting on PC hardware vendors anymore, forcing them to catch up, if they don't we will all be running game consoles.
Looks like AMD is joining in with Nvidia pushing up tiers:
AMD confirms mainstream Radeon 7000 GPU launch this quarter, RX 7950XT spotted in ROCm pull request - VideoCardz.com
AMD finally confirms its plans for the mainstream Radeon GPU series During the Q1 2023 earnings call, AMD CEO and President Dr. Lisa Su admitted the company will have an update to the mainstream series this quarter. Radeon 6500/6650XT, Source: TweakTown It is not a secret that AMD is working on...videocardz.com
Ironically it looks like games consoles are pushing more of the innovation now,ie,using SSDs properly.
Ironically it looks like games consoles are pushing more of the innovation now,ie,using SSDs properly.
That and more, yes.
I really hope AMD isn't stupid enough to replace the RX6700XT with essentially an overclocked RX6650XT with a bit more VRAM.
So do i....
Because if Nvidia end up allowing AIB partners to make 16GB versions of the RTX4060/RTX4060TI,that will be the end of AMD mainstream dGPU sales.