• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

If next year's consoles do 4k/60..

I use DSR to do 3840x2160 in any game at 240hz. So i can do Rocket League and Overwatch pin sharp with 91ppi (same as 4k 50inch) at 240fps with no blur no lag and i am about 10 years ahead of console bar Raytracing. My only downside is the panel is a TN and not an OLED.

You can get minimum 240fps at 4K in any game? Are you from the future?
 
If all you do is game then your in the wrong forum.

I dont give a roos poo about consoles. They could be the best gaming experience ever still wouldnt have one as all you can do is game. Every time I've dabbled with a console it only last a month or two then went. Boring.
I do so much more with a PC than you can do on a console and its always been at least a step ahead gaming wise.

That's just my personal opinion.

Why? I overclock my CPU and 99% of my PC activity is gaming (mostly games that aren’t available on console).

I’ve used this (specifically Graphics card) forum for years to keep abreast of the GPU market.
 
I played Assassins Creed at 4k on my pc with medium to high settings and I also played it at 1440p at max settings. To be honest I could barely see the difference. The higher resolution seemed to counter the graphic settings.

At 4k on a TV the difference between medium and max settings isn't exactly night and day.

Then the console is probably a good option for you. Save some money and move onto the next gen console :)
 
Resolution and frame rate do not automatically equal quality. Look at the differences on console vs PC title graphics for things like missing artifacts, substantially less infill and detail, worse quality shadows and models, multiplayer version being significantly worse than single player in quality.. to name a few.
 
Im still using a r9 290 gpu and I have just upgraded my screen to a 49" 3840x1080 and I can run a few games almost maxed out at 3840x1080 smooth enough to play. I was impressed
 
Resolution and frame rate do not automatically equal quality. Look at the differences on console vs PC title graphics for things like missing artifacts, substantially less infill and detail, worse quality shadows and models, multiplayer version being significantly worse than single player in quality.. to name a few.

That is true but they matter up to the point where your personal screens pixels become invisable and the blur goes. So framerate is important and framerate simply removes blur. A lot of people think otherwise because they have never had experience with a zero blur image. You have to bypass the current generation and go back to the days of CRT. And sadly this means 240hz as a minimum. I tested it strobed 120 works as well as 240 native but you lose sync technology to do that.

And playing these Witcher games and Cyberpunks BEFORE this step like a lot of current console owners did is the same as viewing any work of art in poor conditions. Why you might as well go see the Mona Lisa with myopia milk bottle glasses on compared to looking at the same work of art up close with 20/20 vision.
 
Resolution and frame rate do not automatically equal quality. Look at the differences on console vs PC title graphics for things like missing artifacts, substantially less infill and detail, worse quality shadows and models, multiplayer version being significantly worse than single player in quality.. to name a few.

Unfortunately people seem to obsess so much over the shadows and textures that they forget about the game itself.

Most highly rated games of last 5 years seem to be console. I've always gamed on my PC because I have a Gtx 1080 and it's no slouch at 1440p but I'd be a fool if I thought my PC was the amazing platform it once was.

As the gap in fidelity closes the pc becomes harder to justify. We are all pretty aware of how minor the visual differences can be between high and ultra settings even though they cost so many frames.
 
That is true but they matter up to the point where your personal screens pixels become invisable and the blur goes. So framerate is important and framerate simply removes blur. A lot of people think otherwise because they have never had experience with a zero blur image. You have to bypass the current generation and go back to the days of CRT. And sadly this means 240hz as a minimum. I tested it strobed 120 works as well as 240 native but you lose sync technology to do that.

And playing these Witcher games and Cyberpunks BEFORE this step like a lot of current console owners did is the same as viewing any work of art in poor conditions. Why you might as well go see the Mona Lisa with myopia milk bottle glasses on compared to looking at the same work of art up close with 20/20 vision.

What made The Witcher so well regarded ? Was it the graphics or was it the immersive storyline? I have friends who played it on both console and pc and both of them chatted about the story. They didn't spend too long willy waving over who saw it look its prettiest.

This is where the PC master race thing comes in. People get so defensive of their GPU and forget that developers tend to make games for people to enjoy. Not to benchmark.

There is also a really interesting dynamic on this forum where people get kudos for rocking slightly older gpus (say gtx 980?) for gaming yet people who want to play the same games on a console get rinsed because they don't use a keyboard and mouse. It's silly.
 
The shortcuts been used in modern games is astounding, the biggest one is short draw distance shadows.

I find it very jarring when I am driving a car or something in a game and I can see the shadows appearing 5 feet in front of my character.

So if we get 4k/60fps on next gen consoles we can be sure new shortcuts are coming to achieve it, lower level of detail, less tesselation etc.
 
Higher settings, higher framerates, higher resolutions, etc Same as always. Ray tracing is gonna be pushed really hard. 4K is essentially solved on PC, now we're waiting for 1-2 more jumps so the crazy early adopters can start messing around with 8K. I know that that's my plan in 2-3 years.

The good thing about these new consoles is how much they will tank the prices of GPUs below 2080ti (and even it). Essentially it's gonna be a repeat of last cycle where 1080p performance was made to be dirt cheap, except now it's gonna be 4K. Bring it on!
 
I’d be guessing gtx 1080/2060 levels of performance. Enough to do 4k at a reasonable quality and framerate. But I’m still doubting Raytracing and 60hz. Raytracing enhances scenes possibly but not full scene like RTX just because of how much a perf hog it is.

It’s reasonably powerful but it’s nothing special.

I love my PS4 pro and XBX for the exclusives but other than that they don’t get much use. And compared obviously my pc performs and plays much nicer. Enjoyed many games over the years because of it.
 
What made The Witcher so well regarded ? Was it the graphics or was it the immersive storyline? I have friends who played it on both console and pc and both of them chatted about the story. They didn't spend too long willy waving over who saw it look its prettiest.

But then you never see the game as intended and have to have a lesser experience because of being impatient and using lower settings. And for the GPU grunt required it seems nothing special indeed i get the feeling these guys struggle with optimizing thier games.

So i could not say about what makes it great yet, I will tell you once i get a gpu that can run the game at 4k 120fps. And i will wait because under 4k is crap and so is under 120fps. I would rather go back and play the games just before Witcher 3 like Witcher 2 4k 120fps. This to me is the smartest thing when you want the best experience from what is works of art.
 
Then how do you justify a £1000 gpu?

Does everyone suddenly demand 4k 120fps?

I don't really understand what route pc gaming will take. Cyberpunk will be amazing on pc, but if its 4k/60 on a next gen console then why even bother with pc?

The gap seems to be closing and either consoles or things like stadia could finally change the pc gaming world in a way people used to claim for years.
Well that's only 1/2 the story. Although rumored:
If it's true that MS will direct port their console games to Win10 via power shell (with no .exe, same code used on consoles) then Radeon will find it very easy to optimize for PC. Nvidia not so much.
Another thing, if Sony and Microsoft are using AMD's version of hybrid ray tracing that AMD patented then things are going be great for those who have AMD PC systems for those games that are ported to PC.

Yeah, I will admit it's going to take a pretty big trickle down effect to get console triple AAA titles on PC via powershell (except those that are Xbox exclusive) but the writing, IMO, is on the wall that AMD's does have developer mind-share this time around.
--
To better answer your question though, High/Ultra High end gpu users won't believe anything will change from what they are seeing now. And, will only believe it (next gen consoles doing 4k 60fps and ray tracing) when they see it.

Sure, the rest of us know that GPUs that cost more then $600 will greatly depreciate in perceived and stated value once next gen consoles are released and we see them gaming at 4k 60fps. But that release date is still a long time from now.

This is why I've decided to put a pricing cap on GPUs moving forward. With next gen consoles they won't hold the same value they hold now with current gen consoles. Or to put it more bluntly, the leap in performance that MS and Sony will make with next gen consoles are going to leave this gen of consoles literately in the dust IMHO. And will be able to ray trace on top of that.
 
Last edited:
to put it more bluntly, the leap in performance that MS and Sony will make with next gen consoles are going to leave this gen literately in the dust IMHO. And will be able to ray trace on top of that.

Raytracing and 4k alone would take a whole new console generation each just to do each feature correct and true. Be honest here 4k is 4x the current perf they talked up taken. So the next gen is exactly like the last gen but 4k aka x4 performance!


But wait you want Raytracing? Ooooooh very expensive then we have to think about 4k vs Raytracing and this is even if people tolerated the exact same visuals fron your last gen just rendered in 4k. They would probably not accept that but i would personally i think it was something they have to just do. So wheres the grunt coming for all these things? Well they are going to fake most of it like the last generation this is what they always did.


Vastly reduced Raytracing just gimmicky things like mirror reflections etc and checkerboarded 4k which is fake. And the framerate will probably tank between 30 to 40fps in Raytracing enviroments because they will claim Freesync or VRR makes this ok. People will lap this up i would lap up water too if i never tasted milk. Anyone thinking they are going to get some sort of cheap magic 4k Raytracing box are deluding themselves they will get what £600 of pc parts can do right now which is not a lot with Raytracing period.
 
Every single time there's a new console generation people predict that we'll see [resolution]/60fps, and every time we end up with [resolution]/30fps for most "AAA" games, because publishers would much rather crank up the graphics rather than the framerate, as great graphics are a lot easier to sell to people than 60fps. It will be no different this time.
 
Back
Top Bottom