• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 2 (Ryzen 3000) - *** NO COMPETITOR HINTING ***

Caporegime
Joined
18 Sep 2009
Posts
30,112
Location
Dormanstown.
I'll try again, with less words. Don't make me break out the crayons.

1080p is easy. It doesn't need optimising for. No future hardware will find 1080p "hard".

I'd say my Vega 64 is pretty old (And the performance it offered at launch was old too) and I'd have to agree.
I run 2560x1080 and my vega 64 destroys everything and I'm usually picky.

It's not good enough for what I want for 4k gaming like
 
Soldato
Joined
6 Aug 2009
Posts
7,071
2600K (with a huge clock to be fair) to 2700X was a sidegrade (mild upgrade) for gaming (not saying I regret it in the slightest, been a wonderful rig). Amazing everywhere else, benchmarks look mighty impressive. Not... particularly skull shattering on the gaming front.

If you have a productivity reason for 8c/16t, absolutely go for it.
If you really can't wait and have to buy something, sure, a good crosshair VI X470 will be a worthy companion for the new gen.
If you can wait though, the new gen boards have PCie 4, (probably) more power phases and likely will enable an additional trick or 2 in the new chips.

I'd hold if you can help it.

+1 exactly what I'm doing.
 
Soldato
Joined
22 Nov 2018
Posts
2,715
I'll try again, with less words. Don't make me break out the crayons.

1080p is easy. It doesn't need optimising for. No future hardware will find 1080p "hard".

You're only thinking about today's games. People expect a GPU to last longer than 6 months. Think about in 5 years time when games have much better graphics, higher detailed textures, better light/shadow effects etc. Today's hardware will struggle at 1080p with ultra settings.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
You're only thinking about today's games. People expect a GPU to last longer than 6 months. Think about in 5 years time when games have much better graphics, higher detailed textures, better light/shadow effects etc. Today's hardware will struggle at 1080p with ultra settings.
He said "future hardware" and you countered with "today's hardware will struggle in future".

Not a very good retort.

This whole "1080p should be retired" argument has passed into the realms of being utterly stupid.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
He said "future hardware" and you countered with "today's hardware will struggle in future".

Not a very good retort.

This whole "1080p should be retired" argument has passed into the realms of being utterly stupid.

1080p is a primitive resolution unless on a 5-incher phone screen. The amount of details that you miss with it and the sheer pixels size is shocking.
Don't know why it's kept for so long but once you go to 2160p and up, you never want to go back to 1080p.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
1080p is a primitive resolution unless on a 5-incher phone screen. The amount of details that you miss with it and the sheer pixels size is shocking.
Don't know why it's kept for so long but once you go to 2160p and up, you never want to go back to 1080p.
You can't "keep" it or "lose" it.

The argument is akin to saying that if your car does 80 MPH at 60 MPG, then there is no need to drive at less than 80 MPH, and all speeds below 80 MPH should be retired.

It's just nonsensical. It's not how reality works.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
You can't "keep" it or "lose" it.

The argument is akin to saying that if your car does 80 MPH at 60 MPG, then there is no need to drive at less than 80 MPH, and all speeds below 80 MPH should be retired.

It's just nonsensical. It's not how reality works.

Nope, it's just that someone prints big money using the old technology and cheap/crap screens. We shouldn't confuse people that there aren't better things out there which will serve them well.
 
Soldato
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
May as well throw away all your Blu-ray discs and UHD ones too if you have any guys, since they'll be "primitive" when we get 8k TV's :rolleyes:

Hell why even bother watching TV now since it's so low res compared to what's coming in future, may as well just listen to the TV blindfolded.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
May as well throw away all your Blu-ray discs and UHD ones too if you have any guys, since they'll be "primitive" when we get 8k TV's :rolleyes:

Hell why even bother watching TV now since it's so low res compared to what's coming in future, may as well just listen to the TV blindfolded.

300 GB BDXL discs will be sufficient for 8K movies.
 
Soldato
Joined
11 Jun 2003
Posts
5,081
Location
Sheffield, UK
Urgh. We're even goalpost moving now.

The start of this little (1080p) discussion was someone pointing out some Dev notes or similar about (paraphrasing slightly) "1080p will no longer be optimised for".

That's.. "going forward we won't put effort into optimising drivers/hardware" (whichever it was) "for 1080p".

There was a comment about it. I replied suggesting 1080p being rather trivial for modern hardware, there was no need to spend effort optimising for it as... It's trivial for modern hardware.

Can everyone put their handbags away, stop taking each other way out of context and perhaps try and read and follow points being succinctly made, rather than taking any opportunity to sit and snipe about their own little agendas? :)
 
Soldato
Joined
11 Jun 2003
Posts
5,081
Location
Sheffield, UK
You can't "keep" it or "lose" it.

The argument is akin to saying that if your car does 80 MPH at 60 MPG, then there is no need to drive at less than 80 MPH, and all speeds below 80 MPH should be retired.

It's just nonsensical. It's not how reality works.

Equally nonsensical to suggest that when cars, planes etc travel at whatever speed, we should still base transport optimisations on travel by horse.

We don't exclude horse. It's just assumed that the basics needed for horse travel (a flat road) are pretty well handled by now.

It's entirely how reality works.

Quit being quite so obtuse.

Should modern hardware be optimised for 800x600? No.

Why? Because it's entirely trivial for modern hardware. It'll easily do the workload many times over. What optimisation is needed? What effort is needed (remember this came from some notes about IN THE FUTURE...) to "optimise" for 1080p when hardware is clearly able to smash it without barely a thought.

Again, before it's twisted back elsewhere you can't say "but xx% are still on 1080p!" And want 300fps 1080p optimisations in the same breath.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
33,188
Urgh. We're even goalpost moving now.

The start of this little (1080p) discussion was someone pointing out some Dev notes or similar about (paraphrasing slightly) "1080p will no longer be optimised for".

That's.. "going forward we won't put effort into optimising drivers/hardware" (whichever it was) "for 1080p".

These two statements aren't even close to the same, at all, and it's painfully silly to take it that way.

Optimise for, from a dev means. We want this game to run smoothly at our given target. If you make a game with all the same graphical settings and power requirements then it will run a different framerate if you run at 1080p and 4k. If you optimise for 1080p it means lets say going for a smooth 30 or 60fps on consoles but that might also mean the game sucks at 4k as it runs too slowly.

When a dev says they will optimise for something other than 4k, they are in absolutely no way saying they will ignore drivers or hardware, at all, they are saying if they are targetting 60fps on that title their goal will be 60fps at 4k, not 60fps at 1080p.

Every single performance optimisation they make for 4k = optimisation at 1080p. If you make the game more efficient that works everywhere, not at one randomly resolution and if you think they'll ignore some kind of performance or IQ bug where the game runs fine at 4k but has a weird bug for 1080p, you're nuts, they absolutely will not do that.

They are simply saying their goalposts are moving to a different resolution, it's got absolutely nothing to do with optimising hardware or drivers.
 
Joined
22 Feb 2019
Posts
1,189
Location
Guernsey
1080p is a primitive resolution unless on a 5-incher phone screen. The amount of details that you miss with it and the sheer pixels size is shocking.
Don't know why it's kept for so long but once you go to 2160p and up, you never want to go back to 1080p.

Primitive resolution lol.

1080p is easy on the eyes and a very comfortable viewing experience.
4k for many is unusable without scaling of 150% to 200% applied.
Personally it took me ages to get use to 2k just because it was that much tighter at native scaling.
in my opinion 4k is far better suited to large screen TVs than smaller PC monitors.
 
Soldato
Joined
11 Jun 2003
Posts
5,081
Location
Sheffield, UK
OK, then promoting it, working hard to implement it, limiting Full HD to legacy support, etc policies ;)

Ok, that's the original post I replied to.

The context there was of future hardware.

If a graphics card has power under the hood to do 1080p 60fps (this the "what most people use" that keeps being quoted from steam surveys. People on a higher res are a smaller market, people on 1080p@200/300fps are an even smaller market), what optimisation is required? From this point forward, the lowest end new cards etc onwards, ANYTHING can do 1080p@60fps. ALMOST anything will be able to do it with most detail turned on.

What optimisation is required?

Going "what and leave glaring performance issues unfixed?!". No. Obviously. That's a tangent though for sure.
Optimising, as in "teasing more out of the available hardware". Making it work doesn't mean optimising.

optimize
/ˈɒptɪmʌɪz/
verb
gerund or present participle: optimising
  1. make the best or most effective use of (a situation or resource).
    "we manage our time so that we optimize our productivity"
    • COMPUTING
      rearrange or rewrite (data, software, etc.) to improve efficiency of retrieval or processing.
What is there to optimise for? At the point it's future work done by a graphics card vendor towards the usability of it's products, at what point does a workload that uses 50-80% GPU need optimising for. What's the purpose? What improves? How is making it have MORE idle time optimal?
Framerate doesn't - that's stuck with making the CPU faster for more API calls from the game engines.

There is NOTHING to optimise for. Modern GPU's are already vastly more capable in pretty much every circumstance (outside the most bare bones onboard graphics that neither of us are referring to) than is necessary to run 1080p "optimally".


Now... the whole "1080p is crap and needs to die" mindset - I don't really give a toss for either way. It's one of the only bits of 10 year+ old tech we're still using (I had a 2405fpw in... 2005 when the 16:10 vs 16:9 format wars were on. 16:10 lost) but... whatever, it doesn't affect me so each to their own.
Understand the bit I was specifically talking about though.
I'll otherwise duck out of this particular wizzing contest as it's getting silly.
 
Last edited:
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Also remind us how expensive 4k/8k screens that do >100FPS are, please?

1080p can't be "dead technology" when the "better" alternatives are costing hundreds or thousands more. The cheapest 4k/100 screen I can see is £500 ish, and the a very good 1080p screen is a lot less than that.

In before, "This is an enthusiast forum and they should retire 1080p because we don't use it anymore," lol.
 
Soldato
Joined
19 Apr 2012
Posts
5,190
I'll be staying 1080P when I upgrade my mobo/cpu/ram later this year. I'm only a casual gamer on PC now so it makes no sense to me for the extra outlay. All my friends are on console so play that more but still find PC gaming better.
 
Caporegime
Joined
24 Dec 2005
Posts
40,065
Location
Autonomy
1080p is a primitive resolution unless on a 5-incher phone screen. The amount of details that you miss with it and the sheer pixels size is shocking.
Don't know why it's kept for so long but once you go to 2160p and up, you never want to go back to 1080p.


I have ultrawide 3440 x 1440 and would rather have that than 4k 16:9

oh and 1080p on my 17 dell laptop with a 1060 is fine for some mobile gaming.

Your post is nonsense.
 
Soldato
Joined
22 Nov 2018
Posts
2,715
Life was easier in the old days of CRT monitors because I could game at the highest resolution my hardware could handle. Then when better graphical games were released that struggled, I could switch to a lower resolution to keep my framerates up. When I found myself at sub 1024x768 levels, it was time to upgrade my CPU and graphics card so I could play at high resolutions again.

These days, screens have a native resolution so I was reluctant to go higher than 1080p because I'd be limited when my frame rates dropped. 1440p would be great but in a few years when my hardware starts to suck, 1080p would look blury so I'd have to upgrade my computer sooner than usual. Either that or keep two screens - a 1440p screen for new hardware, then switch to the 1080p screen when my hardware is getting old. Then change back to the 1440p screen when I upgrade the GPU. Which is ridiculous. So I stuck to 1080p.
 
Back
Top Bottom