Gains with small bottlenecks

Extra 100 quid for like 2% gains in fps? That just ain't my style cowboy .

Amd kind of screwed up the oppurtunity to sell me their expensive chips with that one . Seems like the i5 2500k of recent times. I think it'll hold up well.

If I had of bought the 2600k instead of the 2500k, I wouldnt have bought the 3700x today.... I'd have got at least another couple of years out of it.
 
I've actually tried and failed. In fact I'm not even sure it's a technical issue...
Believe it or not the specs of the Sony x900f TV only list 30fps @ 1440p.
-When choosing 1440p on the nvidia panel 60fps just isn't there anymore.

I remember reading the rTings review before I bought it and thinking "nah come on its gotta do 60fps, esp if it does it @ 4k."
And It was actually recommended as a pc/gaming TV so I literally assumed it was a miss print on the specs.

The crazy thing is I have actually ran 1440p @ 60fps several times. I have the benchmark screenshots from Assasins Creed Origins and a few others to prove it.

Cracking TV... But a big BIG £1000 **** up considering it's 99% used as a monitor.
 
I've actually tried and failed. In fact I'm not even sure it's a technical issue...
Believe it or not the specs of the Sony x900f TV only list 30fps @ 1440p.
-When choosing 1440p on the nvidia panel 60fps just isn't there anymore.

I remember reading the rTings review before I bought it and thinking "nah come on its gotta do 60fps, esp if it does it @ 4k."
And It was actually recommended as a pc/gaming TV so I literally assumed it was a miss print on the specs.

The crazy thing is I have actually ran 1440p @ 60fps several times. I have the benchmark screenshots from Assasins Creed Origins and a few others to prove it.

Cracking TV... But a big BIG £1000 **** up considering it's 99% used as a monitor.
Looks to me like you can do 1440p 120Hz if you upscale on the PC. Make sure it's an HDMI 2.0 capable cable (in other words just decent quality). Set the output to 4k initially so that's the hardware protocol being used, then change the resolution. You might need to manually update the TV firmware according to a Google.
 
I'm confused... To get the pc to upscale it would have to receive a 1440p (or less than 4k) capability signal from the TV? And we can't change res settings on tvs. You mean change to 1440p then choose 120hz via nvidia control panel?
If I change res to 1440p, the same settings are available. Ie - nothing higher than 30hz and no colour change options.
I've never seen 120hz selectable... Didn't know it was possible.

I'm sure the cable is 2.0 othwise surely I couldn't do 4k @ 60fps?

The TV is set to automatically download... In fact I think maybe that's why I lost 60hz @ 1440p.
I remember about 5 months ago I turned pc on and I could not get any signal.
I changed the hdmi from port 2 to hdmi 1 and I got a signal... Odd. Also got a signal in hdmi port 3 (arc) where my 4k hdr blu ray player is connected.

After playing around, I figured out if I put the pc hdmi back in #2, and changed
Display ~ Picture ~ advanced settings ~ video options ~ hdmi video range > full... I could get my picture back. I thought it was on full anyway.
I guess that's maybe when I lost 1440/60hz.

Fyi - It looks like the TV is auto'ing to "limited" because when I scroll across to "full" from either low or limited, the screen flashes on and off. I do get a signal on either now though.
What does that even mean? Full/limited?

Either way - 1440p still does not have 60hz option.
 
*edit*
Ok, Fixed it... Kind of.

Above was wrong... I also remember playing with...
Settings ~ external inputs ~ hdmi signal format ("enhance 4k signal format) ~ then choosing either" standard format" or "enhanced format - displays high quality hdmi signals on hdmi inputs 2 & 3. Select this option only if your device supports high-quality HDMI formats such as: 4k 60p 4:2:0 10bit, 4:4:4, 4:2:2.

Once I enable 'enhanced format', 1440p/60Hz becomes available again, as does option to change colour depth, and rgb/ycbrc colour format etc.
No 120hz though.

Should I play around with those BTW? I played around before but just left on "use default colour settings in the end.

So... Maybe I'll wait for an rtx3080... Although the rtx2080 at £390 is now tempting me
 
If I had of bought the 2600k instead of the 2500k, I wouldnt have bought the 3700x today.... I'd have got at least another couple of years out of it.
Well I thought the same thing and went from my 2500k to my 4770k. But actually, most of my games I've actually benchmarked perform BETTER with hyper threading off. All bar one - Assassins creed Origins, which ends up with slightly better 1%lows. Some it does not seem to affect at all.
I've literally just been using it as an i5 .
 
And you think b550 boards will support pci express 4.0?
Might want a nvm2 one day

And their vrms will handle a 12c/16t?
B550 won't be PCIe v4 chipset.
So far all information points to it sticking to PCIe v3 even in uplink to CPU.
With only four PCIe v3 lanes supplied for devices in addition to those dozen yearls old v2 lanes.
https://www.tomshardware.com/news/amd-b550-chipset-specifications-motherboard-specs,40467.html
But main PCIe x16 and M.2 slot connected to CPU's lanes should get PCIe v4.
That would have been possible in better pre-X570 boards if AMD had allowed mobo makers to enable it in boards they deemed fit for it.

VRM isn't tied to chipset in any way and we can expect major spread.
No doubt worst boards will be made basically for handling only six cores under full load with 8 cores already pushing toward limits.
Some more expensive models could get VRMs good for higher up CPUs, but question is about where those will be priced.
Wouldn't call VRMs of even the most expensive B450 boards as good for heavily loaded 12 core.

For normal home users PCIe v4 NVMe drives will be usefull only if you're playing copying data back and forth to fill the drive as fast as possible and then formatting it to start over.
 
Well I thought the same thing and went from my 2500k to my 4770k. But actually, most of my games I've actually benchmarked perform BETTER with hyper threading off. All bar one - Assassins creed Origins, which ends up with slightly better 1%lows. Some it does not seem to affect at all.
I've literally just been using it as an i5 .
Intel's SMT implementation being different from AMD's implementation.
And even comes with vulnerabilities (which made secured focused OpenBSD disable HT) because of Intel's failing to give **** Swiss cheese security design which they haven't bothered to fix.
On Ryzen situation is pretty much plus minus zero. Some games lose few percents or so, some gain same with upper limit somewhere between 5-10 %.

Of course used games will affect heavily.
Games designed pretty much for two cores in that four cores is high end Intel stagnation era (Intel has been big obstacle for advance of game development) certainly cant' benefit any and more likely lose some.
Though performance loss can be also down to Microsoft doing **** work with their stupid schedulers.
While Android has for years managed moving loads between vastly different size and power cores (three different size cores in the latest) scheduler of Wintoys10 can't even put single thread heavy load to run on best cores of the CPU!
https://www.techpowerup.com/review/1usmus-custom-power-plan-for-ryzen-3000-zen-2-processors/2.html

And that seems to be continuing:
"In the near future, Microsoft will release the Windows 10 19H2 update (Windows 10 1909), which gives the OS scheduler the ability to prioritize threads. I tested a pre-release build of this version and didn't notice significant improvements. Quite often, the scheduler used a higher priority for background processes. I think you can imagine what will happen if Windows gives precedence to such a process, and not your currently running game."
But of course running some background DRMs and user spying aka "telemetry" is more important than foreground loads user is trying to run...
Though I guess we should be already glad, if Microsoft manages to make patch which doesn't break lots of things!
 
Wow that's some useful info cheers guys. Considering I'm building a future build for perhaps the next 5 years of so, so perhaps will even see a pci-e 5 card toward the end of its life, and factoring in the boards are already nearing their limit for 8c cpus (I imagine in 5 years I'll be on a 12c/24t cpu) I think x570 seems the only way to go for me.

However about amds HT not affecting performance... I swear I saw a few benchmarks from gamers nexus where the 3600 performed about 5-7% better with smt off... Am I missing something? Perhaps it was the older gen games.

I really do hope game developers start working with amd to utilise all these cores that they have brought to the table, and we see the end of the stagnating cpu performance year on year we've seen from Intel.
 
The same goes for gpus... I hope with amd AND intel competing for market share, nvidia will be forced to go to the next level with rtx3000 series.
I remember the jump from my 970 to the 1080 was like a 100% performance uplift. Despite the mining situation having pushed cards up, the value for money was still there.
 
However about amds HT not affecting performance... I swear I saw a few benchmarks from gamers nexus where the 3600 performed about 5-7% better with smt off... Am I missing something? Perhaps it was the older gen games.
Here's one newer comparison using 3900X, with that overall +/- zero results:
https://youtu.be/uIWO-821vik?t=8m

Though most of those performance degradations are likely from that Microsoft's garbage level scheduler being dumb as brick compared to scheduler of Android.
(and Android is based on Linux whose scheduler was fast updated to understand Zen uArch)

Well, wouldn't be surprised if Intel has paid Microsoft to slow down speed of scheduler modernizing to keep AMD's performance down.
Intel is pretty much at Sith master level in dirty tricks and has done huge amount of harm to competition and consumers throughout the years:
http://jolt.law.harvard.edu/digest/intel-and-the-x86-architecture-a-legal-perspective
Something copied by Nvidia.

But next-gen consoles should make game code itself well fitting to Ryzens.
With fixed hardware developers will always start fast looking for ways to extract more of hardware's performance potential.
Lots of games and their engines have been made for that "4C is best what can be expected from bigger user base" Intel stagnation era.
Neither did old consoles with their tablet level as new CPUs have HT/SMT.


The same goes for gpus... I hope with amd AND intel competing for market share, nvidia will be forced to go to the next level with rtx3000 series.
I remember the jump from my 970 to the 1080 was like a 100% performance uplift. Despite the mining situation having pushed cards up, the value for money was still there.
With people still willing to pay super luxury prices for what's pretty much yester year's core/thread counts on dead end platform I'm just getting afraid Intel will transfer that pricing to GPUs.
 
I went with the 3600, 32gb 3200 Vengeance, Asus x570-p and rx2080 in the end.

Not a massive increase in fps for the £800 but after I've sold the Delidded 4770k, mobo and super (low latency) ddr3 I will have only spent around £300... That's not too bad for an extra 30-40% fps and some future proofing is it?

Smallest upgrade I've ever done, but tbh I'm just happy to be with AMD again. Always liked their business ethics, and it's been a looooong time since the 2500k



With people still willing to pay super luxury prices for what's pretty much yester year's core/thread counts on dead end platform I'm just getting afraid Intel will transfer that pricing to GPUs.
Well people will pay crazy money... IF that's it's only thing available (does it matter that the 2080ti is £1200 when it's the only true 4k option?) and just to say they have "the best":
The 9900k is still the fastest for gaming but at least with cpus it seems many are smart enough to realise the extra 5-10% doesn't rule out the competition now, and I've FINALLY seen so many people siding with AMD mainly because of them working to not release new chip-sets each year. Siding as in... Actually adopting the platform rather than saying it and then just going out and buying the 10% faster intel anyway... Then moaning 1 year later about the insane price of the new Intel boards :D

-I think Intel know full well (Esp after AMDs surprise innovations with ryzen/threadripper) that their pricing strategy will need to be VERY different against both AMD and Nvidia. They've got the financial muscle to utilise quite aggressive pricing... Let's hope they do. I just hope someone can release a high end product once again or that end of the market will remain out of the reach for most.
It only takes one courageous manufacturer to put an mrsp at the level where they can make a profit AND NOT bumrape their customers to shake up the pricing.
e.g AMDs... 5700xt (?). What a cracking card for the money... As someone said earlier it's a shame there wasn't quite enough in it over a gtx1080 to justify a purchase for me personally.
 
Back
Top Bottom