• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

JayZ review of the 5700XT Red Devil, don't OC, Undervolt.

Caporegime
OP
Joined
17 Mar 2012
Posts
47,954
Location
ARC-L1, Stanton System
Whilst i agree with you on the cpu part zen 1 had a lot of issues at the start ram/bios/sleeping core parking and some programs really didnt like ryzen at all hehe. ryzen 2 launched with very few problems and even bios seemed ready my 2700x (which my son now has) worked perfectly from the get go no real problems and i kinda expected the 3000 series to be similar but ive had voltage issues with really high spikes on msi mobo otouching 1.6v at stock setting 3 mobo die 1 msi 1 asrock 1 asus now m asus c8h is simply amazing and the latest bios really did sort out my p[roblems i had.

intel have been playing safe for many years same cpu after same cpu and im pritty sure if no ryzen the 8700k and 9700k would have been quad core chips again and intel just keep patching security problems afeter problem but im sure intel will bring out another 14nm++++ cpu witch will push ahdead of ryzen again in ipc and gaming very much doubt they will match the 12 and core cpu in core count though but i know the roumored slides that made the rounds was fake im pritty sure intel will bring out a more core i9 cpu and a whole new socket

\\\\rdna i a start for amd with graphic cards amd really needed to move on GCN and even vega stuff they needed to sort out power draw and make a newish artecture to expand there gpu offering but I would have still liked to see amd really push it with rdna from alll the marketing material and people indepth views and reviews on it. It does look promising and im sure one day i will go back to having a amd gpu been so long since ive had one for my main pc My son has rx 470 which i picked up at a carboot for 75 pounds just before the minning craze started been a good gpu for him.

But my main choice now for him is a custom 5700xt at 439.99 the gigabyte 5700xt or the red dragion hence why i lokked at this thread or go for the kfa2 2070 super at 499.99

Intel will be refreshing coffeelake until well into 2021, they are expending 14nm production and retracting 10nm production while their road map is Sunny Cove 2019, Willow Cove 2020, Golden Cove 2021, all 10nm according to wikichip.

Intel will release a 10 core Coffeelake, which will be DOA given AMD's 12 core let alone the 16 core AMD will release the day after.

Coffeelakes 'Ring Bus' architecture limits the CPU to 10 cores, its one reason AMD invented a new way of cranking up core counts, a way that actually "scales"

They will release the 5Ghz all core KS to claw back some of the losses against Zen 2 with their IPC deficit, which is about 10%, Intel have a 10% IPC deficit to AMD... < it was fun writing that :D

On the Desktop IMO Intel are screwed for the next couple of years, if not for good because AMD are racing ahead at pace now, I don't think Intel care so much about Desktop now, they are loosing and loosing badly on server performance, performance per watt and price, they will put everything into getting 10nm working and ready for server because they don't need the clock speeds there, 10nm isn't giving it, much less even than AMD's 7nm Zen 2, but they need the yields ramped up but even then still with monolithic dies they ain't catching AMD, just look toward reducing the total humiliation by Zen 2 EPYC.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
every frame counts:) if you have a 144hz dispolay u want 144 fps:) sorry but its true
Actually that's incorrect. Completely incorrect. Let me explain why.
1. Most monitors today come with LFC so if you don't reach the target of 144FPS for a 144hZ monitor LFC kicks in to prevent screen tearing.
2. If you looking to reduce input lag RTG created Radeon Anti-Lag, RAL. It does indeed reduce input latency. In which you don't need to worry about having 1000 FPS in order to reduce or remove input latency from your peripherals.

So in either case RTG has it lock, stock and barrel. With Nvidia you simply have to rely on brute power and IQ reduction techniques like DLSS to get the job done.
 
Caporegime
OP
Joined
17 Mar 2012
Posts
47,954
Location
ARC-L1, Stanton System
Actually that's incorrect. Completely incorrect. Let me explain why.
1. Most monitors today come with LFC so if you don't reach the target of 144FPS for a 144hZ monitor LFC kicks in to prevent screen tearing.
2. If you looking to reduce input lag RTG created Radeon Anti-Lag, RAL. It does indeed reduce input latency. In which you don't need to worry about having 1000 FPS in order to reduce or remove input latency from your peripherals.

So in either case RTG has it lock, stock and barrel. With Nvidia you simply have to rely on brute power and IQ reduction techniques like DLSS to get the job done.


Even DLSS doesn't work.

On the left native 2160P. In the middle 1640P with AMD's "FedelityFX" enhancement. On the right 2160P with DLSS.

LvqGzmE.png
 
Soldato
Joined
6 Aug 2009
Posts
7,073
Even DLSS doesn't work.

On the left native 2160P. In the middle 1640P with AMD's "FedelityFX" enhancement. On the right 2160P with DLSS.

LvqGzmE.png

Very interesting. Probably not that noticeable with a moving image? Surprised how good that Fidelity FX looks for non native. I presume it gets you higher frame rates? Does it introduce any latency penalty?
 
Caporegime
OP
Joined
17 Mar 2012
Posts
47,954
Location
ARC-L1, Stanton System
Very interesting. Probably not that noticeable with a moving image? Surprised how good that Fidelity FX looks for non native. I presume it gets you higher frame rates? Does it introduce any latency penalty?

Don't know about input latency, i don't see how, introduces about a 2% performance penalty, however, what you do is reduce the resolution, in this case 78% of 4K and use FedelityFX to get the 4K image quality, perhaps even a little better, it looks to me in that image.

DLSS just blurs the crap out of everything, like FXAA, another one of Nvidia's 'enhancement' brain farts.

Tim from Hardware Unboxed made that video, a very detailed review, i'm going off line for a while but when i get back i'll try to find it.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Don't know about input latency, i don't see how, introduces about a 2% performance penalty, however, what you do is reduce the resolution, in this case 78% of 4K and use FedelityFX to get the 4K image quality, perhaps even a little better, it looks to me in that image.

DLSS just blurs the crap out of everything, like FXAA, another one of Nvidia's 'enhancement' brain farts.

Tim from Hardware Unboxed made that video, a very detailed review, i'm going off line for a while but when i get back i'll try to find it.

There you go.
 
Soldato
Joined
14 Apr 2009
Posts
4,818
Location
Cheshire
Oh come on..... its a fantastic competitor to the 2070S, its quicker than the 2070.

First iteration of RDNA and with lots room to grow more Shaders. its a flying start...
Agreed.

A lot of people arent on the same journey as us and AMD. They don't care for potential it's all just now now now.

I'm very likely to buy an AMD card when my 1080ti is retired, thankfully its pretty clear I'm not doing that for a couple of years as its still a monster.
 
Associate
Joined
26 Nov 2015
Posts
1,481
Location
Derby
this is all very promising.. i still want to see what will amd show in the future.. 5700xt is a really good product.. especially those custom AIB variants..
 
Associate
Joined
19 Sep 2014
Posts
61
The judgement on 2070s v 5700xt will be when enough AIB cards have been sold and what price both are say 1st December before XMAS holiday season.

Currently the 5700XT is in the low 400's for AIB and the 2070S 500 or so.
 
Associate
Joined
4 Jul 2016
Posts
275
Intel will be refreshing coffeelake until well into 2021, they are expending 14nm production and retracting 10nm production while their road map is Sunny Cove 2019, Willow Cove 2020, Golden Cove 2021, all 10nm according to wikichip.

Intel will release a 10 core Coffeelake, which will be DOA given AMD's 12 core let alone the 16 core AMD will release the day after.

Coffeelakes 'Ring Bus' architecture limits the CPU to 10 cores, its one reason AMD invented a new way of cranking up core counts, a way that actually "scales"

They will release the 5Ghz all core KS to claw back some of the losses against Zen 2 with their IPC deficit, which is about 10%, Intel have a 10% IPC deficit to AMD... < it was fun writing that :D

On the Desktop IMO Intel are screwed for the next couple of years, if not for good because AMD are racing ahead at pace now, I don't think Intel care so much about Desktop now, they are loosing and loosing badly on server performance, performance per watt and price, they will put everything into getting 10nm working and ready for server because they don't need the clock speeds there, 10nm isn't giving it, much less even than AMD's 7nm Zen 2, but they need the yields ramped up but even then still with monolithic dies they ain't catching AMD, just look toward reducing the total humiliation by Zen 2 EPYC.

'Intel are screwed'.....Except for in gaming which they are still the best. None of that 'gap is close'...'difference is negligible' bs, please, as it normally comes after.
 
Caporegime
OP
Joined
17 Mar 2012
Posts
47,954
Location
ARC-L1, Stanton System
'Intel are screwed'.....Except for in gaming which they are still the best. None of that 'gap is close'...'difference is negligible' bs, please, as it normally comes after.

The gap is tiny, its not BS its the truth - you dont have to have an IQ above a warm glass of water to see that.
On the other side of the coin, the gap in things outside of games is rather large - to the red side.

Indeed, 9900K 5Ghz vs 3800X 4.5Ghz its 2 to 5% to the 9900K.

Its nothing,....

9900K: 116 vs 108 :3800X
9900K: 188 vs 184 :3800X
9900K: 104 vs 100 :3800X
9900K: 115 vs 110 :3800X
9900K: 140 vs 123 :3800X
9900K: 162 vs 158 :3800X
9900K: 125 vs 120 :3800X
9900K: 107 vs 103 :3800X
9900K: 165 vs 159 :3800X



 
Last edited:
Soldato
Joined
27 Mar 2010
Posts
3,069
Indeed, 9900K 5Ghz vs 3800X 4.5Ghz its 2 to 5% to the 9900K.

Its nothing,....

9900K: 116 vs 108 :3800X
9900K: 188 vs 184 :3800X
9900K: 104 vs 100 :3800X
9900K: 115 vs 110 :3800X
9900K: 140 vs 123 :3800X
9900K: 162 vs 158 :3800X
9900K: 125 vs 120 :3800X
9900K: 107 vs 103 :3800X
9900K: 165 vs 159 :3800X





What difference would be there if the laptop gpu was swapped for a High end gpu 2080ti?
 
Soldato
Joined
22 Apr 2016
Posts
3,448
Indeed, 9900K 5Ghz vs 3800X 4.5Ghz its 2 to 5% to the 9900K.

Its nothing,....

9900K: 116 vs 108 :3800X
9900K: 188 vs 184 :3800X
9900K: 104 vs 100 :3800X
9900K: 115 vs 110 :3800X
9900K: 140 vs 123 :3800X
9900K: 162 vs 158 :3800X
9900K: 125 vs 120 :3800X
9900K: 107 vs 103 :3800X
9900K: 165 vs 159 :3800X




So Intel is still king then ;) As you’ve nicely confirmed! The best thing about Ryzen 3000 series is I can drop one into my x470 mobo and I can’t wait until either the 3900x becomes available or even the 3950x. A few FPS wont matter now to most people given its other benefits.

Sadly I don’t get so excited about Navi, at best AMD’s 7mn is competing with the mid range 2070S which is Nvidia ‘old’ 12 mn. AMD have fallen further behind and the moment Nvidia release their 7mn it’s practically game over.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
Its all over the place because in FH4 its a match in performance for a 2080TI, maybe that's it?

Damn those stupid all over the place GPUs...........

BF5 & FH4 represent the next gen of Xbox games that we start seeing.
Especially those on MS Store, going to come with all console optimizations.
 
Back
Top Bottom