• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 2 (Ryzen 3000) - *** NO COMPETITOR HINTING ***

I cant believe people are arguing over FPS north of 144hz... who buys a 2080ti and does NOT use G-Sync??? i mean seriously... as if any of this even really matters? Lets be honest, if your north of 150fps on an adaptive sync screen you getting the best experience anyhow, regardless if its 160fps or 190fps.

And another thing, most people tend to frame cap their cards to the FPS limit of their monitors, i did with my 144hz 1440p screen on my Vega 64, and now with my UW 1440p 100hz screen i limit the fps, this has a desirable side effect where if your playing something that is old and doesnt take too much gpu grunt, your gpu stays super quiet too :)

But in reality, who really cares if one CPU gives 190fps on a 2080ti over the other CPU that gives 180fps? its not like that extra few FPS is even noticable on your adaptive sync screen. Oh wait, pull out the CSGO players cards... they are an extreme minority and as far as im aware anything north of 450fps on that game is golden, and as both chips achieve that with ease its a NON ISSUE.

Some people just tailor an argument to suit their needs, and try and kick back with a smug look on their face, when in reality they are talking aload of old tosh and drivel and in real world scenarios most of what they harp on about is a non issue.

Show me someone that has the cash to buy a 2080ti and doesnt own an adaptive sync screen and i'll show you a moron with more money than sense.
 
Why did you ignore the 4.8GHz 8700K at 187 FPS then? Is it because it doesn't fit in your narrative? How much faster do you think the 9900K with one core at 5.0GHz is going to be than 187FPS 4.8GHz 8700K. Don't worry I don't expect an answer since you ignored it the first time ;)

You're trying to imply there will be a negligible difference between a 6C 4.8GHz all core clock CPU (I assume? no information provided) and a 8C 4.7GHz all core clock one.

Again, what's the reasoning behind AMD using a RTX 2080? It only benefits them to do so and it's disappointing.
 
BLS2K8G4D30AESBK seems to be ~£80. On paper it has lower specs than other similarly priced kits. I know you've had a good experience but how is one supposed to decide and know which one will work with higher speed or tighter timings? It's a nightmare. :(

You need to brush up on your Google Fu. :)

Many people have used these kits with great success not just me, Micron E-die is topping the world speed records alongside B-die from Samsung. A good balance between speed and timings is what works well for me, but it depends on your application and how the 3xxx react to improved latency and sub timings, they might not be anything like the 1xxx or 2xxx in those respects.

It's hardly a nightmare, it is all part of the overclocking game and is great fun. :)
 
You're trying to imply there will be a negligible difference between a 6C 4.8GHz all core clock CPU (I assume? no information provided) and a 8C 4.7GHz all core clock one.

Again, what's the reasoning behind AMD using a RTX 2080? It only benefits them to do so and it's disappointing.

Jesus let it go already, the only thing you come into this thread to do is dance around waving that silly banner of "AMD are scared to show 2080ti results" its boring and pathetic, as i stated above, both chips are MORE than capable of playing all current games at decent FPS when paired with the correct GPU, and for most people now running adaptive sync screens the difference between 180fps and 200fps is not even noticable. So just give it a rest already, either discuss the damn Ryzen 3000 series chips like an adult or take your stupid tinfoil hat conspiracy theories elsewhere, we dont want anymore of your drivel.
 
will have to dig up thread on asrock z370- had a forum member on here with reply from asrock about their taichi i think it was stating that adding 4 sticks lowered all sticks to default speed . will have to double check it and x570 boards - was pretty surprised at asrock reply

New BIOS and AGESA are much better, running 4 x 8GB 8 Pack sticks (32GB) at 3200 C14 no problems.
 
You're trying to imply there will be a negligible difference between a 6C 4.8GHz all core clock CPU (I assume? no information provided) and a 8C 4.7GHz all core clock one.

Again, what's the reasoning behind AMD using a RTX 2080? It only benefits them to do so and it's disappointing.

Nope I asked you a question, why do 'you' think that the 9900K will steam ahead as you put it, based on the fact that a 8700K (6c/12t is not fully used by GTA V) so a 9900K only really offers the extra boost clock speed, and once you hit 6 cores in GTA V extra cores add nothing, but IPC and clock speed will.

Let me ask you this, how relevant is 8 FPS or 12 FPS, or 14 FPS in a game that is 6 years old running at a resolution that is fast becoming outmoded, and is already hitting 170+ FPS?
I don't give a flying monkey myself, but I am sure there are niche edge case people that it matters to, and they will not rest until the injustice it rectified!
 
Nope I asked you a question, why do 'you' think that the 9900K will steam ahead as you put it, based on the fact that a 8700K (6c/12t is not fully used by GTA V) so a 9900K only really offers the extra boost clock speed, and once you hit 6 cores in GTA V extra cores add nothing, but IPC and clock speed will.

Let me ask you this, how relevant is 8 FPS or 12 FPS, or 14 FPS in a game that is 6 years old running at a resolution that is fast becoming outmoded, and is already hitting 170+ FPS?
I don't give a flying monkey myself, but I am sure there are niche edge case people that it matters to, and they will not rest until the injustice it rectified!

+1 to this, his mock outrage is hilariously sad, its almost like hes trying to drum up support for his pathetic argument.
 
Lets be blunt, FPS north of 144fps is rather more about big numbers than it is actual performance, and frankly i'm very skeptical that the average person would even notice it in the first place.

You're literally talking 2.78 ms of extra time vs 240fps...

This is compared to a difference of 9.72 ms between 60fps and 144fps, AMD knows what the value in this is because they literally showed off an "anti-lag" feature, which going by the slide is going to be far more useful, but early days.
 
Back
Top Bottom