• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel 10th Gen Comet Lake thread

You realise that the difference between PCIe 3 8x and 16x is 2% with a 2080ti..... and 8x has 50% less bandwidth.

How much do you think you are going to get from PCIe4?
Did you even bother to watch the video from hardware unboxed?

2080ti scales between pcie3x16 and x8 by 10% to 15% across the low and max fps

when you look at the 5700xt the higher the res the larger the performance loss when used in pcie3. The lower 1% low is significant as some games seems to have same peak fps but the 1% low is higher under pcie4.

no matter how you cut it the bigger bandwidth will facilitate faster transfer between GPU and system especially under the heavier workloads.

Say 3080 is twice as much performance than 5700xt and when that gets translated into fps I wouldn’t want to guess but if I have to you will be looking at around 10%-20% loss in performance on AAA titles on average @ pcie 3 vs 4.
 
Did you even bother to watch the video from hardware unboxed?

2080ti scales between pcie3x16 and x8 by 10% to 15% across the low and max fps

when you look at the 5700xt the higher the res the larger the performance loss when used in pcie3. The lower 1% low is significant as some games seems to have same peak fps but the 1% low is higher under pcie4.

no matter how you cut it the bigger bandwidth will facilitate faster transfer between GPU and system especially under the heavier workloads.

Say 3080 is twice as much performance than 5700xt and when that gets translated into fps I wouldn’t want to guess but if I have to you will be looking at around 10%-20% loss in performance on AAA titles on average @ pcie 3 vs 4.

Learn the difference between "fast" and "wide" PCIe isnt speed its bandwidth. Using a 2080ti with 8x and 16x is irrelevant.

Using a slow 5700XT is also irrelevant.

There is no reliable information out in the market today. Thats a fact. We will see what happens with the 30xx
 
Learn the difference between "fast" and "wide" PCIe isnt speed its bandwidth. Using a 2080ti with 8x and 16x is irrelevant.

Using a slow 5700XT is also irrelevant.
FAST WIDE is the same thing when you talking about gbps - tkae a humble pie here and stop being a troll

if a lesser 5700XT that can be limited by the total bandwidth of the PCIE3 lanes then a GPU that can compute at a great speed and require much more data throughput to memory and CPU would also be limited.

I look forward to saying "told you so" in a couple of weeks time.
 
gXjLtLs.png

That's probably just another AMD driver bug.

The 2080ti was the first GPU to gain a slight performance benefit going from PCIEv3 x8 to x16. Gen 3 x16 is still plenty fast enough for 3090.
 
Oem I'm after retail as 1 year warranty is terrible imo

CPU warranty lol. So not needed, CPU failure rate is astronomically low. In my lifetime, I've only seen broken CPU's where someone has physically broken it. Whether that be bending pins, spreading conductive TIM on the pins/capacitors on the back of the CPU, attempting a delid etc.

That said, I only buy retail as well, as I refuse to run the risk of a retailer mass testing OEM chips, and selling the worst clocking ones as new. Of course all retailers deny this (doh!) but it absolutely still happens, all around the world.
 
That's probably just another AMD driver bug.

The 2080ti was the first GPU to gain a slight performance benefit going from PCIEv3 x8 to x16. Gen 3 x16 is still plenty fast enough for 3090.

I am not convinced, nVidia have stated they wish their testing methodology to be performed on pcieV4 systems, it would seem odd they request this if it wasn't to show positive benefit.
 
Did you even bother to watch the video from hardware unboxed?

2080ti scales between pcie3x16 and x8 by 10% to 15% across the low and max fps

when you look at the 5700xt the higher the res the larger the performance loss when used in pcie3. The lower 1% low is significant as some games seems to have same peak fps but the 1% low is higher under pcie4.

no matter how you cut it the bigger bandwidth will facilitate faster transfer between GPU and system especially under the heavier workloads.

Say 3080 is twice as much performance than 5700xt and when that gets translated into fps I wouldn’t want to guess but if I have to you will be looking at around 10%-20% loss in performance on AAA titles on average @ pcie 3 vs 4.


Guess i was right after all....

"
There’s a couple of key points to make here. First of all, if you’ve gone through our game-by-game breakdown, you will not be surprised to see PCIe 4.0 makes very little overall difference compared to PCIe 3.0. Almost nothing, in fact – at 1440p, the average difference between the 3900XT with PCIe 4.0 and the same CPU with PCIe 3.0, is a mere 2%, or 2FPS. At 4K, the difference is reduced to exactly nothing.

The other point to bring up is CPU bottlenecking. At 1080p, the 10900K is simply a faster gaming CPU than the 3900XT, performing 10% better on average despite its PCIe 3.0 limitation. Even at 1440p and 4K, it’s simply averaging higher frame rates than the 3900XT with PCIe 4.0."
 
FAST WIDE is the same thing when you talking about gbps - tkae a humble pie here and stop being a troll

if a lesser 5700XT that can be limited by the total bandwidth of the PCIE3 lanes then a GPU that can compute at a great speed and require much more data throughput to memory and CPU would also be limited.

I look forward to saying "told you so" in a couple of weeks time.

Time to eat some humble pie yourself. Rename yourself to 'dont-know-much-about-pc-guy'
 
Guess i was right after all....

"
There’s a couple of key points to make here. First of all, if you’ve gone through our game-by-game breakdown, you will not be surprised to see PCIe 4.0 makes very little overall difference compared to PCIe 3.0. Almost nothing, in fact – at 1440p, the average difference between the 3900XT with PCIe 4.0 and the same CPU with PCIe 3.0, is a mere 2%, or 2FPS. At 4K, the difference is reduced to exactly nothing.

The other point to bring up is CPU bottlenecking. At 1080p, the 10900K is simply a faster gaming CPU than the 3900XT, performing 10% better on average despite its PCIe 3.0 limitation. Even at 1440p and 4K, it’s simply averaging higher frame rates than the 3900XT with PCIe 4.0."

I was wondering how long before this post heh.
 
Guess i was right after all....

"
There’s a couple of key points to make here. First of all, if you’ve gone through our game-by-game breakdown, you will not be surprised to see PCIe 4.0 makes very little overall difference compared to PCIe 3.0. Almost nothing, in fact – at 1440p, the average difference between the 3900XT with PCIe 4.0 and the same CPU with PCIe 3.0, is a mere 2%, or 2FPS. At 4K, the difference is reduced to exactly nothing.

The other point to bring up is CPU bottlenecking. At 1080p, the 10900K is simply a faster gaming CPU than the 3900XT, performing 10% better on average despite its PCIe 3.0 limitation. Even at 1440p and 4K, it’s simply averaging higher frame rates than the 3900XT with PCIe 4.0."

Give it until 4000 series pal.
You know the series that'll arrive soon, and be capable of pcie4 usage within its lifespan, rather than what you've spent your money on, which is completely done, and unupgradable.
 
Guess i was right after all....

"
There’s a couple of key points to make here. First of all, if you’ve gone through our game-by-game breakdown, you will not be surprised to see PCIe 4.0 makes very little overall difference compared to PCIe 3.0. Almost nothing, in fact – at 1440p, the average difference between the 3900XT with PCIe 4.0 and the same CPU with PCIe 3.0, is a mere 2%, or 2FPS. At 4K, the difference is reduced to exactly nothing.

The other point to bring up is CPU bottlenecking. At 1080p, the 10900K is simply a faster gaming CPU than the 3900XT, performing 10% better on average despite its PCIe 3.0 limitation. Even at 1440p and 4K, it’s simply averaging higher frame rates than the 3900XT with PCIe 4.0."

To add some more context to this:

avg-1080-1.png


lol
 
Anyone else still thinking about picking one of these up following the zen 3 launch. I keep coming back to a 10700k + MSI cashback deal for gaming. The 5800x just looks on par with the i7 from the slides

Does this make me a fanboi now?
 
I occasionally hover over the 10850K but I'd need a pretty decent offer really to jump - still in 2 minds about just shoving an old Xeon in my current rig and waiting it out a bit longer yet.

I have specific software needs which will keep me with Intel anyway.
 
After the announcement yesterday I'm still going down the intel route, just finalizing my parts now :D Although I feel I'm just around the corner from getting a good deal -.-
 
I got my 10900k 3 weeks ago and i do wonder if i made the right decision, my PC is purely for gaming. I know it is still a great CPU and i am wondering whether my all core OC @ 5.1 will be on par or better than the 5900x
 
I got my 10900k 3 weeks ago and i do wonder if i made the right decision, my PC is purely for gaming. I know it is still a great CPU and i am wondering whether my all core OC @ 5.1 will be on par or better than the 5900x
It'll be fine - your CPU will be happily playing games for the next 5 years+ at a decent pace
 
Back
Top Bottom