• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

LOL, Nvidia inadvertanly promotes AMD CPUS.
https://www.pcgamesn.com/amd-battlefield-5-hardware

the new Turing graphics architecture isn’t the be-all and end-all of real-time ray tracing. AMD’s own hardware might have something to say about the performance of games with all those billions of rays of light beaming around.
We spoke with DICE technical director, Christian Holmquist, who explained that in order to get the RTX-enabled version of Battlefield 5 running well it needed to “go wide” and utilise many more cores than its standard recommended spec.
To achieve the results it wanted with the ray traced version of the game, however, DICE has been targeting a beefy CPU with 12 threads, such as the mighty Ryzen 5 2600.

“What we have done with our DXR implementation is we go very wide on a lot of cores to offload that work,” explained Holmquist, “so we’re likely going to require a higher minimum or recommended spec for producing RT. And very wide is the best way for the consumer in that regard, with a four-core or six-core machine...12 hardware threads is what we kind of designed it for.

There you have it ladies and gents RTX is not what it's marketed to be.

And I'm quiet sure that other developers are coming to the same conclusion (1080p 30-60 fps). You need to incorporate and us 12 or so hardware threads to get RT running. That's the problem, RTX can't do it alone. Better yet, there is no real proof that RTX is doing it at all to be honest.


Interview with a vampireRTX
Interviewer: Developers say that you aren't doing Ray tracing alone and are choking to even do 1080p smoothly. What do you say to those who've pre-ordered you?
RTX: That's an excellent question interviewer. Dah Da Da Da Da Da Da ho ho hoooooooo... Lo Lo Loooo Lo Lo Loooooo Lo Lo Looooo Lo Lo Lo Loooooooooooooooooo...AAIIEEEEEEEEEEEEEEE

And there you have it Ray Tracing delegated back to the CPU as it always has been.




-------------------------------------------------------------------
video removed from youtube?
Yes, it appears to have been removed for violating terms for not being legit.
 
Last edited:
But that's been my point all along, If you use Real Time Ray Tracing, it will tank framerate because the GPU grunt isn't there yet. And the only way they will get it to playable frame rates is by faking it basically. The important thing isn't the overall performance, it's that Ray Tracing is finally in consumer level GPUs and game developers are using it, the performance will come down the line with future cards. That's why I didn't agree with you when you said it would be a disaster for Nvidia. It won't, even if the cards themselves aren't a massive success, it will be another feather in Nvidia's cap.

I think we were talking cross purposes. RTX will be a slow trickle... it's not as though day 1 of launch they'll be half a dozen games that demand "full real time ray tracing" across all aspects of gameplay. Were that the case then these cards would just crumble, unable to deliver anywhere near acceptable frame rates. So that clearly won't be the situation we're faced with.
 
LOL, Nvidia inadvertanly promotes AMD CPUS.
https://www.pcgamesn.com/amd-battlefield-5-hardware



There you have it ladies and gents RTX is not what it's marketed to be.

And I'm quiet sure that other developers are coming to the same conclusion (1080p 30-60 fps). You need to incorporate and us 12 or so hardware threads to get RT running. That's the problem, RTX can't do it alone. Better yet, there is no real proof that RTX is doing it at all to be honest.


Interview with a vampireRTX
Interviewer: Developers say that you aren't doing Ray tracing alone and are choking to even do 1080p smoothly. What do you say to those who've pre-ordered you?
RTX: That's an excellent question interviewer. Dah Da Da Da Da Da Da ho ho hoooooooo... Lo Lo Loooo Lo Lo Loooooo Lo Lo Looooo Lo Lo Lo Loooooooooooooooooo...AAIIEEEEEEEEEEEEEEE

And there you have it Ray Tracing delegated back to the CPU as it always has been.

Posts like this make me scratch my head. I am not sure why people think everything is "AMD V NVidia" and has to be. I would happily run a Ryzen CPU if mine wasn't up to the task and they are very good CPUs. It is a good article and explains that the more CPU cores you have, the better the performance will be, which is fantastic news. The olden days of having a 6c12t chip but only one core is sitting at 100% is hopefully long gone.

And as for very childish end statement, RT on a CPU like it has always been is pathetic!

Edit:

A good article but a troll post.
 
Posts like this make me scratch my head. I am not sure why people think everything is "AMD V NVidia" and has to be. I would happily run a Ryzen CPU if mine wasn't up to the task and they are very good CPUs. It is a good article and explains that the more CPU cores you have, the better the performance will be, which is fantastic news. The olden days of having a 6c12t chip but only one core is sitting at 100% is hopefully long gone.

And as for very childish end statement, RT on a CPU like it has always been is pathetic!

Edit:

A good article but a troll post.
Right now Ryzen CPU has established it's Brand as a more competitive multicore cpu when it comes to multicore tasks. We also know that Intel is playing catch up to that with their 9000 series being released on the alleged z390 chipsets later this year.
Furthermore, the article mentioned the 2600 (not an Intel CPU). So you are incorrect on that nebulous claim of AMD vs Nvidia as nvida doesn't make cpus. That is a strawman you created just now. So, to further clarify, RT is helping AMD CPUs.

RT isn't something new. Nor was it developed by nvidia's RTX lineup which you seem to allude to. It goes as far back as 1980's. During that time it was being used on the cpu. Albeit that it was slow. Besides, RT isn't an api limited on the gpu either.

The context of the article brings something to light here. What is it that the RTX is doing where Dice still needs more cores to get Ray Traced Reflections up to speed? Because the only thing demonstrated in BFV was reflections. With them telling us that they need more CPU powah (AKA HW threads) to get RT working is throwing a lot of shade at RTX Brand as a whole.
 
Last edited:
Don't understand what he means by new tier. The 2080Ti is the same tier part as the 1080ti, 980Ti, 780Ti. Same with the 2080, and I would bet money that the 2070 will be the same tier part as the previous generations too, ie. the 1070, 970, 770.

There isn't any new Tier, they just increased the prices.

That Jay guy is pretty low down on the intelligence curve. Nice guy and all, and he is doing well, so good luck to him. He is the type of person that comes up with some wacky understanding and has that overly confident character to sell it to most average people. He should work in marketing.

They've changed the tiers? WTF. Or, they haven't released a Titan and upped the prices.. Never argue with an idiot, they'll beat you with experience every time.
 
Right now Ryzen CPU has established it's Brand as a more competitive multicore cpu when it comes to multicore tasks. We also know that Intel is playing catch up to that with their 9000 series being released on the alleged z390 chipsets later this year.
Furthermore, the article mentioned the 2600K (not an Intel CPU). So you are incorrect on that nebulous claim of AMD vs Nvidia as nvida doesn't make cpus. That is a strawman you created just now. So, to further clarify, RT is helping AMD CPUs.

RT isn't something new. Nor was it developed by nvidia's RTX lineup which you seem to allude to. It goes as far back as 1980's. During that time it was being used on the cpu. Albeit that it was slow. Besides, RT isn't an api limited on the gpu either.

The context of the article brings something to light here. What is it that the RTX is doing where Dice still needs more cores to get Ray Traced Reflections up to speed? Because the only thing demonstrated in BFV was reflections. Them telling us that they need more CPU powah (AKA HW threads) to get RT working is throwing a lot of shade at RTX Brand as a whole.

:D
Did you read the article you posted? And I went on your very first line of "LOL, Nvidia inadvertanly promotes AMD CPUS." which straight away points to you coming across as making a NVidia Vs AMD post.... You don't need to be smart to work that out. Are you 12 years old or less?

As for the rest of your nonsensical post, RT is something that requires a lot of grunt to run, even more so in realtime and hence why we haven't really seen it being used before. As for your statement about it running on the CPU and back in the 80's, sure it could/can but why hasn't it been done? Think about that ;)
 
That Jay guy is pretty low down on the intelligence curve. Nice guy and all, and he is doing well, so good luck to him. He is the type of person that comes up with some wacky understanding and has that overly confident character to sell it to most average people. He should work in marketing.

They've changed the tiers? WTF. Or, they haven't released a Titan and upped the prices.. Never argue with an idiot, they'll beat you with experience every time.
I like Jay but I tend to take his articles and many others with a pinch of salt. I would also test 2080Ti against a 1080Ti - 2080 against a 1080 etc. I don't understand the need for 2080 V 1080Ti but it will show what the 2080 is capable of I guess for those on a 1080 or less.
 
And I went on your very first line of "LOL, Nvidia inadvertanly promotes AMD CPUS." which straight away points to you coming across as making a NVidia Vs AMD post.... You don't need to be smart to work that out. Are you 12 years old or less?
No, it is not. And having replied to you to clarify the matter it is not.

As for the rest of your nonsensical post, RT is something that requires a lot of grunt to run, even more so in realtime and hence why we haven't really seen it being used before. As for your statement about it running on the CPU and back in the 80's, sure it could/can but why hasn't it been done? Think about that ;)
Interesting that you still understood the post. ;)
But the question still remains...What is it that RTX is doing vs what Dice is requiring from the CPU. The workload of RT'd Reflections in BFV will be shed at some point. However, as it stands right now I don't see the RTX being what it's claimed to be if a developer is looking for a higher core count cpu.
 
Interesting that you still understood the post. ;)
But the question still remains...What is it that RTX is doing vs what Dice is requiring from the CPU. The workload of RT in BFV will be shed at some point. However, as it stands right now I don't see the RTX being what it's claimed to be if a developer is looking for a higher core count cpu.
You are asking me why RTX is making good use of CPU cores/threads? I would put that down to the tensor cores working independently of the cuda cores and the more threads and cores your CPU has, the easier it is for everything to work together. The old days of 1 core running at 100% and then asking for another workload would result in massive bottlenecking. If you want to act smart, you should really know this stuff but crack on with your "That's the problem, RTX can't do it alone. Better yet, there is no real proof that RTX is doing it at all to be honest." :rolleyes:

Too many people on here of late making stuff up and it is getting very annoying now.
 
No, it is not. And having replied to you to clarify the matter it is not.


Interesting that you still understood the post. ;)
But the question still remains...What is it that RTX is doing vs what Dice is requiring from the CPU. The workload of RT'd Reflections in BFV will be shed at some point. However, as it stands right now I don't see the RTX being what it's claimed to be if a developer is looking for a higher core count cpu.

Considering DXR is part of direct X 12 and Nvidias RTX technology is designed to accelerate it, is it not obvious?

I have been reading OCUK forums for years but the troll posting towards this latest card is ridiculous.

AMD shareholders club springs to mind!

Edit - I presume you know part of direct X 12 is designed to get the best out of multi core cpus
 
Too many people on here of late making stuff up and it is getting very annoying now.
But here's the thing, you are doing exactly what you accuses others. There is nothing you've said that refutes the need to know why more HW threads are needed when JHH said himself "it just works".
Well, according to Dice via interview, no it doesn't "just works". But all will be made clear once the spec's are announced for running BFV with RT that only uses reflections.
Will one need a RTX with a min. 2600 or can they use their existing PC will be something I will keep my eye out for.

But thank you for helping me bring this to the surface. As it's something worth discussing. I originally thought that all you needed was a RTX. But now it's becoming clear that even with a RTX card a CPU upgrade maybe required.
:)

Considering DXR is part of direct X 12 and Nvidias RTX technology is designed to accelerate it, is it not obvious?

I have been reading OCUK forums for years but the troll posting towards this latest card is ridiculous.

AMD shareholders club springs to mind!

Edit - I presume you know part of direct X 12 is designed to get the best out of multi core cpus
The convo is actually about the interview with Dice. I don't think that the article needs to demonstrate that the game is using DXR in order for it to be clear that Dice needs more HW threads to get RT'd Reflections working
:D
 
Last edited:
The whole gpu section has become unreadable. The same dull “it’s so expensive” line is tiresome. And from a few new members, school holidays have anything to do with it? I have a number of users on ignore. Not had to do that for a while.

Edit. To remain on topic, I think the cards are going to be great. I won’t be dipping in though. My 980ti’s doing fine for wow.
 
Are you joking me man that Tom Petersen doesn't know the exact performance figures? LOL Not a hope.

Sandbagging? In a launch that's been plagued with bad press, huge price rises, rumours of poor performances, articles about low frame rates in Ray Traced games, etc. Sandbagging, haha. Nope.

You aren't applying any logic, you are dismissing what the Technical Marketing Director of Nvidia is saying because it doesn't suit your beliefs. All the leaks so far combined with what Tom Petersen says suggests strongly that the performance in standard games between the 1080Ti and the 2080 will be broadly similar.

Tensor cores, Ray Tracing cores, have nothing to do with the figures that Tom Petersen mentioned. It was discussing the performance in standard games, games that haven't been developed to use either Ray Tracing or DLSS, you know, most of the games currently on the Market?



So Guess he isn't Sandbagging after all and the figures he quotes are exactly where the performance will be on release.
Did he say he didn';t know the exact figures or not? That was what I was referring to :D. he may know the exact figures but didn't want to share - who knows and who cares?
I don't think I was dismissing anything but I don't believe every fake fact or word spoken either. he was reluctant to talk more about figures, regardless of whether he knew them or not. The facts will be in their actual performance, why over analyse all this stuff?
As I said, lets wait and see eh?
We can discuss all we like but we all know NV will dangle a big enough carrot, right? This time next year, half of those trying to rip NV a new one (upset over pricing) will own a 20 series I reckon and that'll probably include you if your interested in playing future games with a more future proof card :D and not using some old 2.5 year old based cards. I'm keeping the 1070 Ti and am temtped to cancel the 20 series card I ordered (but probably wont) but I wont be changing my mind even if I do - they'll be great.
And you're right, he may not be sandbagging but I was opening your eyes tot he possibility that he might be! Folks seem to over analysising everything and spinning it to the conclusion they want. I reckon Jenson is laughing his head off at the internet going mad over the speculation anyway. His 1180 joke at the start of the presentation tells me he loves a laugh and a wind up!
More folks seemingly are being swayed it seems. Haters need to get with the programme :D (only kidding).
 
Last edited:
But here's the thing, you are doing exactly what you accuses others. There is nothing you've said in your diatribe that refutes the need to know why more HW threads are needed when JHH said himself "it just works".
Well, according to Dice via interview, no it doesn't "just works". But all will be made clear once the spec's are announced for running BFV with RT that only uses reflections.
Will one need a RTX with a min. 2600 or can they use their existing PC will be something I will keep my eye out for.
:)


The convo is actually about the interview with Dice. I don't think that the article needs to demonstrate that the game is using DXR in order for it to be clear that Dice needs more HW threads to get RT'd Reflections working
:D
Sigh. Way to ignore my explanation because you would rather carry on looking silly. Anyways, I am going out for a few beers in a bit, so I will let you do some swatting up and hopefully when I am back tomorrow, you will have realised why some games use more CPU cores than others and then you might understand why RT is also calling on cores/threads for usage!
 
Sigh. Way to ignore my explanation because you would rather carry on looking silly. Anyways, I am going out for a few beers in a bit, so I will let you do some swatting up and hopefully when I am back tomorrow, you will have realised why some games use more CPU cores than others and then you might understand why RT is also calling on cores/threads for usage!

Fantastic plan, cheers!
 
But the question still remains...What is it that RTX is doing vs what Dice is requiring from the CPU. The workload of RT'd Reflections in BFV will be shed at some point. However, as it stands right now I don't see the RTX being what it's claimed to be if a developer is looking for a higher core count cpu.

Just now a game engine only has to process the field of view that the user can see on their screen but with ray tracing in order to get reflections of things happening behind you and of things happening around corners the game engine also has to process those additional views. Wouldn't that account for the extra CPU load?
 
Just now a game engine only has to process the field of view that the user can see on their screen but with ray tracing in order to get reflections of things happening behind you and of things happening around corners the game engine also has to process those additional views. Wouldn't that account for the extra CPU load?

A game engine still processes the whole scene - only with traditional rasterisation when it comes to rendering features frustum culling is used to decide what is actually drawn (and some kind of potential visibility system so you aren't rendering things that are occluded i.e. around a corner unless they need to be) all the geometry data, etc. is still being processed as you need that for physics and the texel and so on data can be looked up if needed, etc. which will add some CPU overhead but that is part and parcel of ray tracing.
 
Back
Top Bottom