• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Offscreen 12 minutes of gameplay with RT on welcome back to cinematic 30fps at 1080p :D

http://www.pcgameshardware.de/Grafi...ormance-in-Shadow-of-the-Tomb-Raider-1263244/

:eek: 35fps, sub 50s all the time.

yet @Paladine dismissed Hardware unboxed video, when Steve said that with Ray Tracing ON looked that there was a 50% drop on the videos FPS.

I wonder though. Will Turing have better HDR+Gsync and not lose performance, or going to have slideshows gaming with Ray tracing + HDR + gsync?
Which of course would result to having the brigade saying that "there is no difference between High and Ultra setting. Turing is powerful enough"
As they already saying "the cooler is good because it needs it", while not even the 295X2 or the Titan Z or the GTX690 had such over engineered coolers.
And these were dual GPU behemoths burning a LOT of power.
 
You really trying to tell me that he wasn't referring to HDMI, when talking about the free open standard and mentioning consoles which don't have display port outputs on them.

And just to add, how sad are we that we just wont let things drop or say we are wrong when we are wrong and yes I know I have several time in the past.

Whether it's on HDMI or DP doesn't change the facts Bru. VRR doesn't add anything to the Royalties on either. There are no royalties for using DP, so VRR is free, there is a fee for using HDMI, but that fee is the same whether you use VRR or not, so again no cost for VRR. Unless you are somehow suggesting that a console wouldn't have a HDMI port on them either if it wasn't for VRR?

So his statement is correct, there is a free open standard available that improves PCs.

And Bru, if I get things wrong, I have no problem admitting it and moving on, but when I know I am right, I will never back down. It's a flaw, I should just drop it but I can't seem to help myself.

the meme, "somebody on the internet is wrong!" is all me :p
 
Last edited:
Unfortunately wrote IF. Personally doubt it, however it doesn't require Gimpworks to add ray tracing and going to be coded with PS5 (Navi) hardware in mind also.



Please pop to AMD website and have a read on the technical documentation of RadeonRays 1 & 2.
It supports hardware ray tracing on Cuda & GCN since 2013, and AMD has provided a pretty good detail how to do development on CUDA cores.
I bet it will do also to have RadeonRays to support ray tracing core on the Turing cards.

That shows the corporate ethics for each company.

I guess that does make or break the difference between corporate ethics. There's a reason we're currently still using rasterization on current hardware, just as equally as there's a reason we rarely ever see AMD pushing technologies into new and upcoming games.
 
The yields/profit argument @Boomstick777 mentioned is valid mate.
Titan V came at $3000 pricetag, and was barely faster than the GTX1080Ti FE. Let alone the GTX1080Ti factory overclocked monsters.
And $3000 card won't sell at many as an $800 card.

Same applies to Turing.
If you see the chip IS HUGE. Look at the cooling on those VRMs!!!!!!
376W max power consumption Vega 64 Nitro+ (when truly overclocked all out) doesn't have such cooling on the VRMs.
Hell neither my 295X2 had such VRM cooling!!!!!!!! A card that needed 1000W gold rated PSU to run.


True this is the EVGA one but no other card before had such cooling.

The Turing cards, at least the 12nm generation, are expensive to make because the chip is big and yields are in proportion to the chip size.
GDDR6 ram is cheap, and doesn't cost $200 like the HBM used on the Vega 64.

LOL in the video They say 2080TI but the chip says TU-104, i though all TI were TU-102.
 
Another very important thing to wait and see is have they fixed DX12 with these cards?

If in next gen titles such as BF5 it's still performant to turn DX12 off and use DX11 then well I don't see the point in the upgrade.
 
I think we need to see first. We're speculating atm. No doubt the next gen again will do it better but I wouldn't be surprised if NV have a fairly decent implementation already. It's done at the hardware level after all.

Not if the Tomb Raider video is anything to go on, ~30FPS at 1080p on a 2080ti :(
 
Not if the Tomb Raider video is anything to go on, ~30FPS at 1080p on a 2080ti :(
Hmm, well, good on them for showing the FPS counter :).
Lets see.
i'm thinking we could do with an 11** series without the RTX stuff for now but more CUDA cores :p. But without the technology being bought it's unlikely going to be implemented in games.
 
Last edited:
I guess that does make or break the difference between corporate ethics. There's a reason we're currently still using rasterization on current hardware, just as equally as there's a reason we rarely ever see AMD pushing technologies into new and upcoming games.

Hardware Async compute and DX12/Vulkan are something AMD is trying to push for a long time now. But it cannot afford to splash millions per game like Nvidia does.
Which the later is doing so, to keep the games at DX11 and add gimpworks. Or at least force DX11 support.

Look at some recent ports. All AMD optimizations existing on consoles were stripped from the same titles when came to PC.
That results that you need a pretty beefy hardware to achieve the same visual quality and performance with consoles.
Yet AMD sponsored titles (FC5 for example) work perfectly regardless the hardware either Nvidia or AMD.

Thought there are titles like SW Battlefront 2 who have kept a lot of those optimizations. The Division 1 also. (DX12 on Vega 64 works better than on my old GTX1080Ti Xtreme, DX11 different story)
Have you see how the Battlefront 2 performs on Nvidia cards with DX12 or HDR activated? Very appalling even for the damn Titan Xp, which last year considered over the 1080Ti Xtreme.
A card that costs atm 3 times more than Vega 64, and is fricking slower at 4K HDR + gsync!!!!!
Yet on other games at same settings is barely less than a handful fps faster. And that on a £3400 setup (27" 4k HDR gsync monitor + TXp) compared to a £900 setup (35" 4K HDR Freesync monitor + V64 Red devil)

True not all together at same review, and need to connect the dots from 2 different reviews, but for heaven sake -_-

People should have grab pitchforks against Nvidia long time ago, not blaming AMD for not competing.
 
I'd like to see benches with RT turned off. Because at this point. (for me anyway) RT is completely unusable. Sub 40fps at 1080p isn't going to cut it. Let's see what the cards do with it off. As they showed zero benchmarks during the keynote and made no claims of its performance compared to pascal outside Ray slideshowing I am worried.
 
Sounds like your just trying to convince yourself here. You jumped at the pre-order and told. Nvidia you will buy anything they release, at any price, without seeing benchmarks or performance specs. Now pay the consequence. Enjoy your 22fps Ray tracing at over a grand lol.

This post is so full of salt but it got a good laugh out of me.

"Enjoy your 22fps Ray tracing at over a grand lol." :D :D :D
 
so a quick thought... if AMD have supported ray tracing for some time, but it never took off, does that mean that these new games which come out you will be able to turn on the raytracing like the nvidia card, or will it be "proprietary" like physX?

if it is open, regardless of the motivations from NV behind it, could this not be good for AMD owners who have had the hardware but nothign to use it on?
 
We’re that man gibbo when you need him lol probably nursing the hangover after partying with Jenson last night:D.
 
Back
Top Bottom