• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Soldato
Joined
19 Jan 2010
Posts
4,806
https://www.nvidia.com/en-us/geforce/news/justice-online-geforce-rtx-ray-tracing-dlss/


Justice to use both DXR and DLSS, look very nice MMO game and hope to buy it if it will be on Steam.
This is all very nice but why on earth anyone would want it now when performance is so low even on the most beasty cards is beyond me. That scene where there is more than one person in it the frames tank to what looks like 15 - 20 fps at what looks like 1080p.

Does it looks nice? Yes it looks incredible. Is it worth the money vs performance return right now? No way!!

I must admit though, i do thank the early adopters, without you guys buying the new stuff up the rest of us that wait for acceptable performance vs price wouldn't have a chance to buy into the tech as it would become obsolete.
 
Soldato
Joined
13 Jun 2009
Posts
6,847
DLSS is basically "fake 4K", right? If so, it makes sense that it is designed to be used with DXR, considering the huge hit in frame rate DXR produces. Using 1080p or 1440p with DLSS + DXR instead of native 2160p, for example, would probably provide a nicer image overall without affecting frame rates too much.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
If DLSS comes to BFV and makes RT usable at my native 1440p resolution I just might buy a card.

Until then... I don't have a need to upgrade as RT is unusable at 1440p and my 1080 is currently delivering at my native resolution.
 
Caporegime
Joined
24 Dec 2005
Posts
40,065
Location
Autonomy
DLSS is basically "fake 4K", right? If so, it makes sense that it is designed to be used with DXR, considering the huge hit in frame rate DXR produces. Using 1080p or 1440p with DLSS + DXR instead of native 2160p, for example, would probably provide a nicer image overall without affecting frame rates too much.

Not ready to buy into promises.
 
Soldato
Joined
28 Sep 2014
Posts
3,437
Location
Scotland
This is all very nice but why on earth anyone would want it now when performance is so low even on the most beasty cards is beyond me. That scene where there is more than one person in it the frames tank to what looks like 15 - 20 fps at what looks like 1080p.

Does it looks nice? Yes it looks incredible. Is it worth the money vs performance return right now? No way!!

I must admit though, i do thank the early adopters, without you guys buying the new stuff up the rest of us that wait for acceptable performance vs price wouldn't have a chance to buy into the tech as it would become obsolete.

Your post made me really very sad. Not sure if you serious or not but you are 36 years old? I wondered what graphics card you used back in 1996 and 2004? I was Voodoo 1's 3D early adapter and will be RTX 2080's Ray Tracing early adapter.

Back in 1996 I had old PC ran Windows 95 with desktop resolution set at 1024X768X32 but Voodoo 1 only can do either 640X480X32 or 800X600X32. Ran Tomb Raider on Glide at 640X480X32 ran at 30 fps I think so in some heavy demand places it tanked to 15 fps but still bare playable until I walked away from area. Tried ran Tomb Raider at 800X600X32 seemed slow probably around 15 to 20 fps. Ran Quake at 640X480X32 probably at 30 fps or less.

https://www.vogonswiki.com/index.php/3dfx_Benchmarks

Look at 3dfx benchmarks at link above and you will be surprise to see how bad 3dfx and Nvidia cards ran with Quake 3. I remembered upgraded to Voodoo Banshee on christmas day 1998 ran Quake 3 at 30 fps on 800X600 but slow at 1024X768 less than 20 fps. In 2004 I played Far Cry with Geforce 6800 GT at 1024X768 with around 90 fps at maxed Ultra settings but in many heavy demand places the frame performance nosedive into 30 and ever 15 fps so that with all Far Cry patches too.

https://www.anandtech.com/show/1545/8

Was Voodoo 1 worth the money vs performance back in 1996? Yes.
Was Voodoo Banshee worth the money vs performance back in 1998? Yes.
Was Geforce 6800 GT worth the money vs performance back in 2004? Yes.
Will Geforce RTX 2080 worth the money vs performance in 2018? Yes!


If you has 1080 Ti, let say if you play Hitman 2 at 4K Ultra, are you happy with low performance at 47.2 fps and less in lots of places?

https://www.techpowerup.com/reviews/Performance_Analysis/Hitman_2/4.html
 
Last edited:
Soldato
Joined
28 Sep 2014
Posts
3,437
Location
Scotland
lol keep what you got lol 7 nm coming very soon. o

You can lol embarrassing to see 7nm tiny 331mm2 Vega 20 chip consumed 300W power get destroyed by 12nm big 545mm2 energy efficient Turing TU104 chip consumed just 70W power in AI Resnet Benchmark, that 4 times less power than Vega 20. Imagine 7nm shrink of Turing TU104 chip would see it consumed 35W power, 8 times less power than Vega 20.

“The 70W Tesla T4 with Turing Tensor Cores delivers more training performance than 300W Radeon Instinct MI60. And Tesla V100 can deliver 3.7x more training performance using Tensor Cores and mixed precision (FP16 compute / FP32 accumulate), allowing faster time to solution while converging neural networks to required levels of accuracy.” – NVIIDA

https://wccftech.com/amd-radeon-mi60-resnet-benchmarks-v100-tensor-not-used/
 
Last edited:
Soldato
Joined
19 Jan 2010
Posts
4,806
Your post made me really very sad. Not sure if you serious or not but you are 36 years old? I wondered what graphics card you used back in 1996 and 2004? I was Voodoo 1's 3D early adapter and will be RTX 2080's Ray Tracing early adapter.

Back in 1996 I had old PC ran Windows 95 with desktop resolution set at 1024X768X32 but Voodoo 1 only can do either 640X480X32 or 800X600X32. Ran Tomb Raider on Glide at 640X480X32 ran at 30 fps I think so in some heavy demand places it tanked to 15 fps but still bare playable until I walked away from area. Tried ran Tomb Raider at 800X600X32 seemed slow probably around 15 to 20 fps. Ran Quake at 640X480X32 probably at 30 fps or less.

https://www.vogonswiki.com/index.php/3dfx_Benchmarks

Look at 3dfx benchmarks at link above and you will be surprise to see how bad 3dfx and Nvidia cards ran with Quake 3. I remembered upgraded to Voodoo Banshee on christmas day 1998 ran Quake 3 at 30 fps on 800X600 but slow at 1024X768 less than 20 fps. In 2004 I played Far Cry with Geforce 6800 GT at 1024X768 with around 90 fps at maxed Ultra settings but in many heavy demand places the frame performance nosedive into 30 and ever 15 fps so that with all Far Cry patches too.

https://www.anandtech.com/show/1545/8

Was Voodoo 1 worth the money vs performance back in 1996? Yes.
Was Voodoo Banshee worth the money vs performance back in 1998? Yes.
Was Geforce 6800 GT worth the money vs performance back in 2004? Yes.
Will Geforce RTX 2080 worth the money vs performance in 2018? Yes!


If you has 1080 Ti, let say if you play Hitman 2 at 4K Ultra, are you happy with low performance at 47.2 fps and less in lots of places?

https://www.techpowerup.com/reviews/Performance_Analysis/Hitman_2/4.html
Hey dude. Yes im serious. Back then we didnt really have a choice did we? I remember playing Tomb Raider at low FPS as well as Carmagedden but it was just the way we played back then, as kids we didnt really care too much and i guess didnt really understand what it meant to get better FPS.

Fast forward over 20 years and things are just way different, im now in a position where i can expect to play games at 4k, solid 60fps with most settings on ultra or high. My 1080ti achieves this and im loving it. It just leaves a sour taste in my mouth that if i want to enjoy what this new card brings to the table id have to drop back to 1080p and start all over again at 30 - 60fps and thats just not ok. I could afford to buy the 2080ti card tomorrow but the return i will get on that card against my current 1080ti is simply not worth it, its not even close!
 
Soldato
Joined
13 Jun 2009
Posts
6,847
Was Voodoo 1 worth the money vs performance back in 1996? Yes.
Was Voodoo Banshee worth the money vs performance back in 1998? Yes.
The original Voodoo card was ~$550 in today's money and apparently dropped by 33% after less than a year.

Was Geforce 6800 GT worth the money vs performance back in 2004? Yes.
It was under £450 in today's money.

Will Geforce RTX 2080 worth the money vs performance in 2018? Yes!
Hmm no.

Strange how your post about "money v performance" doesn't mention money anywhere.
 
Soldato
Joined
28 Sep 2014
Posts
3,437
Location
Scotland
Hey dude. Yes im serious. Back then we didnt really have a choice did we? I remember playing Tomb Raider at low FPS as well as Carmagedden but it was just the way we played back then, as kids we didnt really care too much and i guess didnt really understand what it meant to get better FPS.

Fast forward over 20 years and things are just way different, im now in a position where i can expect to play games at 4k, solid 60fps with most settings on ultra or high. My 1080ti achieves this and im loving it. It just leaves a sour taste in my mouth that if i want to enjoy what this new card brings to the table id have to drop back to 1080p and start all over again at 30 - 60fps and thats just not ok. I could afford to buy the 2080ti card tomorrow but the return i will get on that card against my current 1080ti is simply not worth it, its not even close!

Unfortunately things still the same since 1996, like you said, we didnt really have much choice!

Had my first PC free from college as part of course back in 1994, 14 inch monitor was very small back in the day and I upgraded it to 17 inch monitor in 1996 then a few years later I felt 17 inch is too small for 800x600 games and considered 21 inch monitor but it was really very heavy 50kg so I didnt had much choice and bought Sony Trinitron G420 19 inch monitor weighted 25kg and loved trinitron screen. A few years passed shops was selling Plasma TVs for first time, I loved it but very overpriced costed £4000-8000+ way out of my price range, heard it was horrible to use it on PC and consoles. Think I bought 27 inch 720p LCD TV back in 2005 used to watched TV, playback DVDs and PC running 1280X1024 through DVI, it was fantastic and loved it and it can go up to 1600X900 with no flickers through custom resolution. LCD TV died in 2010 and bought my current 32 inch Sharp 1080p HDTV still going strong today can display 1440p, 4K, 8K and 16K through DSR. GTX 470 was the first card in 2010 to use HDMI port and powerful enough to pushed GTA IV at 1080p resolution playable 30-60fps. But I found out my DVD recorder was unusable and unwatchable with grainy and pixelated image quality on 1080p HDTV then inserted DVDs in PC and relieved it was looked much better with image upscaled to 1080p on PowerDVD.

Back in 2015 there was GTX 980 Ti threads asked is GTX 980 Ti capable to played new games at 4K on maxed settings? Not at Ultra settings on new games but old games can do 4K at solid 60+ fps on Ultra settings and same thing happened to GTX 1080 Ti now with modern games like Hitman 2. If you want maxed settings then you will need to turn down to 1440p or 1080p but seriously 4K at low setttings? Tiny unreadable texts? Forget it not worth it! Leave a sour taste in my mouth. When I bought Crysis back in 2007 but found it was unplayable at 1600X900 at maxed settings and then I turned down to 1280X1024 maxed settings to get it playable enough. Years ago I was excited about Quantum Break the first DirectX 12 game, I bought it back in 2016 to played it on GTX 970 ran great at 1080p Ultra settings on Act 1 but suffered bad shuttering in Act 2 or 3 and turned down settings did not helped but 720p helped a bit and GTX 1070 did not improved performance then Remedy threw the towel on DirectX 12 after 2 patches and ported it to DirectX 11. I played Quantum Break again a few months ago and still has horrible performance then I went to Steam and read Quantum Break messageboard found AMD owners still complained about horrible DirectX 11 performance so I decided to risked £10 downloaded DirectX 11 version but very pleased surprised to see it ran much smoother at 1080p Ultra around 60-90fps. Found Quantum Break DirectX 11 benchmarks showed GTX 1080 Ti cant run at 4K Ultra solid 60 fps but at 30-50 fps. Absolutely loved Quantum Break after completed it, hope to see Quantum Break 2 someday. I cant wait to play Control in 2019 with ray tracing. :D

Games in 2019 and beyond will see 1080 Ti fell behind at 4K resolution and push down to lower resolutions just like 980 Ti and Fury X did and you didnt really have much choice but to upgrade to 2080 Ti with great Black Friday deals or wait for 3080 Ti.


Take a look at the video and imagine a few years from now when 8K HDTV and monitors could became affordable and your 1080 Ti play old game Battlefield 4 at 8K resolution with 200% scaling = 16K resolution running barely playable at 20-25 fps, you do not want to go back to 4K 30-60 fps but that would be not ok but you really don't have much choice but to upgrade to GTX 4080 Ti to play 16K resolution playable at solid 60 fps. It not worth it! I will happy play it at 1080p.


Maybe one day you want to play Half Life 2 all over again at 4K, 8K or 16K but I did not had great experience with awful very tiny menu and unreadable texts with 32 inch screen and maybe 55 inch screens. I cant play at this resolution because it texts was impossible to read and totally unusable to play while performance will be fantastic at 16K resolution on GTX 1070. It not worth it! I will happy play it at usable 1080p or 1440p with readable texts. Maybe 4K HDTV is not the right way to get best experience and think future 4K mobile projectors, VR headsets and AR glasses with up to 200 inch screen will have the best experience at 4K, 8K and 16K resolutions with Half Life 2.
 
Back
Top Bottom