Discussion in 'Graphics Cards' started by sunshinewelly, Aug 16, 2019.
Whatever is around when I actually NEED to upgrade. Probably won't be by next gen.
I will definitely NEED an upgrade for the 2020 games that I want to play. I just refuse to buy the 2000 series cards at the current prices.
Why are there any games supporting it? What happened to Wolfenstein, Mechwarrior, Asseto Corsa that NV advertised as Ray Tracing games but none got it?
Many in here even took pitchforks when I wrote in this very discussion that you won't see RT in Wolfenstein.
How about DLSS on games like Final Fantasy, which only exists on the demo.
It should be close to 2080ti performance and at £899 it will be good value. The whole product stack has to move up a couple of tiers in price every generation these days. 'Amazing' prices and quite good performance is the new mantra, didn't you see the memo?
that's fair enough.
the same thing happened with all graphics technologies. When first introduced performance is horrible and up take is low. Ray tracing is just like tesselation, physx etc.
In a few years when turning rayvtracing in is performance neutral that's when we'll get mass uptake and every game will use it
but you have to start somewhere, it's chicken and egg. Nvidia and amd needs revenue to keep doing R&D on rayvtracing and developers are waiting for better performance but that only comes if people buy the GPus. That's why technology moves slow, because most people won't buy until its mature which means there is little revenue to improve performance and little motivation for developers to support it.
Lol. If they do that I will just stick with a PS5 for 4-5 years until I can actually get a decent upgrade on the PC. I ain’t paying that kid of money for 3rd/4th tier GPU. £500 is the maximum.
One note. War Thunder Ray Tracing is GPU agnostic as it doesn't use DX12 DXR, so works for both AMD & Nvidia and on DX11 API also.
Also it doesn't use RT Cores or Tensor cores found in RTX series.
I don't for a minute think that full on ray traced games is going to happen this generation of hardware - but I have a hard time taking seriously the dismissiveness people have towards it - the implementation nVidia are doing is going to basically be the way it is done underneath and for an early form of it the implementation in Quake 2, when you look beyond the geometry limits inherent to the Quake 2 engine (which do not in any significant way provide a benefit to RTX performance), is very very impressive. Static screenshots don't really do it justice when you see in real time the little lighting details as light interacts with the scene in a way that traditional rendering just can't do it takes it to another level.
Though there are some approximations, cheats and optimisations in the Quake 2 RTX implementation it is a fully featured ray tracing implementation using path tracing that is used for all GI - there are no traditional techniques used at all - every light, reflection, refraction, shadow, etc. is all done through ray tracing techniques - this isn't just slapping some ray traced reflections in an old engine and saying job done.
People are dismissive because of one of numerous reasons. Price of RTX cards, performance of RTX cards, implementation in new games being lack lustre or all three combined.
There were loads of people saying they could not see the difference made by 4K back in 2014 when I got my first 4K monitor and only the past year or two are people changing their tune as better hardware is coming out that can run it. Once hardware to 4K at high FPS you will start seeing more and more people upgrading their monitors to 4K and they will magically see the difference then
The difference for me is I have always been happy to sacrifice FPS for image quality and not been bothered much by playing under 60fps on most games, only a few games that I play NEED it like fighting or racing games. I don’t play online much either. This meant I could live with the low FPS all the way back then and enjoy the image quality boost and not need to use AA methods that are like Vaseline being smeared all over the screen.
As I recall the only other early adopter here of 4K was Kaapstad
To me the sweet spot on most games is lowering settings by 1 notch from Ultra to High and turning off AA, motion blur, depth of field etc. This gets you much better image quality than 1440p could ever provide on Ultra settings. In my experience there has only been a handful of games where you can easily see the difference between High and Ultra while actually playing. You need to run screen shots see minor differences. But not for all games. The Outer Worlds for example I did see a clear difference when I recently played it, so I left settings on ultra for my playthrough
Went on a bit of tangent there
My stance on RT is that I really like it. Just happy to wait until 3000 series cards where we get better hardware and have more than a handful of games that implement it well. If Nvidia had a lot more RT games out there, specifically ones I wanted to play (none so far) then I may have caved.
I had my first 4k monitor in 2013. It was a 27 inch Samsung. The first game I loaded up in 4k was Guild Wars 2, I was blown away it looks amazing but only ran at 20fps so I went back to a 1080p screen until 2015 and from then it's being 4k all the way.
Well at least you did not try and convince yourself that there is no difference in image quality like a lot of people did back then on here
That's weird because i play a lot of games at 4k on my 1440p monitor and see a difference. I even play the new terminator game at 5k ultra because it can handle it using dss
Disappointing as you have most of the performance hit of using ray tracing but none of the little things like surfaces capturing bounced light that traditional techniques struggle with and make so much difference.
I agree, it is weird. But that is what a lot of people used to say to me a few years back.
I saw the difference as soon as I got it and could not go back to 1440p. Now that graphics cards have become so expensive though I will stay away from 8K until there are cards that can run it 60fps. In other words in 5-7 years time probably. I fear if I see it I will struggle to want to go back again
@TNA TBH it's older games that scale best with 8K. The newer ones, as they become incredibly post-process heavy, benefit a lot less from it. It's really only raytracing that benefits a lot from it because you get a lot less noise though obviously 8K RT is so many years away it's not even funny. Maybe it will change next year starting with next-gen games for consoles, because overall asset quality & polygon count is gonna go up many many times higher so 8K will make the games shine more. Until then it's not too interesting, and I've noticed Unreal Engine games in particular plummet performance-wise going to 8K, way more than for other engines. I'm not sure what's going on there but it's a drastic reduction.
I picked up a 4K monitor when they moved on from having 2x split panels and needing 2x DP connections - AOC U2868PQU so fairly early adopter - while games do look great I pretty firmly returned to 1440p aside from a small number of genres like space games and racing where I'd use a controller for playing (though I moved on from that monitor due it it having high latency despite claims otherwise by AOC).
I find with 4K you end up far too often with a situation where you are either skipping pixels to get across the screen fast enough or having to drag the mouse endlessly to travel larger distances and neither is ideal - upto 1440p you don't have to make a compromise in that regard and I think that aspect in general will actually prevent it becoming a de facto standard in the same way as earlier jumps in resolution. I also find surprisingly a lot of every day people just don't care about having resolutions above about 1280x1024 or so - quite a few people I've built PCs about have insisted on moderate resolution 24" displays over anything else as they find it the easiest on their eyes.
Have you tried the reshade rt plug in thing in finding it to be pretty fantastic, both on star wars where I can make the sabre light reflect off things and in terminator its amazing for all the plasma rifle shots lighting up everything and reflecting off t-800s and all the fires lighting up everything. Really making the dark areas dark
Not had this problem with games I played.
You also had an issue with Deus Ex: MD as I recall and was the only person on this forum with that issue
The mouse sensitivity scaling with framerate and not scaling with ADS problems with DX MD were complained about elsewhere - not sure why people here didn't find it more annoying.
Separate names with a comma.