• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Assetto Corsa Competizione Dumps NVIDIA RTX

why do people without a rtx card keep on trolling that the hardware to run it does not exist yet. I have happily been playing metro exodus this week at 4k ultra on my OC'd 2080 at decent fps (fine for me anyway - 40 - 45 fps - RTX high / DLSS)
Because 40-45 isn't acceptable levels of FPS people moaned like hell about consoles 'only' doing 60 yet suddenly it's perfectcly fine for a 1.2k GPU to only do 60 lol for the price of the card is want a lot more FPS than that
 
If ACC wasn't an unoptimised mess in the first place his words would mean more. As it stands it's just incompetent devs deflecting blame.
 
why do people without a rtx card keep on trolling that the hardware to run it does not exist yet. I have happily been playing metro exodus this week at 4k ultra on my OC'd 2080 at decent fps (fine for me anyway - 40 - 45 fps - RTX high / DLSS)

in the day of modern age gaming, very high refresh rate monitors adaptive sync tech, 40-45 is nowhere near acceptable. 100fps is now the 60fps equivalent to many, my self included. Sub 60 is horrendous to play on PC.
 
why do people without a rtx card keep on trolling that the hardware to run it does not exist yet. I have happily been playing metro exodus this week at 4k ultra on my OC'd 2080 at decent fps (fine for me anyway - 40 - 45 fps - RTX high / DLSS)

Sorry but 40-45 FPS on a 600 pound gpu is unacceptable and is the main problem with rtx. In metro the lighting effects don’t really add that much to the overall experience and in a lot of areas the original looks better.

Now you say people without rtx cards keep trolling well I own a mai 2080ti gaming trio x and I’ve tried bf5 and metro with rtx on and whilst in right conditions it does look better and it’s a nice tech but it’s not always better and sometimes look worse then just shadows and lighting in said games. But what really gets me is in metro and bf5 I turn rtx on and a 5700xt and 2060etc will run the game same graphic setting but no rtx and get higher frame rate that’s a 400 pound gpu outperforming a 2080ti with rtx knee capping it.

Lastly the technology doesn’t exist yet rtx was shown with all new rtx cores along side of cuda cores now if it is how the marketing shown at the start raytracing round the have a impact on the game the rtx cores would handle it not the main cuda cores .

But 600 pounds for 40. FPS that just insaine
 
This is no surprise to me. Many on their forum was against it from the start.
IMHO, they caved to pressure from their own pre order customer base (those who bought it before v1.0 was released at a cheaper price).
And, obvious truths about RTX: performance, alienating their core group, etc.
 
why do people without a rtx card keep on trolling that the hardware to run it does not exist yet. I have happily been playing metro exodus this week at 4k ultra on my OC'd 2080 at decent fps (fine for me anyway - 40 - 45 fps - RTX high / DLSS)

The hardware to run it not only doesn't exist but perhaps will never exist as far as the nm race goes on silicon semiconductors.

It's explained - raytracing is Nok for one very simple reason - it isn't worth it because you drop the efficiency to some insanely low levels.

https://devblogs.microsoft.com/directx/announcing-microsoft-directx-raytracing/
"Appendix – Primers on rasterization, raytracing and DirectX Raytracing
Intro to Rasterization
Of all the rendering algorithms out there, by far the most widely used is rasterization. Rasterization has been around since the 90s and has since become the dominant rendering technique in video games. This is with good reason: it’s incredibly efficient and can produce high levels of visual realism.

Games rendered with rasterization can look and feel incredibly lifelike, because developers have gotten extremely good at making it look as if their worlds have light that acts in convincing way."
 
No surprise

In their own comments they say they don’t have time to implement it as the game has too many other issues they need to fix.

Just an indie developer getting too big for their boots too soon
 
One of the problems is - you can't dump traditional rendering yet - so you still need to do all the work for rendering the game without ray tracing and then you have the choice of slapping some token RTX features over the top like BF V which takes a big performance hit for nothing particularly amazing, maintaining a full development branch optimised for ray tracing that would look amazing but only a tiny number of people could actually use right now (i.e. 2080ti owners at 1080p at max) or something in between that wouldn't take much work but still take a huge performance hit while not looking anything like the potential ray tracing can do and only be playable for those with 2070s or higher at sub 1080p resolution.
 
If I were to guess the added input lag would kill the game IMO. I'm sure they would have to wait for the developers of Unreal Engine to fix it directly.

From my understanding Kunos has a working RT version. It's just not being released do to the reasons provided.
 
Sorry but 40-45 FPS on a 600 pound gpu is unacceptable and is the main problem with rtx. In metro the lighting effects don’t really add that much to the overall experience and in a lot of areas the original looks better.

Now you say people without rtx cards keep trolling well I own a mai 2080ti gaming trio x and I’ve tried bf5 and metro with rtx on and whilst in right conditions it does look better and it’s a nice tech but it’s not always better and sometimes look worse then just shadows and lighting in said games. But what really gets me is in metro and bf5 I turn rtx on and a 5700xt and 2060etc will run the game same graphic setting but no rtx and get higher frame rate that’s a 400 pound gpu outperforming a 2080ti with rtx knee capping it.

Lastly the technology doesn’t exist yet rtx was shown with all new rtx cores along side of cuda cores now if it is how the marketing shown at the start raytracing round the have a impact on the game the rtx cores would handle it not the main cuda cores .

But 600 pounds for 40. FPS that just insaine

Personally I would accept 40-45FPS (and I'm a framerate junky) if it meant having at least 1080p resolution ray tracing even at the level of implementation in Quake 2 RTX but the with scene complexity of a modern engine - messing about with a custom map I got Quake 2 looking like this for instance (on a GTX1070 - Turing cards manage around 6x the framerate)

bwlwwmU.jpg

With the limitations of a ~25 year old game engine... if people can't imagine what they can do with the same implementation on a modern engine.................................
 
Adaptive-Sync monitors (particularly with the LFC feature) makes low-FPS more acceptable, not less.

In objective terms, there's never been a better time to play at sub-60 fps.

Don't agree, sub 60 even with adaptive sync feels gash still. Throw the same game and monitor side by side, one at 40fps to 50fps one at 100fps and higher, the one running at 50fps looks lumpy and horrible in comparison. Adaptive sync won't change that drastically, it'll help it but the experience still tastes horrible comparison. It'll feel like playing a console at 30fps.

I've 2 pretty high end gaming rigs in my house, I've 4 adaptive sync monitors, I know exactly how games play at sub 60 and then over 100.

60fps is old hat now as is 1080p.
 
Last edited:
Don't agree, sub 60 even with adaptive sync feels gash still. Throw the same game and monitor side by side, one at 40fps to 50fps one at 100fps and higher, the one running at 50fps looks lumpy and horrible in comparison. Adaptive sync won't change that drastically, it'll help it but the experience still tastes horrible comparison. It'll feel like playing a console at 30fps.

I've 2 pretty high end gaming rigs in my house, I've 4 adaptive sync monitors, I know exactly how games play at sub 60 and then over 100.

60fps is old hat now as is 1080p.

Yeah mostly - having adaptive sync means that if a game is running at mostly say 70-80 FPS and you get the odd dip to 40-50 it is far more acceptable whereas without it can be hugely negative on the experience but it doesn't magically fix low framerates and mean you can play at ~40 FPS now with a good experience.

Back in the day though you'd have to be rendering quite a bit above 60 FPS to get decent responsiveness and offset any tearing, etc. meaning even for single player I'd like to have ~100 FPS but these days with G-Sync I'm fairly happy playing most single player games at a consistent 60 FPS some exceptions aside though I still like having 80+, ideally 100+, for multiplayer.
 
Don't agree, sub 60 even with adaptive sync feels gash still. Throw the same game and monitor side by side, one at 40fps to 50fps one at 100fps and higher, the one running at 50fps looks lumpy and horrible in comparison. Adaptive sync won't change that drastically, it'll help it but the experience still tastes horrible comparison. It'll feel like playing a console at 30fps.

I've 2 pretty high end gaming rigs in my house, I've 4 adaptive sync monitors, I know exactly how games play at sub 60 and then over 100.

60fps is old hat now as is 1080p.

Not according to Steam.
 
Back
Top Bottom