• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

With no dates for DLSS or Ray-Tracing enabled games, is being worried more than a conspiracy theory?

There are already more games listed supporting RTC and DLSS than there are DX12 games.

RTX/RTC is quite easy and quick to implement over the traditional rasterization methods aswell its more straight forward to get running to use RT rather than say using (one for a better term) "fake" rasterization methods for shadows/AO/Lighting etc
 
Gents dont forget the first Tomb Raider was ported to PC by a third party company (who took Nvidia money to strap gameworks) and not the original makers of the console version... Hence all the issues with ryzen and AMD cards which took Crystal almost 8 months of heavy patching to improve it.

The second was much better because it was done by them, and the third if anyone watched the XboneX HDR 4K version could say to hell the PCs

lol. I mean the firs of the new trilogy..
Hmmm, funny you say that, as if it had issues with Ryzen CPUs and AMD cards, AMD have messed up big time, as it was the first game to use TressFX and needed patching to work on NVidia GPUs, as it would just crash for NVidia users from the off. NVidia jumped in and helped get it sorted but it certainly wasn't a Gameworks title.

Time and again you make stuff up without ever having a clue. Honestly, it is becoming tiresome reading and correcting you all the time.
 
There are already more games listed supporting RTC and DLSS than there are DX12 games.
There's a different between ticking the checkbox and implementing the features well on a game though.

Guess only time will tell how well with these features work out in games. But even if people are that interested in these features, the golden rule is always to avoid 1st gen of any tech products (unless you got more money to blow than you need to care), as 2nd gen products of the same tech would almost always guarantee address most of limitations and problems that the gen 1 product might have, if not the very least would definitely run much more efficiently/with better performance.

If people are that desperate to get RTX/RTC DLSS asap, at least be a sensible person wait for actual in-depth review for how well are they implement in real-world using in game. If people drop money on promised features base on expectation alone without any real-world representation (under actual gaming environment), they might as well be funding crowd-funding projects.
 
There's a different between ticking the checkbox and implementing the features well on a game though.

Guess only time will tell how well with these features work out in games. But even if people are that interested in these features, the golden rule is always to avoid 1st gen of any tech products (unless you got more money to blow than you need to care), as 2nd gen products of the same tech would almost always guarantee address most of limitations and problems that the gen 1 product might have, if not the very least would definitely run much more efficiently/with better performance.

If people are that desperate to get RTX/RTC DLSS asap, at least be a sensible person wait for actual in-depth review for how well are they implement in real-world using in game. If people drop money on promised features base on expectation alone without any real-world representation (under actual gaming environment), they might as well be funding crowd-funding projects.
That golden rule is quite simply false (and utterly stupid).

As I said previously, if everyone avoids 1st gen there would be no second gen. You absolutely MUST have early adopters of 1st gen tech to make it a viable enough product to sink in more R&D to create future generations. If people had not bought the 1st gen iPhone in sufficient numbers Apple would have jumped out of the smart phone market entirely - the same is true of the Toyota Prius and many other innovations throughout history.
 
Hmmm, funny you say that, as if it had issues with Ryzen CPUs and AMD cards, AMD have messed up big time, as it was the first game to use TressFX and needed patching to work on NVidia GPUs, as it would just crash for NVidia users from the off. NVidia jumped in and helped get it sorted but it certainly wasn't a Gameworks title.

Time and again you make stuff up without ever having a clue. Honestly, it is becoming tiresome reading and correcting you all the time.

Was it the second that needed all the patching last year? I had a 1080 when played the first for 30 minutes bu that was 2017 also when played it.
 
Was it the second that needed all the patching last year? I had a 1080 when played the first for 30 minutes bu that was 2017 also when played it.
As far as I can remember patching being needed, it was Tomb Raider 2013 that needed it. The second ran flawless for me and I don't recall anyone having issues that I can remember.
 
That golden rule is quite simply false (and utterly stupid).

As I said previously, if everyone avoids 1st gen there would be no second gen. You absolutely MUST have early adopters of 1st gen tech to make it a viable enough product to sink in more R&D to create future generations. If people had not bought the 1st gen iPhone in sufficient numbers Apple would have jumped out of the smart phone market entirely - the same is true of the Toyota Prius and many other innovations throughout history.
That golden rule is quite simply false (and utterly stupid).
Those examples at least has real-world results clear for all to see to decide on if they are worth the investment (even if they were expensive), the RTX cards the other hands do not and are already on shelves being sold on a just promise, with reviewers all up-in-arms stating there's no game for them to test these features and provide test results to their audience...
 
There's a different between ticking the checkbox and implementing the features well on a game though.

Guess only time will tell how well with these features work out in games. But even if people are that interested in these features, the golden rule is always to avoid 1st gen of any tech products (unless you got more money to blow than you need to care), as 2nd gen products of the same tech would almost always guarantee address most of limitations and problems that the gen 1 product might have, if not the very least would definitely run much more efficiently/with better performance.

If people are that desperate to get RTX/RTC DLSS asap, at least be a sensible person wait for actual in-depth review for how well are they implement in real-world using in game. If people drop money on promised features base on expectation alone without any real-world representation (under actual gaming environment), they might as well be funding crowd-funding projects.



So far basicaly every DX12 game has simple been a checkbox feature, with performance worse than the DX11 mode.

From what we have seen form DLSS and RTX I don;t think you can really call them checkbox. DLSSx1 allow 50% higher frame rates with nearly identical image quality, while DLSSx2 others much better IQ than TRAA with slightly better performance. RTX makes a massive difference to global illumination, shadow quality and dynamic lighting. The implementation of RTX is quite easy, and follows industry standards form MS so if AMD ever get back in to competition they will also be able to take advantage of such games.
 
Those examples at least has real-world results clear for all to see to decide on if they are worth the investment (even if they were expensive), the RTX cards the other hands do not and are already on shelves being sold on a just promise, with reviewers all up-in-arms stating there's no game for them to test these features and provide test results to their audience...
Which changes absolutely nothing nor disqualifies my previous reply.

Every single new innovative product (1st gen) is purchased on faith because there is no prior benchmark or experience to go on. All of these products are sold on trust and nothing more. The launch of the Toyota Prius or first iPhone were based on no different circumstances to the Nvidia RTX series or any other new 1st gen product; and as I said, if no-one buys them in significant numbers, there would not be a second gen - to argue otherwise is just a waste of air.
 
No what I'm saying is performance issues reported is likey the cause of the engine relying more on dx12 as in requires a more beefy cpu, you need a good all round system and not a system that has a good gfx card paired with old CPUs and slow memory.

Regarding textures, the game now has much larger maps than the previous 2 games so maybe that was a factor in slightly downgraded textures.

It's also not strange that what is probably Squares best developer outside of their own Japanese talent that the higher ups wanted Crystal Dynamics working on the Marvel license, you know that huge money making license that Disney are milking and proving to carry in being a big hit with Spiderman on ps4.
There was many leaked reports saying how unhappy square were with tomb raider sales (even though realistically there pretty good on paper).
I'd imagine Square decided to offload this game to get it out quickly so they can close off this trilogy and shelve it for awhile.

I can see their western studios all working on Marvel games for the next few years especially if CD game does well.

First off, DX12 offloads cpu load, second, the game has no performance problems they are technical issues with the engine, not performance induced stuttering.

As for textures, the maps aren't appreciably larger in most places, and we're talking about character textures on the people who are usually just one character close up on screen, maybe two or three in a cutscene, the reduction in quality is noticeable and completely unjustifiable. Again larger maps (meh) can take up more space, but individual textures are fairly small and reducing the quality of the most often seen ones that are in your face every second of the game such as your player character is the one single place you don't reduce texture quality. If you're having memory issues you keep character texture quality and you reduce lod, remove background rubbish, reduce other things you don't kill off quality of the main thing you see on screen throughout the whold game. There is no justification for the massive reduction in quality of character textures. Performance and memory size are not reasons to cut back there, but reasons to cut everywhere else to maintain quality of character textures.


As for DP, Rise of the Tomb raider ran monumentally faster and smoother in DX12 and that holds for almost every DX12 game I've played. I think Battlefield 1 is pretty much the only game off the top of my head that might not run better in DX12 and I don't know if that is even true any more.

For Rise of the Tomb Raider on launch DX11 was a stuttering horrific mess of performance issues and when DX12 was patched in the game became incredibly smooth for me.
 
Hmmm, funny you say that, as if it had issues with Ryzen CPUs and AMD cards, AMD have messed up big time, as it was the first game to use TressFX and needed patching to work on NVidia GPUs, as it would just crash for NVidia users from the off. NVidia jumped in and helped get it sorted but it certainly wasn't a Gameworks title.

Time and again you make stuff up without ever having a clue. Honestly, it is becoming tiresome reading and correcting you all the time.


https://www.eurogamer.net/articles/...b-raider-pc-players-plagued-by-geforce-issues

that's pretty funny mate, first off, TressFX worked from the get go for Nvidia users, the game crashed though and performance wasn't optimised. Nvidia themselves literally apologised for it, it was THEIR drivers causing the bulk of the issues. TressFX did not require being patched to work at all for Nvidia users, Nvidia required a patch to get their driver working more stably with the game as a whole.

They 'jump in and helped'.... get their own driver sorted. If you're going to accuse other people of making stuff up... it's best not to make stuff up when doing it.

Also once the patch and driver were done so Nvidia could get it stably working on their cards, TressFX wasn't as much of a performance hit and had no problem running on Nvidia cards. It wasn't a huge performance hog and it was in no way at any stage prevented from working on nvidia hardware, from second one of anyone playing it would work on Nvidia hardware, game crashing wasn't down to Tressfx but the Nvidia driver.
 
https://www.eurogamer.net/articles/...b-raider-pc-players-plagued-by-geforce-issues

that's pretty funny mate, first off, TressFX worked from the get go for Nvidia users, the game crashed though and performance wasn't optimised. Nvidia themselves literally apologised for it, it was THEIR drivers causing the bulk of the issues. TressFX did not require being patched to work at all for Nvidia users, Nvidia required a patch to get their driver working more stably with the game as a whole.

They 'jump in and helped'.... get their own driver sorted. If you're going to accuse other people of making stuff up... it's best not to make stuff up when doing it.

Also once the patch and driver were done so Nvidia could get it stably working on their cards, TressFX wasn't as much of a performance hit and had no problem running on Nvidia cards. It wasn't a huge performance hog and it was in no way at any stage prevented from working on nvidia hardware, from second one of anyone playing it would work on Nvidia hardware, game crashing wasn't down to Tressfx but the Nvidia driver.
Why is it funny? I said the game needed patching, (I never said because of TressFX) and NVidia did quickly jump in and fix it? Not sure if you are arguing with me for the sake of arguing. Perhaps try reading what people write and take it as it is instead of twisting to suit your agenda!

Edit:

And whilst we are here -
“We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.”
https://www.rockpapershotgun.com/2013/03/07/nvidia-apologises-for-crummy-tomb-raider-performance/
Take from that as you will and I am sure you will try to twist it about but they did get it sorted quickly and it did need patching via the game!
 
Nvidia requested publicly that they required TR source code access and dev co-operation to get a working driver out, got it and with the devs help they released a working driver-then it bugged out AMD TFX, so the dev released alternative builds for each vendor, it's the only title I've known that had 2 concurrent builds 1 for AMD, one for Nv, devs weren't neglecting any vendor.

Then years later Nv slaughtered AMD for requesting GW's source code access-insisting that game devs wouldn't give out source code due to trade secrets, now Nv's flipped again asking for game devs to supply source code for DLSS.:D
 
Last edited:
now Nv's flipped again asking for game devs to supply source code for DLSS.:D

Ummm context is everything. NV have not requested source code to fix something which is broken, they have requested source code so they can implement DLSS directly into games for developers in order to reduce time and costs for developers to support it. Developers are perfectly entitled to not take up Nvidia's offer and do it in house if they are concerned about trade secrets. You are trying to create a scandal where one does not exist.
 
Why is it funny? I said the game needed patching, (I never said because of TressFX) and NVidia did quickly jump in and fix it? Not sure if you are arguing with me for the sake of arguing. Perhaps try reading what people write and take it as it is instead of twisting to suit your agenda!

Edit:

And whilst we are here -

https://www.rockpapershotgun.com/2013/03/07/nvidia-apologises-for-crummy-tomb-raider-performance/
Take from that as you will and I am sure you will try to twist it about but they did get it sorted quickly and it did need patching via the game!

First of all this is what you said

as it was the first game to use TressFX and needed patching to work on NVidia GPUs, as it would just crash for NVidia users from the off.

So you stated that tressfx crashed for all Nvidia users and needed patching before it worked at all, that is a straight up lie. There are literally quotes in that article of Nvidia users saying the performance sucked or the hair went crazy, none of them said it crashed if they enabled it. You simply made that up, you also inferred by saying "nvidia jumped in" to help the dev fix this major issue with Tressfx. You could I guess claim you meant the game as a whole crashed on Nvidia users gpus... though again you have quotes there of people playing the game and the blame for the crashing was an unstable driver. You should be aware that the way you wrote that it came off as you saying Tressfx failed to run without crashing for all Nvidia users till a patch was released. Accusing people of twisting your words when you write poorly is not a good tact to take. Regardless again either way you meant it, was flat out incorrect anyway.

Games often get patched along with a driver to work WITH THAT DRIVER, that doesn't mean anything was broken in the game, it meant Nvidia needed changes to work with their driver which was unstable. This is also pretty standard for Nvidia, Witcher 3 and Nvidia needed about what 6 drivers before their own users stopped experiencing crashes. This when AMD's old driver and AMDs game driver all played the game without crashing from the get go. The game was patched a few times as Nvidia sought to fix the crashing issue with their drivers yet the game worked fine with AMD throughout.

Sometimes games need patching to help work around driver specific issues and this happens with AMD and Nvidia, implying that the game required patching to work is incorrect though. The issue was with Nvidia as both the article I linked and the article you linked which had the same basic title was an apology from Nvidia to their fans. It was in fact CD who helped Nvidia get their dodgy driver working with the game, not Nvidia jumping in to help the dev as you implied. My experience is I can't remember the last game that I had to install a new driver to make a game work, while numerous games have crashing issues on both old and 'game ready' drivers until Nvidia finally releases a stable driver, again look at Witcher 3, a gameworks title yet Nvidia users were experiencing problems on launch again. Maybe I'd mention something about Nvidia driver extensions and needing extra help to get a game actually working with their driver while AMD like to stick within the specs and seems like more games just work from the get go as a result of game devs and AMD drivers working on the same page.

So no I'm not just being argumentative, you said something patently false, that TressFX didn't work with Nvidia from the get go and you clearly were implying that Nvidia helped the dev fix this issue (the one you made up).
 
First of all this is what you said
Look, check the post I replied to and I was explaining why it was an AMD sponsored game. It was the first game to use TressFX (which I really liked) and it needed patching for NVidia users to work (the game was broken and required drivers and a patch from the devs). Not sure what elese I can say that you can write a whole bunch of stuff over but what I said is quite correct and that is that!
 
Look, check the post I replied to and I was explaining why it was an AMD sponsored game. It was the first game to use TressFX (which I really liked) and it needed patching for NVidia users to work (the game was broken and required drivers and a patch from the devs). Not sure what elese I can say that you can write a whole bunch of stuff over but what I said is quite correct and that is that!

That's okay, you're doubling down on your lie. It's pretty simply, you said it did not work without a patch, that is a lie, it didn't work great and crashed for some people. It was patched to accommodate the Nvidia driver, the game worked just fine before the patch for AMD users and Nvidia users were actively playing the game before the patch. Stating it did not work and only crashed for all Nvidia users before the patch is a straight up lie, nothing more or less.

Though by your standard of a game crashing for anyone on launch means the game didn't work at all before a patch, then the Witcher 3 didn't work for Nvidia users till like the 5th or 6th driver about 2 weeks after launch?
 
Back
Top Bottom