• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.

Right so you're whole argument is based on CP2077 alone (78% positive on PC mind you - some people seem to enjoy the RT graphics) we are NOT talking about bugs, just the graphics, discount Digital Foundry as biased and again count me as biased despite not actually owning a GPU that can do RT. It's not a love for RT, its a love for anything than can add graphical realism to games, I see you didn't really take this into account, just dismissed it as it doesn't fit your anti-NV narrative - As well as my opinion that anything, whether AMD or NVidia is what I want if it increases the fidelity.

So fully pathed ray tracing as seen in movies isn't currently viable, but the steps towards it mean nothing until it's there complete as it is on £100k super computers that Hollywood/CGI studios use to run it. Brilliant.

:D
 
Right so you're whole argument is based on CP2077 alone (78% positive on PC mind you - some people seem to enjoy the RT graphics) we are NOT talking about bugs, just the graphics, discount Digital Foundry as biased and again count me as biased despite not actually owning a GPU that can do RT. It's not a love for RT, its a love for anything than can add graphical realism to games, I see you didn't really take this into account, just dismissed it as it doesn't fit your anti-NV narrative - As well as my opinion that anything, whether AMD or NVidia is what I want if it increases the fidelity.

So fully pathed ray tracing as seen in movies isn't currently viable, but the steps towards it mean nothing until it's there complete as it is on £100k super computers that Hollywood/CGI studios use to run it. Brilliant.

:D
Noice strawman. Isn't cb2077 the most recent game that only works best on ampere? Doesn't it offer all this rt effects you claim will revolutionized the world? Didn't you reference your influencer DF regarding it using RT? Of course you did.

BTW, I don't need to stick my hand in a fire to know it's hot. I don't need to chew glass to know it will cut. And, I don't need to own a nvidia card to see the poor level of RT that it offers is no interest to me. But since you concede that "real" rt is best left in the studio, movies, renders, etc I take it that this level of rt in games is at best fake.
Merry Christmas
:D
 
Last edited:
Noice strawman. Isn't cb2077 the most recent game that only works best on ampere? Doesn't it offer all this rt effects you claim will revolutionized the world? Didn't you reference your influencer DF regarding it using RT? Of course you did.

BTW, I don't need to stick my hand in a fire to know it's hot. I don't need to chew glass to know it will cut. And, I don't need to own a nvidia card to see the poor level of RT that it offers is no interest to me. But since you concede that "real" rt is best left in the studio, movies, renders, etc I take it that this level of rt in games is at best fake.
Merry Christmas
:D

Again, fantastic post. No I never claimed it revolustionized the world. Just improved it. DF biased by acknowledging graphic enhancements that can make a game seem more real and visceral? ok.

RT implementation poor on Nvidia, well what the heck does that make AMD's attempt? That doesn't come into it does it? Just the fact that the movie version of RT can't be implemented at the full movie studio experience therefore making it worthless regardless of increased graphical fidelity in games.

Come on, I genuinely need some other people in here to say what @EastCoastHandle is talking about hold's up in the overall PC graphics scene? You can be biased either way if necessary but please just state facts. Whether it be RT won't be around in a couple of generations. Or even NVidia is currently superior with new graphics technology advancements means nothing?
 
Again, fantastic post. No I never claimed it revolustionized the world. Just improved it. DF biased by acknowledging graphic enhancements that can make a game seem more real and visceral? ok.

RT implementation poor on Nvidia, well what the heck does that make AMD's attempt? That doesn't come into it does it? Just the fact that the movie version of RT can't be implemented at the full movie studio experience therefore making it worthless regardless of increased graphical fidelity in games.

Come on, I genuinely need some other people in here to say what @EastCoastHandle is talking about hold's up in the overall PC graphics scene? You can be biased either way if necessary but please just state facts. Whether it be RT won't be around in a couple of generations. Or even NVidia is currently superior with new graphics technology advancements means nothing?
LOL, you come into a AMD centric thread professing nvidia as the better buy...get into a debate with me, which you lose. Now want to feign that you are the victim? Now that's rich.

RT as whole in games is poor as it is not a standard in which someone who buys a midrange gpu can enjoy without the performance penalty. Second, it's not true ray tracing as the game is still rasterized. Third, all it does it help decrease development time implementing lighting, shadows, reflections, ambient occulsion, etc. Depending on how it's tweaks to improve overall performance. And which the developer will use in the game. As all of the tricks of rt are not always implemented in the game. But like I told you before, it doesn't change the fact that its still a rasterized game.

As for how AMD decided to go about it. I agree with. As RT isn't important enough to fully support. Since the game is still rasterized AMD improved RDNA 2 to reflect that. It's not hard to figure out.
:D
 
Last edited:
Well this thread is pretty boring now with nVidia fans dominating it. Half of it is like a broken record.

Can someone give me a shake next year when either card is available at a non scalped price?
 
Again, fantastic post. No I never claimed it revolustionized the world. Just improved it. DF biased by acknowledging graphic enhancements that can make a game seem more real and visceral? ok.

RT implementation poor on Nvidia, well what the heck does that make AMD's attempt? That doesn't come into it does it? Just the fact that the movie version of RT can't be implemented at the full movie studio experience therefore making it worthless regardless of increased graphical fidelity in games.

Come on, I genuinely need some other people in here to say what @EastCoastHandle is talking about hold's up in the overall PC graphics scene? You can be biased either way if necessary but please just state facts. Whether it be RT won't be around in a couple of generations. Or even NVidia is currently superior with new graphics technology advancements means nothing?

You're wasting your time trying to argue with this when the decision about how good or bad RT is was made and shut tight a long time ago - for some.
Personally I would pay that extra 50 credit to get the card that has better features when all else is equal. Even the difference in power consumption is small, but still need to change the PSU due to it, independent of the card I'd chose.

The narrative would change if AMD was in front or will come in front after a few generations down the line. Is the same with every tech, regardless of the company that implements it.

However, at the end of the day, this is still good as it allows the company with the lesser product to still sell it and continue the competition down the line.
 
The thing is that we are currently in a transition period when most people will either have moved from 1080p to 2160p and start thinking about it, so that it's the worst possible timing for enabling heavy computationally ray-tracing and trying to offset the frames per second drop by using faking the image quality with artificial alternative rendering techniques such as deep learning super-sampling.

If AMD blocks the implementation of ray-tracing in games - it would be the best for the community. For now. For the next 5 to 10 years.
 
If AMD blocks the implementation of ray-tracing in games - it would be the best for the community. For now. For the next 5 to 10 years.
Not sure if serious.

You could say this about any feature - why dont you just block 4k resolution and have us all playing at 720p then we could all run RT.

Best thing to do is enable it and let it develop over the years - those who want to run it can those who dont turn it off.
 
The narrative would change if AMD was in front or will come in front after a few generations down the line. Is the same with every tech, regardless of the company that implements it.

What if AMD will never come in front? What if it will catch up and even do better ray tracing than Nvidia but by then all the reviewers will tell you that the greatest thing a videocard can do is another feature and Nvidia does that 2 times better than AMD? :)
What if this is the trick Nvidia is using to sell their cards? Add a feature ( like ray tracing ) and then sponsor a ton of games to heavily use that feature? And sponsor media to promote that feature? How can any other manufacturer catch up if by the time they catch up, Nvidia already moves to something else? (from example from heavy tessellation to heavy RT)?

Most of the people think Nvidia helps the industry with these tricks because the games are "more realistic" with the heavy RT. But in fact they are only interested to sell their new cards, they don't care how realistic the game looks or even if it becomes an industry standard. When AMD will be close in RT performance, they will move to the next thing. And heavy RT games won't be used in the next years outside Nvidia sponsored titles. They may never be created if the cost to performance is too big and there are other ways to make the game look " more realistic".
 
Not sure if serious.

Oh, very serious.

You could say this about any feature - why dont you just block 4k resolution and have us all playing at 720p then we could all run RT.

Ray-tracing is about implementing real-time error correction for the calculations of the lighting effects, which by the way have been done pretty well for decades and ray-tracing is the least useful feature today that you could imagine.

4K is the most needed because it obviously improves both the image quality and the ergonomics and strain on people's eyes because of the low-quality and large pixel size 1080p or 720p displays.

Best thing to do is enable it and let it develop over the years - those who want to run it can those who dont turn it off.

What goes around, comes around :D

Nvidia causes DX10.1 to be removed from Assassin's Creed? | Overclockers UK Forums
 
Here is Nvidia using the same trick with the same developer in Witcher III:

https://www.extremetech.com/extreme...over-optimizations-to-the-witcher-3-wild-hunt

"Nvidia takes a 24% performance hit from having GameWorks enabled, compared with 47% for AMD. "
"
As you might expect, AMD and Nvidia have their own distinct takes on this problem and the reasons for it. AMD’s Richard Huddy told Ars Technica the following: “We’ve been working with CD Projeckt Red from the beginning. We’ve been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we’re concerned. We were running well before that… it’s wrecked our performance, almost as if it was put in to achieve that goal.”
Nvidia’s PR Manager, Brian Burke, had a very different take on the situation in a statement provided to PC Perspective. “We are not asking game developers do anything unethical,” said Nvidia’s GameWorks’ Brian Burke. “GameWorks improves the visual quality of games running on GeForce for our customers. It does not impair performance on competing hardware. GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license...
"

How did that improved the gaming industry? The game came with a lot of bugs ( like another game we all know :) ) and today no one is using hairworks. But at that time it helped Nvidia to sell a lot of cards based on a trick promoted by everyone in the media.
 
Ray-tracing is about implementing real-time error correction for the calculations of the lighting effects, which by the way have been done pretty well for decades and ray-tracing is the least useful feature today that you could imagine.
Well if RT is completely useless then what is all the fuss about? Let the nvidia fan boys turn on their useless feature which eats up FPS.

I think AMD have done a pretty good job on RT for first iteration - CP2077 was supposed to come out prior to the 3000/ 6000 release so AMD would not have had any RT if everything went to plan with the games release. You can probably argue that nvidia held back release but the game is clearly a buggy mess so I genuinely think they needed the extra time.

The RT performance is up there with the 2080ti on top AMD models. If someone had said that would be the case 6 months ago people would have been pretty pleased.
 
AMD 6000 series RT performance is good enough for every game that will come from consoles to PC in the next years and will use RT. It is not enough and it will never be enough for Nvidia sponsored titles. Even if the next AMD cards will be enough to play lets say Quake II or CP at 60 FPS/ 4k ultra, by then Nvidia will sponsor other titles with even more RT for their next gen cards.
And when they won't be able to use RT to sell their cards, they will come up with something else and make everyone think that is the ultimate feature and they have to buy Nvidia next gen cards. :)
 
I guess ECH likes to play on lowest settings. Considering RT is way more visible than going from low to ultra settings in most games.

amirite? :D

Also bias? The king of bias calling others biased? Let me laugh some more. :D:D:D
 
I have been using Watchdogs 3 with RT on at 4K and it does make a visible difference. Strangely adding it into CP2077 seems to just tank FPS other than add some reflextions to water. Though to be fair CP2077 sucks and I haven't played it much to do real testing.

The main reason RT is suable now is because of DLSS 2.0 as 1.0 totally sucked. 2.0 DLSS actually is a useful feature compared to RT and if the choice is RT on and DLSS on at 40 FPS, or DLSS on at 80 FPS... well the RT better look amazing and usually it just doesn't.

So if AMD get a decent iteration of DLSS then that will be a big bonusd for AMD 6X00 owners. Though if it is as bad as DLSS 1.0 then it will just suck. AMD RT may actually be fine once they get their DLSS equivalent sorted.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom