• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

looks like RTX "on" just got a nice boost in BF5

Downgrade? They've got lots of optimisations coming in the next patch and after apparently. Good to see they're working closely with NVidia and will continue to do so.
Shame so many continue to want to hate RT and focus on any negatives...……because the cards cost too much :D. Open your eyes to the fantastic tech that it is, will be, once it develops further.

Once it develops further and the price drops about £1000. Then it will be worth talking about and see wide-scale support.
 
Christ, doesn't matter what nvidia do they get hate. Should change their name to Trumpvidia.
People just don't like corporate greed when it is well and truly shoved in their face. This hate extends to the people who don't mind this fact and still buy their products to enjoy.

That, and Space Invaders.
 
Christ, doesn't matter what nvidia do they get hate.

Agreed, I'm reading more or less the same tripe with RTX as Mantle got dished on it.

Although there's a certain amount of karma dished on some RTX adopters that **** on Mantle getting a taste back, it's grim reading when you embraced both techs and going through all the pish from the bitter side of those that can't/don't want to adopt it and would rather all we got was stuck in boring high fps graphics.
 
You wanna say thanks to all those 2080ti owners, as they are the ones paying to see this tech improved on. I don't think anyone is under any illusions that this isn't early adopter tech. Soon I think we'll see ray tracing become more prevalent as gpu's begin to offer more grunt in that rendering space.

Great to see that we're already seeing ways to improve performance without massively gimping it.
 
Agreed, I'm reading more or less the same tripe with RTX as Mantle got dished on it.

Although there's a certain amount of karma dished on some RTX adopters that **** on Mantle getting a taste back, it's grim reading when you embraced both techs and going through all the pish from the bitter side of those that can't/don't want to adopt it and would rather all we got was stuck in boring high fps graphics.

Mantle was a proprietary standard designed exclusively around GCN architecture. AMD did eventually realize developers woudln't touch it with a barge pole so moved it to open source, but since it was controlled by AMD and designed for GCN it quickly died, with a few of the better more general concepts redesigned for Vulkan.

RTX is totally different since the DXR standard is part of DX, maintained by an independent 3rd party, and is a high level API that is not written to a particular architecture. AMD could write a driver right now that supports DXR, they have all th epieces in place already with Radeon rays, just it will be incredibly slow without hardware support.
 
I read but you are vile.

Edit:

Why do you care? You are one of those who has no intention of owning NVidia but the most outspoken when it comes to their techs.

You may have read but perhaps require someone to explain.

I have a 980Ti, 1080Ti, and a laptop with a 1070/G-Sync. I was ready to buy a 2080Ti as I have never thought the additional cost / poor cooler of the Titan was worth it. Luckily though I watched the launch. Smart people realised there was something very wrong with this series.

Still, look on the bright side, It's a good time to own a turkey ;)

Downgrade? They've got lots of optimisations coming in the next patch and after apparently. Good to see they're working closely with NVidia and will continue to do so.
Shame so many continue to want to hate RT and focus on any negatives...……because the cards cost too much :D. Open your eyes to the fantastic tech that it is, will be, once it develops further.

There is no hate for RT. There seems to be no pro argument from the consumer for these cards, therefore anyone who is less than impressed with the 20 series must be full of hate?

Most people acknowledge that the 20 series is a good entry for developers, while providing a cheap professional card. I don't think the cards cost too much, indeed given the amount of silicon involved, these are fairly cheap. What you are seeing DICE do is remove some of the workload from the GPU to the CPU while also using only a subset of raytracing. It's not what the 20 series was sold as, "It just works".
 
Most people acknowledge that the 20 series is a good entry for developers, while providing a cheap professional card. I don't think the cards cost too much, indeed given the amount of silicon involved, these are fairly cheap. What you are seeing DICE do is remove some of the workload from the GPU to the CPU while also using only a subset of raytracing. It's not what the 20 series was sold as, "It just works".

Well said
 
There is no hate for RT. There seems to be no pro argument from the consumer for these cards, therefore anyone who is less than impressed with the 20 series must be full of hate?
But it's purely based on price IMO :). It's a like a child saying "I don't want one anyway" or "It's rubbish anyway" when you tell them they can't have something, hence we have people continually pulling the new tech apart. I'm not paying x for an RT card, it's rubbish anyway. Improvements being made? Oh, it's flawed.
Technology advances/new tech often when something is introduced that people didn't even know they wanted. And as most game at lower than 4k resolutions, more realistic experience is pretty high up on their list of requirements so there is a pro consumer argument.
Heck, even the internet was expensive and had imperfections when it first arrived. Imagine paying 90pence or so for an hours browsing. It was slow too :). But using it, having it adopted by the masses, reduced the price and improved it.
 
Last edited:
Mantle was a proprietary standard designed exclusively around GCN architecture. AMD did eventually realize developers woudln't touch it with a barge pole so moved it to open source, but since it was controlled by AMD and designed for GCN it quickly died, with a few of the better more general concepts redesigned for Vulkan.

RTX is totally different since the DXR standard is part of DX, maintained by an independent 3rd party, and is a high level API that is not written to a particular architecture. AMD could write a driver right now that supports DXR, they have all th epieces in place already with Radeon rays, just it will be incredibly slow without hardware support.

You are far from truth and history.

Mantle was an open standard working with all architectures and operating systems.
It was NV decision not to the support it. Ironically they did support DX12 when announced few days after Mantle, which is a proprietary framework. And yet still it took Turing to fully support it as neither Maxwel nor Pascal fully supported DX12.

Fast forward 5 years later, Mantle is renamed to Vulkan, and Nvidia support it, while it runs great on both AMD & Nvidia GPUs across all OS.

DXR (a proprietary framework) requirement is DX12.1 hardware compliance. AMD Vega does comply to all requirements, and it doesn't need Tensor Cores or RayTracing cores to do ray tracing effectively. Especially Vega, is proven to be very good on matrix computing (thats what ray tracing is) and doesn't need some "mythical" cores to achieve it.

Apparently AMD has made the decision not to support DXR until all of their products lineup does.
 
But it's purely based on price IMO :). It's a like a child saying "I don't want one anyway" or "It's rubbish anyway" when you tell them they can't have something.

For some, some negativity will be based on price as it's not unfair to expect a price/performance ratio not unlike previous gens. For those though, if the product was worth it, saving up for a few more months would not be a problem. IMO ;) it's based on what was advertised/promised vs what was eventually delivered. The performance of raytracing with the 20 series is simply no where near where it needs to be for consumer cards.

Cards in the past (pre RTX) have been rubbished due to FPS. Rattles were thrown and prams destroyed when that little number did not go high enough. Yet with raytracing, that little number no longer matters ;)
 
For some, some negativity will be based on price as it's not unfair to expect a price/performance ratio not unlike previous gens. For those though, if the product was worth it, saving up for a few more months would not be a problem. IMO ;) it's based on what was advertised/promised vs what was eventually delivered. The performance of raytracing with the 20 series is simply no where near where it needs to be for consumer cards.

Cards in the past (pre RTX) have been rubbished due to FPS. Rattles were thrown and prams destroyed when that little number did not go high enough. Yet with raytracing, that little number no longer matters ;)
Because it's crucial moving forward that;s why I think. It's a hugely complex thing to implement, needs so much computing power, so obviously it has to be implemented scaled back initially until the tech improves which will only happen through use and adoption. There have been other times where performance has suffered in the past I believe. Tessellation for one :).

While the 2080 is more expensive than the 1070 Ti I had before, I get over 60 FPS in BFV with RT on low and ultra everything else. I was getting 73 from the 1070 Ti in FarCry 5 and I'm not sure that was with everything ultra either. BFV looks a lot nicer too. Personally I find that fairly impressive given what they're trying to do and for a very first implementation tacked onto a game that wasn't even planned to have RT which leads me on to another point - it's still very early days yet even for the first generation of RT cards. Can NV have some of that Fine Wine AMD are allowed? Given how complex RT is, they do deserve it IMO
 
Last edited:
Because it's crucial moving forward that;s why I think. It's a hugely complex thing to implement, needs so much computing power, so obviously it has to be implemented scaled back initially until the tech improves which will only happen through use and adoption. There have been other times where performance has suffered in the past I believe. Tessellation for one :).

While the 2080 is more expensive than the 1070 Ti I had before, I get over 60 FPS in BFV with RT on low and ultra everything else. I was getting 73 from the 1070 Ti in FarCry 5. BFV looks a lot nicer too. Personally I find that fairly impressive given what they're trying to do and for a very first implementation tacked on to a game that wasn't even planned to have RT which leads me on to another point - it's still very early days yet even for the first generation of RT cards. Can NV have some of that Fine Wine AMD are allowed? Given how complex RT is, they do deserve it IMO

It's around 28years since I played with raytracing, using Borland's Turbo C and a86. So I have an idea how complex it is :p

My point, don't sell a 4k card offering full raytracing and then only provide an implementation that uses a subset at 1080p/60. There would have been a lot less negativity If Nvidia had shown the true performance of the 20 series at launch, even with the design flaws.

Odd feeling of deja vu here. 970 anyone?
 
It's around 28years since I played with raytracing, using Borland's Turbo C and a86. So I have an idea how complex it is :p

My point, don't sell a 4k card offering full raytracing and then only provide an implementation that uses a subset at 1080p/60. There would have been a lot less negativity If Nvidia had shown the true performance of the 20 series at launch, even with the design flaws.

Odd feeling of deja vu here. 970 anyone?
Reminds me of tessellation personally where the first games used it sparingly :).
4K is demanding anyway so predicting the performance of RT would not be great was a no-brainer. I learned a lesson when I moved to 1440P too soon. I still use 1440P today. I won't go higher res than that. New graphics tech will always have an impact on frame rates initially so the higher the res, the bigger the pain to suffer as the frame rate is likely to fall back to something less acceptable.
 
Last edited:
You are far from truth and history.

Mantle was an open standard working with all architectures and operating systems.
It was NV decision not to the support it. Ironically they did support DX12 when announced few days after Mantle, which is a proprietary framework. And yet still it took Turing to fully support it as neither Maxwel nor Pascal fully supported DX12.

Fast forward 5 years later, Mantle is renamed to Vulkan, and Nvidia support it, while it runs great on both AMD & Nvidia GPUs across all OS.

DXR (a proprietary framework) requirement is DX12.1 hardware compliance. AMD Vega does comply to all requirements, and it doesn't need Tensor Cores or RayTracing cores to do ray tracing effectively. Especially Vega, is proven to be very good on matrix computing (thats what ray tracing is) and doesn't need some "mythical" cores to achieve it.

Apparently AMD has made the decision not to support DXR until all of their products lineup does.
Mantle was never an open standard and it was only when it was passed over to Khronos did it become Vulkan and then 'open'. AMD stated that it would become 'open source' but whilst it was called Mantle, it was a proprietary tech for AMD users only.
 
Was interested to read about how people are finding the update as i'll give it a go later today, all I got was the same crap in all the other threads slamming the update. Boring...
 
You may have read but perhaps require someone to explain.

I have a 980Ti, 1080Ti, and a laptop with a 1070/G-Sync. I was ready to buy a 2080Ti as I have never thought the additional cost / poor cooler of the Titan was worth it. Luckily though I watched the launch. Smart people realised there was something very wrong with this series.

Still, look on the bright side, It's a good time to own a turkey ;)



There is no hate for RT. There seems to be no pro argument from the consumer for these cards, therefore anyone who is less than impressed with the 20 series must be full of hate?

Most people acknowledge that the 20 series is a good entry for developers, while providing a cheap professional card. I don't think the cards cost too much, indeed given the amount of silicon involved, these are fairly cheap. What you are seeing DICE do is remove some of the workload from the GPU to the CPU while also using only a subset of raytracing. It's not what the 20 series was sold as, "It just works".
You can own what you like but it doesn't change the fact that you seem to dislike any RT performance upgrades and you have shown this time and again with your posting and still doing it now. I don't care who has what and want all and sundry to have a great gaming experience and techs that bring graphical gains are a must for me. RT is one of the things that WILL change gaming as we know it and when the tech is in full flow, we will be playing near movie quality games. It has to start somewhere and I applaud NVidia for doing it. I never expected 200 fps but all this negativity does stink.
 
Back
Top Bottom