• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Another RTX GPU Moan Thread

  • Thread starter Thread starter Guest
  • Start date Start date
RTX is amazing for the industry because it's driving people away to AMD. Hopefully we will have an even playing field in a year or two closer to 50/50 instead of a manopoly. Although some people have been annoyed at the prices of RTX so they bought a 1080 ti instead which defeats the object because you're still giving money to Nvidia.
 
They spend $600m per quarter on R&D for everything, the comments are probably abit overblown but peeps underestimate cost and man/person hours to make a new gpu especially one that radically different to what came before it. Just need to ask AMD or Raja Koduri.

That's true, they do need to recoup all that money back. Although in hindsight, I'm sure they have realised that they scewed up by trying to release real-time raytracing in 2018. As truly amazing as real-time raytracing is, it's just not yet possible to make cards with that technology for a realistic price.

Nvidia should have relased 2 more generations of traditional rasterization cards; an 1100 series, followed by a 1200 series in 2020. Then it should be possible to produce RTX in 2022 for a cheaper price.
 
I dont agree with that. RT is sort of like tessellation, it can make games look nicer, but still a lot of gamers will turn it off for raw performance. As we seen also with BF5, the real world implementation is also a lot less than flashy tech demos. The BF5 RT is most definitely not the biggest GPU tech advancement in last 15 years. Also I think RT will not be as popular as you said in 3 years time, I am very confident in saying that. For a start indie dev's will probably never implement it and 95+% of games are indie. How many games implement DX12 3 years after DX12 was launched? and DX12 had a higher player base than RTX cards do.

You cant base a prediction of whether a new technology will fail or not based on BF5. It would be like looking at the first car and saying it's a failed invention. Or looking at the first rocket and saying it's a failed invention.

Forget anything you see about RT 2018. They said Direct X was a failure back when people were using 3DFX Glide and OpenGL. Now in 2018 Direct X is huge.

I predict that in 10 years time, rasterization wont be used anymore in games. RT will be the only option. There wont even be an 'off' button because nothing else will be available. So obviously Intel integrated graphics will have to be RTX. Everything. We'll definately see big changes in 6 years. Obviously by then, the hardware available will handle well over 100 FPS and the cards will be as cheap as todays rasterization cards
 
As truly amazing as real-time raytracing is, it's just not yet possible to make cards with that technology for a realistic price.

Nvidia should have relased 2 more generations of traditional rasterization cards; an 1100 series, followed by a 1200 series in 2020. Then it should be possible to produce RTX in 2022 for a cheaper price.

Sorry, but this is the garbage I was talking about. Where did you get the idea RTX is so expensive to produce? Why would it be so much more expensive than any previous generation?
 
Last edited:
Sorry, but this is the garbage I was talking about. Where did you get the idea RTX is so expensive to produce? Why would it be so much more expensive than any previous generation?

To do ray tracing properly like it is done in cgi films, it currently takes lots of gpus. To condense this sort of power into a single gpu will take time and money. Look how big the die is just for the performance we are seeing now and its not anywhere near full ray tracing.
 
Even the Taxi driver last night told me that NVidia had skimped on doing RTX.:D

I did not mention that I had a couple of 2080 ti cards.:eek:
 
You cant base a prediction of whether a new technology will fail or not based on BF5. It would be like looking at the first car and saying it's a failed invention. Or looking at the first rocket and saying it's a failed invention.

Forget anything you see about RT 2018. They said Direct X was a failure back when people were using 3DFX Glide and OpenGL. Now in 2018 Direct X is huge.

I predict that in 10 years time, rasterization wont be used anymore in games. RT will be the only option. There wont even be an 'off' button because nothing else will be available. So obviously Intel integrated graphics will have to be RTX. Everything. We'll definately see big changes in 6 years. Obviously by then, the hardware available will handle well over 100 FPS and the cards will be as cheap as todays rasterization cards

Given raytracing is not a tech that can draw everything, sorry but I dont think thats going to be the case in 10 years, you seem to think the industry moves way faster then it actually does and that game developers only care about the top end hardware.
The only way you would see a tech adopted as widespread as you are saying is if "low end old hardware" can utilise it.
There is still DX9 games been published in 2018. How old is DX11?
Its going to take 5 years for it to be available I reckon in maybe 5-10% of AAA titles at best, which will equate to about 1-2% of all titles maybe. We will go from that to 100% of all titles in 10 years? Plus not just small bits of lighting effects, to all of graphics been rendered by RT? I dont know what you smoking but that seems just barmy thinking.

I mean e.g. why would you use RT to render say a sideview platformer, or anime style game?

Or do you think everyone just wants to play games that have realistic graphics emulating a movie? This is another part of the problem of first person gamers thinking thats the entire industry.

So e.g. BF5 60fps 1080p with small little tidbits of RT in use, how much increase in RT power would be needed to say fully render every object with RT on this game? 20x? 50x? 100x?
Then after that, hardware advancements have to be at the point that a integrated gpu in 10 years can do all that. Because if it cannot, then absolutely no chance of what you claiming will happen.
 
I don't mean to be rude but that means you either don't understand what RT is, don't understand what PhysX is, or both.

PhysX is cool but it's not super game changing, you can do similar stuff with Havok/etc. RT is the biggest advancement in GPU tech in probably the last 15 years, IIRC hardware T&L was prob the last game changer on this level. in 2-3 years from now every game coming out will have RT features, it will become a standard feature of all games just like T&L, AA, AF, 3D acceleration, etc.


Ray tracing isn't the biggest advancement in GPU tech in 15 years, largely because ray tracing began DECADES ago. Ray tracing was every single day since then about when would their be enough power to utilise it, nothing more or less. Year 1, 1 frame every 6 hours, year 3 one frame every 3 hours, etc, etc, etc... until we reach a point where 1 ray tracing frame can be done in 1/60th of a second and then it will be used everywhere. Ray tracing is really pretty simple as well, hence why it was easy to do decades ago, just slow. It's easy because it doesn't use tricks, and as such it's simple but exceptionally computationally heavy. Rasterisation started off simple and crap looking and gets more complex every year to introduce more quality using tricks, short cuts, work around to give you higher quality at a lower cost. Insane quality at insane cost was doable with ray tracing from decades ago and at some point it will become viable, but it's not an advance in gpu tech.

Essentially ray tracing will become the norm when the image quality improvements with normal rasterisation become so complex and so difficult to move forward that they actually take more power than ray tracing to achieve gains and thus ray tracing becomes 'cheaper'.


The strange thing here is, Nvidia has always had the ability to force their tech into games via developer relations and gameworks. It's not about developer help, it's simply we're paying you to put gameworks in. It very much makes you think that DLSS has a major quality or functionality issue. If it 'just worked' and they are paying games to be gameworks then it would be in there imo, that it's not indicates something Nvidia side preventing it being enable/utilised till they fix it.

It's strange, upscaling an image was always frowned upon, but it still makes sense to get a higher fps at full 4k on a screen and preventing a screen upscaling the image or a 'less good' algorithm within the Nvidia driver upscaling the image from say 1440p to 4k. Upscaling has it's places and works well on consoles (well enough), but to paint it as an uber new image quality option when really it's a performance cheat lower quality option was really, lets say, disingenuous. 4k DLSS will almost certainly look less good than 4k native, but if it looks better than 1440p and performs better than straight 4k then it can definitely work for people as said. The real question would be how good will the image actually be, because you suspect it will be a little blurry and washed out in reality. A compromise you accept just to get a solid 30fps on a console, but one I wouldn't make on a PC especially sitting closer to a screen.


back to RT, I've been saying it for years. It was the holy grail because 25 years ago the difference between rasterisation then and ray tracing was night and day. Today the difference is drastically smaller due to the complexity of rasterisation. Even those BF5 reflections, in static shots they look nice, but in reality while playing those reflections are something you barely notice as you aren't just focusing on puddles and windows but playing the game. Making far lower quality reflections gives almost exactly the same feel when actually playing the game and costs so much less performance.

When the power is there to use it everywhere fine, but while it's not, I'm in no way fussed about it.
 
Sorry, but this is the garbage I was talking about. Where did you get the idea RTX is so expensive to produce? Why would it be so much more expensive than any previous generation?

The cost of silicon wafer increased by 20% in the first half of this year and a further 20% rise is due in 2019. TSMC will have to charge Nvidia at least 40% more for each batch of GPU dies in 2019 than they did at the end of 2017. Therefore, Nvidia will have to pass the full 40% rise onto the consumer from initial release of the RTX because it would be very bad for them if they had a 20% rise 6 months after release.

The GTX 1080 Ti has a die size of 471 mm² so TSMC would have been able to cram 180 dies onto a 300mm silicon wafer. I don't know what the failure rate is but if they get a 70% working yield, that leaves us with 126 working dies from a 16nm 300mm wafer. A wafer costs about $7,500 so that's $59.52 per die to make.

Due to the RTX 2080 Ti's larger die size of 775 mm² they can only fit 61 dies onto a 300 mm wafer. This newer 12nm wafer is also more expensive and I cant find anything on TSMC's website so lets give it a conservative figure of $9,000 per wafer. Assuming a similar failure rate, this will mean that RTX costs about $210 per die. Then add on the 20% price increase and we arrive at $252.92.

I'm just guessing at failure rates per yield and the cost of 12nm wafer but the 2080 ti die definitely costs at least twice as much to make as the 1080 ti.

I haven't even mentioned the cost of R&D to design such a highly advanced piece of tech.
 
Last edited:
Sorry, but this is the garbage I was talking about. Where did you get the idea RTX is so expensive to produce? Why would it be so much more expensive than any previous generation?

I agree with some of what you are saying. Nvidia could have released these cards at cheaper price to try and promote the jump to Ray Tracing. They might have taken a small hit on their profits in the short term, but long term they would have made a killing and got a ton of kudos as well.

You are completely wrong though about how much these cards cost to make. They are much more expensive to make then previous generations. Just like at the dies sizes, they are massive, compare the 1080ti with the 2080. 1080Ti has 12 billion transistors on a 471mm2 die. The 2080 has 13.6 billion transistors on a 545mm2 die. That's a big price increase right there, now add in the more expensive RT and Tensor cores and the more expensive memory etc. and these cards are a lot more complicated and cost a lot more to make.
 
I agree with some of what you are saying. Nvidia could have released these cards at cheaper price to try and promote the jump to Ray Tracing. They might have taken a small hit on their profits in the short term, but long term they would have made a killing and got a ton of kudos as well.

You are completely wrong though about how much these cards cost to make. They are much more expensive to make then previous generations. Just like at the dies sizes, they are massive, compare the 1080ti with the 2080. 1080Ti has 12 billion transistors on a 471mm2 die. The 2080 has 13.6 billion transistors on a 545mm2 die. That's a big price increase right there, now add in the more expensive RT and Tensor cores and the more expensive memory etc. and these cards are a lot more complicated and cost a lot more to make.

That is absolutely mind blowing how many transistors are on such a small area, i would love to see one magnified up to what is actually going on in a processor my inner nerd is out :)

Wow you got me started people watch this we really should not moan i suppose this is magical lol

 
Last edited:
That is absolutely mind blowing how many transistors are on such a small area, i would love to see one magnified up to what is actually going on in a processor my inner nerd is out :)

It's pretty amazing what they can do now as they push the limits of Moore's law.
 
That is absolutely mind blowing how many transistors are on such a small area, i would love to see one magnified up to what is actually going on in a processor my inner nerd is out :)

Wow you got me started people watch this we really should not moan i suppose this is magical lol


They are one of the if not the hardest things to make. Purely because you are talking transistors in the atomic range of sizing. And everything has to be pretty much perfect to function as intended.

I have always found them fascinating. Just because of what they can do and how fast/accurately they can do it.

The only thing that fascinates me more is space probably
 
Last edited:
So, I heard that Final Fantasy finally had DLSS available for it.

A friend of mine gives me his steam deets so I can login and download + play it and see what it's like.

SHOCKER! It won't ***** work because I have a QHD Monitor and not a 4k Screen, will only work with a 4k Panel.

NVIDIA! Final Fantasy? Who's to blame? If DLSS is only working on 4k then that makes both RTX and DLSS a total BLOWOUT.

I'm very very tempted to write them a snotty letter demanding that they refund my GPU Purchase cost and let me send it back directly to them (even though I bought from OCUK).

Honest to god, such a cluster fk clown show! No wonder their shares dropped 48%!

Not to mention, the implimentation after looking it up online (after trying to figure out why i can't run DLSS on my QHD screen) I saw some people using it, flickering pixels? blurry text? Bad textures on some parts of models? Doesn't really seem to be what they sold....

This is my last NVIDIA GPU regardless of what happens.


I think some are setting to 4K, enabling DLSS and then switching down to 1440P with another setting.Not ideal though! It's not supposed to be for 4K only although I think that's where the largest benefit is. Hopefully other games will support 1440P too
 
Back
Top Bottom