Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It''s been added into a game not initially designed with it in mind probably doesn't help. This is exactly the reason NV need to get it out there however. See how it's implemented and improve it from all angles. It's a three way effort, ie, Nvidia hardware and drivers, Microsoft Direct X, game devs implementing the use of it.That reflection in the painting is completely unrealistic to me. For starters the painting's surface appears matt. Then the only reflection is of the player boarding up the window which vanishes. When they turn, you can still see plenty of light coming through the gaps which should also reflect on the painting. Also, it's the only reflection - there are no others on the surface of the painting from the various other lights sources. It's like it's been baked in, just in a different way. I thought the whole point of RT was that it calculated the path of light realistically for a scene?
Also, the reflectionless bridge just made me sad - a perfect example of why you would want RT and it's missing. The water in the harbour just looked bad, I much preferred the traditionally rendered scene even though it did not have any reflections.
It just feels rushed and rough around the edges to me![]()
). Work being done today is going to help mature the technology for future game and hardware releasesThey should have show cased this tech on a slower paced, single player game.
Witcher 3 / Hitman 2 spring to mind. Hitman has plenty of shiny surfaces, Witcher 3 could have show cased water reflections / gorgeous environment lighting.
Indeed. Hopefully just in time for when 7nm cards and Cyberpunk 2077 comes outWork being done today is going to help mature the technology for future game and hardware releases

Yep, will get better RT for the money but I'll be surprised if the £ will be good(7nm will probably be £££).Indeed. Hopefully just in time for when 7nm cards and Cyberpunk 2077 comes out![]()
I am due a upgrade on my CPU, Mobo and RAM anyway. Strongly considering going for Zen 2 (if it is as good as I am expecting) next year at some point.Yep, will get better RT for the money but I'll be surprised if the £ will be good(7nm will probably be £££).
Many may need to upgrade their CPU's too given that the recommended for RT are minimum 8700K and 2700X. Not even sure my 1950X falls inside or outside that to be honest. It has the core count at least. Seems to perform okay but not found anything to compare with
Yeah, it's interesting why they've changed things around this time - probably driven by the technology/size of the die needed. I reckon we'll get a Titan at some point too. I have wondered if they'll do it with 7nm. If we do get a Titan in the coming months I'll be very surprised if we get 30 series until at least early 2020. But that should give time to develop RT further from both the HW and SW side. Some are hoping for a 30 series sooner but kinda makes sense to take longer to do a lot more R&DI am due a upgrade on my CPU, Mobo and RAM anyway. Strongly considering going for Zen 2 (if it is as good as I am expecting) next year at some point.
I personally doubt 7nm will cost any more than current prices. Nvidia already have been told they are taking the mick with current prices. If I had to guess a 3080 Ti will be $999 dollars at the most and will be on a much smaller die, under 500 mm².
The only reason Nvidia have right now for releasing the 2080 ti at $1199 price point is the huge die size 754 mm² and RTX for the first time.
What they need to go back to is their old model. Release a Titan first, where they can milk the people who cannot wait, then once sales dry up on those, release the 3080 Ti at a decent price of say $799.
.Nope. Seems a bit odd. It mentions it works for the 200 series? Thought this was 20Black Ops uses all 11 GB on my mates 1080 Ti and all 8 GB nearly on my card.
https://www.geforce.com/games-applications/pc-applications/design-garage
Does this work for you?
. It start invoking but doesn't fully load. No useful details of the failure in eventviewer.They should have show cased this tech on a slower paced, single player game.
Witcher 3 / Hitman 2 spring to mind. Hitman has plenty of shiny surfaces, Witcher 3 could have show cased water reflections / gorgeous environment lighting.
I am talking about RTX though. But say Titan V was this generations Titan. I am sure sales were pitiful compared to other Titan's due to hardly any gamers buying it. The price on that was bonkers. I think only a handful of people purchased it on this forum for gaming/benchmarking. Where as other Titans usually sell very well here.They have/released a Titan V earlier this year already.
https://www.geforce.com/games-applications/pc-applications/design-garage
According to the site it's a Raytracing Demo.. That doesn't run on my PC.
Yeah, it's interesting why they've changed things around this time - probably driven by the technology/size of the die needed. I reckon we'll get a Titan at some point too. I have wondered if they'll do it with 7nm. If we do get a Titan in the coming months I'll be very surprised if we get 30 series until at least early 2020. But that should give time to develop RT further from both the HW and SW side. Some are hoping for a 30 series sooner but kinda makes sense to take longer to do a lot more R&D
Yep, the size isn't helping - something many are forgetting. At say £800 they'd always be out of stock with a 5 month waiting list.
It's a bit of a shame they haven't put more memory on the 2080 I have to say. In BFV mine is using just under 6GB @ 1440P in DX12. Maybe DLSS if/when it's used doesn't require more memory but seeing as even the 2080 is being sold as a 4K card, if they don't have some trickery I would imagine 8GB will be a limitation fairly soon at 4K. The 20 series uses less memory than the 10 series too referign to a few games I've checked.
Zen 2 should be good! While 2700X is a much better value proposition than the 9900K, I can see some of the benefits the 9900K has over the 2700 however, according to a few charts anyway. Would be good if Zen 2 beats it in most things

Memory use can vary. In some cases a game might use more just because it's available to be used but might not necessarily need it. Definitely more the better however. I was a but surprised to see the 2080 at nearly 6GB. That was with RT on too, didn't try it off. On a 10 series card I assume it would be closer to 7GB given that a few games I checked showed about a 12% difference. Forza 4, that last game I checked only used 4.2GB on the 2080, 4.7 with a 1070 TiYeah. The 2080 having 8GB was also something that put me off to be honest. Had they put 11GB on it and priced it a bit cheaper I would have been onboard as I did want some new tech. But in the end I went for my Titan XP from members market which was nearly half the price, has 4GB more VRAM and is very close in performance. When playing Final Fantasy 15 I was seeing nearly the whole 12GB being used up! Was about 11.5GB used in many of the areas.

Agreed - but how is that any different to Battlefield V's built in reflections (some of which I prefer over RT)? Or any other decent 3D engine for that matter?But Hitman already has detailed reflections built in to the game engine (which don't have the issues highlighted in the above video). It would be even less worth it.
Agreed. 8GB even at 4K is enough for 99.9% of the games out there to be honest. But as you say more is better and who knows, I may end up keeping this card longer than I expect where the extra VRAM on new games will come in handyMemory use can vary. In some cases a game might use more just because it's available to be used but might not necessarily need it. Definitely more the better however. I was a but surprised to see the 2080 at nearly 6GB. That was with RT on too, didn't try it off. On a 10 series card I assume it would be closer to 7GB given that a few games I checked showed about a 12% difference. Forza 4, that last game I checked only used 4.2GB on the 2080, 4.7 with a 1070 Ti
TXP is good. I wish I kept mine as I'd still have it now if I did and would have waited for something > 2080 Ti. Sold it to use two 1070 Ti;s in Sli then realised with the 1950X especially that Sli was ****![]()

I won't moan anymore about the RTX series. I got my RMA RTX 2080Ti refund today, and I have 2x Gigabyte GTX 1080Ti Gaming OC 11GB cards waiting to be picked up after work ... bye bye RTX issues![]()
