• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I hope we get a 40” version next year. Would be perfect.

I will likely only upgrade my current monitor to an OLED. Not impressed with the price for performance of monitors currently. Also too much of a lottery getting one without issues. I got lucky with my current monitor on my second go.



I like glossy. Looks better. Happy to use curtains if needed :)

Having a number of OLED's in the past they are by far from perfect and free from issues either just like any TV ;). Just depends what you are sensitive too and willing to put up with. Don't think i've ever seen a perfect OLED, don't think i ever will. Same in the monitor world. Just todays QC unfortunately :(.

I think glossy gives the image a better depth and finish can't imagine how it would be close up as a desktop though.
 
Path tracing techniques tend to scale well with scene complexity as far as things like mesh and texture detail goes - what slows it down is additional bounce/ray counts, caustics, reflections to a degree, etc.

Then they should have done a proper scene/level. Comparing to UE5 Demo, to me, that one looked far better than nVIDIA's and I think (if all they claim is true), more interesting.
 
Then they should have done a proper scene/level. Comparing to UE5 Demo, to me, that one looked far better than nVIDIA's and I think (if all they claim is true), more interesting.

As to that I have no idea on the resources available though you'd think nVidia would throw more effort behind it.

Better quote on what I was talking about before though:

While it is true that Quake II is a relatively old game with rather low geometric complexity, the limiting factor of path tracing is not primarily raytracing or geometric complexity. In fact, the current prototype could trace many more rays without a notable change in frame rate. The computational cost of the techniques used in the Q2VKPT prototype mainly depend on the number of (indirect) light scattering computations and the number of light sources. Quake II was already designed with many light sources when it was first released, in that sense it is still quite a modern game. Also, the number of light scattering events does not depend on scene complexity. It is therefore thinkable that the techniques we use could well scale up to more recent games.

Why Quake II?
Since Quake II is open source and has a long-standing modding tradition, it is a great sandbox for putting academic research to the test in the real world. Particularly, the game has fast-paced action and is played competitively, setting high standards for the performance and robustness of any implemented rendering techniques. Finally, in some sense Quake II is to this day quite a modern game, since it already shipped with complex and artistic light design back when it was first released.

It isn't something that is very intuitive because the game has relatively low geometric complexity and lacking material shaders even transparency effects in the game are dodgy but underneath compared to most other games of that era the engine is in many ways far ahead of its time.
 
In the cpu industry the 10 to 15 percent is the norm in the modern era. We rarely see a huge leap.

Depends on what you're doing. Having all of the sudden 6/12, 8/16 or even a 12/24 CPU at a good offer, at the same price as previous 4/8 ones, in apps that do know about extra cores, can be seen as a pretty big leap ahead. But, if we're sticking strictly to gaming, than also should be mentioned that the need to upgrade the CPU is far less needed than the GPU - so big steps are made naturally as you finally decide to go from a very old part to a new and shinny one when is finally required. :)

As to that I have no idea on the resources available though you'd think nVidia would throw more effort behind it.

Better quote on what I was talking about before though:



It isn't something that is very intuitive because the game has relatively low geometric complexity and lacking material shaders even transparency effects in the game are dodgy but underneath compared to most other games of that era the engine is in many ways far ahead of its time.

Yup, I'm aware of your thoughts on the matter. While I am excited about RT can do, I'm far more reserved in how it's used today - most likely due to lack of hardware power as well. But that DLSS 2.0 for sure looks great!

I think is more about picking your battles scenario depending on the type of game you're in. The Vanishing of Ethan Carter Redux looks, to me, significantly better than Metro Exodus in Taiga level (or whatever is called), where the scenery is relatively the same. Of course, The Vanishing of Ethan Carter Redux also performs far better than Metro, even when Metro runs without RT.

Then talking about multiple lights, when the whole low level api thing was a hot topic, one of the benefits was (or it was made to be), that it should have allowed for a greater number of shadow casting lights per scene than the 4-5 ones that are usually used in dx11 before performance breaks. That's another thing, cheaper, that can improve upon the quality of an image, but did not quite take of even is (dx12) widely supported (by hardware compared to RT).

People are saying that the demo of UE5 is just a demo and doesn't have all the other assets of a game, but Tomb Raider games have plenty of scenes just like that, walking around, "doing nothing", so it is part of a game. nVIDIA's demo, not so much. And also about picking your battles, having better assets, such as the ones from demos Heretic, Blacksmith or The Book Of The Dead, in my opinion, can contribute more to photorealism than allocating those hardware resources to RT. :)

Anyway, will be interesting to see what the future holds and hopefully nVIDIA will manage to convince developers to use RT to a far greater and better extent than GPU PhysX.
 
Having a number of OLED's in the past they are by far from perfect and free from issues either just like any TV ;). Just depends what you are sensitive too and willing to put up with. Don't think i've ever seen a perfect OLED, don't think i ever will. Same in the monitor world. Just todays QC unfortunately :(.

I think glossy gives the image a better depth and finish can't imagine how it would be close up as a desktop though.

The latest ones have 120hz BFI which on SDR content including games can be beneficial.

Also they've moved on now to the point where Burn in doesn't seem to be an issue like it was on 2015-2017 sets.
 
Depends on what you're doing. Having all of the sudden 6/12, 8/16 or even a 12/24 CPU at a good offer, at the same price as previous 4/8 ones, in apps that do know about extra cores, can be seen as a pretty big leap ahead. But, if we're sticking strictly to gaming, than also should be mentioned that the need to upgrade the CPU is far less needed than the GPU - so big steps are made naturally as you finally decide to go from a very old part to a new and shinny one when is finally required. :)



Yup, I'm aware of your thoughts on the matter. While I am excited about RT can do, I'm far more reserved in how it's used today - most likely due to lack of hardware power as well. But that DLSS 2.0 for sure looks great!

I think is more about picking your battles scenario depending on the type of game you're in. The Vanishing of Ethan Carter Redux looks, to me, significantly better than Metro Exodus in Taiga level (or whatever is called), where the scenery is relatively the same. Of course, The Vanishing of Ethan Carter Redux also performs far better than Metro, even when Metro runs without RT.

Then talking about multiple lights, when the whole low level api thing was a hot topic, one of the benefits was (or it was made to be), that it should have allowed for a greater number of shadow casting lights per scene than the 4-5 ones that are usually used in dx11 before performance breaks. That's another thing, cheaper, that can improve upon the quality of an image, but did not quite take of even is (dx12) widely supported (by hardware compared to RT).

People are saying that the demo of UE5 is just a demo and doesn't have all the other assets of a game, but Tomb Raider games have plenty of scenes just like that, walking around, "doing nothing", so it is part of a game. nVIDIA's demo, not so much. And also about picking your battles, having better assets, such as the ones from demos Heretic, Blacksmith or The Book Of The Dead, in my opinion, can contribute more to photorealism than allocating those hardware resources to RT. :)

Anyway, will be interesting to see what the future holds and hopefully nVIDIA will manage to convince developers to use RT to a far greater and better extent than GPU PhysX.

Yes I agree with you on there, thanks to AMD we have seen mainstreams CPUs have come with more physical cores and hyperthreaded. I mean look at the 10600k and 10700k, if AMD did not bring Ryzen it would have probably been still 4 cores and 4cores 8 threads, respectively. I am waiting for AMD to bring a ryzen chip which will have 12 cores and 24 threads at a lower price bracket, x600x/x700x. That will be a buy which will last a good 6-8 years as a gaming PC.
 
I am waiting for AMD to bring a ryzen chip which will have 12 cores and 24 threads at a lower price bracket, x600x/x700x. That will be a buy which will last a good 6-8 years as a gaming PC.
The 3900x will drop in price when the 4000 series is released, then you will get your wish. :)
 
Looking forward to these GPU's. If the DLSS improvement is big, combine a 3080 or something with a Ryzen 4000 chip and you've got a lot of capability!

Could probably push 240hz minimum in esport titles
 
Looking forward to these GPU's. If the DLSS improvement is big, combine a 3080 or something with a Ryzen 4000 chip and you've got a lot of capability!

Could probably push 240hz minimum in esport titles
Yea, good time to build a new PC, hopefully we wont be let down by performance and price figures since most of these anticipated releases are based on rumours.
 
for gaming it'll def be worth waiting for the 4000-series, which I'd doing. Pretty pointless switching from a 5Ghz Intel chip right now, even if you've got a Sandy Bridge!! :eek: :D
hmm i think an overclocked 2600K might be still OK for a lot of the games but 2500K will really start to show its age in modern titles, especially when pairing it with a high end modern card.
 
Back
Top Bottom