It will support also DXR API according to Nvidia.
Predominately all Ray Tracing will work only on Nvidia proprietary tech, including gameworks.
You are just making up stuff. The demos rleased have been using Microsoft DXR.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It will support also DXR API according to Nvidia.
Predominately all Ray Tracing will work only on Nvidia proprietary tech, including gameworks.
Nvidia have no desire to defeat open source, they are big proponents. Nvidia are pushing OpenGL and Vulkan far more than AMD is.
With Ray tracing, Nvidia has already provided an open source materials library MDL which is used by the likes of Adbobe and Unreal.
RTX is being supported in open source projects like Blender.
I've seen a lot of posters quoting G-sync module component costs like these but I don't believe they're even close to accurate. Where are these figures? They don't make sense.
That's the new Gsync module that's in the HDR and 144Hz 4k monitors, not the one that's in the any of the current non HDR Gsync monitors. The price of the new module came for a teardown that PCPer did. You can look up the part in online distributers, costs $2600 and you have to buy 3 minimum. They estimated what it cost Nvidia to buy one based on that.
That's the new Gsync module that's in the HDR and 144Hz 4k monitors, not the one that's in the any of the current non HDR Gsync monitors. The price of the new module came for a teardown that PCPer did. You can look up the part in online distributers, costs $2600 and you have to buy 3 minimum. They estimated what it cost Nvidia to buy one based on that.
The FPGA CPU used on the HDR Gsync module has street price $2600. The rest of the parts on the module cost $250-300 as it bit more beefed up than the normal Gsync module that costs $250. Do the maths.
If there was going to be VRR support by Nvidia, do you believe they could have spent all that money to create their own HDR Gsync module?So still no VVR support on the RTX cards?
That makes more sense now, Cheers.
My mistake I thought we were talking about regular modules.
If there was going to be VRR support by Nvidia, do you believe they could have spent all that money to create their own HDR Gsync module?
Or their own TV lineup?
They could offer both. Though it seems mental to keep pushing Gsync, I think the focus group might have enough influence to keep people buying Gysnc. All the rhetoric would have to be is, Gsync is better because AMD don't have £3000 graphics cards. That point seems to be all that matters.
Nvidia are 5 years behind on implementing an open standard. Maybe if they just implemented freesync monitor support on the refreshed GTX cards?
Why is this thread talking about monitors?
Nah. NV said numerous times when asked about this, that they are selling "exclusive" products, and want their customers to feel exclusive.
Can you imagine using Freesync alongside consoles, AMD and Intel? Where is the exclusivity on that?
Why is this thread talking about monitors?
If Ultra HD 4K = 8.3 MegaPixels x 400 Rays/pixel x 60 FPS = 199 GigaRays/second and some estimates say up to 10x that figure for noise reduction.
So, only 10 GigaRays/second shall be extremely weak.
How many RT cores, if CUDA cores are up to 4608 and Tensor cores are up to 576?
I want playability in a game. Fancy graphics is all well and good but if the game is gash then no point.
Dialogue, and indeed we digress for a completely different matter. I will steer the discussion to the subject.
Given what we know of the top spec RTX quadro, to achieve the above it would require 200 cards each costs $10000.
+ the NVLink licences.
Amen to that.I want playability in a game. Fancy graphics is all well and good but if the game is gash then no point.
Cheers Panos.
The first Geforce 256: "GeForce 256 was marketed as "the world's first 'GPU', or Graphics Processing Unit", a term Nvidia defined at the time as "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.”
How many can the 1080ti do compared to this?