So what does this hint at?
Who knows. The only thing we know from earlier this year was that AMD promised a driver update adding HDMI 2.1 on all RX cards.
But could be RT, or something completely different.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
So what does this hint at?
lol someone posted this on another forum regarding these new cards
This is NOT my own words, i merely copy pasted the below from elsewhere!
How about this wild theory.
NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.
So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.
The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.
But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.
I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)
Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.
The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.
History will prove this theory right.. or wrong.
Guess we'll find out soon enough.
Titan V can easily beat a Titan Xp in gaming and on average is a better overclocker too by a small margin.
HBM2 is not great for gaming and really needs a big overclock (over 1000 on the memory) to get the best out of the Titan V hence the switch to GDDR6 on Turing.
Below is a good example of the Titan V really showing its muscle on demanding software compared to Pascal.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.
There’s no denying that NVIDIA TITAN V smashes GeForce GTX 1080 Ti FE gaming performance at 4K and 1440p. We experienced performance differences up to 40%, most were around the 30% mark, with varying games below or above that average.
lol someone posted this on another forum regarding these new cards
This is NOT my own words, i merely copy pasted the below from elsewhere!
How about this wild theory.
NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.
So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.
The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.
But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.
I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)
Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.
The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.
History will prove this theory right.. or wrong.
Guess we'll find out soon enough.
So what does this hint at?
I don't need to *think* many developers are already ON THE RECORD stating they will support RTX - go do some reading instead of whining on OCUKYou think developers are going to add significant implementation of RT to their games right now when the consoles don't have the functionality, AMD cards don't have the functionality, and only the most ludicrously expensive card of the new expensive range of cards from Nvidia has the hardware for it?
Can I ask, what are YOU smoking?
So like I said, implementation of it is going to be reserved to a bit better shadows and reflections in some games, and that is literally it in the near term. Very far from your total hyperbole of 'Golden egg for 3D rendering' which is just laughable when applied to these cards.
I could see devs willing to add it due to Nvidia's market share dominance but due to the pricing of the RTX cards they aren't going sell anywhere near as well as Maxwell and Pascal.For ray tracing to come off, the devs have got to find value in it. For the game devs to find value in it it has to be widely supported in hardware, and by that I mean consoles, AMD and Nvidia hardware. I expect AMD will bring something big to consoles before PCs as that's where they get to sell in high volume. Then RT will take off.
My theory is very simple. In two years time with the release of the next gen consoles, Nvidia know they cannot compete at this 'low' price point. You see what can be squeezed out of a highly optimized sealed box in the ps4 pro, ps5 will most likely deliver true 4k 60fps with amazing graphics.
So Nvidia are repositioning their business into a luxury high end, high price point products.
I had to use double layer of tin foil for my hat but still believe some of this may be true.lol someone posted this on another forum regarding these new cards
This is NOT my own words, i merely copy pasted the below from elsewhere!
How about this wild theory.
NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.
So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.
The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.
But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.
I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)
Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.
The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.
History will prove this theory right.. or wrong.
Guess we'll find out soon enough.
It really is, and that's all this thread consists of.Talk is cheap![]()
Maybe AMD have given up on the high end as they have already seen the writing on the wall? the adoption rate of consoles is great
Titan V can easily beat a Titan Xp in gaming and on average is a better overclocker too by a small margin.
HBM2 is not great for gaming and really needs a big overclock (over 1000 on the memory) to get the best out of the Titan V hence the switch to GDDR6 on Turing.
Below is a good example of the Titan V really showing its muscle on demanding software compared to Pascal.
To be fair, doesn't that particular benchmark use Async compute to a large extent? Volta excels here compared to Pascal. Would explain the large gains, but your point is still valid. There's still enough of a gain to be considered tangible.
lol someone posted this on another forum regarding these new cards
This is NOT my own words, i merely copy pasted the below from elsewhere!
How about this wild theory.
NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.
So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.
The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.
But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.
I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)
Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.
The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.
History will prove this theory right.. or wrong.
Guess we'll find out soon enough.
I don't know why people keep saying that. My OC'd Titan V is ~30% faster than my OC'd XP in real world games. I don't consider 30% faster at this level to be "barely" faster.
My theory is very simple. In two years time with the release of the next gen consoles, Nvidia know they cannot compete at this 'low' price point. You see what can be squeezed out of a highly optimized sealed box in the ps4 pro, ps5 will most likely deliver true 4k 60fps with amazing graphics.
So Nvidia are repositioning their business into a luxury high end, high price point products.
I don't know why people keep saying that. My OC'd Titan V is ~30% faster than my OC'd XP in real world games. I don't consider 30% faster at this level to be "barely" faster.
My theory is very simple. In two years time with the release of the next gen consoles, Nvidia know they cannot compete at this 'low' price point. You see what can be squeezed out of a highly optimized sealed box in the ps4 pro, ps5 will most likely deliver true 4k 60fps with amazing graphics.
So Nvidia are repositioning their business into a luxury high end, high price point products.
I don't need to *think* many developers are already ON THE RECORD stating they will support RTX - go do some reading instead of whining on OCUK