• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Seriously though, when can we expect some benchmarks. Are they going to hold back drivers and people have the cards already?

I think what they are planning is to release game benchmark drivers after the 30-day return policy period has expired, so that those shocked by the tame increase in performance over Pascal can't return them at that point.
 
lol someone posted this on another forum regarding these new cards

This is NOT my own words, i merely copy pasted the below from elsewhere!

How about this wild theory.

NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.

So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.

The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.

But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.

I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)

Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.

The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.

History will prove this theory right.. or wrong.
Guess we'll find out soon enough.
 
Last edited:
I'm guessing you just ignored what the Tomb Raider devs said about it?

Sorry but with the greatest respect to the devs - it looks like a great game - I am done believing anything devs say about things.... i have been stung far to many times (alien colonial marines, star citizen, and so many more) and also heard that in the future driver/firmware updates will make everything all ok (zidoo h6 pro media box)

so no... if i see something being demoed as an advert for a product THAT I what i am going to go on until the promised updates are out and working. IF NV didnt want people to judge the performance of their card on tombraider then they should have demonstrated something other than tombraider.
 
lol someone posted this on another forum regarding these new cards

How about this wild theory.

SNIP .

Navi/Vega 2 part is not happening. There is always this wishful thinking regarding AMD's next GPU. I would love them to come out with something extremely competitive but Nvidia hasn't bolted on the RT gimmick because of them.
 
These threads are like the Brexit argument....:D

At least not many racists where they rave against EU immigrants and when you tell them "hey mate I am an EU citizens", they are dismissive that they do not referring to myself.... :rolleyes:
 
Pic not working @Brun
Sorry. Checked it in preview and it looked ok, maybe cos I was logged into their forum where it was an attachment. Anyhow...

ACCRTX.png
 
Sorry but with the greatest respect to the devs - it looks like a great game - I am done believing anything devs say about things.... i have been stung far to many times (alien colonial marines, star citizen, and so many more) and also heard that in the future driver/firmware updates will make everything all ok (zidoo h6 pro media box)

so no... if i see something being demoed as an advert for a product THAT I what i am going to go on until the promised updates are out and working. IF NV didnt want people to judge the performance of their card on tombraider then they should have demonstrated something other than tombraider.

I'm skeptical but have had a lot more great gaming experiences than No Man's Sky's so happy to take them at their word. To be honest not even especially interested in RT although I think it will add a lot to certain games. More interested in that tech radar report about 4k performance. Hope it's true.
 
There better be some monster IPC gainsper CUDA core, or RTX lineup does not look impressive. Especially that with a little tweaking the Titan V is 30-50% Faster than the 1080Ti Already

M9QNlEc.jpg
 
I'm skeptical but have had a lot more great gaming experiences than No Man's Sky's so happy to take them at their word. To be honest not even especially interested in RT although I think it will add a lot to certain games. More interested in that tech radar report about 4k performance. Hope it's true.

For ray tracing to come off, the devs have got to find value in it. For the game devs to find value in it it has to be widely supported in hardware, and by that I mean consoles, AMD and Nvidia hardware. I expect AMD will bring something big to consoles before PCs as that's where they get to sell in high volume. Then RT will take off.
 
lol someone posted this on another forum regarding these new cards

How about this wild theory.

NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.

So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.

The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.

But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.

I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)

Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.

The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.

History will prove this theory right.. or wrong.
Guess we'll find out soon enough.

I didn't realise that the Sunday Sport now had a PC tech page...............maybe it's because I just look at the pictures.
 
There better be some monster IPC gainsper CUDA core, or RTX lineup does not look impressive. Especially that with a little tweaking the Titan V is 30-50% Faster than the 1080Ti Already

M9QNlEc.jpg

In raw FP32 performance, it seems odd that anyone would chose a RTX 2080 over a 1080Ti which (apart from ray tracing which I suspect the 2080 won't do well) would smash a 2080. The only interesting card in the lineup is the 2080Ti, and only in raw performance terms. It is very expensive though. As someone who has finally given up on SLI though, it looks like it will perform better than my 1070s in SLI ever could.
 
lol someone posted this on another forum regarding these new cards

How about this wild theory.

NVIDIA hit a max with their CUDA cores and also a max with clocks. Meaning, they simply can't put any more of them in a chip and get more performance, and clocks topped at that magic 2ghz on current processes.
One little hint of that is Titan V, with it's amazing 5120 CUDA cores and ultra-fast HBM2 memory, is -barely- able to beat the Titan XP in games.

So, the company scratched their head... what can we give gamers to make them buy a next generation, we can't just sell Pascals forever... because we cant' give them more performance, it has hit a ceiling.

The answer was... well, ray-tracing.
But this wasn't meant for consumer cards, actually, that 10 years in development was VERY LIKELY for the media industry, giving ray-tracing acceleration to content creators, to sell incredibly expensive Quadro chips.

But when there was no alternative for the consumer chips.. BAM ! Harvested Quadro chips now sold as cut down RTX 2070/80/80 Ti.

I have a strong feeling that the new and amazing RTX chips will not beat Pascal equivalents at all (maybe 10% if lucky, and only at 4K), and that is because they will clock LOWER than pascal, due to the very big and complex chip (and power hungry, and hot)

Even more crazy wild theory:
- Even so, yields are very low for these (as they are originally huge, professional chips), and their availability will be weak for the entire next year.
- This pre-order stunt was done to grab as much cash on these defective chips before the truth comes out, because with those prices will be a very tough pill to swallow with just 10% performance uplift.
- They had no choice, and HAD TO RELEASE SOMETHING, because with the fall of mining craze, the market is inundated with 2nd hand Maxwells, Pascals and everything else, meaning very few people (relatively) would buy NEW ones.
- They know something about AMD, possibly Navi or Vega 2 WILL be much faster than expected, and it will beat both Pascal and Turing in raw horsepower, in all existing games.

The ray-tracing is supposed to be the savior of this gen, and that's why they pushed the marketing so hard for it, inventing crappy meaningless performance values (Jiga-rays, tensor-flops), instead of bragging of how much faster they are in all games run at 4K, like they did with previous 3-4 gens.

History will prove this theory right.. or wrong.
Guess we'll find out soon enough.
i think there could be some truth in this
 
Back
Top Bottom