• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

After two years Nvidia finally beat AMD's other cards. Well played Nvidia better late than never xD

RTX 2080 Ti > GTX 1080 Ti
RTX 2080 > VEGA 64 > GTX 1080
GTX 2070 > VEGA 56 > GTX 1070
GTX 2060 > RX 580 > GTX 1060
GTX 2050 > RX 570 > GTX 1050 Ti

Need me some of these blinkers
 
To be fair to Gibbo, his opinion is in his benefit based on the fact that he still has previous gen cards to flog.

Sort of like Jensen from NVIDIA saying that "we won't get new cards for a long time" a few months ago, when he knew well that the new cards were merely a quarter of a year away. They are just protecting their current interests businesses wise.

Exactly.

Also if Gibbo is saying Pascal card prices are going up, that means his stock is running low and he won’t be needing to drop the price to clear out the remainder. Believe me, if he has not sold them by the time Turing hits his warehouse in high quantities, he will either get some kind of rebate or lower the price on a one time special to get rid of the remainder price.

Tomorrow is going to be interesting. Does anyone know if benchmarks come out, or do we find out when the NDA on those expire?

I will base my decision on either getting a 2070 or 1080Ti second hand depending on how bad the 2070 is. Don’t fancy going for a 2080ti knowing 7nm is not far away. Though I do think the 2080Ti will be a nice card. It should beat the 3K Titan V :D
 
NVIDIA RTX 2070 Specs Leaked – 2304 Cores, 8GB GDDR6 at ~$400

https://wccftech.com/nvidia-rtx-2070-specs-leaked-2304-cores-8gb-gddr6-at-400/

Videocardz sources confirmed RTX 2070 will have 2304 CUDA cores and 8GB GDDR6 but performance is not at GTX 1080 Ti level, only just 8% faster than GTX 1080. When overclocked it could reach GTX 1080 Ti performance level.

The price is huge gap between $400 RTX 2070 and $700 RTX 2080 so a chip with GTX 1080 Ti performance level would be RTX 2070 Ti that do not exist yet probably until 2019.

Guess I will get RTX 2080 so I will have to wait and see.

Only 8% faster than a 1080, hmm don't like the sound of that only puts it 25% or faster than the existing 1070.

So with 384 more cores than the currant 1070 and only 25% faster. Yet the 2080 rumoured to have 512 more cores and 280ti rumoured to have 1024 more cores.
Sounds like different AIB's being told different amounts of cores so NVidia can tell who leaks the info.

Bottom line is we will know tomorrow
 
Not sure if lack HDMI 2.1 is a major issue - I don't believe that VRR is a mandatory part of the spec.
hdmi.org isn't clear on this point but reading the Q&A on 'Enhanced Refresh Rate features' suggests that products may have a combination of the features and it is dependent on the manufacturers implementation.
This does suck and I look forward to the day that we do have standard VRR widely available on gfx card & monitors - but not expecting this anytime soon. :(

This is the third time I've heard this from different people, but I cannot find any verification on it at all.
 
The 2070 isn't even the TU04 chip, remember. It's the one down from that - TU106 - the chip that used to power the GTX xx60 cards.

I really don't think that will be the case seeing as it is supposedly it is an RTX rather than a GTX, so it should be a salvaged TU104 chip.
 
With all the pics of aib cards and no foundry edition this looks like a rebrand rather than new cards, although the 970 did release aib from the off
 
Tomorrow is going to be interesting. Does anyone know if benchmarks come out, or do we find out when the NDA on those expire?

I will base my decision on either getting a 2070 or 1080Ti second hand depending on how bad the 2070 is. Don’t fancy going for a 2080ti knowing 7nm is not far away. Though I do think the 2080Ti will be a nice card. It should beat the 3K Titan V :D

It will be interesting indeed tomorrow! I hope we get benchmarks and all. If I'm sensible I'll get the 2070, can't make any promises though.

After the last few pages of comments I gather a lot of people think the 2070 will be a disappointing waste of time. I'm hoping for the best though and I will not be surprised if a high base/boost clock 2070 version matches a stock 1080Ti while still being cheaper with newer technology.

Frustratingly, if they copy the 10 series launch, they will only launch the high end 2080/2080Ti tomorrow and the 2070 will be launched in a month or so.
 
Frustratingly, if they copy the 10 series launch, they will only launch the high end 2080/2080Ti tomorrow and the 2070 will be launched in a month or so.

I think that is what is happening isn't it? 2080 and 2080ti go live at some point tomorrow evening and the 2070 comes out in Sept.
 
True. We are just months before 7nm (AMD at least is already 5 months in) upgrading to a chip similar to GTX480/8800GTX is kinda daft.
Because the new RTX2080/Ti are just that, the GTX480 and 8800GTX. Huge chip, which will run hot, the last of it's node before the next, very expensive to make and buy, with no direct competitor (so high prices), which will be replaced in 12 months by a far better chip at 7nm. Which will run much cooler and be faster, with direct competitor to keep pricing low.

But short memories in this very forum. People forgot what history taught most of us with the 8800GTX and GTX480. And are repeating the same mistake.

FYI inflation adjusted the 8800GTX cost $1000 in today's money, which was superseded quickly by a better and far cheaper GPU. That is why Nvidia is pushing the Ti also right now. Sell as much as they can get away with and pull a fast one.

Yeah - I'm sitting this one out for this reason. Will wait for the 7nm cards - my 1080 is doing fine at 1440p anyway.
 
This is the third time I've heard this from different people, but I cannot find any verification on it at all.

ON HDMI 2.1 VRR IS MANDATORY. FULL STOP. Anyone says differently confuses it with the DP adaptive sync which is optional.

https://www.overclock3d.net/news/gp..._radeon_rx_gpus_with_a_future_driver_update/1
This announcement came on the same day that Nvidia revealed their 120Hz 65-inch "Big Format Gaming Displays", giant G-Sync HDR monitors that could be described as a range of large G-Sync smart TVs if it wasn't for the fact that it doesn't support TV channels or inputs. While Nvidia's BFGD range is an impressive technological feat, they are built with mostly proprietary technology with a price tag that is likely to be astronomically high. Right now Nvidia has not committed to supporting HDMI 2.1, whose VRR tech is a mandatory portion of the standard.

And if anyone reads the whole article AMD said all RX cards will support full HDMI 2.1
That means current RX Vega cards (including the APU) will only need the upcoming adrenaline drivers to fully support HDMI 2.1

If Nvidia supports VRR means the death of the Gsync and locking their customers to their tech.

Plain and simple. If NV was willing to support VRR they couldn't have developed a $800+ HDR Gsync module ($500 the FPGA + ~$300 the rest of the board) nor their own lineup of TVs
 
Last edited:
ON HDMI 2.1 VRR IS MANDATORY. FULL STOP. Anyone says differently confuses it with the DP adaptive sync which is optional.

https://www.overclock3d.net/news/gp..._radeon_rx_gpus_with_a_future_driver_update/1


And if anyone reads the whole article AMD said all RX cards will support full HDMI 2.1
That means current RX Vega cards (including the APU) will only need the upcoming adrenaline drivers to fully support HDMI 2.1

If Nvidia supports VRR means the death of the Gsync and locking their customers to their tech.

Plain and simple. If NV was willing to support VRR they couldn't have developed a $800+ HDR Gsync module ($500 the FPGA + ~$300 the rest of the board) nor their own lineup of TVs

Those will have to be very special TV's to have any chance in the market... To me it's crazy Nvidia are even attempting to make a play for TV's.
 
This is the third time I've heard this from different people, but I cannot find any verification on it at all.

Yeah - I'd heard it in various places as well - my comments are based on my interpretation of the Q&A on here https://www.hdmi.org/manufacturer/hdmi_2_1/ specifically

Enhanced Refresh Rate Features

Q: Are these primary for gaming applications?
A: Certain aspects are better suited for gaming, but it depends on how the manufacturers implement the features. For example, for better gaming, Variable Refresh Rate (VRR) that synchs up source and display with continually changing refresh rate, and Quick Frame Transport (QFT) that allows frames to transmit faster from the source, both allow for smoother, no-lag, and no screen tearing gaming experiences.

Q: How is video or movie viewing any better?
A: When you switch between sources and their content sometimes there is a lag or dead screen while devices change resolutions, refresh rates or TV viewing modes; but Quick Media Switching (QMS) switches and sets those automatically and very quickly so viewing is uninterrupted and smooth.

Q: Can products have a combination of these features?
A: Yes, but it depends on each manufacturer’s implementation, so it is necessary to carefully check their specifications and marketing materials.

I found an older version of the HDMI Adopters Agreement and it has the concept of HDMI Features and minimum required functionality - that is the minimum set of features that a product must support - so the key question assuming that concept still remains is whether VRR is optional or mandatory.

So - to be honest - it's all speculation until we see products or the actual spec :) Still it wouldn't surprise me to see NVidia use their influence to ensure it was optional part of the spec :mad:
 
The VRR makes a lot of sense for AMD if you consider that they are building for the next gen playstation gpu ... which will most likely be plugged into a modern TV rather than a monitor.
 
Keep our eyes wide-open is all I could say when it comes to these new cards.

Try to ignore the naming game that Nvidia's playing, but carefully look at the reviews and the architecture and specs of the cards.

Nvidia has NEVER released a 80Ti card alongside 80 card together, so if the rumor of that comes true, don't just go and assume the new 80Ti is guarantee to be a full fat chip.

If the new 80Ti and the 80 is using the same mid-range chipset like 1070 and 1080 did rather than 1080 vs 1080Ti (big chip), then what Nvidia have done would be essentially be just name changed and pushing the naming up by one tier (again), and renaming chip are that supposed to be 70/80 to 80/80Ti.
 
The VRR makes a lot of sense for AMD if you consider that they are building for the next gen playstation gpu ... which will most likely be plugged into a modern TV rather than a monitor.

The market is offering a free open standard that improves PC's. Seems silly for anyone not to use it especially since the consoles are such strong competition.
 
Keep our eyes wide-open is all I could say when it comes to these new cards.

Try to ignore the naming game that Nvidia's playing, but carefully look at the reviews and the architecture and specs of the cards.

Nvidia has NEVER released a 80Ti card alongside 80 card together, so if the rumor of that comes true, don't just go and assume the new 80Ti is guarantee to be a full fat chip.

If the new 80Ti and the 80 is using the same mid-range chipset like 1070 and 1080 did rather than 1080 vs 1080Ti (big chip), then what Nvidia have done would be essentially be just name changed and pushing the naming up by one tier (again), and renaming chip are that supposed to be 70/80 to 80/80Ti.

Exactly. People should be clever and consider why Nvidia puts out the 80Ti right now.
If someone watches the AdoredTV video with the RTX speculation, which he was spot on, he leaves a hint of why Nvidia is rushing fast this gen out of the door, while they are not using the big chip but the small one. And I am not referring to money grabbing. But that AMD is ahead at 7nm already by half a year.

Also the big chip is HUGE! The amount of power needs is out of the 12nm process for mainstream consumption.
At 7nm is more manageable but that is somewhat a year or more away.
Assuming NV can get their orders placed for 2019 as AMD and Apple seems have fully booked the TSMC 7nm production for 2019, as AMD uses TSMC already for Zen 2, Navi (GPU & consoles) and Vega 20.
 
Back
Top Bottom