• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

First RTX2070 review

Vega 64 is a strange one for me.
I see it in Joker's cod 4 benches doing well and going toe to toe with 2080 which is supposed to be at or above 1080ti performance.
Then i see Vega 64 in other charts getting 1080 performance.

:confused:
 
I think the 2070 could be a great option, 1080 ti performance for under £400... Oh wait no, that was last generation. Still 1080 performance for a little under £600, amazing! Imagine being able to get that for the last 2 years :(
 
I think the 2070 could be a great option, 1080 ti performance for under £400... Oh wait no, that was last generation. Still 1080 performance for a little under £600, amazing! Imagine being able to get that for the last 2 years :(

On day 1 of product launch, certain cards shall be far cheaper than £600, but it will literally be like a one day launch special price and ONLY certain SKU's, so if you see some cheap and think thats no so bad, BUY IT. ;)
 
On day 1 of product launch, certain cards shall be far cheaper than £600, but it will literally be like a one day launch special price and ONLY certain SKU's, so if you see some cheap and think thats no so bad, BUY IT. ;)
Not taking a jab at you Gibbo, but "Just buy it" (by Tom's Hardware) seem to be quickly becoming a slogan for the RTX cards, like the "Just do it" slogan for Nike :D
 
It's sad that a GTX 970/R9 390 segment card is now over £500 to buy. I expect the 2060 to be £300+ the way things are going. Enjoy paying through the nose fanboys.
 
Not taking a jab at you Gibbo, but "Just buy it" (by Tom's Hardware) seem to be quickly becoming a slogan for the RTX cards, like the "Just do it" slogan for Nike :D

Just everyone moans when the prices goes up after a day or so, in short its not gonna get cheaper after they go online but more likely go up.
 
Just everyone moans when the prices goes up after a day or so, in short its not gonna get cheaper after they go online but more likely go up.
Yea appreciate that you making it clear for people and giving them a heads-up. But no doubt people will still scream "retailer price gouging", despite you are probably not even getting enough supply to meet demand.
 
Price will be very low on some 2070s at launch, will surprise a few, my advice is to grab one as the margins shall be very tight and as such the prices may increase a few days later if later shipments do not become a little cheaper.
 
Last edited:
but of course that is not strictly fair as the whole exchange rates play a massive factor and the GBP is far weaker now than it was when 1080Ti was launched.

You are so wrong it is unbelievable, the USD was at $1.21 to £1 in March 2017, it's now over $1.30, next excuse please :rolleyes:
 
You are so wrong it is unbelievable, the USD was at $1.21 to £1 in March 2017, it's now over $1.30, next excuse please :rolleyes:

Nope when we put 1080Ti up for pre-order, was before Brexit, rate was around 1.50.
I remember it well because most the stock we pre-sold, when the stock actually landed it plummets to around 1.20 and on some lines we shipped at a lost as we honour back orders irrelevant of landed cost change and the big rate swing results in us losing money on some lines.
 
Nope when we put 1080Ti up for pre-order, was before Brexit, rate was around 1.50.
I remember it well because most the stock we pre-sold, when the stock actually landed it plummets to around 1.20 and on some lines we shipped at a lost as we honour back orders irrelevant of landed cost change and the big rate swing results in us losing money on some lines.

The GTX 1080Ti was launched in March 2017, not in 2016. The USD was depressed for most of the year and started hitting $1.32+ in Dec '17, which is where it is at today.
 
The GTX 1080Ti was launched in March 2017, not in 2016.

My error I am mixing 1080 and 1080Ti up then on the exchange rate.

On 1080Ti, the reason is because the USD cost increased on 1080Ti several months ago when memory prices rocketed and they simply never got the price back down in USD to the launch pricing as the memory is still more expensive.
 
Vega 64 is a strange one for me.
I see it in Joker's cod 4 benches doing well and going toe to toe with 2080 which is supposed to be at or above 1080ti performance.
Then i see Vega 64 in other charts getting 1080 performance.

:confused:

It's mainly down to people benching using the stock profiles on Vega cards. The performance is terrible on Stock settings because the cards throttle so much. If you spend a couple of minutes to change from stock to custom settings and undervolt a little the Vega cards perform much, much better. That's why @Poneros was complaining in his earlier post. The Asus Rog Strix that they used in the review, suffers more than other Vega Cards from throttling due to bad thermal pads.

Vega 64 easily out performs a 1080 with a few simply tweaks, but, on Stock settings, it struggles to even match the 1080.
 
It's mainly down to people benching using the stock profiles on Vega cards. The performance is terrible on Stock settings because the cards throttle so much. If you spend a couple of minutes to change from stock to custom settings and undervolt a little the Vega cards perform much, much better. That's why @Poneros was complaining in his earlier post. The Asus Rog Strix that they used in the review, suffers more than other Vega Cards from throttling due to bad thermal pads.

Vega 64 easily out performs a 1080 with a few simply tweaks, but, on Stock settings, it struggles to even match the 1080.

sorry but that is on amd then to not make stock settings better. We are probaly prepared to tinker to get every last drop out of our hardware - it is the nature of these forums - but a product should be judged on how 95% of the users will use it imo, and that is by taking it out of the box and running it.

sure, chat about overclocking etc as a fun little extra but ultimately the cards are only warrantied to do exactly what they say on the box and that is how they should be judged.
 
Please don't be insulting. And you've been quick to defend the 1080 Ti at every opportunity, a card you own I believe :). I believe my posts have been based on the evidence out there about the potential.
Did you not notice the underlining of the word "could"? I am only making a suggestion based on a "marketing" image in Nvidia's blog. It does however show a 51 FPS framerate vs 41 FPS of the 1080 Ti at 4K in one scenario with DLSS. One has to assume it's true at this time and that a potential exists.
NV are of course putting a lot of faith in third parties to help get the performance.
Edit: As requested:
https://blogs.nvidia.com/blog/2018/10/12/deep-learning-turing-graphics/

I haven't insulted you, I have just pointed out that you have been living in denial since these cards were announced. When there was no mention of performance in Normal games during the Launch event some people start worrying that the performance might not be great, but, you were like, no, the performance will be amazing. Then when the interview with Tom Petersen was released where he said what the performance was going to be, you had a ton of excuses why that might not be the actual performance, and that it could be much better. Then finally the reviews came out and now it's, Oh when DLSS comes out in games these cards are going to rock. And you, and others, have also said that when drivers mature these cards will perform much better. Will come back to these two points later.

One game and one tech demo does not a card make. If this was an AMD card you would laugh at me if I recommend purchasing one based on potential future performance from a marketing slide and one game.

Which leads on to Driver Maturity. Why are people hoping for a big boost in performance from drivers? Isn't this why people buy Nvidia? On countless driver threads people are always saying that they go Nvidia because they give 100% performance from the start. I am sure there will be the usual driver increases, a few % here and there in some games, especially new games. But, those people waiting for that miracle driver boost for normal games will be waiting a long time.

Now DLSS, Before the reviews were released, DLSS was the one thing I was most excited about with Turing cards. I defended DLSS in a discussion with @FoxEye and @bru. Myself and Bru both had high hopes and Foxeye, well, was been Foxeye. But, since the reviews and now more info has come to light, I am beginning to think that Foxeye was right and myself and Bru were wrong. I was really looking forward to the reviews because I wanted to see what DLSS was capable of. Not one game using DLSS in any review?? Does this not raise any red flags for you? Well, it did for me. I started having doubts. Then when Hardware unboxed did their analysis of everything they knew about DLSS, I had even more doubts. After their analysis they concluded that running a game using DLSS is the same as running a game at 1800p. Then of course there was other red flags, why were they not allowed test against other forms of AA? If the performance is got because it's actually running at reduced resolution, but just it's really hard to tell the differences between it and real 4K, is it a real performance boost? For those people that want to run at the highest settings, then DLSS is just the same as reducing the settings that make very little visual difference to get more performance.

And it still worries me that there are no games using DLSS yet.

Oh, you are wrong about one thing too, For DLSS, Nvidia aren't relying on third parties at all. The performance will be entirely down to Nvidia. Developers decide if they want it or not that's all. If you turn on DLSS in a game, you will be using Nvidia's algorithms developed on their Supercomputer and using the Tensor cores on the card.

As for me defending my 1080ti? LOL Bizarre, in threads, asking for advice between the 1080ti and the 2080, I recommending the 1080Ti based on what we actually know, you are recommending spending extra on the 2080 based on unknown future performance in two techs that are only going to be used in a very small number of games.
 
That’s hilarious. AMD fans were saying the same BS when Vega was released. Drivers will mature the card will improve to 1080ti levels... absolute nonsense. Selective memory from AMD fans.

However in this case the card will improve with DLSS and drivers. We’ll get RTX cores turned in and windows update to enable it. There are physical cores on the cards that haven’t been activated yet, rather than software driver promises that never materialise.

I haven't insulted you, I have just pointed out that you have been living in denial since these cards were announced. When there was no mention of performance in Normal games during the Launch event some people start worrying that the performance might not be great, but, you were like, no, the performance will be amazing. Then when the interview with Tom Petersen was released where he said what the performance was going to be, you had a ton of excuses why that might not be the actual performance, and that it could be much better. Then finally the reviews came out and now it's, Oh when DLSS comes out in games these cards are going to rock. And you, and others, have also said that when drivers mature these cards will perform much better. Will come back to these two points later.

One game and one tech demo does not a card make. If this was an AMD card you would laugh at me if I recommend purchasing one based on potential future performance from a marketing slide and one game.

Which leads on to Driver Maturity. Why are people hoping for a big boost in performance from drivers? Isn't this why people buy Nvidia? On countless driver threads people are always saying that they go Nvidia because they give 100% performance from the start. I am sure there will be the usual driver increases, a few % here and there in some games, especially new games. But, those people waiting for that miracle driver boost for normal games will be waiting a long time.

Now DLSS, Before the reviews were released, DLSS was the one thing I was most excited about with Turing cards. I defended DLSS in a discussion with @FoxEye and @bru. Myself and Bru both had high hopes and Foxeye, well, was been Foxeye. But, since the reviews and now more info has come to light, I am beginning to think that Foxeye was right and myself and Bru were wrong. I was really looking forward to the reviews because I wanted to see what DLSS was capable of. Not one game using DLSS in any review?? Does this not raise any red flags for you? Well, it did for me. I started having doubts. Then when Hardware unboxed did their analysis of everything they knew about DLSS, I had even more doubts. After their analysis they concluded that running a game using DLSS is the same as running a game at 1800p. Then of course there was other red flags, why were they not allowed test against other forms of AA? If the performance is got because it's actually running at reduced resolution, but just it's really hard to tell the differences between it and real 4K, is it a real performance boost? For those people that want to run at the highest settings, then DLSS is just the same as reducing the settings that make very little visual difference to get more performance.

And it still worries me that there are no games using DLSS yet.

Oh, you are wrong about one thing too, For DLSS, Nvidia aren't relying on third parties at all. The performance will be entirely down to Nvidia. Developers decide if they want it or not that's all. If you turn on DLSS in a game, you will be using Nvidia's algorithms developed on their Supercomputer and using the Tensor cores on the card.

As for me defending my 1080ti? LOL Bizarre, in threads, asking for advice between the 1080ti and the 2080, I recommending the 1080Ti based on what we actually know, you are recommending spending extra on the 2080 based on unknown future performance in two techs that are only going to be used in a very small number of games.
 
Back
Top Bottom