• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

If the 3090 is so powerful why would anyone need SLI??? And why have Nvidia stopped SLI for the lower models? So gimp lower cards with less RAM, end SLI for all but the flagship card and charge the Earth for the privilege.
I'm waiting for Hawaii and Tahiti's kick ass grandchild before spending anything. The power budget and ludicrously big cooler for Ampere suggest Jensen is shook.

Plenty renderers take advantage of multi GPUs. Octane render and redshift being a few. Time is money. If you can make that money back in a few projects then it makes sense.
 
Someones got a 3090

tumblr_mctlb1tPSh1rk5eu3o1_1280.gif


:p
 
The only saving grace for this will be if DLSS 3.0 is something that can be enabled across all games (i.e. driver level implementation) without the need for devs to code for it. DLSS 2.0 is really quite impressive and the performance jump is appreciable, problem is devs have to implement it.
 
The sad part that I've noticed is that the lower-end has completely stalled in terms of improvements. If you got a 480 early then you're still at that level overall in that price range. The mid-range is seeing some action, and the high-end keeps getting more performance (albeit at a higher price), but low-end is just dead. Same tech, same prices, refreshed yearly.

Essentially it means that for people wanting to spend less than $300 they get to see nothing happen in their price range.

Surely this can't continue for another architectural generation? :(

$130 and <40% faster than the 1650, for the low end its actually really good.

https://www.hardwaretimes.com/amd-l...00-30-40-faster-than-the-nvidia-gtx-1650-129/
 
The only saving grace for this will be if DLSS 3.0 is something that can be enabled across all games (i.e. driver level implementation) without the need for devs to code for it. DLSS 2.0 is really quite impressive and the performance jump is appreciable, problem is devs have to implement it.

I was wondering if this might be the thing that gets Nvidia 'over the hump' so to speak in terms of performance increase. So the 3080 might say do 20% more than the 2080, but with DLSS 3.0 that can reliably be increased to 60% or something like that. It somewhat fudging the numbers, but when it comes to marketing I don't see why Nvidia wouldn't do it if they could. Remember how they opened the 2018 reveal by going on about the 2000 series being 10 times faster than the 1000 series, of course referring to RTX, but they hardly rushed to clarify the minimal traditional performance increase.
 
I was wondering if this might be the thing that gets Nvidia 'over the hump' so to speak in terms of performance increase. So the 3080 might say do 20% more than the 2080, but with DLSS 3.0 that can reliably be increased to 60% or something like that. It somewhat fudging the numbers, but when it comes to marketing I don't see why Nvidia wouldn't do it if they could. Remember how they opened the 2018 reveal by going on about the 2000 series being 10 times faster than the 1000 series, of course referring to RTX, but they hardly rushed to clarify the minimal traditional performance increase.

Of course it will, DLSS3 is one of the things marketing will be focused at and the price justified, "4K image quality for 1440P performance" only £1500.... i hope Reviewers scrutinize that because Nvidia have been know for compressing textures resulting in detail loss to gain performance, they need to be comparing it to AMD's native image quality.
 
Of course it will, DLSS3 is one of the things marketing will be focused at and the price justified, "4K image quality for 1440P performance" only £1500.... i hope Reviewers scrutinize that because Nvidia have been know for compressing textures resulting in detail loss to gain performance, they need to be comparing it to AMD's native image quality.

Whilst I am optimistic about DLSS, I do really want to see a full independent analysis of the technology which will, as you say, compare against AMD native and also compare across 1080p / 1440p / 4k. If the image is made worse by DLSS then it becomes a question or whether the loss in quality is worth the performance increase. It is worth bearing in mind that the performance increase from DLSS itself might then be traded off for more bells and whistles in the quality settings, such than you end up with better quality graphics all things considered.
 
Whilst I am optimistic about DLSS, I do really want to see a full independent analysis of the technology which will, as you say, compare against AMD native and also compare across 1080p / 1440p / 4k. If the image is made worse by DLSS then it becomes a question or whether the loss in quality is worth the performance increase. It is worth bearing in mind that the performance increase from DLSS itself might then be traded off for more bells and whistles in the quality settings, such than you end up with better quality graphics all things considered.

Right DLSS is a great thing, take nothing away from it, the only thing that bothers me is 'IF' its used to say this is the new normal, running Native 1440P for 4K gaming.

I run Radeon Image sharpening, its vastly superior to the same thing Nvidia had with My Pascal GPU, its not quite DLSS2 but comparisons have been made and you'd have to point out where the difference are to see them, i run it in every game all the time, its just on globally and that's it, if i turn it off now to run native everything looks blurry and washed out. i don't consider my 1440P games to be 4K now, it isn't that.
And this really is all DLSS is, its complementary image enhancement, not a replacement for native resolutions.
 
You know you can buy a gpu on finance too
Phones are fashion too I guess, but :/ to anyone financing a 3090

edit: what I mean is I don't think its as accepted, I'd certainly lol at paying interest on a 3090, I do on phones too but sheeple sheep, I've paid cash for my phones for under £300 a time >_< a £1k iPhone/samsung means nothing to me
 
Last edited:
People get finance on things a lot cheaper than a flagship GPU. Amazon will offer a payment plan on stuff in the low hundreds. I think if you need to finance a GPU for anything other than a business, you shouldn't buy it in the first place. It's not exactly a necessary purchase if you're just gaming.
 
I am comfortable money wise at my age but found my credit score dropping as I had not bought anything on credit for a decade or more as I do not need it and have what I need unless something died and need replaced then again it was normally a cash purchase.

So now if a store accept PP I use PP Credit and pay it in one go after 4months, helps maintain/grow my credit rating again and it is better in my bank for the 4 months.

Now they keep raising my limit even though made no interest of me and never will. lol.
 
You talk with a serious amount of conviction about something that is a complete guess and fairy story at best!

Really?

So when the 3080 cooler leaked where was the 3090? The 3080 cooler leaked on the 6th June. The 3090 rumours began over two months after that. The 3090 was not confirmed until about a week ago.

To do this in one post. Yes, the 2080ti came out very shortly after the 2080. However, it had to. The leap was nowhere near what it has been in the past (for a good reason, it was a rush last minute job). So basically the 3080 with no RT was just about as fast as what we had already. So what high end buyer would have bought it, unless of course there was a 2080Ti.

Historically though, for gens Nvidia did not do it like that. 670, 680, Titan, later the 780. Then 970, 980, later the Ti. 1070, 1080, later the Ti. And so on.
 
Really?

So when the 3080 cooler leaked where was the 3090? The 3080 cooler leaked on the 6th June. The 3090 rumours began over two months after that. The 3090 was not confirmed until about a week ago.

To do this in one post. Yes, the 2080ti came out very shortly after the 2080. However, it had to. The leap was nowhere near what it has been in the past (for a good reason, it was a rush last minute job). So basically the 3080 with no RT was just about as fast as what we had already. So what high end buyer would have bought it, unless of course there was a 2080Ti.

Historically though, for gens Nvidia did not do it like that. 670, 680, Titan, later the 780. Then 970, 980, later the Ti. 1070, 1080, later the Ti. And so on.

It's not that easy to just knock up a new product if you weren't planning it. I'm not sure I buy that line of thinking.
 
It's purely because, if they doubled it up to 20GB, it'd be too close to the 3090 and would steal sales from it.

It looks like the 3090 is only about 15-20% faster than the 3080 but costs 75% more. The additional VRAM is the only real justification for this.
That's exactly it, I wonder if 10GB would be exceeded at 1440p??

Must admit this is an ass move.
 
Back
Top Bottom