• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Yeah so you might not want to use ARK as an example of RT looking bad - it was announced that they would have DLSS but from what I can find, they never delivered on that and never had RT even planned. The ARK devs are notorious over-promisers and under-deliverers. So if you're seeing RT "graininess" on ARK or ARK screenshots, it's something else.

Control I got about half an hour into and then just decided "Meh", so I have little idea what it looks like. I did play Wolfenstein Youngblood with RT and DLSS activated. Can't say I noticed any grainy weirdness there. Either way I'm not sure it's right to dismiss RTX as "grainy" on the strength of one game.

I didn't. I showed you Ark, even though I haven't played it, Control was worse and Metro does it too.

I bought Wolfenstein YB but it was so stupid I turned it off. Might try that, because I am sure that it was before I got a RTX. BTW? BFV does it too.
 
You just saying 'minimum 850w' or has this been reported somewhere?

If that the case the new 850w PSU I bought in March could be for sale.

Whoah slow down man. It is "recommended" that you use a 850w PSU. Just like when Nvidia recommended a 750w PSU for the GTX 480. It doesn't mean you need one. Well, let's face it any one who knows about how PSUs work know it's not a good idea to pull more than 50% of the PSUs power (for efficiency heat and noise) but many would get away with a 650w on this if they had the connectors for it.

However. Do bear in mind that this is about the 3080. Not the 3090. That could be a whole different animal.
 
It won't be another Fermi. Firstly it looks like it has a decent cooler. Secondly it will be really, really fast and not just 8% faster than the 5870 with FSAA cranked. It also isn't delayed, and there are plenty of stocks on the shelves for the 20 series (unlike Fermi where there were no 200 Nvidia left for 11 months before Fermi launched).

It probably will use tons of power, but it should be cool and it will be fast. So whoever was looking to buy these won't care any way.
 
I didn't. I showed you Ark, even though I haven't played it, Control was worse and Metro does it too.

Right, but you showed it to me as an example of poor RTX performance when it doesn't even have RT!
That's all I'm saying here. And if you're seeing what you think of as "RT noise" in non-RT games, then I question whether it's RT causing it?

It doesn't really matter I guess, it will improve with time, some games will use it, others won't, like any graphics tech.
 
Last edited:
That's in large part why the prices are much higher now - if people buy every 2 years then they need to pay twice as much to get same annual revenues (if margin is the same).
Only if the majority upgraded every single time news cards were released. I don't think that's actually the case.
 
Depends on your definition of delayed, we used to get GPU upgrades every 6 to 12 months. :)

Used to, being past tense. We've had AMD and Vega (lol) we've had a good mid range Navi (but nothing else) Nvidia have had issues with Samsung, then Covid then everything else. Even if we had got Ampere as planned two years ago I highly doubt they would have moved any faster. That is what happens when you are winning, unless you want them to do an Intel and release 5% faster products at higher prices? Maybe in the past we have been spoiled but that seems to be over. There were two years between the 1080 and 2080 too dude, so maybe your maths is a little off and you are going too far back?

Right, but you showed it to me as an example of poor RTX performance when it doesn't even have RT!
That's all I'm saying here. And if you're seeing what you think of as "RT noise" in non-RT games, then I question whether it's RT causing it?

It doesn't really matter I guess, it will improve with time, some games will use it, others won't, like any graphics tech.

I wasn't talking about performance dude. That was OK. Frame rates were great at 1080p with the 2070S.

I haven't seen RT noise in any game not using RT either, so I am not sure where you got that from. I said I tried W YB before I got RT (so I played it without it) and it was fine.
 
If it says 850w on the box then you can guarantee they are just being ultra cautious to cover themselves and I bet lower rated, high quality gold PSU's (650w & 750w) will be fine.

If you take a 3090 @ 350-400w that only leaves 250w for the rest of the system before maxxing out a 650w PSU.

Cutting it fine if you also have a 3950x in the system or a 10900K or whatever Intel's latest and greatest nuclear furnace is called :P
 
What is the point of this connector vs the existing 2 x 6/8-pin design?
If it was a new more compact design I could understand it.

Nvidia can make it look like their Ampere cards only need 2x6 pin in the marketing material.

The difference in perception of something so simple could be huge in the minds of consumers.

We see 3x8 pin, as shown recently on an AIB version, and automatically think hot & loud.

But we see what looks like 2x6 pin and think nothing of it :P
 
If you take a 3090 @ 350-400w that only leaves 250w for the rest of the system before maxxing out a 650w PSU.

Cutting it fine if you also have a 3950x in the system or a 10900K or whatever Intel's latest and greatest nuclear furnace is called :p

The 3090 is going to be a whole different animal to the 3080. The 3080 is going to be smaller, restrained and ETC. The 3090? is going to be the no limits "We have to win to maintain the pricing structure" card.

That said, whoever spends £1400+ on a GPU won't care about having to possibly buy a new PSU. That's small change isn't it?

I run a 1550w Platimax in my TR 1920x/2070 water rig. Do I need it? no. However, when I bought it it was in mint condition and I paid £70. I do have lots of gear in there (four AC computers that control everything, pumps, loads of fans, a 7" touch panel etc etc) but it is still probably twice what I need. Which is good, because it never makes a sound.
 
Russia testing old power supplies suitable for Ampere overclocking, apparently the new cable can be made to fit
vPKqoaumVKiu7aAjA3UmTh-650-80.jpg
 
There were two years between the 1080 and 2080 too dude, so maybe your maths is a little off and you are going too far back?

Errr.. I know. :confused: How am I off, we were discussing Fermi, you said it was delayed, Fermi was a loooooooooooong ass time ago (March 2010) I said depends on the definition of delayed.

I consider two years per release to be a while, and using the Intel CPU thing as a benchmark is hardly the same, since they released 5% faster products at the same prices. ;)

2600K - $317 - Q1 '11
3770K - $342 - Q2 '12
4770K - $350 - Q2 '13
4790K - $350 - Q2 '14
6700K - $339 - Q3 '15
7700K - $339 - Q1 '17
 
The 3090 is going to be a whole different animal to the 3080. The 3080 is going to be smaller, restrained and ETC. The 3090? is going to be the no limits "We have to win to maintain the pricing structure" card.

That said, whoever spends £1400+ on a GPU won't care about having to possibly buy a new PSU. That's small change isn't it?

I run a 1550w Platimax in my TR 1920x/2070 water rig. Do I need it? no. However, when I bought it it was in mint condition and I paid £70. I do have lots of gear in there (four AC computers that control everything, pumps, loads of fans, a 7" touch panel etc etc) but it is still probably twice what I need. Which is good, because it never makes a sound.

Well, if the leaks/rumours turn out to be true, the 3080 isn't far behind the 3090 in terms of power consumption, using 320w.

Overclocking that would likely send the power rocketing, assuming it has any headroom.

We will have to wait and see what is released but I suspect Jensen will gloss over wattage if it's as bad as they say and we will have to wait on reviews to tell us the facts :P
 
Errr.. I know. :confused: How am I off, we were discussing Fermi, you said it was delayed, Fermi was a loooooooooooong ass time ago (March 2010) I said depends on the definition of delayed.

I consider two years per release to be a while, and using the Intel CPU thing as a benchmark is hardly the same, since they released 5% faster products at the same prices. ;)

2600K - $317 - Q1 '11
3770K - $342 - Q2 '12
4770K - $350 - Q2 '13
4790K - $350 - Q2 '14
6700K - $339 - Q3 '15
7700K - $339 - Q1 '17

Those CPUs crept up in prices quite dramatically. Especially the I5. In fact, at one point they were asking £185 for an I3. Then along came Ryzen and they were slapped back to reality.

Then again even if they were the same price forever it wouldn't matter would it? who in their right mind would change their CPU, quite probably their board, lose a load of money for 5%? by the time you waited for 20% you were waiting years any way.

If you are going to declare a product is Fermi you have to take everything into account dude. Like you know? the 580 and the 590 and all of that. Which were hailed as being excellent cards.

I've said it over and over, no one will care about the power use if it's very fast.
 
Those CPUs crept up in prices quite dramatically.

You are thinking of the $:£ exchange rate, not the prices of the MSRP's, I posted them in the spoiler tags for you.

Then again even if they were the same price forever it wouldn't matter would it? who in their right mind would change their CPU, quite probably their board, lose a load of money for 5%? by the time you waited for 20% you were waiting years any way.

You used it as a comparison, not me, I said it was hardly the same. ;)

Then along came Ryzen and they were slapped back to reality.

That goodness for AMD, huh? :D
 
I wasn't talking about performance dude. That was OK. Frame rates were great at 1080p with the 2070S.

I wasn't talking about performance either. I'm not sure why you're bringing that in now, I was saying that raytracing effects are likely to improve in future.

I haven't seen RT noise in any game not using RT either, so I am not sure where you got that from. I said I tried W YB before I got RT (so I played it without it) and it was fine.

But you did post up a screenshot of an example of what you think of as RT noise, but from a game that hasn't even got RT support.
If that 'noise' in the ARK screenshot is the sort of thing that leads you to believe that RT is no good at present, then perhaps it's not the raytraced effects you're noticing *at all*.
 
So what's the 3080 likely to need? Guess my 650w PSU might not cut it.. annoying time to have to buy a new PSU if that's the case, since all the ones I've seen have gone above their usual price

Its times like this when you should always consider mr AMD, unless you have a new G-Sync display there is no reason to stay chained to nvidia assuming AMD offer you similar performance (hint: theres wont be as power hungry as the past two gen's have been addressing that). ;)
 
Back
Top Bottom