• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

So we're in September now...... What is the latest surrounding the 5090/5080 release date? Is still looking likely to be next year now?
There's an outside chance we get a 5080/5090 announcement in the next fortnight and a launch in November. But I think at this point, the suggestion of a January announcement and March launch is probably the best case scenario. Rumours swirl of manufacturing delays prompted by the AI side.
 
Last edited:
There's an outside chance we get a 5080/5090 announcement in the next fortnight and a launch in November. But I think at this point, the suggestion of a January announcement and March launch is probably the best case scenario. Rumours swirl of manufacturing delays prompted by the AI side.

March onwards would suit me perfectly.
 

5080 will be 10% faster than 4090.

5090 600W TDP :eek:
5080 400W TDP :eek:
 
Don't care when they're released or at what price.

I just want an end to this tedious nonsense of everything selling out in the first 3 seconds of launch, looking at you Founders Editions, and then having to jump through hoops everytime there's a resupply.

There should be no excuses for stock issues now.

Even though the 5090 is likely to be going on £2000 I think you'd get a good chunk of that back after 2.5 years I reckon you'd lose less than £1 a day. Not terrible value, so if you can get one on day one...it's tempting.

You could even sell it after 18 montha and make even more back...
 

5080 will be 10% faster than 4090.

5090 600W TDP :eek:
5080 400W TDP :eek:

The 5080 power consumption isn't any higher than Intel's CPU's, what's the problem? :D
 
Last edited:

5080 will be 10% faster than 4090.

5090 600W TDP :eek:
5080 400W TDP :eek:
one of these with Intel cpu and you are looking at almost a 1000w, this is insane amounts of heat to deal with specially in summer. They should advertise it as Sauna Edition with free water scooper and some rocks.
 
one of these with Intel cpu and you are looking at almost a 1000w, this is insane amounts of heat to deal with specially in summer. They should advertise it as Sauna Edition with free water scooper and some rocks.

My 4090 and 7950x is already closing in on 1000w, though this is power at the wall, and includes the monitor, speakers and a router. The room has new AC though, so will be fine in Summer
 
Last edited:
600w is fine. Besides it won't always be running at full load depending on the game and if you don't let the frame rate run uncapped. My 4090 takes more than my 3090 did which it will under full load, but if I run at my usual cap of 4K 120/144fps and run the same games I did previously I'm actually drawing less power on my 4090 to maintain the same frame rate as its only running at 50 to 70% load rather than the 100% load the 3090 needed plus it runs cooler because of it. If you want ridiculously fast performance then it's going to take a decent amount of power, nothing comes for free.
 
Last edited:

5080 will be 10% faster than 4090.

5090 600W TDP :eek:
5080 400W TDP :eek:



4090 = 100% performance and 450w TDP

5080 = 110% performance and 400w TDP

Using TPUs aggregate data; 4090 = 152fps.

Therefore:

4090 = 0.33 fps per watt
5080 = 152*1.1 / 400 = 0.42 fps per watt

Therefore Blackwell is 27% more efficient than Ampere. So the 5080 must be a significant larger GPU than the 4080 - it must have significantly more cores


This is not the best comparison it would be best if both GPUs have the same power draw but it is what it is based on this data
 
Last edited:
My 4090 and 7950x is already closing in on 1000w, though this is power at the wall, and includes the monitor, speakers and a router. The room has new AC though, so will be fine in Summer
not everybody has a air conditioner during summer. My computer when gaming heats the room up considerably and its usually pulling around 350w for the gpu, even with both windows open.
 
Last edited:
Even with DLSS perf at 4k which is equilvalent to dlss quality at 3440x1440 iirc, the perf still is poor on a 4090 in those 2 titles.

Optimisation issues is when fps drops for no good reason, vram is being hogged for no good reason, cpu usage is not being properly utilised. Black myth has optimisation issues as highlighted in DF/Alex video, same with star wars outlaws, still better optimised than the likes of TLOU on release day but it doesn't excuse them.
I have Black Myth, it runs in high details in 3k resolution with DLSS on 4060 mobile (gaming laptop) and Ryzen 7 7000H series, with around 80fps. No hardware RT, granted, but that's still well optimised in my book, with how well it runs, considering how much weaker it is comparing to my PC. And I have definitely not seen anything even close to 60 FPS (it's way higher) on my 4090 in 1440p UW resolution, though I've not seen the whole game yet - highest possible in game details. I expected it to run worse.
 
Last edited:
It can fit 2 frames for the price of 1. But only if you have a 5000 series ;)
Current FG latency increase can already be felt relatively easily in faster titles with lower input fps.Add more and it'll quickly become unplayable. That said there's already software that can do even 3, so it's easy to test already as is - it's not a great feeling, quality aside.
 
Current FG latency increase can already be felt relatively easily in faster titles with lower input fps.Add more and it'll quickly become unplayable. That said there's already software that can do even 3, so it's easy to test already as is - it's not a great feeling, quality aside.

Not with magic nvidia tech. They will make it work so it is 2 for the price of 1. The cost to us? Need to buy a 5000 series card :p

Honestly though, thought it was obvious my comment was tongue in cheek. Who bloody knows what we will get or how it will work.
 
not everybody has a air conditioner during summer. My computer when gaming heats the room up considerably and its usually pulling around 350w for the gpu, even with both windows open.
Air conditioner uses power to throw energy outside the flat. With it you pay double - once to burn by GPU and throw into the room then again to move it outside the flat. With 300W average in current games and 100W for CPU and then 100W for stuff like monitor etc. we have already space heater and in mild UK summer it's often unbearable, which stops me from gaming in a well insulated flat. Now imagine 600W card with all the other stuff and we have almost 1kW space heater in summer - my room would shoot up to 40C easily, I suspect (27C+ currently is normal with all windows open and 20C outside). There's no way I'd ever buy such GPU, no matter how fast, without installing aircon etc. first.
 
Not with magic nvidia tech. They will make it work so it is 2 for the price of 1. The cost to us? Need to buy a 5000 series card :p

Honestly though, thought it was obvious my comment was tongue in cheek. Who bloody knows what we will get or how it will work.
I know you are not very serious about it but enough of people here are and it seemed like good enough time to give my thought about such "improvements". :) In other words, not interested. And considering AI is nr 1 for Nvidia these days, I suspect they will show more AI stuff.
 
Current FG latency increase can already be felt relatively easily in faster titles with lower input fps.Add more and it'll quickly become unplayable. That said there's already software that can do even 3, so it's easy to test already as is - it's not a great feeling, quality aside.


Software currently being worked on that can generate 4 frames for 1 real frame. I'll never used it though, current Nvidia and amd frame gen is not appealing to me - yes there is the latency, but the latency is not too bad if you're only generating one fake frame for each real frame - the real issue for me is the smearing/blurring that frame Gen creates

Being a gamer in 2024 consists of: TAA blurring, DLAA smearing, Frame Gen smearing, DLSS/FSR low internal resolution, tons of post processing junk like film grain and then the cherry on top is developers using PS3 quality textures and at the same time demanding you have 16gb vram

Combine all of these "features" together and you get a modern AAAA $100 game
 
Last edited:
Back
Top Bottom