• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

We will see I guess. I think this launch supply was bad because they had issues and made the cards late. They were all rushed out just before launch, and that could mean we will see more steady supply soon. They should be relatively happy with the MSRP so do want to sell them. There is not really huge demand from non gamers.
The 3000 series was different because of all the supply issues that happened after the launch, the lower MSRP made it so the AIB's didn't really want to make the cheap cards or sell them. Huge demand from miners who outbid gamers.
 
This is the first thing I am planning to do! Pretty good result so far. Same performance as stock at lower power is a must for this gen. Have you monitored the idle power usage? My 4090 goes to like 4w which is crazy and I have seen these 5090s at like 50/60w idle.
Not looked at idle yet, I'll have a nose though.
My 5080 is sitting at 5-6w at idle
Anyone quoting 50-60w at idle must mean entire system power, not just the gpu

Edit; oddly techpowerup show the 5080 as being low at start of their test, then a big spike for a few seconds and then it settles down to the same level as their multi monitor test and then quote an average of 20w. I've left it sitting on desktop while I'm typing this on my phone and after 10 minutes it hasn't deviated from 5-7w, so it looks to me like their system "did something" with the GPU while it was running the test.
Loading and unloading a web browser with YouTube it briefly spikes to 14w and then goes to 7-9w. Running a video (MP4) it's at 9-11w - techpowerup state 21w.

I managed to get it to spike to 50w by loading and unloading multiple browser windows and MP4 at the same time.

2nd edit, I've managed to get it to idle at 20w by having everything loaded in the background, epic store, steam store, browser minimised etc. - I don't normally have them running on boot
 
Last edited:
I think the Gamerock looks great! However it has small fans so will probably run hot
I actually ended up cancelling. Turns out Palit will not warranty the card if you tamper with it. Reinstalling the original cooler doesn't count. Just can't risk doing that. So I'm either getting an FE and forgoing liquid cooling this time, or waiting for a different model that would allow for RMA's to come down to a reasonable price
 
I actually ended up cancelling. Turns out Palit will not warranty the card if you tamper with it. Reinstalling the original cooler doesn't count. Just can't risk doing that. So I'm either getting an FE and forgoing liquid cooling this time, or waiting for a different model that would allow for RMA's to come down to a reasonable price
Thats annoying. Asus dont void your for putting a WB on, only if you damage it...Jay 2 cents mentioned it in the recent Astral LC video and I found a few articles this morning firming that up. I've went for the TUF and Alphacool block. Both pre preorder.
 
The worst launch I can remember. I managed to get a 3080 card back in the day with less hassle!

At this point of price scalping (not just by scalpers either - retail places too applying demand surge pricing for sure) I’m probably just waiting now and will see how the 9070XT stacks up. If it’s a bit less (perf and price) then I’ll probably take one if there is stock.

5080 is a disappointment on many levels and the 16GB .. especially on a card going for anywhere between 1K and 1.4K is a let down for a card that might last another 4yrs.
 
I actually ended up cancelling. Turns out Palit will not warranty the card if you tamper with it. Reinstalling the original cooler doesn't count. Just can't risk doing that. So I'm either getting an FE and forgoing liquid cooling this time, or waiting for a different model that would allow for RMA's to come down to a reasonable price
There is hope for an FE waterblock. Will be interesting but as long as something launches it will be possible. Bykski/Obelik have one in development and I think debau8er under the Thermal Grizzly brand is also looking at it.
 
Last edited:
The amount of time people here go on about not having enough vram. Yet every game i have played that recently came out I never have issues.

STALKER 2 - No issues
Silent Hill 2 - No issues
God of War Ragnarok - No issues
Final Fantasy 7 Rebirth- No issues
Kingdom Come Deliverance 2 - No issues

All at 4K too with my measly 12GB.

I was even thinking of upgrading to a 5070Ti but will skip it. 12GB will be fine for me as it will take me about a year or maybe two just to complete the games above :cry:

Edit - Fixed typo. Not sure why it said you at the start of the sentence. Lol.
99% of people who claim cards don't have enough vram don't realise that many games will allocate as much vram as a card has, it just won't use it.
I never had any vram issues while gaming at 4k on a 3080, yet apparently my 5080 has
"nowhere enough vram for 4k and is only a 1440p card at a push"
according to some reviewers lmao.

One reviewer showed that it can get 130fps in Kingdom come 2 at Ultra settings at 4k... and called it poor performance. wtaf are these guys smoking?
 
Yup VRAM ‘needs’ beyond 16gb are totally overhyped / overplayed. Yet to hear of anyone who actually struggles with this issue in a way that’s ‘game breaking’.
 
  • Like
Reactions: TNA
99% of people who claim cards don't have enough vram don't realise that many games will allocate as much vram as a card has, it just won't use it.
I never had any vram issues while gaming at 4k on a 3080, yet apparently my 5080 has
"nowhere enough vram for 4k and is only a 1440p card at a push"
according to some reviewers lmao.

One reviewer showed that it can get 130fps in Kingdom come 2 at Ultra settings at 4k... and called it poor performance. wtaf are these guys smoking?
What are we considering vram issues? I've had to drop textures down to medium or even low too reduce stuttering on my 3080. It's also definitely more susceptible to any memory leaks that games have as the smaller pool fills up faster. Now you can argue these are game faults and I'm not going to disagree but they're not going to go away either.
 
Managed to get teh card fired back up. been having a laff with it. I thought I would never say this but I actually like the 5080. I have always hated xx80 series.

Hated the 1080 loved the 1080 Ti. hated both the the 2080 and 2080ti. hated the 3080 but loved the 6800xt and 3080 ti. Hated the 4080 but loved the 4070ti super instead. And strangely this is the first xx80 I have liked in a few years. i think there is a a perception of the card and it fall down to these two versions. There is the stock 5080 version which is only marginally faster than the 4080/super and costs a bomb. Or there is the overclocked version and still costs a bomb.

I mean you still get the same 16gb vram and the price will leave a bitter taste in your mouth still but I am fine with it. How fast is the overclocked version compared to stock. Well there is a lot of overclocking headroom on both the core and the memory. How fast is fast? I would love to tell you that it is faster than a 4090 but it 's not. But it does start to close the gap fast. In gaming (I don;t use frame gen) It's not that far off what I got from my 4090 and in benching it mirrors those findings.

Of course this comes down to whether you are willing to overclock or not, and while I believe most 5080s will clock well I wound;t want to make a sweeping general statement that all 5080 clock super well. So like if you just want a easy no fuss overclock go +250 on the core +1000 mem. Feeling adventurous go +350 on the core and +1500 on the mem. And if you are really part of the IDGAF crew then go +400 and above and 2000 on the mem. .

I mean of course people have already made up their minds on this card. But I would suggest keeping an open mind. If the 5090 is out of your budget and you can;t find a a 4090 at a reasonable price give the 5080 a go I mean if you don't like it you can always return it. As much as this may be an unpopular opinion I like it (the overclocked version).

So in 3d mark I get 30899 with my 4090. And with the 5080...29958...
 
Last edited:
Managed to get teh card fired back up. been having a laff with it. I thought I would never say this but I actually like the 5080. I have always hated xx80 series.

Hated the 1080 loved the 1080 Ti. hated both the the 2080 and 2080ti. hated the 3080 but loved the 6800xt and 3080 ti. Hated the 4080 but loved the 4070ti super instead. And strangely this is the first xx80 I have liked in a few years. i think there is a a perception of the card and it fall down to these two versions. There is the stock 5080 version which is only marginally faster than the 4080/super and costs a bomb. Or there is the overclocked version and still costs a bomb.

I mean you still get the same 16gb vram and the price will leave a bitter taste in your mouth still but I am fine with it. How fast is the overclocked version compared to stock. Well there is a lot of overclocking headroom on both the core and the memory. How fast is fast? I would love to tell you that it is faster than a 4090 but it 's not. But it does start to close the gap fast. In gaming (I don;t use frame gen) It's not that far off what I got from my 4090 and in benching it mirrors those findings.

Of course this comes down to whether you are willing to overclock or not, and while I believe most 5080s will clock well I wound;t want to make a sweeping general statement that all 5080 clock super well. So like if you just want a easy no fuss overclock go +250 on the core +1000 mem. Feeling adventurous go +350 on the core and +1500 on the mem. And if you are really part of the IDGAF crew then go +400 and above and 2000 on the mem. .

I mean of course people have already made up their minds on this card. But I would suggest keeping an open mind. If the 5090 is out of your budget and you can;t find a a 4090 at a reasonable price give the 5080 a go I mean if you don't like it you can always return it. As much as this may be an unpopular opinion I like it (the overclocked version).
You were having the black screen issue right? How did you get round that? I am worried I might have an issue and my motherboard doesn’t support onboard gfx. I’d have to get a pcie 2.0 GPU and riser to get into the bios if it bricked as the second pcie slot is blocked by water cooling.
 
You were having the black screen issue right? How did you get round that? I am worried I might have an issue and my motherboard doesn’t support onboard gfx. I’d have to get a pcie 2.0 GPU and riser to get into the bios if it bricked as the second pcie slot is blocked by water cooling.
Yes for a while I couldn't get video output. but I have a spare graphics cards so I slotted that in. Completly nuked the nvidia install completely. Disconnected the PC from the internet so windows couldn't;t redownload a driver as you need 572.16 to recognise and fire up 5xxx series cards. Used the Command line interface to reset all the clocks as my 5080 got stuck at 2287 mhz, DDU everything installed my 5080 back in and installed 572.16 fresh.

Why would you need to get into bios? Pci-5 bug? A few peeps have had this.
 
99% of people who claim cards don't have enough vram don't realise that many games will allocate as much vram as a card has, it just won't use it.
I never had any vram issues while gaming at 4k on a 3080, yet apparently my 5080 has
"nowhere enough vram for 4k and is only a 1440p card at a push"
according to some reviewers lmao.

One reviewer showed that it can get 130fps in Kingdom come 2 at Ultra settings at 4k... and called it poor performance. wtaf are these guys smoking?
You didn't play Far Cry 6 then on your 3080? I can tell you it was a stuttering mess on my old 3080 at 4k when it bounced around 10gb Vram usage and was unplayable and that was 2021 title, It's only going to get worse as GPU performance increases and higher quality textures are used.
 
Back
Top Bottom