• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** NVIDIA GEFORCE RTX 3080 SERIES STOCK SITUATION - NO COMPETITOR DISCUSSION ***

Associate
Joined
2 Dec 2020
Posts
68
alright guys thanks, maybe ill stream at 930 we will see. i tried OBS once and it just looked too complicated and i thought screw this and used streamlabs obs instead it just looks more simple

edit: also when using streamlabs obs it doesnt give me an option of 930 its either 1080p or 720p also, there are two things- base canvas resolution and output scaled resolution which i have set both to 1080p as no idea what it means

To elaborate on this - WARNING: long nerd post

When others talk about streaming at sub-1080p to accomodate users with slower internet connections what they're actually referring to is the Bitrate you've chosen in output. 6000kbps is still 6000kbps (as far as someone's internet connection is concerned) regardless of if that's 720p or 1080p. People generally consider 2500-3500 to be more accessible bitrates, until you have enough consistency as an affiliate to get transcoding, at which point you can jump to 6000kbps (or the non-advertised 'unofficial' bitrate of 8000kbps including audio, which has been available for 2 years with no public acknowledgment by twitch).

Even then, 8000kbps isn't, uh. It's not a lot. Even on x264 slow (for streamers running two computers) that's still not enough ones and zeroes going around for good 1080p 60fps streams. I load up a big streamer and I see artifacting and blocking all over the place (because spoiler: twitch is primarily about the person and people are generally more forgiving for bad video than they are for good audio, despite how much I go on in this post lmao) Even at 8000kbps, video spods would recommend streaming at 720p60fps or 1080p 30fps. When you're drawing less pixels per second (whether that's via lower frame rate, resolution, or both) there's more kilobytes that can be dedicated to each individual pixel, so at 720p, especially for fast-moving games at 60fps, the whole image will look smoother. 1080p would mostly be recommended if you were streaming something like programming, writing, art, video editing or Dungeons & Dragons where there's not a lot of motion but text might want finer detail. 936 is often floated as a 'happy medium' but that's hotly debated, but at least 936 is divisible by 16; resolutions divisible by 16 are a little sharper than those that aren't because current encoding methods work with 8x8 macroblocks.

What's more, a default view of twitch (not fullscreen, not theatre mode, chat open, follow list open) on a 1080p monitor (at least, on vivaldi, a chromium based browser) is only 732p on a 1080p monitor (jumping to 805p on a 1440p monitor) so there's limited benefit to streaming above 720p. Additionally, streamers who play games at 1440p or 4K will have their inputs scale cleanly down to 720p because 720 divides evenly into both 1440 and 2160, so the encoder has to make less approximations, it just goes 'sweet, I can just copy every fourth or sixth pixel' respectively, which can improve quality an extra pinch. This is why I have an admittedly oddball stream resolution of 1536x864 - I technically have a bit less bitrate efficiency than 720p but within my layout the extra screenspace is relatively low-motion and my 1440p game inputs scale down evenly into that 1280x720 window. EDIT: 4k resolutions do scale down evenly into 1080p but you're still playing at the crummy end of bitrate efficiency at 1080p unless you run 30fps.

For others who just have their game fullscreened and their other stuff (metrics / facecam etc) on top, it's thankfully less complicated. The 'Base (Canvas) Resolution' should generally be set to your desktop resolution (again, for beginners) and your 'Output (Scaled) Resolution' should be the resolution you're outputting to twitch, and if you want something other than 1080p or 720p for that then you just need to click in the box and type it out manually - that's assuming that Streamlabs works the same as 'normal' OBS, mind.

That's unless you want to record locally at full resolution with a higher bitrate and stream at a lower one, in which case you're Rescaling Output in the 'streaming' tab of the Output menu - or you're doing some Very Niche Nerdy Crap like me with 2 instances of OBS, or the Source Record plugin (only on normal OBS).

I, uh, I hope I haven't caused more problems than I've solved with this around-the-houses explanation.

WARNING 2: SHIFTING INTO MAXIMUM OVERDRIVE TURBO NERD ABOUT THE FUTURE OF ENCODERS AND I MIGHT BE WRONG ON A COUPLE THINGS

Twitch seems dead set on not increasing the bitrate cap which is really, really low compared to youtube's streaming cap which is, iirc, somewhere around 50,000kbps for 4K video - and admittedly some Video Nerds would still thumb their noses at youtube's recommendations - 50,000 would be around what I use to locally record 1080/60 via quantization parameters and even then I know some Video Nerds who are into archiving footage who feel that's nothing. Youtube also allows you, on a technical level, to stream video encoded in H265 / HEVC (High Efficiency Video Codec) which gets more bang for your buck, so 8000kbps H265 video looks a lot better than 8000kbps H264 video (I converted a ton of H264 to H265 using CQP settings to maintain the video quality and shrunk my internal storage drive by 42%!) buuuuuuuuuuuuuuuuuuuuuuuut HEVC is the subject of some absolutely ridiculous licensing and patent laws that means you are possibly open to being sued by its patent company for broadcasting HEVC-encoded content without a license - which is also why it isn't available as a preset in OBS. To my knowledge, Netflix uses HEVC, but of course they've paid the license fee as a distributor. Because Youtube transcodes everything you send them, that places you as a distributor if you were to stream HEVC and consequently liable. As a result, Twitch are running away from HEVC as fast as possible.

They are instead looking at AV1, an incoming open-source video encoding format which is being developed to compete with HEVC without its absurd licensing. One of Twitch's own employees has a demonstration video up which shows AV1 encoded video running 1440p 120fps running at 8,000kbps and to squeeze that much quality out of what's a 'low' bitrate by H264 standards is the stuff of my wildest dreams - don't need to spend money on more server space if you're just compressing better video into the same space!

And this is where we circle back to GPUs - AV1 video can be decoded in software reasonably well, but higher quality videos (eg 8k resolution) can make even monster CPUs suffer, but 30XX series nvidia cards can decode 2 8K AV1 videos smooth as butter. Encoding on the other hand is what we're waiting for. AV1 encoding isn't there yet. The Twitch video was encoded, rendered, in slower than real-time (so it might take, say, 1 minute 30 seconds to encode 1 minute of AV1 video). You can't stream if you can't encode in real time like you currently can with x264 and h264. It's possible that in four or five years time the eventual 50XX series will have real time AV1 encoding on NVENC, but nothing's guaranteed. It's a chicken-and-egg scenario. You won't get better encoding until there's demand for it, and there won't be demand for encoding until there exists content which is encoded with it to decode, which is now available on the 30XX (since launch) and AMD's 6000 series (via a recent GPU driver update). AV1 on NVENC is an inevitability, Nvidia have a post talking up AV1. AMD will likely be a bit further behind, they don't really target the Creativity Dabbler market (like how nvidia has Studio Drivers on consumer-grade cards) and AMF encoding on their GPUs is pretty rough.

But when that happens, that's pretty much going to be quality issues on Twitch solved forever, and I won't have to write out a post like the first half of this one ever again. Damn my weird backseating perfectionism!
 
Last edited:
Associate
Joined
2 Dec 2020
Posts
68
How many years will the 3080 keep doing 4K 144hz in games or 1440p 144hz

How long's a piece of string? Depends what you're playing.

For modern graphics-intensive games - I mean, your AAA tentpole blockbuster releases that are equally built for PC and console - even going back a few years, 4K 144Hz is just not going to happen for them. Even maintaining 1440p 120fps on a 3080 is a rarity now for these games. Games like Jedi Fallen Order or Yakuza 7 at max can scratch that but regularly dip below, and that assumes you aren't multitasking. For the higher end of framerates for those games you'll have to stick to 1080p or take a chunky settings hit. It's only older games or competitive fast-paced titles made to run quickly on relatively low hardware with long lifespans (CSGO, League, Overwatch) that can maintain those 120+ framerates, with Overwatch hitting 400fps at 4K low settings, or averaging 155 at 4K ultra settings, that you'll see hit those 120/144/244Hz marks consistently.

If you mainly play single player or big-budget titles, I imagine you'll see yourself returning to 60fps pretty soon unless you're willing to take a settings hit - just pulled up a Necromunda benchmark and it's maxing itself out to hit 60fps at 4K max settings, and if you throw RTX into the mix, well, be prepared for playing at 30fps at 4K.

EDIT: I'm aware you asked about 144Hz but my frame of reference is mostly 120Hz stuff because of weird little things about video encoding regularly needing evenly divisible figures so that's the number I've familiarised myself with, sorry!
 
Last edited:
Associate
Joined
9 Apr 2021
Posts
280
To elaborate on this - WARNING: long nerd post

When others talk about streaming at sub-1080p to accomodate users with slower internet connections what they're actually referring to is the Bitrate you've chosen in output. 6000kbps is still 6000kbps (as far as someone's internet connection is concerned) regardless of if that's 720p or 1080p. People generally consider 2500-3500 to be more accessible bitrates, until you have enough consistency as an affiliate to get transcoding, at which point you can jump to 6000kbps (or the non-advertised 'unofficial' bitrate of 8000kbps including audio, which has been available for 2 years with no public acknowledgment by twitch).

Even then, 8000kbps isn't, uh. It's not a lot. Even on x264 slow (for streamers running two computers) that's still not enough ones and zeroes going around for good 1080p 60fps streams. I load up a big streamer and I see artifacting and blocking all over the place (because spoiler: twitch is primarily about the person and people are generally more forgiving for bad video than they are for good audio, despite how much I go on in this post lmao) Even at 8000kbps, video spods would recommend streaming at 720p60fps or 1080p 30fps. When you're drawing less pixels per second (whether that's via lower frame rate, resolution, or both) there's more kilobytes that can be dedicated to each individual pixel, so at 720p, especially for fast-moving games at 60fps, the whole image will look smoother. 1080p would mostly be recommended if you were streaming something like programming, writing, art, video editing or Dungeons & Dragons where there's not a lot of motion but text might want finer detail. 936 is often floated as a 'happy medium' but that's hotly debated, but at least 936 is divisible by 16; resolutions divisible by 16 are a little sharper than those that aren't because current encoding methods work with 8x8 macroblocks.

What's more, a default view of twitch (not fullscreen, not theatre mode, chat open, follow list open) on a 1080p monitor (at least, on vivaldi, a chromium based browser) is only 732p on a 1080p monitor (jumping to 805p on a 1440p monitor) so there's limited benefit to streaming above 720p. Additionally, streamers who play games at 1440p or 4K will have their inputs scale cleanly down to 720p because 720 divides evenly into both 1440 and 2160, so the encoder has to make less approximations, it just goes 'sweet, I can just copy every fourth or sixth pixel' respectively, which can improve quality an extra pinch. This is why I have an admittedly oddball stream resolution of 1536x864 - I technically have a bit less bitrate efficiency than 720p but within my layout the extra screenspace is relatively low-motion and my 1440p game inputs scale down evenly into that 1280x720 window. EDIT: 4k resolutions do scale down evenly into 1080p but you're still playing at the crummy end of bitrate efficiency at 1080p unless you run 30fps.

For others who just have their game fullscreened and their other stuff (metrics / facecam etc) on top, it's thankfully less complicated. The 'Base (Canvas) Resolution' should generally be set to your desktop resolution (again, for beginners) and your 'Output (Scaled) Resolution' should be the resolution you're outputting to twitch, and if you want something other than 1080p or 720p for that then you just need to click in the box and type it out manually - that's assuming that Streamlabs works the same as 'normal' OBS, mind.

That's unless you want to record locally at full resolution with a higher bitrate and stream at a lower one, in which case you're Rescaling Output in the 'streaming' tab of the Output menu - or you're doing some Very Niche Nerdy Crap like me with 2 instances of OBS, or the Source Record plugin (only on normal OBS).

I, uh, I hope I haven't caused more problems than I've solved with this around-the-houses explanation.

WARNING 2: SHIFTING INTO MAXIMUM OVERDRIVE TURBO NERD ABOUT THE FUTURE OF ENCODERS AND I MIGHT BE WRONG ON A COUPLE THINGS

Twitch seems dead set on not increasing the bitrate cap which is really, really low compared to youtube's streaming cap which is, iirc, somewhere around 50,000kbps for 4K video - and admittedly some Video Nerds would still thumb their noses at youtube's recommendations - 50,000 would be around what I use to locally record 1080/60 via quantization parameters and even then I know some Video Nerds who are into archiving footage who feel that's nothing. Youtube also allows you, on a technical level, to stream video encoded in H265 / HEVC (High Efficiency Video Codec) which gets more bang for your buck, so 8000kbps H265 video looks a lot better than 8000kbps H264 video (I converted a ton of H264 to H265 using CQP settings to maintain the video quality and shrunk my internal storage drive by 42%!) buuuuuuuuuuuuuuuuuuuuuuuut HEVC is the subject of some absolutely ridiculous licensing and patent laws that means you are possibly open to being sued by its patent company for broadcasting HEVC-encoded content without a license - which is also why it isn't available as a preset in OBS. To my knowledge, Netflix uses HEVC, but of course they've paid the license fee as a distributor. Because Youtube transcodes everything you send them, that places you as a distributor if you were to stream HEVC and consequently liable. As a result, Twitch are running away from HEVC as fast as possible.

They are instead looking at AV1, an incoming open-source video encoding format which is being developed to compete with HEVC without its absurd licensing. One of Twitch's own employees has a demonstration video up which shows AV1 encoded video running 1440p 120fps running at 8,000kbps and to squeeze that much quality out of what's a 'low' bitrate by H264 standards is the stuff of my wildest dreams - don't need to spend money on more server space if you're just compressing better video into the same space!

And this is where we circle back to GPUs - AV1 video can be decoded in software reasonably well, but higher quality videos (eg 8k resolution) can make even monster CPUs suffer, but 30XX series nvidia cards can decode 2 8K AV1 videos smooth as butter. Encoding on the other hand is what we're waiting for. AV1 encoding isn't there yet. The Twitch video was encoded, rendered, in slower than real-time (so it might take, say, 1 minute 30 seconds to encode 1 minute of AV1 video). You can't stream if you can't encode in real time like you currently can with x264 and h264. It's possible that in four or five years time the eventual 50XX series will have real time AV1 encoding on NVENC, but nothing's guaranteed. It's a chicken-and-egg scenario. You won't get better encoding until there's demand for it, and there won't be demand for encoding until there exists content which is encoded with it to decode, which is now available on the 30XX (since launch) and AMD's 6000 series (via a recent GPU driver update). AV1 on NVENC is an inevitability, Nvidia have a post talking up AV1. AMD will likely be a bit further behind, they don't really target the Creativity Dabbler market (like how nvidia has Studio Drivers on consumer-grade cards) and AMF encoding on their GPUs is pretty rough.

But when that happens, that's pretty much going to be quality issues on Twitch solved forever, and I won't have to write out a post like the first half of this one ever again. Damn my weird backseating perfectionism!
thanks, ive been using streamlabs obs now instead of twitch studios as it works properly. changed my settings and my stream is good. My monitor is 4k 144hz so my desktop res i assume is 4k. been streaming some apex recently but i play that at 1440p instead of 4K as 144fps is more conistent. i have my base canvas set to 1080p and my output for the stream is 1080p 60fps altough ive lowered it to 936p 60fps to take stress off my pc. i havent tried other games yet as i play those at 4K for example fortnite and warzone
 
Associate
Joined
9 Apr 2021
Posts
280
How long's a piece of string? Depends what you're playing.

For modern graphics-intensive games - I mean, your AAA tentpole blockbuster releases that are equally built for PC and console - even going back a few years, 4K 144Hz is just not going to happen for them. Even maintaining 1440p 120fps on a 3080 is a rarity now for these games. Games like Jedi Fallen Order or Yakuza 7 at max can scratch that but regularly dip below, and that assumes you aren't multitasking. For the higher end of framerates for those games you'll have to stick to 1080p or take a chunky settings hit. It's only older games or competitive fast-paced titles made to run quickly on relatively low hardware with long lifespans (CSGO, League, Overwatch) that can maintain those 120+ framerates, with Overwatch hitting 400fps at 4K low settings, or averaging 155 at 4K ultra settings, that you'll see hit those 120/144/244Hz marks consistently.

If you mainly play single player or big-budget titles, I imagine you'll see yourself returning to 60fps pretty soon unless you're willing to take a settings hit - just pulled up a Necromunda benchmark and it's maxing itself out to hit 60fps at 4K max settings, and if you throw RTX into the mix, well, be prepared for playing at 30fps at 4K.

EDIT: I'm aware you asked about 144Hz but my frame of reference is mostly 120Hz stuff because of weird little things about video encoding regularly needing evenly divisible figures so that's the number I've familiarised myself with, sorry!
i dont mind single player games being at 4K 60fps because more fps is not really needed for those types of games eg cyberpunk or far cry 6 assasins creed valhalla, im more concerned about battle royale games or pvp games like fortnite, apex legends, call of duty etc
 
Associate
Joined
13 Oct 2020
Posts
472
Location
Plymouth
@martind Nice to see you still loitering on these forums. How's the new card performing?
HI mate, three words, bloody love it!- i had a evga 1080ti before this and that was no slouch, but 3080 is in another league! working my way through varoius games at the min and have been no less impressed - really hoping you get yours soon, hang in there,all the best!:p cant keep away from the forum, addiction is a terrible thing!
 
Soldato
Joined
21 Jan 2010
Posts
3,517
There has been multiple drops today I got mine in the first drop and now it’s looking very promising!!

Since the big Chinese crackdown on crypto, stock has improved for all cards. Lots of posts about people getting various fe cards recently.

Those conspiracy theories about direct selling to large mining operations may have been at least partially true.

Not having a go at miners here, before someone steams into me!
 
Associate
Joined
8 May 2021
Posts
20
Location
England
Since the big Chinese crackdown on crypto, stock has improved for all cards. Lots of posts about people getting various fe cards recently.

Those conspiracy theories about direct selling to large mining operations may have been at least partially true.

Not having a go at miners here, before someone steams into me!
I also think there was a lot of people stealing the products to sell themselves along with selling to the miners
 
Associate
Joined
27 Nov 2020
Posts
58
Noticed a 3080 Zotac on resell for £1,200 today which is the lowest price I've seen for a while.

Seems like the situation is improving, which is good news indeed.

Another reminder for those who are tempted, to not buy scalped cards at this time to encourage further price drops overall. Hopefully we can get below £1k before September.
 
Associate
Joined
29 Mar 2021
Posts
43
Location
Dublin
Noticed a 3080 Zotac on resell for £1,200 today which is the lowest price I've seen for a while.

Seems like the situation is improving, which is good news indeed.

Another reminder for those who are tempted, to not buy scalped cards at this time to encourage further price drops overall. Hopefully we can get below £1k before September.


Hopefully this trend continues and it actually becomes possible to get hands on cards for a sane price
 
Associate
Joined
10 Oct 2020
Posts
380
Noticed a 3080 Zotac on resell for £1,200 today which is the lowest price I've seen for a while.

Seems like the situation is improving, which is good news indeed.

Another reminder for those who are tempted, to not buy scalped cards at this time to encourage further price drops overall. Hopefully we can get below £1k before September.

Indeed it's improving. I don't mind paying £1000 for a 3080 but not a penny above
 
Associate
Joined
2 Jun 2016
Posts
2,382
Location
UK
Soldato
Joined
10 Apr 2011
Posts
3,741
Location
London
So how long do we think before these cards are back to retail prices and have availability - 2-3 months?

Fortunately I have managed to get a 3080 for myself but have a lot of friends that are keen on the 3060ti.
 
Back
Top Bottom