• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
But yes the main thing that would be impacting the latency is lack of reflex, low FPS and not having dedicated/premium servers. What was the latency being recorded with free tier on your system?
Did not have the overlay on to see it and I would need to wait another couple of hours in a queue to retest :cry:

I did mess about with LAN streaming and to be honest 1440p 120Hz did feel pretty good. I tried 2160p but the old 3080 probs can't encode 4K 120Hz so cap was about 60fps - it did not feel good!

As long as you live close enough to a datacentre and can stream 120fps it is probably not bad actually!

Some Moonlight stats:
0QJm2eX.jpg

dih9Dld.jpg
 
Did not have the overlay on to see it and I would need to wait another couple of hours in a queue to retest :cry:

I did mess about with LAN streaming and to be honest 1440p 120Hz did feel pretty good. I tried 2160p but the old 3080 probs can't encode 4K 120Hz so cap was about 60fps - it did not feel good!

As long as you live close enough to a datacentre and can stream 120fps it is probably not bad actually!

Some Moonlight stats:
0QJm2eX.jpg

dih9Dld.jpg

Yeah that's the biggest thing to impact latency, big difference between 120hz and a 60hz stream.

Did you try with reflex on and off? Also, be curious to see what nvidias overlay with the render latency shows although it may not work with the stream.

I imagine Nvidia probably have some other things happening in the background to help mitigate the latency.
 
Quite interesting too @stooeh , 4080 used for geforce now actually have 24GB vram!

Right. So now once in a blue moon when you run out from your 10gb of vram you can pay £18 to get 24gb vram.

Question is, how many months of it can you get with the money you saved from going 10gb 3080 instead of 24gb 3090?
 
Last edited:
Right. So now once in a blue moon when you run out from your 10gb of vram you can pay £18 to get 24gb vram.

Question is, how many months of it can you get with the money you saved from going 10gb 3080 instead of 24gb 3090?

If my math is right..... essentially 3 years and 6 months from the saving of not going for a 3090 over a 3080 at msrp :p :cry: :D Money well spent :p



Honestly, if the game library improves, I be pretty tempted to just pay for this in the mean time and even forgo buying 50xx (if silly priced too) as it is that good IMO, thing that I like the most is how my pc isn't used at all, zero fan mode ftw :D

This guys youtube has some pretty good side by side comparisons to his local 3080:


@stooeh

 
Last edited:
  • Like
Reactions: TNA
If my math is right..... essentially 3 years and 6 months from the saving of not going for a 3090 over a 3080 at msrp :p :cry: :D Money well spent :p



Honestly, if the game library improves, I be pretty tempted to just pay for this in the mean time and even forgo buying 50xx (if silly priced too) as it is that good IMO, thing that I like the most is how my pc isn't used at all, zero fan mode ftw :D

This guys youtube has some pretty good side by side comparisons to his local 3080:


@stooeh


Yeah, but can you mine on your 4080? Bet you can't do that :cry:

In all honesty though. The above shows why the 3080 FE was the much better option if price for performance matters to one at all.

For the ones that price for performance never mattered, they upgraded to a 4090 and will upgrade to a 5090 too.
 
Yeah, but can you mine on your 4080? Bet you can't do that :cry:

In all honesty though. The above shows why the 3080 FE was the much better option if price for performance matters to one at all.

For the ones that price for performance never mattered, they upgraded to a 4090 and will upgrade to a 5090 too.

Wouldn't be surprised if someone has found a way to do it :cry: I'm considering spinning up a vm in our cloud at work just to assign some beastly power and put games on, I'm sure no one would notice the increase in billing.... :p

Exactly, if you're buying the halo card, all this nonsense of "future proofing" is just silly and nothing more than trying to justify the extra cost. Can't blame nvidia with what they did this time round, 4090 is a lot more justifiable (and if they wish, they can do a 4080ti....) but as you said, all those who are saying 4090 will be a keeper for 3+ years are just deluding themselves :D If you can afford to buy 3090 then go to 4090, chances are, you're going to make the jump to a 5090 too.
 
Last edited:
  • Like
Reactions: TNA
Wouldn't be surprised if someone has found a way to do it :cry: I'm considering spinning up a vm in our cloud at work just to assign some beastly power and put games on, I'm sure no one would notice the increase in billing.... :p

Exactly, if you're buying the halo card, all this nonsense of "future proofing" is just silly and nothing more than trying to justify the extra cost. Can't blame nvidia with what they did this time round, 4090 is a lot more justifiable (and if they wish, they can do a 4080ti....) but as you said, all those who are saying 4090 will be a keeper for 3+ years are just deluding themselves :D If you can afford to buy 3090 then go to 4090, chances are, you're going to make the jump to a 5090 too.

If Nvidia don't release until late 2025 then by default the 4090 will have lasted them 3 years :cry: :cry: :cry: :cry: . Future proofing right there.
 
Right. So now once in a blue moon when you run out from your 10gb of vram you can pay £18 to get 24gb vram.

Question is, how many months of it can you get with the money you saved from going 10gb 3080 instead of 24gb 3090?

Turns out, you can even sometimes get 48GB VRAM! :cry:


Nvidia uses L40 for their 4080 tier GPU. L40s only consume 300 watts and are designed to be used in racks.
As others have implied, actual 4080s are not used...but rather data center variants of the GPUs. They have more VRAM because they are designed to be virtualized (into multiple vGPUs). Within GFN, if you land on a rig type that ends in 60 (such as 3060)...that's a vGPU which is one-half of the physical.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom