• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce now ultimate - most impressive!

Here's a comparison I made of the image quality of GFN Ultimate compared to my RTX 3080 gaming PC in a demanding stress test (a game with lots of vegetation):


Just to note I love GFN and the image quality is much better than other cloud gaming services, but there is still work to do as you can see particularly in the zoomed in shots.

Thanks for this. I'm playing that now, and it's made me a bit happier to stay with my 3080fe!!
 
Thanks for this. I'm playing that now, and it's made me a bit happier to stay with my 3080fe!!
Thank you and I'm glad you found it useful.

I find it odd how some games look better than others. Alan Wake 2 with pathtracing looks extremely close to a local PC to my eyes (despite having lots of vegetation) and Cyberpunk is pretty close too, but Avatar, Baldur's Gate 3 and the Talos Principle 2 have much more obvious issues with blur and compression artefacts.
 
Last edited:
so I finally got access to try this & bought a month of ultimate to try out. now to be clear I used to have a 3090 & recently upgraded to a 7900xtx

After the 20 minutes ive played on 1 game i play a lot (ETS2/ATS) & one quick comp game of century. I love it. even on my broke Net connection right now (35MB/s with 5% packet loss)

was running 3440x1440 60/120fps no issues (set custom) & didn't feel like I could see any differences. I think balanced/comp settings do have downgrades, but Id be very tempted to use Geforce Now full time

I mostly play old games or Story games now days & i hate a nosy room ( I think 200rpm 180mm Fans are noisy)

so if i could run a NUC & game for next 10 years that be pretty awesome

id be very tempted to sell my Gaming PC & buy a low end pc or just sell the 7900xtx & buy a old 2060 (in case my net goes down)
 
It's quite impressive and better than expected. I can see Nvidia working on some sort of AI that would run client side that would increase the IQ of the received data to compensate for the fact it isn't running locally.
 
Ignoring the fact I am useless at the game here is an example of the streaming quality


Bear in mind youtube does reduce the quality a bit but for the most part it's not that much different from native with DLSS

For the best quality it's important to make sure you are streaming with AV1 which it will show on the side bar.
 
Decided to give GeForce Now a crack after re-aligning my entire setup at home.

Initial impressions weren't great with the free tier tbh. The wait time was, well, I gave up after 45 mins and only moving up 80 places from ~230. The one time I did actually get past the queue stage it timed out due to the slow as molasses sync via Microsoft GamePass. I am not going to fully blame this on GeForce Now as I have seen similar slow times when using GamePass streaming app itself.

Now upgrading to the Ultimate tier is a completely different experience. Over wifi on my MacBook Sniper Elite 5 had a little bit of stutter, especially on the audio, but that soon cleared up and it was quite honestly a great experience (well ignoring the track pad and only running at 1080p as the bizarre resolution of the Mac is not an option).

I need two get it setup via Ethernet and on my 1440p UW G-sync monitor to really give it a proper test.

Main test will be a direct comparison between the Ultimate tier (4080) and Parsec into a Win 11 VM on my server running 16 Zen 2 threads and a 4070. More powerful hardware vs theoretically better latency.... Will be very interesting to see the difference tbh.

EDIT: Completely forgot to mention the connection I am running on. I am on sequential 900Mb/s fibre with a WiFi 6E router.
 
Last edited:
Still on WiFi but now got the Mac hooked up to my 1440p UW monitor.

Once I changed servers manually to one that didn't show any packet loss in the connection test (EU NorthWest instead of EU West it auto chose?!) it was waaaaaaay better. Still somewhat blocky in certain scenes but I will reserve judgement until I test it via Ethernet (that is a whole Dongle scenario and I am too tired to test that out right now).

Also I could just run the game natively on the Ally + 3080 XG Mobile but where is the fun in that?! :p

Quick shot of it in "action":


meNzWqPh.jpg
 
Last edited:
For some reason I have to use a UDP VPN (Proton is free) to connect to Geforce Now servers, otherwise the latency is much higher (50-60ms with spikes over 100ms). With a VPN, the latency is consistently between 17-18ms.

Apparently, some ISPs don't route traffic efficiently to Nvidia's servers.

Proton's free VPN seems to have packet loss over around 20mps (server defaults to EU Northeast), which limits me to 1600x900 resolution (60 FPS), but it does work smoothly playing like this.

I probably wouldn't pay for Geforce Now, until I change to an ISP that provides playable latency on their servers.
 
Last edited:
For some reason I have to use a UDP VPN (Proton is free) to connect to Geforce Now servers, otherwise the latency is much higher (50-60ms with spikes over 100ms). With a VPN, the latency is consistently between 17-18ms.

Apparently, some ISPs don't route traffic efficiently to Nvidia's servers.

Proton's free VPN seems to have packet loss over around 20mps (server defaults to EU Northeast), which limits me to 1600x900 resolution (60 FPS), but it does work smoothly playing like this.

I probably wouldn't pay for Geforce Now, until I change to an ISP that provides playable latency on their servers.
Which ISP are you with so I can avoid them? :cry:
 
Hopefully over time, Nvidia will add more server locations (like Birmingham, Manchester, maybe Glasgow and/or Edinburgh as well).

I doubt the latency is good in Scotland, if the closest server is London...
 
Variable Refresh Rate Support Comes to NVIDIA’s GeForce Now Cloud Streaming Service



by Matthew Connatser on March 7, 2024 7:00 PM EST

Today NVIDIA has brought variable refresh rate support to its GeForce Now cloud gaming service. The company initially promised variable refresh support on GeForce Now back in early January during CES, and has seemingly waited so that it could launch alongside GeForce Now Day Passes, which are also now available.


Variable refresh rate (VRR) technologies, including NVIDIA's own G-Sync, have been around for around a decade now, and allow a monitor to synchronize its refresh rate to the instantaneous framerate of a game. This synchronization prevents screen tearing, when two or more frames are present on a display at the same time. Without a VRR technology, gamers either have to tolerate the visual incongruity of screen tearing or enable V-Sync, which solves screen tearing by locking the framerate to the refresh rate (or a fraction thereof). VRR became popular because V-Sync added latency and could depress framerates due to it effectively being a framerate limiter.


Dubbed "Cloud G-Sync", NVIDIA touts not only a screen tearing-free experience for GeForce Now thanks to variable refresh rate support, but also lower latency thanks to “varying the stream rate to the client, driving down total latency on Reflex-enabled games.” Prior to VRR’s debut on GeForce Now, users either had to enable V-Sync in-game, enable a stream-level V-Sync setting that had the benefit of not locking the game framerate, or accept screen tearing. GeForce Now Ultimate members will also be able to pair VRR with Reflex-powered 60 FPS and 120 FPS streaming modes.



According to NVIDIA’s technical documentation, variable refresh rate support on GeForce Now can work with both Mac and Windows PCs hooked up to a VRR-capable monitor. This includes G-Sync monitors on Windows, as well as VESA AdaptiveSync/FreeSync monitors, HDMI 2.1 VRR displays, and even Apple ProMotion displays, such as the panels built into their recent MacBook Pro laptops. The biggest compatibility hurdle at this time is actually on the GPU side of matters; Windows machines need an NVIDIA GPU to use VRR with GeForce Now. Intel and AMD GPUs are "not supported at this time."

N2IZmxg.png


Although G-SYNC originally came out in 2013 and GeForce Now has been available since 2015, the two never intersected until now. It’s not clear why NVIDIA waited so long to bring G-Sync to GeForce Now; the company’s original announcement merely states “newly improved cloud G-SYNC technology goes even further,” implying that it wasn’t possible before but doesn’t exactly explain why.


Does not work on AMD and Intel GPUs in windows (Not supported at this time...AKA waiting for backlash before maybe adding some minor support) BUT works fine on Apple iMacs even ones with the AMD Radeon Pro cards that gamers will not own... :rolleyes:
 
Last edited:
Variable Refresh Rate Support Comes to NVIDIA’s GeForce Now Cloud Streaming Service



by Matthew Connatser on March 7, 2024 7:00 PM EST



N2IZmxg.png





Does not work on AMD and Intel GPUs in windows (Not supported at this time...AKA waiting for backlash before maybe adding some minor support) BUT works fine on Apple iMacs even ones with the AMD Radeon Pro cards that gamers will not own... :rolleyes:
Classic Nvidia!
 

More Nvidia comedy.... Not April 1st yet but they are trying hard early...


qaupVJE.jpeg


Still want to rent your GPUs from this company ? :cry:
 
I'm not sure why it's such a big issue and you're getting so offended by these things? If you don't like it, simply don't pay for it?

Do you expect them to allow people to use their tech/server centres/4080s for free? And is it really surprising that they don't want their competition to be using their tech? Perhaps amd and intel or other cloud streaming services can come up with their own solutions to provide competition and thus force nvidia to provide it for cheaper or/and expand their support?







Some more comparisons from one of our members on here:



 
@Purgatory

Can you not discuss the points or answer the questions raised rather than going of on one and resorting to childish remarks? (something which you have given of about before?)

Keep to the topic and answer these questions:

Do you expect them to allow people to use their tech/server centres/4080s for free? And is it really surprising that they don't want their competition to be using their tech? Perhaps amd and intel or other cloud streaming services can come up with their own solutions to provide competition and thus force nvidia to provide it for cheaper or/and expand their support?

What exactly do you "want" to see nvidia do?
 
Last edited:
Back
Top Bottom