• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G-Sync pulsar

Caporegime
Joined
4 Jun 2009
Posts
31,821
Nvidias next evolution to keep it the gold standard for VRR?

TNfQHjS.png



Not quite fully related to the above but finally something, which shows what sets gsync apart from adaptive/free sync! This was always the main advantage of gsync module over non gsync modules, it's not really required for oled due to how oled handles motion and the instantaneous pixel response but for LCD screens, it was/is quite the game changer.


8gCVlHh.png
 
Last edited:
Ah more stuff which most people still can't tell the difference on.

No issues with more tech but really who is this for, what problem does it resolve if at all.

Is this Nvidia inventing a problem that doesn't exist and selling a solution to it?

For people who value crystal clear motion clarity and want VRR too. Nvidia are first to the market with this and it will probably cater towards those who value the CRT days (although oled is great for motion, it still isn't quite on par with CRT)

BFI generally can make frametime spikes more apparent thus reduce the feeling of smoothness, not to mention, sometimes the ghosting/halo like outer edges so I imagine the gsync module along with VRR + variable overdrive is aiming to resolve some of these issues.

Probably won't be for me due to BFI usually resulting in a dimmer image/lower brightness but perhaps nvidias method may make this a less of an issue too.
 
There's value in anything if you make it up as well.

I will question the value of this when testing in a variety of games.

Maybe this will be noticable in valorant and not WOW, who knows.

Many people already don't see value in many things thats on PC even if they have access to it because it does not resolve anything or is a solution to anything.

I ask again, what problem does this provide a solution for?

Good gaming monitors just don't have these issues unless you introduce them with in-game settings or mods.

Made up?

Why do people buy high refresh rate monitors and want high fps? Usually because they want a smoother image, better motion, less lag.

What would you rather have?

mNbG9zO.png


If you want that improvement, then you pay up as you would for when it comes to wanting anything "improved/better".


This is going to be more beneficial for LCD monitors than OLED but there is still a use case even here and yes, it is primarily going to benefit competitive shooters the most.

If you're questioning what the point of this is then no offence but it clearly isn't aimed for someone like you and sounds like you're better of to make do with consoles if you don't cae about high fps, motion and so on.
 
Even on a 175hz QD-oled screen (arguably one of the best for motion clarity), I still see "blur" in moving images like this:


This is where BFI/gsync pulsar would make the motion even more crisp.
 
So the solution essentially is already there.

So again, what problem does this solve if it's already available.

What game and type of games does the value of this warrant it.

This absolutely reeks of a problem that Nvidia is trying to fabricate and sell you a solution for it.

By the way, the main game I play 70% of the time is world of Warcraft.

Where's the solution exactly? To my knowledge there is no monitor with BFI where vrr works (without issues)

Did you even read the post what it looks to "improve"?
 
What and where is the known issue is this module suppose to fix.

Giving me a fringe scenario where you really have to dig to have this issue is looking for a problem that really doesn't exist and letting Nvidia sell you a solution to a problem that people just aren't having.

Are you deliberately being obtuse here? Or trolling? If trolling, maybe read the stickied post at the top of this sub-forum.

The use case, what it seeks to "improve", fix is literally right there in the article and my post to you.

Here is another article:



And a good article in general on tech:

 
Don't like what I say isn't trolling.

You keep saying "what is it to fix/improve", it is in black and white text from the articles and with images to show exactly what it seeks to fix and improve..... So seems like you don't like what it is seeking to fix/improve? Is it because nvidia are first to the market again?
 
You have explained what they are.

I asked what problem or issue.

Who has this issue? Pros? Or average joes on this forum.

I've never seen any one complain in actual games oh noes not enough motion clarity on my screen.

So I ask again, what issue and for who and what does it solve, what games actually this solves a problem for which no one has complained about.

Some games even have innate blur baked in.

Even competitive esports never complains.

:rolleyes:

Maybe stick to 30 fps with no vrr then as clearly you don't get who this is aimed for or rather appreciate such tecnology advancements.

It's literally the equivalent to saying "what's the point of more than 60hz/60fps", "what's the point of vrr", "what's the point of self emissive pixels"
 
Last edited:
Sort off, probably not far off it, NV and now AMDs on board with the Fake Frames, it's just the beginning.

IMO, Fake Frames need extremely high FPS, they've possibly worked out how to match or get closer to the Hz/fps for a cleaner output image, as Fake Frame technique evolves the FPS are going to increase.

NV's possibly got the solution.

This is quite different to DLSS 3/FG. This seeks to primarily improve motion clarity through different methods.

 
Last edited:
Having read about this further in comments and articles, seems it's just really for those hardcore gamers with LCDs and stuff.

[OLED master race represent]

Yeah it's far more beneficial for them but oled can benefit too.

TFTcentral comments on when they used BFI on older oled screens:

We saw this available on some older OLED TV’s including the LG CX for instance in 2020 where it could be used at 60Hz and 120Hz, and we were very impressed by the motion blur reduction benefits.

They'll have reviews coming soon so we'll get a good insight into what or how it is beneficial for oled.
 
Last edited:
You're giving something on paper.

Give something that people in games have actually said is an issue.

If we are taking about fps games then shroud or tenz I've never seen ever complain saying you know something, the motion clarity is lacking.

So if pros never bring it up, what do average joes notice?

And you're ignoring the listed benefits to it as well as what it will improve shown via image example/comparisons. You're basically saying it is perfect as it is motion clarity when it's not really especially for people who prefer lcd monitors for whatever reason where this tech will be a massive advantage to them.

It understandable that you can't relate to the problem as you don't seem to play the type of games that need it.

The technology will benefit everyone in the applicable genre, especially those who can't afford 240hz+ monitors.

Running at 360hz+ will feel like butter, but if you slow it all down you're still getting ghosting, stutter and blur, not to mention how your own eyes process images, and you'd only realise it when it didn't happen i.e. when the tech is turned on.

It's like trying to explain 165hz to a 60hz gamer, or 360hz to a 165hz gamer. You have to experience it.

Said it perfectly.

Lets turn this around, perhaps, you can tell us why it won't solve anything or what it won't resolve? TFTcentral etc. (experts in the monitor field) seem pretty convinced by the benefits of BFI (whilst having VRR work...)
 
Last edited:
For those who don't quite understand what this is, I highly advise reading blurbusters articles on motion clarity and so on....

 
So essentially if I'm reading it right this newer module enables the use of ULMB2 + G-Sync at the same time, Not something I'm personally excited for or even need but newer additions to the tech aren't a bad thing.

Yup exactly.

The tech has its place and it's not something I'm personally excited for "now" but maybe in the future.
 
It may read like that but once implemented and people actually get to use it, they'll notice the difference in motion clarity.

The question is how much is this going to cost the consumer and with it being nVidia, potentially a lot.

Part of the cost with using gsync module was mostly attributed to the monitor manufacturers having to change the chassis design in order to accomodate the gsync module so the extra cost is not directly all nvidia tax (although obviously there is an extra cost associated with this but it comes from monitor manufacturers too)

But alas when you're first to the market with key technology, you can dictate what the price will be. Of course, VESA will no doubt be along to have BFI working with VRR and then amd will jump on it with their freesync premium 3 or whatever branding it will be but as per history, you know there will be some drawbacks/issues and of course, we could be waiting years for that to happen so seems like nvidia could have the market to themselves for the forseeable.
 
Lol so Nvidia invented a problem and now selling a solution for

Just seen this, incredibly short sighted view point. So nvidia invented motion blur issues now?

Serious question, are you reading any of these articles by "experts" in the field?
 
Last edited:
Im sure it will be an open standard that will benefit the whole gaming community. He said with a glint in his eye.

It will eventually come, no doubt about that, question is when and also what will be the issues/drawbacks? It depends entirely on the monitor manufacturers too i.e. the monitor itself needs to support BFI/ULMB, not to mention, the monitor/panel itself will also need to meet requirements i.e. mininium brightness. Supposedly gsync pulsar itself could have a wider rollout in the future so existing displays with the gsync module (where firmware is updateable) might see this but again, your monitor will first need to support BFI/ULMB too.
 
If my monitors graphics are any sharper, my eyes will bleed!

Plus this is all complete ** now, the chance of noticing this new mode over an almost instant OLED wont happen. Besides most of these competitive gamers will be cheating anyway haha

Yeah like when getting 130+ fps on my 175hz qd-oled screen, motion looks so clean but I still can see some "blur" (and this isn't TAA/DLSS, motion blur etc.) As said, not a huge selling point for me now but in 2 years when I look to upgrade, this could sway me over a monitor that doesn't have this.
 
Loads of people said the same thing when it came to 60Hz vs 120Hz, most of them have crawled back into their corner since higher refresh panels became the norm and they'd actual experience of the difference :s

There are a lot of people "happy" enough with adaptive sync (G-Sync Compatible) but personally I notice quite a difference with having variable overdrive - I suspect a lot of the people "happy" with G-Sync compatible have not actually experienced G-Sync with the module. (Though to be fair I have a 43" 4K 60Hz display with G-Sync Compatible in the 48-60Hz range which is actually pretty good and I have no real complaints with it but I know from experience it could be better).

The variable overdrive is something very few realise is actually a massive benefit for when your fps is fluctuating on lcd displays. OLED has solved that now but before people didn't know about it or disregarded it as a strength (not something I realised until later on as it was quite poorly advertised tbh) e.g. my old ips freesync 2 premium 144hz was a right pain as I had to change the overdrive response depending on what game I was playing e.g. +2 was great as long as fps was 100+ but if playing something where fps was <75, I had to change to maybe +1 or even 0 otherwise inverse ghosting was noticeable.
 
Last edited:
Mean the talk of different refresh rates is not my arguement or the like.

What I asked was, what problem does this tech solve since it's never been brought up as a problem.

Refresh rate is just that, I've not discussed the difference between 2 points of refresh rates, that's something you can take up with the others.

When I said fps, if you needed clarity I'm referring to fps games, and non of the top fps players have ever said they are lacking motion clarity ( and they do run at the highest fps they can typically).

So hence my question, who and how many people have this issue.

If it's not been brought up then it's Nvidia creating a problem and then selling a solution for it.

Your argument is the equivalent to saying refresh rates don't matter though.... IIRC, a 175HZ OLED is the equivalent to a 245hz lcd display in terms of motion clarity. BFI seeks to improve motion clarity even further.

OLED does resolve this to some extent but as shown with the links above, it can always be better.

Ultimately, LCD needs to die but as shown, monitor manufacturers are still wanting to milk it so perhaps point the blame towards them. You are basicaly saying that nvidia created the "motion blur" problem, which as evidenced is clearly not the case.

You are also completely ignoring articles that explain this tech and the benefits for whatever reason.
 
Back
Top Bottom