• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G-Sync pulsar

So essentially if I'm reading it right this newer module enables the use of ULMB2 + G-Sync at the same time, Not something I'm personally excited for or even need but newer additions to the tech aren't a bad thing.

Yup exactly.

The tech has its place and it's not something I'm personally excited for "now" but maybe in the future.
 
Only has its place for those 1337 kids chasing ultra high framez for thousands and thousands of £ prize

It may read like that but once implemented and people actually get to use it, they'll notice the difference in motion clarity.

The question is how much is this going to cost the consumer and with it being nVidia, potentially a lot.
 
It may read like that but once implemented and people actually get to use it, they'll notice the difference in motion clarity.

The question is how much is this going to cost the consumer and with it being nVidia, potentially a lot.

Part of the cost with using gsync module was mostly attributed to the monitor manufacturers having to change the chassis design in order to accomodate the gsync module so the extra cost is not directly all nvidia tax (although obviously there is an extra cost associated with this but it comes from monitor manufacturers too)

But alas when you're first to the market with key technology, you can dictate what the price will be. Of course, VESA will no doubt be along to have BFI working with VRR and then amd will jump on it with their freesync premium 3 or whatever branding it will be but as per history, you know there will be some drawbacks/issues and of course, we could be waiting years for that to happen so seems like nvidia could have the market to themselves for the forseeable.
 
Lol so Nvidia invented a problem and now selling a solution for

Just seen this, incredibly short sighted view point. So nvidia invented motion blur issues now?

Serious question, are you reading any of these articles by "experts" in the field?
 
Last edited:
Im sure it will be an open standard that will benefit the whole gaming community. He said with a glint in his eye.

It will eventually come, no doubt about that, question is when and also what will be the issues/drawbacks? It depends entirely on the monitor manufacturers too i.e. the monitor itself needs to support BFI/ULMB, not to mention, the monitor/panel itself will also need to meet requirements i.e. mininium brightness. Supposedly gsync pulsar itself could have a wider rollout in the future so existing displays with the gsync module (where firmware is updateable) might see this but again, your monitor will first need to support BFI/ULMB too.
 
For those who actually want to learn the benefits, good discussion in here about BFI etc.

If my monitors graphics are any sharper, my eyes will bleed!

Plus this is all complete ** now, the chance of noticing this new mode over an almost instant OLED wont happen. Besides most of these competitive gamers will be cheating anyway haha
 
You have explained what they are.

I asked what problem or issue.

Who has this issue? Pros? Or average joes on this forum.

I've never seen any one complain in actual games oh noes not enough motion clarity on my screen.

So I ask again, what issue and for who and what does it solve, what games actually this solves a problem for which no one has complained about.

Some games even have innate blur baked in.

Even competitive esports never complains.

Loads of people said the same thing when it came to 60Hz vs 120Hz, most of them have crawled back into their corner since higher refresh panels became the norm and they'd actual experience of the difference :s

There are a lot of people "happy" enough with adaptive sync (G-Sync Compatible) but personally I notice quite a difference with having variable overdrive - I suspect a lot of the people "happy" with G-Sync compatible have not actually experienced G-Sync with the module. (Though to be fair I have a 43" 4K 60Hz display with G-Sync Compatible in the 48-60Hz range which is actually pretty good and I have no real complaints with it but I know from experience it could be better).
 
Last edited:
Made up?

Why do people buy high refresh rate monitors and want high fps? Usually because they want a smoother image, better motion, less lag.

What would you rather have?

mNbG9zO.png


If you want that improvement, then you pay up as you would for when it comes to wanting anything "improved/better".


This is going to be more beneficial for LCD monitors than OLED but there is still a use case even here and yes, it is primarily going to benefit competitive shooters the most.

If you're questioning what the point of this is then no offence but it clearly isn't aimed for someone like you and sounds like you're better of to make do with consoles if you don't cae about high fps, motion and so on.
I just turn off motion blur in the settings. ;)
 
If my monitors graphics are any sharper, my eyes will bleed!

Plus this is all complete ** now, the chance of noticing this new mode over an almost instant OLED wont happen. Besides most of these competitive gamers will be cheating anyway haha

Yeah like when getting 130+ fps on my 175hz qd-oled screen, motion looks so clean but I still can see some "blur" (and this isn't TAA/DLSS, motion blur etc.) As said, not a huge selling point for me now but in 2 years when I look to upgrade, this could sway me over a monitor that doesn't have this.
 
Loads of people said the same thing when it came to 60Hz vs 120Hz, most of them have crawled back into their corner since higher refresh panels became the norm and they'd actual experience of the difference :s

There are a lot of people "happy" enough with adaptive sync (G-Sync Compatible) but personally I notice quite a difference with having variable overdrive - I suspect a lot of the people "happy" with G-Sync compatible have not actually experienced G-Sync with the module. (Though to be fair I have a 43" 4K 60Hz display with G-Sync Compatible in the 48-60Hz range which is actually pretty good and I have no real complaints with it but I know from experience it could be better).

The variable overdrive is something very few realise is actually a massive benefit for when your fps is fluctuating on lcd displays. OLED has solved that now but before people didn't know about it or disregarded it as a strength (not something I realised until later on as it was quite poorly advertised tbh) e.g. my old ips freesync 2 premium 144hz was a right pain as I had to change the overdrive response depending on what game I was playing e.g. +2 was great as long as fps was 100+ but if playing something where fps was <75, I had to change to maybe +1 or even 0 otherwise inverse ghosting was noticeable.
 
Last edited:
Loads of people said the same thing when it came to 60Hz vs 120Hz, most of them have crawled back into their corner since higher refresh panels became the norm and they'd actual experience of the difference :s

There are a lot of people "happy" enough with adaptive sync (G-Sync Compatible) but personally I notice quite a difference with having variable overdrive - I suspect a lot of the people "happy" with G-Sync compatible have not actually experienced G-Sync with the module. (Though to be fair I have a 43" 4K 60Hz display with G-Sync Compatible in the 48-60Hz range which is actually pretty good and I have no real complaints with it but I know from experience it could be better).
Mean the talk of different refresh rates is not my arguement or the like.

What I asked was, what problem does this tech solve since it's never been brought up as a problem.

Refresh rate is just that, I've not discussed the difference between 2 points of refresh rates, that's something you can take up with the others.

When I said fps, if you needed clarity I'm referring to fps games, and non of the top fps players have ever said they are lacking motion clarity ( and they do run at the highest fps they can typically).

So hence my question, who and how many people have this issue.

If it's not been brought up then it's Nvidia creating a problem and then selling a solution for it.
 
Mean the talk of different refresh rates is not my arguement or the like.

What I asked was, what problem does this tech solve since it's never been brought up as a problem.

Refresh rate is just that, I've not discussed the difference between 2 points of refresh rates, that's something you can take up with the others.

When I said fps, if you needed clarity I'm referring to fps games, and non of the top fps players have ever said they are lacking motion clarity ( and they do run at the highest fps they can typically).

So hence my question, who and how many people have this issue.

If it's not been brought up then it's Nvidia creating a problem and then selling a solution for it.

No it is the same kind of arguments - not many people "had a problem with it" before higher refreshes were a thing, same with ray tracing in a way actually - a lot of people saying but we are fine with current graphics.
 
Mean the talk of different refresh rates is not my arguement or the like.

What I asked was, what problem does this tech solve since it's never been brought up as a problem.

Refresh rate is just that, I've not discussed the difference between 2 points of refresh rates, that's something you can take up with the others.

When I said fps, if you needed clarity I'm referring to fps games, and non of the top fps players have ever said they are lacking motion clarity ( and they do run at the highest fps they can typically).

So hence my question, who and how many people have this issue.

If it's not been brought up then it's Nvidia creating a problem and then selling a solution for it.

Your argument is the equivalent to saying refresh rates don't matter though.... IIRC, a 175HZ OLED is the equivalent to a 245hz lcd display in terms of motion clarity. BFI seeks to improve motion clarity even further.

OLED does resolve this to some extent but as shown with the links above, it can always be better.

Ultimately, LCD needs to die but as shown, monitor manufacturers are still wanting to milk it so perhaps point the blame towards them. You are basicaly saying that nvidia created the "motion blur" problem, which as evidenced is clearly not the case.

You are also completely ignoring articles that explain this tech and the benefits for whatever reason.
 
It's unfortunate that the new Gsync is limited to certain panel types and monitors only

I don't know how I feel about going back to the mid 2000s where a single hardware purchase could give you a significant advantage in multiplayer games.

But these advantages in monitors aren't going anywhere, MSI just announced a monitor with built in hacks that shows you where enemies are and tracks game data that the gamer normally doesn't see. And developers can't do **** about it because it's not possible to detect it

 
Last edited:
Mean the talk of different refresh rates is not my arguement or the like.

What I asked was, what problem does this tech solve since it's never been brought up as a problem.

Refresh rate is just that, I've not discussed the difference between 2 points of refresh rates, that's something you can take up with the others.

When I said fps, if you needed clarity I'm referring to fps games, and non of the top fps players have ever said they are lacking motion clarity ( and they do run at the highest fps they can typically).

So hence my question, who and how many people have this issue.

If it's not been brought up then it's Nvidia creating a problem and then selling a solution for it.
It’s a very well documented and regularly discussed issue. Obviously if you don’t game in that space then you won’t be aware of it.

Optimum Tech has a couple good videos on it as he plays a lot of competitive FPS games.
 
Mean the talk of different refresh rates is not my arguement or the like.

What I asked was, what problem does this tech solve since it's never been brought up as a problem.

Refresh rate is just that, I've not discussed the difference between 2 points of refresh rates, that's something you can take up with the others.

When I said fps, if you needed clarity I'm referring to fps games, and non of the top fps players have ever said they are lacking motion clarity ( and they do run at the highest fps they can typically).

So hence my question, who and how many people have this issue.

If it's not been brought up then it's Nvidia creating a problem and then selling a solution for it.


You're on the casual forum mate. If you really want to know go ask your question here
https://forums.blurbusters.com/
 
Last edited:
ULMB never did anything for me, but I can understand the utility and where this new feature will appeal to some. It'll likely be niche but if people are happy to pay for it then more choice is always a good thing.

I personally didn't notice a difference going from a monitor with a g-sync module to a Freesync one, but then the latter (the first Odyssey G7) either has variable overdrive or is very well tuned across the refresh rate range (I'm sure HWU mentioned this in their review of it) so it could be that.
 
Back
Top Bottom