• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Are Nvidia Going to Support Freesync?

... and probably something to do with the absolute drivel and nonsensical style of posting that you have come to do on the forums. None of what you have typed in the last few posts has made any sense.

+1

What an utter load of tosh in this thread. Played Shankly. Time to go to sleep if you've been on the Christmas sauce this early.
 
Ahhh fair enough and I have absolutely no interaction with anyone when it comes to me typing and probably something to do with the absolute drivel and nonsensical style of posting that you have come to do on the forums. None of what you have typed in the last few posts has made any sense.

One day you're going to be my Father in law.

Childish antics aside.

I just love to read Shanks posts, and then counter the nonsense.
 
I wish you'd stop saying PhysX only title and losing out, because it's factually incorrect. Given the vast majority of PhysX only (Wow I hate that term) look exactly the same and have the same physics effects.

PhysX only title are the games that only support GPU PhysX without some sort of Hacking to force onto the CPU..

BL1 and 2 are PhysX only titles... Metro 2033 had the choice from PhysX or Physics this wasn't a PhysX only title..

So if the option for switching between the two isn't in the settings of the game then its a PhysX only title.
 
PhysX only title are the games that only support GPU PhysX without some sort of Hacking to force onto the CPU..

BL1 and 2 are PhysX only titles... Metro 2033 had the choice from PhysX or Physics this wasn't a PhysX only title..

So if the option for switching between the two isn't in the settings of the game then its a PhysX only title.

Metro 2033 only has one Physics engine, that being PhysX from what I can tell? It is by definition a PhysX only game. You've just made up your own definition and mixing up PhysX and physics as if they're the same thing.

What you mean by PhysX only, is a game which has GPU accelerated physics via PhysX. Which are in the vast minority.
 
Metro 2033 only has one Physics engine, that being PhysX from what I can tell? It is by definition a PhysX only game. You've just made up your own definition and mixing up PhysX and physics as if they're the same thing.

What you mean by PhysX only, is a game which has GPU accelerated physics via PhysX. Which are in the vast minority.

Metro 2033 used both CPU Physics and GPU physX
On AMD I was't loosing out if I choose to run CPU Physics I could still enjoy the game be it less performance.

Are my eyes playing tricks on me, another subject Shankly is talking about he has no understand of!?

Bah, Bah, Bah

PhysX only is a game that with only run GPU PhysX, BL1 and 2
 
Last edited:
Metro 2033 used both CPU Physics and GPU physX
On AMD I was't loosing out if I choose to run CPU Physics I could still enjoy the game be it less performance.

You don't get it.
It's all PhysX.

The stuff on the CPU isn't meant to be the same as the stuff on the GPU, so you wouldn't be getting the same "experience" by your own definition, somewhat ironically is the one with less performance would be the one running hardware accelerated physics via PhysX.
Basically, what I'm getting is that you don't actually understand what you're arguing about. You don't understand what PhysX is.

Unless Metro 2033 allows AMD GPU owners to run the hardware accelerated effects on the CPU. Which isn't exactly a great idea. But that'd still make it a PhysX only title by true definition.

Also, why can't you enjoy a game if you can't see papers flying or capes flowing differently? I've never found hardware accelerated physics to be *that* special. Full of potential sure, but never a game changer.
 
Last edited:
Indeed, hardware costs are likely the same between the 2. Nvidia may charge a license fee but licensing fees are typically small anyway (e.g. $10 a pop, with big discount for early adopters). Nvidia are always in the position where the gsync license fee can be dropped. On the flipside of the coin just because there is no license fee for free-sync doesn't mean that manufacturers wont charge a price premium .


I doubt there will really be any noticeable price differences once everything has settled. If there is a systematic price difference with no difference n features or quality then Nvidia will likely start supporting Adaptive sync if they koose market share because of it.

People corrected you before about this, yet you are back posting the sane drivel. Freesync is AMD's method to connect to an adaptive sync monitor.

Gsync monitors cost more because Nvidia has to supply hardware specific to that monitor. Its not a license fee. There will always be a price difference between adaptive sync and gsync monitors unless nvidia supply the gsync module for free. They are a business, they aren't going to do that.

You clearly know nothing about gsync/freesync or adaptive sync, so please stop posting such rubbish. Mind you there are lots of other posters in this thread who don't have a clue either.
 
Metro 2033 used both CPU Physics and GPU physX
On AMD I was't loosing out if I choose to run CPU Physics I could still enjoy the game be it less performance.



Bah, Bah, Bah

PhysX only is a game that with only run GPU PhysX, BL1 and 2

Please stop Shankly, you are sounding more and more stupid with each post. Take off the AMD glasses and think about what you are saying before posting again
 
Trying to justify things a little, I am sure if nVidia contacted AMD and said "can we use Mantle please", they would give that over in an instance. (joking of course for those that might miss it). nVidia can by all means adopt and support Adaptive Sync if they so wish, because AMD don't own it and just sent a text to VESA and asked for A-Sync to be integrated into the next DP standard.

I wonder if AMD have ever contacted nVidia and asked "Can we use that please?" and would nVidia allow them? The thing with nVidia is they do tend to innovate and lead from the front, whereas AMD do tend to play catch up!
 
You don't get it.
It's all PhysX.

The stuff on the CPU isn't meant to be the same as the stuff on the GPU, so you wouldn't be getting the same "experience" by your own definition, somewhat ironically is the one with less performance would be the one running hardware accelerated physics via PhysX.
Basically, what I'm getting is that you don't actually understand what you're arguing about. You don't understand what PhysX is.

Unless Metro 2033 allows AMD GPU owners to run the hardware accelerated effects on the CPU. Which isn't exactly a great idea.

But it is the same thing, the only difference to running it on the GPU is to gain performance..

Would you like to watch me running Batman with PhysX high on the CPU?
Batman isn't a PhysX only title least it gives you the option to use it on the CPU..

https://www.youtube.com/watch?v=YX_r0Ex5cNg

You run the same test with PhysX onto the GPU = the same end result, the only difference is your performance will be much higher.
 
Ooh another **** and moan thread, gg gpu forum!

I really do wonder what the point of this subsection is sometimes, nothing but petty one ups-manship, thinly veiled insults (which some people somehow manage to get away with on a consistent basis) and fanboys doing no end of nut swinging from their respective card company's gonads.

Seriously wise the **** up. :rolleyes:
 
But it is the same thing, the only difference to running it on the GPU is to gain performance..

Would you like to watch me running Batman with PhysX high on the CPU?
Batman isn't a PhysX only title least it gives you the option to use it on the CPU..

https://www.youtube.com/watch?v=YX_r0Ex5cNg

You run the same test with PhysX onto the GPU = the same end result, the only difference is your performance will be much higher.

I give up. You must be trolling.
Or, you simply don't understand what PhysX is.

You're not really *meant* to run the hardware accelerated stuff on the CPU, because it results in poor performance, which is the reason why we have hardware accelerated physics running on GPU's (Well, Nvidia GPU's)

It's all through PhysX though!
 
Last edited:
I give up. You must be trolling.
Or, you simply don't understand what PhysX is.

You're not really *meant* to run the hardware accelerated stuff on the CPU, because it results in poor performance, which is the reason why we have hardware accelerated physics running on GPU's (Well, Nvidia GPU's)

It's all through PhysX though!

If we not meant to run it then why did they give us the option? Why not just have it like BL 1&2 and completely disable it.

The latter.
He's no troll, he's genuinely stupid.

He's told us before, AMD lowers your IQ.

All you can do is post Digs and insults at people! Least Mart is having a discussion..
 
If we not meant to run it then why did they give us the option? Why not just have it like BL 1&2 and completely disable it.

You have the option to put your multiplier to 60. Doesn't mean you should.
It's just an option, it doesn't work that well, performance takes a nose dive. And most games do disable it. I remember running it on Alice Returns and performance would just tank at certain monsters.

But by definition, these titles are all PhysX only.
 
Last edited:
I wonder if AMD have ever contacted nVidia and asked "Can we use that please?" and would nVidia allow them? The thing with nVidia is they do tend to innovate and lead from the front, whereas AMD do tend to play catch up!

Nvidia publicly offered to share PhysX with ATI/AMD back in 2008, AFAIK ATI/AMD never picked up the phone.


EDIT: Reading it back it's pretty funny that it's basically the same offer/press release that AMD would make about Mantle years later lol.
 
Last edited:
Back
Top Bottom