• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Couldn't disagree more with him about adaptive sync tech though - for instance if you were playing on a 4K 60Hz monitor without some form of adaptive sync you'd have to drop some settings back to hold 60fps - with some kind of adaptive sync tech some small variations around 60fps i.e. dipping into the low 50s or so is reasonably acceptable for offline games allowing you to turn up the quality more.

And 30fps is just plain nasty - no way I'd find playing at a locked 30fps acceptable.
 
I don't care for g-sync/freesync either. I don't care because you have to lock yourself to one vendor. I'll care as soon as that restriction is lifted, till them, lived without it for the years before it was around, I'll survive without it now. Locking card/monitor together is absurd.
 
I don't care for g-sync/freesync either. I don't care because you have to lock yourself to one vendor. I'll care as soon as that restriction is lifted, till them, lived without it for the years before it was around, I'll survive without it now. Locking card/monitor together is absurd.

I am afraid you can blame Nvidia and them alone for this. Amd are supporting the open standard and Nvidia are part of the group that make these standards. They choose not to join and push there closed G-Sync mod. It's daft but until it's worth them doing so profit wise they won't support adaptive sync like Amd do with Free sync. Sucks but it is what it is.
 
I am afraid you can blame Nvidia and them alone for this

and Nvidia are part of the group that make these standards.

Well from what I understand there wasn't much interest in pushing adaptive sync tech when nVidia was trying to get it introduced - it was only after they went away and made their own version VESA suddenly started to do anything.
 
Well from what I understand there wasn't much interest in pushing adaptive sync tech when nVidia was trying to get it introduced - it was only after they went away and made their own version VESA suddenly started to do anything.
Probably because they wanted a open standard not a locked down one lol :D
No but seriously there is nothing stopping nvidia also supporting it. If Gsync is superior people will still buy the panels. Simplez. And people with Async panels don't have to feel like AMD is the only option for the tech.
 
Can nvidia even support it on current cards? I know it's not a requirement, but is the hardware there and it just needs doing via drivers or does the card need building with it in mind?
 
Here is what GV100 does (according to Nvidia's announcement).

If the top-end chip doesn't do uniform memory architecture, and has no hardware for virtualised memory (the HBCC in Vega) then I doubt you will see a card from Nvidia achieving these feats any time soon.

Your're aware, that GP100 already has unified memory and could do the same things as the hbcc in vega? Not sure, whether you could add a ssd to it, but at least the gaming stuff would be easy to add for them if they wanted. At the moment it's just only accessible with cuda, because big pascal isn't used for gaming. Probably the smaller pascals don't support it, but maybe volta?

Pascal Unified memory is what you're earching for.
https://images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf
 
I still jump between the two.
That is strange i get the absolute opposite ...Win 10 DX 12 runs like crap ..Win 7 Dx11 runs butter smooth .. with 1080 & Online

Well i just installed it last night and my god what a positive difference it made to be back on win7 for now. BF1 was buttersmooth with no issues, havent had that it a long while.
 
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?

I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.

So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.

Class yourself lucky that you do not see the difference in resolution. I see a very big difference and have on two occasions tried to go back and to 1080p and 1440p could not do it. I see massive difference in graphics from 1080p to 2160p, it is obviously smaller with 1440p, yet I still clearly see it. To describe the difference, it is like putting your glasses on. My eye sight is not 20/20, but good enough that I do not need them to drive legally so I do not bother as I cannot stand contacts and do not enjoy having glasses on. But when I do put my glasses on suddenly everything becomes sharper and more clear. That is what going from 1440p to 2160p looks to me like, so much sharper and better. So false economy for you maybe, but not for everyone ;)

To be fair though I have noticed some games there is a much bigger difference than others. Take Tekken 7 for example, I do not see much difference in the way it looks on 4K to 1080p playing on my TV, yet a big difference in amount of GPU horse power needed. Yet a game like Fifa 17 or Witcher 3 I see a huge difference.
 
Can nvidia even support it on current cards? I know it's not a requirement, but is the hardware there and it just needs doing via drivers or does the card need building with it in mind?

Pascal cards onwards "can" definitely support it - not sure about before that.

I could have sworn maxwell was capable too. Because of that rumor nvidia was going to release a driver to support adaptive sync. But it never happend.

If NVIDIA announce at siggraph that they'll be supporting "FreeSync" they'd entirely kill Vega without having to spend R&D on Volta.

So many people would instantly buy an NoVideo card that day. I myself included and I've been holding off for nearly a year because of my monitor.

They certainly don't as well, as the Laptops using G-Sync do not have a module in them. They take advantage of eDP, also an open standard.
 
It's unlikely to happen over night; they'll have contracts and obligations to forfil with g-sync first. They'd need to phase it in at best over a longer period.

I dont see how that would hinder adaptive sync support. Knowing nvidia they will just brand adaptive sync support as being the lesser of the 2 so they can keep their premium cash cow. Keep the current gsync owners happy, since they have the best tech(atleast they think so) and keep the adaptive sync monitor owners happy that they now have access to nvidia hardware. Winner Winner chicken Dinner
 
I dont see how that would hinder adaptive sync support. Knowing nvidia they will just brand adaptive sync support as being the lesser of the 2 so they can keep their premium cash cow. Keep the current gsync owners happy, since they have the best tech(atleast they think so) and keep the adaptive sync monitor owners happy that they now have access to nvidia hardware. Winner Winner chicken Dinner

They have productised the capability and licensed it; supporting adaptive sync erodes the merit of that license and what it brings. Yes of course they could do it, but they'd link it into a new product not a driver update on exsisting products.
 
Status
Not open for further replies.
Back
Top Bottom