• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
I don't care for g-sync/freesync either. I don't care because you have to lock yourself to one vendor. I'll care as soon as that restriction is lifted, till them, lived without it for the years before it was around, I'll survive without it now. Locking card/monitor together is absurd.

This isn't actually true, Nvidia have developed G-sync to be a one brand tech, AMD's response was to push for changes to be made to what at the time was an incoming update to the Displayport standard, That update now includes adaptive-sync, In that respect the timing couldn't of been better for AMD. If Nvidia hadn't announced G-sync when they did it would have been too late for the current version of the displayport to get tweaked as it did. Freesync works through the industry standard displayport which is available for anyone to use, AMD use it for Freesync, Intel use it but Nvidia don't because there version G-sync locks anyone who buy's a G-sync monitor into the Geforce eco-system. Because of the hefty premium gamer's pay for G-sync monitor's owners feel the need to continue buying GTX graphic's card's so that they won't have wasted their money.
Freesync is not entirely free but it is only a small mark up compared to what is often hundred's of pounds for the G-sync equivalent.
For me what makes G-sync a hard sell is that a monitor fully decked out with the right Freesync features is every bit as good as a G-sync monitor.

Don't Nvidia support adaptive sync in their gsync laptops?

I was just thinking about adding this to my post, I remember reading some articles about this a couple of years ago and at the time yes, Mobile G-sync used the industry standard adaptive-sync instead of the module desktop G-sync monitor's use.

That alone speaks volumes about Nvidia's claim that G-sync's the superior tech.
 
Freesync is not entirely free but it is only a small mark up compared to what is often hundred's of pounds for the G-sync equivalent.
For me what makes G-sync a hard sell is that a monitor fully decked out with the right Freesync features is every bit as good as a G-sync monitor.

What needs to happen is monitor spec parity between the two. Why isn't there a ***** to the wall ROG Freesync MG348Q that's identical to the PG348Q bar Sync tech.

That does put off a few people for some high end monitors; especially since LFC is so darn important on Freesync; but many of those Freesync panels are capped at 75hz.
 
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?

I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.

So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.

It won't be HDR tech as that's just come to TVs and Monitors. And it has to be 10bit HDR non of this half ass fake crap.
Freesync and Gsync are useful fullstop. Ive tried it without and gamed without for years then i finally tried it and it was amazing. It's like going from 60Hz to 144 and maintaining over 100FPS. You notice the difference in smoothness for sure. And FPS variation you dont feel as much if at all because i dont notice drops like i used to.

4k Does improve image quality. Heck even 1440p does. Ive got a 1080p second monitor at the side here and at the desktop is absolutely is not as good and as sharp as my 1440p at the desktop. Same in games. Max settings is false that just slows down GPUs.
 
1080 performance at 1080 prices will be what I expect and that will do nicely for those with Freesync screens. :)

I'll skip this gen if its 1080 performance at 1080 prices, my Fury does alright at 1440P I'll buy one if the price/performance is right

That'll be fine from the WC model but if the blower version is 1080 prices with the WC approaching Ti pricing I won't be buying. Nvidia's next range can't be that far away, There's no reason why we should have to pay so high a price, What it's cost them is down to them not us, I would have preferred a GDDR5x model 6 months ago rather than a HBM2 model that's memory is slower than last gen's HBM

Mine has a bit of BLB but nothing to worry over it is in the 4 corners but not bad. My Dell IPS was horrendous for it

As was the Asus IPS Dominator I got from OCUK, It was the first IPS monitor I'd ever bought and the BLB was horrendous, All 4 corner with 2 in particular that almost covered there entire quarter in bleed, It was so horrible I almost said sod it and bought another TN, Luckily enough Old gamer had a great UW Freesync monitor that he put on the Member's Market so I grabbed that and I'm glad I did even if I don't go with Vega this time.
 
What needs to happen is monitor spec parity between the two. Why isn't there a ***** to the wall ROG Freesync MG348Q that's identical to the PG348Q bar Sync tech.

You're right they do need to to get parity between the sync types, I'm not sure why Nvidia G-sync models have the faster panel's, Maybe the G-sync module's are pushing a bigger overclock by default, My Freesync's capped at 75hz but the working range is 30-75 with LFC and for playing games it feels great. I'm hoping my next upgrade will be to a UW4k model with 100hz+ and an adaptive sync tech but there still several years away so for now I have the best IPS UW can offer, I briefly had a 144hz Freesync ultrawide monitor that used a VA panel but the 2560x1080 resolution let it down, I imagine we can't be far off seeing uw1440 versions of that and if that becomes available with Freesync2 spec's it'd make a great choice.

That does put off a few people for some high end monitors; especially since LFC is so darn important on Freesync; but many of those Freesync panels are capped at 75hz.

That is the most notable difference with a lot of comparable monitor's, Another example is the 35" Acer VA Ultrawide I mentioned earlier. The Freesync model's at 144hz while the G-sync one's at 200hz.
 
Last edited:
Mines capped at 75hz but the working range is 30-75 with LFC and for playing games it feels great. I'm hoping my next upgrade will be to a UW4k model with 100hz+ and an adaptive sync tech but there still several years away so for now I have the best IPS UW can offer, I briefly had a 144hz Freesync ultrawide monitor that used a VA panel but the 2560x1080 resolution let it down, I imagine we can't be far off seeing uw1440 versions of that and if that becomes available with Freesync2 spec's it'd make a great choice.

Yeah, I've never been happy with VA panels after having a proper IPS one that's been calibrated. Never mind the lower resolutions.

All the monitors now still use DP 1.2. I'm hoping 1.3 or 1.4 ones show up soon, as that'll give up the bandwidth for 144Hz at 3440x1440 without issue; or colour depth compromises.

I'm surprised we've heard nothing about Freesync 2 since it's announcement; and no monitors featuring it.
 
Been hearing about hardware z-sorting/culling and occlusion detection for over a decade now. No one ever uses it. Everyone just uses dpvs to cull entire models on the CPU. Don't know if it is just too hard to use, or if it is just too much of a performance hit.

So...don't expect much performance to come from this rehashed feature.
 
I am sure it will be golden bro. 1080 performance at 1080 prices will be what I expect and that will do nicely for those with Freesync screens. Hopefully they have done some sandbagging with the FE and the RX will be Ti performance :)
It's not exactly 1080 prices as I have to buy a new PSU to run the thing
 
Your're aware, that GP100 already has unified memory and could do the same things as the hbcc in vega? Not sure, whether you could add a ssd to it, but at least the gaming stuff would be easy to add for them if they wanted. At the moment it's just only accessible with cuda, because big pascal isn't used for gaming. Probably the smaller pascals don't support it, but maybe volta?

Pascal Unified memory is what you're earching for.
https://images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf

Actually I was not aware of that and I'm quite glad to find out (as a dev I can appreciate how much this simplifies things), so thanks for the link!

From reading your reference, it seems like they are at the stage AMD used to be before Vega. They have a 49-bit address space and a page fault can fetch data from DRAM, but they don't seem to have an interface to generic PCIe devices (e.g. to do the SSD trick). I'm sure whatever comes after Volta will have the ability to directly interact with any device. I was thinking they could even slot a Thunderbolt interface onto the card making it possible to attach an external SSD so you can modify the amount of available cache by swapping out drivers. But I'm getting quite off-topic now.

The thing is, once again AMD pulled a rabbit out of its hat in a pro feature in Vega which nobody expected. It's a real pity that due to ****** software AMD's cards seem lagging when in fact they're one step ahead. I mean, forget about the gaming drivers and being bad in DX11 and just look at pure compute: Nvidia's CUDA software essentially locked that market in when in fact AMD cards have had the TFLOPs and the superior simplified programming model the whole time. It's just that they had buggy/no compiler support, libraries/tools (OpenCL, Caffe, etc).

I really hope they invest whatever money is coming towards them from Ryzen and crypto-sales-boom in R&D across both hardware and software.
 
W00t, not long left now. I really hope they improve price for performance, otherwise it will be another damp squib show for AMD. They need another Ryzen like showing!
 
What needs to happen is monitor spec parity between the two. Why isn't there a ***** to the wall ROG Freesync MG348Q that's identical to the PG348Q bar Sync tech.

That does put off a few people for some high end monitors; especially since LFC is so darn important on Freesync; but many of those Freesync panels are capped at 75hz.

Mmm!

 
Yeah, I've never been happy with VA panels after having a proper IPS one that's been calibrated. Never mind the lower resolutions.

All the monitors now still use DP 1.2. I'm hoping 1.3 or 1.4 ones show up soon, as that'll give up the bandwidth for 144Hz at 3440x1440 without issue; or colour depth compromises.

I'm surprised we've heard nothing about Freesync 2 since it's announcement; and no monitors featuring it.
Well freesync is GPU section and they seem slow on actually telling us much lately . It is a shame however but a lot of AMD GPU tech does seem to follow that same pattern of announcing said tech then it goes quiet :(
 
Status
Not open for further replies.
Back
Top Bottom