• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Same old rubbish about drivers, even although that has been proven wrong by AMD themselves.

Yah mate whatever. I suggest do not start to read tech sites which proved the features not working, voltage scaling not working
Consider the evidence and think about it logically - the only evidence we have at the moment is:

1) Vega is no better than Fury at the same clock speed.
2) Vega can clock higher than Fury.
3) Vega has huge power consumption and runs very hot as a result.

Is it really logical to assume that a couple of months of driver development will bring big enough increases to performance and big enough decreases to power consumption to make Vega better than a 1080?

Your first point is the actually point to the driver issues. Vega has tons of upgrades even above Polaris, and yet still not faster than Fury...because it is running basically on the same modified Fury drivers as i said a few time.
Tiled rendering one of Maxwells main energy saving feature was - inactive in Vega.
AMD automatic voltage regulation AVFS (automatic voltage and frequency scaling) - inactive in Vega - thus the huge drops in power consumption at undervolting (Gamersnexus dropped to 1100mV from 1200mV and that was 87W less power consumption)
No it wont be better at power consumption/performance than the 1080, it will probably be worse than the 1080Ti, but not this much.

Also take into consideration that every new architecture needs it's drivers to mature. Look at Ryzen, it started as a "bad for games" CPU, and in a few months it gained 2 digits of FPS in many games
 
This is so sad its almost comical. The Vega hype train and RTG have derailed so badly that even NVIDIA is promoting them at their own events. This is most definitely the Bulldozer of GPUs
 
Tiled rendering one of Maxwells main energy saving feature was - inactive in Vega.

I don't think this is going to be the saviour you and some others think - nVidia brute force it, AMD went for a more elegant approach but to get anything like the same gains from it appears to require the application developer to actively do something rather than just something that can be indiscriminately turned fully on in the driver.
 
So 1080 performance with freesync and the possibilty (however small) that will improve with driver updates? Priced well im happy with that. PSU requirements and heat output are the issues for me, but not a deal breaker.

R290 may finally go...
 
Why do certain people think that AMD are still using "Fiji drivers" anyhow?

Since when do you not have time/resources to develop your driver in tandem with developing your silicon?

Do people believe that AMD is so short-staffed that they can only afford to work on the driver a month before release? And that until then they've been scraping by with an old driver for a different architecture/product?

Seems utterly amazing to me that people believe this.
 
I don't actually find that much time for gaming these days. So although I am actually looking at 1080's at the moment the truth is I should only be looking at GPU's in the 1070 class.

But I need a monitor upgrade to 1440p (144Hz Gsync/FreeSync) is a 1070 REALLY a satisfactory card at 1440p?

Maybe not everything ultra, but I generally like almost ultra with maybe AA settings turned down to like 2xmsaa.
 
I'm using a 1070 with 1440p 144Hz G-Sync - there are 1-2 demanding games I don't play so not sure how it holds up in that but everything else does fine - the only game I had to turn settings down really was Deus Ex: Mankind Divided and that is for more complicated reasons though performance is part of it.

That said with the next round of games I'll probably be looking to upgrade maybe - I think some of the next generation titles will probably relegate the 1070 to 1080p unless you want to turn settings down.
 
Yah mate whatever. I suggest do not start to read tech sites which proved the features not working, voltage scaling not working


Your first point is the actually point to the driver issues. Vega has tons of upgrades even above Polaris, and yet still not faster than Fury...because it is running basically on the same modified Fury drivers as i said a few time.
Where is the evidence that vega is running Fiji drivers/ If Vega is really runnign Fiji drivers then it means it must be an identical architecture so why expect any performance difference for the same clocks. And why expect in a few weeks they can magci up a fully working Vega driver when they have had at least 18 months to get one working. And why are the actual file dates of the Vega driver newer than the latest Radeon drivers?


Tiled rendering one of Maxwells main energy saving feature was - inactive in Vega.

Its not necessarily the main energy saving feature of Maxwell. Nivida have not released any official metrics to know that. Maxwell has a large number of improvements that can all contribute to energy savings at scales similar to TBR.

Secondly, there is absolutely zero evidence that TBR is not working on Vega. The one known application that potentially detects this uses a hack that is only known to work with maxwell and Pascal GPUs, it is not known whether ti would work on any other GPU. There could be numerous reasons why it is not detecting artifacts of a TBR, or Vega is not using and TBR path for that specific piece of code. And anothert otpion is like Roff points out, perhaps AMD's TBR mode requires explict coding for within the aplication much like its new geemtry shaders. The fact is there is no evifence either way that a new driver will magcially switch on a missing feature.

AMD automatic voltage regulation AVFS (automatic voltage and frequency scaling) - inactive in Vega - thus the huge drops in power consumption at undervolting (Gamersnexus dropped to 1100mV from 1200mV and that was 87W less power consumption)
No it wont be better at power consumption/performance than the 1080, it will probably be worse than the 1080Ti, but not this much.

You can alos undervolt Pascal and Polaris cards to get massive ebergy savings. Thevoltages are set at those levels largely due to yield issues and such like. Undervolting vega and seeing a power drop means nothing.

Also take into consideration that every new architecture needs it's drivers to mature. Look at Ryzen, it started as a "bad for games" CPU, and in a few months it gained 2 digits of FPS in many games


AMD have had 18 months or more to create Vega drivers, an extra month or 2 is not going to change things dramatically.
 
I need to stop looking at the 1080's as I'm getting tempted to buy one.

Me too. I wonder how good/bad the cooling and noise is on that £450 OcUK Value brand 1080. I was going to buy a 1070. I even ordered one (from somewhere else because it was £350 there), but surprise surprise they don't actually have any and delivery is "maybe some time in August, possibly" and price isn't actually the £350 shown (it's £450 now). So that's not looking good. If the cooling and noise is decent on that OcUK 1080, I might well get one. I want a new toy now and Vega has clearly failed to cut the mustard. £450 is a lot of money for a graphics card, but I don't really need to care any more. Maybe I should just do it.
 
And anothert otpion is like Roff points out, perhaps AMD's TBR mode requires explict coding for within the aplication much like its new geemtry shaders.

AFAIK from what AMD have said there are basically 3 modes for TBR - traditional method, a "compatibility" deferred mode which I think can be forced on per application at driver level but doesn't have full performance benefits and then the fully enabled TBR mode which requires the application developer to implement draw stream binning, etc.
 
Me too. I wonder how good/bad the cooling and noise is on that £450 OcUK Value brand 1080. I was going to buy a 1070. I even ordered one (from somewhere else because it was £350 there), but surprise surprise they don't actually have any and delivery is "maybe some time in August, possibly" and price isn't actually the £350 shown (it's £450 now). So that's not looking good. If the cooling and noise is decent on that OcUK 1080, I might well get one. I want a new toy now and Vega has clearly failed to cut the mustard. £450 is a lot of money for a graphics card, but I don't really need to care any more. Maybe I should just do it.

My issue is dumping money especially at these inflated prices on an Nvidia product in the twilight of it's release cycle. Surely Volta is now only 6-9 months away.

Give me a good deal and I might be tempted.
 
It's called desperation :p

Good to know who the AMD faithful are tho. I prefer my information from impartial sources.

Anyway, the whole point of masking the PCs was obviously so you wouldn't know which is which. Not that you wouldn't know it was Vega against a 1080. The idea that you'd have an unknown AMD card vs an unknown nVidia card, and set up a demonstration which told users literally nothing... what the heck would be the point?

Comparing gsync and freesync, presumably. What else could AMD do when Vega cards are maybe as good as as 1080 but use more than twice as much power, generate far more heat, are noisier and can't sustain advertised clocks unless undervolted?
 
Your first point is the actually point to the driver issues. Vega has tons of upgrades even above Polaris, and yet still not faster than Fury...because it is running basically on the same modified Fury drivers as i said a few time.

Saying it doesn't make it true. Even if you say it a few times. You're arguing that AMD would release a product after spending well over a year failing to be able to make any drivers for it but they only need another couple of weeks to make the drivers for it...and it works with drivers for a different GPU anyway.

Tiled rendering one of Maxwells main energy saving feature was - inactive in Vega.

Speculation. All we know is that software written specifically to show TBR working on one specific GPU doesn't show it working on a completely different GPU.

AMD automatic voltage regulation AVFS (automatic voltage and frequency scaling) - inactive in Vega - thus the huge drops in power consumption at undervolting (Gamersnexus dropped to 1100mV from 1200mV and that was 87W less power consumption) No it wont be better at power consumption/performance than the 1080, it will probably be worse than the 1080Ti, but not this much.

Tests clearly show automatic frequency scaling active, since Vega fails to hold 1600MHz at stock. It's automatically switching being different GPU frequencies - automatic frequency scaling.

Gamersnexus' tests showed that even with just a handful of pieces of software being tested they found one that required a minimum of 1120mV to the GPU to be stable. So even if Vega did automatically undervolt it probably wouldn't do so very much if at all below 1200mV.

Also take into consideration that every new architecture needs it's drivers to mature. Look at Ryzen, it started as a "bad for games" CPU, and in a few months it gained 2 digits of FPS in many games

If AMD aren't able to make properly working drivers before releasing a product, they need to fix that. "This new kit isn't very good but you should buy it anyway and hope that the poor performance is due to bad drivers" isn't a convincing sales argument.
 
I guess a bit of blind optimism, but it's not as if AMD's last few cards haven't benefited quite reasonably from newer driver releases.
But are you prepared to wait for months/years after the initial cards release to reap such driver benefits? Me, call me crazy but I would like performance out the box.
 
78aa07e6c2e742840fb14c9d6d38fb2554bd4b6c5d5173c845e4cc81ac210280.png

075039b3b2d24199595f103821fd1c3c8775fe109569a0041b4ce7cbaf858fa0.png

94a18e45e50636c5813fd2e6e4608e55663fa9a8774f918d77149ae150f18a1c.png


Doing pretty good in the pro segment, but oh no, them drivers can't do squat for it.
 
It's hardly completely useless right now though. For me, I think buying anything new is just asking for trouble these days.
Computer news sites almost seem as bad as mainstream media for trying to shoehorn drama into things.
 
AMD Radeon RX Vega Compared Against a GeForce GTX 1080 in Budapest – Almost Similar Performance in Battlefield 1, Launches in 2 Weeks

Anyways, moving onward, there was one system that faced a little hiccup and was performing worse. There’s no way to tell which system that was but we know that a single GeForce GTX 1080 does over 60 FPS (average) on 4K resolution with Ultra settings. The tested resolution is not high compared to 4K so it’s possible that the GTX 1080 was performing even better. Plus, AMD not publicly showing any FPS counter and restricting max FPS with syncing showcases that there was possibly a problem with the Radeon RX Vega system against the GeForce part.

AMD reps told the public that the AMD system has a $300 US difference which means two things. First, AMD puts the average price of a Freesync monitor compared to a G-Sync monitor as $200 US cheaper. Then if we take that $200 out of the $300 from what AMD told, it means that the Radeon RX Vega can be as much as $100 US cheaper than the GeForce GTX 1080 at launch which should be a good deal but they haven’t told anything on things aside from that like performance numbers in other titles, power consumption figures and most importantly, what clocks was Vega running at which seems a little sad.

All it tells is that AMD is leveraging their new graphics card using a technology that’s much older. It almost seems like that Vega will be great only if you plan to purchase a freesync monitor because otherwise, the features touted for the RX Vega cards aren’t doing much to help it in the performance sector against a rivaling card that is more than a year old and available in several custom flavors that further help boost performance. We hope that AMD shares more details of Radeon RX Vega cards in the coming weeks

http://wccftech.com/amd-radeon-rx-vega-gtx-1080-battlefield-1-comparison/


 
Last edited:
Status
Not open for further replies.
Back
Top Bottom