• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
The wait is killing me, why has nobody invented a coma drug you can take to sleep for a week or whatever lol.



What you're saying isn't factually incorrect, but context is important. Firstly it only applies to Polaris (RX400/500) cards, downclocking on Hawaii/Fiji/Vega will reduce Ether hashrates. Secondly it's generally more profitable to "dual mine" when mining ethereum, mining a second altcoin (I.E Decred) at the same time, and if you're doing that you will lose performance by downclocking on an RX400/500.



I remember being so hyped when I got a 50MHz overclock for the first time, obviously it was a bigger % of the original clockspeed back then, how times change :D
Cant say say I have noticed any difference mining ubiq/sia. I'll try it tonight.
 
Anyone else worry that Vega isn't even out yet and we're comparing it with the 1080. NV will release Volta and this will become another 480 situation where it's decent value but it's a gen behind the top end. I think it's great for the mid market consumer, but it's not what I'm after. (assuming the rumours are right)

AMD are a generation behind Nvidia already however there's a lot of technology built into Vega that can be enabled that could potentially give Vega the boost it needs to stay in contention. Games with A-sync running can give it anywhere from 10% to 20% uplift, good use of low level API’s will eliminate any CPU threading issues (this will only get better over time) and rapid pack math also looks promising. Of course all of this requires the co-operation of game developers but if AMD can get this tech built into the major engines (Unity, frostbite, UE and ID tech 6) then things will start to look different.


If I had to a semi educated guess in the best case scenario for AMD Vega 64 could see a 70% boost in performance if games were properly coded for the hardware. IMO low level API’s are a game changer you only have to look at well the new Xbox performs and that’s hampered by the use of Jaguar CPU cores.
 
The wait is killing me, why has nobody invented a coma drug you can take to sleep for a week or whatever lol.

Coma drugs do exist. Heard the term "Medically induced coma". I don't think the NHS will hook you up on the basis of waiting for a GPU lol. Maybe in Mexico, who knows.
 
Nobody should jump on any train until reviews are in, unless they have money to gamble and lose.

The idea that AMD are still sandbagging is getting more and more absurd by the day.

There's also zero evidence that Vega will work better with Ryzen at this time.

Unfortunately somepeople try to find reasons for greatness where none exists. The additional optimizing for Vega on Ryzen started out as no more than someone asking Will AMD cpu's & gpu's work better together than in a mix, over the coming weeks theories were posted which then became facts when reposted by the next man in the Delusions-R-Us chain. It's nothing new except now we have web sites that exist by spinning anything that can be made to sound like a possibility into facts making it worse than ever.
 
You think running a GPU at 100 percent load for months on end doesn't stress it? Especially when it's stuffed in amongst a load of other cards and wasn't designed for that kind of life.

Heat cycles are only one aspect of wear and tear on electronics.


End this mindless cruelty against gpus! Remember, a gpu is not just for Christmas, it’s for life (or until electromigration gets it).

I can see a RSPCG forming soon to combat these cruel miners. ;)
 
AMD are a generation behind Nvidia already however there's a lot of technology built into Vega that can be enabled that could potentially give Vega the boost it needs to stay in contention. Games with A-sync running can give it anywhere from 10% to 20% uplift, good use of low level API’s will eliminate any CPU threading issues (this will only get better over time) and rapid pack math also looks promising. Of course all of this requires the co-operation of game developers but if AMD can get this tech built into the major engines (Unity, frostbite, UE and ID tech 6) then things will start to look different.


If I had to a semi educated guess in the best case scenario for AMD Vega 64 could see a 70% boost in performance if games were properly coded for the hardware. IMO low level API’s are a game changer you only have to look at well the new Xbox performs and that’s hampered by the use of Jaguar CPU cores.

There are many different use cases for which Vega is a good option or a poor option.

But I am running a 970. Having upgraded from a 3570k to a 1700 back in April, my 9 year old monitor that was poor as a gaming monitor is now terrible as the 1700 platform upgrade has pushed the PC and now the biggest bottleneck is the fps the monitor can display.

In BF1, in order to stop the horrendous jank and tearing I have to cap fps in game to 60. (And it's still not great)

Now in my scenario it's either a £500 1080 or a Vega 64. BOTH if you want to argue are a generation behind. At least Vega is new, at the beginning of it's support life cycle and has more than just performance to go for it. It has next gen hardware such as HBCC, FP16 and DSBR.

So what to do is the question? Spend £500 on a 2016 GPU in the 1080 so at least with a Gsync monitor I'm in a position to upgrade to Volta or just go with a 2017 GPU in Vega and have all the benefits of the architecture. But then be a generation in performance behind once Volta hits. (And stuck with a FreeSync monitor)

If the 1080 had dropped in price as it should have by now it wouldn't be so hard a choice.

The mistake I made was not upgrading to a 1070 when they dropped to £330 (Back in Sep-Oct 16 I think).

I think the reason I didn't pull the trigger was I didn't think the 1070 was up to 1440p gaming which is the monitor resolution I want to upgrade too.

Goes some way to prove that to get the most for you money it's sometimes best to buy when the GPU just gets released.

---

The same can definitely be said about those buying 1080ti's NOW. Spending top top dollar on a DX11 class GPU with 2016 feature set. Volta will be here soon. So whats true for Vega is true for the 1080ti.
 
Last edited:
The wait is killing me, why has nobody invented a coma drug you can take to sleep for a week or whatever lol.
...snip...
Me too. Luckily I have a stag away this weekend and I'm off monday so should be back in teh land of the living just in time for the long, long-awaited reviews :).
 
So, bearing in mind how a company like AMD works and the GPU development cycles, does the delay in VEGA have any knock on effect for Navi?

Will the engineering team working on Navi have been continuing their work as an autonomous group, therefore leaving Navi on track for it's original release window?
 
So, bearing in mind how a company like AMD works and the GPU development cycles, does the delay in VEGA have any knock on effect for Navi?

Will the engineering team working on Navi have been continuing their work as an autonomous group, therefore leaving Navi on track for it's original release window?

I'm sure I saw an AMD slide saying RTG now had two teams working on alternate product releases, so the other team working on Navi should be still aiming for their original timetable, but no one knows if there have been any delays on that project. Plus AMD may decide they have to get a certain amount of life and sales from Vega.
 
Surely this is now the longest 'pre-launch' in history? I can't ever remember anything ever taking this long to see the light of day in the GPU world.
 
Status
Not open for further replies.
Back
Top Bottom