• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
But at least gamers could buy em even if the stock was low... I can see mining becoming a real big problem in the future, we might only be experiencing the beginning of the mining s**tstorm.

Anyone that really wanted one of those limited edition cards for their rig will be fugged lol...
Yup. There's no guarantee that mining will go away this year, next year, this decade or never...

I'm feeling really pessimistic about PC gaming going forwards. It will jsut be the domain of people who can afford to spend £1000s every year.
 
I suspect the future of computing is more specific hardware for more specific purposes, rather than things with a large amount of generality. Much like Google's Tensor cores.

Although I assume MCM designs will have to be completely mainstream first, otherwise this approach would likely be far too expensive. As you'd need to design so many different chips, each of several sizes too.
Agreed. Nvidia is already in heavy competition in deep learning with dedicated hardware offering much more performance per watt.

But for HPC use architectures like Volta are still very powerful. And I suspect nvidia already have plans for dedicated deep learning chips, maximizing their Tensor cores.
 
Yup. There's no guarantee that mining will go away this year, next year, this decade or never...

I'm feeling really pessimistic about PC gaming going forwards. It will jsut be the domain of people who can afford to spend £1000s every year.

And consequently the graphics quality of games will stagnate if it lasts for years and nobody can afford the new cards, and no one finds a solution to this "problem", not really any point making nicer games if nobody can buy the more advanced cards... On the upside they will have to concentrate more on the gameplay and the story line.
 
This is VEGA thread, why am I reading what the mighty nvidia is and isn't good at?

I come in here to read up on latest Vega information or leaks not nvidia.
So stay on topic thanks ;)
 
Did the Fiji Nano not match or exceed the R9 Fury in performance and even match the fury X in a lot of scenarios? All of this whilst having a lower TDP. Vega nano should be the same story, I'd expect it to be the full Vega 64 just with more aggressive undervolting and clocking to hit the optimal curve for perf/Watt.

Yes. Fury Nano if watercooled and not restricted by the tiny cooler, is a full blown FuryX under there, that overclocks to 1100/550, but no more. Is heavily power restricted.
FuryX can do 1190/600 with +48mv or 1237/625 with +96mv with the AIO, if the AMD UEFI bios is used (stock bios struggles above 1075). Custom watercooled haven't tried the FX only the Nano. (you can view my benchmarks on the other threads here).
 
Did they do that because ASIC algos lowered the buy in price for mining currencies, meaning more miners would bring down the value of the currency ? Don't know much about mining, but i'm interested in why they would have done that, must be some logical reason
ASIC became about Chinese factories being able to manufacture designs faster and more cheaply then anyone else. The equipment and the use of ASIC becomes centralised to China [also some of the cheapest power in the world available]. Thats kind of ok in a way because China moving value to dollars is a large part of bitcoin and why its valued. However having all the mining centred around ASIC production is also a possible negative, its supposed to be a global currency equally accessible by all.

The use of GPU allows a more complex protocol I think, ETH handles contracts and escrow type situations. I think the ideal crypto would be cpu only, this was how bitcoin originally was.

The news with ETH is that it may not use GPU beyond this year, not sure but there is another system based around value not mining but reliant on account holders to operate a voluntary network (it pays some). It doesnt then use the gpu in a competitive way to ensure security but leans on the richest holders to have the most interest in honest confirmation of trades.

If 480 wasnt being used for ETH. Wouldnt the biggest competitor to Vega be crossfire of 480 or is that really not effective. Maybe that's a reasonable prediction for 2018 but I imagine they refine Vega use enough to keep it ahead of the tide

I can see mining becoming a real big problem in the future, we might only be experiencing the beginning of the mining ****storm
Free markets tend towards efficiency. ASIC cuts out the waste in btc process. ETH has found some purpose in a different direction but likely they wont continue with a more expensive transaction process then they have to.
Or another coin will go faster, cheaper, simpler and replace ETH maybe.

I think AI swamping use of gpu design is more likely. Its new, its possible very useful and unique and industry with trillions in revenue can justify diverting gpu chips for AI if it returns gains for them in their revenue.
 
Last edited:
Oh yeah no I understood what you meant, there will always be those who won't change brand (maybe one day) because Intel is all they ever heard for the past decade.

Wasn't really about you - there are some who inject the worst possible anti-AMD spin into everything I say and would start arguing I claimed AMD won't sell anything, other people mistake their claims for what I actually said and it pointlessly goes around for 3 pages until I drop it - even though I never claimed anything of the sort - so just pre-empting it.
 
And consequently the graphics quality of games will stagnate if it lasts for years and nobody can afford the new cards, and no one finds a solution to this "problem", not really any point making nicer games if nobody can buy the more advanced cards... On the upside they will have to concentrate more on the gameplay and the story line.

Have you not been paying attention for the last 5 years? Graphics have stagnated, you can thank consoles for it. This is the reason why 4-5 year old GPUs can still push mostly high settings - the only reason most of /us/ (the enthusiasts) are upgrading is because we're not doing 1080p60 anymore, we're pushing higher resolutions and higher refresh rates - which is "artificially" increasing the power demand, by this I mean the games haven't really changed it's /us/ who are forcing the change.
 
Have you not been paying attention for the last 5 years? Graphics have stagnated, you can thank consoles for it. This is the reason why 4-5 year old GPUs can still push mostly high settings - the only reason most of /us/ (the enthusiasts) are upgrading is because we're not doing 1080p60 anymore, we're pushing higher resolutions and higher refresh rates - which is "artificially" increasing the power demand, by this I mean the games haven't really changed it's /us/ who are forcing the change.

Consoles and the fact NVidia and Intel have had such dominance that competition isn't there. Gamers haven't helped matters either by accepting whatever is put on offer at whatever inflated price the companies settle on. I'm expecting my new 1080 to arrive today, but if it wasn't for Elite on VR being too demanding for an R9 390 I wouldn't have seen much reason to upgrade at all for the foreseeable future.
 
Have you not been paying attention for the last 5 years? Graphics have stagnated, you can thank consoles for it. This is the reason why 4-5 year old GPUs can still push mostly high settings - the only reason most of /us/ (the enthusiasts) are upgrading is because we're not doing 1080p60 anymore, we're pushing higher resolutions and higher refresh rates - which is "artificially" increasing the power demand, by this I mean the games haven't really changed it's /us/ who are forcing the change.
Does make you wonder how much performance is in products, nvidia drop a driver with x3 performance for a titan for whatever program it was
 
Does make you wonder how much performance is in products, nvidia drop a driver with x3 performance for a titan for whatever program it was
That was just Nvidia disabling that feature/performance on the card to protect their 'pro' card sales. AMD forced their hand to enable features the disabled.
 
Does make you wonder how much performance is in products, nvidia drop a driver with x3 performance for a titan for whatever program it was

Well - here is the rub :D Kepler for instance supports a second DMA engine in the design (and some of the professional cards actually have it) and with minimal changes (in the hardware design - not something they can implement via a driver fix) could have supported a full ASync configuration with multiple command queues for graphics and compute + memory ops, etc. simultaneously with limited requirement for context switching or pre-emption, etc. - though on the flipside by the time anything actually really takes advantages of those kind of features Kepler would have been long history anyhow on the performance front.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom