• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When is Vega 2 out

Associate
Joined
7 Feb 2017
Posts
77
So i heard somewhere that the Vega 2 (Navi) was being worked on by the same team that worked on Ryzen. This is promising considering how poor Vega 64 is. I realise that all the buzz is for the 2080 Ti right now. But i want my GFX for 3D work and the current Vega 64 smashes the 1080 Ti when rendering in Blender.

So any idea on the wait for Vega 2 or just don't go there? I mean a year on a Vega 64 still under performs in games to the level of a 1070 (in some instances)
 
Sorry guys for the confusion but i don't own a Vega 64. I'm looking to buy a new graphics card and am trying to choose. From the current crop, when working in Blender, i was surprised to learn that Vega 64 was nearly twice as fast as a 1080 Ti at rendering. However, after watching some videos by experts, reviewers, and owners of the card, it was explained that still, today, the drivers are still the cause for it lagging in FPS in modern games. So whilst i primarily want my GFX card for work, the work is for game development and such should be great for gaming.

With Vega 2/navi/12 or whatever the name of the next Radeon iteration for consumers, is it worth waiting for or should i just buy the best now? Like most people i don't want to drop £700 on a GFX card only to find 3 months later that price drops to £400 since the latest thing has just been released.
 
Something else bothers me about the Vega 64 too. It's power consumption. So under load, playing Battlefield 1, according to AnandTech a Vega 64 (on air) is 459w, with the 1080 Ti coming in at 379w. That's a difference of 80w. 80w per hour over 12 hours is just under a kilowatt. If a kW of electricity cost 25p then, and i run my card full pelt for 12 hours a day, then this would be £1.75 per week. Translate that to 52 weeks per year, and it's £91. I'd expect to keep my card for 2 years. So over the life of the card it would have cost me an extra £182 to run over a 1080 Ti.

And whilst graphics card prices go down, electricity prices are always going up!

Now this might seem unfair, after all my computer will run idle at points throughout the day. But my PC is always on so i've accounted for that. And no, i wouldn't play a game for a straight 12 hours a day, 365 days per year. But i do consider 12 hours rendering time to be realistic. And i've not accounted for the fact that i would render at night, whilst asleep, and using cheaper electricity on Economy 7.

I just wanted to show the stark reality of what an extra 80w of power actually costs when your parents aren't paying the bills.
 
@peterwalkley from your three schools of thought i'm not sure i fit any. You see i've just bought a 1700x because OCUK had them on sale for a silly £150. Before this offer i was considering the first gen threadripper, i think it was the 1920X? for £325? or something like that. But i want to get into game development and i didn't feel that i was at the level yet where my ability demanded a use for that number of cores and threads. And then the 1700X came up at £150 and decided it all for me.

It's the same with the GFX card. Sure i could splash out on a 2080 Ti, but i'm not really at the level where i'll be taking advantage of its processing power so i'm better off waiting until i am. By which time the price will have dropped because something else will have been released.

So my school of thought is to buy a new 1080 Ti in October after the 2080 Ti has been released and hope the prices drop further. Then, when i want more performance, buy a second, used 1080 Ti, to render using SLI. Once they're holding me back i'd probably look at getting the latest 3080Ti in 2020/21
 
@Panos. I appreciate that your electricity supplier and specific variant of card GFX card will impact costs and power usage. I was just drawing a broad example. I suppose the point being that if you save £200 buying a Vega 64 over a 1080 Ti now, then in the end you still end up paying the £200.

So am i right in understanding that like AMD CPUs that the GPUs require a lot more voltage to eke out 2% more performance. Hence why super over clocked cards like the Asus Strix are so power hungry?
 
Yeah, well why is that the convention seems to be that you buy an nVidia card and it works and then you buy a Radeon card and you have to tweak loads of settings to even make it work correctly. I mean surely if the likes of Asus cannot build a card and set it up straight from the outset then what chance do us laymans have?
 
@Panos. You say my understanding is wrong. But i always thought that overclockers whom go for a world record, cool using liquid nitrogen because of the heat generated by how much voltage they have to push through the CPU to get that extra 100mhz.

@jigger. Unfortunately an APU doesn't have the core count i require for 3D work

@Bloot I just watched your video and Battlefield looks great. What PSU are your running?
 
Last edited:
@Nasher hey, i'm no fan boi. I'll get whichever provides the best value. From a gaming perspective i'm running a 2K 2013 Asus PB278Q monitor with a max refresh rate of 75hz so i don't care about the difference between 100 or 104 FPS. But i do care about render times.

I had a Radeon 7970. A beast of a card. Today it has too little VRAM. But i had to get rid because the default renderer in 3DS Max wouldn't let me use it's power to render scenes. I had to switch to a GTX 780 TI so i could get at those Cuda cores.
 
@ubersonic. My rate is 16.5p per KW/h Which is 8.5p below what i stated. which is £1.155 per week, which is £80.60 per year, so £161.20 over two years. But you can doubt every inch of the equation. For example, on Monday i might run the rig at full pelt for just 8 hours. Or maybe i run it for 14. Perhaps in two years time electricity has gone up a further 2p per kw/h (i wouldn't put it past them) and whilst i use the computer way too much, it's unlikely that it will be used for 730 consecutive days over two years.

My point was to show an extreme example. Not court controversy.

And whilst my PC being over £1k is an underestimate once you factor in the water cooling, GFX water blocks, Seasonic Prime PSU, etc, you really want to start looking where you can save.

@LePhuronn It wasn't until i questioned the community that it even occurred to me to consider a Radeon card. But whilst the example was in Blender i'm actually using 3DS Max. On my previous Intel build i had to lose my Asus ROG 7970 since the default renderer wouldn't allow me to use the processing power of the GFX card. So i had to switch to the 780 Ti so i could use the compatible cuda cores.

I'm not an expert and i think £600 is a chunk of change. £1200 seems crazy if i look at a 2080 Ti. So fielding the communities opinions makes sense. And not just for me, but for those who may come to this thread in the future with a like minded question. Sure Google is great for finding things but it's impossible unless you know what to look for. The information from this community has helped me to narrow my focus and i'll look to include my findings on this thread.
 
Last edited:
Back
Top Bottom