• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When is Vega 2 out

Yeah, well why is that the convention seems to be that you buy an nVidia card and it works and then you buy a Radeon card and you have to tweak loads of settings to even make it work correctly. I mean surely if the likes of Asus cannot build a card and set it up straight from the outset then what chance do us laymans have?

Took me about 15 mins.

You can do everything inside the driver GUI pretty easily. Because unlike Nvidia, AMD's drivers don't look like they belong in windows 98.
 
@Nasher hey, i'm no fan boi. I'll get whichever provides the best value. From a gaming perspective i'm running a 2K 2013 Asus PB278Q monitor with a max refresh rate of 75hz so i don't care about the difference between 100 or 104 FPS. But i do care about render times.

I had a Radeon 7970. A beast of a card. Today it has too little VRAM. But i had to get rid because the default renderer in 3DS Max wouldn't let me use it's power to render scenes. I had to switch to a GTX 780 TI so i could get at those Cuda cores.
 
Something else bothers me about the Vega 64 too. It's power consumption. So under load, playing Battlefield 1, according to AnandTech a Vega 64 (on air) is 459w, with the 1080 Ti coming in at 379w. That's a difference of 80w. 80w per hour over 12 hours is just under a kilowatt. If a kW of electricity cost 25p then, and i run my card full pelt for 12 hours a day, then this would be £1.75 per week. Translate that to 52 weeks per year, and it's £91. I'd expect to keep my card for 2 years. So over the life of the card it would have cost me an extra £182 to run over a 1080 Ti.

An extra 80w power consumption isn't something that should really concern anyone running a computer worth over £1k, everything you said after the first sentence is the literal description of why power consumption is practically irrelevant in high end computing...

*EDIT*

Also if you're paying 25p per kWh then you need to shoot the family member who agreed to that (even if it's yourself) as that's double the UK average.
 
Last edited:
@jiw2033 : let's break it down, shall we?

You say the Vega 64's rendering prowess is attractive to you as your significant use for it will be rendering.
You express concern about power draw when gaming, the community here have told you how to tune Vega 64 to alleviate your concerns.

So what's the problem? It'll blitz your work requirements, you can reduce the power consumption, sounds like there's no problem to be had.
 
@ubersonic. My rate is 16.5p per KW/h Which is 8.5p below what i stated. which is £1.155 per week, which is £80.60 per year, so £161.20 over two years. But you can doubt every inch of the equation. For example, on Monday i might run the rig at full pelt for just 8 hours. Or maybe i run it for 14. Perhaps in two years time electricity has gone up a further 2p per kw/h (i wouldn't put it past them) and whilst i use the computer way too much, it's unlikely that it will be used for 730 consecutive days over two years.

My point was to show an extreme example. Not court controversy.

And whilst my PC being over £1k is an underestimate once you factor in the water cooling, GFX water blocks, Seasonic Prime PSU, etc, you really want to start looking where you can save.

@LePhuronn It wasn't until i questioned the community that it even occurred to me to consider a Radeon card. But whilst the example was in Blender i'm actually using 3DS Max. On my previous Intel build i had to lose my Asus ROG 7970 since the default renderer wouldn't allow me to use the processing power of the GFX card. So i had to switch to the 780 Ti so i could use the compatible cuda cores.

I'm not an expert and i think £600 is a chunk of change. £1200 seems crazy if i look at a 2080 Ti. So fielding the communities opinions makes sense. And not just for me, but for those who may come to this thread in the future with a like minded question. Sure Google is great for finding things but it's impossible unless you know what to look for. The information from this community has helped me to narrow my focus and i'll look to include my findings on this thread.
 
Last edited:
@ubersonic. My rate is 16.5p per KW/h Which is 8.5p below what i stated. which is £1.155 per week, which is £80.60 per year, so £161.20 over two years. But you can doubt every inch of the equation. For example, on Monday i might run the rig at full pelt for just 8 hours. Or maybe i run it for 14. Perhaps in two years time electricity has gone up a further 2p per kw/h (i wouldn't put it past them) and whilst i use the computer way too much, it's unlikely that it will be used for 730 consecutive days over two years.

My point was to show an extreme example. Not court controversy.

And whilst my PC being over £1k is an underestimate once you factor in the water cooling, GFX water blocks, Seasonic Prime PSU, etc, you really want to start looking where you can save.

@LePhuronn It wasn't until i questioned the community that it even occurred to me to consider a Radeon card. But whilst the example was in Blender i'm actually using 3DS Max. On my previous Intel build i had to lose my Asus ROG 7970 since the default renderer wouldn't allow me to use the processing power of the GFX card. So i had to switch to the 780 Ti so i could use the compatible cuda cores.

I'm not an expert and i think £600 is a chunk of change. £1200 seems crazy if i look at a 2080 Ti. So fielding the communities opinions makes sense. And not just for me, but for those who may come to this thread in the future with a like minded question. Sure Google is great for finding things but it's impossible unless you know what to look for. The information from this community has helped me to narrow my focus and i'll look to include my findings on this thread.

I'm not a 3DS Max user, so this is not a recommendation, but are you aware of this:

https://pro.radeon.com/en/software/prorender/3ds-max/
 
According to this site Nvidia have nothing on the market apart from Titan Xp that is faster in this test. I would assume Titan V is also faster but both these cards are way out of your price range. Vega looks a steal when you see this chart. The Rx580 is not to far behind the 1080ti so AMD must have pretty good support going on in this app with the prorender.

https://techgage.com/article/performance-testing-amds-radeon-prorender-in-autodesk-3ds-max/
 
Back
Top Bottom