• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Associate
Joined
8 Jul 2013
Posts
2,089
Location
Middle age travellers site
The meter name is Sartano and its not the 1080ti thats the big power draw. CPU only stress tests takes most of the power. Ill have a look and see if i missed something when i plugged it in, dont think i have but hope im wrong. But the powerdraw for something like the screen fits so. Efficiency is also around 90-92 which fits the platinum psu im using.

What is your cpu voltage @ ?
 
Soldato
Joined
10 Oct 2012
Posts
4,455
Location
Denmark
What is your cpu voltage @ ?
with the 4,4ghz overclock i have to push around 1.28.. anything less and it gets unstable in some applications.. just removed the oc to do some testing and running OCCT Linpack i pull 400 watts from the wall, now remove the 40 watt router(measured) and 30 watt monitor(again measured) and we are around 330 which is way more than what it should be. It makes no sense to me.
 
Soldato
Joined
30 Dec 2011
Posts
5,547
Location
Belfast
Member when Vega was supposed to be a 225W GPU, and dual GPU was 300W? I member.

Such a long time ago now.

TXoPTIW.jpg


Are we so sure it won't be, or at least be very close to that?

Let's look at the facts we do know
  • Vega FE pulls on average 275 Watt with 16GB HBM2 VRAM.
  • Vega RX will have half the VRAM of FE, which will reduce overall power consumption.
  • Vega RX by this fact alone should consume less power than Vega FE.
Now for my own assumptions :)

As we can see, without any other power saving changes, Vega RX could be ~250W. Now assuming (as rumoured) there are other power saving features not enabled in FE, it could bring RX power consumption down to the 225 Watt target posted above. Having said that, maybe it will be best to wait until Vega RX specs are released before making any conclusions. ;)
 
Last edited:
Soldato
Joined
28 Sep 2014
Posts
3,465
Location
Scotland
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?

I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.

So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.

I don't care about G-Sync and Freesync so I am very exciting about HDMI 2.1 Game Mode VRR.

Manufacturers worldwide shipped about 120 million 1080p, 1440p, 4K & 4K HDR PC monitors compared to 350 million 1080p, 4K & 4K HDR HDTVs annual. Not all 120 million PC monitors supported VESA Adaptive Sync, AMD Freesync, AMD Freesync 2 and Nvidia G-Sync. I checked price search engine found there are 4209 monitors brands and models but only 241 or 0.05% of monitors supported VESA Adaptive Sync, 185 or 0.04% of monitors supported AMD Freesync, 57 or 0.01% of monitors supported Nvidia G-Sync and only just 3 Samsung monitors supported AMD Freesync 2 so probably less than 1% of PC monitors shipped worldwide supported VESA Adaptive Sync, AMD Freesync, AMD Freesync 2 and Nvidia G-Sync.

Years ago both AMD and Nvidia probably had met HDMI founders (Hitachi, Panasonic, Philips, Silicon Image, Sony, Thomson, RCA and Toshiba) to offered Freesync or G-Sync as standard for next generation HDMI 2.1 but they were not interested because it cost far too much to make then HDMI founders decided to created their own version called Game Mode VRR, both AMD and Nvidia had no choice but to support Game Mode VRR alongside with Freesync 2 and G-Sync. Over the next few years all PC monitors and HDTVs will have HDMI 2.1 ports supported Game Mode VRR.

4K resolution on 28 inch monitor are a massive waste, people cannot see big difference compared to 1080p. The recommended size people with both excellent or poor visions can see huge visual improvement on 4K resolution are 50 inch 4K HDTV or 50 inch 4K monitor.
 
Soldato
Joined
13 Mar 2008
Posts
9,638
Location
Ireland
Linus has more Vega, and a Threadripper system. The AMD lady in the back was at the PDXLan event as well.
W6ChpaTqRt2PZn8OMU2oyA.png
 
Man of Honour
Joined
13 Oct 2006
Posts
91,687
You know you should learn to speak for yourself only. Your "statement" certainly doesn't fit my personal experience and im sure im not the only one feeling this way.

I have the AOC 28" 4K monitor - definitely can tell the difference that said I wouldn't generally recommend 4K on a monitor that was less than upper 30s or higher inch.
 
Associate
Joined
2 Jun 2016
Posts
2,382
Location
UK
Are we so sure it won't be, or at least be very close to that?

Let's look at the facts we do know
  • Vega FE pulls on average 275 Watt with 16GB HBM2 VRAM.
I think you're forgetting that when the FE is running at 1600 MHz it uses substantially more power than 275 Watts.
 
Associate
Joined
8 Jul 2013
Posts
2,089
Location
Middle age travellers site
with the 4,4ghz overclock i have to push around 1.28.. anything less and it gets unstable in some applications.. just removed the oc to do some testing and running OCCT Linpack i pull 400 watts from the wall, now remove the 40 watt router(measured) and 30 watt monitor(again measured) and we are around 330 which is way more than what it should be. It makes no sense to me.

Try another meter
 
Status
Not open for further replies.
Back
Top Bottom