• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Ahhh Polaris, the 1080 killer :D

That was the issue though,random posters on forums with no history of ever leaking stuff would say stuff,then Wccftech would republish it as gospel truth and then people would believe it. I mean despite this people keep doing it! :confused:

Edit!!

Remember the "beast mode" RX480. I don't think there are even "beast mode" RX580s either!! :p
 
Also as mentioned, it all depends on what AMD delivers this weekend. I already have a 1440p 144hz Freesync, but if the GPU disappoints I'm moving to green in total again.

Can't be ***** sitting on a sub par monitor and GPU; and honestly I've waited long enough. Since Polaris rumors. :|


I'm sat here waiting with bated breath too, I'm currently running a 1060 with my Freesync monitor so I'd like to go back to AMD and Freesync but if the price doesn't sit right I won't because I'm not having problems using the 1060 on this monitor. I can put 600 towards a gpu at the moment but I'm not paying for performance that doesn't match the price so if Vega's a wash I'll either stick with the 1060 until Volta or look out for a 1080 or 1080ti deal. As my Mum would say the money's burning a hole in my pocket. :D
 
Last edited:
Ahhh Polaris, the 1080 killer :D

But I kind of agree, LFC will help enormously in games. Just a damn shame that their's been zero news on decent UW Freesync2 monitors either, but that's AMD for you lately!

That was the issue though,random posters on forums with no history of ever leaking stuff would say stuff,then Wccftech would republish it as gospel truth and then people would believe it. I mean despite this people keep doing it! :confused:

Edit!!

Remember the "beast mode" RX480. I don't think there are even "beast mode" RX580s either!! :p
I'm sat here waiting with bated breath too, I'm currently running a 1060 with my Freesync monitor ao I'd like to go back to AMD and Freesync but if the price doesn't sit right I won't, I can put 600 towards a gpu at the moment but I'm not paying for performance that doesn't match the price so if Vega's a wash I'll either stick with the 1060 until Vega appears or maybe look out for 1080 or 1080ti deal. The problem is I'm burning a hole in my pocket. :D


Damn right! I had G-Sync (PG278Q) 144hz and SLI 980Ti at the time. Sold the monitor and GPUs( a week before Pascal was announced ); got the MG278Q and sat on a 4670 until Polaris. Then it was a paper launch, then waited for the product launch(months on a damn 4670), then got a Fury X.

That died, and the friend I sold my 980Ti's is lending me one until Vega hits.
It's been months again, and I've already spent a small fortune on biltong for the chap to sate his hunger and as a massive thank you.

So now AMD's time is up; Siggraph better be awesome or it's bust. That ASUS PG348Q is looking mighty fine. My only concern is possible backlight bleed, and if that silly 3 prong stand won't be an issue on my Ikea desk :p

New Job means no need for Skylake X or Threadripper; hence looking at Ultrawide monitors again. I want a great combo that'll last me ages. Fingers crossed for AMD however.
 
Last edited:
Realtime. That's an important inflection point for this kind of work. Nvidia makes a lot of it's graphics card money from the professional market. AMD would obviously like a chunk of this.

You should expand to read the comments section under that post to see people's reactions. As far as compute is concerned, Vega is an awesome compute chip and does things that even Volta doesn't do. It's too bad AMD doesn't have the budget to split development of gaming/pro cards. Right now, the HBCC and even HBM are "too soon" for gaming cards...
 
You should expand to read the comments section under that post to see people's reactions. As far as compute is concerned, Vega is an awesome compute chip and does things that even Volta doesn't do. It's too bad AMD doesn't have the budget to split development of gaming/pro cards. Right now, the HBCC and even HBM are "too soon" for gaming cards...
Do we know what volta doesn't do yet?
 

Again basically they don't have the fps numbers but it stands true that G-Sync is overpriced and if you want a great experience go down the AMD route and save your cash. That's what they are trying to get through. If only this was happening at 4k though lol. Thing is the majority are gaming below even the 1440 that they used so the message is true as G-Sync is a rip off. If the prices are to be believed then so is Vega. The thing is will this filter down to the lower tiers where Freesync is a bargain and without miners so would most AMD solutions. Looks like AMD are trying to appeal to the mid-low market with there high tier card and on the bigger scale trying to get at Nvidia's overall brand. It's what they need to do as Nvidia's brand as a whole keeps them in the driving seat at all tiers.
 
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?

I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.

So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.
 
Damn right! I had G-Sync (PG278Q) 144hz and SLI 980Ti at the time. Sold the monitor and GPUs( a week before Pascal was announced ); got the MG278Q and sat on a 4670 until Polaris. Then it was a paper launch, then waited for the product launch(months on a damn 4670), then got a Fury X.

.

Paper launch? the RX480s were available in large numbers from day 1.
 
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?

I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.

So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.

Quite simply you have not been paying attention. You have not been using HDR, as far as i know no monitor supports it as of now or it's just become available. You would not be buying a slower hotter card just to use Freesync as you could also buy a faster cooler card and pay more for G-Sync. G-Sync and Freesync are also not what you think. It's early on a Saturday morning so i suggest you google some more before you type up a rant that's so far from the truth. Maybe you should sit next to a 1440p monitor and a 4k as i have sat next to both. It's not mind blowing on good screens but there is a difference in clarity. Is it worth it that's up to you.

Possibly this is what got you confused.

PCs have actually been claiming to deliver HDR gaming for the best part of 12 years. But this was only emulated HDR, not the true HDR we’re seeing today that delivers genuinely expanded brightness and colour performance.
Read more at http://www.trustedreviews.com/opinion/what-is-hdr-gaming-2946693#ixkftTKEASbz54ih.99

http://www.trustedreviews.com/opinion/what-is-hdr-gaming-2946693
 
Last edited:
Am I the only one here who doesnt care about gsync, freesync, hdr, 4k ?

I dont game on a sofa far from my screen, just hate that type of experience.
I have used HDR for a decade already in pc gaming, it is not new like sony and co are trying to make out. Its been around since the early 80s.
gsync and freesync are only useful if you have a framerate different to your refresh rate multiplier. The question is why are people playing in such a way? I play at either 30fps or 60fps on a 60hz screen.
I think pixel count is a false economy, it does wonders for slowing gpu's down, but not that great for improving image quality. As resolutions get higher and higher, you hit a point of diminishing returns, I only got a 1440p monitor for desktop real estate, the fact I game at 1440p is only for that reason now. I have noticed no visual improvement over my previous 1050p resolution, however things like lighting affects, sggssaa, tessellation et. "do" make a meaningful difference to visuals. I always prefer lower resolution with max graphics settings vs higher resolution with things turned down.

So to me buying a slower hotter, more power hungry card just so I can use freesync sounds barmy.

The main benefit of PC gaming over console gaming is choice. People choose to have bigger higher resolution screens that runs at 120+hz, and therefore have to pay more for the graphics card to run them.
 
Status
Not open for further replies.
Back
Top Bottom