• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Hmm the recent games announced at E3 make buying a top GPU look worth it now.

Come on Vega be good!

My only worry is timing. Is this AMD's product to compete against the Nvidia 1000 series or will this hold up against Volta?

Because releasing basically early August we are only gonna be a 4-6 months off of Nvidia's next Gen.
 
Hmm the recent games announced at E3 make buying a top GPU look worth it now.

Come on Vega be good!

My only worry is timing. Is this AMD's product to compete against the Nvidia 1000 series or will this hold up against Volta?

Because releasing basically early August we are only gonna be a 4-6 months off of Nvidia's next Gen.

Anthem looks pretty wicked. Looks like Crysis meets Horizon Zero Dawn.
 
I've found with my 7970, 285 and 290 setups that the cards drop to low clocks if all three monitors are identical
Are your monitors similar, or are you running different resolutions/refresh rates, mate?
No luck there. 1920x1200 60Hz, 1920x1080 60Hz, 2560x1440 144Hz combo.
At the moment I am saved by connecting first two to motherboard and the gaming monitor directly to the videocard.

I understand that more monitors means more load, but the jump from 30C idle to 55C idle by just connecting another screen is uncalled for.
 
Hmm the recent games announced at E3 make buying a top GPU look worth it now.

Come on Vega be good!

My only worry is timing. Is this AMD's product to compete against the Nvidia 1000 series or will this hold up against Volta?

Because releasing basically early August we are only gonna be a 4-6 months off of Nvidia's next Gen.
According to AMD they say its competitive with the 1080ti. Volta could be a lot faster.
They will prob need a sharp refresh to get near that.
 
According to AMD they say its competitive with the 1080ti. Volta could be a lot faster.
They will prob need a sharp refresh to get near that.
Actually if you pay a visit at the volta thread at best it will be same perf with Pascal, if not worse. And no FP16...
 
....
Hmm the recent games announced at E3 make buying a top GPU look worth it now.

Come on Vega be good!

My only worry is timing. Is this AMD's product to compete against the Nvidia 1000 series or will this hold up against Volta?

Because releasing basically early August we are only gonna be a 4-6 months off of Nvidia's next Gen.

This also has me wondering whether I should be spending what'll obviously be 500+on Vega.
I've yet to notice Free-sync being missing but that's probably because I'm not trying to run any AAA games at the moment.
I'll know by the time RX Vega lands.
 
Been holding off buying a second hand 1080 as I really wanted to see what Vega was like. But on the other hand I am comfortably green and have been since I sold my 5870 back in 2012 (I think).

My issue is being locked in to the ecosystem by virtue of buying a Freesync/Gsync monitor.

If I bought a second hand 1080 I could upgrade to Volta if I needed, if I go Vega and Freesync I am locked out and then have to wait for Navi.... which could be two plus years away by AMD's recent history.

And to add I'm a desperate for a new monitor now. Since upgrading to a 1700 Ryzen my issues with tearing and my monitor have increased. I imagine my fps has gone up and my 2008 monitor just cant cope. It's worst in BF1. I have to put adaptive vsync on in the Nvidia control panel and limit my fps to 60 just to make it playable, but it still tears and stutters.

Wasn't this bad when I had my 3570k.
 
If I bought a second hand 1080 I could upgrade to Volta if I needed, if I go Vega and Freesync I am locked out and then have to wait for Navi.... which could be two plus years away by AMD's recent history.

FFS dude you've been on the fence for *months*!! Just make a damn choice!! :D

I did, I'm all-in Green now. Gsync is the bomb imo.
 
So in one post you criticise AMD for wanting developers leave data management and dataset switching to GPU drivers, and also saying that its useless for games then; in your next post you say that it is useful and that games already do it.
I take it English isn't your native language? Let me try and explain again. I haven't criticized AMD at all just pointed out some features and facts about their HBCC technology. HBCC has been designed fundamental for HPC use, not gaming. This is an area where a process could be accessing data sets many times larger than VRAM so it makes a lot of sense to have a feature that accelerate memory IO operations. Currently most games use far smaller datasets, and typically they fit within the VRAM limits of consumer level cards. Enabling super high-resolution textures and gaming at 4K will push VRAM closer to the limits of mid-range GPUS which is where the 11/12Gb of the 1080ti/Titans etc. come in handy for the extreme games. There are a few games that have been designed to use extremely high reosltuon textures that require real-time streaming of texture data from the system RAM. These are games like GTA where you don't need the high resolution texture and model data for a build that is so far away it will take minutes of game play to get there, but once you get closer the new assets need to be streamed in.

No where have I said HBCC is useful for gaming in the current market. In fact I have said quite the opposite that most games will have data that fits within VRAM, and its already possible to stream in larger datasets using current technology under most scenarios. In the future HBCC could be leveraged in games but wit will require widespread adoption and standardization.

Why are you upset by AMD wanting to do it in drivers rather than having developers having to do it?
Upset? Think you need a dictionary:
noun
noun: upset; plural noun: upsets
ˈəpset/
1.
a state of being unhappy, disappointed, or worried.


I don't are how it is done but I expect it is much easier for developers to do this and they will have a much better idea of resource management within their game. It also mans the game developers wouldn't have to wait for AMD's driver team to catch up.

From the quick search i've done carmark's megatexture it appears to contradict what your trying to say


this paper describes the real-time streaming and decompression for use in Mega-texture, published by ID software. The paper details a lot fo compression technologies but you see in section 7 an overview of the megatexture process where a super-larger texture is streamed form hard disk into he game in real-time:
http://mrelusive.com/publications/papers/Real-Time-Texture-Streaming-&-Decompression.pdf
 
I don't think HBCC is going to be a selling point as far as gaming goes -- however, I don't see devs having the extra option of using it as a negative.
 
I take it English isn't your native language? Let me try and explain again. I haven't criticized AMD at all just pointed out some features and facts about their HBCC technology. HBCC has been designed fundamental for HPC use, not gaming. This is an area where a process could be accessing data sets many times larger than VRAM so it makes a lot of sense to have a feature that accelerate memory IO operations. Currently most games use far smaller datasets, and typically they fit within the VRAM limits of consumer level cards. Enabling super high-resolution textures and gaming at 4K will push VRAM closer to the limits of mid-range GPUS which is where the 11/12Gb of the 1080ti/Titans etc. come in handy for the extreme games. There are a few games that have been designed to use extremely high reosltuon textures that require real-time streaming of texture data from the system RAM. These are games like GTA where you don't need the high resolution texture and model data for a build that is so far away it will take minutes of game play to get there, but once you get closer the new assets need to be streamed in.

No where have I said HBCC is useful for gaming in the current market. In fact I have said quite the opposite that most games will have data that fits within VRAM, and its already possible to stream in larger datasets using current technology under most scenarios. In the future HBCC could be leveraged in games but wit will require widespread adoption and standardization.


Upset? Think you need a dictionary:



I don't are how it is done but I expect it is much easier for developers to do this and they will have a much better idea of resource management within their game. It also mans the game developers wouldn't have to wait for AMD's driver team to catch up.




this paper describes the real-time streaming and decompression for use in Mega-texture, published by ID software. The paper details a lot fo compression technologies but you see in section 7 an overview of the megatexture process where a super-larger texture is streamed form hard disk into he game in real-time:
http://mrelusive.com/publications/papers/Real-Time-Texture-Streaming-&-Decompression.pdf

LOL. Oh you want to go there, okay then. There is a difference between reading what someone has wrote and understanding what they wrote, because you read what i wrote but you did not understand. Let me help you understand.

First and foremost go reread the post you quoted (#1035), and show me where i explicitly wrote "HBCC".
I wrote
AMD for wanting developers leave data management and dataset switching

Secondly, did you not state that games are already streaming textures in and out of VRAM? Heck its right there at the bottom of the post I am quoting.
Is this not a type of data management and dataset switching?
Do you think that "data management and dataset switching" is a bad feature in games? Because the way I understood your post you don't seem to think it is.
May be I am missing something but is that not what AMD is proposing to do with GPU drivers and HBCC?

Either the process of "dataset switching and data management" is useless or it is useful in games. So which one is it?


If you read that definition you posted, you would have seed the word "disappointed", which coincidentally slots right in. You clearly didn't understand the definition that you just posted.

Why are you disappointed(upset) by AMD wanting to do it in drivers rather than having developers having to do it?

As for whether or not English is my first language. https://www.youtube.com/watch?v=_n5E7feJHw0
 
Status
Not open for further replies.
Back
Top Bottom