• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Talks Graphic Industry’s Trends – Radeon HD 7970 Still The King of $299 Price Segment

Its only a matter of time before the limit is breached, but even so id still expect 2gb to be plenty for single screen @1080p for a while. With the consoles having lots of usable memory though it will happen, just a question of when.
 
You got to love this whole not enough memory argument tie in with upcoming consoles.
So if the new consoles are going to be using 3GB of ram just for graphics operations, that is only going to leave 5 GB for the operating system and the games to actually be running in. Plus of course in the case of the Xbox one, Skype, Kinect and whatever else they have running all the time.
 
+1

this advert is talking about a single card setup and there is no way that a single card could drive settings that would require 3GB of VRAM, so really what they are advocating is crossfire setups for 2560x1440... in which in their own advert they then admit that there is a glaring issue with their own drivers for crossfire

:D

This isn't true, and I wish people would stop spreading it.

Memory consumption and GPU power aren't really linked.

Memory intensive doesn't mean GPU intensive, and most stuff that fills up memory doesn't really make that much of a difference to FPS except for lots of AA and things like ambient occlusion.

As for this advert, it's clear what they're angling at, but the way they've portrayed it is misleading.

2GB is an issue but only as a result of it being on a 256bit bus, as the bus is the actual issue it's just that 2GB is like a secondary issue to that as you wouldn't typically have 2GB on anything but a 256bit bus currently.
 
I tend to agree here with AMD.

Remember that gaming at the moment on the PC is dictated by Console gaming. 80% of games that come out just now are ports on their console equivalent and of course the other 20% are then forced to create games with those constraints as well due to available hardware. Yes you can add certain features into a game to make it look prettier on the PC hardware but essentially graphic engine are produced to work with the limited hardware of the Console.

With the new Consoles around the corner utilizing 8gb of GDDR5 ( swapped between graphics and system memory ) graphic engines are now going to be designed to take use of said memory so 2gb simply isnt going to be enough, AMD know this and so do Nvidia. I would hold off buying any new graphics cards at the moment until the new consoles come out if you already own a card that is gtx 480 or above or 7850 or above with AMD.

I also think that AMD will drop their cards eaither just before the consoles come out or just after with card I reckon with 6gb of vram because they will know exactly whats being developed on both consoles and exactly what will be needed to play them in glorious HD and above.
 
This isn't true, and I wish people would stop spreading it.

Memory consumption and GPU power aren't really linked.

Memory intensive doesn't mean GPU intensive, and most stuff that fills up memory doesn't really make that much of a difference to FPS except for lots of AA and things like ambient occlusion.

As for this advert, it's clear what they're angling at, but the way they've portrayed it is misleading.

2GB is an issue but only as a result of it being on a 256bit bus, as the bus is the actual issue it's just that 2GB is like a secondary issue to that as you wouldn't typically have 2GB on anything but a 256bit bus currently.

I'd like to know how you would go about increasing the memory load without increasing the GPU load...? If you up the settings in game, you decrease your FPS by increasing load on the GPU, in turn the VRAM usage goes up. You do this till the settings are maxed out or you reach an unplayable FPS, more times out of 100 you reach an unplayable FPS before you hit your VRAM wall.

The two might not be linked directly but for the sake of gaming they're two sides of the same coin, one cannot be raised without the other, as such you need more than a single card to hit that wall. The memory bus is perfectly fine for 1080 it starts to show its faults when asked to run 1600p+

Obviously some games break that rule, modded Skyrim being one of the fan favorites for "need more than 2GB" BS
 
Very high resolution textures would fill up VRAM without much of an impact to FPS.

That's one example, and generally speaking, the type of settings that pull down GPU performance are computationally "expensive" ones, where the memory footprint isn't increased so much, but the GPU is being taxed.

So one can actually be raised without the other, it's that simple. In addition to that, there's no reason why having more available GPU power will increase VRAM usage either.
 
Very high resolution textures would fill up VRAM without much of an impact to FPS.

Modded Skyrim then :D

That's one example, and generally speaking, the type of settings that pull down GPU performance are computationally "expensive" ones, where the memory footprint isn't increased so much, but the GPU is being taxed.

Upping ingame settings to "max"

So one can actually be raised without the other, it's that simple. In addition to that, there's no reason why having more available GPU power will increase VRAM usage either.

The fan favourite (modding Skyrim) is an addon and not needed to play the game, you make a conscious decisions to add the textures/mods and shouldn't be used in an argument towards VRAM usage imho....as it's a gamers choice.

The other is turning ingame settings up to 11, you either turn the game into a slideshow through hitting a VRAM wall or because you don't have enough GPU grunt to churn those settings, your example is the latter, although not memory heavy it still increases memory load albeit tiny in comparison to the likes of AA whilst having a similar effect on the framerate.
 
Welcome back Spoffle. :p

Thanks :D

Modded Skyrim then :D

That is an example of course.



Upping ingame settings to "max"

"Max settings" doesn't necessarily equate to a large memory foot print.

Most GPU demanding settings aren't particularly memory intensive.

What is memory intensive is assets.


The fan favourite (modding Skyrim) is an addon and not needed to play the game, you make a conscious decisions to add the textures/mods and shouldn't be used in an argument towards VRAM usage imho....as it's a gamers choice.

It's irrelevant. The issue here is a lack of understanding, rather than "because Skyrim".

It can be used as an argument perfectly fine, though I wasn't myself, but it shows that VRAM usage isn't dictated by available GPU power.

One of the biggest issues with games are poor textures when played on high res displays, so larger resolution textures would offer better quality textures whilst only increasing the VRAM footprint and not much else.

The other is turning ingame settings up to 11, you either turn the game into a slideshow through hitting a VRAM wall or because you don't have enough GPU grunt to churn those settings, your example is the latter, although not memory heavy it still increases memory load albeit tiny in comparison to the likes of AA whilst having a similar effect on the framerate.

It's not. Turning the settings to 11 won't necessarily load up the VRAM if all those settings aren't particularly VRAM heavy.

As for AA, one of the biggest impacts of AA is available memory bandwidth, and then available ROPs, so again isn't really about available GPU performance dictating memory usage.
 
Last edited:
Be nice if AMD hinted at a new product. This release of Titan, GTX 780, GTX770 & GTX760 just feels..... disappointing.

Titan - Great card. Simply far too expensive

Gtx 780 - Good card but overpriced. This should have come in at £400-£425

GTX 770 - It matches a year old 7970Ghz card at the same price point minus the games. Show me progress!?

Gtx 760 - Castrated GTX 670. Cost less but performs worse. Doh! :S

The above is my opinion. While its good to see at last nvidia improving on their price/performance in terms of its GTX680/GTX770 it doesnt really like we've moved on much in the last year & a half, just treading water. Think nvidia should have priced all the cards more aggressively & make AMD sweat. Something like-

GTX Titan - £550
GTX 780 - £400
GTX 770 - £299
GTX 760 - £180

Think the above is a fair price for each.
 
VRAM is used as a cache, games dump stuff in there just in case they need it, not just what they NEED need.

We went over this in a thread not long ago about VRAM where people posted their usages.

Still wouldn't buy a card with <3GB. :D
 
^^ This.

From a non biased perspective Nvidia have a tight lineup now, from bottom to top a card for every segment. Although AMD have cards with similar performance for less money they have no answer to the GTX 780 or Titan, doesn't have to beat either but same ballpark / less cost would make things more competitive. New cards would generate more interest.. Hopefully AMD announce something soon..

The 7990 is more than an answer to either, regardless of it being a dual gpu solution.
 
to be fair with all the console enrollment that AMD is in now and game partnerships to optimise for their gpu's, why do they need to announce anything or even rush anything? They already have the money in the bag, just waiting to cash it in. They're probably slowly developing their new gpu line up to best perform in the top games released late this year and next and for the foreseeable future.

My only concern is, how much will they charge us as a customer for this optimisation and performance boost?
 
The 7990 is more than an answer to either, regardless of it being a dual gpu solution.

A dual GPU isn't an answer to a monster single GPU..

780 / Titan are in their own performance bracket right now. That is where AMD need a decent card, it'll force Nvidia to price more competitively and spark some more interest. Funny adverts are great and everything but now AMD needs to execute. I think the baiting may only stir up the beehive within Nvidia. The GTX 880 is going to be a monster I tells ya, GPGPU Compute / Monster Gaming performance the whole enchilada.. AMD may yet regret the taunting lol :p They need to back all the talk up with new cards..
 
Back
Top Bottom