• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Which graphics card for my system - 770 v 7970

As I linked above this has been refuted with a company saying they are using up to 6gb on a game.

Dead link.

Could the developer monitor what RAM is being used for what though?

6GB RAM could be the total RAM the system is using, not specifically for the one game.


http://www.thesixthaxis.com/2013/07/28/ps4-ram-rumours-absolutely-false-according-to-developer/

Literally only just says "it's false" and that it's.


There's other links saying 5GB for games (5GB +1GB for OS would be 6GB)

We still don't really know anything, but saying 4.5GB for Video RAM is flipping ludicrous (Although you didn't say that)
 
Last edited:
As I linked above this has been refuted with a company saying they are using up to 6gb on a game.

I think the main point here is that some people took it to mean that the RAM available is for graphics only, therefore 8GB PS4 means you need a PC graphics card with 8GB vRAM! :rolleyes:

Blazin
 
Last edited:
So if developers use, have access to 4.5Gb for gpu, system and general; What is the other 3.5Gb doing?? (Its not just for show lol) I have my doubts......but i dont have an explanation either!!!!!!

I still believe 2Gb+ for Vram.....

The link above is dead can we have it reposted please?
 
Last edited:
So if developers use, have access to 4.5Gb for gpu, system and general; What is the other 3.5Gb doing?? (Its not just for show lol) I have my doubts......but i dont have an explanation either!!!!!!

I still believe 2Gb+ for Vram.....

The link above is dead can we have it reposted please?

I fixed the link as well. Stupid copy and paste on a galaxy tab gave it a double http.
 
Literally only just says "it's false" and that it's.

We still don't really know anything, but saying 4.5GB for Video RAM is flipping ludicrous


So if developers use, have access to 4.5Gb for gpu, system and general; What is the other 3.5Gb doing?? (Its not just for show lol) I have my doubts......but i dont have an explanation either!!!!!!

I still believe 2Gb+ for Vram.....

:rolleyes:
 
I stand corrected. However, I don't think people should be too worried about 'next gen' games being unplayable on their PCs because they only have a 2GB GTX 770. The Xbox One GPU is comparable to a HD7790 (source) while the PS4 is more comparable to a 7870 (see previous source again).

It isn't the power of the GPU alone that dictates how well the new consoles will perform. Unified RAM, arhcitecture and bandwidth all have a significant impact. So it is incredibly simplistic to state the new consoles are only 7790/7870 level GPUs "so they are already behind PC spec".

People here seem to confuse the idea of purchasing a HD7970 vs a GTX770 for future proofing as meaning it will be able to play games the 770 will not. The GTX770 will not suddenly become obsolete, it will just possibly have to reduce eye candy further than you would with a 7970.

None of this is a certainty but that is what future proofing is all about. If you have two graphics cards with identical prices but one has more VRAM and higher bandwidth, you are taking a chance when purchasing the lower specified card. Anyone who says the increased VRAM capacity or the increased bandwidth (or both) will never be a factor are ignoring the fact that evidence already exists showing this very scenario.

Modded Skyrim with hi-res textures
Increased MSAA
Hi res gaming

Yes I understand that these are limited scenarios but they cannot be remotely denied as facts. Evidence is there that as texture resolution or graphics demands increase the VRAM capcity or bandwidth requirements also increase. History shows that games become more, not less demanding.

By all means recommend a card with less bandwidth and VRAM, but don't tell people that the lesser VRAM or bandwidth will absolutely never be an issue. Because you cannot say that with absolute certainty. Ironically I do believe it most likely will not be a factor, but there is a possibility and that is the pertinent point. This isn't a Nvidia vs AMD debate, it is a pure future proofing argument considering identical price of these GPUs.
 
Last edited:
Everyone talks about future proofing your GPU. OMG You have to get the card with 3gb or 4 gb so your safe? 70% will change GPU's twice a year if not more. Theirs only a small amount of gamers that stick with the same card for years. I mean damn.....I've been thru 6 gpu's alone in the past year:p
 
Everyone talks about future proofing your GPU. OMG You have to get the card with 3gb or 4 gb so your safe? 70% will change GPU's twice a year if not more. Theirs only a small amount of gamers that stick with the same card for years. I mean damn.....I've been thru 6 gpu's alone in the past year:p

Exactly and it's usually the people who go for the top end cards who change more often too.
 
Everyone talks about future proofing your GPU. OMG You have to get the card with 3gb or 4 gb so your safe? 70% will change GPU's twice a year if not more. Theirs only a small amount of gamers that stick with the same card for years. I mean damn.....I've been thru 6 gpu's alone in the past year:p

I agree, when I bought my system recently, I was leaning towards getting a GTX 770 (purely because I was sick of ATI drivers which I've had for years). However after reading up and trawling through countless benchmarks, I decided to save for an extra couple months and get the GTX 780. Which I have now have overclocked to perform on par with the Titan. This was my way of future proofing, as I don't plan on getting a new PC for another 3 or 4 years (apart from minor upgrades - hard drives etc), and I am fairly certain there won't be many games that bring it to its knees, so for me I saw it as an investment.

How many PC games have come out since Crysis (2007) that have really pushed systems? Maybe 10-20? In 6 years.

If you are adamant on future proofing your system, and don't need it straight away, save a little longer and get a high end card (and I know this isn't financially plausible for some people, but this a performance/gaming PC forum).

Blazin
 
Everyone talks about future proofing your GPU. OMG You have to get the card with 3gb or 4 gb so your safe? 70% will change GPU's twice a year if not more. Theirs only a small amount of gamers that stick with the same card for years. I mean damn.....I've been thru 6 gpu's alone in the past year:p
That's what "everyone"'s talking about here.

For example, if someone have to speed £80 over a GTX770 2GB to get the 4GB version, then that is ridiculous; but getting a graphic card with equal GPU grunt, but extra 1GB vram and higher memory bus and memory bandwidth at no extra cost, that's a different story...

If one have to get a Nvidia card with longevity in mind, don't really think anything below GTX780 is worth considering, but of course the difficult (or impossible) part is find one brand new at £400-£450 price range, rather than £500+.

And for your example of you gone through 6 GPUs alone in the past year (because of itch I assume :D)...let's be honest, because none of them are 8800GTX or Radeon 9800 in today's standard :p And actually there are increasing number of people holding onto their cards longer, because many people don't upgrade unless it is like at least 75%+ increase in performance. This use to take 2 years/gens on average, but nowadays it's taking 3 years/gens or more.
 
Last edited:
Your saying I'm ridiculous for getting a 770 2gb. Now that is the fanboy comment of the day.
Read properly...I said spending £80 for 2GB extra vram for the sake of "future proofing" is ridiculous...I didn't even know you have a 770 2GB until you said it in this post...

Why are people so jumpy in wasting no time at all before calling someone a fanboy? :confused:
 
Read properly...I said spending £80 for 2GB extra vram for the sake of "future proofing" is ridiculous...I didn't even know you have a 770 2GB until you said it in this post...

Why are people so jumpy in wasting no time at all before calling someone a fanboy? :confused:

F5
 
Read properly...I said spending £80 for 2GB extra vram for the sake of "future proofing" is ridiculous...I didn't even know you have a 770 2GB until you said it in this post...

Why are people so jumpy in wasting no time at all before calling someone a fanboy? :confused:

Then my original comment stands. Why would you say such a thing then? I don't understand some of you here. Your looking for trouble
 
Back
Top Bottom