• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th

2018? You mean 2014?

Its a 5 year old game, PC 4 years.

Compare a 1997 game to the 2007 Crysis. Compare 2007 Crysis to a 2018 game. The worst thing is Crytek,hid settings away in the config settings files,since it would overtax cards of the era. Crysis with manually added settings and mods,shouldn't look so good for a 2007 game. We have not only massive faster cards,but DX11,DX12 and Vulkan. Also now with have CPUs which utterly destroy the Core2 CPUs of that era - that interactable environment ran only on a dual core. Look at how the vehicles were modelled so,you could selectively destroy parts to slow them down,huts took damage,etc.

There were other aspects too - remember the snow levels?? They had different AI models for humans and the aliens.

The aliens flew - a lot of modern games use walking aliens,since they use the same AI models as the human NPCs.

Another modded Crysis playthough from 2016:

https://www.youtube.com/watch?v=PECqrhxhpTA


This is DX9 with a few DX10 features.

But even go back before Crysis,Far Cry pushed graphics forward too and was open world.

Both games had complex environmental models,ie,for wind and it could actually cause damage to the folliage,etc.

Even before those games,you had games like Unreal which pushed graphics and sound forward quite a bit(the Crysis of its day),games like Red Faction where the environments were fully destroyable,Half Life 2 which had impressive facial animation tech,and pushed graphics forward.Games like Planetside where,you had massive online battles.

All PC exclusives too,and we all know why.

Edit!!

Look at the controversy over The Witcher 3 and Watch Dogs. Both games were graphically downgraded over the inital demo footage especially Watch Dogs. It was probably since both companies were worried if the cards over the next few years could handle the added effects.

Crytek OTH didn't care,and they probably knew the 8800GT,etc were coming so felt confident to release it. This was a $250 card which came close to a $650 one from the previous generation in the space of around 12 months!

Did the GTX1060 come close to a GTX980TI/Titan Xm or an RX480 come close to a Fury X?? Nope.

Now if the GTX2060 does indeed equal a GTX1080(or even a GTX1070) as was mentioned before,then at least it means games like Metro:Redux and Cyberpunk 2077 can push things a bit more than normal.

People say that PC gamers don't care,but look at how much PC gamers are throwing at Star Citizen at the faint hope it will be pushing uber graphics and pushing the technical side of things far more than a console.

$175 million says PC gamers do care.
 
Last edited:
LOL nope! You obviously not played many old unreal 2 & 3 engines 10-13 years ago like Splinter Cell: Chaos Theory, Bioshock, Bioshock 2, Stranglehold and others. All has horrible low resolution textures.

Snip

I'm glad you countered his argument so concisely, as that statement was one of the bigger loads of crap I'd read on the internet all week....and I've read trump's tweets.

That's in the 3D Mark 06 ballpark quality of graphics.

Meant 2018, because it is still praised now.

And what open world game from 10 years ago has better graphics? You can't compare it to a linear corridor first person shooter.

GTA V let's you do a ridiculous amount of stuff.

3D mark 06 was mearly rending a tech demo. There's wasn't a huge game running there that you could play.
 
Last edited:
Crysis at 4K - a 2007 game.
Crysis on hardware of the time wouldn't have looked like that.

Nobody could run Crysis at 4k in 2007.

Nobody could run Crysis at Ultra settings in 2007 :p

The only thing that proves is that Crysis was built in a way that it could look better over time. And it was expensive to do this... Crytek subsequently stopped putting in such overkill amounts of dev time citing "piracy" on the PC platform as their reason. See Crysis 2...

But again if you want to compare let's have some screenshots of Crysis on 2007 hardware.
 
These games are played with relatively mediocre configurations. GTA V has level of graphics maybe worse than what the Unreal engines provided 10 years ago.

PUBG isn't even optimised properly.

The decline is in the enthusiast, cutting-edge gaming market.

Your statement that GTA V has a level of graphics worse than ten years ago is just total BS. Who could take your argument seriously with comments like that?

The decline is cutting edge gaming market? LOL What you mean all those people buying 4K monitors, VR headsets and record numbers of 1080Ti's and Titan cards?

Show proof of this decline? It's all in your head.
I love to walk around in beautiful environment, just to wander in Crysis, Crysis 2, Crysis 3.
Not even fighting with the enemies, just walking around and enjoying all details and realism.

Something I miss so much now :(

You and Cat keep going on and on about Crysis. Like no game looks as good these days. That's pure and utter crap from both of you.

Also, you can easily mod most games to look way better than Crysis ever was and also cripple your system.
 
Nobody could run Crysis at Ultra settings in 2007

End of 2007 I had 8800GT SLI - they didn't do too badly (with some SLI profile hacking) - IIRC I had the 7950GX2 when the benchmark was released but I can't remember how well it ran now.

2008 I had GTX260 SLI - starting to talk about playing at some of the higher quality levels with playable framerates albeit 40-50s.

EDIT: Found a random 8500GT (heavily overclocked) SLI result from some dabbling I did:

s1RHybT.png


Was probably similar to the 7950GX2 results heh.

(EDIT: Bare in mind that this would have been the kind of performance with the very bleeding edge GPUs - most people would have been lucky to get to get much over 1fps with those settings back in most of 2007).
 
Last edited:
Not sure what I will do when Ampere launches.

The XX80 will likely be a faction faster than the 1080 Ti but noticeably slower than a Titan V.

This little point in getting anything until the Titan and ti turn up towards the end of the year. Even for 1080 users, the 2080isnt going to offer enough of an upgrade for the price hike which will no doubt occur. Can see with the current mining issues 2080 will start at £700 and go to £900 for the Asus tax version
 
Even before those games,you had games like Unreal which pushed graphics and sound forward quite a bit(the Crysis of its day),games like Red Faction where the environments were fully destroyable,Half Life 2 which had impressive facial animation tech,and pushed graphics forward.Games like Planetside where,you had massive online battles.

The reason why there were stand out games back then is because very few games looked good at all. Graphics were poor. Now days, games all look good. Improvement in graphics are becoming more subtle. It's simply a state of diminishing returns.

You also mention that you can mod Crysis and change the ini to make it look even better.

Hello?? You can mod modern games and make them look way better too. Talking about photo realistic if you want. And they will also bring your computer to a crawl.

Lastly, Crysis was designed by Crytek to sell it's game engine. They weren't really worried about how well it ran more how well it looked so game developers would use it, that's where they were getting their money. Where as game developers have to make sure their games will run on as many PC's as possible.

Did the GTX1060 come close to a GTX980TI/Titan Xm.


The 1060 would be the equivalent of an 8600 GTS, not an 8800 GT.
 
Crysis on hardware of the time wouldn't have looked like that.

Nobody could run Crysis at 4k in 2007.

Nobody could run Crysis at Ultra settings in 2007 :p

The only thing that proves is that Crysis was built in a way that it could look better over time. And it was expensive to do this... Crytek subsequently stopped putting in such overkill amounts of dev time citing "piracy" on the PC platform as their reason. See Crysis 2...

But again if you want to compare let's have some screenshots of Crysis on 2007 hardware.

I am sure Crytek had the configurations on which they developed the games. It's impossible that they wrote the code without anything to run it on :eek:
 
I am sure Crytek had the configurations on which they developed the games. It's impossible that they wrote the code without anything to run it on :eek:
Not sure what that comment means tbh. It was well-documented at the time that the V.high/Ultra settings were not meant to be playable on GPUs of the time.

There are other settings besides Ultra, and other resolutions besides 4k (which nobody back then was using), so I'm not sure what your point is.
 
Not sure what that comment means tbh. It was well-documented at the time that the V.high/Ultra settings were not meant to be playable on GPUs of the time.

There are other settings besides Ultra, and other resolutions besides 4k (which nobody back then was using), so I'm not sure what your point is.

But they had and probably still have powerful work stations and on these, the game run well.
 
Crysis on hardware of the time wouldn't have looked like that.

Nobody could run Crysis at 4k in 2007.

Nobody could run Crysis at Ultra settings in 2007 :p

The only thing that proves is that Crysis was built in a way that it could look better over time. And it was expensive to do this... Crytek subsequently stopped putting in such overkill amounts of dev time citing "piracy" on the PC platform as their reason. See Crysis 2...

But again if you want to compare let's have some screenshots of Crysis on 2007 hardware.

Well,4K monitors didn't really exist at the time!!:p

It was more common to have something like 1680X1050 and 1920X1080 on higher end LCD monitors of the day. I had a 1680X1050 one,which cost like £300 or something like that. Monitors have definitely gotten cheaper over time.

Plus,I think you forget the $250 8800GT matched performance of the $650 8800GTX for most intents and purposes in the game. That would be the equivalent of a GTX1060 or RX480 matching a GTX980TI at launch. Emm,yeah that never happened.

So imagine if that list on Steam had a GTX980TI level GTX1060 instead of a GTX970 level one. That is why a game like Crysis could come out in 2007.

I had a 8800GTS 512MB(£260ish in todays money),and it was fine at higher settings under DX9 after the drivers caught up - it was some things like the single real DX10 effect(motion blur IIRC) which could cause issues,and that first video on newer hardware at 4K,is on non-modded settings. Even SLI 8800GT cards would be just over £500 in todays money and would be the equivalent of a pair of GTX980TI cards in 2007.

Regarding videos,most of the videos made at the time were quite low res and low bitrate,so its a bit hard to compare,but running it on a HD4830 and 8800GTS 512MB it was certainly still quite pretty looking,and nothing really wowed me for years after that.

I had non gaming friends and family members,who went wow at the graphics and the interactability of the environment back then - it truly was a big leap forward(yes I know about Red Faction).

Crysis 2 actually downgraded textures so it could run on consoles. MaldoHD fixed that via his mod,but Crysis had some of the best texture quality for years as consoles had limited RAM and CPU power. The major improvements Crysis 3 had over Crysis was better lighting,improved AA and the use of tessellation.

But it wasn't as open world,and again the AI models. Crysis had different models for the flying aliens - many games nowadays CBA,and use one model for human and alien NPCs. That included Crysis 2. Then you had the destructable environments,detailed damage models for things like vehicles,buildings and weather induced damage. Crysis wasn't just the graphics bit but the attention to detail to the rest of the world,and how for an FPS you could really use different play styles to get to an objective. The nano suite was pretty fun!! :p

Edit!!


This is a performance hit down to FRAPs,but that is on very high(mostly).

Anyway,I have to accept these jumps in graphic performance and even jumps in graphics performance(unless you spend loads) are not really going to happen. Its much more profitable for devs and hw companies to milk things. I mean I still hold out hope,that there will be a PC game which does push things and this will come hand in hand with a good performance bump.

I hope Metro:Redux warrants a good GTX2000 series:

https://www.pcgamer.com/metro-exodus-will-feature-a-year-long-journey-across-the-russian-wilderness/
https://wccftech.com/metro-exodus-details-4k-xb1x-hdr/

Cyberpunk 2077 is likely to be demoed in the next few months:

https://pcgamesn.com/cyberpunk-2077/cyberpunk-2077-e3-trailer

So if its released in the next 12 months,that would need better cards too.
 
Last edited:
I think this will be the biggest graphics card release ever. The pent up demand for Pascal's replacement is massive.

No doubt it will be good at mining as well, potentially making Pascal only marginally profitable like Maxwell is right now.
 
If I was NVidia, with some of the things I've been reading, they should try and move mining onto their Ampere professional cards only :D
 
If I was NVidia, with some of the things I've been reading, they should try and move mining onto their Ampere professional cards only :D

But then you'd want to divert production capacity to those cards. Nvidia are in the business of making money for it's shareholders right?

Also it doesn't quite work. RX Vega is sold out but the vega frontier edition generally wasn't. Stock of Titan Xp is also easy to come by.
 
I think this will be the biggest graphics card release ever. The pent up demand for Pascal's replacement is massive.

No doubt it will be good at mining as well, potentially making Pascal only marginally profitable like Maxwell is right now.

Well Metro:Exodus and Cyberpunk 2077 are going to be big titles,so a nice performance bump would certainly be a good thing!!
 
Ten years ago I had an 8800GT and a 1680x1050 monitor. And CoD4 ran brilliantly at around 90FPS with everything on. That 8800GT was £170. What new £170 card will run a new game at 90+FPS at 1920x1080 with everything on? There isn't one.
 
Nvidia are slowing increasing the gap between card performance by lowering the number of cores in mid range cards vs the 'full' version.
I'm ignoring the new tier of ultra cards e.g.1070/80 TI

Back in the GTX5XX generation
GTX560 - 66%
GTX560TI - 75%
GTX570 - 94%
GTX580 -100%

Now:
GTX1060 - 50%
GTX1070 - 75%
GTX1080 - 100%

Add to that the mid range cards aren't getting any cheaper, even before the memory and mining hikes I expect anything other than the 2080 will remain under specified unless AMD get their act together and launch a well priced volume card.
 
Ten years ago I had an 8800GT and a 1680x1050 monitor. And CoD4 ran brilliantly at around 90FPS with everything on. That 8800GT was £170. What new £170 card will run a new game at 90+FPS at 1920x1080 with everything on? There isn't one.
Didn' t into account inflation. I suppose a 1060 3gb is perhaps a decent shout at 1080p but I still suspect you'll be lowering settings.
 
Ten years ago I had an 8800GT and a 1680x1050 monitor. And CoD4 ran brilliantly at around 90FPS with everything on. That 8800GT was £170. What new £170 card will run a new game at 90+FPS at 1920x1080 with everything on? There isn't one.

Before or after cryptocurrency inflation?

As mentioned GTX 1060 3GB and RX 480 4GB were star 1080p performers for anything but the most intensive games. Destiny 2, Overwatch etc. all 60+ fps.

10 years ago I believe the £ was at an all time high against the $ which was unsustainable. 10 years of inflation at a low 2% also adds up to 22% total inflation. Most people's income should have kept up.

http://www.xe.com/currencycharts/?from=GBP&to=USD&view=10Y
 
Last edited:
Back
Top Bottom