• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

If next year's consoles do 4k/60..

Well, yes. You can't just pluck arbitrary figures out of the air. Why 'for them' :confused:

Is this like wccftech now where we can just make stuff up :p

If he's Not counting R&D and marketing etc. Just material cost to manufacture, I reckon a 2080TI probably costs a couple of hundred quid. I am definitely making that up mind haha.

I say "for them" because I think Nvidia are starting to overprice their products so if someone told me they make greedy profit I'd probably nod and agree (with zero evidence of course :D)
 
If he's Not counting R&D and marketing etc. Just material cost to manufacture, I reckon a 2080TI probably costs a couple of hundred quid. I am definitely making that up mind haha.

I say "for them" because I think Nvidia are starting to overprice their products so if someone told me they make greedy profit I'd probably nod and agree (with zero evidence of course :D)

OBLcayd.png

That is income and net margins after expenditures have been deducted until the start of 2019,and their margins are even more now IIRC.
 
Also as a normal PC would have not been powerful enough too - the Witcher 3 in its finished form was still very hard to run even with the recently launched Maxwell GPUs. Remember,we should have had a new node by then but TSMC 20NM was a failure,so designs probably had to be backported onto 28NM.
The major change that happened was vis-a-vis lighting, and that's only because what they had initially was unworkable for the whole game. That's different than just down-grading. There's no doubt that we could have had a much more advanced game if consoles were stronger but that's always the case. I don't think that looking at those pictures (which btw, the PC version isn't even majorly modded as it could be) you can look at it and say 'there's little difference between that and what's on consoles'.

There's just no way to look at these pics and think the differences are minor. No chance!

iGCvJ0.png
iaARKq.png
jW5bJb.png
iar5zP.png

And certainly not if you take the E3 version to be radically different than release, it's just not:


Since CDPR was hardly likely to give a frank and full public statement about exactly why and which parts of the game were downgraded, I tend to defer to the following article:

http://whatifgaming.com/developer-i...-from-2013-list-of-all-features-taken-out-why

It seems The Witcher 3 graphics controversy never ends, and it is for good reason. Recently it has been quite apparent to a lot of people with footage showing up all over the web of newer PC build gameplay demos supposedly running on ULTRA on the PC (Poland preview event) which pales in comparison to the 2013 gameplay trailer. We contacted our insider who provided us information on The Division downgrade and apparent delay into 2016 months ago – who managed to connect us with someone in the know-how at CD Projekt RED last night to further explain the situation and set the record straight.

Here is simply what they had to say in regards to the whole thing (we advise everyone to take it with a grain of salt, despite having vetted their identity ourselves):

2013 was a tough time for CD Projekt RED simply because we were trying to create an entire bulk of the game on the older DirectX 9 renderer that we had in place for The Witcher 2. Most of the assets were created during the time we were creating our DX11 solution render pipeline to bring the next-generation experience to everyone. A lot of the footage including the debut gameplay trailer was done when the consoles were not even out and we only had an idea of the specifications of the system. This landed itself into problem territory when we realized the next-generation systems could not simply meet our graphical output to the desirable level of quality that we needed. There were several options: build three different builds or consolidate to the nearest denominator, which is what we did. We took the specifications of the lowest performing throughput system which I don’t care to mention here at all to avoid that discussion, and worked our way up from there. As almost a 250 man team, we sequentially had to take out/turn down a lot of features not just from our NVIDIA GameWorks pipeline but our normal game solution scripts as well – these include the following:

  • Level of horizon detail (essentially the draw distance had to be completely tuned down to tax the consoles less)
  • Volume based translucency
  • Ambient occlusion and foliage density / tree count
  • Flexible water simulation / tessellation we resorted to a (script texture effect similar to most games than physical based simulation)
  • Ground/building tessellation
  • Forward lit soft particles (this is the fire, smoke, fog that you would encounter while going through thick terrain into open space)
  • Real-time reflections in the water are completely off and replaced with a cheaper render solution estimator (this is a primary reason blood splatter was also removed from water)
We just did not have the manpower, budget or the console power to produce the vision we intended before the consoles were released to create a more visually stunning game of higher fidelity like 2013 assets. The PCs themselves had more than enough power to achieve this vision, almost certainly. But working on the game across 3 platforms did not make it feasible to keep features included that could potentially break the game as we kept building around it. All the 2013 trailers were actually in-game footage (not prerendered or vertical slices) but essentially just not an entirely finished world running on a high-end PC at the time.

When questioned as to why CD Projekt RED’s community managers have denied that “there will be no downgrade” and that there has not been one (as if this hardly a smart answer to anyone with a pair of eyes):

In game development you simply just don’t explain it like this. It isn’t something a developer ever wants to admit to because it would make us look bad even if it is plain as day. It would make us seem like we’re incapable and that next-gen is not as next-gen as people would think. The team would rather focus on the positives than admit to any faults, negatives, or that the final product is not the vision they intended politically speaking (because the game still looks good but not 2013 good). As for the PC version, it looks just like the console versions just with a higher resolution and a lower-form of HairWorks in effect.

Again, because I cannot reveal the identity of this developer for obvious reasons, I can tell anyone out there interested to take this whole thing with a grain of salt. But likewise, take any talks of “nothing has been downgraded” since 2013 with the same amount of salt as well from CD Projekt RED.
 
Since CDPR was hardly likely to give a frank and full public statement about exactly why and which parts of the game were downgraded, I tend to defer to the following article:

http://whatifgaming.com/developer-i...-from-2013-list-of-all-features-taken-out-why

Half of those things are moddable, or were available even in the ini at release. The others aren't, but things like SSR I'd put under lighting as well. It's hard to trust that guy, or even if you do whether he's talking about Ultra preset (which indeed isn't a great deal different than consoles) or if he includes the hidden presets as well that were available in ini. This is where the downgrade talk comes from, which was more of a concept trailer than actual gameplay showing: https://www.youtube.com/watch?v=8Z8qJZZf4Wo
edit: It's actually funny, I went to Gamersyde to get a better look at the uncompressed trailer and it's crazy how much worse it looks than the end product. The initial reveal has a very Witcher 2-esque look to it. I think a lot of people remember it better than it was just because it's hidden in lower resolutions. The screenshots in particular are very revealing. Check it out! (and compare to the screens I posted before)
https://www.gamersyde.com/news_images_of_the_witcher_3-13837_en.html

EG:
iyOZp9.png
image_the_witcher_3_wild_hunt-22370-2651_0001.jpg
endEdit


But, with all the being said, idc about was it downgraded or was it not. I happily concede that point from the start. I'm more interested in why you think the differences vs consoles are minor when you get..

- >2x the NPC count
- >8x the shadow map resolution
- >5x the grass density & variety
- >8x the cascade shadow distance
- >4x the texture resolution
- >5x the foliage distance & detail
- Vastly better fur & Geralt's hair
- Vastly better AO
- Still improved water tessellation & detail
- Vastly better AF
- >10x the decals
- >10x the mesh LOD distance & rendering

To say nothing that you can choose your own resolution, fps target, the wondrous joys of reshade (even better AO, GI, SSR etc) & all the other goodies that come with PC, which fine they're more universal so I'll not count them.

But how can all those be minor? I just don't get it, unless you need a totally different game in order for the difference to count.
 
Last edited:
Since CDPR was hardly likely to give a frank and full public statement about exactly why and which parts of the game were downgraded, I tend to defer to the following article:

http://whatifgaming.com/developer-i...-from-2013-list-of-all-features-taken-out-why

TBF,at launch even with the depreciated aspects,the game ran badly especially if any Gameworks effects were switched on. It depends on what the CDPR chap means by running OK on PC?? Was it quad SLI GTX980 GPUs??

But missing out on 20NM really didn't help - both Maxwell and Fuji were meant to be 20NM designs AFAIK,but backported to 28NM.
 
Last edited:
Iphones and Samsungs cost well over 1000.
I have to wonder if this will be the console that will also cost 1000?

samsung and iphones also have cheaper models to hit the entry and mid range market segments.

consoles dont have that luxury, they ship 1 model that needs to appeal to everyone.
 
samsung and iphones also have cheaper models to hit the entry and mid range market segments.

consoles dont have that luxury, they ship 1 model that needs to appeal to everyone.

apart from that rumour that there is a 2nd next gen xbox in the works that would be cheaper for streaming games and the like, so you never know. also phones last what, 18 months for the top of the line ones, even the "cheap ones" are £600, as has been proven people are stupid when it comes to money the past few years.

as for price il still be amazed if its not £599 or more unless ms and sony for that matter are going to take a HUGE hit on each one sold.
 
Well I'm almost certain 3070 should be decently faster than the new consoles, i still wouldn't use one for 4k though.. Might get one for 1440p so it can max games out to look their best with high fps.
As long as it can hit 60fps in most games at 4K I will be happy. Can always use DLSS on more demanding games. They just need to make sure they price it right.
 
Well I'm almost certain 3070 should be decently faster than the new consoles, i still wouldn't use one for 4k though.. Might get one for 1440p so it can max games out to look their best with high fps.

IF nvidia run true to form a 3070 should match a 2080ti, and as long as they dont gimp the rt cores on it, it should surpass it in ray tracing being 2nd gen rt cores. will be interesting to see the difference tbh.
 
Consoles can come knocking on my door when they are capable of VR at high-end PC levels. Until then, I'm not interested.

The last gen was supposed to be 4K capable. They barely were, using upscaling and managing a paltry 30fps. Now they're touting 8K. As if that won't be a pure upscale rather than a rendering resolution. I'm betting they'll barely hit 4K/60fps in a bunch of graphically-crippled games.

Also remember consoles will be stuck again with previous gen PC hardware for the next 5-7 years. Unless cloud gaming becomes a big thing after so many failed attempts and kills it off.
 
the thing is ms havent said 8k AT 120fps. or even 4k AT 120fps. just the idiots across the internet and on here have seen them mentioned and instantly expect it because gotta be more powerful than the pc mustard race!

also missing the HUGE fact not many tv's out there that can actually do 120hz 4k or the fact most people wont own one any time soon. let alone 8k at any refresh rate!
 
Consoles can come knocking on my door when they are capable of VR at high-end PC levels. Until then, I'm not interested.

The last gen was supposed to be 4K capable. They barely were, using upscaling and managing a paltry 30fps. Now they're touting 8K. As if that won't be a pure upscale rather than a rendering resolution. I'm betting they'll barely hit 4K/60fps in a bunch of graphically-crippled games.

Also remember consoles will be stuck again with previous gen PC hardware for the next 5-7 years. Unless cloud gaming becomes a big thing after so many failed attempts and kills it off.
You should be happy these consoles are coming it. It will benefit PC gamers a lot. New baseline and ssd means we will get much better games being ported to pc. It will unfortunately take a few years until they make games only for the new consoles and not for PS4 and pro.
 
1440p @ 120hz would still be pretty decent as an option. In fact you might find you can chose between 4k/60 or 1440p/120hz

Thing is, there are competitive multiplayer type games like Overwatch that would play perfectly well at 4k/120hz.
 
You should be happy these consoles are coming it. It will benefit PC gamers a lot. New baseline and ssd means we will get much better games being ported to pc. It will unfortunately take a few years until they make games only for the new consoles and not for PS4 and pro.
To be honest, I don't care too much for modern gaming. Sure, they can make the visuals better, but that's about it. Game design is still pretty much the same it's always been since Wolfenstein and Doom - you run around, collect and shoot stuff. It's become dull as dishwater.

I want new experiences. Which is why the last (and only) console I ever bought was the Wii (excluding 8-bit consoles and computers from the late 70s and 80s). And now we finally have decent VR. It's different and exciting. Playing endless first person shooters with just better graphics on a flat panel...boring.
 
1440p @ 120hz would still be pretty decent as an option. In fact you might find you can chose between 4k/60 or 1440p/120hz

thats IF you're tv supports it, remember the xbox is aimed at the living room, yes a lot of people play them on monitors but the vast majority will buy a 65inch 4k tv that only does 60hz for under £500 over a 4k 120hz monitor that according to pc part picker the cheapest option is over £1k ?!
 
Back
Top Bottom