• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Wolfenstien 2 Vulkan benchmarks:

Why don't you actually set the video to 1080p and freeze it and have a look instead of making excuses.
Have you played the game? If you have you know that time of day can be different according to the map progression, it's obvious in the last scene that the weather is different, there are no excuses here, just facts. It's even mentioned in the comments.

He fixed the issue by testing one of the training scenarios which have fixed beginnings, you can see it in this video, when the time of day is matched, both look the same:

https://www.youtube.com/watch?v=qN550loG-hY
 
Have you played the game? If you have you know that time of day can be different according to the map progression, it's obvious in the last scene that the weather is different, there are no excuses here, just facts. It's even mentioned in the comments.

He fixed the issue by testing one of the training scenarios which have fixed beginnings, you can see it in this video, when the time of day is matched, both look the same:

https://www.youtube.com/watch?v=qN550loG-hY
I've played it an awful lot actually and the weather doesn't even change lol. Even in that scene it's exaclt the same :P . To get different weather you have to select the map where it's at night or the blizzard is on etc or whatnot. This guy is playing multiplayer and I can assure you it never changes. Even if it did why would it make the graphics instantly turn to dog **** . First time iv ever heard of weather effects having an effect on all these things dynamically.
 
More Wolfenstein 2 becnhes:

HBCC: minimal fps gain
GPU Culling: minimal fps gain
Deferred Rendering: hurts fps

GPU Test: @1080p the 1080Ti is 2% faster than Vega64, @1440p the 1080Ti is 10% faster, and @2160 the 1080Ti is 22% faster, I would say this is an obvious CPU limitation issue, @1080p GTX 1080Ti is only 13% faster than regular 1080, this grows into an almost 30% @2160p, which is the expected lead.
https://www.hardwareluxx.de/index.p...ein-2-the-new-colossus-im-benchmark-test.html
 
Last video Fury X vs 980TI. Go to 3:27. The pipes arent as sharp on the nvidia card. They lose far more detail the further away the are compared to the Fury. The textures at the front are bad. The rocks in the back have a lot of detail missing and look low quality. The shadows aren't as sharp. The bump mapping isn't as good and there's even a cloud missing at the back.

The video is riddled with the same. Points for anyone who can spot the differences.

Freeze the last video as you said at 3m 20 and check the purple blocks along with other things for a good lol. AMD looks so much better. Spotted other moments but none so clear as day as that.The Nvidia side looks like it has some blur going on where as the AMD side as all the lines popping as they should be.
 
Freeze the last video as you said at 3m 20 and check the purple blocks along with other things for a good lol. AMD looks so much better. Spotted other moments but none so clear as day as that.The Nvidia side looks like it has some blur going on where as the AMD side as all the lines popping as they should be.
Funny how this is the video he chose to prove that the image quality thing is a myth. The worst possible choice :P
 
Funny how this is the video he chose to prove that the image quality thing is a myth. The worst possible choice :p

:D:D:D:D:D

Yea as i said earlier on in the thread my mate has a 5930 running with a 290x on a 4k Freesync screen and a 1080ti with a 7820k running on an Acer Predator 4k screen. He is seriously considering a Vega 64 because he can see them running side by side everyday and the AMD side looks so much better. FPS wise the gtx1080ti is obviously killing it but on image he's not happy at all seeing what is happening on the other screen. It's truly not even close. His G-Sync monitor is a higher class as well so there is no excuse there.

We asked his 2 kids to choose what they thought looked better and they picked the AMD side and they know nothing of AMD v Nvidia. It's been said for years but until you see a side by side you always think it could be this or that but to me seeing that AMD have a clear advantage.
 
Last edited:
Unbelievable, I post 4 videos with identical IQ and you guys pick one scene where lighting is different because of a time of day change and base all your useless subjective opinions on it! Talk about straw grasping at it's finest. Anyway you guys go on ahead with your dark fantasies while the rest of the world actually enjoys identical image quality on BOTH vendors.
 
Unbelievable, I post 4 videos with identical IQ and you guys pick one scene where lighting is different because of a time of day change and base all your useless subjective opinions on it! Talk about straw grasping at it's finest. Anyway you guys go on ahead with your dark fantasies while the rest of the world actually enjoys identical image quality on BOTH vendors.

Your opinion ain't that great though as you posted a Video where it was easy to see the difference. I will watch the other video's later but i wanted to see what Lokken had seen and it was easy to spot. For you they both are the same though.
 
He credits the original video owner in the description, but doesn't thoroughly explain the process, I know his channel and I know his methods, other new comers don't so they assume things.
Cheers. I'll give him the benefit of the doubt and put it down to amateur hour rather than an axe to grind.

:D:D:D:D:D

Yea as i said earlier on in the thread my mate has a 5930 running with a 290x on a 4k Freesync screen and a 1080ti with a 7820k running on an Acer Predator 4k screen. He is seriously considering a Vega 64 because he can see them running side by side everyday and the AMD side looks so much better. FPS wise the gtx1080ti is obviously killing it but on image he's not happy at all seeing what is happening on the other screen. It's truly not even close. His G-Sync monitor is a higher class as well so there is no excuse there.

I'd say he needs to look at his settings, control panel and monitor. A lot of monitors (and tvs) have really poor settings out of the box.
 
Last edited by a moderator:
I'd say he needs to look at his settings, control panel and monitor. A lot of monitors (and tvs) have really poor settings out of the box.

He has and it needs to be seen to be believed. He's not your usual PC noob either and has been building them for over 20 years. He got me into PC's :D:D:D:D:D. It really is not settings as we checked it, the Nvidia card/monitor just can't replicate the AMD side for whatever reason. It's not that the Nvidia side does not look good it just does not look as good as the AMD side. It's not like his 4k Freesync monitor was on the expensive side as it was under £400. To me it looks like what Nvidia has always looked like and that's washed out. Like i say it's enough for him to think about a downgrade.

An Acer Predator is a decent G-Sync monitor as well so should be doing pretty decent out of the box. The Freesync monitor is not tweaked either. Any how this has been said for many a year and there has to be some truth to it as i can't remember it being said much about the other way round.

It's not something i would say is gospel but i tend to believe my own eyes and have seen this first hand and do so every few weeks when i am through for a gaming night. Screen shots and Video's might not even show it but in real life side by side it's hard to deny. So far i have watched 2 of the videos and the AMD side looks better though.
 
Last edited:
An Acer Predator is a decent G-Sync monitor as well so should be doing pretty decent out of the box.

Nah the AUO panels used in many of these high refresh G-Sync monitors (assuming it is one) does need calibrating out the box - they look bleached and washed out otherwise - dunno what they were thinking as they look fine once you get the gamma setup properly.

EDIT: Regarding battlefront - we've had that one before - the dynamic environments mean things never look the same twice.
 
at 3m 20 and check the purple blocks along with other things. AMD looks so much better.

+1

I have to agree that it does look noticeably better. Before I sold my 7970, I tested it on one or two games and in Fallout 4 I thought it appeared more immersive - almost a greater sense of depth.

I like Nvidia's IQ, but for me ATi/AMD provide a look that I prefer over team green. Sometimes the difference is minor, even questionable, whereas in other instances it's quite noticeable.
 
Nah the AUO panels used in many of these high refresh G-Sync monitors (assuming it is one) does need calibrating out the box - they look bleached and washed out otherwise - dunno what they were thinking as they look fine once you get the gamma setup properly.

It's not high refresh. Both are 4k 60hz. Could just be needing more refined as it's not even close tbh.

You have to ask yourself why Nvidia can't get this right. It's been said for years by many people yet they won't or can't change it. This ain't a new thing and you would think they would do something about it.
 
Last edited:
Knew we had a thread on that Battlefront video before - it isn't an AMD versus nVidia thing as I said at the time I've seen it look like both examples on different nVidia cards at different times - it is either some kind of bug or dynamic weather or whatever. Was Lokken86 who posted it last time.
 
Knew we had a thread on that Battlefront video before - it isn't an AMD versus nVidia thing as I said at the time I've seen it look like both examples on different nVidia cards at different times - it is either some kind of bug or dynamic weather or whatever. Was Lokken86 who posted it last time.

We have had a few threads over the years and it's usually about Nvidia looking inferior. These threads would not be popping up if there was no fuel for the fire. Nvidia are doing something wrong for peoples eyes to keep noticing this. If it was as easy as just using AMD's colour settings i am sure they would have done so by now. The thing is if i had never gamed on an ATI/AMD gpu i would not be bothered in the slightest as it's not like the Nvidia rendering is bad it's just AMD seems to be better.
 
We have had a few threads over the years and it's usually about Nvidia looking inferior. These threads would not be popping up if there was no fuel for the fire. Nvidia are doing something wrong for peoples eyes to keep noticing this. If it was as easy as just using AMD's colour settings i am sure they would have done so by now. The thing is if i had never gamed on an ATI/AMD gpu i would not be bothered in the slightest as it's not like the Nvidia rendering is bad it's just AMD seems to be better.

Over the years I've extensive experience of both GPU vendors and outside of things like the colour precision issue with the GeForce FX series and the odd driver cheat from both sides back in the day largely there has been no difference between them in terms of rendering detail/fidelity, etc. except where caused by bugs specific to certain games.

Side by side AMD generally looks more vibrant but its barely perceptable, the colours on nVidia somehow look more drab - I can only assume it is some difference with the video output chip(s) used - but you can get the same thing on nVidia by nudging the digital vibrance up by 2% - though I'm not a fan of playing with that as it tends to reduce colour accuracy if you change it much.

Beyond that any difference is almost certainly user error or some other problem.

EDIT: Just had another though I wish I'd thought to test at the time - the Frostbite engine has an old old bug where visual quality settings don't apply properly especially if you are changing them a lot for benchmarking - it will say high or whatever for a setting but render ultra or vice versa, etc.
 
Last edited:
Back
Top Bottom