• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Launches "AMD Has Issues" Marketing Offensive ahead of Hawaii Launch

Caporegime
Joined
12 Jul 2007
Posts
42,001
Location
United Kingdom
This is not news to me but it may be to some.


Nvidia Launches "AMD Has Issues" Marketing Offensive ahead of Hawaii Launch


Recently I read an article on one of the popular technology websites which drew a comparison between multi-monitor gaming and 4K displays, joining the frame in claiming that Nvidia has a superior experience in 4K over AMD. Our experience, however - was exactly the opposite.

First a bit of a background - here at BSN we have more than five years of experience working with 4K equipment, including displays. And while as a video enthusiast and 4K gaming proponent I just have to state that while gaming on a 4K panel is more immersive than AMD's Eyefinity and Nvidia (3D) Surround - the biggest problem lies within the displays themselves.
...
When running on GeForce cards (tested with GTX 580 / 590 / 680 / Titan), games would show a Vertical tear (not a horizontal one), regardless of running with Vsync on or off. Furthermore, Nvidia cards would display the Start bar only to 50% of the screen, and the wallpaper was double.

Our test setup - we use 7970 boards from PowerColor and GTX Titan cards from Nvidia.

On the other hand, AMD Radeon hardware (tested on HD 7970) worked simple - just select Eyefinity mode and voila, the task bar would fill the screen, single wallpaper and most importantly, no vertical tear in the middle.
The reason for these issues was simple - the display required two Dual-Link DVI cables and Nvidia consumer cards experienced issues when running in dual display setup. The story was exactly the opposite on the Quadro cards, as witnessed after running the Quadro K5000 - no tearing what so ever, and unified display (single wallpaper). After we worked with Nvidia on explaining the exact nature of the problems, the company released a driver that fixed all of the issues encountered.

At the time, AMD offered better game experience, as you can read in our detailed 4K Shootout article. Bear in mind that we have tested scaling between a single, dual and triple-GPU configuration between AMD and Nvidia.

...
And this brings us to the most important part of the whole story. Why would someone write an article mentioning or criticizing 4K gaming experience a week ahead of AMD's launch of 4K optimized drivers and the Hawaii GPU architecture in (you've guessed it) - Hawaii?

Source

Full article here, its a very good read.

http://www.brightsideofnews.com/new...rketing-offensive-ahead-of-hawaii-launch.aspx


oLCMjU7.jpg
 
Last edited:
Yes the title was actually a post from someone else which i mistakenly thought was the headline article. I will edit that now to the correct headline. Thanks for pointing it out.
 
You are forgetting this prototype driver addresses the framepacing problem on resolutions of 2560x1440 and less.

It does nothing on the eyenfinity problems and multi-monitor problems mentioned in Ryan's article.

There is no dodgy journalism here. Read the article and you'll see.

And this brings us to the most important part of the whole story. Why would someone write an article mentioning or criticizing 4K gaming experience a week ahead of AMD's launch of 4K optimized drivers and the Hawaii GPU architecture in (you've guessed it) - Hawaii?

Ryan admitted on oc.net he'd been holding on to the info for a couple of months. Timing stinks.
 
Tommy has a point. In that 4k article he's using a frame pacing driver and it appears to be working. I thought he didn't have access to a frame pacing driver that fixed higher than 1600p? The plot thickens. So if he had a working driver back in april why bring this out now while not using that previous driver? Especially as the final fix for 4k comes in a week or so.
 
Last edited:
Ryan said:
Actually, funny you bring that up. To be 100% up front with you, PC Perspective had the entire gammut of testing hardware BEFORE NVIDIA did and in fact NVIDIA was asking US how WE were doing things in 2012 before they created FCAT. We bought the very first Datapath DVI-DL capture card in the US, on our own. We bought the DVI splitter on our own. All the hardware was purchased out of OUR POCKET. Thanks for allowing me to clarify that.

postaltwinkie said:
Wait....One second....

The part I made bold.....

This video was posted by PCPer on May 9th of this year.....

You say you were doing this before Nvidia, yet here is Nvidia on your show talking about how they had been doing it for a "bunch of years", and he then specifically says they created FCAT a "couple" of years ago. This statement puts FCAT creation back to about ~2011....at least a full year before the date you are giving in your above quote.

Nvidia is on record saying they had FCAT prior to 2012, so if that is the case; why would they, or how would they, be asking you on how you were doing it?

Ryan said:
NVIDIA is more talking about developing a theory of testing it. All I can tell you is that I had the capability to capture full quality before them... For example: http://screencast.com/t/iIMOLSCATT

malventano said:
(Another editor at pc per)

I seriously doubt Ryan (or any reviewer for that matter) is going to paste emails between them and any reviewed hardware company in a public forum.

It's pretty obvious at this point that those who have some form of allegiance are going to continue spraying until something sticks. My own issue with AMD is that they are basically relying on the review sites to discover the issues and not fixing the issues until they have been revealed and published. I personally don't like being a beta tester on retail hardware that I've purchased. That bugs me.

For the timeline thing, whatever NVidia may have been doing *internally* "for years" is irrelevant. If they were feeding us tools that far back, we wouldn't have to buy our own capture gear, or color individual frames for the published content, just to break the story in the first place.

postaltwinkie said:
No one is asking, or expecting, to see e-mails back and forth. Even the one he provided was a bit shocking to see get pushed out to a public forum like this.

As for the timeline; it is relevant. Ryan stated you guys were doing something before Nvidia was, and that Nvidia came to you for help. When it is clear Nvidia was doing it before you, before anyone. The claim itself calls into question the integrity of the rest of the content of PCPer.

Ryan interviewed Nvidia, and the video went live in May, then today he is on here saying PCPer pretty much invented the technique.

Come on, you have to see an issue in that.

postalTwinkie said:
As a private company, they don't have to disclose their financials, you are correct in that. In the spirit of integrity and honesty in Journalism, if they are being sponsored by one side or the other, they should disclose that. If they aren't, a simple "We aren't" should suffice, and people should give them some credit and take them at their word on that matter.

That being said; it still doesn't clear up the other major questions about their review and timeline presented by Ryan in this thread. Considering the overwhelming evidence that the claim today of PCPer doing this sort of thing before Nvidia isn't true. I think that is a bigger issue, if they aren't honest in that facet, what else are they being slanted on?

This never got a reply. Wonder why. ;)
 
Last edited:
Not directly related but was reading on Guru3D. lol, best graph ever. I <3 Crossfire

index.php

:D

You missed out the best part.

I do want to note that what you see here is actually hard to see on the screen

Basically admitting that without FCAT, you can't really tell. Lol GG.

EDIT

Oh and i'll bite.

I <3 Nvidia Drivers

v7tI0pg.jpg

:p
 
Last edited:
Well obviously it's hard to read. Looks like he rigged it up to his Richter scale by mistake. May as well have just said it couldn't compete in the bench.

To be fair the Guru3d article was fair and balanced. It was well known that the frame pacing was not included yet and they made it fairly clear and said they will re-test when they're available.
 
So are people with 7 series cards still experiencing frame pacing issues, or is this now just isolated to 4k? The benchmarks in the Guru article are obviously at 4K so thought it was at least a little bit relevant. Micro stuttering is an issue on both sides of the fence anyway, it's never been rectified.

No issues up to 1600p. Only Eyefinity/DX9/OpenGL users have the supposed problem at the moment. One thing ive noticed is i don't seem to get micrsostutter on DX9 games, don't know why. Maybe its because they're not very demanding.

Tommy's quote sums it up well. 4k Monitor with a Beta frame pacing driver AMD sent pcper.

The problem being, Ryan has already had access to the prototype driver which helps address the frame metering issue, but I don't see any mention/reference of it in this latest article.

Just one example of dodgy journalism:



Ryan(Delboy) is misdirecting his audience-'If it does require hardware' despite first hand experience that AMD is working on a software fix, he knew that back in April:



'Frame Rating: High End GPUs Benchmarked at 4K Resolutions'

http://www.pcper.com/reviews/Graphi...marked-4K-Resolutions/Battlefield-3-999-Level

Which yet again begs the question, why release a report that contradicts his earlier report so close to the 25th September as he has clearly lied to his target audience?

83IitOf.png


That article was printed back in April 30, 2013 using

Graphics Drivers AMD: 13.5 beta
AMD: Frame Pacing Prototype 2 (HD 7990)

So clearly AMD had sent Pcper a prototype driver that fixes 4k. However for some reason rather than use that again, or wait a week for the final version to be released, they did that article using a different driver. At least thats as far as i understand it.

The funny thing is we have a page back PCPER saying we created FCAT in 2012. Before this Nvidia were on pcper (see the video last page) saying they created FCAT years ago, ala 2011. It brings into question a lot of what pcper say as someone is clearly lying.
 
Last edited:
nVidia developed a testing suite way before 2011 even, "FCAT" is part of that for taking the raw data and analysing it but itself doesn't actually do the (hardware) data capture part despite the name (EDIT: roughly 5 minutes and 40 seconds into the interview with nVidia touches on this). Not sure at what point it became known as FCAT.

PCPer came up with the hardware side of capturing the data from source for themselves and at some point in trying to produce a way to represent that raw data in a meaningful way got into contact with nVidia who opened up some of the tools they were using to them as well as refining their own systems based on what PCPer was doing.

Well in that thread Ryan is claiming its pcper who created it long before Nvidia.
 
Well, looks like the BSN article is not so full of rubbish after all. Techreport pretty much confirming what some of it says. :rolleyes:

So the problem is fixed (aside from frame pacing which is coming soon) by using DisplayPort. Jesus. More mud slinging crap.

Ryan tells me he was working on this story behind the scenes for a while, talking to both AMD and Nvidia about problems they each had with 4K monitors. You can imagine what happened when these two fierce competitors caught wind of the CrossFire problems.

For its part, Nvidia called together several of us in the press last week, got us set up to use FCAT with 4K monitors, and pointed us toward some specific issues with their competition. One the big issues Nvidia emphasized in this context is how Radeons using dual HDMI outputs to drive a 4K display can exhibit vertical tearing right smack in the middle of the screen, where the two tiles meet, because they're not being refreshed in sync. This problem is easy to spot in operation.

GeForces don't do this. Fortunately, you can avoid this problem on Radeons simply by using a single DisplayPort cable and putting the monitor into DisplayPort MST mode. The display is still treated as two tiles, but the two DP streams use the same timing source, and this vertical tearing effect is eliminated.

I figure if you drop thousands of dollars on a 4K gaming setup, you can spring for the best cable config. So one of Nvidia's main points just doesn't resonate with me.


And you've gotta say, it's quite the aggressive move, working to highlight problems with 4K displays just days ahead of your rival's big launch event for a next-gen GPU. I had to take some time to confirm that the Eyefinity/4K issues were truly different from the known issues with CrossFire on a single monitor before deciding to post anything.

Full article
http://techreport.com/blog/25399/here-why-the-crossfire-eyefinity-4k-story-matters

So Nvidia calls everyone together apart from pcper...yeah. Not sure i buy that one little bit.

The Nvidia bias mods (no doubt Alatar) over at oc.net had the thread title changed and the thread moved to a dead section of the forum. Yet we have another Tech site pretty much confirming everything the BSN article was saying. :mad: :confused:

Its scary how deep the wormhole goes. :p
 
Back
Top Bottom