• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

One thing before reading the lines below, I do think it was a genuine Polaris chip that Raja held up and I do think that it was a genuine Polaris chip that was inside the closed system running the demo. No trickery involved.


But to say that it physically could not have been anything else is just laughable, with some time and a 3D printer I reckon I could make a GPU chip that would be indistinguishable at a few meters away, stick it on a board and you would never know.
The power meters showing the numbers, well they could be hooked up to anything, if the cables just went down behind the table it was all sitting on. Get a couple of normal cards to run the two demos and just tell everyone in the room what they want to hear.

Just to be clear again. I am not saying that is what did happen, I'm just indicating that it is physically possible, to fool a room full of people if you really wanted too, so stating that it couldn't possible be anything other than what they said it is just daft.

he was talking about the chip used on battlefront demo, that it couldnt have been anything but a 14nm chip, seeing it's TDP between 30-40watts, what other gpu can run that game at 60fps and hover around 35watt ?
 
One thing before reading the lines below, I do think it was a genuine Polaris chip that Raja held up and I do think that it was a genuine Polaris chip that was inside the closed system running the demo. No trickery involved.


But to say that it physically could not have been anything else is just laughable, with some time and a 3D printer I reckon I could make a GPU chip that would be indistinguishable at a few meters away, stick it on a board and you would never know.
The power meters showing the numbers, well they could be hooked up to anything, if the cables just went down behind the table it was all sitting on. Get a couple of normal cards to run the two demos and just tell everyone in the room what they want to hear.

Just to be clear again. I am not saying that is what did happen, I'm just indicating that it is physically possible, to fool a room full of people if you really wanted too, so stating that it couldn't possible be anything other than what they said it is just daft.

I'm taking it as a given that like any power monitor demo, it will almost certainly be hooked up in a way that you can physically see a standard power monitor device plugged in, see the power cable go from there to the psu, see the case connected to the PC and see all the cables.

This is pretty much a standard for such a show, IE, I've seen such displays before and it's incredibly easy to lay out in a way that the only power is coming directly through that single cable.

As such it would be physically impossible to produce that level of performance at that given power level any other way. The demos I've seen like these, and every piece of logic suggests that card could only have been a 14/16nm gpu, unless it was an Nvidia 1030gt, I'm going to go with it being Polaris.

When articles by respected journalists say it was a demo that showed the power difference, I presume it means they SHOWED the power difference which includes the above, showing the system is connected to that power monitor and showing there are no other cables involved. A good journalist will walk around such a display and confirm for themselves that it isn't connected to some other computer hidden out of sight.

Such a display, which is entirely standard, would require the cable to the monitor to be connected to a completely different system and would be easy to spot. It's a detail i wouldn't expect explained in an article because there is no need, though if there was something suspicious I would expect to see it in the article, which in this case no one did.
 
Last edited:
Your forgetting the fact that there were expert independent journalists and other industry experts present.

If you believe that they would be fooled by a 3D printed GPU, then you've probably one of those people who believe the Earth is flat and that the Moon landings were fake too, in other words there's no hope for you.

he was talking about the chip used on battlefront demo, that it couldnt have been anything but a 14nm chip, seeing it's TDP between 30-40watts, what other gpu can run that game at 60fps and hover around 35watt ?


So you are both saying that it couldn't possibly have been faked by whatever means, if they had wanted too.

Again I will state that I don't think it was faked, I'm just saying that it is physically possible for it to have been.
 
So you are both saying that it couldn't possibly have been faked by whatever means, if they had wanted too.

Again I will state that I don't think it was faked, I'm just saying that it is physically possible for it to have been.

assuming it was faked, is assuming the idiocy of all the press that went there and made sure there was nothing fishy about the cable layouts, and most of them reported on it.

 
Last edited:
assuming it was faked, is assuming the idiocy of all the press that went there and made sure there was nothing fishy about the cable layouts, and most of them reported on it.

I never said it was faked I just said if they had wanted to do it is physically possible.

Yet again I will state that I do not think it was faked.
 
I never said it was faked I just said if they had wanted to do it is physically possible.

Yet again I will state that I do not think it was faked.

It isn't physically possible within the confines of the demo that was showed. As with my post above, power demos as are fairly standard at such shows, almost always make the effort to ensure there aren't extra power cables or to let anything you've suggested even a slight possibility.

If AMD did a power demo over a web came showing a desk, monitor and a bunch of cables with no journalists present that is one thing. But at a trade show, on a desk where dozens of journalists can just look around the table and see what is hooked up to what, no, it actually isn't possible.
 
Hey, anything is possible and like Bru, I don't believe it was a fake but that doesn't mean it isn't beyond the boundaries of possible and with AMD needing to appease shareholders, it is even more plausible that it was a fake. It wouldn't be the first time AMD or Nvidia have mislead investors.
 
Hey, anything is possible and like Bru, I don't believe it was a fake but that doesn't mean it isn't beyond the boundaries of possible and with AMD needing to appease shareholders, it is even more plausible that it was a fake. It wouldn't be the first time AMD or Nvidia have mislead investors.

You're floundering, Nvidia DID show a fake, they've done it twice and with no debt so no reason to lie.... so equating lying/faking with shares is clearly ridiculous.

Second, it is not more plausible it was a fake full stop, it can't be because we KNOW it was real. Dear lord, can you really keep up this pretence you are unbiased while peddling this stuff. There is literally a video showing the computers ON THE DESK behind the monitors with the power monitors and cables direct to those computers on the desk for the precise reason I suggested. This is standard, supply a power demonstration, verify what it is by providing access to be able to see the cables and confirm the screen is connected to that computer which is connected to that power monitor.

There is no reason to fake it, it's been proven beyond a doubt. You weren't there but everyone that was there says it was real. YOu literally claimed, incorrectly that AMD did NOT show any cores to anyone and someone provided you with a direct link to Anandtech article stating they did. That isn't the first time that has been linked to. Multiple journalists say they have physically seen the cores, multiple journalists were in attendance, saw the power demo and would have physically confirmed those computers were connected in such a way it can't be faked.

It was no fake, stop coming up with reasons it might be fake or specious reasoning they might want to fake it.

We know, as fact, that Polaris 14nm gpus were both demoed and shown to more than enough people to verify what they were seeing. Beyond any doubt it was not fake.

There are more than enough close up images of the 'Pascal' showing from Nvidia, they confirm beyond any reasonable doubt they were Maxwell GM204 MXM modules. There is no reasoning around how AMD may have faked it or Nvidia may have not faked it.
 
“What higher node? Intel 22nm? There is no TSMC, UMC or Samsung/GF FinFet process above 16/14.”
IMG first test silicon was on basic 28nm to test out the new architecture & build a reference design so drivers and software could be worked on ready for the final products. The volume products will not be on 28nm but expected at TSMC FinFet 16/14 and later 10nm.

First silicon and dev kits don’t have to match the finale hardware they only need to be code compatible.



“Also, aside from FinFet, if we ignore that ... how exactly do you test a massively more dense chip design on a process that's massively less dense? LOL.”
You either do what IMG did and run it at a much lower mhz speeds so it doesn’t overheat but doesn’t run at full speed. Or you stick on liquid cooling on. Another option as you are only doing limited numbers you build extra big chips which can be a reason why you don’t want to show the silicon as doesn’t represent how the finished version will look. Once I was shown a test silicon chip & board hold together with clamps . That’s two examples of when you wouldn’t want first silicon to get shown. It’s perfectly reasonable from a development point of view but the internet and press tend to take it the wrong way. So early silicon gest hidden away.



“If they go to partners in April, then that's probably the first good test silicon they'll have.”
You cannot ship test silicon to partners as soon as you have it. You need to work on the drivers, software and more importantly do all the testing of the chip. All that takes a good 6months or longer so partners have to wait at least that long. Otherwise you are sending your partners a useless lump of silicon or worse if it has a major fault.



“Chance of Nvidia having 100% yield on any chip they've ever made, none, there is no chance of that. If they had a single wafer of chips back, they would have a dozen or more dead chips perfect for parading around on stage that have zero other use....”
Since when? Have you got any evidence to back that up? My experience with GPU company’s is not like that, ok it does happen that the first silicon chips fail but when dealing with limited numbers of first test chips you can have a 100% yield. It doesn’t happen every time but on the other hand first run silicon should not produce dozens of dead chips unless you did something majorly wrong. I do sometimes wonder, have you ever been inside a GPU company? Have you ever seen the R&D departments and how they work?



“If they have silicon back they have spare silicon to stick on such a card.”
Yet there are so many examples of company’s having silicon but not showing it to the public for various reasons. Now I don’t always agree with the reasons but company’s do have silicon and keep it hidden. IMG did the same thing, they never hold up one product and said it’s another but they had silicon kept hidden for 6 months. AMD have done the same thing kept silicon hidden. What if the first silicon they have isn't suitable to be shown to the public? What if the first chips are extra big as one example.
 
Last edited:
You cannot ship test silicon to partners as soon as you have it. You need to work on the drivers, software and more importantly do all the testing of the chip. All that takes a good 6months or longer so partners have to wait at least that long. Otherwise you are sending your partners a useless lump of silicon or worse if it has a major fault.

These days nVidia has full hardware emulation - should see the size of the Kepler/Maxwell rigs :D - much of that is done - 90% of the driver work that used to start when they first got silicon is already done - its more verifying it actually works with real hardware like it does with simulated ideal hardware.
 
Yet there are so many examples of company’s having silicon but not showing it to the public for various reasons.
*snip*
AMD have done the same thing kept silicon hidden. What if the first silicon they have isn't suitable to be shown to the public? What if the first chips are extra big as one example.

The thing with this is that he went up on stage and proclaimed that the thing he was showing was something else. When in fact it was a part that was made mid last year.

Now it could be plausible that it was early silicon, but it is unlikely seeing how the die's dimensions and board layout are identical to MXM Maxwell.

But here are the facts.

AMD showed a system with very low power usage for the graphical performance being demonstrated.

AMD showed chips behind doors as stated by reputable tech websites.

Nvidia showed a car module with what they claimed were two Pascal chips, to which none of their official photography shows the side with the MXM modules.

It has been mentioned numerous times that TSMC's FF+ is a few months to half a year behind Samsung's LPP process. This does not mean that Pascal is delayed, just that TSMC couldn't start manufacturing chips in volume to begin with. This means that for Nvidia there was less time between fabrication and testing compared with AMD and samsung.

Now people might pipe up about mobile chips running on FF+ and LPP already. But they are very small compared to a GPU Die and have considerably fewer transistors. This reduces their complexity to manufacturer and decreases their ramp up time.

In the end, nvidia will have some chips for testing but nowhere near enough for them to be outside of labs on show. And more than likely none that are performant enough for demonstration purposes due to early silicon etc.

Yet numerous people still have to act like children.
 
Last edited:
Nvidia and AMD reading this Thread

giphy.gif
 
Any actual pascal information across these 52 pages? Or just usual trash and petty bickering?

There is very little actual information - the only kind of sort of solid info really is that nVidia are gearing up into Pascal development (as exhibited with automotive technology, etc.), HBM(2) availability is a mixed story but Samsung is a little ahead of schedule, TSMC started delivering product tapeouts for their major customers towards the end of last year and recently started volume production of the process that Pascal will most likely be built on.

There is also fairly solid information that nVidia is planning on using Volta for the high end "big" compute parts with a roadmap for rapid ramping up throughout 2017 and delivering to commercial customers in 2018 - consumer access to Volta won't come until later. Commercial compute application of Pascal seems to be being restricted to mobile through to medium level HPC and not the big supercomputers.
 
Last edited:
Updated the post a little - there is very little solid information at all but a certain amount of minimum/maximum range/timetable can be inferred from what other companies are doing i.e. automotive products, other TSMC clients, etc.
 
JsTbP4b.jpg


People try to convince each other that their theories are better. Pointless. We will see when the cards came out what is the state of the two manufacturers.
 
Back
Top Bottom