• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

And the Cypress shader count is...

This guy has some issues and its really starting to tire me.

Perhaps he put his 8800gtx in upside down or something when he failed graduation and cant get over it.

If nvidia is as dead as he is making out then i don't see how they could have survived this long as a business.

I like competition, it keeps the gfx card market healthy, if one side is down and out, then the competitor prices will not budge at the same rate.

I have no doubt these cards will be great, but theres no point posting an article without some solid proof or source. Like Yamahahahahaha said 'Believe it when I see it. '

Charlie Demerjian is the worst "journalist" out there. He really is like 10 year old school kid with a grudge, and he's meant to be in his late 30's or something.

I lose respect for anyone who posts a quote or links to any of his articles. They're not articles, just personal rants of some fat immature mentally ill yank.
Yeah he guesses right sometimes, but the vast majority of the time he's wrong, and i can link to countless articles to prove this.
Theres even a blog dedicated to the FUD this man writes charlie-Demerjian-Is-A-Douchebag.

Oh and heres a article about him on Despicable Media Watch.

It's disgusting he gets payed to write stuff. Literally anyone here could do his job better. But it's all about page views. So in the future can pople stop linking to Demerjian rants, even if it's just to help stop this moron getting the page views he does not deserve.
 
Last edited:
:rolleyes:
Charlie Demerjian is the worst "journalist" out there. He really is like 10 year old school kid with a grudge, and he's meant to be in his late 30's or something.

I lose respect for anyone who posts a quote or links to any of his articles. They're not articles, just personal rants of some fat immature mentally ill yank.
Yeah he guesses right sometimes, but the vast majority of the time he's wrong, and i can link to countless articles to prove this.
Theres even a blog dedicated to the FUD this man writes charlie-Demerjian-Is-A-Douchebag.

Oh and heres a article about him on Despicable Media Watch.

It's disgusting he gets payed to write stuff. Literally anyone here could do his job better. But it's all about page views. So in the future can pople stop linking to Demerjian rants, even if it's just to help stop this moron getting the page views he does not deserve.


Why don't you just go read another thread, are you forced to read this thread:rolleyes: Im more fed up of stupid Nvidia fan boys making comments on this guy.

If what he says is total **** then why worry:rolleyes:

What is sad man is grown men starting a blog on a guy they don't know, why? because he has a grudge against Nvidia........oh how very awful, he must be hanged at once:rolleyes:

You sound like a very angry Nvidia supporter, maybe you should get a reality check.
 
You just got to take him with a pinch of salt some things he gets right and some he don't.
He was spot on about the mobile chips so he does get quite a few things right,i just read what he says and if it true fair enough if not who really cares.
 
:rolleyes:


Why don't you just go read another thread, are you forced to read this thread:rolleyes: Im more fed up of stupid Nvidia fan boys making comments on this guy.

If what he says is total **** then why worry:rolleyes:

What is sad man is grown men starting a blog on a guy they don't know, why? because he has a grudge against Nvidia........oh how very awful, he must be hanged at once:rolleyes:

You sound like a very angry Nvidia supporter, maybe you should get a reality check.

I honestly think MR.B wouldn't have mentioned Charlie at all if he was ranting about another company.

That blog too, that's worse than charlie himself, I mean, if they think what he does is so bad, why do they do the same thing about him? :rolleyes:

We all know charlie is an idiot, but you don't have to read what he posts or react in such a way.

You're giving him the power if you react to what he says, if no one reacted to responded I'm sure he'd soon go away, and if he didn't, so what, people would just be ignoring him.
 
Charlie Demerjian is the worst "journalist" out there. He really is like 10 year old school kid with a grudge, and he's meant to be in his late 30's or something.

He's actually one of the best journo's I've seen online, because most "journo's" online simply get newslinks and post them up as their own story's, get sent press releases by every manufacturer and post them up as news. Thats not journalism, thats posting links and pasting press releases.

Charlie's posts are long, and for those people with no technical knowledge his posts might seem boring and long winded, but what he does, what EVERY good jounalist should do is quantify his reasoning. He makes a "guess", a very good one, and he quantifys EXACTLY why he came to his conclusions. Theres no grey area, theres no miscommunication, there's no claiming something without sources and evidence. Every single post on the Nvidia's faulty products debacle had verified links to verified information released by several companies that proved what he claimed. Almost every claim he made has since turned out to be true.

If you want to come on here and claim he's a poor journalist and insist most of what he claims is incorrect, give us an example of something he claimed that isn't true? Or is it really just that you love poor ickle Nvidia, or that you simply didn't understand most of his storys?

When did being thorough, providing proof, proving his storys and actually investigating things make you a bad journalist. Too many people think simply regurgitating whatever press releases people are sent is what journalism is, its not.
 
I doubt that very much, with no option but to buy AMD gamers wouldn’t have any choice, they either buy the overpriced high end card or the overpriced mid range card, with no choice there profits would increase one way or the other

If you think otherwise then you’re just deluding yourself or your totally clueless, with the comment about nvidia being irrelevant I definitely know which way I'm leaning.

This is where you're COMPLETELY WRONG. You're ignoring the very simple choice, infact you're excluding it, people can and DO choose to not buy a card. Did every person on earth run out and get a 260gtx at £300 when it was launched, easily spanking the last gen cards and easily surpassing the 3XXX series performance? no, because not everyone will spend £300, its very simple, 1000x's more people will buy a card priced at £200 or less, than a card that was £300.

Most people simply won't upgrade if it costs to much, high end sales have never, ever been money makers for either company. They used to make all their profits off £30-80 cards, with the higher end cards making very little. AMD have managed to change their profitability from the £30 cards right up to their £200 cards.


Again I'll point out, you've been proven wrong, already. Its not very difficult, 4870, it was competitive with the £300-330 260gtx on its release, they knew the price, it wasn't a secret it was out after it. Given your theory of competition causing prices to be low, AMD would have launched their card at or above the 260gtx pricing. THey could have undercut them but £20, but for some magical reason they decided to undercut a similarly performing card by, £130. Keep in mind they could have sold them at £280, and when Nvidia dropped prices, kept dropping them, why didn't they? because they get more sales and make FAR more money selling them cheaper.

Nvidia had NOTHING to do with AMD pricing their card at under £200, its very clear they had nothing to do with it, because no company would purposefully sell a card that according to you, would sell the same amount of cards due to lack of other options, £100 cheaper than they could have. According to you they could have sold the exact same number of cards and made an extra £100 profit on each and every sale.

Of course the sales figures don't back you up, and the fact that AMD sold a ridiculous number of 4XXX series cards entirely due to their price, should be completely ignored :rolleyes:

Keep in mind things like, say Steam survey's, where a 6600gt would be in say, 40% of peoples computers, while 6800GT's were in less then 1-2% of computers, Nvidia made a lot more cash out of their 6600GT than they did out of their 6800's of any level. Thats how the industry has always worked.
 
Last edited:
What intel say and what actually happens is a totally different thing... demonstrations (by intel) of simulated larrabee performance are distinctly underwhelming by this generation performance let alone in the future... and intels approach to drivers will get them nowhere in the gaming world even at low/mainstream...

Sorry I don't see intel as a threat in the gaming market to either ATI or nVidia unless their entire attitude and methodology shifts dramatically.

YOu keep saying their approach to driver making will get them no where. THey haven't had a high end gaming card, or a midrange or low range for that matter, to release drivers for. Their drivers aren't great, yet they work fine. AS many people have pointed out to you, they have the money to, should they want to(and I can't see why) hire everyone on Nvidia's staff to do their drivers. They have the resources to hire every great programmer on earth to do their drivers. They haven't to date, because why would they need an Nvidia/ATi style driver team, when they don't have products for them to work on. When they do, they'll get the right people in, not sure how thats hard to understand.

This I've been saying for a year, and what have Intel gone and done in the past couple months, thats right, buy up a bunch of small programming firms who all focus on multithreading stuff. So infact they are buying up experienced programmers in the multi-threading performancer area, just as anyone could have guessed they would do.

SO they ARE buying up programmers left right and centre as their cards approach being able to be released.

As for what their simulations and stuff to date has shown, errm, so? They showed a simulation of performance of, as I understand it, whats supposed to be the current working generation and likely the generation of Larabee behind what they will eventually release. Theres also little sense assuming you will come in and dominate the top end performance segment, which is why they quite smartly seem to be aiming for the low/midrange segment which is a much easier goal to attain, and also, unsurprisingly, the likely kinds of cores they would at a later date be intergrating onto a CPU die.

AMD and Intel will both be adding gpu's onto the cpu die in 1-2 years. SO smartly, Intel wants to make their low end graphics their focus for now, so they have some experience, some brand recognition(in terms of graphics and gaming performance) so people actually consider a Intel GPU on die as a worthwhile addition.

THe thing of it is, your whole argument against Larabee seems to be based around the seeming incompetance of their driver team, which never was and never needed to be a great driver team. You seem unwilling to accept the idea Intel will add people, more people and better people, to this team. They already have now, so maybe its time to go with the idea they might have great drivers plus TWIMTBP is going to be utterly and absolutely blown out of the water by Intel investing their time, people and money in helping game dev's from now on. Which means Nvidia will likely lose all advantage over ATi from their programming help and investment, and that advantage will swing Intel's way in a huge way in the coming years.
 
I personally don't care who guy likes or hates, i know very little on him. But i must say ive yet to read another reporter that aint up Nvidia or ATI's backside. This guy speaks in plain english for us noobs, he also gives a honest opinion.

All i ask for:)
 
YOu keep saying their approach to driver making will get them no where. THey haven't had a high end gaming card, or a midrange or low range for that matter, to release drivers for. Their drivers aren't great, yet they work fine. AS many people have pointed out to you, they have the money to, should they want to(and I can't see why) hire everyone on Nvidia's staff to do their drivers. They have the resources to hire every great programmer on earth to do their drivers. They haven't to date, because why would they need an Nvidia/ATi style driver team, when they don't have products for them to work on. When they do, they'll get the right people in, not sure how thats hard to understand.

This I've been saying for a year, and what have Intel gone and done in the past couple months, thats right, buy up a bunch of small programming firms who all focus on multithreading stuff. So infact they are buying up experienced programmers in the multi-threading performancer area, just as anyone could have guessed they would do.

SO they ARE buying up programmers left right and centre as their cards approach being able to be released.

As for what their simulations and stuff to date has shown, errm, so? They showed a simulation of performance of, as I understand it, whats supposed to be the current working generation and likely the generation of Larabee behind what they will eventually release. Theres also little sense assuming you will come in and dominate the top end performance segment, which is why they quite smartly seem to be aiming for the low/midrange segment which is a much easier goal to attain, and also, unsurprisingly, the likely kinds of cores they would at a later date be intergrating onto a CPU die.

AMD and Intel will both be adding gpu's onto the cpu die in 1-2 years. SO smartly, Intel wants to make their low end graphics their focus for now, so they have some experience, some brand recognition(in terms of graphics and gaming performance) so people actually consider a Intel GPU on die as a worthwhile addition.

THe thing of it is, your whole argument against Larabee seems to be based around the seeming incompetance of their driver team, which never was and never needed to be a great driver team. You seem unwilling to accept the idea Intel will add people, more people and better people, to this team. They already have now, so maybe its time to go with the idea they might have great drivers plus TWIMTBP is going to be utterly and absolutely blown out of the water by Intel investing their time, people and money in helping game dev's from now on. Which means Nvidia will likely lose all advantage over ATi from their programming help and investment, and that advantage will swing Intel's way in a huge way in the coming years.

Its not hard to understand... what you don't seem to understand is... they could hire the entire nVidia driver team, throw all their money at them and still fail - because of the entire mentality, direction and structure they would have to work with at intel... they would have to give that team entire freedom to work outside of intel for it to succeed.

Intel like to make very plain, very functional, fire and forget drivers - which in itself is not a bad thing, works so much better generally than the bloatware many companies throw out that take 20 updates to fix a single bug while introducing 20 more bugs along the way... but it doesn't work for the gaming market - where people like lots of features and regular updates... until this approach from intel changes they will not succeed and its like changing the direction of a big ship...

How intel will measure up to nVidias TWIMTBP is hard to predict but they don't tend to have the flexibility to work with the varied demands of different studios...

For all that I do hope intel does succeed even tho I don't see it happening without a major shift in how they work - it would be nice to see some new competition and innovation in the market things have got a bit stale lately.
 
I don't understand why there's a Charlie love-in going on.
He's consistently wrong, just look at that blog...

It's not really journalism if you just make loads of random statements and more often than not, get it wrong.
Although, since most people read red-top newspapers, I'm not surprised that this level of 'journalism' is considered meriticious.

Anyway, back to ATI... Still nothing official or interesting leaked.
 
Charlie is a hack - I love the way he goes on massive rants about things no one cares about... half the time hes just regurgitating things hes been told in a very naive and idealistic manner... reminds me of the guy from golden eye.
 
Charlie is a hack - I love the way he goes on massive rants about things no one cares about... half the time hes just regurgitating things hes been told in a very naive and idealistic manner... reminds me of the guy from golden eye.

It's obvious most of what he complains about you of all people wouldn't care about, but a lot of what he goes on about is nVidia ethics that a lot of people on here are interested in.

For charlie to be hack, I would guess that you also should be counted as one to be honest.
 
But we weren't talking about me not sure why you feel the need to attack me personally at every chance you get.

It's not an attack, it's an observation. As for every chance I get, that would mean every post I make in relation to you is an attack, which isn't true.

As for 'we weren't talking about me', you made a comment about some one being a hack, a few people around here think you're also a hack. It was only fair. ;)
 
Back
Top Bottom