• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

It's powered by a micro-blackhole which sucks in dust and spews out small amounts of gamma radiation; which is then harnessed to self sustain the GPU and LEDs - thus, requiring no Molex connectors whatsoever.
 
No, afraid not :(

The first is a repost of this chart, which was discussed earlier in the thread (see here for my take on it). I can't believe that people are still reposting the "6.0Gbps" figure for "memory speed". Surely this should be "6.0Ghz quad pumped"... 6Gbps would make no sense whatsoever, as the bandwidth of the GTX580 was 194GB/s.

The second link seems to be based on the other chart that has been doing the rounds for a few weeks (see here).

6Gbps is the internal speed of the memory per pin..... 32pins = 192GB/s(ish, actually Gbits but who cares) bandwidth.

Memory speed can be listed in a variety of different ways, bandwidth is bandwidth though and memory speed is memory speed.
 
6Gbps is the internal speed of the memory per pin..... 32pins = 192GB/s(ish, actually Gbits but who cares) bandwidth.

Well since it's a factor of eight difference, I'd say it's pretty important :confused:

Besides, when have you ever seen GPU memory bandwidth quoted in terms of bandwidth per pin?! Not to mention that the chart says "memory speed", not "bandwidth".

Are you saying that you believe the chart is genuine?
 
Well since it's a factor of eight difference, I'd say it's pretty important :confused:

Besides, when have you ever seen GPU memory bandwidth quoted in terms of bandwidth per pin?! Not to mention that the chart says "memory speed", not "bandwidth".

Are you saying that you believe the chart is genuine?

Again I'll say it, the SPEED of the memory is rated at 6Gbps PER PIN, gddr5 has 32 pins, that makes the speed 192Gb/s. That gives a chip on 32bit bus 24GB/s, on a 256bus, that gives 192GB/s.

With a fairly standard 256bit bus because of the /8 * 8 situation to get GB/s then on a bigger bus, you can rather easily directly convert it, so thats why I don't care ;) )

You misunderstand, bandwidth is bandwidth(192GB/s) memory speed is memory speed(6Gbps) that IS the memory speed and memory speed for consumer is simplified to Mhz often, as standard its talked about in bandwidth per pin, so the answer is yes, I frequently see memory regarded by its bandwidth and not its Mhz.

http://www.techpowerup.com/111029/Hynix_Introduces_World_s_First_40_nm_Class_2_Gb_GDDR5_DRAM.html

Press release and news story on a enthusiast website that doesn't mention Mhz/Ghz once, and only mentions the bandwidth.

Samsung don't make 7Ghz GDDR5, they make 7Gbps GDDR5.

I always find it funny, if I simply things to make a post shorter on a public forum which isn't chock full of scientists, I get you and xsistor breathing down my neck over being incorrect.... if I post a long explanation everyone complains about long posts.

http://www.anandtech.com/show/2556

Plenty of mention of Gbps there, and Mhz. Frankly Gbps is becoming more standard in reviews because it makes more sense.

900Mhz gddr3 and 900Mhz gddr5, to your average user look the same. Listed in bandwidth form you have, say 1.8Gbps gddr3 vs 3.6Gbps gddr5. Its a vastly better way to represent the speed and in the future, with memory types changing the only thing you ever want to know is the Gbps, clock speed is irrelevant as different types of memory make the comparison worthless.
 
Last edited:
Again I'll say it, the SPEED of the memory is rated at 6Gbps PER PIN, gddr5 has 32 pins, that makes the speed 192Gb/s. That gives a chip on 32bit bus 24GB/s, on a 256bus, that gives 192GB/s.

With a fairly standard 256bit bus because of the /8 * 8 situation to get GB/s then on a bigger bus, you can rather easily directly convert it, so thats why I don't care ;) )

You misunderstand, bandwidth is bandwidth(192GB/s) memory speed is memory speed(6Gbps) that IS the memory speed and memory speed for consumer is simplified to Mhz often, as standard its talked about in bandwidth per pin, so the answer is yes, I frequently see memory regarded by its bandwidth and not its Mhz.


This isn't true.

The term bandwidth comes from the field of analogue electronics. Particularly the areas of RF circuits and (Analogue) Signal Processing. bandwidth is defined as the upper cut-off frequency, fu, for a baseband signal or, for a bandpass signal it is the upper cut-off frequency minus the lower cut-off frequency, fu - fL

In this context Bandwidth actually meant "the width of the frequency band".

But because bandwidth came to be intimately related to data transfer rate (or the analogue eqivalent of this), when the mathematician/theoretical engineer Claude Shannon rigorously formulated his "Information theory", and with the rise of digital electronics, the term "bandwidth" was now used as a metaphor for a raw measure related to maximum digital channel capacity.

The term "speed" does not exist in an engineer or mathematician or physicist's vocabulary. It is too vague a term. Not even our lowly mechanical engineering brethren use it anymore ( :D :D :D I kid, I kid, Duff Man). so even in mechanics the term velocity is preferred because "speed" doesn't tell you much. And in electrical engineering we use terms like data transfer rate, or channel capacity. No engineer or physicist I've ever met has used the term "speed" when referring to data transfer rate, channel capacity, GT/s, Clock rate, FLOPS or whatever else. To do so, I suspect, would be equivalent to if a microbiologist used the term "germs" to refer to microorganisms.

Now, as I said before, the term "bandwidth" in digital systems (like memory) is a metaphor and does not retain the original meaning (i.e. "width of a frequency band used as a communication channel"). It can be used in place of data transfer rate. To say that 6GB/s is not bandwidth and then claim that 192GB/s is both unscientific and wrong. Either can be used as "digital bandwidth" because digital bandwidth is literally the maximum theoretical transfer rate (normally ignoring corrections for losses, etc).

So the term as used in computing is imprecise. It's like any figure of speech, and in this instance it harkens back to the original meaning of the term in analogue electronics. To get pedantic over it is a pointless argument.

There are more examples. People in the computing field often like to take technical terms in electrical & electronics engineering and use them as metaphors. Don't get too hung up on these. A good instance is the term "impedance mismatch" which has a very specific formal meaning in the subfield of transmission line analysis within engineering electromagnetics. But Computing folk use it to mean something entirely else.

Here's a bit more historical and technical insight:
During the heyday of Amplitude Modulation (AM) and Frequency Modulation (FM) when radio frequency electronics was the cats whiskers, the term bandwidth gave some indication as to how much "information" you could fit through a channel. While this measure was important it itself did not give a solid indication of exactly how much that information was. Because things like sampling rate and aliasing effects could degrade an analogue signal without obliterating its information content, the term was used more to give indication of quality. But the degradation of quality was intimately related to the fact the fact that the channel did not have the capacity for the data sent through it. (To be perfectly honest, no channel can fully encode an analogue signal. Because in all its imperfections any analogue signal would require infinite memory to convert to a digital signal without any loss, and it would require infinite energy. The latter is called the Berkenstein bound in theoretical physics. The concept originated in black-hole thermodynamics but has important implications to electronics engineering and computer science.)

When Claude Shannon developed information theory these concepts became more formal. And as digital systems were developed and used over analogue channels, techniques such as PSK, FSK, ASK etc were now being used to send (modulate) DIGITAL data through what was originally an ANALOGUE channel (carrier) -- and these were the same ones originally used to send analogue data. Simultaneously, because of certain limitations to maximum transfer rates over a noisy line (Gaussian white noise) that Shannon proved, they also used encoding schemes. And communications signals went from analogue to binary (1 baud = 1 bps) to signals that had higher baud.

All these contributed to bandwidth being used to imply max data transfer rate or channel capacity in a sense. Eventually, even when digital systems were not using a analogue channel for transfer, the term bandwidth began to be applied in the digital sense to give an indication as to how much digital data you could send on a digital channel.
 
Last edited:
IT's quite hilarious that you managed to cut out the part of my post you quoted that points out that if I ever simplify anything you'll jump on it like the pedant you are, while also saying in your own post that to get pedantic over it is pointless when you've just typed a few hundred words being pedantic.

Also you are wrong, again. You are incapable, completely, of seeing the difference between a technical term and what someone NAMES ANYTHING THEY LIKE.

IT doesn't matter what you deem to be bandwidth, it matters what, as in the press release I linked to, the companies who make memory in this case mean when they use the term bandwidth, nothing more or less.

Samsung and all other memory makers rate their memory by bandwidth, you could call them and ask them to use a more technical term if that would please you but that doesn't change the fact that they use bandwidth to mean EXACTLY what I said.

If I claimed that what I said was the technical definition of bandwidth I would have been wrong, I didn't, I used the term memory makers use to rate their memory, and used the name THEY designated for that, they call it bandwidth, they could have called it blargon 48, or boobies, but they didn't and it doesn't matter. 1,000,000 people could all refer to different things as bandwidth, its JUST a word and words have different meanings in different contexts and to different people.

All over the world in every industry people use terms that are technically incorrect, but they merely use it as a relative name for something they want to specify, please get past it as its boring.

It's like your little TDP problem, it does NOT matter what the technical term for TDP actually is, what matters is what Intel, Nvidia and AMD ACTUALLY mean by TDP, and its utterly and obviously provable that all three companies MEAN very very different things with the rating. Your designation of TDP doesn't matter, nor does a technical specification, its JUST a word that companies CHOOSE to use to mean whatever they want.
 
I cut that out cos I didn't want to make a long post longer with a quote.

But you're just being ridiculous now. So you invented a term and it should matteR?

"Also you are wrong, again. You are incapable, completely, of seeing the difference between a technical term and what someone NAMES ANYTHING THEY LIKE."

So you're saying you're still right because they can name it what they want? The fact of the matters is the companies do you use technical terms. The engineers use technical terms. It doesn't matter if some "tech site" got it wrong. As all your info comes from blogs and tech sites and your rear-end I fully expect you to get things wrong thanks to your ignorance.

These are technical terms and you have failed to grasp them and made a mistake trying to correct someone else. I schooled you, now simmer down and go home. Intel and NVIDIA use the term TDP in the same way, fully understanding what it means. It's not my fault, or theirs, if ignorants on the internet (particularly you) get it wrong. You should go do something more your speed. Like the humanities perhaps.
 
Back
Top Bottom