• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ivy Bridge to use 3D 22nm transistors

Thats why AMD doesn't have to compete on the CPU(they still will) to "win", because in the same things that Intel themselves are talking about their CPU's for, the GPU is massively involved, and a AMD cpu + gpu will beat anything Intel's got in most of these area's. AMD are doing great, no matter the process as frankly the game is moving from CPU performance to overall performance in what 99% of people do day in day out.

Have to disagree there. The vast majority of computers in the world are either business desktops/laptops or home desktops/laptops. These don't need decent graphics built in. A very tiny amount have discrete graphics cards, i.e. only gaming and high end workstations.
 
Have to disagree there. The vast majority of computers in the world are either business desktops/laptops or home desktops/laptops. These don't need decent graphics built in. A very tiny amount have discrete graphics cards, i.e. only gaming and high end workstations.

The vast majority of these computers don't actually need anything more powerful than something like an old dual core. This is because modern IGPs help offload video and flash decoding which is the most intensive application most people will be running.

However,the poor abilities of the older Intel IGPs are the main reason netbooks failed. The IGPs Intel bundled with most of them were simply not upto the task of decoding video and flash very well and put too much load on the CPU.

In fact look at the success of the iPad and iPad2 which have very weak processors. However,they not only have a decent built in decoder but the graphics ability of such devices is rapidly increasing.

Casual gaming on mobile devices is also becoming more and more of a money maker for companies and graphics ability is getting more important as a result.

Intel is also investing a lot of money into improving their IGPs and investing in GPGPU too. Quick Sync is an example of this.
 
Last edited:
Have to disagree there. The vast majority of computers in the world are either business desktops/laptops or home desktops/laptops. These don't need decent graphics built in. A very tiny amount have discrete graphics cards, i.e. only gaming and high end workstations.

Where did I say discrete graphics, business will benefit MORE from intergrated graphics than home users.

AS I said most home users need half, if that, of the power they buy. People want high def video's and crap games to run well and adverts tell them product X or Y will give them an unmatched hi def experience, they'll upgrade eventually even though in a lot of situations there won't be any real gain, heck half the time the performance improvement people feel(who generally do smeg all with their computers) is just from a non 3 year old install of windows.

Intergrated GPU's are becoming VERY good, and gpu accelerated software that up till recently has always been done exclusively on the CPU is taking off pretty big time.

Business users, presentations in high def, running various adobe products, video encoding(Intel and quicksync offers magnitudes faster performance and plenty of businesses could benefit).


As for finfet, essentially that is what Intel have done, they've just called it 3d rather than finfet. AMD apparently made finfet chips a couple years back(new process stuff is often tested literally years before retail products hit, and usually new processes start as sram chips as they are very basic in general).

I guess Intel are just playing on the whole "3d" craze going around at the moment though, I mean you can see from the picture, the transistor(most of it) hasn't gone 3d and they aren't being stacked so calling it 3D is mostly BS.


Anyway back to the whole APU thing, in general being seen as fast in one area is just a good thing, full stop, it doesn't matter if its desktop/laptop and non business. AMD have a shedload of market gain available to them if they have competitive products in every area as they are miles behind everywhere.

You're forgetting the main point though, Intel themselves are pushing towards APU's, marketing APU's and showing what overall system performance their APU's can provide. Also don't forget, the single biggest increase in performance from their old chips to Sandybridge, was quicksync, which is done on the GPU and is strictly an APU feature. So thats Intel who put billions in R&D and massive effort into their APU.

Intel are going APU in every single market, as are AMD, in which case yeah, pretty sure APU performance will be the most important thing.
 
i read somewhere, nvidia are actually losing a lot of money and sales to intels new integrated gfx, and with amd producing both cpu and gpus, its left nvidia to lick its wounds, these new 22nm chips have even better gfx in them dont they? and am i right in thinking that they are also going to be socket 1155, current boards getting a bios flash upgrade to run them?.. i hope so as i only just upgraded to 1155, if they bring out an entire new socket you have to spend more on it will be a slap in the face ! >.<
 
That's an interesting article, but I have to say its a bit of a shameless marketing gimmick by Intel.

As much as I respect the engineers at Intel these naming conventions are the like you'd expect to be thrown out by the marketing department and are meant to dazzle the nontechnical public by adding in a bit of confusion. Multigate Field Effect Transistors and indeed multi source/drain, etc (GAAFET, MIGFET, MuGFET etc) have been around for a while in microelectronics

true "3-dimensional" transistors, where u have complex nonlinear interconnects in all directions are unlikely to become feasible until we move deeper into nanoelectronics and quantum tech, and switch from silicon to carbon, probably.
 
Xsistor, ever considered running folding@home on your rig when it's not in use? A couple of the folding guys have sr2s, you could do a lot of research on that badboy and would be a welcome addition to the ocuk team. Check out the link in my Sig if you are interested :)
 
That's an interesting article, but I have to say its a bit of a shameless marketing gimmick by Intel.

As much as I respect the engineers at Intel these naming conventions are the like you'd expect to be thrown out by the marketing department and are meant to dazzle the nontechnical public by adding in a bit of confusion. Multigate Field Effect Transistors and indeed multi source/drain, etc (GAAFET, MIGFET, MuGFET etc) have been around for a while in microelectronics

true "3-dimensional" transistors, where u have complex nonlinear interconnects in all directions are unlikely to become feasible until we move deeper into nanoelectronics and quantum tech, and switch from silicon to carbon, probably.

AMD is releasing Bulldozer soon so they are of course trying to reduce any interest in it.
 
i read somewhere, nvidia are actually losing a lot of money and sales to intels new integrated gfx, and with amd producing both cpu and gpus, its left nvidia to lick its wounds, these new 22nm chips have even better gfx in them dont they? and am i right in thinking that they are also going to be socket 1155, current boards getting a bios flash upgrade to run them?.. i hope so as i only just upgraded to 1155, if they bring out an entire new socket you have to spend more on it will be a slap in the face ! >.<

As far as I know Ivy is gonna be a whole new socket, I'm still on S775 myself, so I'm waiting until Ivy is released before upgrading...
 
As far as I know Ivy is gonna be a whole new socket, I'm still on S775 myself, so I'm waiting until Ivy is released before upgrading...

It is on the current 1155 socket, but it is unknown if there will be any drawbacks to not using it on a 7 series chipset.

Also like that the H77 allows CPU overclocking.
 
My digital circuit design lecturer at uni was recently talking about FinFETS and other such advantages in MOS technology. Very interesting news, and good to see Intel really advancing the industry with this.
 
Looks pretty cool technology although it seems most of the improvements will be in low power devices. It would mean Intel is in a much better position against ARM based products.

The following image is from the Anandtech article on the new transistors:

http://www.anandtech.com/show/4313/...nm-3d-trigate-transistors-shipping-in-2h-2011

power.jpg


At higher voltage the improvement over a 32NM planar transistor starts to diminish it seems. AFAIK,all the current Sandy Bridge processors have a VID over 1V. The planar transistors Intel use ATM are produced on a 32NM bulk process.

It would be interesting to see the advantage when compared to planar transistors produced on a 32NM SOI process and AFAIK this is what AMD is using.

Anyway,hopefully the H67 chipset supports Ivy Bridge as it would mean a better upgrade path for my computer.
 
Last edited:
My digital circuit design lecturer at uni was recently talking about FinFETS and other such advantages in MOS technology. Very interesting news, and good to see Intel really advancing the industry with this.

yeah and u dont get ppl running around calling FinFETs "3d transistors" in the way intel is calling their trigates. it's basically one more gate channel.
http://www.nd.edu/~gsnider/EE666/666_05/QZhang_FinFET.ppt
 
Back
Top Bottom