• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Optical computing and its application within GPUs

Soldato
Joined
22 Aug 2008
Posts
8,338
First of all here's an article on optical/photonics:

http://www.extremetech.com/extreme/...-speed-of-light-optical-computer-on-your-desk

Optalysys, a UK technology company, says it’s on-target to demonstrate a novel optical computer, which performs calculations at the speed of light, in January 2015. If all goes to plan, Optalysys says its tech — which is really unlike anything you’ve ever heard of before — can put an exascale supercomputer on your desk by 2020.

When we talk about optical computing, we’re actually referring to a fairly large number of different and competing technologies. At its most basic, optical computing refers to computing that uses light instead of electricity. When we’ve previously written about optical computing, we’re usually referring to chips and computers that have replaced their internal wiring with optical waveguides, and some kind of optical transistor that is controlled by photons instead of electrons. There are also optoelectronic devices, which use a mix of the two (usually optical interconnects and electronic transistors).


It goes something like this. You start with a low-power laser. This laser is then directed through a massive liquid crystal grid. This grid works in much the same way as a liquid crystal display. By applying electricity to each “pixel,” the laser light passing through it is affected. Complex calculations would turn hundreds or thousands of these pixels on or off. After the laser has passed through this grid, the beam is picked up by a receiver. By analyzing the beam’s diffraction and Fourier optics, matrix multiplication and Fourier transforms can be combined to perform complex maths. You can also have multiple pixel grids in sequence or parallel, significantly boosting the complexity and parallelism of the optical computer. There’s a little more technical info on the Optalysys website, but not much.

Moving away from the technical nitty-gritty, Optalysys’s optical computer is exciting for two main reasons: It consumes very little power, and there’s essentially no limit on how parallel you can make it. There’s no direct analogy to transistor-based logic, but you could almost think of every liquid-crystal pixel as a tiny processing core (or at least a tiny transistor). In a normal computer chip, while there is some parallelism, most things happen very sequentially, with each core (and each transistor) working mostly in serial. In an Optalysis optical computer, the laser beam hits every single pixel at the same time — it essentially performs hundreds or thousands (or millions?) of small computations in parallel, at the speed of light.

Optalysys thinks its Optical Solver could scale up to 17.1 exaflops by 2020. This seems like a very bold statement for an entirely novel and untested method of computing — but given how conventional computing has mostly stagnated by this point, I hope the folks at Optalysys can follow through.

Here's a paper on GPUs with chip stacking utilising optical interconnects:

http://plaza.ufl.edu/nil/root/Research/GPU-Icnt/files/PACT12-camera-ready-To-Press.pdf

3D stacking technology provides low latency
and high bandwidth cross-layer communication in a compact form. With significant bandwidth demand in GPU, it is anticipated that power consumption will reach a point at which electrical interconnect and memory subsystem design will become infeasible. On the contrary, optically connected GPU shader cores and memory interface in 3D-stacked multi-layer chip seems to be an attractive alternative

I would quote more bits but PDFs have weird word wrapping which makes it a chore to re-arrange. However it seems to suggest that with an optical interconnect between GPU and RAM, stuttering from heavy swapping may be a thing of the past. It is a very interesting peek into what may be coming in the next few years and good to know the problems we have now are being worked on and have answers! :cool:
 
Back
Top Bottom