baudrunner's space: Qubits and the list of 500
"Philosophy to Science - Quark to Cosmos. Musings on the Fundamental Nature of reality"

search scientific sources

Monday, January 21, 2008

Qubits and the list of 500

FLOPs/s - floating point operations per second
teraFLOPs/s - trillions of floating point operations per second
petaFLOPs/s - quadrillions of floating point operations per second

The 29th list of 500 made the news again recently and if you're following the supercomputing trend this one is a notable edition, seeing the largest turnover among list entries in the history of the TOP500 project. Again, IBM's Blue Gene took the top spot with a supercomputer named Blue Gene/L, a speed demon with a peak performance of 367 teraFLOPs/s and with a computational performance of 280.6 TFLOPs/s. Cray Supercomputing, in the business longer than anyone else, took the next two spots with the Cray XT4/XT3 at 101.7 TFLOPs/s, and the Cray Red Storm system at 101.4 TFLOPs/s.

It will take a radical change in architectural approach to displace the Blue Gene/L. IBM is represented among the top 50 with 46% of the systems and 49% of the performance. Among its specialized tasks, the Blue Gene/L is also "a computational science research machine for evaluating advanced computer architectures". It is designed to allow for scaling up to virtually any limit with no degradation in performance. The Blue Gene/L is not quite ready for the desktop as it currently has a footprint of about 2,500 square feet and requires about 1.5 megawatts of power to run. I can wait.

Japan's NEC Corporation held the number one spot for a long time with its Earth Simulator. In a project expected to cost up to one and a half billion dollars, the Japanese Ministry of Science and Technology has commissioned the Riken Research Institute to design the world's fastest supercomputer by 2011 in an effort to supplant IBM dominance in the competitive world of supercomputing.

Supercomputing is about the whole package, not just the fastest central processing unit. In 2002, IBM's cutting edge research teams demonstrated a nano-tech based data storage density of a trillion bits per square inch, enough to store 25 million printed textbook pages on a surface the size of a postage stamp. Constraints and considerations of data transfer rates and consistent throughput and the ability of systems to delegate operations on data restrict application to the current market offering - IBM's System Storage DCS9550, a storage solution for the high productivity computer (HPC) user.

But the leaders of the pack are all digital computers. Answers to problems in digital computers are essentially mathematical solutions, says Alexey Andreev, a venture capitalist in D-Wave Systems, a Canadian start-up company which spun off from the University of British Columbia's Computing Lab. D-Wave promises a commercially viable quantum computer by 2008. According to Andreev, answers to programs that run on the quantum computer come in the form of a physical simulation.

Instead of the two states of a digital computer's bits - either a 1 or a 0, the quantum bit, or qu-bit, can maintain both states at the same time plus another which is neither 1 or 0. D-Wave founder and Chief Technology Officer Geordie Rose says of the quantum computer, "We view these machines as probability distribution generators. We want to build an actual physical embodiment of a hard math problem." He envisions the quantum computer as a co-processor to existing digital computers. They will be capable of instantly solving problems relating to the natural world which digital computers cannot readily solve on their own or would simply take much longer to solve than the rate of technological advancement.

To that end, the company has constructed Orion, the world's first commercial quantum computer. It is itself a simulation of quantum mechanical behaviour while actually utilizing quantum mechanical principles. This gives D-Wave every right to call the Orion a quantum computer. While a few short years away yet, the potential capabilities of a massively complex Orion are enormous, probably leading to the real-time simulation of protein reactions involving hundreds of thousands of atoms. By comparison, the Riken Research Institute's planned design of their 10 petaFLOP computer would work for 24 hours to complete the calculations required for a 20 nanosecond simulation involving a million atoms. A demonstrable Orion already exists. The 10 petaFLOP computer does not.

There is hope yet for a desktop supercomputer of sorts. Uzi Vishkin, also known as the inventor of parallel computing, has created the generic operating system algorithms to complement a parallel 64-microprocessor array on a circuit board the size of a shoe box which will no doubt one day soon find its way into a home computer. Igor, are you listening?

No comments: