NASA Supercomputer Columbia

NASA Supercomputer ColumbiaNASA just announced their newest supercomputer built in collaboration with SGI and Intel – a Linux cluster.

It’s fast. 42.7 teraflops on 16 nodes. If you want to know that that means, “flops” means “floating point operations per second” – it’s a way of encoding numbers with precision. If you can perform 42.7 teraflops, it means this little baby can do

42,700,000,000,000 floating point operations in just 1 second. (over 42 trillion)

And to think, I’m thinking about getting a faster computer so I can play Doom 3…

The system is a combination of 20 systems, each with 512 processors. To make it truly beautiful – this monstrosity of interconnections manifests itself as one single Linux image.

It outperforms Japan’s Earth Simulator which held the record at 37 teraflops.

IBM is currently working on a system called Blue Gene that should be installing a system at Lawrence Livermore National Laboratory sometime in 2005 – it is supposed to achieve 360 teraflops.

You can check out what it looks like at the NASA site – there are some very nice photos.

Also, read their press release if you like.

SGI also has some images available.

  • zechariah

    couuld you tell me more about the super computer?

  • Zechariah

    Will you please tell me about the super computers that you use and what do you use them for?

  • Well, I don’t use any supercomputer. Today’s supercomputers are tomorrow’s home computers, too.

    But I do know that usually supercomputers are used to model complex simulations. For example, nuclear weaponry advances are usually modeled by computer simulation instead of actual detonations any more. Weather is also a highly complex thing to model, and supercomputers are often used for it.

    They are also used to test various theories in physics, and oftentimes controversially so. Modeling cannot replace actual experimentation.

    I’m sure the US Government is using supercomputers for all its lovely spying and data mining, too.

    Lately, supercomputers are often built as clusters of slower computers, like PC’s running Linux, all hooked together to produce a much more powerful machine in aggregate. This is a new development. Supercomputers used to be built as very expensive, monolithic systems.