Posted: . At: 11:07 AM. This was 11 years ago. Post ID: 5960
Page permalink. WordPress uses cookies, or tiny pieces of information stored on your computer, to verify who you are. There are cookies for logged in users and for commenters.
These cookies expire two weeks after they are set.

From the ABC, ENIAC and now the TITAN supercomputer, we have power we could only dream of before.

The first electronic computer; the ENIAC, was born of the need to calculate ballistics tables for a sideways firing from air-planes. This necessitated the construction of the massive computer to hold vacuum tubes pulsating at 100,000 beats per second. The computational speed was estimated to be at 1,000 multiplications per second, this would be able to calculate a single projectile trajectory in 100 seconds. This figure out-laid in the memo by John W Mauchly, regarding the proposed calculating computer turned out to be three times too high. This shows the enthusiasm that he had for the design of this new computer. One that would revolutionize the world and pave the way for the next designer to take the idea and actually build a working computer. By 1937, a design for a calculating computer had been created and then in 1939, the first working prototype was constructed. This machine could add and subtract sixteen digit binary numbers. Not bad for 1939. The completed machine was constructed in 1942. This was named the ABC, for Atanasoff-Berry Computer. This calculating machine had the control console on top of the machine. The main body that contained the two hundred and ten tubes was underneath. The input and output was via punched cards and there were spinning drums that stored a charge on their skins, this was a form of memory storage in the early days of computing.

These rotating drums were twelve inches long and six inches in diameter. They stored data upon condensers set into their skins. Each drum could store thirty fifteen digit numbers, the equivalent of a fifteen digit decimal number. Another thirty vacuum tubes helped maintain the charges in the condensers which would drain away over time. The ABC, being the lesser ancestor of the ENIAC, could only add fifty-bit numbers each second. The ENIAC could add 5,000 ten digit decimal numbers every second. That shows the increase in computing power from one machine to the next. Of course a modern computer could perform any of the calculations the ENIAC had to handle in an instant, but the power of the machine was revolutionary at the time. The ENIAC consisted of 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. Making a truly massive machine. But in these days of shrinking hardware, we are still building massive computers. Titan, the world`s first Open Science Supercomputer, is a massive machine that takes up a giant room and is a computer that uses consumer level chips to power a supercomputer. This machine uses 16-core AMD Opteron CPUs and NVIDIA Tesla K20 GPU accelerators. With 299,008 CPU cores in total and using 18,688 GPU chips to complement the CPU power, this machine could have a peak performance of 20 petaflops. This is a level of performance that the creators of the ENIAC could only have dreamed of.

Having 710 terabytes of system RAM, it can handle massive amounts of data and the whole system takes up 4,352 square feet of floor space. This goes to show that consumer level hardware can be used to build a powerful supercomputer very easily, this is the point of massively parallel computing. Using an array of inexpensive chips to perform calculations as one large cluster. The future of computing is rather like that described in science fiction, with computers that take up whole buildings used to solve problems. This is coming true, we had computers becoming smaller, and now that the parts are more energy efficient and usable, they are used to build massive machines that again take up whole rooms; but have enormous processing power. That is the key to constructing the supercomputers of the next decade. Using cheap parts in parallel to allow the technology of clustering to perform calculations that previous computers would find too time consuming. Predicting the weather long in advance; as well as materials science simulations and also testing nuclear weapons in virtual reality with all blast effects simulated by the computer is one benefit to mankind. Testing nuclear weapons is does the environment no good at all and if they can do this in a computer with scalable supercomputer clustering. This allows scientists to simulate the blast effects down to the molecular level. Saving the environment and allowing the test to be run over again easily and with much more safety than a real atomic test. Soon, we will be able to simulate anything we wish, this will be a great help to scientists all over the world.

I just wonder if we could ever make virtual reality just like real life? Then we could create video games that are indistinguishable from reality. I mean, Far Cry 2 from 2008 has a 50 square kilometre area with birds and Zebras running around and realistic fire and rain. How realistic can a computer simulation get? And the TITAN supercomputer also runs a Linux operating system that natively supports clustering for building such a massively parallel supercomputer. Of course, you would not build such a powerful machine and then run Windows on it. That would be sacrilege. The gods of supercomputing would be angered and strike down the sysadmins where they stood.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.