04 September, 2009

A 'Bit' of computing history...

"Reboot for UK's oldest computer"


As the title entails, Britain's oldest computer --the Harwell-- is being sent to the National Museum of Computing at Bletchley, where it will be restored to working condition.

Originally, the Harwell was built to crunch numbers on general atomic theory, by 3 people at the Atomic Energy Research Establishment.

BBC News, interviewed Dick Barnes (one of the three who built the system), who said the research was --officially at least-- for civilian nuclear power projects. But they had connections to the nuclear weapons programme.

Not being the first computer built in the UK, it had one of the longest service lives.., being in active use from 1951 up until 1973. Allthough the later years were spent as a teaching-system dubbed "WITCH" (Wolverhampton Instrument for Teaching Computing from Harwell). After this it went to be displayed at the Birmingham Science Museum, before being put to storage at Birmingham City Council Museums' Collection Centre.

The Harwell had a number of predecessors: The ACE, or "Automatic Computing Engine" (parts of which are on display in London's Science Museum), the Electronic Delay Storage Automatic Calculator (EDSAC) which was broken up, and Manchester's Small-Scale Experimental Machine (SSEM) nicknamed Baby, which has been rebuilt but not using original parts.

What distinguished Harwell from similar computer-systems at the time, was the ability to manage stored computing. Unlike it's predecessor, the Colossus, Harwell was relay-based, with 900 Dekatron gas-filled tubes, each holding a single digit in memory --similar to modern RAM-- but having 900 bits, didn't even come close to modern memory sizes we use today, which is an indicator on how fast the development process of IT has come to this day and age.

Harwell sported a single memory array, used as temporary memory, while also utilizing paper tape for both input and program storage. These machines are now considered 1st gen. (vacuum tube machines, 1940-1956), because this was a time well before transistors and ICs (Integrated Circuits) became mainstream. Transistor-based computers are considered 2nd gen. (1956-1963) and IC-based computers 3rd gen (1964-1971).

We are currently using 4th gen. computers (microprocessors, 1971-present). IBM presented the home computer for the average joe in 1981, and Apple soon followed in 1984 with the Macintosh.

While modern personal computers generally come with at least 2 GigaBytes as standard [2^30 bytes x 2], the Harwell sported an approximate grand total of 0,000125 GigaBytes (or approx. 125 KiloBytes), which was HUGE in the vacuum tube-era. I actually remember a time in the 80s when 30 MegaByte harddrives were top-of-the-line :P
A little sidenote on 'Bits' and 'Bytes':

In computer storage, sizes are calculated using so-called 'bitwise' algorithms.

And just recently, Apple Mac OS X switched to a new bitwise algorithm to better represent the actual storage available in a system. A lot of Linux distributions did the exact same thing (including my distro of choice; Slackware).

But in computer storage theory, there are two main ways of calculating sizes.

Storage calculation is based on the octal-system (8 Bits = 1 Byte): 1 GigaByte = 1024 MegaBytes (or 1.073.741.824 Bytes to be more exact).

While transfer calculation is based on the metric system (1:1): 1 GigaBit = 1000 MegaBits

This is also one of the reasons why storage-OEMs have differentiating space-specs on their products. Haven't you bought a 1 TeraByte harddisk drive lately, and realizing after you formatted it, that it does not hold as much as specified?

Basically, OEMs use transfer-specific calculation for storage amounts, and not storage-calculations as they should. And none of them specify this on their packaging or manuals, because it's not required by law.

OEMs, when faced about this issue, explain it this way:
"...if we WERE obligated to print this information on our products, it wouldn't be economical to manufacture drives at all...", which I believe to be utter FUD.

Another reason why harddrives don't always sport the specified amount, is that a filesystem requires a certain amount of disk-space to store filesystem-metadata, and thus removes this reserved space from the defined accessible storage amount on the drive (usually around 5-10%).
So.., in conclusion, this is the perfect example on how mind-bogglingly progressive computer-development has been over the last decades.

Links:
http://news.bbc.co.uk/ - article about the rebuild.
http://news.bbc.co.uk/ - video about the rebuild.

0 kommentarer :

Post a Comment