Computer History 7/17/19

By Richard Bleil

There are advantages to getting older. Some of my historical knowledge is kind of fun to play with, especially around “experts”. In a previous post, I discussed the difference between the web and the internet. I kind of enjoy asking computer scientists what the difference is, and invariably the younger ones will insist that they are the same thing. I’ll be honest; I’m not sure if I enjoy doing this to remind the “experts” that there is always more to learn, or if it’s just hubris. Either way, it’s fun to see them struggle with what seems to be such a simple question.

The short answer, by the way, is that the “internet” is the physical wiring and computer hubs that connect computers world-wide, while the “web” is the graphical interface software that allows users to navigate the internet easily. The earliest internet had only a few major functions; ftp allowed for sharing of files from one location to another (like downloading something from the web), telnet allowed a user to log on to another computer as if they are physically plugged in, and communication tools like email and chat allowed for rapid communication. The internet was originally designed for government, military and academic work. And, no, it wasn’t invented by Al Gore, although he had more of an influence on it than many people realize. In the 1980’s, as a senator for Tennessee, he promoted legislation in support of the ARPANET, which allowed for greater public access to the internet.

Today, I had MS word download an updated version. In the description, all it said was, “Bug free.” Riiiiiiiiiiiight.

Well, we’ll see, but the phrase “bug” kind of got me thinking. A lot of people understand the vernacular and its meaning, but I’m not sure how many people know its genesis.

Most people remember seeing pictures of the computers of the fifties. These were only owned by large companies, as they would often occupy an entire warehouse for, maybe 8 kB of memory. These computers used old vacuum tube transistors. These basically acted like switches, that flipped “on” if current was present, or “off” when it wasn’t. Here, “on” and “off” were the modern equivalent of 1 and 0, but, they took a lot of power, and generated a lot of heat. In these large warehouses, insects (especially cockroaches) were hard to keep out, and they were attracted to the heat of the electronic transistors. The cockroaches would craw into the large computer cabinets, and if they stepped wrong, they would short out the wrong wires, and cause the entire computer system to crash. In these days, the computer scientists would have to search for this short, therefore, quite literally, “de-bugging” the system.

Terminology is a lot of fun. The computer language is binary, meaning the “gates” (or transistors) have only two possible values, 0 or 1, or “off” and “on” if you prefer. Each value of “0” or “1” is referred to as “bit”. The “ASCII”(American Standard Code for Information Interchange) code defines a letter, number or symbol for each combination of eight bits. Eight bits are known as a “byte”. What is less well-known is that a tongue-in-cheek term “nibble” is defined as four bits. So four bits is a nibble, and two nibbles is a byte.

Now, since each bit is binary (only two possible values), then with four bits, there are 16 possible combinations (2x2x2x2 or 2 to the fourth power). There are 256 combinations in a byte, but 16 (one nibble) is a convenient number to use in tables, so ASCII tables often have 16 rows and 16 columns.

In the early days of the silicon based computers, Bill Gates and Steve Jobs had some of the earliest computers. Here, the computers were programmed, early on, by physically flipping switches. To get one letter, eight switches were required. There used to be clubs where computer enthusiasts would meet regularly, and the attendees would try to outdo each other by showing off their newest advancement. For example, somebody developed the first keyboard, where instead of flipping eight switches, only one button had to be depressed, but to depress the button it was necessary to insult its family. (Sorry, it was like a joke…just smaller). Somebody had the box close to a speaker, and realized that the frequency of the CPU would cause static in the speaker in various pitches, and developed the first computer generated song.

These developments were shared openly and freely. Entrepreneurs like Steve and Bill then incorporated these advances into their systems. I find it interesting when people accuse (generally) Bill Gates of “stealing” from Steve Jobs, when in fact that was simply the culture and both of them stole ideas from these gatherings.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.