Wednesday, December 29, 2010

A Brief History - 6 - The Origins of Objects

PART 6: Object Trek: The Origins of Objects

"A Brief History of Code", by Rudedog Hawkins

PART 6: Object Trek: The Origins of Objects


When the first computers began to “think” in the 1950s, engineers realized that their artificial brains were not really thinking at all.  It was only those who did not understand how they worked who perceived that a computer could think for itself.  The engineers knew all along that the CPU executed a sequence of pre-written, pre-determined instructions, and they did little to disavow the public of that notion.  Doing so had the effect of adding mystique to the practitioners of the trade and kept nosey bodies away.  The fact that cutting edge development was in the interests of national security helped a little bit, too.

As much as engineers wanted to create an artificial mind that they could converse with, it was just simply beyond their current understanding and technology.  The CPU was simply too slow compared to the human brain.  The BRAIN.  That was it!  They had a real model to imitate.

The human brain reacts to inputs, and creates outputs.  For example, the brain can hear music through the ears, which also stimulate pleasure centers that produce hormones that put us into a good mood.  Engineers realized that a computer could do the same thing, just on a smaller scale.  They recognized that their computers had a limited number of inputs compared to the human brain. 

Engineers also recognized the most critical differences.  The brain had dedicated areas for various types of processing, and seemed to have the capacity for simultaneous processing of inputs, each of which could produce an independent output.  Gee, we could walk and talk at the same time!  Sometimes the obvious is hidden because it is plain sight.  The dream to invent artificial intelligence had a direction.  Imitate life.  Didn’t they do that trying to invent the airplane?

It was not until the mid and late 1950s that engineers began to think of more creative concepts to model and imitate the human brain in a computer program.  A computer program can be thought of as a list of computer instructions.  They began to write “lists” that represented computer code that performed specialized tasks just like the separate areas of the brain.  One of the oldest computer high-level languages, and is still in use today, was designed at this time to work in this fashion.  The language was called LISP, which was short for List Processor. 

A program was written to interpret these instruction lists, and the first formal Interpreter for a higher-level language was born.  Engineers already had standardized syntaxes for the lists of instructions to write binary code, Assembly Language.  Since everyone wanted to get in on the ground floor of high-level languages, a higher-level standard syntax were being conceived and a published as an industry standard.  They wanted a syntax that could be used on more than one type of CPU provided you had the proper Interpreter.   Another language introduced at this time was Fortran, short for Formula Translator. 

Primarily engineers in the scientific and research fields were doing computer hardware and software development work during this time period.  Focus was put on turn around time for solving long and complex equations.  Engineers wanted to enter their actual formulas and let the CPU interpret and solve them, instead writing computer code to solve one specific equation.  But, interpreted code was notorious for its’ slow execution compared to binary instructions. 

Some engineers still wanted to imitate the multitude of inputs available to the human brain.  Inputs, which when stimulated, caused specific sections of the brain to activate and process the stimulus.  They wanted to give an AI the ability to see, hear, feel, and eventually think and reason.  But, engineers of the day fell short of imitating the human brain.  They soon realized that the human eye was not made up on a single vision signal, but rather human vision was comprised of millions of sub signals, each being generated by the separate rods that made up the human retina.

Truly imitating Life required far more computer capacity than what they had available.  They needed to split the functionality of one list, into multiple identical lists.  So, they gave their instruction lists the characteristic to be identified with arbitrary symbols.  The original list became a roadmap for how the sub-lists of that type should operate and behave.  They allowed for the roadmap to be provided with data to define how an individual sub-list should behave.  These were the first “objects” to be used in computer programming.  These roadmap definitions and resulting objects were not very different from the class definitions and instance objects that we use today. 

They were also stuck on figuring out all of the pieces and parts that were needed to make up an actual human brain model.  Modern medicine couldn’t even tell them all of the details.  They did realize that countless specialized pieces and parts were needed.  But, what?  And so, progress towards realizing an artificial brain was stymied by available science and technology throughout the ‘60s, ‘70s, and even most of the ‘80s. 

Work with objects did not stop, however.  It proliferated.  Engineers continued to improve upon the computer languages and software theory that used objects.  Although the concept of software objects was becoming better understood, they were still expensive to use in the ‘60s and ‘70s because of how much memory they consumed.  Then along came the microprocessor, which introduced LSI, Large Scale Integration, of computer chips.

It was soon recognized that the pace of hardware improvement would quickly render the memory issue moot.  Which is exactly what occurred in 1981 with the introduction of the microprocessor based IBM PC.  Although software objects were quickly moving from away from novelty and into reality as more people discovered them, they were still curiosities as seen by most.  The term geek, and the phrase computer nerd were born.  Their best friends were computers.


Rudy  =8^D

No comments:

Post a Comment