Wednesday, December 29, 2010

A Brief History - 4 - The Library from the Edge of Tomorrow

PART 4: Object Trek: The Library from the Edge of Tomorrow

PART 4: Object Trek: The Library from the Edge of Tomorrow

The Microsoft computer language development packages underwent a major change that took the form of the .NET Framework.  The sheer volume of code in the release felt like a software Library of Congress being released by the U.S. government.  The .NET Framework represented a “5th Order” computer language design.  “4th Order” computer languages had been around for a couple of decades.  These languages could quickly compile source code into a custom intermediate form, a fourth form. This custom file could then be executed by a custom Interpreter much more quickly than one running form the original source code alone.  This custom intermediate file frequent meant that everything worked better if could fit inside of one custom binary assembly.

The “5th Order” designs were designed to eradicate the beast that long stood in the developer’s way, that being cross-platform and mixed-language program development.  While a “5th Order” design functions like a “4th Order” design in most every way, the difference is with the intermediate format of the code that is generated.  The .NET Framework introduced a standardized Intermediate Language, IL, as the output of the compilation process. 

The compilers for several languages were completely re-written to generate this new standardized IL.  Standardized IL meant that a run time Interpreter was required for the various manufacturers’ CPUs on the market.  More importantly, a standardized IL meant that the cross-platform and mixed-language problems of the past could be eliminated.  You no longer found it easier to fit everything into one assembly.  It didn’t matter anymore.  Re-usable code took on a new meaning and added dimension.

The .NET Framework also introduced another even more significant change.  Since all of the languages compiled to a standard IL, this meant that all of the languages would need to be compatible with some type of hardware design.  A brilliant design decision was made, abstraction.  They separated, or abstracted, the target hardware from the languages and created a virtual programming environment that did not care what the actual hardware configuration was when the program was actually executed. 

The details of implementing and executing the code on specific hardware would be left up to the responsibility of the Compiler and the IL Interpreter, which was also written by Microsoft.  The .NET Framework uses an IL Interpreter known as the Common Language Runtime, or CLR for short. 

From a developer’s perspective, all hardware resources were now just another piece of abstract software.  This even included actual hardware like disk drives, mice and memory.  This is a critical feature that many new and old-time developers have trouble fully understanding.  How the hardware works can be regarded as a “don’t care” condition under most circumstances.

An implementation of managed memory was introduced with the CLR for the .NET Framework.  In the past, developers originally had to overly concern themselves about hardware specific issues; i.e., such as which memory addresses were consumed by their program, other programs, the OS, and their own data.  This issue had been partially resolved with the introduction of multi-tasking operating systems, which used virtual memory. 

The existing implementation of virtual memory, which allowed the developer the freedom to not to worry about other programs, still found itself too dependent upon the hardware platform where the program was being executed.  While the developer was granted the freedom to assume that all available memory was his, he still had to worry about how much memory was available and where. 

The developer also had to worry about re-using available memory for data storage when writing more complex programs.  The developer had to clear out used memory before it could be re-used.  Sometimes the developer had to relocate large chunks of data stored in memory to make room for even more data.  The developer could not store data in a continuous segment of memory and had to resort to breaking it up into pieces with the resulting management problems associated with keep track of it as it grew and shrank dynamically.  Compounding the problem was using third party software libraries that didn’t seem to care how much memory they consumed, nor where.

Managing memory became just as much as part of the development task as actually designing and implementing the software’s features.  It had been long recognized that managing memory slowed development, but what could you do.  Managing memory was just as much a part of the development process, as an airplane having to deal with air friction when it is airborne.  It was a fact and a part of life. 

Managed memory pretty much freed the developer from those worries.  Only when the application software interacted with hardware resources would the developer need to worry managing memory.  Even then, it amounted to pretty much calling a subroutine known as Dispose to let the CLR know that the application was finished with interacting with a given hardware resource for now.  The same procedure was used to interact memory, disk files, graphics, audio, etc.  The developer was free to focus on the application’s features, and not on actually implementing the application on a specific platform.

Microsoft’s Visual Studio software uses the .NET Framework to provide developers with a sophisticated and powerful environment in which to flex their creative muscles.  This Integrated Development Environment, IDE, is targeted for the PC platform.  As of this writing, Microsoft has never introduced a version for the Mac platform.  The .NET Framework is now in its’ 4th generation.


Rudy  =8^D

No comments:

Post a Comment