History Of Increasing The Level Of Abstraction

When developing software one deals with levels of abstraction, ranging from the real world where the problem represents the highest level of abstraction, to machine language that represents the solution in the lowest level of abstraction. Between the highest and lowest levels of abstraction, one should develop software in as many levels of abstraction as the problem demands.

The history of software development is the history of increasing the level of abstraction at each step. In the early days of computing, programmers used to represent programs using the lowest level of abstraction, by sending binary instructions corresponding to native CPU instructions to the computer to be executed. The main challenge programmers faced in those early days was the efficient use of the very limited amount of memory space available.

Later, an important software innovation took place: Assembly language was developed under the pressure of an increasing number of new and larger applications in the real world. Assembly language is based on abstractions designed to allow programmers to replace the Os and Is of native computer instructions by mnemonics. An assembler was used to translate the mnemonics into Os and Is corresponding to the native processor instructions, allowing the programmer to concentrate more on the problem than in programming error-prone details. Mnemonics were a higher level of abstraction. Writing code was less time-consuming and programs were less prone to error. The increased level of abstraction was followed by an increase in the level of productivity, software quality, and longevity.

The next big jump in using abstraction in code writing was the advent of the third-generation languages (3GLs). In these languages, a number of machine instructions needed to execute a certain operation like PRINT, for example, were grouped into macro instructions with a name. The code would be written using these macro instructions and a translator would translate macros into a sequence of Os and Is that now are called machine code. The high-level constructs used to write code allowed programmers to not be limited by the hardware capabilities. If a new hardware was used with a different set of instructions, then new translators were created to take into consideration new changes and generate code targeted to the new hardware. The ability to adjust code to different machines was referred to as portability.

The new set of tools programmers could use increased the number of domains and applications for which computer solutions were possible. So code was developed for each new and different problem. System vendors began to use 3GLs instead of assembly languages, even to define operating systems services. While 3GLs raised the level of abstraction of the programming environment, operating systems raised the level of abstraction of the computing platform.

Structured programming gave a boost to the use of control abstractions such as looping or if-then statements that were incorporated into high level programming languages. The control structures allowed programmers to abstract out certain conditions that would affect the flow of execution. In the structured programming paradigm, software was developed by first making an inventory of the tasks needed to be achieved. Then, each task was decomposed into smaller tasks until the level of the programming language statement was reached. During all the phases of analysis and design in structural programming, the focus was on how to refine step-by-step tasks that needed to be accomplished.

Later, abstract data types were introduced into the programming languages that would allow programmers to deal with data in an abstract way, without taking into consideration the specific form in which data were represented. Abstract data types hide the specific implementation of the structure of the data as their implementation was considered a low level detail. Programmers did not have to deal with this low level of detail; instead, they manipulated the abstract data types in an abstract way.

The demand for developing more complex systems, and in shorter time, made necessary a new, revolutionary way of looking at the software development: The object-oriented paradigm. According to this new paradigm, the basic building block is an object. The object-oriented approach tries to manage complexity by abstracting out knowledge from the problem domain and encapsulating it into objects. Designing software means depicting objects from the problem domain and providing them with specific responsibilities. Objects dialog with each other in order to make use of each other's capabilities. Functionality is achieved through dialog among objects.

In the object oriented paradigm, the analysis and design processes start with a more abstract focus. The main focus is to identify which operations need to be accomplished and who would accomplish these operations. The corresponding responsibilities are to be distributed to objects. Objects are to be provided with the necessary data and behavior in order to play the particular role they are assigned to. Each object knows its responsibilities and it is an active player. Rarely are objects created to stand by themselves outside any collaboration with other objects.

One of the most important recent achievements that represented a great breakthrough in software development is what we refer to as design patterns. Design patterns are descriptions of communicating objects and classes that are customized to solve a general design problem in a particular context [GHJ95]. As a collaboration, a pattern provides a set of abstractions whose structure and behavior work together to carry out some useful functions [BRJ99]. They present recurring solutions to software design problems that occur in real world application development. Design patterns abstract the collaboration between objects in a particular context and could be used and reused again and again. The use of design patterns in software engineering moved the level of abstraction higher, closer to the problem level and away from the machine language level.

For many years, software development companies have developed applications in a number of languages and operating systems. Isolated islands of applications developed in different programming environments and operating systems make it difficult to achieve a high level of integration that is demanded by the age of the Internet. In order to be competitive, companies are now forced to look for ways of building communication bridges between these isolated islands.

Object Management Group (OMG) was created in 1989 to develop, adopt, and promote standards for the development and deployment of applications in distributed heterogeneous environments [Vin97], [VD98]. OMG's response to this challenging problem was CORBA (Common Object Request Broker Architecture). CORBA enables natural interoperability regardless of platform, operating system, programming language, and even of network hardware and software. With CORBA, systems developed in different implementation languages and operating systems do not have to be rewritten in order to communicate. By raising the level of abstraction above the implementation languages and operating systems, CORBA made a tangible contribution to the longevity of the software.

Another important event that significantly influenced the software engineering world was the use of visual modeling tools. Modeling is a well-known engineering discipline as it helps one to understand reality. Models of complex systems are built because it is difficult to understand such a system in its entirety. Models are needed to express the structure and the behavior of complex systems. Using models makes it possible to visualize and control system's architecture.

In the early 90s, there were several modeling languages used by the software engineering community. The most well-known methodologies were Booch, Jacobson's (Object-Oriented Software Engineering) and Rumbaugh's (Object Modeling Technique). Other important methods were Fusion [CAB94], Shllaer-Mellor [ShM88], and Coad-Yourdon [CY91]. All these methods had strengths and weaknesses. An important event occurred in the mid-90s when Booch, Jacobson, and Rumbaugh began adopting ideas from each other that led to the creation of the Unified Modeling Language or as it is known best, the UML. UML is a standard language for visualizing, specifying, constructing, and documenting object-oriented systems [BRJ99].

UML uses a set of graphical symbols to abstract things and their relationships in a problem domain. Several types of diagrams are created to show different aspects of the problem. Models created using UML are semantically richer than the ones expressed in any current object-oriented language and they can express syntax independently of any programming language. When a UML model is translated into a particular programming language, there is loss of information. A UML model is easy to read and interpret as it is expressed in plain English. Therefore, formal models raise the programming abstraction level above the 3GLs in a profound way.

In 2002, OMG introduced a new and very promising approach to software development referred to as the Model Driven Architecture approach, known as the MDA. MDA is about using modeling languages as programming languages [Fra03].

Most commonly, software models are considered to be design tools while code written in programming languages are considered to be development artifacts. In most of the software development teams, the role of the designer is quite separated from the role of the developer. A direct consequence of this separation is the fact that design models often are informal, and are used by developers only as guidelines for software development. This separation of roles is the common source of discrepancies that exist between design models and code.

MDA aims to narrow the gap existing between the designer and the developer by providing for producing models that can be compiled and executed in the same environment. Therefore, models will not be only a design artifact, but an integral part of the software production process. The MDA approach to software development is based on building platform independent models (PIM) that later can be mapped into platform specific models (PSM) that take into consideration specific implementation issues, such as platforms, middleware, etc. Specific models are then used by code generators to automatically create the implementation code.

The MDA approach has already been applied to a variety of computing technologies such as Sun's J2EE and EJB, Microsoft's .NET, XML, OMG's CORBA, and an increasing number of software companies are adopting this approach. Although it is quite early to evaluate the impact MDA is having in the software industry, the future looks promising. By narrowing the gap between designers and developers, MDA considerably raised the level of abstraction in software development.

Was this article helpful?

0 0

Post a comment