Decoding Software Architecture: The Essential Role of Architectural Decision Records (ADRs)
Understanding the Importance of Choices in Software Design
Creating high-quality software is a challenging task. Surprisingly, crafting or maintaining even low-quality software is no walk in the park either. With every line of code we write, we make choices. Some of these choices bear a significant impact, while some are critically important.
Although we can adjust some decisions with relative ease, there are others that are difficult to alter. Probably the best-known definition of software architecture says, “Architecture is the stuff that’s hard to change later.”
So, the decisions we make regarding our software architecture tend to stick with us because they are difficult to change. However, before we decided to pursue a specific path, we had thoroughly considered and discussed it, meaning we had valid reasons to choose that direction, right? Now, if we are considering a change in course, do we still remember why we made those initial decisions? Were there alternative options that we evaluated before settling on our chosen path?
The Second Law of Software Architecture
In their book Fundamentals of Software Architecture, Neal Ford and Mark Richards define Second Law of Software Architecture as “Why is more important than how.”
Indeed, it is essential to strive for well-informed decisions. Understanding the rationale behind our choices allows us to reevaluate them when needed and retain valuable knowledge. Consequently, thorough documentation becomes imperative to preserve this knowledge for future reference.
BM Insight: For the sake of my sanity, I will also mention the First Law of Software Architecture: “Everything in software architecture is a trade-off.”
Architectural Decision Records: An Insight
One effective way of documenting architecture decisions is through Architectural Decision Records (ADRs).
In general, the ADR concept consists of Architecturally Significant Requirement (ASR) and Architectural Decision (AD).
An AD represents a software design choice addressing either a functional or non-functional requirement of significant architectural importance. An ASR is a requirement that has a discernible and quantifiable influence on the software system’s architecture and its quality. The ADR serves as a detailed account of a single AD, capturing its rationale. The comprehensive collection of ADRs, which are developed and kept for the duration of a project, forms its decision log.
Challenges of Architecturally Significant Requirements
Defining ASRs can be a daunting task. These requirements often lack precise expression, making them initially challenging to identify. They might be concealed within broader requirements and subject to varying interpretations depending on the context.
Moreover, architecturally significant requirements are subjective and dynamic, adding to their complexity.
While other requirements could share some of these characteristics, what sets ASRs apart is their unique importance and impact on the overall system’s architecture. This significance makes dealing with these requirements particularly intricate and demanding.
Key Components of an ADR
ADRs typically include the following information for one architectural decision:
Title: A clear and concise title that reflects the essence of the decision.
Context: The situation or problem that prompted the need for a decision.
Decision: The specific choice made to address the given context or problem.
Rationale: The reasoning behind the decision, including the benefits and drawbacks considered.
Consequences: The expected outcomes and potential impacts of the decision on the system and development process.
Status: Indicates whether the decision is proposed, accepted, rejected, or superseded.
Date: The date when the decision was made or recorded.
Author: The individual or team responsible for the decision and its documentation.
Preserving the Integrity of ADRs
Once made, an ADR typically is not altered. If a decision is changed, a new ADR is created to supersede the old one, preserving the history of architectural decisions.
It is a good practice to store ADR in source control together with the code/project.
A Practical Glimpse: ADR Example from Neal and Mark’s Book
For last, we will show you example of ADR taken from book written by Neal and Mark:
BrightMarbles’ Commitment to Excellence
At BrightMarbles, we strive to uphold the highest standards, continually integrating best practices such as the use of ADRs. Our rich experience, shaped by years of dedication, is currently being channelled into the Raven project. This endeavour is not just an investment, but a showcase of our commitment to efficient, reliable, and innovative software design.
BM Insight: Software architecture is the umbrella term for various approaches, models, methods, and programming languages that make a comprehensive software development project. In our daily coding errands, not only that we follow current industry trends, but we also eagerly study how we got here. One of our previous posts, The Genesis of Object-Oriented Programming, is a thorough analysis of the development of the eponymous programming model. As pointed out before, we need to know why to become versatile conceptual software engineers.
In Conclusion: The Vitality of Architectural Decisions and ADRs
In the multifaceted realm of software architecture, our choices, and the rationale behind them, are pivotal. These choices, whether easily modifiable or steadfast, have lasting repercussions on our software’s performance and maintainability. Architectural Decision Records emerge as the beacon, providing clarity, rationale, and a historical account of our architectural choices. They not only fortify our architectural framework but also ensure that the wisdom behind each decision remains accessible for current and future team members.
As highlighted in the BrightMarbles’ relentless pursuit of excellence and best practices – like the ADR – one thing is evident: the intricate dance of decision-making in software design isn’t merely about choosing a path but understanding, documenting, and justifying the why behind it. As developers and architects, embracing tools like ADRs helps us navigate the complex waters of software design with precision, clarity, and foresight.
About Author:
Nenad Stojkovic is a seasoned software architect and a tech excellence officer at BrightMarbles, and a PhD student at the Faculty of Sciences in Novi Sad. He’s an all-round language enthusiast and learner, whose scope of knowledge extends beyond programming languages, to English and German. Music is the fourth language he speaks, and while his guitar gently weeps, Nenad travels the world, admiring the amazing works of architecture.
The Genesis of Object-Oriented Programming (OOP): A Historical Overview
1. Introduction
Welcome to “The Genesis of Object-Oriented Programming (OOP): A Historical Overview,” where we delve into the origins and evolution of one of the most influential paradigms in modern software development. In this blog, we’ll explore the key milestones and groundbreaking contributions that shaped the landscape of object-oriented programming. From the early concepts of Simula to the birth of Smalltalk and the transformative emergence of C++, we’ll trace the lineage of OOP and uncover the remarkable insights from visionary pioneers. Join us on this captivating journey through time as we unravel the fascinating history behind OOP.
2. The Triad of Programming Paradigms
The renowned software engineer, Robert C. Martin, affectionately known as ‘Uncle Bob’, once postulated that there are three fundamental programming paradigms that form the bedrock of software development: Structured programming, Functional programming, and Object-oriented programming. As articulated in his seminal book, “Clean Architecture: A Craftsman’s Guide to Software Structure and Design,” [3] these paradigms were conceived during a transformative decade – the latter half of the 1950s and the early half of the 1960s.
According to Uncle Bob, these three paradigms are likely to be the only ones we will ever unveil. His argument pivots on the notion that each paradigm introduces constraints, providing guidelines on what we should refrain from doing. Furthermore, any emergent paradigm will similarly enforce new limitations. One of the keystone arguments bolstering his belief that the existing paradigms are the only ones we will ever stumble upon rests on the historical record. Quite simply, we’ve seen no new paradigm materialize in the past half-century.
Before delving into the fascinating historical evolution of the Object-oriented programming paradigm, it’s worth taking a moment to revisit the landscape of software development prior to its advent. Doing so will allow us to better understand the factors that instigated the genesis of Object-oriented programming and catalyzed its subsequent rise to prominence.
2.a Structured Programming: Algorithms and Control Flow
Structured programming pivots around crafting a suite of procedures to tackle a specific problem. Once the optimal procedures, or algorithms, have been established, the next course of action involves determining the most efficient way to store data. Notably, this ideology is embodied in the title of Swiss computer scientist Niklaus Wirth’s book, “Algorithm + Data Structures = Programs.”
Essentially, structured programming is the art of software development utilizing control flow elements such as conditionals (if-then-else) and loops (for and while), code block structures, and coroutines. The code block structure allows for multiple instructions to be interpreted as a singular entity.
Structured programming rose to fame through “The Structured Program Theorem,” commonly referred to as “The Böhm-Jacopini theorem”, and the influential work of Edsger Dijkstra. The theorem, published in 1966 by Corrado Böhm and Giuseppe Jacopini, posited that all programming tasks can be logically executed using only elementary structures: sequence, selection, and iteration.
In 1968, Dijkstra popularized the term “structured programming” and argued vehemently against the misuse of ‘goto’ instructions, likening them to the creation of a “spaghetti code”. His groundbreaking research, published in an open letter to the Communications of the ACM magazine, asserted that the number of software errors correlated directly with the frequency of ‘goto’ instruction usage. Initially titled “The case against the goto instruction”, the letter stirred a passionate debate about the use of the ‘goto’ instruction after being renamed by editor Niklaus Wirth to “The goto instruction is considered harmful.”
Early programming languages that employed structured programming principles include ALGOL and Ada. To encapsulate the essence of structured programming, Robert C. Martin stated, “Structured programming imposes limitations on the direct control-flow transfer.” [3]
2.b Functional Programming: Harnessing the Power of Mathematical Functions
Functional programming is grounded in the principles of lambda calculus, an abstraction introduced by Alonzo Church in 1936. Central to lambda calculus, and consequently functional programming, is the immutability of symbol values. The first functional language, LISP, was developed by John McCarthy at the Massachusetts Institute of Technology in the 1950s. LISP functions were defined using Church’s lambda notation, further extended with additional functionality under the label name to enable recursive function calls.
In the functional programming paradigm, functions are treated as first-class entities. That is, they can be named, passed as arguments to other functions, or returned as values from functions.
As Eric Normand points out, functional programming is a “programming paradigm characterized by the use of mathematical functions and avoidance of side-effects.” [12] Side-effects refer to any behavior of the function outside of the return value, while clean functions rely solely on their arguments and produce no side-effects. This ensures that for any given set of input arguments, clean functions will consistently return the same value. Functional programming thus strives for purity but recognizes the necessity of side-effects and impure functions in certain scenarios.
Functional code is generally categorized into three divisions: actions, calculations, and data. Some of the most prominent functional languages include Common Lisp, Erlang, Clojure, Haskell, and F#. Robert C. Martin summarizes functional programming as: “Functional programming imposes limitations on assignments.” [3]
2.c Object-Oriented Programming (OOP): Modular Units for Simplified Software Maintenance
Object-oriented programming, or OOP, is characterized by predefined programming modular units (objects, classes, subclasses, etc.) designed to speed up the programming process and simplify software maintenance. [1]
Alan Kay, the creator of the OOP concept, pinpointed two key motivators behind the development of OOP: the quest for superior module structures for complex systems (which would encompass detail hiding) and the desire to devise more flexible versions of assignments. A programming language is considered object-oriented if it supports data and class abstraction, while certain languages are considered fully object-oriented if they also support inheritance and polymorphism. [15]
At the heart of object-oriented programming lie four basic concepts: encapsulation, inheritance, abstraction, and polymorphism.
Encapsulation, also known as data hiding, is the practice of an object concealing its internal structure and data from the rest of the system. It permits access to its fields exclusively via its defined methods. The data within an object represents instance fields, and the operations performed on this data are referred to as methods or functions. It’s an unwritten rule in OOP that objects should never directly access the fields of other objects; instead, this should be accomplished through the methods of those objects.
Inheritance allows a class to acquire the attributes and behaviors of another class (referred to as the base class). Moreover, inheritance can specify the interface implemented by a specific class.
Abstraction aims to hide unnecessary details to lower complexity.
Polymorphism, derived from the Greek words ‘poly‘ (many) and ‘morphe‘ (shape), enables the same method call to behave differently depending on the object type upon which it is invoked. Polymorphism can be achieved through method overloading and overriding, as well as operator overloading.
In the realm of object-oriented programming, programs are constructed from objects that communicate with each other. An object is an instance of a class, while a class delineates the type of the object. Some prominent object-oriented programming languages include Java, C++, C#, Python, Ruby, PHP, Smalltalk, Objective/C, Swift, Scala, and Kotlin.
Robert C. Martin distills the concept of object-oriented programming to this: “Object-oriented programming imposes limitations on the direct control transfer.” [3]
Genealogy of programming languages
3. Simula: The Precursor to Object-Oriented Programming
Simula 1 and Simula 67, developed by Norwegian computer scientists Ole-Johan Dahl and Kristen Nygaard, are recognized as the first object-oriented programming languages. Belonging to the ALGOL language family, Simula was designed as a superset of ALGOL 60, chosen for its support of block structures, robust programming security, and a touch of European pride [9].
An ALGOL program, represented by a block of code, encapsulated not just a sequence of operations on data, but also the structure of the data itself. Simula took this concept a step further by introducing the notions of encapsulation and inheritance. The first iteration of Simula was a specialized simulation language, while Simula 67 marked its evolution into a general-purpose language.
In the late 1950s, Kristen Nygaard worked on Monte Carlo simulations at the Norwegian Defense Research Establishment (NDRE). At the time, these simulations were performed manually by employees, although some were run on early computers. Nygaard identified the need for a programming language that could describe these systems at a higher level. Thus, Simula was conceived to streamline the formal description of system interfaces and the handling of discrete events, such as state changes. The name Simula itself is short for “Simulation Language.”
During Simula’s early development, its creators envisioned a simulation system model composed of various stations, each with its own queues of customers. Customers were accessible at all stations, which meant each station could “borrow” a user from a queue, modify its variables, and then redirect it to another station’s queue. Stations could autonomously create or delete customers and were governed by the program itself. This iteration, sometimes referred to as Simula 0, never reached the implementation stage but set the foundation for the object-oriented languages to come.
Dahl, Nygaard and Simula taken from [4]
3.1 Simula 1: Pioneering Object-Oriented Design and Memory Management
Recognizing that the implementation of a language as a preprocessor for ALGOL would constrain their ability to develop a system of stations and customers due to ALGOL’s LIFO (Last In, First Out) stack of procedure calls, Dahl and Nygaard sought fundamental changes to the ALGOL language in 1963. They developed a memory allocation system based on a two-dimensional free area list and moved the stack or frame to the heap, enabling it to persist beyond the completion of a function. These local variables transformed into instance fields, and the functions that operated on this data became methods.
In the summer of 1963, Dahl and Nygaard encountered the use of pointers in high-level languages, through Bernard Hausner, a creator of the SIMSCRIPT language who joined NCC, their company. They initially considered using process sets, a process search mechanism, and an expression selector as the sole means of process identification but instead opted for a process pointer.
The creators of Simula aimed for the following: The program security of the SIMULA language had to be of the same level as in ALGOL 60; any faulty program must either be rejected by the compiler (desired), or via checking during the execution period (if inevitable), or its behavior must be comprehensible through the judgement completely based on the language semantics, regardless of its implementation” [17].
The primary security aspect was ensuring access to the data controlled by the compiler. As for the processes in interaction only via non-local data, the problem was solved with standard ALGOL access regulations. However, there was a need to gain access to the content of an object outside the object. In a model that contains several users, an active customer might need access to data that belongs to another customer.
The secondary security aspect referred to deallocation of memory space. For the simplicity and efficiency of implementation, the creators wanted to make memory deallocation explicit, like, for example, “delete” or destructor executed at the end of the process’ lifecycle. However, the only way in which this operation would be secure was the mode in which only one indicator can indicate one process at any particular moment. Since this strict rule would significantly affect flexibility during programming, Nygaard and Dahl decided to use calculations references on a certain object. They claimed this idea had been borrowed from J. Weizenbauma, i.e., more precisely, from his study “Knotted List Structure,” published in 1962. Moreover, they also added a last resort garbage collector. To prevent a potentially dangerous situation in which the memory space would be filled with unnecessary data, the creators insisted that the procedures and regular subblocks must self-destruct upon the operation completion, as is the case in ALGOL 60.
The last security aspect referred to the rule that the value of the variable was non-defined after the declaration. With indicating variable in the language, the “non-defined” values had to be defined separately. Dahl and Nygaard solved this problem by ascribing a neutral initial value to each variable.
In the previous language design version – Simula 0 – the stations represented active parts while the customers were passive. In the new design, the concept of stations and customers was replaced by the concept called “process”; the process could be both an active and passive part of the simulation. The process could also be a record-like data structure, as well as a block of code executed as a coroutine. The life of the process was not limited to the way it would be had they used the ALGOL’s LIFO stack, and the record of the existing objects was tracked via non-standard indicators. The system comprised a built-in mechanism for controlling the execution of active processes. Dahl designed a scheme that would automatically “return” the memory that is not used anymore. The garbage collector scheme was based on calculating the references and free lists, as well as the mechanism for searching the interconnected loop objects whose memory can be freed.
The idea behind the concept with processes was to enable the decomposition of discrete event systems to components that can be defined separately. Basically, “every process had two aspects: it carried the data and executed the actions.” [16] The process description was named the activity declaration. An activity was a class of process described with that same declaration. A process represented one instance of an activity.
A discrete event system was a set of processes, in which the actions and interactions represented the system behavior. Processes would become or stop being part of the system due to the actions within the system itself. Process indicators indicated the memory location that consisted of that process data, as well as of the data on the current execution of that process.
element John;
activity developer (isexperienced, thumbs);
Boolean isexperienced; integer thumbs;
begin - - - end;
John := new developer (false, 10);
An example of code in Simula
The value of the John variable is a referend to the developer process. If at any moment no other value is allocated to John, and if there is no other indicator to that process, the memory used by that process will be freed. The symbol new serves to differentiate various uses of activity indicators. It is possible to have several elements that indicate the same activity. The process can be in one of the four possible states: active, suspended, passive, and terminated. As the simulation unfolds, the states of the process will change.
Only after this design version of Simula did Dahl and Nygaard start with language implementation. According to Krogdahl – the author of The Birth of Simula book – the implementation lasted from April 1964 to January 1965. After the implementation was done, that language was used for the development of several different projects. During the development of those projects, Dahl and Nygaard were considering what language advancements are possible to turn Simula into a general-purpose programming language.
3.2 Record Handling: Pioneering Data Structures and Reference Types
In 1966, the influential British computer scientist Tony Hoare published a study, “Record Handling,” where he presented a comprehensive approach to managing data structures in general-purpose programming languages. Hoare’s work was motivated by the need to address the issues surrounding structured data, which had been driving the development of specialized programming languages. Hoare drew parallels between our understanding and modeling of the real world and our computer simulations of it. Each “object of interest” in the real world, as well as in computer-generated models, must be represented by a certain computing unit that can be manipulated by programs. Hoare termed these units “records.”
Each of these real-world or simulated objects, represented in the record, possess one or more attributes that describe it (e.g., name, surname, year of birth, etc.); each attribute represents a field in the record. Hoare also introduced the concept of a “record class,” defining fundamental concepts of objects, records, attributes, and record classes.
A record class comprises a class name and fields that are associated with records of that class. The fields are constituted by names and types.
record class bankloan(integer loan number, principal; real rate of interest);
record class repayment(integer loan number, amount);
Example of class record from Hoare’s work [8]
The example above shows the declaration of two record classes: ‘bankloan,’ which consists of two integer-type attributes and one real-type attribute, and ‘repayment,’ which has two integer-type attributes (loan number and amount).
Hoare also proposed the idea of a “reference type,” which is essentially a pointer to a specific record in memory. Variables and function parameters can be of this type. Variables of this type can be manipulated the same way as other variables; their values can be assigned or modified. Interestingly, a variable of this type can change the record class it points to, meaning the record class to which the reference points is specified at the point of its declaration.
In the example above, Hoare demonstrates the declaration of two reference-type variables. The variable ‘master’ is of the ‘bankloan’ record class type, and the variable ‘transaction’ is a reference type to the ‘repayment’ record class.
Pictogram from Hoare’s work [8]
Hoare’s work also showcased the concept of ‘null’ fields within a pictogram. In 1965, Hoare had introduced ‘null’ in the ALGOL W language. In retrospect, Hoare has since referred to the ‘null’ pointer as his “billion-dollar mistake” and has expressed regret for introducing this feature. [18] Hoare proposed that the ‘null’ pointer should be represented by an integer value outside the address space allocated for record storage.
Hoare also proposed the concept of a record subclass, an idea that would later form the basis for process subclasses in Simula 67. Hoare demonstrated that total security could be achieved in constructions where the reference type is known during compile time, and that a certain level of referential flexibility during execution time could be achieved via the record subclass concept.
The creators of Simula decided to merge the concept of a process with the new concept of object self-initialization in a new version of Simula. In this new context, they chose to replace the term ‘process’ with ‘object,’ reflecting the new focus on object-oriented programming.
3.3 Simula 67: Pioneering Object-Oriented Programming and Inheritance
The “Simula 67” version was a major step forward in the evolution of the Simula language, incorporating the ideas and improvements suggested by Tony Hoare and integrating them into Simula’s pre-existing framework. The concept of “record class” from Hoare’s work was adopted into Simula and called “object class”.
Starting from 1978, authors began emphasizing that all class declarations should be prefixed, and they introduced a primitive external prefix object that includes “detach” and “resume” as local procedures. This prefixing idea and class merging was a critical progression towards the modern concept of inheritance. This allowed defining classes that were primarily meant to be used as prefixes.
The prefix classes were created with the intention of expressing some concepts of special-purpose Simula 1, available to programmers as modules exclusively meant for prefixing (and hence, for inheritance). This was a significant change in how programmers approached the design and structure of their code. The notion of these prefix classes became one of the foundational ideas of object-oriented programming, and Simula 67 was the first programming language to implement these concepts.
The idea of prefixing allowed for the creation of more complex and organized code structures, and it offered the ability to reuse code in a more efficient and systematic way. This, in turn, led to the development of more complex software systems. Furthermore, the integration of these concepts into Simula made the language more flexible and powerful, enabling it to handle a wider range of programming tasks.
4. Smalltalk: Crafting the Purest Object-Oriented Language and Emphasizing Dynamic Interpretation
Smalltalk is a seminal programming language that was developed in the 1970s at the Xerox Palo Alto Research Center (PARC). The language was created by a team of computer scientists, including key contributors like Alan Kay, Dan Ingalls, and Adele Goldberg.
Alan Kay, a key figure in Smalltalk’s development, introduced the term “object-oriented”. He envisaged a programming paradigm that encapsulates data and methods within objects and emphasizes dynamic message passing among these objects. Interestingly, Kay did not initially regard inheritance and polymorphism as fundamental aspects of object-oriented programming.
Smalltalk’s foundational concepts were influenced by several sources. The concept of classes, fundamental to code organization, was adopted from Simula 67. The LOGO programming language inspired turtle graphics, which uses a virtual cursor (the ‘turtle’) to draw vector graphics in a Cartesian coordinate system. Lastly, the interactive, direct-manipulation interfaces in Smalltalk were motivated by Ivan Sutherland’s pioneering Sketchpad program.
Sketchpad, developed in 1963 as part of Sutherland’s Ph.D. thesis at the Massachusetts Institute of Technology, was an early computer graphics program. Sutherland was later awarded the prestigious Turing Award in 1983 for his influential work.
A distinguishing characteristic of Smalltalk is its pure object orientation: everything in Smalltalk is an object, including class instances, simple data types, and even blocks of code (closures). This is in stark contrast to languages like Java or C++, where primitive data types are not treated as objects. Smalltalk didn’t adopt subclasses until the release of the Smalltalk-76 version.
The language’s first iteration, Smalltalk-72, was developed from 1971 to 1975. Unlike many other languages, Smalltalk was dynamically typed and interpreted, meaning its code was not compiled into machine language but was instead interpreted by another program (akin to Java being interpreted by Java Virtual Machine). While Smalltalk-72 borrowed the concepts of classes and message passing from Simula, it had no fixed syntax, allowing its developers to build their own parsers for token streams associated with object instances. However, this made the code nearly incomprehensible to anyone other than its creators.
After implementing and working with Smalltalk-72, the development team identified issues with performance and scalability. This led to significant upgrades in the subsequent version of the language, Smalltalk-76:
The syntax was made more rigid, hence becoming fixed.
Inheritance and subclasses were introduced, emulating advancements made in Simula. Inheritance allowed for the expansion of class specifications (for example, Integer as a subclass of Number) as well as the expansion of implementation (such as Dictionary becoming a subclass of Number). [10]
5. Evolution of C++: Merging Efficiency with Code Organization
Bjarne Stroustrup, a Danish computer scientist, pioneered C++, a robust programming language that built upon the C language. Initially known as “C with classes,” C++ was Stroustrup’s answer to effectively merge the efficient and flexible programming style of C with a sophisticated code organization methodology, similar to Simula.
C originated in 1972 as a procedural, general-purpose language that supported structured programming, recursion, and a static system for data types. Seeking to extend these capabilities, Stroustrup began his work on “C with classes” in 1979, drawing inspiration from various languages. He contemplated Modula-2, Ada, Smalltalk, Mesa, and Clu as possible alternatives to C and potential wellsprings of new features. In the end, he decided to extend C, but not without infusing elements from these languages into C++. Simula inspired the concept of classes; ALGOL contributed to operator overloading and variable declaration possibilities, while BCPL motivated the inclusion of // comments. In 1983, “C with classes” underwent a rebranding and emerged as C++, with its first version released in 1985.
Throughout his academic tenure at the University of Cambridge and the University of Aarhus, Stroustrup found Simula incredibly helpful. It facilitated the construction of programs from smaller code units and classes, simplifying the development process and debugging. Moreover, extensive type-checking ensured that the rate of error did not increase linearly with the program’s size. However, as the number of classes grew, Simula’s implementation became problematic, leading to long compile times and poor program performance, primarily due to runtime type checks and memory management overheads.
During his work at Bell Laboratories, Stroustrup grappled with the task of dissecting the UNIX kernel for distribution over a LAN network. The challenge lay in modeling the complex system and facilitating communication between its components. To solve this, he devised Cpre, a preprocessor that extended C with Simula-like classes, in October 1979. By early 1980, the first versions of “C with classes” were released, aiming to retain the usability of C while addressing its shortcomings in code organization. However, in a bid to avoid performance issues, “C with classes” skipped runtime type checking, making it no safer than C. Nevertheless, it introduced a number of enhancements to the C language, including:
Classes and derived classes
Public and private access control
Constructors and destructors
Call and return functions that allowed implicit function calls
Friend classes
Type checking and function argument conversion
The year 1981 marked a pivotal moment for C++ with the introduction of groundbreaking functionalities: inline functions, default arguments, and allocation operator overloading. These additions expanded the language’s capabilities, paving the way for its remarkable growth.
Derived classes, a hallmark of C++, drew inspiration from Simula’s prefix classes and shared resemblances with Smalltalk’s subclasses. By integrating concepts from both derived and base classes, C++ provided a powerful framework for class inheritance. Notably, “C with classes” lacked support for virtual functions. To address this, Bjarne Stroustrup turned to the protection mechanisms of the Cambridge CAP computer, using them as a springboard for implementing protection in C++. The class became the fundamental unit of protection, confining access to fields and functions within the class declaration.
In its default state, C++ granted private access rights to all fields and methods. Access could be extended by declaring functions in the public part of the class or designating specific functions or classes as friend functions or friend classes. Initially, only classes could be labeled as friends, enabling access to all functions within the friend class. However, subsequent enhancements allowed for selective granting of friend status to specific functions. Additionally, C++ supported both public and private inheritance. Public inheritance allowed derived classes to access the interface implemented by a base class, while private inheritance treated the base class as an implementation detail, limiting access to public members unless explicitly implemented in the derived class’s interface.
The success of “C with classes” can be attributed to its design philosophy, enabling improved organization of large-scale programs without sacrificing efficiency or necessitating drastic cultural changes within organizations. Nevertheless, the language’s moderate success was partly influenced by its limited set of new objects compared to C and the preprocessor technology employed for implementation with C classes. Recognizing the need for further progress, Stroustrup embarked on designing an extended version of “C with classes,” initially known as C with classes or C84. However, concerns arose over potential confusion if people started referring to the new language simply as C. Eventually, Stroustrup settled on the name C++, a choice driven by its brevity, versatile interpretations, and avoidance of the “adjective + C” pattern. The term C++ was suggested by Rick Mascitti and made its debut with the final release in 1983. [14]
During the renaming process, several crucial updates propelled C++ forward:
Virtual functions
Overloading of the name of the function and operator
Reference
Constants
Customer memory management on heap
Improved type checking
Support for call and return functions was deleted
irtual functions, inspired by Simula, revolutionized the language by enabling dynamic binding and indirect function calls. Each class contained a set of virtual functions defined as a string of indicators, allowing for flexible function calls within class objects.
References played a crucial role in supporting operator overloading in C++. While C passed function arguments by value, resulting in redundant object copies during function calls, references offered a more efficient alternative. Stroustrup’s experiences with error elimination in the ALGOL 68 program affirmed the value of references, which retained their assigned objects after initialization. As C++ supported both indicators and references, additional operations to differentiate reference operations from object operations were unnecessary, setting it apart from Simula and ALGOL 68.
The introduction of the NEW and DELETE operators in C++ marked a significant departure from the traditional malloc and free functions. Inspired by Simula, the new operator streamlined memory allocation by simultaneously assigning heap memory and invoking the constructor. The delete operator served as a crucial addition, allowing C++ to manage memory independently without relying on a garbage collector. [14]
With these monumental advancements, C++ emerged as a powerhouse programming language, empowering developer.
6. Conclusion
As we conclude our historical overview of the genesis of Object-Oriented Programming, we can’t help but marvel at the immense impact it has had on the world of software development. From the visionary ideas of Ole-Johan Dahl and Kristen Nygaard with Simula to the groundbreaking work of Alan Kay and Adele Goldberg with Smalltalk, and the subsequent evolution of C++ by Bjarne Stroustrup, the foundations were laid for a paradigm that revolutionized the way we think about structuring and organizing code.
Object-Oriented Programming’s ability to encapsulate data and behavior within objects, foster code reusability through inheritance and polymorphism, and provide a modular approach to software design has proven instrumental in tackling the complexities of modern-day programming challenges. The principles and concepts born out of this rich history continue to shape the development of cutting-edge applications, frameworks, and systems.
As we move forward in the ever-evolving world of programming, it is essential to pay homage to the remarkable pioneers who shaped Object-Oriented Programming and acknowledge their invaluable contributions. By understanding the historical context and evolution of OOP, we gain a deeper appreciation for the power and versatility it offers as a programming paradigm.
We hope this historical overview has provided you with valuable insights and sparked your curiosity to explore further into the fascinating world of Object-Oriented Programming. The journey we’ve embarked upon today is a testament to the human drive for innovation and the transformative nature of ideas. So, let’s continue to build upon the legacy of the past and embrace the endless possibilities that lie ahead in the realm of Object-Oriented Programming.
7. References:
[1] “Object-Oriented Programming,” Britannica (2021)
[5] “Go to Statement Considered Harmful,” E. Dijkstra (1968)
[6] “The Birth of Simula,” Stein Krogdahl (2003)
[7] “The Early History of Smalltalk,” Allan C. Kay (1993)
[8] “Record Handling,” C.A.R. Hoare (1965)
[9] “The Birth of Object Orientation: The Simula Languages,” Ole-Johan Dahl (2001)
[10] “The Past, The Present, and The Future of Smalltalk,” Peter Deutsch (1989)
[11] “Structured Programming,” O. J. Dahl, E. W. Dijkstra, C. A. R. Hoare (1972)
[12] “Discovering Simplicity,” Erik Normand (2021)
[13] “Sketchpad: A Man-Machine Graphical Communication System,” Ivan Sutherland (1963)
[14] “A History of C++: 1979− 1991,” Bjarne Stroustrup (1991)
[15] “Java 2, Volume I – Fundamentals,” Cay S. Horstmann, Gary Cornell (2010)
[16] “Simula – an ALGOL-Based Simulation Language,” Ole-Johan Dahl and K. Nygaard (1966)
[17] “The Development of the Simula Languages,” Ole-Johan Dahl and K. Nygaard (1981)
[18] “Null References: The Billion Dollar Mistake,” C. A. R. Hoare (2009)
About Author:
Nenad Stojkovic is a seasoned software architect and a tech excellence officer at BrightMarbles, and a PhD student at the Faculty of Sciences in Novi Sad. He’s an all-round language enthusiast and learner, whose scope of knowledge extends beyond programming languages, to English and German. Music is the fourth language he speaks, and while his guitar gently weeps, Nenad travels the world, admiring the amazing works of architecture.
Cras sit amet nibh libero, in gravida nulla. Nulla vel metus scelerisque ante sollicitudin. Cras purus odio, vestibulum in vulputate at, tempus viverra turpis. Fusce condimentum nunc ac nisi vulputate fringilla. Donec lacinia congue felis in faucibus. Lorem ipsum, dolor sit amet consectetur adipisicing elit. Fugiat a voluptatum voluptatibus ducimus mollitia hic quos, ad enim odit dolor architecto tempore, dolores libero perspiciatis, sapiente officia non incidunt doloremque?
Cras sit amet nibh libero, in gravida nulla. Nulla vel metus scelerisque ante sollicitudin. Cras purus odio, vestibulum in vulputate at, tempus viverra turpis. Fusce condimentum nunc ac nisi vulputate fringilla. Donec lacinia congue felis in faucibus. Lorem ipsum, dolor sit amet consectetur adipisicing elit. Fugiat a voluptatum voluptatibus ducimus mollitia hic quos, ad enim odit dolor architecto tempore, dolores libero perspiciatis, sapiente officia non incidunt doloremque?
Cras sit amet nibh libero, in gravida nulla. Nulla vel metus scelerisque ante sollicitudin. Cras purus odio, vestibulum in vulputate at, tempus viverra turpis. Fusce condimentum nunc ac nisi vulputate fringilla. Donec lacinia congue felis in faucibus. Lorem ipsum, dolor sit amet consectetur adipisicing elit. Fugiat a voluptatum voluptatibus ducimus mollitia hic quos, ad enim odit dolor architecto tempore, dolores libero perspiciatis, sapiente officia non incidunt doloremque?
Cras sit amet nibh libero, in gravida nulla. Nulla vel metus scelerisque ante sollicitudin. Cras purus odio, vestibulum in vulputate at, tempus viverra turpis. Fusce condimentum nunc ac nisi vulputate fringilla. Donec lacinia congue felis in faucibus. Lorem ipsum, dolor sit amet consectetur adipisicing elit. Fugiat a voluptatum voluptatibus ducimus mollitia hic quos, ad enim odit dolor architecto tempore, dolores libero perspiciatis, sapiente officia non incidunt doloremque?
Comments16
Lorem ipsum dolor sit amet, consectetur adipisicing elit. Ipsa iste inventore rem Community Guidelines.
by Simon & Garfunkel
by Simon & Garfunkel
by Simon & Garfunkel
by Simon & Garfunkel