barra advisory group logo

Insights

Change is constant. Adapt through learning and courageous actions.

A Small History on Technology Evolution

The only limit to our realization of tomorrow will be our doubts of today.

- Franklin D. Roosevelt

The future is not something we enter. The future is something we create.

- Leonard Ichiro Uchida

Technology is cool, but you've got to use it as opposed to letting it use you.

- Prince
Leaders who study the history of computing and communication technologies understand that staying ahead of the next evolutionary curve confers a significant competitive advantage. These technologies serve as tools whose mastery by competent individuals can position any organization for long-term success. The evolution of technology is a constant force that disrupts both people and organizations, representing the natural order of progress. In this note, I will take a trip down memory lane to revisit key evolutive shifts in computing and communication and their impacts on society. The objective is for leaders to get insights from these technological shifts and leverage them to navigate forthcoming changes.
Here are the primary evolutions in computing and communication technology that have shaped the world and propelled our advancement:

1940s: The Mainframe Era

The concept of modern computers underwent a dramatic transformation with the introduction of the Turing Machine. In the mid-1930s, Alan Turing, a brilliant British mathematician and computer scientist, formulated a theoretical concept for computing machines. The Turing machine employed a set of rules (instructions) to manipulate symbols on tapes (storage or memory), laying the groundwork for subsequent advancements. By the mid-1940s, big room-sized computing machines inspired by the Turing Machine model emerged. These early precursors to modern mainframes included the Harvard Mark 1, ENIAC from the USA, and Colossus from the UK. Constructed with electromechanical, vacuum tube, and electronic components, these machines lacked the time-sharing capabilities of contemporary mainframes. In the early 1950s, the advent of early commercial mainframes such as UNIVAC 1 and IBM 701 ushered in a new era, featuring improved processing capabilities and time-sharing functionality. These machines could be programmed using punch cards and languages like Fortran and COBOL. By 1964, IBM introduced its System 360 series, representing the second generation of mainframes with enhanced computing capabilities, efficient resource allocation, and advanced time-sharing capabilities. Industries such as energy and finance continue to rely on mainframes in the 2020s, employing programming languages such as ADA, FORTRAN, and COBOL to power critical operations.

1960s: The Mini Era

Primarily due to high costs, only large institutions such as academia, banking, energy, military, and government could afford mainframes. The mid-1960s saw the emergence of mini computers pioneered by Digital Equipment Corporation, representing the first steps towards the democratization of computing. While not as computing-intensive as mainframes, mini computers offered increased agility. Their compact size and affordable costs rendered them attractive options. Built with transistors and powered by the UNIX operating system, they were programmed using traditional mainframe languages alongside newer languages like Pascal and BASIC. The early 1970s saw the emergence of new software languages such as C, followed by object-oriented languages like C++. These languages played a pivotal role in making mini computers and subsequent machines more widespread. The quest for portable and agile computing solutions intensified, culminating in the mid-1970s with the release of the Altair 8800 by MITS (Micro Instrumentation and Telemetry Systems), which was offered as a build-it-yourself kit. BASIC was among the early programming languages for the Altair 8800, sparking widespread adoption and the formation of computer clubs.

1980s: The Personal Computer Era

The late 1970s and early 1980s witnessed the advent of true personal computers, including products like the Apple 2, IBM PC, and clones from companies like Compaq, HP, and Dell. Utilizing integrated circuits from Intel and others, these personal computers (PCs) revolutionized computing by democratizing access. They were built on the von Neumann machine architecture, featuring a central processing unit (CPU), registers/cache/memory/storage, instruction sets for programming, and input/output ports. These modern machines could be networked on a peer-to-peer basis through daisy chain connections and newly defined networking protocols. Similarly, they could connect to telephone networks through output ports, although telephony was in its first generation (1G) offering only voice communication services at that time. Popular operating systems included command-line based systems such as MS-DOS (Microsoft – Disk Operating System) and graphical user interface (GUI) systems like Mac OS, Windows, and Linux. PCs programming languages ranged from C and C++ to Visual Basic to traditional computing languages such as COBOL and FORTRAN. This era saw the delineation between RISC (Reduced Instruction Set Computers) for single-purpose machines and CISC (Complex Instruction Set Computers) for general-purpose machines. Additionally, the era witnessed the widespread adoption of client-server architectural models for resource and database sharing applications.

1990s: The Internet Era

In 1969, the Defense Advanced Research Projects Agency (DARPA) of the United States Military funded the creation of the Advanced Research Projects Agency Network (ARPANET), a precursor to today's Internet. ARPANET initially comprised four nodes starting at UCLA, then expanding to Stanford, UC Santa Barbara, and the University of Utah. In subsequent years, ARPANET expanded to include universities such as UC Berkeley, MIT, Carnegie Mellon, Harvard, and Case Western Reserve. The early 1970s saw the creation of the first email. With the development of network protocols like TCP/IP, PC modems, the World Wide Web, and web browsers, the Internet was democratized in the early 1990s. During that period, telephony entered its second generation (2G), offering voice and text capabilities. With the advent of Voice over IP, telephone costs decreased. Access to information became more accessible, and peer-to-peer communication increased. Late 1990s witnessed the rise in popularity of programming languages like Java from Sun Microsystems and Python.

2000s: Cloud and Mobile Era

Cloud computing gained prominence in the mid-2000s with the introduction of Amazon Web Services in 2006, followed by Microsoft Azure in 2010, Google Cloud, and Ali Cloud in China. However, many of the underlying features of cloud computing have roots in the mainframe era, including time and resource sharing, and virtualization of computing and communication devices such as virtual machines (VMs). Concepts like containers and microservices are relatively new additions. The cloud essentially represents an implementation of grid computing or utility computing, where computing resources such as CPU, storage, networking, databases, and software applications are accessed over the internet on a pay-as-you-use basis. This shift has significantly reduced the cost of computing, networking, and storage, eliminating the need to purchase large servers outright.
The launch of Apple's first iPhone in 2007, coupled with significant advancements in networking and telephony, revolutionized access to information. The introduction of third (3G) and fourth (4G) generation phone services, offering voice, text, and video capabilities, enhanced communication experiences for users and accelerated changes in business models across industries.

2010s: Data Analytics and Cybersecurity

By 2010, an estimated 1.6 billion smartphones had been sold. These smartphones, along with social media platforms like MySpace, Facebook, and later YouTube, as well as Internet of Things (IoT) or industrial machines and home devices connected to the Internet, collectively generated more than 62 zettabytes (or 10 to the power of 21) of data by 2020. This massive influx of data created fertile ground for data analytics and the emergence of machine learning (ML), a branch of artificial intelligence. ML algorithms, which are primarily statistical, were developed to enable machines to learn from data, identify patterns, and make predictions with relatively good accuracy. Popular ML algorithms include linear regression, decision trees, support vector machines (SVMs), K-nearest neighbors (KNN), and ensemble methods like random forests.
The proliferation of devices and machinery connected to the internet also gave rise to cybersecurity concerns, with threats such as data breaches, ransomware, zero-day bugs, and other networking-related crimes becoming nightmares for IT managers.
Web 3.0, the next generation of the internet, focuses on the network decentralization (distributed computing), and the control and ownership issue of data. The concept emerged around the same time blockchain technology was introduced through Satoshi Nakamoto's famous whitepaper on Bitcoin in 2008. Cryptocurrencies, cryptographic contracts, distributed computing, and decentralization of data are expected to continue to shape our lives for years to come.

2020s: Generative AI and Robotics

Despite the sophistication of cybercriminals, the computing and communication industry remained resolute in strengthening network protocols, educating users, and patching leaky software. Meanwhile, machine learning continued to make inroads and gain wider acceptance, with learning taking place through supervised, unsupervised, or reinforcement methods.
ML eventually evolved into deep learning (DL), which is about training machine-based neural networks with large amounts of data to make accurate predictions by learning complex patterns, representations and relationships from that data. Convolutional neural networks (CNNs), recurrent neural networks (RNNs), and long short-term memory (LSTM) networks are among the most popular DL algorithms used to build models. Frameworks such as TensorFlow, PyTorch, or Caffe are commonly employed to deploy these models, with Python emerging as the predominant programming language for model development. Other languages like R, C++, Java, and Julia also find use among ML developers. GPUs or graphical processing units are primarily used for training models, while CPU-based computers handle inference tasks to check for accuracy.
In late 2022, the world was introduced to generative AI (GenAI) through OpenAI's chatGPT, also known as generative adversarial networks (GANs). GANs operate with two subnetworks—the generator and the discriminator—engaging in an iterative process to refine new data created by the generator. This iterative loop continues until both networks agree on the quality of the data. By 2024, GenAI models could generate near-perfect outputs by utilizing real data from multiple sources or modalities, leading to the emergence of multi-modal AI. This approach encompasses various modalities such as text, speech, image, audio, and video, with the capability to generate outputs across these domains.
The journey to GenAI began with Google's publication of the paper "Attention Is All You Need" in 2017, introducing "Transformers," a new DL architecture. Transformers utilize a self-attention mechanism to process sequences of text by converting them into tokens and assigning weights before passing them through each layer of the neural network. Initially applied to natural language processing (NLP), transformers have since expanded into computer vision and other domains, laying the groundwork for advancements in image and audio generation. Leading examples of transformers include Generative Pre-trained Transformer (GPT) from OpenAI and Bidirectional Encoder Representations from Transformers (BERT) from Google.

2030: Artificial General Intelligence (AGI) and Metaverse

Artificial intelligence (AI) is poised to impact human civilization across all domains more profoundly than trends like smartphones, cloud computing, social media, and big data. GenAI represents just the beginning of this transformation. While current GenAI models operate within specific contexts and predefined parameters, the next step in AI evolution is Artificial General Intelligence (AGI). AGI aims to create neural networks or models capable of acquiring knowledge similar to or surpassing human capabilities. These agents would possess self-control and the ability to learn new skills in different contexts autonomously.
NVIDIA's CEO has predicted that by 2030, AGI agents could outperform most humans by up to 10% in certain aptitude tests such as the bar exam or logic tests. Full-fledged AGI will be capable of performing tasks akin to human cognitive abilities without relying on pre-trained data. However, achieving full AGI will require convergence between advanced technical models, biology, and psychology to emulate human sensibilities, consciousness, and contextual understanding. This convergence will lead to what is often termed as "singularity," marking a pivotal moment in human civilization.
The development of virtual reality (VR), augmented reality (AR), and mixed reality (MR), coupled with advancements in next-generation networking communication like 5G or 6G, will synergize with GenAI and AGI to diagnose and remotely resolve complex problems. In the next decade, these technologies, combined with AI, will revolutionize every industry, including biotechnology, healthcare, energy, and business productivity.
Biotechnology, in particular, will benefit from the integration of AI, giving rise to Generative Biology (GenBio). GenBio will accelerate the discovery of new molecules, biochemical processes, and AI-based biomedical advancements. While presenting opportunities for progress, GenBio will also necessitate robust regulations governing governance, data privacy, and risk management. Public and private institutions must be proactive in implementing these regulations to navigate the potential unintended consequences of AI's exponential growth. Leaders must navigate this technological super-cycle with foresight and preparedness for the profound transformations it will bring.
The evolution of technology represents a continuum of interconnected advancements, from mainframes to smartphones and their associated software. These transformations have brought about significant positive changes in human civilization. However, the benefits of each technological evolution have not been equally distributed among all members of society. Artificial intelligence (AI) stands as the current pinnacle of technological evolution, poised to impact every facet of our lives and businesses. As such, leaders must approach the implementation of AI-based practices with intention. They must be prepared and committed to not only harness the potential benefits but also to address and mitigate any adverse side effects that may arise.
Until we meet again, stay current on the future of technology.
Fal Diabaté
Managing Partner, Barra Advisory Group

Leave a Reply

Your email address will not be published. Required fields are marked *

Work with Barra Advisory Group to accelerate Growth

Determine your next big bets
Enhance clarity in your decision making process
Develop your people
Play to win