And onto your lap and desk…
The importance of the impact of innovation on markets, and in turn, our daily lives, cannot be overstated. Markets and economies commonly pivot on the perturbations of innovation, at times very abruptly, but more often with subtlety.
Long before we enjoyed common commercial access to it, the Internet was shaping aspects of the economy and society, but the ubiquity and pervasiveness was not as acutely felt in daily household life. However, at a time where the affordability of home/personal computing met with the proliferation of Microsoft’s Windows 95 and Windows 98 operating system in the 1990s, the right conditions existed for rapid uptake of internet access, which was largely propelled by the World Wide Web. The resulting impact on computing and daily life laid the groundwork for innovations that we see as common today: online communities, search engines, eCommerce, and personalization of these experiences.
Okay, but what about that “chip on my shoulder?” I am getting there. The principle technology vector that lead to the proliferation of both home and organizational computing arrived via what we commonly refer to as the “PC” or the “Desktop.” This familiar configuration of computing access is founded on the Intel 8086 processor dating back to 1978. As a Complex Instruction Set architecture, these computers were set to operate with interchangeable parts and peripherals that allow a versatile set of uses in either the home or office setting. We, collectively, enjoyed a multi-decades-long ride on Moore’s Law that saw increases in speed, reduction in size (think laptops), and increased affordability for all of the important components in the personal computer. We generally refer to Central Processing Units as “microchips” due the general mechanics inherent in Moore’s Law. Thus, leading up to the turn of the century, your relationship to computing was largely that your device was on your desk or on your lap.
Although the idea of portability for computing has been with us for as long as personal computing, the convergence of several key features – telephony, network connectivity, onboard sensors (particularly for location and image capture), and touch interfaces – created a new phenomenon that has been truly transformative. Increasingly, relationships to computing and information are forged because computing is where you are at and accessible through your “phone.”
“Okay, and that chip on my shoulder?” The phone as a descriptive metaphor, out of the momentum and service providence of the cell phone infrastructure, is my cute way of acknowledging that our idea of multitasking in with a traditional phone is to hold the receiver up our ears with our shoulders while at the same time keeping our hands free to do other things. So, for the use of a bad pun for a lead in, I am guilty. Yet, the strength of the metaphor – your access to computing is via a “phone” – is noteworthy.
Mobile computing, via cell phones, tablets, wearables, and the proximity of IoT (Internet of Things) connectivity, has been driven by an entirely different set of computing principles even down at the microchip level. Our mobile devices are powered by fundamentally different resource needs largely according to the precious resource of battery life. At the heart of this revolution are ARM processors (Advanced RISC Machines). The Reduced Instruction Set Computing (RISC) is a simplified counterpart to CISC and focuses on doing more with less (which is the entire premise of mobile computing). Specifically, lower power draws and other simplifications are well-suited to the mobile space.
My Lap and Desk?
Also shortly after the turn of the century, and to broaden market share, Apple made the decision to leave their long-standing use of RISC processors (known as PowerPC) in order to harness the possibilities of a Linux core and to utilize the economies-of-scale benefits of the x86 Intel CISC architecture that was pervasive due to the proliferation of PCs.
Mobile computing has changed all that. Starting late last year, Apple began the transition back to a home-grown ARM-based architecture with their first offering of the M1 processor being available in laptops and some models of desktop. There are many advantages/disadvantages in the RISC vs. CISC equation, but that would be fodder for another post. However, what is remarkable about this development, is that Microsoft has indicated a desire to move to their own chip for both their tablet products and data centers. The trend is catching.
Whereas mobile computing was once the supplement to access to computing, it is now driving computing even at the hardware level. Beyond mobility, we are now entering an era where the easy gains of Moore’s Law are depleting and, lacking some other paradigmatic shift and engineering breakthrough, computing improvements are likely to continue with smaller arrays of cooperative and distributed computing rather than monolithic approaches. Apple’s move with the M1 proves that innovation still matters. This shift will be solidified by the same economies of scale that propelled the x86 architecture in the Personal Computer age. Regardless of how, or if, Intel can muster a response, it is likely that this ship has sailed. Moreover, like many innovations, the impact to the consumer will be less likely felt as things will “just work.” What has changed is the more appropriate moniker of “Personal Computer” for your mobile device and the overarching convergence of computing.
In parting, this gives occasion to reflect on what these trends mean for business education. That too leave ample room for future discussion.