The Cycles of Computing
A brief history to how we use computers
blockchain cloud Cloud computing Computer Science computing systems Systems
3 minutes
The Booms and Busts of Silicon #
Tech has had a dynamic cycle of innovation. In the 1980s, we saw the boom of the personal computer (think Microsoft 📎1 and Apple 🍎), whereas the 90s and 00s saw the rise and fall of internet companies (from Amazon 📦 and Google 🔎 to Pets.com 🐶).
Since it’s harder to categorize the past decade so soon, I’ll offer my personal take – I think the 2010s proved the surge of cloud computing ☁️ and harnessed the power of economies at scale.
Visionaries at places like AWS noticed that every business would want to use servers at industrial scales, but may not have resources to do so. Through the power of virtualization, large companies could remotely loan out physical servers to smaller businesses, and in turn, everyone could borrow out pieces of the large server for a fraction of the price 2.
A Sharing Economy #
Conceptually, this is awfully similar to “timesharing”. Although timeshares have gotten notoriety over real estate scam schemes 🏚️, older computers and mainframes worked on a similar schedule.
Since these mainframes would often cost hundreds of thousands of dollars, people tried to maximize its use by running programs at full capacity, 24/7. Crafty young programmers 3 ran their programs at the dawn of day, so they had a chance of using a portion of the expensive equipment outside of business hours 🕐.
The difference between the age of mainframes and the cloud is that computers are no longer prohibitively expensive - in fact, almost everyone in the developed world seems to be connected through multiple computers and smart devices that are millions of times faster at a fraction of the cost of a classic mainframe. But cloud computing still offers a great value proposition in convenience and even cost.
On the other hand, we see millions, if not billions of devices in the world being underutilized for most of the day. For the average American with an iPhone 🤳, iPad📱, Apple Watch ⌚ and a MacBook 💻, they couldn’t possibly be using all of them at the same time. So for the 2020’s and 2030’s, I think we’ll see the lines of client and server-side machines blur and blend together. That’s why I’m hedging my bets on “Web3”, and the quiet revenge of personal computing for now.
Notes #
-
Virtualization basically means you are making a physical computer look like multiple virtual computers. Maybe a more tangible example: Imagine you have a room to rent, and instead of letting one person rent it, you put some dividers up and let multiple people rent it. You can also optimize this by taking boarders with opposite schedules, e.g. a person with a night shift and another with a day shift. ↩︎
-
Bill Gates seemed to have a knack for getting “free mainframe time” as well: https://en.wikipedia.org/wiki/Bill_Gates#Early_life_and_education ↩︎