Peak Laptop

Moore’s law states that the computing power available on a chip doubles every year. Experts argue over whether we’ll be able to keep up this exponential pace for much longer. There are years where we seem to be falling short, only to then see another massive architecture improvement yielding outsized gains.

We used to see Moore’s law reflected in our personal computers at home, but that has largely slowed over the past 5-7 years.

The processing improvement from my 2005 PowerPC and my 2010 Xeon was astounding — close to a 5x or 10x improvement in overall performance. But lately, progress has hit a wall. The quad core i7 on my 2012 MacBook Air isn’t much slower than the Skylake i7 on my new MacBook that’s almost 5 years older. So what gives?

Internet Broadband Capabilities Slowing

Average bandwidth hasn’t made major strides in the US since the introduction of 720p HD Video in the mid 2000s. While high-definition 1080i has been available for quite some time, most service providers still don’t provide a quality user experience to stream full-HD content.

While 4k HDR offers a superior user experience, I just don’t see consumer demand pushing ISPs to make major investments in their infrastructure anytime soon. While most of us would love to have a 4k TV and to stream UHD video on a 1Gbps connection, we aren’t going to pay $3-4k for a TV when we can get a solid TV for less than $500 and can pay $50/mo. for a 100MB connection. Unfortunately, it’s “good enough” for the average consumer.

So why does this correlate with slower processing innovation on laptops? With the dearth of 4k content streaming, there just isn’t much of a need for the top-of-the-line processors / GPUs.

Cloud Computing

A decade ago, it was common to have 15 or more applications running in your dock. Microsoft word & Excel, Photoshop, iTunes, Messaging apps, Safari, etc… You needed the additional performance on your machine to keep up with all of the applications.

Now, virtually all of the day-to-day applications have moved to the cloud either via the browser or via apps on your phone.

  • Word and Excel => Google docs
  • iTunes => Spotify
  • AIM, ICQ => WhatsApp, Facebook Messenger, etc…
  • Photoshop => Adobe CS Cloud
  • etc…

Even things like video rendering or massive data computation is now often outsourced to cloud computers.

With cloud computing taking on most of the heavy lifting in terms of raw processing power, we have seen a movement in personal devices towards “thin clients”:

A thin client (sometimes also called a lean, zero or slim client) is a computer or a computer program that depends heavily on another computer (its server) to fulfill its computational roles.

Chromebooks are an extreme example of a thin client - in addition to fairly basic processors, they sport limited local storage space, eschewing local hard drives for Google Cloud storage. The benefit is you can often buy a very useable computer for $200-300.

Design

The final piece is around portability and design.

The real innovation over the past 5-7 years in my opinion has shifted from focusing on processing power in bulky, heavy casing to providing practical features and beautiful design in a lightweight product that’s a joy to use and carry around all day.

We’ve seen this with portability (Macbook Air) and keyboard / input innovation with the new Macbook Pro touchbar.

The reason I’m upgrading my device these days isn’t for the processing power, but for the improved portability, better screen resolution and better battery life. The question is really just, given the improvements “is it worth it”?

Strangely, over the past 5-7 years, the answer has been no. This has triggered somewhat of a “viscous cycle” in the personal computer industry - where the lack of organic need / demand for improvements has meant less attention & less research & development, less competition, and therefore less innovation.

Where are we headed?

The next paradigm shift for computing is going to be in computing applications - the consumer application has largely lost its exponential improvements and is approaching more linear gains. The new computing applications that are gaining steam now are in GPU / AI / ML computation, as seen with Nvidia automotive “self-driving” chips and Google’s Tensor Processing Unit.

I think we’ll also eventually see a surge in nanotechnology that allows for human implanted computational mechanisms to diagnose and enhance our physiological and neurological systems. We’re just beginning to make progress there, so we’re probably about 20-30 years before any commercial applications go mainstream, but I’m excited about the space.

Follow me on twitter.