Why you should take Apple Vision Pro seriously

Written By:
Content Copyright © 2023 Bloor. All Rights Reserved.
Also posted on: Bloor blogs

Why you should take Apple Vision Pro seriously banner

Apple just announced Vision Pro and changed the game. To explain why, and what I mean, you need to get beyond the $3,499 price tag, the poor battery life, and the ergonomics of wearing something akin to a wrap-around virtual reality headset all the time, and I have to give you some historical context. I’ve been saying for a while that 2023 is a pivotal year in the tech space, and Apple’s announcement was the next shoe dropping. I believe what’s happening is the start of a new phase of the Internet and another turning point in the way we interact with technology. Computing and communications have always been evolving, but if you look back to the birth of the computer, the birth of the Internet, and the point when we all became publishers, with access to all recorded history, all the latest news, and all available information on a device a lot of us could afford and put in our pockets, then you see phases and points of change.

Third Industrial Revolution – birth of computing and Web 1.0

The World Economic Forum suggests the third industrial revolution, a time of electronics, IT and automated production, started in 1969. It runs from the time of the mainframe and the increasing use of computers in business, through the start of the web, and then into the 2000s. In 1964 Gordon Moore, the co-founder of Fairchild Semiconductor and Intel made the observation that the number of transistors in an integrated circuit (IC) doubles about every two years. Here we are in 2023 and, incredibly, Moore’s law still holds. As well as the steady progression in lower price/higher performance chip technology, during this period we’ve had new technology innovations happening one after the other at intervals of 3 or more years. The mainframe, then the minicomputer, then IBM changed the Personal Computer market in 1981, the (analogue) car/mobile phone, client/server technology, the start of the web, e-commerce takes off, and cell phones become ubiquitous. At each shift, new companies formed that eventually become well-known brands, while others failed to make the transition and died. Web 1.0 is what we now call the first incarnation of the Internet and Tim Berners-Lee’s World Wide Web, from around 1991 to the late 2000s. During the period when companies started to create static websites, search engines like Alta Vista, Yahoo or Ask Jeeves appeared, the dot-com bubble boomed and burst, and consumers as well as corporates started to use email.

For the transition from the Third to the Fourth Industrial Revolution, the WEF doesn’t give a year. I will make the case for it being 2007. That was the pivotal year. Let me quote from Klaus Schwab’s explanation from 2016 in the middle of the second wave:

“There are three reasons why today’s transformations represent not merely a prolongation of the Third Industrial Revolution but rather the arrival of a Fourth and distinct one: velocity, scope, and systems impact. The speed of current breakthroughs has no historical precedent. When compared with previous industrial revolutions, the Fourth is evolving at an exponential rather than a linear pace. Moreover, it is disrupting almost every industry in every country.”

Fourth Industrial Revolution – Web 2.0

Velocity, scope, and systems impact. That’s what became visible in 2007, with what we now call Web 2.0. The term Web 2.0 was actually coined by Darcy DiNucci back in 1999, was popularised by O’Reilly Media‘s conference in 2004 and the tools were just available or in development. The web got interactive, characterised by user-generated contentease of use, more participatory culture and interoperability. Then three major technology shifts happened simultaneously during 2007. The rise of social media started to be harnessed by businesses rather than geeks, developers, and enthusiasts. At the same time, we began to use the Cloud Computing term regularly, and the idea of Software as a Service and web apps became mainstream. The iPhone was launched in January 2007, but initially, Steve Jobs announced it as a closed system, so “we didn’t break it”. However, by September 2007 Steve did a U-turn, opened the platform to developers and created the app store. That created the mobile computing industry, and suddenly we were saying “there’s an app for that“. We had near-ubiquitous access to the Internet in the palm of our hands, and everyone could be a publisher.

What else happened in 2007? We all thought Mark Zuckerberg was crazy to turn down $1Bn from Yahoo when Facebook was only one-tenth of the size of MySpace at the time. In September 2007, the same month as the iPhone, Mark also opened up his platform to developers. What else happened? A company called Netflix, which had an e-commerce platform renting DVDs by post, and had tried and failed to sell themselves to Blockbuster, pivoted and started streaming content instead. What else happened? Twitter had started as an experiment in 2006, passed a million users in March 2007 when it collaborated with the SXSW event that year, and suddenly Jack Dorsey realised he had something. He formally started it as a company in April 2007. From this point on we were dealing with multiple, simultaneous, new technologies – big data, analytics, 3D printing, nanotechnology, the Internet of Things, 4G, 5G and more. Velocity, scope, and systems impact.

This fourth industrial revolution saw the rise of blogging, Google, Amazon, Wikipedia, WordPress, LinkedIn, Skype, Facebook, YouTube, Twitter, Tumblr, WhatsApp, Pinterest, Instagram, Snapchat and TikTok. Things going viral. Memes. We needed to deal with a new element in marketing – influencers. We had completely new digital ways to collaborate. All of this contributed to a complete change in the way people research products and buy them. Business is done differently with Social Selling.

First, we had the static web, then it got interactive. It’s just about to change again.

The Third Wave of the Internet – fifth industrial revolution? Web3? Web 3.0?

To quote a classic song from the 60s (and lose all of the millennials and Gen Z?)

“There’s something happening here

But what it is ain’t exactly clear”

I believe 2023 is the start of a third incarnation of the Internet which will become an even bigger step change than the two previous versions/revolutions. Will we eventually start describing it as the fifth industrial revolution? Tim Berners-Lee talked of the Semantic Web, sometimes known as Web 3.0. He says the goal of the Semantic Web is to make Internet data machine-readable so that our daily lives will be handled by machines talking to machines. We can see evidence of that now. We have artificial intelligence, machine learning, natural language processing, and voice recognition starting to do a significant amount of that. And then suddenly ChatGPT, released by OpenAI in November 2022, got to a million users in 5 days! From January to now, 6 months and counting, it’s been the hottest topic of debate, with multiple news items in your feed every day. Is AI a friend or foe? What jobs will it displace? How can it be used for good? Look how it can be used by criminals. The EU is debating regulation this week. It will be the largest part of discussions through this week’s London Tech Week. AI isn’t a new topic. The first paper on it was published by Alan Turing in 1950, but the forms of AI and ML amplified by generative AI and Large Language Models are ingredients in the “something” that’s happening.

Web 3.0 is not to be confused with Web3, the term coined by Ethereum co-founder Gavin Wood in 2014. An iteration of the World Wide Web which incorporates concepts such as decentralisationblockchain technologies, and token-based economics. I see how decentralised finance (DeFi) and the added trust of blockchain distributed ledger solutions will have a role to play in the new virtual marketplaces we’ll be operating in as this change happens. New frictionless payment processes will be needed. IoT and Industry 4.0 are key enablers, intelligently linking and analysing data for fast and confident decisions within our manufacturing and operating processes. Digital Twin technology will come into play to test and then accelerate a firm’s innovation.

The Web 2.0 world needed near ubiquitous wifi and 4G access to make it work. This next level of computing needs a step up again. We need the Cloud as an enabler, but now we need it at the Edge. Sensors, IoT devices, smartphones, tablets, and new headsets need to be supported in real-time, and providing more and more data to be captured and analysed. We’ll have bandwidth limitations, latency issues, and network disruptions that will need to be addressed by 5G, 6G, and Edge Computing. We’ll need to act on insights closer to the data source. We’ll need to enable apps, orchestration, and APIs, eventually blurring the line between device, network, edge, and cloud. Gartner suggests that by 2025, 75% of enterprise data will be processed at the edge, compared to only 10% today. I’m not sure about those numbers, but I believe the trend.

What about the Metaverse? Previously I’ve recommended you get real about it. By getting real, I mean it doesn’t exist yet. We don’t have interoperable, persistent virtual worlds yet. We do have metaverse-like experiences happening using VR (virtual reality) headsets on gaming platforms like Roblox, Decentraland, HyperVerse and others and those experiences have value. Meta, Microsoft, Google, AWS and a lot of big brands are investing. People are talking about VR (virtual reality), AR (augmented reality), and XR (extended reality). Then Apple announced their headset and vision last week. I’m not sure what we were expecting, but it took a lot of us by surprise.

Why you should take Apple Vision Pro seriously

The Apple Vision Pro is a new style of device, but it combines a lot of things we already use and understand. Apple isn’t using the XR/AR/VR acronyms and jargon, but simplifying the message to mixed reality, making it easier to understand for the user. Their approach isn’t a “multiverse-like experience” on a gaming platform, but a new paradigm – spatial computing. The headset looks similar to the VR headsets that precede it, but instead of using gaming controllers (on a gaming platform), it uses gesture, eye tracking, and voice commands. We aren’t being asked to code a new experience (on a gaming platform), but we are using spatial computing access to the apps and interfaces we already know and use day in day out, overlayed on the real world, that we can still interact with.

I realise that the £3,499 price tag means that developers and well-to-do early adopters will make up most of the queues next year at the Apple Stores to buy one. However, rather than positioning this as the next consumer device like an iPhone, iPad or Apple Watch, think of it more like the next generation of personal computing. Way back in November 1986, I bought my first Personal Computer – an IBM XT with a colour screen. Its list price was £3,486. I got it at 40% discount because I was working for IBM at the time. Back then both of those price points would buy you a nice car. My friends thought I was crazy to spend so much on home computing.

The other thing I want to remind you of is the iPhone. Announced in January 2007 as we’ve discussed, but what was the most popular smartphone in the UK in Q4 2009? It was the Blackberry, used by enterprise users for their email, and school kids because BBM gave them cheap messaging. It took at least three years for the iPhone to get established. Apple Vision Pro and its follow on product family will take the same sort of time.

So Web 1.0 was static and one-dimensional, and we used the keyboard and mouse to interact with it. Web 2.0 got interactive and two-dimensional, adding multitouch screens, virtual keyboards, and facial recognition. Web 3.0 goes three-dimensional and takes us into the era of spatial computing, adding gestures, eye tracking, and voice.

These new 3D experiences of spatial computing, not forgetting the metaverse-like experiences, get added to the AI explosion we are all discussing, and Edge Computing we will need to enable and support them in these new implementations. Add in the other technologies mentioned here or coming soon, and I believe these things overlap and combine into an emerging Third Wave of the Internet starting this year 2023. It changes the interface, changes the user experience, and changes the way we collaborate in a mixed reality, hybrid world. We are adding to the keyboard, mouse, trackpad, and multitouch screen we’ve been clicking or swiping for decades with voice, gestures, vision, and eye tracking. We’ll need to rethink our business models, learn, and relearn, without losing the important lessons of the 1.0 and 2. 0 eras.