During the Vision Pro reveal, Tim Cook announced that “augmented reality is a profound technology,” and proclaimed that AR can blend “digital content with the real world and unlock experiences like nothing we’ve ever seen.” The screen behind him then turns black, and we are led on a 360 degree tour of the Vision Pro itself, examining its sleek design and unique, yet familiar features. We’ve seen this before, maybe in Ready Player One or The Matrix.
“It feels familiar,” Tim chimes in, “yet its entirely new.”
Around the 3 minute mark in the video Tim states a similar line that his predecessor, Steve Jobs, stated during the iPhone reveal back in 2007, reminding us of the legacy-building journey Apple has been on for over three decades:
“so in the same way Mac introduced us to personal computing, and iPhone introduced us to mobile computing, Apple Vision Pro will introduce us to spatial computing.”
Spatial Computing? What Happened to VR? AR?
Some journalists have noted that Apple has intentionally avoided over emphasizing the terms “VR” or “AR” in its announcement, even though some have called it clearly an AR device with “VR features;” while others have stated the opposite, suggesting the Vision Pro is clearly a VR device with AR features. We might not know what the Vision Pro is for quite some time as Apple is hoping that apps and other developers work to help imagine its future. But for now we have Tim Cook’s vision of spatial computing that will once again deliver a “revolutionary product.”
Spatial computing refers to a set of technologies that allows humans to interact with computers in three-dimensional space. It combines technologies such as computer vision, AI, and advanced modeling to create an interface where digital objects exist alongside physical ones. A combination of the material with the virtual; a blend of what “feels familiar,” yet “is entirely new.” It’s not just the hardware that might look familiar to you, but its our sense of space which Apple hopes to evoke.
So how can space feel both familiar and new? As the promotional video suggests, It’s users will still be able to view a high-resolution representation of what’s in front of them, so FaceTime with Vision Pro is described as “more natural,” allowing you to hear people “as if they are right in front of you.” But Vision Pro also enhances your surroundings, transforming your space to create “ beautiful environments that extend beyond your room,” allowing your movie screen to feel “hundred feet wide.” To straddle both these realms seems strategic: for the skeptic, Vision Pro seamlessly integrates within one’s environment; for the tech junky, it enhances and improves one’s experience. Pick your poison.
Most importantly, Vision Pro coopts a language we typically reserve for material spaces. Our favorite apps are now “in our space” and they have “dimensions which react to light and cast shadows.” You can even experience the real world while using Vision Pro, as the promotional video notes:
“ because you see the world around you, you can glance at a notification and even connect to your Mac.”
Notice how “the world” here is reduced to other interactions with tech, such as taking a “glance at a notification” or connecting to one’s Mac. Experiencing space seems consistently mediated by Mac products, which probably isn’t far off for many of us. This is epitomized in the promotional video where we see a father enjoying his children playing while gazing through a Vision Pro, a possibly familiar, yet new, version of parenting.
Apple’s Segway into spatial computing should remind us about how space is conceptualized in the first place. Space is socially produced and is both effected by and effecting our experiences and perceptions. All social relations become real and concrete once they are spatially inscribed—there is no unspatialized social reality. Therefore, spaces aren’t stable locations and are in constant flux. Consider a train or plane ride, spaces that have felt slightly different since the iPhone conquered our attention. Before 2007, these were spaces where we would grapple with awkward eye contact or have the occasional conversation with a stranger, all while managing our emotions, or the boredom of it all, in the process. Now, our phones can help us avoid these interactions, creating a more isolating space and helping us avoid the anxiety that comes from feeling present at all times. These spaces might look the same, but the experience has changed.
Spaces have a feeling, and what is lived and experienced in spaces might differ from what is conceived or imagined in space. In the book The Productions of Space, sociologists Henri Lefebvre outlines the contention between our sensual experience of space (i.e lived) and the intellectual, abstract (conceived) vision of space which exists mostly in our heads and has the potential to reduce our lived space:
“Like all social practice, spatial practice is lived directly before it is conceptualized, but the speculative primacy of conceived over the lived causes practice to disappear along with life, and so does very little justice to the “unconscious” level of lived experience per se.”
It is this unconscious, sensual experience of space that cannot contend with the more abstract, conceived space. But often I think we recognize when conceived space is being dictated to us in a way that seems inaccurate, hence why many of us cringe when we see the father staring at his children through the Vision Pro: there’s something authentic, maybe even indescribable, that happens during those social relations that a clunky headset might ignore or flatten.
Apple believes it has considered the limitations of Vision Pro’s virtual environment, like nausea and isolation induced by virtual reality for example. These issues have potentially been addressed thanks to the new technologies Apple has implanted within Vision Pro (5,000 patents were filed over the passed five years by Apple). Given that much of its technology is relatively new in the XR space, it may take time before studies come out regarding the effects of its long term usage. However, the experience is suppose to feel so crisp and fluid, Tech Crunch writer Matthew Panzerino called it “The Platonic ideal of an XR headset.”
But one thing is for sure: the more we interact within the virtual the more it transforms our embodied space. The New York Subway will never return to its pre-2007 days, nor will our family dinner tables, company meetings or our public classrooms, and maybe that’s a good thing depending on your view. But it can often seem that companies like Apple want the virtual to intertwine with “actual” space to such a degree that it becomes no longer useful to think of them as distinct spaces. Deploying rhetorical techniques that reference the familiar and the novel in the same breath, and describing our virtual spaces as if they are material spaces are persuasive ways to do this it seems.
Do we want a seamless interplay between the virtual and the real as imagined by Apple? I don’t know. Honestly, I can’t even promise you I won’t eventually own a Vision Pro since I remember telling myself I would never own an iPhone, only to buy one in 2009. By then, it almost seemed like being in the world was naturally linked to owning a smart phone—I couldn’t walk down the street without bumping into a person staring into their screen. Now, Cook promises the same type of inevitability with the Vision Pro, but it’s important to remember that socially constructed spaces and our sense of embodiment will enviably shift. These two things naturally effect each other.
“Space is not a thing,” Lefebvre argued, “but is a set of relations between things.” Are we ready to build relations with the Vision Pro? Are we ready to build new spaces with spatial computing? Apple seems to think so.