What I learned from the Apple Stores 30-minute Vision Pro demo

Seeing is believing? — What I learned from the Apple Stores 30-minute Vision Pro demo Despite some awe-inspiring moments, the $3,500 headset is a big lift for retail.

Kyle Orland – Feb 7, 2024 9:46 pm UTC Enlarge / These mounted displays near the entrance let visitors touch, but not use, a Vision Pro.Kyle Orland reader comments 310

For decades now, potential Apple customers have been able to wander in to any Apple Store and get some instant eyes-on and hands-on experience with most of the company’s products. The Apple Vision Pro is an exception to this simple process; the “mixed-reality curious” need to book aheadfor a guided, half-hour Vision Pro experience led by an Apple Store employee.

Further ReadingVirtual realitys time to shine: Hands-on with the Oculus RiftAs a long-time veteran of both trade show and retail virtual-reality demos, I was interested to see how Apple would sell the concept of “spatial computing” to members of the public, many of whom have minimal experience with existing VR systems. And as someone who’s been following news and hands-on reports of the Vision Pro’s unique features for months now, I was eager to get a brief glimpse into what all the fuss was about without plunking down at least $3,499 for a unit of my own.

After going through the guided Vision Pro demo at a nearby Apple Store this week, I came away with mixed feelings about how Apple is positioning its new computer interface to the public. While the short demo contained some definite “oh, wow” moments, the device didn’t come with a cohesive story pitching it as Apple’s next big general-use computing platform.

Editor’s Note: This article precedes a series of of in-depth looks at the Apple Vision Pro. We started with the Apple Store experience because that’s where most people will experience it first. Over the coming days, Samuel Axon will be covering long term usage and experience with the device, as he has spent more than a week living and working in the Vision Pro. Stay tuned. Setup snafus

After arriving a few minutes early for my morning appointment in a sparsely attended Apple Store, I was told to wait by a display of Vision Pro units set on a table near the front. These headsets were secured tightly to their stands, meaning I couldn’t try a unit on or even hold it in my hands while I waited. But I could fondle the Vision Pro’s various buttons and straps while getting a closer look at the hardware (and at a few promotional videos running on nearby iPads). Two Vision Pro headsets let you see it from multiple angles at once. Kyle Orland Nearby iPads let you scroll through videos and information about the Vision Pro. Kyle Orland The outward-facing display is very subtle in person. Kyle Orland Without an appointment you can feel the headstrap with your hands but not with your skull. Kyle Orland To Apple’s credit, it did not even try to hide the external battery in these store displays. Kyle Orland

After a few minutes, an Apple Store employee, who we’ll call Craig, walked over and said with genuine enthusiasm that he was “super excited” to show off the Vision Pro. He guided me to another table, where I sat in a low-backed swivel chair across from another customer who looked a little zoned out as he ran through his own Vision Pro demo. Advertisement

Craig told me that the Vision Pro was the first time Apple Store employees like him had gotten early hands-on access to a new Apple device well before the public, in order to facilitate the training needed to guide these in-store demos. He said that interest had been steady for the first few days of demos and that, after some initial problems, the store now mostly managed to stay on schedule.

Unfortunately, some of those demo kinks were still present. First, Craig had trouble tracking down the dedicated iPhone used to scan my face and determine the precise Vision Pro light seal fit for my head. After consulting with a fellow employee, they decided to have me download the Apple Store app and use a QR code to reach the face-scanning tool on my own iPhone. (I was a bit surprised this fit scanning hadn’t been offered as part of the process when I signed up for my appointment days earlier.)

It took three full attempts, scanning my face from four angles, before the app managed to spit out the code that Craig needed to send my fit information to the back room. Craig told me that the store had 38 different light seals and 900 corrective lens options sitting back there, ready to be swapped in to ensure maximum comfort for each specific demo. Sorry, I think I ordered the edamame… Kyle Orland Shhh… the Vision Pro is napping.

After a short wait, another employee brought my demo unit out on a round wooden platter that made me feel like I was at a Japanese restaurant. The platter was artistically arranged, from the Solo Knit Band and fuzzy front cover to the gently coiled cord leading to the battery pack sitting in the center. (I never even touched or really noticed the battery pack for the rest of the demo.)

At this point, Craig told me that he would be able to see everything I saw in the Vision Pro, which would stream directly to his iPad. Unfortunately, getting that wireless connection to work took a good five minutes of tapping and tinkering, including removing the Vision Pro’s external battery cord several times.

Once everything was set, Craig gave me a brief primer on the glances and thumb/forefinger taps I would use to select, move, and zoom in on things in the VisionOS interface. “You’re gonna pretend like you’re pulling on a piece of string and then releasing,” he said by way of analogy. “The faster you go, the faster it will scroll, so be mindfl of that. Nice and gentle, nice and easy, and things will go smoothly for you.” Advertisement

Fifteen minutes after my appointed start time, I was finally ready to don the Vision Pro. A scripted experience

After putting the headset on, my first impression was how heavy and pinchy the Vision Pro was on the bridge of my nose. Thankfully, Craig quickly explained how to tighten the fit with a dial behind my right ear, which helped immediately and immensely. After that, it only took a minute or two to run through some quick calibration of the impressively snappy eye and hand tracking. (“Keep your head nice and still as you do this,” Craig warned me during the process.) Enlarge / Imagine this but with an Apple Store in the background.Kyle Orland

As we dove into the demo proper, it quickly became clear that Craig was reading from a prepared script on his iPhone. This was a bit disappointing, as the genuine enthusiasm he had shown in our earlier, informal chat gave way to a dry monotone when delivering obvious marketing lines. “With Apple Vision Pro, you can experience your entire photo library in a brand new way,” he droned. “Right here, we have some beautiful shots, right from iPhone.”

Craig soldiered through the script as I glanced at a few prepared photos and panoramas. “Here we have a beautiful panorama, but we’re going to experience it in a whole new way… as if you were in the exact spot in which it was taken,” Craig said. Then we switched to some spatial photos and videos of a happy family celebrating a birthday and blowing bubbles in the backyard. The actors in the video felt a little stilted, but the sense of three-dimensional “presence” in the high-fidelity video was impressive.

After that, Craig informed me that “with spatial computing, your apps can exist anywhere in your space.” He asked me to turn the digital crown to replace my view of the store around me with a virtual environment of mountains bathed in cool blue twilight. Craig’s script seemed tuned for newcomers who might be freaked out by not seeing the “real world” anymore. “Remember, you’re always in control,” Craig assured me. “You can change it at any time.”

From inside the environment, Craig’s disembodied voice guided me as I opened a few flat app windows, placing them around my space and resizing them as I liked. Rather than letting these sell themselves, though, Craig pointed out how webpages are “super beautiful [and] easy to navigate” on Vision Pro. “As you can also see… text is super sharp, super easy to read. The pictures on the website look stunning.” Craig also really wanted me to know that “over one million iPhone/iPad apps” will work like this on the Vision Pro on day one. Page: 1 2 Next → reader comments 310 Kyle Orland Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper. Advertisement Promoted Comments rmohns And this is why you shouldn’t expect non-actors to work from a script. No matter how enthusiastic someone is, put a script in front of them and they lose all spontaneity and joy.

I’m a little disappointed in Apple that they expect their employees to read a script. Checklist of features to show, sure, why not but dialogue? For a company that prides itself on the user experience and human factors, that’s a big thing to miss. February 7, 2024 at 10:07 pm lithven My one thought on this is the demos, per design, are always better than reality. Specifically I’m saying this with regard to the 180 deg 8k video that you indicated was the highlight for you. When such things are generated for a demo like this they are designed to show capability of the device not necessarily be a realistic representation of what other media will look like in the same device. If you’re demoing a piece of equipment you can tune everything from the motion to avoid blur, to the color of the video for the displays used, to the amount of contrast in the scene.

Not to tear the company or device down, everyone with any sort of demo does the same thing, I just find it interesting that the one thing you spoke most highly of would be the thing I would take with the largest grain of salt. February 7, 2024 at 10:11 pm TimeWinder I’m left finished the article still not sure what the Apple Vision actually brings.Having been using one fairly regularly now for a few days — software.

I DO NOT mean apps; if you asked me "Why shouldn’t I buy a Vision Pro right now?", my first answer would be "Apple and Unity managed to catastrophically bungle the availability of software at launch." There may actually be "more than 600 apps available at launch," but it sure doesn’t look that way while browsing the single-page (and sometimes single-item) App Store categories. A lot of it’s repetitive: every sport/entertainment company/niche apparently feels the need to have its own custom cinema app even though they’re all basically the same. And a lot of it’s just dumb; there’s a "game" in there that’s just a ball bouncing in AR space. So lack of software would come first on my list of negatives right now; ahead of cost, comfort, the possibility it might mess up your hair or eyebrows, and apparently the lack of porn.

That software selection will improve, but Apple’s tools are new, a little buggy, and abstracted to the point of being extremely hard to learn, even for existing iOS devs. Unity’s are new, a lot buggy, and inaccessible to most of the hobby/indie devs that make up the bulk of Apple platform developers because of the $2000/person/year barrier of entry. So I think it’s going to be slow, and "I’ll wait until there are actually a decent amount of apps to buy" seems a quite reasonable position for someone to take.

But it’s hard to complain about visionOS itself. A lot of the hyperbolic statements in the demo aren’t actually all that hyperbolic. Things are clearer, sharper, and easier to read than any AR/VR system I’ve ever used–even ones with "on paper" better specs–and I’ve used many of them. Most of the extensive set of "flat" iPad apps actually work well, and can "cover" a bit for not having a decent software selection; being able to scatter them around you in space makes for an upgraded experience even though they’re not actually any better than their on-tablet forms.

A lot of folks (including me) complained about the eye tracking and gestures being uncomfortable. That lasts about two days; at this point interating with this thing is completely natural, fast, and easy. As with the iPhone and the Apple Watch, there are some new metaphors that you need to learn, and some old ones that you can’t do. Your eyes aren’t a mouse, and a lot of that early adjustment is learning that. It does some things better and some things worse, but on the balance, it’s an Apple product: it’s designed for what it is, and it’s good at it. I’m flinging things about with wild abandon at this point; I look like one of those Hollywood hackers with a half dozen windows appearing and disappearing in a flash onscreen.

[Edit: Hint to new users: Move your eyes, NOT your head. It feels wrong at first, but not actually moving your neck muscles (or not as much) dramatically improves the comfort and wear-duration of the headset.]

There’s a lot of whining about not having controllers, but for most day to day non-game stuff you don’t need them, and not having those extra parts around (with their separate batteries, usually) is a win. For games, it supports any bluetooth gamepad that works in iOS just fine (effectively all of them); although there are a few rough edges in that, still — Steam Link, for example, can’t see my controller, even though it works fine in the app on iPad, and works fine in every other app on the Vision Pro. No idea.

The Mac Virtual screen is either wonderful (if the virtual screen is better than your real one(s)), or an OK gimmick (if not). It has some annoying quirks, but works well enough for actual use at full resolution and any reasonable size. As a dev; I’ve been using it to avoid having to take the headset off and on while testing VP apps. It’s a tradeoff — I gain the ability to transition between device and code trivally, but I lose the multi-monitor setup that I use when developing flatter apps. I’m hoping for some improvement here, but it’s enough for now.

Overall, I’d say that Apple released it a few months before the software environment could support it, and they made a massive mistake in trusting Unity (though I suppose they couldn’t have known at the time), but this is going to be a stable, well-liked platform sooner rather than later. There’s nothing broken here that can’t be fixed, and plenty that works well enough to use now. February 7, 2024 at 10:53 pm brlljotic7232 That script is so cheesy. If you’re gonna have people help demo the device, why have them talk to customers like cold machines? Is that really how they expect to sell their $3,500 device?

I think no human being has ever uttered the sequence of words "right from iPhone" outside of an Apple ad.Well, I’ll give credit to my Apple Store employee who ran my demo this past Saturday here in NYC he was really smooth and not reading from a script. And didn’t read any of those corny lines. But I was forthright with him, I told him from the start that I was unlikely to purchase a first-generation model of anything, especially something so different and which I very well might not find much use for after the sense of novelty passes.

I also asked plenty of questions about Airplaying my laptop screen to the AVP to use it as a large movable monitor. They did not have a demo module set up for that, which I agree is an oversightbut they probably are promoting the aspects which they believe will be the most popular with the largest number of people.
I don’t know whether I could (honestly) wear this or any headset for 8 hours to work from my laptop or desktop.

I will say that the entertainment aspect is stunning. I didn’t expect the verisimilitude and real sense of depth and height which the spatial video (360 degree) allowed. I actually felt a sense of ‘movement’ as if I were flying over a broad river in one of the spatial video landscape scenarios. My body responded as if I were moving.
Oh – and the audio is great, even without airpods/pro. Very nice.

I’ll be interested to see whatever short films or music videos people may create expressly for the AVP. There was a clip of Alicia Keys singing up close and filmed in spatial, and there was almost the sense that I was actually in front of her in that lounge.
But, I’m not going to drop $3.5K + tax on something even as fascinating as this. I think the device has great potential as a platform and as an entertainment device, but I don’t know that I would actually want to use it very much until it is lighter. February 8, 2024 at 7:16 am Channel Ars Technica ← Previous story Next story → Related Stories Today on Ars

admin