Apple Vision Pro(AVP) is coming on February 2nd, are you getting one?

Posted in

By Jim Teece JimTeece.com

Apple announced the Apple Vision Pro in June. A company that has pivoted over the last 20 years several times with the iPod, iPhone, iPad is now causing a stir with its first Spatial Computing Device. 

The price is $3,500 and that is the first thing people tell me when I ask them about it. 

I remember when the original Macintosh launched. I bought one. I was a fan of the Lisa but couldn’t afford the $7,500. The $2,499 Macintosh was the same thing but affordable. I bought it on my first credit card which Apple conveniently setup as well. That major purchase (I paid $1,250 for my first used car that same year – a 1972 Smurf blue Plymouth Duster) changed my life. It took me years to pay it off, but I have no regrets. I have built a career on that investment. 

I googled how much $2,499 would cost today and with inflation Google says it’s over $9,604. Makes the AVP sound like a bargain now. 

So now I’m looking at the future again. I’m looking at the opportunity to learn and experience a new platform.

Spatial computing is an emerging technology that brings together digital and physical worlds, allowing users to interact with computers in more seamless and immersive ways. It is a set of technologies that enable humans to interact with computers in three-dimensional spaces. Spatial computing encompasses concepts like virtual reality (VR) and augmented reality (AR), as well as the related concepts mixed reality and extended reality.

Here’s how spatial computing compares to AR and VR:

**Virtual Reality (VR)**: A fully immersive visual environment that blocks the view of real environments and replaces it with a virtual one. VR immerses users into a fully simulated virtual environment, entirely isolated from the physical world.

**Augmented Reality (AR)**: Digital content is transposed onto the real world while still allowing visibility of the real world environment. AR overlays digital elements, such as virtual screens, 3D models, or interactive graphics, onto the physical world.

**Spatial Computing**: It is an umbrella term that includes VR, AR, and MR. Spatial computing enables more immersive and physically interactive experiences. It uses technologies like computer vision, sensor fusion, spatial mapping, haptic feedback systems, machine learning, edge computing, and robotics. It allows for realistic and natural interactions with virtual objects, because they can be placed and manipulated in a way that corresponds to the physical world.

While AR and VR are subsets of spatial computing, spatial computing offers a broader field that encompasses technologies that enable interactive experiences within three-dimensional spaces.

We have also been working on software for the new device. I’m writing this before the launch and Apple has us under strict NDA until the launch, so I’ll be able to tell you more about it in the next issue. 

The challenge is what to write without seeing it in person and seeing what others are writing. The challenge is trying to figure out the right mix of AR and VR to bring into your app. 

Apple Vision Pro has cameras that watch your eyes. Your eyes look at an object on the screen and it highlights it. I’m not sure how it works with lazy eyes or one blind eye, but it sounds cool. 

Apple Vision Pro has cameras that watch your surroundings. It can bring the room into your experience or alert you when someone walks into the room. I’m not sure how sensitive this is or what your peripheral vision is. Will the cat set it off and take you out of your immersive experience? Will you get startled or freaked out when someone walks into your space?

Apple Vision Pro lets you use gestures like pinch or twist of your wrist to control the experience of apps. There is no controller, just your fingers. 

Apple Vision Pro will run over a million apps for iphone or ipad without modification. Wow! 

Apple Vision Pro will allow you to insert prescription lenses and reader lenses. Yes they cost more, but it seems that the challenge of individual optic needs can be handled. 

Apple Vision Pro can work in tandem with your computer or standalone. The computer that drives AVP is in the device itself. This blows me away. How heavy is it? How hot will it get? Imagine duct taping your iPhone to your forehead. Is that what this will be like?

What apps will we use on it? 

Is this the way we will watch movies? Can we sync start or do Dena and I just watch separate shows. 

Is this the way we work? It has unlimited desktop space and monitors. That is perfect for me. I live in a cluttered office. Will my work environment mimic that?

Will it let me focus more? If I work with it on, and it shuts off the real world, can I focus?

Will this be a great way to zoom? Or are all the cool features for FaceTime only?

Will I be able to wear it for hours and hours of focused work? I read that the battery only lasts 2 hours. Maybe that’s a good stopping point. 

It’s been weird developing for one without actually having one or knowing what the actual experience will be like. 

You know I’ll be up at 4:30 am on the 19th, like a fan boy trying to buy season tickets to my college football team as they open the store at 5:00 am. It’s not whether or not I’ll buy one, it’s how many I will buy. 

I’m sure I have that old Apple credit card around here somewhere. 

Posted in

Leave a Comment





Advertisement

Archives