Apple Vision Pro: Revolutionizing Spacial Computing and Digital Interactions
Apple recently unveiled one of its most highly anticipated products ever, the Apple Vision Pro, which introduces a new way to interact with digital content that the company calls spacial computing. Instead of tapping or clicking on a display, users look through a display, merging their digital and physical lives for the first time ever.
The product itself features a modular design held together by magnets. It's comprised of a glass and aluminum enclosure, a breathable headband, and a soft fabric seal to block out light. The part housing most of the technology is the glass and aluminum enclosure. On top, there is a digital crown similar to those found on Apple Watch and AirPods Max, which dials the virtual environment in or out, and centers any virtual screens when pressed.
There's also a shutter button that is used to quickly take photos or video. Inside the device are 23 sensors, including four inward IR cameras and inward LED illuminators to precisely track your eye movement. There are two main cameras, four downward cameras, six microphones, two external IR illuminators, two side cameras, a LiDAR Scanner, and a TrueDepth camera to track your hands and environment. The information from these sensors is fed into a new R1 chip that helps the system reduce lag by streaming new images to the display in less than 12 milliseconds.
Or eight times faster than the blink of an eye, preventing motion sickness for most users. There's also a computer-grade M2 chip, the same one found in MacBook Air, which powers the Vision operating system, algorithms, and graphics. The two internal displays responsible for delivering a photorealistic environment are the size of a postage stamp, containing 11.5 million pixels each.
This is made possible by a new display technology called Micro-OLED, which fits 64 pixels in the space of a single iPhone pixel. For a total of 23 million pixels across both displays, which more than a 4K TV for each eye. On top of each display is a three-element lens designed to magnify the panels and wrap them around you.
Resulting in a display that's everywhere you look. For those who need prescription lenses, Apple is partnering with ZIESS to deliver custom inserts. Since Apple arranged all the sensors around the perimeter of the device, there was space in the center for an outward-facing display. Allowing Apple to solve one of the biggest problems with computational headsets; isolation.
Instead of being disconnected from those around you, Vision Pro shares where your attention's at. If you're in an immersive environment, onlookers will see this graphic. But if your attention turns to someone in your real environment, the graphic will fade out to reveal your eyes. Displaying a live feed of your eye position, movement, and blinks.
Allowing for natural, spontaneous interactions that are not possible on any other headset. This is accomplished with a curved lenticular display, the first of its kind, which is essentially a digital version of a lenticular print that reveals different images depending on the viewing angle. So multiple people looking at different angles are all shown the correct perspective of you.
The Apple Vision Pro will be available in the US only starting early 2024, with a price tag of $3,500. This is a high price that needs context. The original Macintosh, released in 1984, cost $2,500. Which is $7,300 today. Twice as much as Apple Vision Pro.
And the jump in technology from the Apple Lisa to the Macintosh was arguably not as dramatic as the jump from today's Mac to Apple Vision Pro. Also, consider the Varjo XR-3 headset at $6,500, which doesn't even have the technologies and hardware to match the Apple Vision Pro's performance.
"WEBVTTKind: captionsLanguage: enApple recently revealed one of their mosthighly-anticipated products ever.The Apple Vision Pro, which introduces a newway to interact with digital content thatthe company calls spacial computing.Instead of tapping or clicking on a display,users look through a display.Merging their digital and physical life forthe first time ever.The product itself features a modular design,held together by magnets.It’s comprised of a glass and aluminum enclosure,a breathable head band, and a soft fabricseal to block out light.The part housing most of the technology isthe glass and aluminum enclosure.On top, is a digital crown similar to thosefound on Apple Watch and AirPods Max, whichdials the virtual environment in or out, andre-centers any virtual screens when pressed.There’s also a shutter button which is usedto quickly take photos or video.Inside the device are 23 sensors.Including four inward IR cameras and inwardLED illuminators to precisely track your eyemovement.And two main cameras, four downward cameras,six microphones, two external IR illuminators,two side cameras, a LiDAR Scanner, and a TrueDepthcamera to track your hands and environment.The information from these sensors is fedinto a new R1 chip.Which helps the system reduce lag by streamingnew images to the display in less than 12milliseconds.Or eight times faster than the blink of aneye.Preventing motion sickness for most users.There’s also a computer-grade M2 chip, thesame one found in MacBook Air, which powersthe Vision operating system, algorithms, andgraphics.The two internal displays responsible fordelivering a photorealistic environment arethe size of a postage stamp.Containing 11.5 million pixels each.This is made possible by a new display technologycalled Micro-OLED.Which fits 64 pixels in the space of a singleiPhone pixel.For a total of 23 million pixels across bothdisplays.Which more than a 4K TV for each eye.On top of each display is a three elementlens designed to magnify the panels and wrapthem around you.Resulting in a display that’s everywhereyou look.For those who need prescription lenses, Appleis partnering with ZIESS to deliver custominserts.Since Apple arranged all the sensors aroundthe perimeter of the device, there was spacein the center for an outward-facing display.Allowing Apple to solve one of the biggestproblems with computational headsets; isolation.Instead of being disconnected from those aroundyou, Vision Pro shares where your attention’sat.If you’re in an immersive environment, onlookerswill see this graphic.But if your attention turns to someone inyour real environment, the graphic will fadeout to reveal your eyes.Displaying a live feed of your eye position,movement, and blinks.Allowing for natural, spontaneous interactionsnot possible on any other headset.This is accomplished with a curved lenticulardisplay, the first of its kind.Which is essentially a digital version ofa lenticular print that reveals differentimages depending on the viewing angle.So multiple people looking at different anglesare all shown the correct perspective of yourface.The straps attached to the enclosure are madefrom a similar flexible polyurethane as anApple Watch band, and contain a built-in audiopod on either side.The dual drivers are equip with a new spacialaudio system, which utilizes audio raytracingto analyze the features and materials of yourroom, allowing sound to be placed in yoursurroundings more accurately than ever.On the Head Band, you’ll find the Fit Dialwhich tightens or loosens the device.But what does Apple Vision Pro actually do?Well, after putting it on, you simply seea live feed of your surroundings.And in a fraction of a second, your iris isautomatically scanned for identification.Something called Optic ID, which is even moresecure than Face ID.Then you’re welcomed by a grid of circularicons.But how do you navigate this interface?Well, rather than using a cursor or reachingout with a finger, you simply look with youreyes.As you glance from one icon to another, theyrespond with an animation.To make a selection, just tap your fingers.Thanks to the downward cameras, you don’thave to lift your hand to make this gesture.To enter text, just look at a text field,and start speaking.Windows can be moved, pushed back, pulledcloser, and scaled, all while reacting tothe lightning in your room and casting shadows.But if you’d prefer to be somewhere else,just choose an immersive environment and turnthe digital crown to dial in your new space.Here, you can use familiar apps like Safari,Messages, and Notes.If someone approaches, you’ll see them subtlyfade into view.Type with a virtual keyboard, or connect aphysical bluetooth keyboard and trackpad.And if you have a MacBook, just open it, andwatch its contents seamlessly transition intoApple Vision Pro.Giving you a huge, private, portable 4K display.But what about using FaceTime when half yourface is covered?Well, Apple created a solution called Personas.By scanning your face, a machine learningalgorithm creates a 3D model of yourself,and displays it to others on the call.In the Photos app, selecting a photo or videoautomatically dims the room.And panoramas taken on iPhone can expand andwrap around you.Giving you the same live-size experience aswhen it was taken.But perhaps one of the most compelling usecases of Apple Vision Pro, is watching 3Dmovies.Since you can play them on theater-sized screens,and experience 3D effects in a way never possiblebefore.All of these technologies and capabilitiescome at a cost of $3,500 and will be availablein the US only starting early 2024.Now that is a high price that needs context.The original Macintosh, released in 1984,cost $2,500.Which is $7,300 today.Twice as much as Apple Vision Pro.And the jump in technology from the AppleLisa to the Macintosh was arguably not asdramatic as the jump from today’s Mac toApple Vision Pro.Also, consider the Varjo XR-3 headset at $6,500,which doesn’t even have the technologiesand hardware to match the Apple Vision Pro’sperformance.So that is everything new with the Apple VisionPro.This is Greg with Apple Explained, thanksfor watching till the end, and I’ll seeyou in the next video.\n"