Apple Glass Concept Shows Us All the AR Goodness We Need
It’s also a good way to see how these glasses would work. The device is called Apple Glass here and looks very much like a pair of regular glasses. It’s got an advanced LiDAR scanner on the left side of the frame, plus a light sensor, microphone and proximity sensor on the other side (top and front). At the middle, right above the nose sits the dual core G1 CPU. the lenses are dual transparent Retina screens. The designer envisions a so called rOS, inspired by iOS.
There are Home Cards available, basically a sort of widgets that show useful info. You can navigate with air swipes and the interface changes with the ambient light around. There’s AR navigation for map orientation shown in the video and real object detection, like iPhones for example. Calls are taken with the aids of the AirPods Pro. We don’t get specs, materials used for the lenses and frames, battery power, resolution and all that. It’s more of a proof of concept… which was done in PowerPoint by the way.
I actually dig the interface, but I’m not very sure about swiping in mid air to navigate. Swiping on the frame is also an option, but it sounds equally annoying. Voice command is probably the way to go, or maybe even eye movements/gestures. Also, you should remember about the theory that says the Apple Glasses will be bundled with a smart ring that you’ll put on your finger in order to control the UI. I would love to see THAT rendered.
via wccftech