I just returned from San Francisco to my home in Spain from Microsoft Build 2015. Now is a good time to analyze and digest all the announcements of the conference, private sessions and focus groups. Microsoft Build is a very intense event, two and a half days full of sessions, normally with about 8 parallel sessions every hour.
This was the third Build in a row I have been able to attend. There was very much of interesting news, like project Astoria and project Islandwood that allow developers to port their Java and Objective-C apps to Windows 10.
But there was a King of the Hill, HoloLens, the new awesome and unreal Microsoft augmented reality headset. Some of us went to Build hoping for an opportunity to return home with a HoloLens. And in a concrete moment during the Keynote,
Alex Kipman, the mind after Kinect and HoloLens, said they have a hundred of devices there at Build. People at the Keynote were literally mad instantaneously and began screaming and then Alex said the units are only for testing during Build, not to give to every attendee. You can have a look and hear the exact moment in this
keynote video (2:51:37).
For real, they brought a hundred HoloLenses to the Build. You can register in a website to be one of the few persons to put your hands on them and develop with the SDK in the "Holographic Academy".
In 60 person groups, with a private mentor for every two attendees, a PC and a HoloLens for each attendee, they got us into a room, made us keep our electronic devices in a safe place and allowed us to play with a HoloLens, using Unity and Visual Studio for three hours, testing our code directly on the new device.
The best of all is, although they don't allow us to take photos or videos, at the end of the event they told us we didn't have any NDA signed, we can and they asked us to talk about anything we played with in the Holographic Academy and share our feels about HoloLens. So here we go!
In the three hours of the Holographic Academy, the HoloLens team guided us through a series of simple tutorials designed to help us review all the HoloLens capabilities while creating a 3D environment using Unity and Visual Studio as in the following:
- Initial Setup and deployment.
- Hands interactions.
- Head interactions.
- Voice interactions.
- Positional sound.
- Real world collisions.
Initial Setup and deployment
When you sit down and prepare yourself to work with a device like HoloLens, with so much power and innovations and in "preview" mode, you are asking for problems. The experience could be anything but simple and easy. But this time, it was simple and easy, quite straightforward. The procedure to be able to deploy a Unity scene to HoloLens were extremely simple, even for someone with little experience in Unity like myself.
In the first place we need to configure the
pupillary distance in our HoloLens hardware. This is a common thing to do in VR headsets to get the best image and avoid dizziness, vertigo and feeling sick after some time using the device. The pupillary distance is the distance in millimeters between the center of your pupils.
The deployment was quite easy. You compile the scene in Unity and open the generated Visual Studio project in Visual Studio, connect the HoloLens using USB to the PC. In the Visual Studio Debug menu select "Start without debugging" and you are ready. Disconnect the USB cable for the HoloLens and put them on your head, you are ready and can see the scene you created.
As a curiosity, every time you finish using the HoloLens, you need to go to the local web server they have (127.0.0.1:100800) and kill the app in the apps management tab. I suppose this is something they will improve, but that is how it is working right now.
Hands interactions
The interaction with the virtual world can be done by hand, with a simple click gesture that involves lifting the index finger perpendicular to the thumb, forming a reverse L with the right hand, then move index finger down to touch the thumb to indicate a click. They call this gesture "Air tap" and it is all we need to interact with our virtual world, select objects, drag, drop or perform actions on them.
We made a demo that consisted of a notepad with two spheres floating over it, looking at the spheres and make an "air tap" on them. They fell automatically, rolled down the notepad and fell out into infinity (yet we had not put the recognition of the real surface).
Head interaction
Not with the eyes directly. At first I thought Hololens would track the eyes, but really it does a head tracking. It automatically calculates the directional vector pointed to where our head is looking and exposes APIs to read the values and check against what elements of the virtual world collides, so allowing us to place an object (a circle) just at the point of collision of our head vector with the virtual object, something like a pointer that facilitated selecting objects. This interaction is called "Gaze".
Voice interaction
The last, but not least, way of interaction is, in my opinion, one of the most powerful of all. Like the other two, the extreme simplicity of its implementation seems false, but only 1 line of code could tie a text to the execution of a command of an object. In my case I defined the following two texts:
- "I Love Holograms" that executed a command to reset the virtual world to start from scratch.
- "Drop Sphere" that ran a previously defined command DropSphere on the sphere that had selected with the view.
In a room with soft music, about 80 people among participants and tutors, the voice recognition system did not fail even once and that was even with my English... well, I do not exactly have a native accent. I need to say, unlike the rest of the gestures, speech recognition had a small lag of a few seconds between when you say the command and HoloLens executes the associated action. Another point to be polished a little more for the final version.
Positional sound
When talking about Hololens, people always reference the ability to display holograms in the real world. But what about sound? Well, believe me, it is at least as spectacular as the graphics capabilities of HoloLens.
The Hololens has two small red speakers, they do not clog the ears at any time and stick to the top of your ear, it seems to do a vibration effect on cartilage that transmits sound perfectly. If you take the Hololens off and get close to the headphones then you hardly hear anything, put them on again and they reproduce sound perfectly if you are in the proper position. I say proper position, because the enviromental sound can be defined from a focal point in Unity or emitted by an object. In either case the sound (its volume and its perceived position) varies depending on our position. To try this, we did a little game of hide and seek in Unity, something like the Marco Polo game, an object plays a song at its position, then it is randomly positioned around us and we had to find it just with the sound. Once you have found it, its position is varied again randomly. Really amazing how well it worked.
Real world collisions
So far, all seems awesome. But when the spheres of the notebook demo fell to the ground, they disappeared without colliding with real objects. To fix that, we only need to add an object from the Hololens Kit to our scene in Unity. This object allowed us to draw the mesh of the real world as a wireframe, opaque or hide it. Once placed, all objects automatically collided with the real world. In addition, the mesh was not static, it is constantly redrawn so you can throw an object in one direction and place an obstacle, for example a chair on its way, and the virtual object collided with the chair. It was really impressive, you can start thinking about a hundred of games mixing holograms and reality.
The Good parts
3 hours were enough to do 7 very simple demos, reviewing all aspects of Hololens: gestures, head tracking, voice, collisions and sound. What surprised me most was the ease of developing using all this technology. Many things like collisions with the real world ™, positional sound, creating a scene and making it visible. Hololens gave them to you automagically™.
The Hololens team decided to map Unity measurement units 1:1 with the real world, with 1 unit being 1 meter in the real world, so it was very intuitive to create and position objects in the real world.
The organization of Microsoft. We were called to a hotel about 5 minutes from the Moscone Center, on the 5th floor. We were asked to leave all electronics, including our Band, in a locker. They put us in line by couples and we were getting into a room and presented to our mentor, with whom we shared the subsequent hours. He ate with us and took care of everything we needed. All this between big security guards, I saw 3 in the living room and another 2 in the elevators. Each time we finished one example, an achievement was awarded to us and everyone people applauded, "gaming", friendly and personal. A great way of making us feeling very comfortable and enjoy the experience.
What needs improvement so far
I am not saying bad, not from a fanboy, because I do not think anything could be considered "bad". The development experience needs to be polished a little to avoid the necessity to stop apps manually to deploy a new one and to allow debugging our code.
They need to reduce the latency a little in the voice commands, if possible. Now it has the same latency as Cortana has, so it is not itself a show stopper, just that all other gestures are so immediate, that the delay in voice is noticeable. Instead I expected instantaneous results.
View port size. Seek a monitor 24" and sit about 40 cm on front of it, that would be more or less the space Hololens is rendering. Perhaps it is a hardware constraint, perhaps performance. The truth is, I don't know, since we were told it is something they are working on. If you approach an object then this somewhat spoils the experience of even a cut of virtual objects in a real-world object.
The attachment to the head is still a bit weak and gives the impression that if you turn your head or lean too much, you can drop the Hololens. It would be a nice improvement because we often tend to lean to look behind an object and you must refrain from doing that for fear of making the HoloLens falling.
The future
The future is amazing. I know because from the minute I put the Hololens on, I knew I was in it. The Hololens is something new and surprising, the professional applications in medicine, mechanics, sports or leisure are so easy to see, that I personally believe it will revolutionize the world. Now we can only hope that Microsoft decides they want our money and create an early adopters program so developers can get to work with them.
Here you have some photos of a Hololens exposed that they had in the showcase and let us photograph with, selfies included to show that yes, I was there and that this was not just a dream of science fiction, although it seems so!
And here is a small official video that Microsoft has posted on YouTube, in second 42 you can see the table where I worked. At that time I should be playing with the Hololens down the hallway, LOL.