قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Microsoft's HoloLens 2: a $ 3,500 mixed reality headset for the factory

Microsoft's HoloLens 2: a $ 3,500 mixed reality headset for the factory



I am in a tiny room in a basement somewhere in Microsoft's Redmond, Washington headquarters, wearing an early version of the HoloLens 2 headset. In front of us is a very real ATV, which is missing a bolt. Not quite at the corner of my vision – but definitely off the side – I see a glowing indicator pointing to a bucket of right bolts.

Back at the ATV, a holographic set of instructions hovers above it, telling me what to do and pointing to the exact spot where the bolt needs to go After a couple of minutes, I've successfully fixed the thing ̵

1; guided by holograms.

This kind of demo is fast becoming a commonplace for tech journalists like myself. But if you read the previous description closely, you'll find that there are three key pieces of technical innovations hidden in plain sight.

Here they are: I saw a hologram off the side because the field of view in which they can appear is much larger than before. I bent down and did not worry about an awkward headset shifting around because it was better balanced on my head. I pushed a button just by pushing a button because I did not need to learn a complicated gesture to operate the HoloLens 2.

These three things may not seem to be all that remarkable to you, but that's precisely the point. Microsoft needed to make the HoloLens feel much natural if it really plans to get people to use it and it has it.

There's one more unremarkably remarkable thing: even though it was just a demo, I was playing part of a worker because it's who the HoloLens 2 is exclusively designed for – workers, not consumers.


The Microsoft HoloLens 2 is available for preorder today for $ 3,500, and it's expected to ship later this year. However, Microsoft has decided that it is only going to sell to corporate customers who want to deploy the headset to their employees. As of right now, Microsoft is not even announcing a developer kit version of the HoloLens 2.

Compared to the HoloLens we first saw saw four years ago, the second version is better in almost every important way. It's more comfortable, it has a much larger field of view, and it's better able to detect real physical objects in the room. It features new components like the Azure Kinect sensor, an ARM processor, eye-tracking sensors, and a completely different display system.

It has a couple of speakers, the visor flips up, and it can see what your hands are doing more accurately than before. There's an 8-megapixel front-facing camera for video conferencing, it's capable of full 6 degrees of tracking, and it also uses USB-C to charge. It is, in short, a chock-full of new technologies. But after four years, that should not be surprise.

The biggest complaint about the first HoloLens was simple: you only saw the holograms in a relatively small box right in front of you. Turn your head even a little, and they would disappear from your field of view. Worse, their edges would clip out of existence even when you were staring right at them. It was like looking at a digital world through a tiny rectangle.

The HoloLens 2 has a field of view that's twice as large as before. It does not quite fill your entire field of vision – there's still clipping – but it's big enough now that you no longer feel constantly annoyed by letterbox. Microsoft says that each eye has the equivalent of a 2K display in front of it, but it's better to think of it as a metaphor than a precise spec. The exact spec is that it has a "holographic density of 47 pixels per degree," which means that the pixel density is high enough to allow you to read the 8-point font.

Typically, when a tech product gets better specs like These, it happens through the sheer force of technical iteration: faster processors, bigger batteries, more RAM, and so on. But that strategy would not have worked for the display on the HoloLens 2. It needed to get lightweight, not heavier. Laser and mirrors

Laser-based displays have become the thing to do for computers on your face. Intel's Vault Project Used Lasers, and the North Focals smart glasses do, too. Although the Microsoft is using some of the same basic components, it's taken them in a different direction and gone much further in developing what they can do.

The lasers in the HoloLens 2 shine into a set of mirrors, which oscillate as fast as 54,000 cycles per second so the reflected light can paint a display. These two pieces together form the basis of a microelectromechanical system (MEMS) display. This is a tricky thing to do, but the really tricky part for a MEMS display is getting the image that it paints into your eyeball.

One solution that companies like the North have used is a holographic film on the lens to reflect the image directly into your retina That has many drawbacks: a tiny display and low resolution, for two. But the really problematic part is simply ensuring the display is right in your eye.

Microsoft does not want any of these problems, so it turned to the same thing used in the first HoloLens. : waveguides They're the glasses in front of your eye that are carefully etched so that they can reflect the holograms in front of your eyes. The waveguides on the HoloLens 2 are lighter now because Microsoft is using two sandwiched glass plates instead of three

When you put the whole system together – the lenses, mirrors and waveguide – you can get a brighter display with A wider field of view that does not have to be precisely targeted to your eyes to work. Zulfi Alam, general manager for Optics Engineering at Microsoft, contends that Microsoft is way to go ahead with this system and that waveguides are definitely the way to go for mixed reality. "There's no competition for the next two or three years that can close this level of fidelity in waveguides," he argues.

Do you want a broader field of view? Simple Just increase the angle of the mirrors, which reflect the laser light. A wider angle means a bigger image.

Do you want brighter images? Simple again. Lasers, not putting it too fine on it, have light to spare. Of course, you have to deal with the fact that waveguides lose a ton of light, but the displays I saw were set to 500 nits and looked pretty bright to me. Microsoft thinks it could be much brighter in the final version, depending on the power draw.

Do you want to see the holograms without getting specifically fitted for your headset? Simple yet again. The waveguide does not require any specific fitting or measurement. You can just put the headset on and get going. It can also sit far enough in front of your eyes to allow you to wear any glasses you need comfortably.

Simple, simple, simple, right? In truth, it's devilishly complex. Microsoft had to create a completely new etching system for waveguides. It had to figure out how to direct light to the right place in the waveguides, almost photon by photon. "We are simulating every photon that comes from the laser," Alam says. The light from the lasers is not just reflected; It's split apart in multiple colors and through multiple "pupils" in the display system and then "reconstituted" into the right spot on the waveguides. "Each photon is calculated where it's expected to go," says Alam. That takes a tone of computing power, so Microsoft had to develop custom silicon to do all the calculations on where the photos would go.

And although alignment is much easier with the waveguide, that doesn ' t mean it's perfect. That's why there are two tiny cameras on the nose bridge, directed at your eyeballs. They will allow the HoloLens 2 to automatically measure the distance between your pupils and adjust the image accordingly. Those cameras will also allow the HoloLens 2 to vertically adjust the image if it gets tilted or if your eyes are not perfectly even. (They are not. Sorry.)

A kind of free benefit from those cameras is that they can also scan your wallet to log you into the HoloLens 2 securely. It runs Windows after all, and therefore it supports Windows Hello.



A MEMS mirror under a high-speed camera.
GIF: Microsoft

Then there's power: lasers, oscillating mirrors, and custom chips to handle the computing for all that must chew through battery. But Alam tells us that even with all of this, it still manages to require less power than the alternative. The mirrors are oscillate in resonance, so it takes less energy to move them, sort of like they're the fastest metronomes ever. Lasers are also less lossy than LEDs, and custom silicon can be optimized to its specific task.

"Our evolution is towards a form factor that is truly a glasses," says Alam, "and all these are significant steps in this journey. "

All that tech is impressive for sure, but I do not want to oversell the image quality. What I was using was not a finished product. I did a little halo around some of the holograms, and they sometimes jumped around a bit. Most of the features based on nose bridge eye scanners have not been flipped yet, either. Still, compared to the first HoloLens, what I saw was crossed over the line from "cool demo I'd use for 20 minutes and then be annoyed" to "I could see people using this for a few hours if the software was really useful. "

But if you're going to use a headset for a few hours, it's necessary to be comfortable enough to leave it in the first place.


Alex Kipman, technical fellow – AI and Mixed Reality, Microsoft

Comfort zone

Here's how you put the HoloLens 2: you put it like a baseball cap, twist a knob on the back to tighten the headband, and then you'll start seeing holograms. The end.

It's much less fiddly than the last HoloLens or any other face-mounted display I've ever tried. Because of all the work on the display system, you can skip the extra "fuss with the position to make sure you can see the image" step. The body of the thing is simpler too. It's a single band that's held with minimal pressure on the back of your head and on your forehead. (There's an optional top strap if you need it.)

All of that is nice, but it's pointless if the headset is uncomfortable to wear. And although I've never had it for more than a 20-minute stint, I think it will hold up for longer periods.

Microsoft has a "human factors" lab where it loves to show off its collection of dummy human heads and high-speed cameras. Carl Ledbetter, senior designer for the Microsoft Device Design Team, walked through all of the prototypes and material Microsoft has been trying to get into the final product. He explained how Microsoft experimented with different designs and materials, ultimately landing on carbon fiber to save weight.



"The reality is [we have to] fit kids, adults, men, women, and different ethnicities around the world." Everybody's head is different, "he says. Microsoft has a database of about 600 heads that tracks the shape of the cranium, the eye's depth, the size and relative position of the nose bridge, and other variations. Ledbetter's team added sensors to people's neck to measure muscle strain, to make sure the center of gravity was right.

The result is that the HoloLens 2 has a more forgiving and flexible fit. It simply does a better job of accommodating basic, physical human realities. You can flip the visor so it's out of your field of view so you can make eye contact without removing the headset. The memory foam pad that rests on your forehead is removable and cleanable, and the heaters have been completely redesigned so heat is piped away from your head.

All of that really helps, but the most important thing Microsoft did was move the center of gravity right behind your ears instead of up by your eyes. The HoloLens 2 is not really much lighter than the original HoloLens. It feels lighter, though, because it's more balanced naturally on your head. That balance makes a huge difference. The weight of it is less noticeable and should put less strain on your neck.

Ledbetter has moved its weight by literally moving the heaviest part: the main processor and battery are now located in a module that sits on the back of the headset, with wires inside the headband running up the display board and components in the front . That processor, by the way, is an ARM-based Qualcomm Snapdragon 850, and that's important because it addresses another basic human reality: we hate when the battery dies, and we hate plugging stuff in. An ARM processor means it can have a smaller battery.

The original HoloLens ran on an Intel processor, and it runs windows. Since then, Microsoft has done a ton of work to get Windows working well on ARM. Those efforts are slowly coming to fruition on laptops, but Intel is still the order of the day on those machines where raw speed is usually more important for users than battery life. In general, there's a tension with Intel. It's not delivering the lower-power chips that mobile devices require. Intel even had reportedly had to lobby Microsoft to keep Surface Go on its chips.

So what about the HoloLens 2? Alex Kipman is the person responsible for the whole HoloLens project. He says that "ARM rules in battery-operated devices." The ARM decision became fairly easy. When I point out that there are plenty of Windows laptops running on batteries using Intel chips, it becomes a blunter. When it comes to battery life, it's hard to find a product that's not running ARM today. "

Intel does not even have an SoC [system on chip] right now for these types of products that run on battery. They had one, the previous version [of the HoloLens] had Cherry Trail, which they discontinued. That decision is a no-brainer. "


For workers, not consumers

The HoloLens 2 is only being sold to corporations, not consumers. It's designed for what Kipman calls "first-line workers," people in car stores, factory floors, operating rooms, and out in the field fixing stuff. It's designed for people who work with their hands and find it hard to integrate a computer or smartphone into their daily work. Kipman wants to replace the grease-stained Windows 2000 computer in the corner of the workroom. It's pretty much the same decision Google made for Google Glass.

"If you think about 7 billion people in the world, people like you and I-knowledge workers are by far the minority," he replies. To him, the workers who will use this are "maybe people who are fixing our jet propulsion engine. Maybe they are people who are in some retail space. Maybe they're the doctors who work for you in the operating room. "

He continues, saying that it's for" people who have been in some way neglected or have not had access to technology [in their hands-on jobs] because the PCs , tablets, phones do not really fly themselves to those experiences. "

Fair enough. It's completely in fitting with Microsoft's new focus on serving corporate and enterprise needs, instead of trying to crank out hit consumer products. That was one of my takeaways when I interviewed CEO Satya Nadella last year, and it holds true today. As I wrote then, it's "a different kind of Microsoft than what we're used to thinking of." It's a little less flashy, yes, but it has the benefit of being much more likely to succeed. "

Furthermore, Kipman argues, even the HoloLens 2 is not good enough to be a real mass market consumer technology product . "He says, then continues:

Why is it not a consumer product?" "This is the best, high watermark of what can be achieved in mixed reality, and I'm here to tell you that it's still not a consumer product," he says, then continues: It's not as immersive as you want it to be. It's more than twice as immersive as the previous one, [but it’s] still not immersive enough for that consumer off the street to go use it. It's still not comfortable enough … I would say that until these things are way more immersive than the immersive product, way more comfortable than the most comfortable product, and at or under $ 1,000, I think people kidding themselves thinking that these products are ready.

Kipman says that Microsoft has not participated in the consumer hype cycle for these types of products. "We were not the company that hyped VR. We are certainly not the company that hyped AR. And since we've merged the two into the mixed reality and AI efforts, we have not hyped either. "

This is not exactly true. We've seen a lot of demos from Microsoft showing off games – including Minecraft – and other consumer applications for the HoloLens. So this move to the enterprise market is absolutely a pivot.

But it's a pivot that's part and parcel with Microsoft's larger corporate strategy. And just because it is no longer positioned as a consumer product does not mean that it's not a significant product – one that Microsoft appears to be committed to and develops software for.


A better interface on your face

The first HoloLens required users to learn awkward gestures with names like "Air Tap" and "Bloom." You had to make these really specific hand gestures because that's the first HoloLens sensor could detect and understand.

The HoloLens 2 can detect and understand much more because of a new array of sensors for reading the room called the Azure Kinect. "Kinect" because it's the brand for Microsoft's cameras that can scan rooms, "Azure" because it seems that everything the company does today is somehow connected to its cloud service and as a further sign that this is a business product, not an Xbox add- is.

"HoloLens 1 is just one big mesh. It's like dropping a blanket over the real world, "Kipman says. "With HoloLens 2, we go from spatial mapping to semantic understanding of spaces.

I can not speak to how good the Kinect is actually able to identify objects – Microsoft did not do any of that for us – but it theoretically works, because the Azure Kinect sees the room at a higher resolution and because it is hooked up to cloud services that helps it figure out what things are.

There's one aspect where I can definitively say that the higher the fidelity is real: it's able to identify my hand and what it does much more easily. It can track up to 25 points of articulation on both hands in space, which means you should not need to use the Air Tap gesture to interact with holograms anymore.



Resize hologramm with natural gesture. Footage does not show the actual Field of View.
Image: Microsoft

In one demo, I was pacing around a room looking at the various holograms that were set up on tables. As I reached my hands in, a box appeared around each one with little grab handles on the edges and corners. I could just reach in and grab the whole box and move the hologram around. I could just grab one edge to rotate it, or two to resize it. When there was a button, I could stick my finger out and push it. I doubt that it's accurate enough to say let's type a virtual QWERTY keyboard, but it's a big step up over the first generation, nonetheless.

Eye tracking also comes into play in how you interact with holograms. The HoloLens 2 can detect where you're looking and use that information as a kind of user interface. There were demos where I just stared at a little bubble to make it pop into holographic fireworks, but the most useful one was a auto scroller. I scrolled the faster the words scrolled, but then it stopped when I looked back up.

I did not see the full top-level user interface, so I do not know if that's changing But one thing is absolutely not: it still runs Windows. It utilizes the shared code in Windows OneCore, which means you will not get a traditional Windows desktop shell, but you will be able to run any Universal Windows App on it.

Chaitanya Sareen, main group program manager for Microsoft Mixed Reality, explains that they are trying to "make" it. "Sareen calls this" instinctual interaction "as opposed to" intuitive, "because it can piggyback off what we already do with real objects in the world. "Is anyone born said 'There's going to be a close button' [in the upper corner of a window] '? No, he says. "A lot of interfaces we use are learned."

Sareen is still thinking of some of the details of what the user interface will be, but the goal is to use many of the natural gestures you picked up as toddler instead of making you learn a whole new interface language .

Microsoft is also making new software tools available to developers. One of the most important, Dynamic 365 Guides, will be a mixed reality app with templates to create instructions for repairing real-world things like that ATV. Other tools depend on Microsoft's cloud services. One is Azure Remote Rendering that lets the HoloLens offload some compute loads to the cloud. It exists because the HoloLens 2 can only store and render a limited kind of detail for something like a 3D render of an engine locally. With Remote Rendering, some of the details can come in real time from the cloud, so it displays potentially infinite levels of detail, allowing you to model and interact with the smallest parts of a holographic machine.

Finally, there are Azure Spatial Anchors . It lets you pin holograms to real places in the world. At the basic level, it's not all that different from what Apple and Google are already doing in augment reality: letting multiple devices see and interact with the same virtual object. Microsoft's ambitions are much grander, though: it wants to create the infrastructure for a "global scale" set of holograms, and it's building tools that let developers use that infrastructure across platforms, including iOS and Android.

Solving that requires more than just GPS location and object recognition. Kipman talks a lot about distinguishing between identically boring conference rooms that are in the same spot on different floors. Tracking objects in space using optics is famously difficult. Walk in a circle around a building, and your position will drift, so the computer will not put your ending point at the starting point.


Alex Kipman believes we are at the precipice of the "third era of computing." First came PCs with their open. It's a little fuzzy about how far Microsoft has really gotten into solving these problems, but it's actively working on them. architects, second came phones with walled garden app stores, and now he hopes mixed reality handsets will swing the pendulum back to openness because Microsoft is about to keep the HoloLens open. The HoloLens works with Microsoft's cloud services, but it also works with other ecosystems. Kipman says the HoloLens and Azure are "loosely coupled, but tightly aligned."

I could do more than quibble with his summary of the history of computing and point out that there is also quite a history of underdogs calling for openness, but the bigger points stands: Microsoft thinks that mixed reality is going to be a big deal.

Understanding what Microsoft's plans have long been in need of a much more jargon than it used to be. With the HoloLens 2 specifically, expect a lot of discussion about "time-to-value" (how fast a user can do something useful after getting a device from a employer) and "intelligent edge" (devices with their own computing power, however connected to the cloud).

There is a cognitive dissonance for regular consumers with all of that talk. Kipman's protestations to the contrary, there is plenty of hype around the HoloLens 2. It's just directed at corporations now. Some of it is well-deserved. I think that the HoloLens 2 is a technical marvel. Just because it is not being sold as a consumer device, this does not mean that it is not a very important piece of technology, something that could change our concept of what a computer should look like.

But we're used to consumer electronics companies doing their best to put such technical marvels on store shelves, translating that hype into gadgets in our pockets and on our heads.

For the HoloLens 2, the hype is not about personal technology. It's just business.



Source link