Dress for Success: Talking Headsets and Haptic Suits with Skarred Ghost Antony Vitillo

July 22, 2019 00:48:22
Dress for Success: Talking Headsets and Haptic Suits with Skarred Ghost Antony Vitillo
XR for Business
Dress for Success: Talking Headsets and Haptic Suits with Skarred Ghost Antony Vitillo

Jul 22 2019 | 00:48:22

/

Show Notes

From headsets to haptic suits, there is going to be a lot of accessories and apparel in XR to chose from, including some that expand senses you didn’t even know were XR-compatible. Antony Vitillo – AKA Skarred Ghost – drops in to discuss different devices with Alan, their use cases, and what companies should consider when they go shopping for some.

Alan: Today’s guest is Antony Vitillo, better known as Skarred Ghost. Antony is an XR consultant and author of an amazing blog called The Ghost Howls. He also runs a consulting company called New Technology Walkers, where they develop VR solutions and advise companies about how best to use VR and AR. Antony recently traveled to the fourth annual VIVE Ecosystem Conference — VEC — in Shenzhen, China. If you’re not already following Tony, you can learn a lot by connecting with him on LinkedIn and subscribing to his newsletter at skarredghost.com. Tony welcome to the show.

Antony: Hello, Alan! Thanks for this opportunity.

Alan: It’s so great to have you on the show. I had a wonderful opportunity to speak with you many times, and we are both very, very passionate about virtual and augmented reality. I want to just thank you for taking the time to be on the show.

Antony: I’m very happy to be with you. I’m happy to speak to you live, after so many messages we’ve done on LinkedIn. Super happy to be here.

Alan: Let’s just dive right in, and we’re going to try to bring as much value as we can to the listeners today. We have a lot to go through; we’re going to go through all of the different hardware aspects involved in Virtual/Mixed/Augmented Reality — XR — and it’s not just the headsets or headphones. You’ve got things like haptic suits, haptic gloves. You’ve got touch-sensitive stimulators. You’ve got VR headsets, AR headsets. You’ve got mobile phone-based AR, eye tracking set devices, taste experiments, hot and cold devices, thermal devices, and then tracking systems for motion capture, and of course, treadmills for omni-directional walking. So there’s a lot to unpack here. Let’s start at something crazy; haptic suits. Let’s talk about haptic suits, and why and where these would be used in any industries.

Antony: I’m very interested to have these suits, because they offer the promise of letting you use your full body in VR. So, finally, you can be there with all your body. You know, my first startup was about full-body in VR, but using Kinect. So, a different approach, but I’m a big fan of having the possibility to kick objects, to move your body in every possible way, and see your full self replicated in VR.

The advantage of using the haptic suits over other approaches, like the one that they used with Kinect, is that you don’t only have your full body — your full movement — in VR, but you can also feel sensations. You can have haptic feedback. So you can [feel] hot, cold. You can feel pain and whatever. It’s really full immersion; a bit like we have seen in the Ready Player One movie. Wade Watts wore that expensive suit, to fully be inside the Oasis. This is why I think they’re very interesting, because they can really enhance your visual experience; your sense of presence, like your ancestors like to say.

Alan: So, let’s unpack this for a second, Tony. What would some of the practical use cases of this… I can see one in military training, where you’re in a virtual world, you’re in a hostile environment, and maybe something explodes behind you — a piece of shrapnel hits you — and maybe it vibrates. Maybe explain some other instances, where this could be used in enterprise.

Antony: I think about different possibilities. Like, for instance, I was talking some time ago with some psychologists, and this can be interesting for rehabilitation. How do you perceive your body? You can see yourself as an avatar that is a bit too fat, too skinny, that lacks some parts of the body or such. So, rehabilitate yourself psychologically. But it can also be used for rehabilitation of your body. So if I can track all the movements, and then medics have to check my patients that have problems with the back, with their legs, whatever; I can really observe them when they’re moving. For instance, you talk about the military, but I think lots of industries and may have interests in evaluating all of the body forces during training. So, if they’re training for particular movements — and all the body — I think the haptic suits are the only real possibility.

Alan: I was actually reading an article the other day — it was was more of a scientific paper — talking about privacy, and the fact that, with haptic suits and these headsets, we’re actually able to collect insane amounts of data. We can not only collect data around your height — because we know how high you are from the floor — but your gait, how you walk, your movements, what you’re looking at, what you’re experiencing, your heart rates. There’s so many physiological aspects that we’re able to collect incredible amounts of data, and so, by collecting this data, all sorts of trainers will have unprecedented levels of data around the person they’re studying. It’s funny, because for years and years, we’ve studied people’s movements, but we’ve never had anything this accurate. It’s really exciting.

Antony: Yeah. It’s all exciting, for that high input. But what I also would like are certain kinds of suits — for instance, we can name the Teslasuit. That is a very complete device that should be available, maybe next year, and it can also provide feedback to the user. So for instance, I sent this video out; people wearing the Teslasuit were able to feel hot, to feel cold, and also to feel pain. You mentioned, for instance, the training of the military; you can really feel the pain of having been shot. If you are a firefighter; maybe if you don’t extinguish the fire fast enough, you can really feel the hot or your body. I think that can also be managed to–

Alan: It’d be amazing for firefighters or paramedics or people who are in emergency situations, where training for these things is almost impossible. You can’t really train for every scenario in the real world; in virtual reality — with these suits — you can now train in a really realistic way, for things that are very rare. I love that.

Antony: Yeah, because the great thing about VR is that you can perform this training, realistically simulating the situation the person will be in. The more the simulation is realistic, the more this person who has to solve the problem will we be prepared to solve it. For instance, if you really have to extinguish a fire, and in the individual simulation you’ve already felt the cold — maybe also the humidity, whatever — of that moment; you’ll be in front of the fire, and you’ll really feel the hotness. You will be already prepared. This is what I think will also be very important of these suits. Of course there will be products tailored to enterprises; there are no products ready for the consumers. But since we’re talking about using this stuff for industries, for enterprises, I think that it will be a very important application.

Alan: I agree. And just to talk some brands that are out there now, you mentioned Teslasuit. Are there any other ones that you know of that are out there right now?

Antony: Well, I know a few names. Another one that I want to mention, that I tried two times — one also at the VEC — is bHaptics. That is a Korean company, and the thing that is interesting is that they are making a modular suit. So, you can buy the model for the face, for the chest, for the arms, and for the legs. So, you can also only buy the pieces that you need, and it can provide mostly depression feedback. But it’s good, because it can be localized. I tried a paintball game, and I really could feel the vibration in the exact point I was shot — in the chest, or also in the face; it was very strange when I got the vibration in my face, because I’ve been headshot-ed by my opponent — and it also works with various devices, and now it works also with the VIVE Focus, so you don’t need a PC, it doesn’t have cables; you just have a headset on your head, and the suit on your body. This can be very, very–

Alan: It’s interesting that you mentioned the VIVE Focus, because you were just in China at the VIVE Ecosystem Conference, and one of the things — I actually interviewed the president of HTC VIVE, Alvin Wang Graylin, and he’s actually gonna be on a different episode of the podcast (so if you’re listening, you can look up that podcast as well) — one of the things they mentioned is the launch of the VIVE Focus Plus, which I believe you had a chance to take a look at. And one of the things they mentioned was this ability to take up to 40 headsets at once, synchronize them in an up-to-900,000-square-foot space, so that you could do very large-scale trainings. And I think, now that you mentioned that bHaptic suit combining with that, this is going to be a very, very powerful tool for enterprises.

Antony: Yeah, of course, I was there; I listened to all the presentations by HTC. I saw other companies working with them — like, for instance, Modal VR. Maybe let’s talk later about that here on this podcast. I think that the strength of HTC now is the services that they’re offering for enterprises. This ability to configure more headsets at once, and also to create such kinds of situations so you can have multiple players in those spaces. And yeah, both for entertainment — you can play large spaces — but [also] for serious applications. You can have serious games, or directly training with different teams, attempting to train together. And everything without a cable, because the Focus Plus is completely standalone. I think that it can be really important. I have one here on my desk, and I think that’s an interesting device. Yes, I think that companies may evaluate the use of this device, especially for when there are multiple people involved, and the company doesn’t want to buy lots of pieces have lots of cables, stuff like–

Alan: It really democratizes it, and makes it easy for businesses to get involved. I think, to be honest, if you look at the roadmap of VR in general, everything kind of launched and kicked off in 2016, to the public. But I think businesses are really going to start embracing these standalone headsets. While we’re talking about VR headsets, let’s unpack some of the other ones that are becoming more prevalent in industry. We’ve got the VIVE Focus Plus. Oculus Quest is coming out. The VIVE Cosmos, the Pimax 8K, the Varro, the Pico… so, which one of those of you tried, and where do you see them fitting in to different enterprises?

Antony: I think that it’s great that this year, we are going to have lots of interest in devices. It’s great to see the industry growing, but it’s also important for companies to start understanding what are their needs, and so, what are the devices that can fit them? The Oculus Quest is a great device that is coming — very polished by Oculus, it will be quite cheap, only $400. There is also the Oculus Go, that is already on the market; it’s just a viewer of 360 content. One other thing that is important is that they also understand the business license. The VIVE Focus Plus is quite expensive — it costs $800 — but it has clear business licensing, does business services, assistance, there is a kiosk mode, and other things that are fundamental for a company. The Oculus Glass — currently — when it will be launching, most probably will be mostly a consumer device. So it will be probably a bit better than the Focus, for what concerns the comfort and the controllers — it is more ergonomic. But it’s not clear if it will have a business license and business services since Day 1. There are rumors about a business version of the Quest coming next month, but it is important that companies understand if there is a business licensing or not, and what it offers. This is the first thing that is important to say.

Alan: I agree with you, Tony; I think it’s really important to unpack that just a little bit, because Oculus and HTC both realized that the path to mass consumer adoption of this technology is actually through enterprise applications. We saw that very early with mobile phones, with the BlackBerry being a very powerful business tool, and then becoming a consumer tool after that. But really, what’s going to happen with these — and I think Oculus, their official stance is that they want to market towards the enterprise, but all of their advertising is towards the consumer. So this is this kind of disconnect, and Oculus being owned by Facebook, I really don’t know that they have the experience…

Antony: The interest.

Alan: Yeah, and it doesn’t seem like they have that. But I mean we’ll see. With the Oculus Quest, we’ll see what kind of services they provide. But definitely, HTC is really far ahead with the services and, being able to do that. So I want to just shift focuses for one second and — for those of you who are listening who don’t really know about the VR headsets and the difference — there’s kind of two types: there’s Three Degrees of Freedom, and Six Degrees of Freedom. Three degrees means you can look up, down, left, right, and you’re in the space, but you can’t move around. Then, Six Degrees allows you to look up, down, left, right, but also move in those [dimensions], and the newer headsets will allow you to move. It makes a huge difference in the amount of immersion, but also the things you can do; being able to have controllers, or see your hands in virtual reality, and connect and move things around. That’s Six Degrees of Freedom.

So, you have something like the Oculus Go, which is a $200 headset, which is perfect for 360 videos; if you want to do some basic training for people, and they can interact using Gaze-type controls. But you can’t move around, and that’s kind of the difference between the two. Then you have tethered and untethered, meaning connected to a big computer or a backpack computer, or standalone, meaning the entire computer device is on the headset. So just for those of you who are new to this, that’s the difference between the different headsets. Is there anything else you want to add to that before we move on?

Antony: No, I think that was a precise list.

Alan: There’s another headset — there’s two out there. One’s called the Pimax, which is an 8-K headset; it’s like wearing a giant scuba mask. It’s massive! And the view is beautiful. I mean, it looks really gorgeous. But there’s no way anybody in the public is going to wear this on their head, because it is like wearing two massive cell phones strapped to your head on a diagonal. But then, there’s another one called The Varro out of — I believe, is it’s Sweden? or Finland?

Antony: Varro is from Finland.

Alan: Finland, sorry. And these guys, they’re selling the most expensive VR headset on the market. What their claim to fame is is having a really, really wide field of view and very, very high optics and resolution. That thing is — I think — it’s $7,000.

Antony: I remember something around $6,000, maybe–

Alan: $6,000.

Antony: — but it was super, super expensive.

Alan: It seems super expensive, but let’s break it down for a second. About five years ago, companies would spend millions and millions of dollars building a virtual reality CAVE, so that people could start working in three dimensions. If you fast forward now, $6,000 for a headset that allows your companies to design in virtual reality, and have meetings in virtual reality while designing; one headset saves one flight, and there’s one flight, paid for the headset. So, in enterprise, this is not a lot of money. In consumer, I can’t imagine anybody is gonna buy one of these headsets. But, in the market they’re going after, it’s really valuable to design companies. One of the other interviews we did was with Elizabeth Baron, who headed up the VR division of Ford, and every single car that they make has to be viewed in virtual reality by all the senior executives, and they have a big meeting where they’re all around the world meeting about the car and seeing in different lighting conditions and all this, and what they need is the best possible quality. I think for that reason, these high-end headsets are going to work.

Speaking of high-end headsets, let’s touch on some of the AR headsets — or the augmented reality, or mixed reality — headsets. You’ve got a ton coming out, now; you’ve got the Hololens, Hololens 2, Magic Leap, nreal, Realmax, Vuzix, North glasses, Epson MOVERIO, Google Glass. So, let’s start at the top; let’s talk about the Hololens. What are your thoughts on that?

Antony: My thought is that Microsoft is doing a great job, because it has created the AR market with the first Hololens, and now after some years, it has created the new Hololens 2, that is a big improvement over the previous one. I have not had the pleasure of trying it yet, but from what I have read, the greatest improvement is that it is much smaller and useable. If you have ever tried to use a Hololens One for a training experience, it was really a pain, because the only interaction was through the “air tap” gesture; it was like a click with the index finger and the thumb.

Alan: Oh my goodness, it was impossible. Basically, you had to put this headset on people, and explain this weird clicking, like… “it’s like a mouse, only you gotta stick your finger up and point it out,” and if anybody who’s ever tried this, knows what I’m talking about. Anybody over 40 had a real hard time trying to figure this out; anybody under 20 picked it up instantly. But the ability to get people working on this immediately was difficult. You posted something yesterday with the Hololens 2 and the interactions, maybe talk about the new interactions that Hololens 2 brings.

Antony: Well, the first interaction that went along with the Hololens One was a disaster, because there was only one interaction, and the system wasn’t able to adapt it well — I had tremendous experience with that. You said the Hololens 2 brings some more natural interactions with both hands, and so basically, you don’t have to teach how things work. You just use it like in real life. You have to scroll things? Just scroll them with your hands, moving the hands from down to up. If you had to click buttons, just to put a finger of yours on the button and press it. It’s hard for me to explain that, because there’s no explanation needed. You just do what you think is intuitive to do. The system can detect all your hands, all your fingers, and so everything just works. And it’s not only the hands. It’s also the eyes. There is a demo by Microsoft, that you are reading a text. And when your eyes are at the end of the text, the system detects that you have read everything, and closes automatically — the text — so you can continue reading it.

Alan: Hold on a sec. So basically, because there’s eye tracking, and because the system knows exactly where you’re looking at all times, it can know when you’re at the end of a sentence, and move it up for you? Think about that for a second; the world is going to move to spatial computing, and… let’s just talk about the difference between VR and AR for a second, because I think we skipped past that. Virtual reality puts on a headset and transports you to another world. All of these headsets are gonna start to have eye tracking and all of these things. But in augmented reality, or mixed reality, you’re actually seeing your real world, with data painted on top of it. The ability to look at something and instantly have the information in context to that, immediately in front of you — and now with eye tracking, it knows exactly what you’re looking at and, can bring up information, and know when you’re finished reading it and get it out of the way. So this is really, really important, fundamental; eye tracking is going to be an every single pair of glasses.

Antony: Yeah. Sorry if I interrupt you, but something has come to my mind. Some companies that have asked of us to make a system for the Hololens — so, in AR — so that the worker — like, a maintenance worker — could do something to repair machinery with the hands, and at the same time, see the manual in augmented reality in front of him. The great advantage of using eye tracking — for instance, the solution provided by Microsoft — is that the worker can have their hands in the machine, performing his work, and at the same time with the eyes, look at the manual that will scroll, automatically, the instructions of how to repair the machine. So this is something that can be very important for maintenance, in my opinion.

Alan: I agree, and in fact, there are some studies by Boeing — that are using this technology immediately, now — and they’re seeing a 25 to 45 percent decrease in the amount of time that it takes a worker to complete a task. Now, think about that; 25 percent faster. That alone is incredible, but the real kicker comes in the fact that they have near-zero error.

Antony: Wow!

Alan: So, by putting these instructions up in front of them, they’re seeing near-zero errors. And there are companies out there like Upskill — they’re going to be on the podcast as well — there’s some other companies there, that are really starting to take digital manuals, and put them into a heads-up display, so you’re completely hands-free, and you have the information, when you need it, in context to what you need immediately. And that is a really powerful tool that… well, very few enterprises are working on now, but I think it’s going to explode in 2019.

Antony: In the end, it will disrupt completely the maintenance sector, in my opinion. In maybe five years, all the maintenance operations that we know now that completely changed by AR and MR.

Alan: I agree. So, we’ve talked about the VR headsets — for training, for simulations, for design. We’ve talked about augmented or mixed reality headsets — the Hololens, Hololens 2. Magic Leap is kind of a Hololens competitor; they really went after the consumer market, and they’re actually gonna be selling through AT&T stores — starting this week, I think. And I think Microsoft really has a firm grasp on the enterprise of this, and I think they’ve got a really good head start, because 1. They have great relationships with all the enterprise clients already. 2. They’re building services into their Azure cloud, so that’s really exciting. And now, everything’s gonna just work with your current BIM systems (if you’re in construction), or your CAD diagrams. I think one of the things that came up at the Hololens 2 launch — that I think is going to be revolutionary — is a program called Spatial. What were your thoughts on that?

Antony: Well, Spatial. I think that it’s one of the best — from what I’ve been able to see — collaboration tools. It’s not the only one, because there are other ones for VR; there are some custom solutions. For instance, there’s one from Nvidia in VR that is also very, very good (also very, very expensive). It is important, because it lets the workers of a company that are maybe in offices in different parts of the world — maybe some are in Beijing, some are in New York, some are at home — and they can meet in augmented reality. They can discuss ideas, they can work on 3D models together. For instance, to refine a prototype. They can meet as if they were in the same space, seeing them together, interacting, talking. Imagine how this is great, because this can save lots and lots of money for companies.

I was talking with the CEO of a Chinese company that is working on another collaboration tool in VR, called XCOL. She explained to me that for certain kinds of companies that produce objects — so, not people like me that create software that can be exchanged easily by just sharing a folder, maybe on Dropbox — people that have to create concrete objects — real objects — maybe they have to share prototypes made with chalk, wood, or whatever. Creating these prototypes, sharing them by sending these packages all over the world, then having a meeting. It’s all a waste of time and money — big money, it’s not just a hundred dollars. So, the possibility to meet in only one virtual space, talk together, and modifying objects together. So instead of 3D printing the object to walking on — I don’t know, a remote control for a new TV — we can see the 3D model of the remote control in front of us and we can discuss in front of it, and decide to change it together. All with zero cost, because that 3D model is just a digital object, so it can be modified on the fly. We can take pictures. We can see multimedia elements together. At the end of the day, there is a modified version of this object that is okay for all of us, and without spending money. The saving for companies is really huge, and that’s why there are lots of companies working on these kinds of solutions, and Spatial can be one of the best, also, because it works with the Hololens.

Alan: Yeah, and one of the things that has come out of these types of collaboration tools — and I think it’s important to note — is that they have to be completely interchangeable with all the different XR technologies. So, if you break down XR or — extended reality, or whatever you want to call it — into its individual components: you’ve got the real world, and then you’ve got the sliding scale of immersion, where you have augmented reality — or overlays of computer graphics on top of the real world. Then you’ve got mixed reality, meaning overlays of computer graphics on top of the real world in context, so it knows that’s a table or a chair, and it builds the experience around the objects that are in your real world. And then you have virtual reality, where it completely hijacks your whole world. If you look at the scale of these, one of the things that I think is gonna be an important factor in mass consumer adoption — but also, in businesses — is leveraging the power of the mobile phone. Mobile phone-based augmented reality has only scratched the surface; it’s been around for five or six years, and companies are only starting to scratch the surface.

One of the things I saw which is really cool is a company called Placenote, where you hold up your phone, and you can leave a note for somebody in 3D space. So, say you’re working on a house, and you want to leave some note saying, “don’t forget to move this thing here,” or you can leave notes for your housekeeper, or your Airbnb host, or whatever. There’s some incredible things that can be done with the mobile phones. If we take away the glasses for a minute, let’s see what we can do with the mobile phones that are in everybody’s pocket. Because by the end of 2019, there will be about 2 billion devices that have powerful AR built right into them, and they’re in everybody’s pocket. So let’s unpack some of the things that we can do with those phones.

Antony: Well, I think that — as you said — the augmented reality runs in the phone is great, because every one of us has a phone. The classic example is that we see how it is done with Pokémon Go. There are lots of people running like zombies in the cities, hunting for Pokémon. So we see how it can be powerful, augmented reality on the phone. I think the tablet, with its wider screen, can be more important for AR, even more than the phone. I’ve seen some examples — I’ve tried some examples — where I could see augmented reality through the tablet is great, because you can see wider space that’s augmented. Some applications come to mind: I’ve seen interesting things — again — one in maintenance. For instance, there was this company — I don’t remember the name — where I could look at my car that is not working. I just open it, I see the engine. I take my tablet. I find my engine. There is a worker that sees what my phone sees, and can write on my screen, can send me instructions to fix what is not working. So what I’m going to say is that I can see the augmentations from my phone, how can I fix my car if it is broken. I’m not a mechanic, so it could be important and interesting for everyone of us, and especially in certain sectors. Well, maintenance is important. Another experiment that I’ve seen that was quite original, was using the phone to see MRI scans of the body. So, the world’s doctors sharing these MRI scans. By moving the phone, the doctor was able to see the particular size of the scan, and so; analyze better, in a more natural way, what could be the problem for the patient.

Alan: That’s incredible. And I think that’s a really great use case. We don’t touch too much on the health care use cases, but they’re so vast. I mean, we could build a podcast just around the health care uses of this; everything from MRI scans in augmented reality; to mixed reality, aiming surgical tools to make sure you’re accurate; to virtual reality simulators to treat PTSD and other things like this. The applications in the medical industry are literally endless.

Antony: Can I just add one thing that I think is fundamental for your listeners? It’s that, even if it’s more dedicated to consumers, what is huge now in 2019 with mobile phones in the AR is advertisement. Because there are lots of experiments how to use advertisements with AR filters; we all know a Snapchat is doing great things with Snap filters and such. You can try, for instance, makeup on your face directly with an AR filter. You can try some glasses. There is also a great campaign by Burger King that makes you burn the ads of their competitors in augmented reality. And I think that is important for companies to know that ads in AR are really outperforming the standard advertisement services.

Alan: You’re absolutely right. That Burger King one: basically, you take your phone, you point it at a competitor’s ad, and it catches on fire and gives you a free Whopper, and then allows you to post the video of you burning down their competitor’s poster or billboard or anything. In the first week, they give away 50,000 Whoppers. Imagine the earned media that they got around the world from that. I know when I posted it on my LinkedIn, it got over 100,000 views, just on my LinkedIn. So, you can imagine the amount of eyeballs that Burger King got from this, and it probably only cost them maybe — I don’t know — $50-60 thousand to build that application.

Antony: Yeah, probably.

Alan: Incredible. I don’t know how much it cost, but it far less than the revenue that they brought in from it and the marketing. So, you’re right, absolutely marketing. If you look at Snapchat alone, Snapchat filters — they have built over 400,000 filters on Snapchat. This is not them specifically, but people building them, and brands are starting to jump on board. Nike’s done a bunch of stuff with LeBron and with Michael Jordan. So, Snapchat is leading the way in mobile phone-based augmented reality marketing and advertising. So, you really nailed it on that one. And there’s companies like Admix. Samuel Huber has been a guest on this podcast as well; he’s creating programmatic augmented reality advertising, so that brands can now scale their advertising using his platform, on Instagram and Facebook, and now Snapchat. So, it’s really an exciting time for advertising, as well.

Antony: Yeah. It’s a new world, and I feel that it’s important, also, to jump now, that there is not much competition, maybe? For companies, it’s a great opportunity to start to make advertising in the new and more effective way.

Alan: I agree. I look at the mobile phone-based a AR as the training wheels to where the world’s going. In the next five years, the devices like Hololens and Magic Leap and these really industrial glasses that are being used for enterprise, I think, are going to end up on the faces of everybody, because you just won’t be able to compete anymore. If I wear a pair of glasses, and it gives me all the information I need — real-time and in context of the world around me — and you don’t, good luck trying to compete with me in my job. So I think we’re going to end up having these glasses that give us superpowers on a daily basis. Some of the other things that we didn’t touch on yet: haptic gloves. We talked about haptic suits and wearing like a full suit for haptics, but what about just something like a pair of gloves, that allow me to reach out and grab something, and feel that it’s there. The feeling of touch and seeing your hands and in virtual spaces is absolutely incredible.

Antony: Well, it’s fantastic. It’s a new field. It’s something that is not consumer ready, but there are some enterprise solutions that are really interesting. One that got very popular on the social media and such lately as being HaptX gloves. The device is really good, because — while they are incredibly cumbersome and expensive — but it has a haptic engine that is really sophisticated, so you can really feel the sensations of touch all over your hands, with a lot of positions, with bigger solutions; so you just don’t feel a vibration on the one finger; you’ll feel the vibration on the single point on your finger, for instance. There is a demo with these glasses, where there is rain in virtual reality. You can really feel the drops of the rain falling on your hand.

Alan: Oh wow!

Antony: Yeah, it’s amazing.

Alan: The haptics ones also have force feedback. So, I reach out and grab a can; it feels like a can in my hand.

Antony: It’s a completely realistic haptic sensation. You can feel objects on your hand as if they were there. You can feel how they are heavy and such. It’s really something that I will really want to have, but–.

Alan: Me too!

Antony: — but I don’t need them now.

Alan: Tony, I tried the Ultrahaptics. It’s just a little finger sensor; it looks like a blood measure — like a pulse oximeter — on your fingers. I tried them at CES, and I reached out and moved some blocks or whatever, and I could feel the haptic feedback. But the second part: they told me to stand by a fire, and then reach my hand into the fire. When I reached my hand in the fire, the finger haptics buzzed on my fingers, and scared the living crap out of me. I jumped back about three feet, and I must have looked like a complete idiot, because there’s nothing there. But it scared me in a way that would be an incredible training tool. I mean, here’s here’s something like McDonald’s; “don’t reach your hand into the deep fryer, because this is what happens.” Better to train somebody in virtual reality than to train them when they actually could burn themselves. That, really… it’s stuck in my head, like, I touched a fire; it burned me. And even though I know it wasn’t real, it felt so real.

Antony: Yeah it’s fantastic. You mentioned Ultrahaptics. It’s completely the opposite approach from HaptX gloves, because HaptX gloves, it’s a really cumbersome device on your hands, but gives you every kind of possible sensation, while the Ultrahaptics leaves you with your bare hands. So this is really like the future. You don’t need any gloves, but there is a device that showers ultrasound waves throughout your fingers, and can give you the sense of touch. But you can’t have all the sensation that a glove can give you. One of these can be great relief for training, because you can really make the people have the sensation of having an object in the hand. So, every kind of training that requires tools; if there is not such a kind of powerful gloves, it’s just like… I don’t know how to explain that if you play a game in virtual reality with the standard controllers, like when you have a sword, or you have a gun. Everything seems fake, because you don’t have the sense of weight. You don’t have the real sensation of the things in your hand. When you use these kind of gloves, you can really feel the weight. You can feel the recoil of the weapon, and this means that, for every kind of training that requires a tool, from the most precise one, to the more hardcore ones — like a gun — I think that gloves can easily improve the training experience.

Alan: I agree 100 percent. Something else that I want to bring up, because it’s underestimated on how this is gonna be really important — and the last thing that I want to touch on — is scent devices. Being able to provide people with a realistic scent… there’s two; there’s VAQSO, and then FEELREAL is a Kickstarter on right now. I’ve actually had a chance to try VAQSO; I put on the headset, and it had this little device that was underneath my nose that gave me scents, and the scents were programmed in the experience. So for example, I reached out and grabbed a cup of coffee. I smelled the cup of coffee — brought it close to my face — and as I brought it close to my face, the scent was released, and I could smell coffee. Then I did the same thing with a chocolate bar. And then the last part of the demo, they told me, “now smell the girl.” I thought they were Punking me — I thought it was gonna be on some television show. I looked to my left, and there was a Japanese anime character, and you have to lean in and smell her. When I leaned over — I smelled the coffee and the chocolate — and then when I leaned over, she smelled like perfume. I will never forget that.

Scent is one of the most incredible ways for our brain to remember things, and I think in industrial applications where there are times where scent is really, really important. Maybe you’re underneath a mine that has sulfur, and you want to give people that realistic experience of being there before they get there, because some people maybe can’t take going into a mine shaft with sulfur smell. Maybe they throw up immediately, and it’s better to know before they send them down there and spend thousands of dollars and training them to send them down. So, I think there’s a really interesting opportunity to use these scent machines. What are your thoughts?

Antony: Well, I’m a huge fan of all the research that doesn’t go to the immediate direction of improving just optics and audio, but also try to improve these other sensations, like the scent and the taste. I think that…our sense of smell is like one of the most ancient ones; it is wired in a particular way in the brain, so it is connected with lots of regions. Why have we just been ignoring it in including it in our headsets? It’s actually one of the most important senses that we have. So it is important that we we implement data in VR, if we want the simulation to be realistic. What VAQSO is doing is great. Also, FEELREAL. We’ll also be able to provide the sense of humidity in front of your face. So not only the sense of smell, but also a bit of simulation of how the air can be. It’s great, not only for entertainment; it can be also good for marketing, because you can associate VR marketing campaign with pleasant scents that can make the user remember the brand better.

Alan: Oh my goodness — you know what? This could be used for real estate. You go into a real estate, and you smell cookies baking in the oven. Maybe you’re selling cannabis, and you want the cannabis smell. Think about it: it can be used for literally anything, and to market anything. Creating a powerful smell interaction, your brain doesn’t easily forget that. I think it’s really something that nobody has really fully embraced yet. I’m really excited to do some experiments on that.

Antony: Yeah, we we should do that together! Imagine that, as you say, you’re selling a house, and when customers enters the house, maybe there is someone that is cooking, and they can really smell the taste of food or whatever. You can maybe dangle it off the balconies, and it could have the smell of the sea, of the grass. I think that’s something that is sticking with their head, and will make them continues to think about the house. He can create a connection.

Alan: I agree.

Antony: That would be great.

Alan: Man, that would be amazing. I think we found our new business model! [laughs]

Antony: Yeah! [laughs]

Alan: Tony, I really want to thank you so much for joining us on the podcast today. I’m going to ask you one final question: what do you see for the future of XR, as it pertains to business?

Antony: Well, the sure thing that I can see as a trend is that it will be used always more. I assume that also, among my customers, maybe someone, some years ago, just said, “ah, VR. I’ve heard about it, but I’m not interested,” and now they’re coming again because their competitors started using VR. So, it’s something that will be fundamental for business in the next [few] years. It is fundamental because it makes companies spend [less] money, and makes their jobs [work] in more effective ways. I’ve seen lots of articles; I’ve seen lots of videos about the efficiency that companies are gaining. For instance, HTC always talks about Bell, that can prototype a helicopter now in months, and not in years, as before, thanks to VR. I see this great trend. Regarding the technology, what I can envision is that this kind of virtual reality is going to become always more realistic. As we’ve discussed, that would be slowly, with the adoption of new sciences. But visuals are going to become very realistic. We are seeing the Varjo headset, that has the resolution of the human eye; we can go further than that. Audio is already well emulated. With the gloves, we are going to improve; now, the devices are very expensive, but yes in a few years, it will be better. The sense of smell and taste will come later, but I’m sure that they will come, anyway. So, I envision lots of great things coming — lots of great hardware coming — and I’m so happy to be here, now.

Other Episodes

Episode 0

August 05, 2019 00:45:18
Episode Cover

Getting the ROI out of XR, with Sector 5 Digital's Cameron Ayres

Alan and his guests often espouse investing in XR on this podcast, but that comes with the implicit understanding that you should expect a...

Listen

Episode

June 18, 2019 00:46:55
Episode Cover

Navigating the New Frontier of Extended Reality with Accenture’s Rori DuBoff

Businesses that were late adopters of the World Wide Web and the mobile realm are the butt of obsolescence-themed jokes today. Extended Reality evangelist...

Listen

Episode

August 15, 2019 00:08:50
Episode Cover

Meet Bobby, the 3D-Scanned Teddy Bear (XR News 8/15/19)

If you didn’t think our 3-episode-a-week release schedule was dizzying enough, welcome to the XR for Business Podcast’s new weekly news rundown! The space...

Listen