The top of a wind turbine a hundred stories up from the ground is not the best place to be making mistakes, but making mistakes and learning from them is the whole point of on-the-job training. That’s why VR Vision Inc helps companies produce XR training modules, so trainees can make mistakes in a safe, controlled environment. COO Lorne Fade drops by to talk about it.
Alan: Today’s guest is Lorne Fade, co-founder of VR Vision. Lorne is a serial entrepreneur that has built several businesses over the last 15 years. He’s had the pleasure of working with some of the world’s largest Fortune 500 brands and award winning marketing agencies all across North America and Europe. His previous agency, Academic Ads, was acquired, and he went on to found VR Vision Inc. As the co-founder and COO of VR Vision, they’re a virtual and augmented reality startup that’s enhancing immersive training outcomes for some of the world’s largest brands using VR, AR, and AI technologies. He’s also the founder of Reality Well, a healthcare technology platform to improve the quality of life for those living in long-term care facilities. You can learn more about VR Vision by visiting vrvisiongroup.com. Lorne, welcome to the show.
Lorne: Thanks for having me, Alan. Thanks.
Alan: My absolute pleasure, man. We’ve known each other for quite some time through the VR/AR Association in Toronto, and we shared some booth space together, and it’s always great to see what you guys are working on. I know the last time we saw each other, you were showing me an automotive manufacturing facility in virtual reality and how you were using that. So let’s dive in there. Let’s talk about how you guys are using VR and 360 video to make better training.
Lorne: Yeah, that’s that’s one of our bigger use cases with Toyota, where we’re training about 10,000 employees currently using 360 video, in immersive training scenarios in VR. And it works really well for eliminating risk and providing a safe environment with zero harm. And it’s totally immersive. So the employees that are getting trained in VR, no distractions, they can’t be on their phone or anything. It was really simple the way we did it. We just storyboarded various scenarios with Toyota on various processes, on safety concerns, on their assembly lines or processes that were mundane and replicable. And then we went out and filmed with a stereoscopic 3D camera, so when they put on the headset they feel like they’re there, fully 3D. And we mapped out, I guess about two to three minute scenarios, various parts of their assembly lines and filmed it all in full 3D and then ported it over to VR, added some overlays, some voice overs, some touch points and interactivity so that the employees could be trained in a completely immersive environment. Nothing like this is, from my knowledge, has ever been done before. So it’s really cool to have this type opportunity to work on a project like that.
Alan: So how are they measuring success? For example, STRIVR is doing 360 video with Wal-Mart and their key performance indicators. They’re measuring training times, how long it takes to train. They’re also testing retention rates. What are the KPIs that you and Toyota decided on, how to measure that?
Lorne: Yes. Great question. We developed a in-house analytics engine for tracking where the user is looking, the various touch points of the training scenarios. And every user that uses the platform gets their own log-in, so we track each user, their effectiveness, and how well they’re being trained with the scenarios. And then within the scenarios, there’ll be, let’s say, about 20 interactive touch points for various risks, or hazards, or processes that the employee needs to learn. And then at the end of this scenario, they’ll get a breakdown or a test results screen that will get pushed to Toyota’s LMS on the backend so they can see how the employee performs. But also within the headset, the user will get to see where they performed and get to learn again on the various things that they might have missed throughout the course of the module.
Alan: So it’s really giving the employees the opportunity to learn through making mistakes, which is funny because our whole lives in school, we learn not to make mistakes: you get an F, and that means fail, and you’re screwed, and you can’t go into university, and it’s beaten into us never to make mistakes. But in the real world, we make mistakes every day, and we learn from them, and we move on. But this is even better, because it’s not the real world. You’re able to make mistakes in the privacy of your own headset, you’re not feeling embarrassed.
Lorne: And it saved you a ton of money for Toyota overall. Basically, instead of having an employee on a live assembly line making those mistakes, where they would have to shut down production, then that could be super costly over time for the plant itself. This way they’re able to train in a risk-free environment without shutting down of production, so that when they’re ready to hit the assembly line – for whatever the processes that they’re tasked with – they’ll be way ahead of the game, it’ll cause less mistakes and save a ton of money for Toyota overall.
Alan: So how are you measuring that specifically, are you measuring training times?
Lorne: Yeah, we’re measuring training times. We’re measuring efficacy for the employees. And then when we put them on the live line, we get to compare and contrast based on their test results, how many mistakes they’re making on the live line. Now, we’re not fully testing just our training scenarios as the end-all, because Toyota has a number of other training LMSs and dojos that they’re using for training the employees, but they weren’t seeing an improvement overall with the employees that had done the VR training.
Alan: That’s really interesting. In your analytics, you mentioned that you’re pushing it to their LMS system. How difficult was that, to go from one company to another? I would assume there are different ways of working.
Lorne: The biggest challenge there was working with their IT, because they had a pretty strict regimen for their firewall. And then accessing it is a very tight network. A lot of restrictions, a lot of loopholes we have to go through. So it took a couple of months of working with their IT team to be able to pass through data from the headsets, and have the headsets themselves connect seamlessly to their network, and make sure they were all on the same MAC address. It’s actually outside of my technical scope. I’d have to ask our IT guy internally here. But basically, once we figured out how to pass through their network, it was seamless.
Alan: What about things like device management? Because if you’re going to train 10,000 employees, how many devices does[sic] that?
Lorne: That’s definitely a great concern that enterprise groups need to be aware of. We’re seeing the brands like HTC and Oculus start to catch up with their business solutions that are going to start to offer enterprise management. We kind of hacked it for the get-go because it wasn’t available as of yet. There’s a great company you can look up called 42 Gears that basically provide a mobile management solution, that can be ported to Android for any devices that are being programmed with Android backends. And that allows us to see all the devices on the network, push updates through them, and manage them remotely. And then we went a step further and we developed a mobile management application for tablets and cell phones, so that a practitioner or a trainer that’s managing the training serials for the users can manage which modules they’re placing the user into, and see where they’re at within the training program.
Alan: Now, is that done from a tablet or a phone or something?
Lorne: Yeah, yeah, it can be done from either a tablet or a phone. Anything Android or iOS based.
Alan: When you’re making the scenario– so, for example, take us back to the beginning. You meet with Toyota. They say, “Hey, this is great. We want to do a trial.” What is the lead time from this first meeting you had, to deployment to 10,000 employees. Is that like a year or two years? What’s that look like?
Lorne: I think the development timeline was about six months, back and forth to storyboard out all the various modules. We started with a proof of concept with one simple module to see how effective it would be. They loved the 3D, they love the immersiveness of it. So we move forward with five modules, and then those films and the whole processing, post-production took about a year overall for all five modules. And now we’re in talks to scale that through more facilities throughout North America. Per module, it really doesn’t take that long. It’s just that we have a 360 development production crew, goes on site, films, takes about one or two days, and then we take it back and post-produce it with various touch points and voiceovers. And that whole process for one module takes anywhere between three to four weeks, overall. I guess the back and forth that took the longest was working with IT and figuring out some of the other complexities, like pushing updates to their LMS, things like that.
Alan: I would think also just the simple procurement process. [laughs]
Lorne: Yeah. Oh, that too. They’re very–.
Alan: Take longer than everything.
Lorne: Yeah. Yeah.
Alan: Standard across all enterprises, yeah. There’s a note to people listening: if you’re working in the C suite of a large enterprise, perhaps consider figuring out a way to work with startups more efficiently, through streamlined procurement processes, because it really is onerous for a startup trying to innovate on technology, while trying to run the gauntlet that is procurement.
Lorne: [laughs] And then keep your overhead going, and runway.
Alan: Exactly. Part of the reason we started XR Ignite was to really be that – for those of you who don’t know, XR Ignite is our community hub and connector – so our goal with XR Ignite is to be the connector between startup studios and developers and corporate clients, and be that conduit for conversations back and forth. What our corporate is looking for – and you mentioned some of them, safety, security, networking, device management, LMS, integrations – and then bringing that knowledge over to startups and saying, “OK, what do startups need to do business with corporate?” and that’s streamlined procurement processes, faster payments and more streamlined communications. So I think it’s a time in a place where we need to really bring everybody together. So that’s what we decided to do with XR Ignite.
Let’s talk about the actual experiences, because I’ve tried one, it was really interesting. You put on the headset and it was really cool because I’ve never been to a car factory, where they build car parts and doors and things, and I was in there and there’s this woman stamping giant pieces of aluminum and she’s doing her job. And then you have to look for anomalies. You have to look for things on the ground, or is she not wearing a hardhat, or whatever it is? Did they provide you those things or did you look at the space and go, what if we put a banana peel over here or…?
Lorne: We basically work with them on the storyboard to provide the highest risk items that would be the biggest safety concerns for the employees. Like not wearing proper PPEs, walking in the laneways where they shouldn’t be walking. Just not using proper safety gear or leaving things in the wrong places. And then we went a step further and added our own flair, if you will.
Alan: I love it. Now, were they accepting of adding your own flair to that? Because sometimes this stuff can be really dry and boring.
Lorne: The basic secret sauce, though, that we provided: we developed this for standalone VR headsets and a lot of the standalone VR headsets really max out at 4K resolution, whereas we’re filming in 8K resolution. So we wanted to push the best quality that we could for the experience, so it was completely immersive, was exciting. It had replicability and it was scalable. So on our backend for the post-processing side of things, kind of did some optimizations with the 360 video to make it appear around 6K instead of 4K in the headsets. Reduce some of the screen-door effect, really just to optimize the visual aesthetic of it so that when they’re playing it in the headset, it just appears as best as possible for the experience.
Alan: I can attest to that. It really was a clear situation. It was–
Lorne: It’s like watching a 3D movie. [laughs]
Alan: It wasn’t even like a 3D movie. It was like I was in the factory. But by the time I put the headphones on and the headset, couple minutes in and I was right there on the factory floor watching this process of stamping these things out. I’ll never forget it, because I feel like I was right there, watching it. And I got a few of the things wrong, but…
Lorne: I think that’s the true value of VR. It’s being able to replicate any type of scenario that’s in the real world but in a safe, controlled environment. And I think this works really well for enterprises that have a lot of potentially harmful, or carry a high risk-versus-reward type of training that may be expensive for onsite, or dangerous for the people that are training. There’s another scenario we’re working on right now with a wind turbine manufacturer, and they’re developing maintenance technician training and it carries a high risk to go up to the top of those wind turbines and work on them with a tether. And they’d rather have these employees trained in a dojo in a safe, VR controlled environment before sending them up hundred stories high to the top of a wind turbine.
Alan: You know, that seems to make sense. I went to a talk the other night and they were talking about– there was a gentleman who’s making nuclear reactor training, for the nuclear reactors here in Ontario. And one of the scenarios is the CANDU reactor, which is a huge reactor. It’s maybe 30 feet high and it’s got all these little tubes. And in real life, you can’t walk in front of the tubes, because they emit radiation and there’s just like invisible beam of radiation. So if you walk in front of the beam, well, you’re–
Lorne: Chernobyl.
Alan: Well, no, you’re just going to have a paid vacation. But one of the things that they showed is, how it’s managed today is, they literally have a piece of tape on the floor. They have duct tape on the floor saying, “Don’t walk within these duct tape lines.”
Lorne: Oh, jeez.
Alan: That’s the safety protocols in a nuclear reactor. So being able to recreate that with a Hololens – is what they used – and be able to recreate visibly what that beam of radiation looks like. Then you can get a visual representation so that when you’re in that facility and you have to go because it is not something that people do every day, it’s very, very rare that they have to go in there. But when they have to go in there, they have this visual representation of these beams of radiation coming out. And I think that’s a little bit better than some duct tape on the floor.
Lorne: Yeah, I think nuclear reactor training is one of the better use cases for creating a safe controlled environment versus a live test bed.
Alan: You would think, yeah. You know, we don’t really want to go down that road. You talked about wind turbines. That’s another big, big area because I mean, clean power is becoming huge and wind turbines, they’re– I don’t know if you’ve ever been in one.
Lorne: No.
Alan: But I have, in VR. I’ve been in a wind turbine. I climbed up the ladder on the inside. I got inside. I looked at the motor. I stood on top of one, all in VR. And I’m good with that. I don’t necessarily need to do that in real life.
Lorne: I’ve definitely been in one in VR. I haven’t been in a real one. [laughs]
Alan: It’s pretty awesome. And there’s so many things that can be done with this. And let’s talk about the cost to deploy something like this. For example, company comes, XYZ company. They say, “hey, we saw what you’re doing or we heard the podcast. This company is doing this. We make widgets and here’s our machine factory. We want to start doing safety training in VR.” What does that typically look like, for as roll-out, your measurements of success, and the costs as well?
Lorne: The costs of actually come down with the standalone headsets, because there’s less graphical work that needs to be done. It’s really linear overall. Basically, there’s two ways that we develop up here at VR Vision internally for these training applications. There’s 360 video that’s ported into VR scenarios, that’s going to be filming or any type of real world environment. Typically, the 360 video form factor is going to be cheaper and more cost effective than creating a CGI based environment, which is basically the other way that we developed training applications. For the 360 side of things, per module, we charge anywhere from 15 to 20,000 dollars, but you also need a platform to interact with those 360 videos. So we start with like a base layer for anywhere from five to seven thousand dollars for a platform that’s built out. It’s kind of like the menu selection screen of Netflix, if you will. And then once you’re in that platform, you can select the various modules or training outcomes that business may want to use. And basically, each training outcome is anywhere from 10 to 20,000 dollars, with interactivity and voice overs and fully optimized. It really depends on the length of the training outcome. These are averaging about three minutes long. But if you have a longer one, it will take more post-production, which would be more costly.
For a CGI based environment, those costs can be far reaching. It really depends on the scope and brevity of the application. The ones that we’ve developed, they fall into like the 40 to 50,000 dollar range, for basically a three to five minute CGI based training scenario. We did one for a fire safety drill for a company down in Texas called Alchemy Systems, and it was basically replicated version of their factory, one-to-one in a CGI based environment. And it trained the users that worked in the factory how to find the fire exit, and what to do in case of an emergency.
Alan: So how did you get the factory one-to-one scale? I mean, obviously, they have the measurements of the factory. You just import that into CAD modeling program or, how did that work?
Lorne: Yeah, they had FBX files of a lot of their factory. And then there was another way that we did it was using LiDAR, which basically went on the floor, scanned the whole factory. It was pretty boxy, rectangular shaped factory, so it’s pretty easy to do. Just scanned the length and then the size of it, and then ported it over into a virtual environment.
Alan: Well, that’s easy.
Lorne: It sounds easy, but there’s a lot of technical expertise, but…
Alan: If I had asked you the same question three years ago, it probably wouldn’t have been that easy.
Lorne: Yeah. Yeah.
Alan: One of the things that we’ve been seeing as a repetition on this show, is that these technologies are getting better, faster, cheaper every day. There’s more talent coming out that know how to use these technologies. But I think one of the key takeaways is that, this isn’t something that you should be looking at five years down the road. This is something that people are utilizing now and getting dramatic results. So let’s talk about some of the results that your clients are getting.
Lorne: They’re having resolutions of conflicts that can arise in a workplace scenario. That’s one of the biggest ones, just avoiding those risks and avoiding downtime for various training scenarios. They’re getting a lot of assessments, post training. So with our analytics engine, we’re tracking where the users are looking, we’re seeing where the problems may arise, or where things are being missed. And then let’s say they’re missing an easily overlooked area of just handling a box or flipping a switch properly. And we see after training 10,000 employees, that maybe half of them are missing this one simple thing. So now we know that this training outcome needs to be pushed a little bit heavier for those employee, so they can reduce the problems with whatever that specific process is.
Alan: Or maybe the process itself is flawed.
Lorne: Or maybe that as well. Yes.
Alan: We never want to talk about that. But let’s be honest, sometimes things were done just because they were always done that way. And now this can shed a light on certain processes that are maybe antiquated or out-of-date.
Lorne: Something that helped us optimize our training programs was to learn from the employee feedback, and then getting multiple iterations of our training programs in place, so that the frontline employees can help optimize training elements to maximize effectiveness.
Alan: So maybe unpack that a little bit.
Lorne: So basically with the post-training assessments, we did a lot of surveys on the employees to see how effective they were finding it. We had some training modules that were rated much higher than others. So we can go back to the ones that were lowly rated and find out “Well, maybe this was too hard for the employee to learn various elements of the training protocols.” so we can make it a little bit easier for them to find whatever the risks were or the safety concerns were for the training scenario.
Alan: So now in that case, you have to go and refilm this, if it’s 360 video, for example.
Lorne: Yes, it would be to re-storyboard it from the ground up for 360 video. For CG, it’s just a matter of tweaking things in-house.
Alan: I think therein lies the exact cost-benefit analysis of 360 versus CG, because if you’re filming in 360 video, it’s 15 to 20k to film each one of these modules. And in CGI you’re looking at 40 to 50k. The difference being if something needs to change, you have to go re-record that, that’s another 20k. In CG, if you need to change something, you can change it on the fly. And one of the things that I love about computer graphics is that you can reconfigure the warehouse. You can add elements real time. You can add things in. So there is that benefit of–
Lorne: Future proofing.
Alan: Yeah, future proofing that. But it’s not always necessary and it’s not always warranted. So when do you decide which one to use over another?
Lorne: There’s also factors to consider, like multiplatform support, having VR/AR functionality, but also being able to push those exact scenarios to the web. In case there’s not a VR headset available, being able to have a 360 video on the web for the user to learn in a dojo or LMS environment, that doubles the effectiveness and accessibility of the training programs as well.
Alan: What devices are you pushing up to now and how does that look like? Let’s take 360 and then we’ll move into CG, for example, because the headsets are changing daily. We’ve taken a complete device agnostic approach, because who knows what the next big thing is gonna be. So how do you then future proof the content to be available in such a broad range? How does that look like and what devices does that go to?
Lorne: We’ve kind of transitioned away from PC powered VR. We think that a lot of the future is going to be based around standalone devices. And as the computers get smaller and faster and more portable, people are just going to want to get away from the cumbersome setups of sensors and just move toward easily portable and scalable device. Things like the Oculus Quest, Oculus Go make it really easy for adoption. Then you see Vive Focus and the Focus Plus, work equally as well. They’re much more portable and scalable for businesses to adopt, whereas two, three years ago these devices didn’t exist. So it’s hard to predict where things are going to be in another two years based on how fast the industry is moving.
Lorne: From the backend side of things, for programming, something to be aware of when developing these – CG based, especially – is there’s a lot of downsizing of sampling for various graphics, because the standalone devices simply can’t push the same amounts of power and graphic quality that the PC powered devices can. So a lot of the times we have to really dumb down or filter down the polygon counts, just to make sure that the standalone devices can still push a decent looking scenario but not overload them, so not to cause frame rate issues and nausea.
Alan: Very interesting.
Lorne: It’s definitely something that developers should be aware of, or businesses looking to adopt the technology.
Alan: What’s the biggest challenge that you’ve found in the adoption of this technology?
Lorne: Tracking issues has been one of the biggest hiccups for us. Before the Focus Plus came out, we were really stoked that finally stand alone VR is here and we ported over a lot of our platforms to the Focus and then we ran into a wall with tracking issues, the controllers would lose focus when you put the controller behind your head, for example, simply because the headset only had cameras front facing. The Oculus Quest has helped a little bit with that because they have four cameras on the front and they’re kind of like a fish eye lens. So they track a little bit better for fronts to the sides and above you and below you. But still, you’re going to lose tracking if you have to put your hand behind your back for whatever reason. So that’s something that’s been a challenge for us, for developing some training scenarios.
Alan: I think the hardware itself is growing by leaps and bounds. They’ve made really, really big strides in bringing that one unit without having to have a computer. And I think that’s one of the biggest challenges with VR has always been the challenge of just getting it to work. You set it up, and then all of a sudden you’ve got 30 Windows updates, and then another Steam update. And then by the time you’re ready to go, there’s an hour gone. Your training time is missed.
Lorne: Yeah. Definitely something to be aware of. I think we’re going to see a lot of advancements in technology in both consumer markets, as well as industrial and commercial applications. Something that we’ve been really excited about is, we’ve just been testing the RealWear AR headset.
Alan: They raised 80 million.
Lorne: Yeah. They raised a ton of money, but they’re really competing with the Hololens. It’s not really competing in a sense because Hololens is more for a static environment, where the RealWear is more for on the job task based, ruggedized training. And I think there’s gonna be a lot of potential for hardware – mixed reality based hardware – in the future. I think they’re going to combine a lot of AR and VR for ruggedized use in the field. I think that’s where the immersive training side of things will move towards, although it is hard to predict.
Alan: I got to go to PTC’s LiveWorx in Boston and I tried the RealWWear headset, and basically what it is, is a little articulating arm that mounts to your construction hat, and it’s like pulling down a screen in front of you. Like imagine pulling up your phone, right? But you pull up a little screen and it’s like having a 9 inch, 10 inch tablet that’s about maybe a foot away from your face, in one eye.
Lorne: Interesting.
Alan: But it’s ruggedized. So it’s waterproof, it’s bombproof. It’s like this big rubber arm. Now the issue with it – and they’re going to address this, I’m sure, on subsequent ones – is that finding that little sweet spot of getting it right in front of your eye in the right spot is kind of finicky, you kind of wiggle it. And then once you get it, it’s usually fine. But I put it on and they have this thing called Expert Capture. And what that means is, you can use the camera on this thing to capture– let’s say, for example, I’m an expert, I go up to a machine – in this case that I went on, it was a tractor – I look at the tractor and I say, OK. And I hit record and I record how to replace the air filter. And then I hit stop. Now, that’s recorded forever and it can be pushed out of every headset. Now, what I do is I put on the glass. It walked me through step by step. A little video said, “here, go here, pull off this cover, replace the thing, put the cover back, make sure the switch is turned.” And that was it. And I replaced an air filter on a tractor. And I’ve never touched that before. I’d ever been on a tractor before. But that little heads-up display gave me all the information I needed, real time.
Lorne: So do you think you could do that on a real world tractor now that you’ve learned it in the headset?
Alan: Oh my God, yes. I’ve done it. So it’s in my head. Obviously, I don’t know the model of tractor. So it would vary by model. But if you put me in front of that model tractor and said change the air filter, I go to the back of the tractor, I climb up, I pull the air filter out. I know exactly where it is. Yeah, I did it.
Lorne: It’s amazing.
Alan: It’s not something that you told me about or I learned on YouTube. I did it. I did it in real life with my hands. And I think this is something that being able to train people on in VR is one thing, where you need a completely virtual and safe environment, but also taking elements of that 360 video elements or those elements of just the information you need at the time you need it, into the real world is really important. That’s why I think RealWear it is a really excellent, elegant solution, although it is very low tech, if you think about it.
Lorne: Yeah, I think being able to use your hands in the real world. I think just a hands-on element, it creates much better retention for learning overall, versus the scenario where you’re using controllers. You’re still learning, but being able to get your hands dirty, if you will. And I think that more than even VR may help learning retention. So it’s interesting to see where the space goes in the next couple of years.
Alan: Yeah. There’s a trial we’re going to test. There’s an excavator, a VR experience made by a Toronto company called Career Labs. And the first thing you do, you learn how to start it, what all the controls do, and then you drive it. You go grab some rocks and put it in a dump truck. So we’re going to put my daughters, who are 11 and 15 in the scenario for an hour each, and then we’re gonna take them out onto an excavator and see if it translates from an hour in VR to being able to operate a real excavator.
Lorne: That’s great.
Alan: Well, we’ll see.
Lorne: See how the results are.
Alan: It’ll either be awesome or they’ll destroy a couple hundred thousand dollar excavator.
Lorne: [laughs] Let’s hope not.
Alan: I hope not. I have confidence in the VR training.
Lorne: [laughs]
Alan: So what’s next for you guys? You’re expanding, you’re growing, you have a new office in Toronto. What’s next?
Lorne: I guess I’d like to touch on Reality Well, because that’s a subsidiary brand that we’re launching. We actually just launched the website and we’re doing a bunch of PR right now for it. It’s basically a platform built for standalone VR – for the Vive Focus or Oculus Quest – with a health care focus, for measuring improvement of quality of life. So we’re really focused on retirement homes, hospice centers, places like that for the elderly. We want to help with cognitive thinking, memory retention, improving mobility, as well as just adding entertainment and increased mood for people that are otherwise bedridden or just bored out of their minds.
The platform itself is fully contained with three sections. The first section is CG based environments that are playful and fun, with animals and interactivity, and they’re just meant to be light and fun for the users to explore. There are things like winter scenes, beaches, forests, very vibrant colors, all CG based. The second part of the platform is real world 360 videos and photos, that we’re slowly procuring in 8K stereoscopic 3D. The highest quality that we can really develop for, it’s all our own content. And it’s just places like landmarks all around the world, bucket list items. I’m actually going to Italy in two weeks to film more content there as well. And that’s a great way for the users to visit places that they may not get a chance to visit in their lifetime. The last part of the platform is minigames, but they’re called exer-games, or serious games in the healthcare community. And we’re working with the University of Waterloo to validate these games, to help with things like mobility, to help with memory retention. Some of the games are like rock balancing games. There’s like a music game. It’s kind of like Beat Saber, but you’re on a beach and there’s just some beach balls coming at you instead of the Beat Saber blocks. It’s a lot of fun. They really enjoy it so far. We’re developing more games for that as well. There’s a fishing game that we’re almost finished and there’s gonna be a farming game as well.
Alan: So let me get this straight. You’re hitting beach balls on the beach. Is it things like, [hums jitterbug tune]? Is it like big band swing music? Clearly not techno music like Beat Saber.
Lorne: No, no, it’s not techno. It’s more classical chill, laid back, relaxing type of music. This is definitely aimed at a different crowd than the Beat Saber crowd..
Alan: Not going to have the Skrillex remix?
Lorne: No, no dubstep here. It’s to help increase their mood and just overall entertainment. So it’s–
Alan: Are you collecting data about these people as well?
Lorne: Yes. Yes.
Alan: The health care providers so that they can help with, because I can imagine there’s some depression and there’s some loneliness, so…
Lorne: Yeah, there’s analytics for all of our platform and there’s a rating system, as well for a lot of the experiences. So after they’ve tested out each one, they can rate it on a scale of 1 to 10. So we can try and drill down what they like the most. And right now we have pilots in about six different health care facilities. And we’re gauging and measuring to see which type of scenarios and environments that they like the best. And so far, they seem to love animals. We filmed at Toronto zoo, and that’s one of the favorite 360 video experiences that we’ve shown them so far. Because you know what it is? It’s the 3D. When you’re filming in stereoscopic 3D, – let’s say you’re looking like a horse range – you almost want to reach out and touch the horse’s head, because it feels like it’s right there in front of you. So it’s really amazing what we’re able to do with the technology nowadays.
Alan: It’s really fantastic, being able to provide such a wonderful service to seniors who may or may not be able to get out, or maybe their memory is failing. And it’s just, it’s wonderful.
Lorne: Yeah, it’s definitely heartwarming. And I really hope that it helps. And we can grow this to provide it to as many facilities as possible, because I think this could be super beneficial for a lot of people. You know what it is? It’s like bucket list items. If I’m 80 years old and I can’t travel anymore and I never got to go to Machu Picchu, bring me a headset and give me a 3D video or tour of Machu Picchu, so I can feel like I’m there. To me, that is truly amazing. And that’s what we’re trying to provide.
Alan: That’s wonderful. So that leads me to my last question. What is one problem in the world you want to see solved using XR technologies?
Lorne: I think the most impactful thing that XR technology can do is train people that save lives, people that are in roles like firefighters or policemen, in high risk scenarios – army’s definitely a huge one as well – any type of role that carries a really high element of risk for real world scenarios, and has the impact to potentially save lives. I think that is where I’d like to see the technology used the most. If we could leverage the technology to mitigate risk in those risky environments, and at the end of the day, this technology is used to save lives, I think that would be a beautiful thing to use the technology for.
Verizon’s XR development lead, TJ Vitolo, dreams of a day where he can download an entire TV series in an instant, or visualize info...
PTC LiveWorx is one of the biggest gatherings of up-and-coming XR tech in the industry. With all sorts of amazing future tech demos, PTC’s...
If you sell a couch that comes in 1,000 different patters and colours, what’s cheaper: printing out a swatch for each variation, or creating...