Ask someone with enough experience with 360 filmmaking (like Alan), and they’ll tell you — it’s not always been a user-friendly undertaking. From exporting to editing, making great 360 content could definitely be a chore. Insta360 Marketing Director Michael Shabun visits the podcast to explain how their products try to make the process seamless for all 360 filmmakers.
Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Michael Shabun from Insta360. He’s the marketing director of Insta360 and leads North American marketing strategy, partnerships and communication efforts. Michael specializes in helping overseas brands build their presence in North America. Previously to joining Insta360, Michael led the Business Development Team for DJI — that’s the crazy drone company in North America — where he was instrumental in moving the company into the public spotlight through a series of strategic partnerships with entertainment, sports, and enterprise verticals. If you want to learn more about Insta360 and the awesome cameras and platform that they’ve built, you can visit insta360.com.
Michael, welcome to the show, my friend.
Michael: Thank you so much for having me, Alan.
Alan: All right. Tell us what Insta360 is, and how you got involved with it.
Michael: It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?
Alan: Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s pretty badass.
Michael: I completely agree. I think stabilization is what kind of brought the technology full circle, and what started enabling it to be widely used by not just VR content creators, but also by traditional filmmakers who just want special angles or to capture certain moments in a much easier way. And just kind of backing up for a second, what most people don’t see us as, and what Insta360 truly is, is a software company. And that’s what you were talking about as being the difference in creating a product that from a hardware standpoint is powerful, but also from a software standpoint, it’s easy enough to use where it’s not going to hold you back, or create these lengthy postproduction delivery timelines. And that’s something that really dipped the whole industry several years ago, is that the hardware was there with some of these companies, but when it came to actually deliver this content or edit it or share it across any platform, that’s kind of where all the roadblocks were. And so what we decided was we need to blur the lines. We need to make 360 content creation just as simple and easy as traditional 2D flat capture. And a big thing in doing that was adding cinema-grade stabilization. And with a 360 camera, it’s a little bit different than stabilizing for a traditional flat camera, because you’re essentially capturing everything. And since you’re capturing everything, the stabilization, the way it works and the algorithm that you need to program is a little bit different. And not getting too many technical details — which my engineers, I’m sure, would love to chat about — but with our One X camera, which is the world’s most popular consumer-grade 360 camera, it’s actually doing two things at the stabilization level. The first is we’re packing insanely powerful gyroscopes directly into the cameras.
Alan: That makes sense. I couldn’t figure out. I was like, why is this so damn good? That makes sense.
Michael: There are two types of stabilization happening in this small little camera. There’s the stabilization at the capture level, which is essentially just using the gyro and the data to stabilize that in capture. And the second way is through a post-processing stabilization algorithm, that we call FlowState. And FlowState is something that we came out with the last generation of our camera, but we were constantly taking it in, perfecting it, and making it an even better, more dynamic stabilization algorithm. And so with FlowState now, you’re stabilizing again during post. And that’s something that’s available in our app, whether you’re using an Android or an iOS device or if you want to stabilize through our desktop software, you’re taking that and you’re essentially making cinema-grade stabilization at the touch of a couple of buttons right from your phone. And that’s something that’s created a really smooth and dynamic workflow with all of our users. You don’t need to be a super well-trained videographer or you don’t need a buy or rent tons of stabilization gear to stabilize your camera. All you need is the actual device, our invisible selfie stick, or whatever you’re mounting it on, and the willpower to actually do it. So now we’re seeing all of these folks in the action sports industry, in the filmmaking space. They’re now incorporating the One X into their workflows, whether it’s specifically around 360 capture for VR, or just adding special angles and pulling off really unique shots.
Alan: Like the flying one where you can attach a camera to little plane wings and fly it through a scene.
Michael: Exactly.
Alan: That’s so badass.
Michael: And that shows that at the end of the day, we’re a company that inspires creativity. We want people to be able to achieve their visions without necessarily needing to spend all this money or go through all this planning. We just want to make it easier for them to capture really great and amazing and unique shots. And what you were talking about with our accessories is that we’re continuing to make accessories for our cameras, that will allow you to create these special shots in moments that you wouldn’t be able to capture with simply with a normal camera. And with the Drifter you’re seeing that creativity come to a peak where it’s– where we’re taking essentially a drone shot and a cable cam shot and we’re combining them together with a $50 accessory, and allowing people to throw their camera at a hundred frames per second and create these stunning shots. And it always kind of leaves people’s mouths dropping, because when they see the footage, they’re always wondering, how do you get the shot? And then we’ll show them the behind the scenes. And it just– I love seeing the reactions on folks’ faces when they see it.
Alan: For sure. It’s just awesome. Like we’ve had– my– you guys sent us a Nano a few years ago and my wife carries it with her everywhere. It plugs into her iPhone. And I mean, she’s must’ve taken thousands of photos with this thing. Every meeting we do, we get everybody around a circle, we take a picture. She shows them the inside out, the tiny planets, and then basically in two seconds sends it to everybody in the meeting and they posted to all their socials. It is the greatest icebreaker ever for going into a company, because it’s instant, it’s great. Beyond that, I mean, you’re able to create completely new 2D videos from the 360 perspective. So you’re able to capture in 360. And everybody thinks, oh, if I captures in 360, I got to look at it in VR, which is cool. But I think being able to take this and make a tiny planet out of it, or make a reverse where it kind of spins from tiny planet into it. And the crazy thing is, you guys have built all of this into the software, so with the touch of a button I can see six different animations, pick the one I want, pick the starting point, the endpoint, and boom. Now I’ve got social media post in seconds. So here’s my advice. Anybody who is in marketing, whether it’s marketing peanuts or marketing space shuttles, you should definitely go buy one of these cameras. And let’s talk about your camera line. And then I want to just say one more thing before I let you go on about the different cameras, because they do service different points. But go back four years. We were using hand stitching and manual stabilization, costing tens of thousands of dollars a minute of finished footage, and then your tools do this on my phone. But that’s where we’re at.
Michael: It’s incredible how quickly the technology has evolved, and I just have to shout out to the Insta360 engineers, they work tirelessly day and night to constantly improve the app and the algorithm. And they’re some of the most brilliant people I’ve ever had the opportunity to work with. And it’s truly incredible, the type of innovation and technology that’s coming out of China as a whole right now, across the board.
Alan: It really is. Your team– I mean, look, we’ve worked with other 360 camera manufacturers over the years and we’ve always used your camera as– and shown them like, “Hey, guys, this is what your thing needs to do.” And I don’t know if they just understand that or whatever, but they just “We build hardware.” OK. Well, yeah, we hardware’s useless with software. So I think that what you guys have done is just truly spectacular. So let’s talk about the different types of products. We’ll talk about consumer and then professional and enterprise, because I think it’s important to talk about this. I just recently– you guys sent me the Evo, which let’s talk about the Evo first, because it’s pretty incredible.
Michael: Sure. So the Evo is one of our newer products. It came out a little less than a year ago, I believe it was last April. And what the Evo is, it’s a consumer-sized camera. It fits in the palm of your hand and it’s a convertible. So it converts from 180 degree 3D to full 360 degrees. And the great part about this is how simply it goes from mode to mode. One challenge that we started with was, well, when you’re flipping it from 180 degrees to 360, how do you account for the calibration? How does the camera get to know that it’s going from this mode to the other mode? And how simply is it going to be for the end-user to actually go through that process? And so what we did was we–
Alan: How easy it is: as a user, I got out of the box and I pressed a little button on the side. I slid the lock. I opened it up, slid another lock. It locked into place into 180, meaning the two cameras were pointing in one direction, took a picture. Then I pressed the button again, move the lock, folded it over, locked it, took another picture in 360. Literally, it could not be any simpler. And then, of course, nobody– like myself, nobody reads the manual. So it was literally– that was as easy as it was. It’s– you took all the guesswork out of it.
Michael: And that’s the exact goal. We don’t even want users to even think about that. We just want them to switch from mode to mode and not have to be concerned about, “is this going to work perfect in either mode that I’m working in?” And that was, I would say, the hardest part in building the camera from a development standpoint. And we had some great partners along the way, that helped us kind of co-develop and beta test this and– including our friends over at Oculus, who are really making a strong push for 180 3D these days for their headsets.
Alan: Yeah. And the 180’s interesting because I have to also call out your packaging of this product. When I got the box, first of all, it’s a beautiful small box, size of a consumer electronics box. You flip it over and there’s a lenticular image on the back of the box of some people at a birthday party. But it’s fully 3D and it just has this depth and feeling to it that’s absolutely incredible. And that was a photo taken using the 180 mode. Like, this is really incredible marketing. When you open the box, camera’s inside, but there’s also a VR viewer, like a little Google Cardboard kind of thing, a little plastic thing that you slap on your phone. So I took a picture in 180 and I was like, okay, what is this 3D all about? Put it into VR. And I couldn’t believe– It just brought the photo fully to life in three dimensions. It was just really, really cool. And so I don’t know how people– how are people using this?
Michael: So it’s actually interesting how lenticulars have been around for, what, 30 years now, from those old baseball cards.
Alan: Lenticular business cards.
Michael: Yeah. And–
Alan: We had lenticular business cards a decade ago. [laughs]
Michael: And so it’s so interesting to see this material that’s been around for so long. But we’re now re-integrating it into a completely new technology that’s coming out. So the point of the lenticular back case was that– and we actually have to send you one of these, we have a phone case that goes right over your iPhone. It’s a clear phone case. It goes actually on the front of the screen. And once you put it on, it turns your iPhone display into a lenticular display. And so when you shoot 180 degrees 3D photos or videos, you can push two buttons inside of your Evo app and it’ll convert it to a mode that will actually play your video that you’ve shot in 180 3D as a holographic video.
Alan: [explosion sound] That was my mind exploding.
Michael: [laughs]
Alan: So basically what you’re saying is I can take a 3D 180-degree video, and then play it back on my phone with this case. It’ll– I guess I assume it’s like a lenticular case that allows me to see it in 3D.
Michael: That’s exactly it.
Alan: That’s badass. I definitely want that! That’s awesome.
Michael: So we have one on the way for you, and we’re excited to get your feedback on it. That’s one really cool niche, unique way of using a 180 3D camera. But overall, I would say there’s this mega push happening right now from some of the biggest tech companies in the world. You see Facebook with Oculus. You see Adobe. You see Google. So I’m talking like all the tech giants right now are making this huge push for 180. And as a result of that, we decided to work with them to create the Evo. And we’ve done countless workshops now, we’ve done some really great introductions with the YouTube VR Creator Lab, where we actually saw creators get a choice between an Evo and some of these more high end expensive 180 3D solutions. And once we had a chance to intro the Evo and talk about what it can do and how it can make their content capture easier, we saw mouths drop and we saw people actually taking the $400 camera over the $2,500 camera. And it was incredible to see, because one challenge — especially with YouTube content creators — is that when they first started dabbling in 360 or just 180 3D, there were a lot of roadblocks along the way. And those roadblocks ranged from simple things like not even– not getting a live view on what you’re recording, to connectivity issues, having to be too far away from the camera, and then not seeing the image that you’re recording. And then when it comes time to actual post-production and delivery, they were having to go through this whole long workflow where you had to stitch your footage. You had to–
Alan: I remember those days. Those days sucked!
Michael: And so after they’ve gone through this whole workflow of stitching and editing and exporting and–
Alan: Stabilizing.
Michael: Yeah. And there’s like ten steps. And then when they finally get a chance to view what they’re what their export looks like in a headset, and there’s like two or three things that they need to change because they couldn’t see it through the pipeline along the way, they gave up and they said I don’t want to do this anymore.
Alan: Yeah, 360’s too hard.
Michael: Yeah. I’ve spent a week just exporting this video. And when I finally put it into my Oculus or whatever I’m viewing it in, it doesn’t look good. And they don’t want to spend the time to go back and do it again. And so what we thought of was, “Well, how do we make every step of the production and post-production process easier for these folks?” And once we showed them that, like, “hey, you can actually connect your Evo through Wi-Fi to an Oculus Go headset or to an HTC Vive and you can adjust your exposure and your camera settings, live in-headset and see where you’re actually recording,” it was a huge game-changer. And then when we showed them that you can after you record, you can stream whatever you just shot directly into your headset so that you don’t need to go home and put it in your computer and go through the process. And if you need to reshoot something, you’ve already lost your location, you’ve lost your talent, lost everything that you had on production day, and you simply just can’t go back and do it without more budget. This has revolutionized the industry for them, because now the pipeline for delivery is much shorter and you’re not getting as many headaches throughout the entire production process. So we’re really listening to our end-users and learning about what their problems are. And we’re trying to solve each problem every step of the way.
Alan: And it shows, it really does. It shows in a number of ways. It shows in the product line. We’ve talked about the Evo, which is 180 switching to 360. But you also have the Nano and the Nano S, which is basically a camera that snaps onto a phone, onto an iPhone and allows you to take it. Then you’ve got the One X, Go and One, which are kind of the action cameras that can be used separately. And then I would assume the software works for the One X for iPhone and Android.
Michael: Yeah. Yeah. It’s completely non-dependent on which phone you have. We support most models of newer phones, the workflow and the user experience is a little bit different with Android and iOS, just in terms of how both operating systems interact with the products. But the experience in the app is is the exact same. And where we started was the Nano and the Nano S are obviously our older product lines. But what we proved with those is that you can have this small phone attachment and you can simply and easily capture, edit, and share photos and videos in 360 to social media without ever having to pull out the microSD card from your camera. So showing that you can actually upload 360 videos and photos faster in some cases than you would from a traditional camera or even editing from your cell phone, is a huge milestone for not just our company but for the whole industry. And with cameras like the Nano S, it had some really unique features in that you can live chat with somebody like FaceTime in 360 and you can give the person on the other end of the camera full control of your 360 camera, so they can actually pan and scan around your 360 without having to have one themselves. So we’ve had these really unique features that we’ve come out with that just aren’t achievable with a normal camera.
Alan: It’s incredible. Absolutely. And I mean, if I wasn’t– if I wasn’t a user of this, I wouldn’t know what you’re talking about. But I’ve been using these cameras. The first time I met Max from your team was at the UploadVR party, maybe four years ago now, when they launched their studio in LA. And you guys had the Insta Pro there. The very first Insta360 Pro. So let’s kind of move away from the consumer side. And then I would recommend that anybody if you’re in marketing, you’re in sales, you want to create, capture, and develop great content, kind of the lower level content you can use the Go, the Evo, the One X, all of those, the Nano. But when you want to go the next level up and you want to put something in VR and you want something to be future-proofed, you guys have the 360 Pro, the Pro 2 and the Titan. So walk me through that, because the first time I saw it, it was in a low light little show within Upload’s studio. It was dark and I put the headset on, in real-time I was seeing three dimensional 360 stereoscopic. It was just incredible. And now you guys have made it even better. It’s– the Pro is what, 8K?
Michael: Yes, the Pro shoots up to 8K.
Alan: And the Titan is 11K. To put in perspective, the headsets are still displaying at 4K. So you guys are future-proofing people’s content as well with this.
Michael: And that’s the goal. It’s– there’s– the challenge in our industry is definitely on the headset and the viewing side. The technology just isn’t quite keeping up with the camera tech. But we’ve gone above and beyond and undercut the system a little bit. So– and I know I’m jumping forward, but I think this is kind of important to make that distinction. So there’s three inherent challenges from a professional VR content capture perspective. The first is obviously production and we can get into that. The second is post-production. And then the third is delivery and viewing. So it’s basically there’s problems at every step of the way, right? On the viewing side, what we did about a year ago was we came out with a technology called Crystal View. And Crystal View allows us to play 8K videos — or higher, now — in a non-8K device and that’s available in iOS, Android devices, their smartphones, your tablets, and headsets like Oculus Go and HTC Vive. What this does is it uses a technology that Google made popular some years ago and it uses something called dynamic rendering. And so what it’s doing is, it’s packing as many pixels as possible into your immediate field of view and diluting everything else that you’re not looking at. And if you whip your head side to side and you’re looking at other perspectives in your 360 headset, it’s doing this live. So you’re not seeing any lag, you’re not seeing any latency. And so we’re basically what we call it is our version of playing higher res and non 8K and 10K and 11K devices. And so you’re actually getting higher resolution, when the device itself is maxed out at 4K.
Alan: So you’re basically using head pose to render almost like foveated rendering.
Michael: It’s– that’s exactly it.
Alan: Wow. So for people to understand what that means: in order to render the full scene of 360, think of how much data has to be put into a headset to make everything super crystal clear behind you, that you’re not looking at ever until you turn around. So you’re basically saying, most people’s eyes — well, everybody’s eyes — only see in the middle 5 percent of what you’re looking at any given time. So I would assume that as we progressed with these headsets to have eye-tracking, your Crystal View will actually get more refined into that five degrees rather than whatever it is now, maybe 50 degrees, I would think. But that would be even more so as we get eye-tracking involved. Is that correct?
Michael: That’s it. And it’s also when you’re thinking about a 360 video, and I know this is probably trivial knowledge for you, since you’ve been using our tech for so long. But for everybody that’s listening, that’s new to 360, we don’t look at our resolutions the same as a flat normal camera. When you’re watching a 4K TV, everything is in front of you. All 4K pixels are right there. When you’re looking at a 360 video at any moment in time, depending on what your field of view is, you might be looking at a quarter or a third of all of the total 8K or 11K or however big the resolution of the videos that you’re looking at. And everything else that you’re looking at is, in the past was still being rendered out at that same resolution. And so we thought, well, why would you need everything else rendered at the same resolution when you’re only looking at a certain portion at a given time? And so we’re basically just maximizing on the pixels and allowing whatever is in your field of view to be played at an even higher resolution, instead of being balanced with the whole 360 video that for the rest of it they are not looking at.
Alan: Absolutely. So you’re basically– you’re able to shoot these cameras in super high res. Future-proofing is acknowledged. So something you shoot today on the Titan is going to be as relevant today as it is five years from now, by the time the headsets actually catch up to this. If they ever catch up, and by then you’ll be shooting in 600K or something or.
Michael: [laughs] Yeah. That’s assuming that we’re gonna stop at 11K. If you look at the recent trend of Insta360 cameras, the resolution’s only getting higher, the bit rate is getting higher. We’re able to now export videos in Pro res format from a 360 camera, which if you’re looking at even two, three, four years ago that was unfathomable.
Alan: [chuckles] It was a pipe dream, literally.
Michael: Yeah. So the idea is with everything that we’re doing, whether it’s on the pro side or the consumer side, we’re standardizing it to what folks in the traditional film industry are used to. And our goal is to make our products as easy to use and as powerful as possible. We know that folks in the professional film, the traditional industry, they’re so busy that it’s often difficult to take on these new tools and figure out the workflows. And so we’re– with things that we’ve done on the software side, we’re very much open to sharing our tech with these traditional programs. So, for example, when the Pro 2 came out last August, we partnered with Adobe to enable something in Premier that would basically allow you to directly import your unstitched 360 files at 8K into Premier, and allow you to trim out everything that you don’t want to export. So just backing up for a second, we used to be in the days of having to stitch everything ourselves, like you might point out.
Alan: And let me remind everybody how much that really sucked. Imagine taking six or twelve camera angles. And every time that those images cross, you had to physically go and blur them out. It really sucked. Anyway, go on.
Michael: And you had to have the skill to be able to do that. It’s not just like anybody can pick up these unstitched files and just seamlessly put them together. It actually required a lot of knowledge and skill, and so it was limited to only a few people. And that’s one of the reasons, in addition to just being time-consuming, it was a drain on your budget. Nobody wanted to do it. So what we came out with and a lot of other companies have this now is auto-stitching. So you’re looking at the camera, you’re looking at the live view and it’s already being stitched in front of you. So that was kind of the revolution on that side. But then we took it one step further. So let’s say you’re shooting an hour of 8K content and you’re on a MacBook Pro like I am. That can take you overnight. It can take you even if at times it could take you a couple of days to stitch that out in full resolution. And so you’re still– even though you’re not sitting there in front of your computer and stitching it yourself, it’s still time-consuming. It’s still– it’s taking a lot of time to get that done. So what we decided was, OK, well, why don’t we partner with Adobe and why don’t we give people the chance to before they even export, you can trim out everything that you don’t care about. So if you only want five minutes of your one hour, or if you want three minutes or 10 minutes or however much you want, you can actually now export out only the parts that you care about in full resolution. And we did this in two ways. So we actually enabled the Pro 2 to record proxies simultaneously, as it’s recording the full resolution videos. And in doing so, you can now import the proxies directly into Premier and then you can trim and edit based off that.
Alan: Ah, so you don’t have to bring in the whole giant file. You can just trim and edit it, it’ll– Oh, man.
Michael: But it will also bring in the whole giant file as well. So if you’re doing something like motion graphics and you need to be frame specific, you can toggle between the full-res and the proxy. So everything’s there for you. It’s just all about the user individual preference and how you want to edit and how you want to export and do all of your work. We’re very flexible and not forcing people into using our software, per se.
Alan: Wow, it’s really incredible. I got to ask you a question. So I’m looking at the Insta360 Titan website for a second. It says “8× Micro Four Thirds Sensors.” What the hell does that mean?
Michael: So with the Titan, it’s an incredible piece of technology, because we took what we learned from the Pro and the Pro 2, and we applied it to something that’s even higher resolution. So one thing that the Pro and the Pro 2are somewhat limited in, is low light shoot8ing.
Alan: Ah, OK.
Michael: We took that learning and we just built in higher resolution sensors and higher and micro four thirds will enable you to shoot in low light. And you won’t see a lot of pixelation, you won’t see a lot of blues or purples in the blacks. It’s making it the ultimate low light 360 camera. And we’ve taken this thing everywhere and we’ve tested it, and we’ve had our partners test it in low light. And the feedback that we’ve gotten from it has been absolutely remarkable.
Alan: Yeah, I can believe it. I can’t wait to see stuff shot on this. And the great thing– it’s interesting because today, my first interview this morning was with Michael Mansouri from Radiant Images. And I mean, they’ve been pioneering 360 cameras and that sort of thing since the very beginning. And I believe they’re one of your resellers as well. The cameras that they put together two years ago were the– I wouldn’t say the equivalent to the Titan, because they’re not, but they were basically hacking together this, using full digital SLR cameras in a custom mounted rig, whereas you guys have it all in a unibody construction that allows you just pull it out, film with it, put it away, put your SD cards in. I even think there’s hot-swappable batteries in these, aren’t there?
Michael: So you can have the titan on house power and you don’t even need to have a battery in it. So, yeah. And Michael’s a very good friend. We worked together quite a bit and we do some great projects together. We actually just did the world’s first 8K livestream in 360 into a dome. That was a– it’s a project that we’ve been working on to promote our new 8K live stitching software. That was about a month and a half ago. And so if you think about that, we’re barely just now getting 8K TV sets, 8K headsets are still a year or two off.
Alan: Minimum.
Michael: The fact that we’re able to do an 8K 360 video and stream it over 5G internet — or you can even use a normal connection — and into a remote venue that’s nine miles away and give people the same immersive experience that they’re getting at a live concert, is extraordinary with off-the-shelf camera technology.
Alan: That’s amazing. So let’s talk about some use cases, because we’ve talked about the technology quite a bit. What are people using this for in business, in enterprise? What are the best use cases that you’ve seen other than entertainment, which is the obvious one man?
Michael: I can talk about this kind of stuff all day long. There’s so many incredible uses that are coming out for.
Alan: [laughs] We’ve got 15 minutes, so go!
Michael: Oh my goodness. Well, I mean, just in the past couple of years, we’ve seen things that like for example, like if you look at production and pre-production, there’s a growing number of use cases for 360 there. So if you look at location scouting and what it used to be, you would have a person go out to a film location that’s being considered. And they would take hundreds of shots of each crevice in each place and they would try to put that together and tell the story for a DP or a director why this location is perfect for the shot. And they’ve gotten to the point where even after they take the pictures, the DP and the director and the whole film crew will have to go to this location, they’ll have to sit there and they’ll have to look at it. But now what we’re doing is we’re taking a 360 video and we’re essentially just giving them a walkthrough of that place. And so it’s making it more efficient. We’re also looking at beyond just location scouting. There’s also uses in production design. We partnered with this great company called Matterport, and I’m sure you’re familiar with them. They do virtual tours in 3D models of homes and other locations. The idea with Matterport is they had this giant camera that cost around five to six thousand and it would shoot 360. But it was more complicated to use than one of our off-the-shelf small One X cameras that cost $400. So we took their software and we combined it with our hardware and we allowed people to create virtual tours with just an off-the-shelf camera that’s super easy to use. And we’re now applying this into location scouting and production design. So with production design, they would take even more photos and measurements. They would have to go and measure from door to door, from floor to ceiling, like all these– every single measurement that after the location has been selected, they’ve had to actually go in and spend days there measuring everything out, creating floor plans and designs. Now with 360 technology and something that we call photogrammetry, you’re able to do that with a couple of clicks of the button. And so it’s really making the pre-production process more efficient and affordable for everyone.
Alan: Is your camera able– like, when you say photogrammetry — I’m quite familiar with that — but is your camera– how’s your camera doing photogrammetry? Does it have a built in depth sensor? Because I know the Matterport was a combination of RGB and depth sensors that captured almost like a depth. So you could actually see everything in three dimensions in, I think they had centimeter accuracy.
Michael: It’s mostly our partner software that’s enabling that. So we’re taking the imagery, we’re feeding it into their software and then their software is analyzing the data and using their photogrammetry algorithm to get all the measurement data in. And it’s coming down to like centimeter accuracy, which is quite incredible. And there’s a number of these softwares that are available now. I’ve been using Matterport and I know they’re working on some cool photogrammetry stuff. And one that in particular is really cool is Cupix and they’re more of an industrial software that’s being used by folks in the construction space, in project development, engineering, things like that.
Alan: Very cool.
Michael: But beyond that, we’re seeing some really great additional use cases like, let’s say– let’s take training, for example. Our partners over at Disney, they’re starting to use 360 to actually train their drivers, that give tours either at their parks or on their other properties. And so they’re putting them in a VR headset and they’re putting them in the golf cart that they use to drive around. And they’re showing them that you can talk and you can drive at the same time, which is actually pretty hard to do. And they’re putting them in the virtual experience beforehand, and they’re allowing people to come in and get fully trained on this stuff in the virtual space before they go out, they take people into a real golf cart and drive them around. And it’s a big safety concern. And training goes across the board. I mean, there’s people that are getting training in supermarkets with 360. There’s people that are getting trained in jails that are about to reintegrate back into society and they’re training them on specific jobs. And so that this is kind of like a universal case study as well that’s coming out right now. The Oculus has been working on a whole “VR for good” campaign that we’re partnered with them on. And they’re going out there, telling these stories of people around the world who whether it’s war-torn areas and you’re actually putting people into that space or it’s certain societies and people that are telling their stories of their cultures that 360 is a really big growing use case for that as well.
We worked with a psychologist in New York City who is using exposure therapy to help cure people of PTSD, phobias and other traumatic events that happen in their lives. And the really great part is how you see the treatment evolving. They used to actually just build virtual experiences and it would use gaming engines like Unity and Unreal. And it wasn’t realistic enough. It was basically like a cartoon or almost like gamified people that you were interacting with, and it wasn’t working. And so when the psychologists started taking real 360 footage and he had a person that was scared of flying, or driving over bridges, or even walking downstairs, he would put himself in these environments. He would shoot the whole thing in 360 and then he would put his patients in those same experiences over and over and over again until they finally felt at peace with whatever phobia or traumatic disorder they had. And he’s seeing a huge success rate in this type of treatment over the past two years.
Alan: That’s incredible. To think about it, you’re able to do this with such an inexpensive piece of technology. Even if you’re buying the Titan, which may be overkill for some of these things, but how much is the Titan? Let’s talk prices. How much is that thing?
Michael: The Titan is $15,000, US.
Alan: So $15,000 to– and how much would you think it would cost to create a 3D model of some of these things, in the hundred thousand dollar range?
Michael: Well, in the past it may have cost quite a bit, but today you can do it with a $400 camera and $10 a month with Matterport.
Alan: No, I know. But what I’m saying is like, if you wanted to model it out and create a 3D model of it, it would be hundreds of thousands still. But with a camera that’s 400 bucks, you can now do this. I– literally– first thing I do when I go into companies, I’m like, “Here, this camera here, you’ve got to go buy one of these. And just, at least even if you’re not going to do it in-house, at least have it for your marketing.” It’s just simple.
Michael: It’s a no-brainer. It absolutely is. There’s a different product for everybody for depending on what you want to do. Obviously, some people might not have the need for a Titan and might be great with One X or a Nano or a smaller product. But but others who are, for example, like we’re working with Madison Square Garden right now and they’re– I’m not sure if you’re aware, but they’re building their sphere in Las Vegas, which is going to be like the world’s biggest LED screen at 24K and it’s going to be wrap-around and the whole venue is about 18,000 people for seating. And so they’re actually using two Titans to capture all the footage for that venue.
Alan: Amazing. What is one problem in the world that you want to see solved using XR technologies?
Michael: One big issue that we’re having right now is in the journalism space, in the newsgathering space. There’s so many different ways of telling your story. And right now, with everything that’s happening politically around the world, we just need real news and we need to see what’s actually going on. With 360, you can’t get any more real. And we’re seeing the journalism space being a huge beneficiary of 360 technology, whether it’s live broadcasting from disaster events like CNN has been doing, or just telling immersive stories that really hit home with whatever you’re trying to get across. I think that’s the future of all news and stories and coverage that’s going to be shared across the world.
2020's been a hell of a year, huh? Not just in XR, but for everyone around the world. But the COVID-19 pandemic and the...
Making any sort of head-mounted AR display has been a challenge, both on the technology front, and from an adaptation standpoint. But Stefan Alexander...
AR dragons, psychedelic displays at Coachella, and other digital gizmos made possible with XR technologies are fun and all, but Mark Sage, founder of...