Verizon’s XR development lead, TJ Vitolo, dreams of a day where he can download an entire TV series in an instant, or visualize info about the entire world with AR glasses, even living in a connectivity dead zone by the beach. In his position, he’s able to work to make that dream a forthcoming reality by developing the technology that will make 5G possible.
Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today, I've got an amazing guest, TJ Vitolo. He is the director and head of XR Technology Development at Verizon. Today, he leads the commercial strategy and product execution behind Verizon's VR, AR and 360 organization environment. Recently, TJ and his team launched AR Designer, the world's first streaming-based AR tool kit that allows brands and developers to quickly and easily create augmented reality experiences, with no technical expertise. You can visit Verizon.com or envrmnt.com. I want to welcome TJ to the show. Welcome.
TJ: Hey, thanks for having me, Alan.
Alan: Oh, it's my absolute pleasure. I'm so excited to have you on the show. This is like-- all the things you guys are doing, from working with the accessibility team at Cornell Tech, to your acquisition of Riot, to working with the Sacramento Kings, Yahoo! News. There is so much going on at Verizon. You want to just give us a high level summary of what you do, and what the plan is at Verizon for introducing 5G and XR?
TJ: It's quite dynamic here. You know, the VR space is ever evolving. Teams that do a number of things within VR here. But specifically you mentioned Riot. Between our team and Riot, we manage both of the content and creative end of XR, and that's Riot. And our team manages the technical-- technology side of virtual reality. So really, my team is focused on building tools and enablers, systems, platforms on the 5G network, sort of the underlying side of XR, to help accelerate and grow the adoption of the technology. On the other side, Riot's all about the product and the creative storytelling around VR, which really brings these things to life for people.
Alan: So you've got both the technical side and then the creative. And this is something that I've been harping on with customers as well, and just the industry at large: that this industry is no longer about just making products. And you look at the VC investments and they're investing in platforms and products, but you still need people to create the content. And I think you guys have found that balance with Riot. What do you see as kind of the future of how we create this content, is it going to be user generated versus studio content, or a mixture of both?
TJ: It's going to be a mixture of both. User generation is quite difficult today. One of the products you mentioned, we launched was AR Designer. And really the foundation for that was to put the power of augmented reality and virtual reality into the hands of even the most common user of technology. We built this platform initially with the mindset that schoolteachers-- and not by any means that they're simpletons, but the fact of the matter is they're teaching students, young children, and they've got to have a very effective way to do that, efficient way to do that. And so when we were building this tool, we baseline on children as the audience, schoolteachers as the user of the tool, to produce something that's really effective. So I think you're going to see as VR/AR becomes more ubiquitous, access is going to be much greater, and more in the hands of users. At the end of the day, there's always going to be the community outside of the content or the UGC community producing content. And I think those are the folks who are going to synthesize really compelling, powerful stories to users to grow that adoption. So I think you're seeing a lot with UGC, where it sort of leads the way to broader, more institutional creation of content, but it could very much see a [inaudible].
So walk me through your platform that you guys have built. Was it in the market already or-- walk us through that.
It was in the market. So we pulled back on that platform specifically because we had changed the strategy of our team. Initially, I was brought into the organization around commercialization for Envrmnt -- which is Verizon's XR organization -- and we wanted to generate revenue off of the XR ecosystem. And there's a fair amount of money out there to be made. But at the end of the day is when we started to launch our commercial products, we started to build up and prepare for our 5G launch strategy. And the task of my engineering team was to go down a few different levels in the technology stack, and start building platform enablers into the 5G network that will drive the adoption acceleration growth of AR/VR. So the tool we still actually use today, we've got over 10,000 users internally in Verizon that use it across our training organizations, our HRO organizations, our network operations organizations. So it's been very successful. There is still a plan to commercialize that in the future, but the idea was that we wanted to pin it against our 5G launch, to show what 5G can do for the XR space. That's where I'm super excited, it's about what 5G specifically does for XR technology moving forward.
Alan: Absolutely, one of the videos that I watched of you was a retail demo, where you took a phone -- just a regular phone with 5G -- and you pointed at some products on the shelf, and it not only recognized one product and gave you like that standard AR image recognition and showed some overlay information, but it recognized all the products at once. And I thought that was really a great way of showing how 5G will enable so much more than just simple AR that we're used to now using our phones. And then as that moves to glasses, you'll be able to walk down and say, "I'm on a keto diet." and walk down the aisles and anything that's keto will show up in green. I think that's where it's going. And that demo was really incredible.
TJ: Thank you. Yeah, I think AR, from a mobile standpoint, has been put in this bubble, because of 4G. And that's one of the examples of what 5G is going to do to AR. It's going to make it highly functional, highly useful, and a lot more entertaining in that space. Computer vision, graphics rendering. Those are the two sort of fundamental underlying technologies between virtual reality and augmented reality. And what we did there is that we expanded the capability of computer vision by offloading what typically is done on a mobile device over the 5G network -- so extreme amount of bandwidth, extremely low latency -- to a network node that sits within our network, that is very high powered from a processing standpoint. It allows us to offload all of that computer vision information and provide a response back in real time. This is something only possible over 5G and only possible with our net edge network. Fundamentally, what this does is, it ends the current limitations of augmented reality, blows them out of the water, in terms of their limitations.
Alan: You know, you're taking all the compute power off the device, and putting it into the edge. You've been recognized at the Edge Awards already, winning Best Contribution to Edge Computing for R&D and then Greatest Commercial Potential for Edge Concept. So you guys are clearly leading the way for this. One of the things I saw last week in Wired was startups building a new chip. It's an artificial intelligence chip, and it's the size of an iPad. Rather than everybody's trying to make them smaller and smaller, these guys went the opposite way and made a huge chip, and it can do trillions and trillions of calculations. But obviously you can't put a chip the size of an iPad in your phone. But having the ability to offload that to the cloud and have the processing power when you need it, where you need it, but only offloaded into the cloud was really incredibly powerful for not only rendering, but also capturing the data that's around you. A lot of people don't realize that as much data as you're pushing from the cloud down to the graphics processing and all that, you're also capturing data from point cloud data using RGB cameras, or all the phones will start to have infrared camera sensors now. So being able to capture that data, send it to cloud, make sense of it, all within milliseconds, I think is really going to be a game changer for VR and AR.
TJ: It's a massive amount of data, too. If you look at all those different sensors on those devices, it's crazy if you look at the future of volumetric video. Well informed on Microsoft came out and said, hey, their studio does two terabytes a minute of data capture.
Alan: Yeah, the Metastage.
TJ: With a handful of cameras and that sort of tech, texture, and depth, and other sensors. But you're right, the thing that's going to close the gap between really powerful technology in your hands is extremely low latent, high-bandwidth network connected to computer-- very high power, scalable computer network. And it's not just AR/VR, right? It's a lot of things, although I'm focused on AR/VR. We are now going to be putting supercomputers in everybody's hands.
Alan: So, what 5G XR use case-- your focus is in XR. If you take 5G to the nth degree and 5G and edge computing, you've got autonomous vehicles, you've got drones, you've got-- there's all sorts of ways. But let's focus on 5G and XR for a second. What use cases do you guys as Verizon see as happening first? I mean, we're already seeing it in enterprise, where they're using heads-up displays to help field and service workers, and machine workers, factory workers repair things, and see-what-I-see, and all of these types of things. But what do you guys-- what's your roadmap for the next 10 years, let's say?
TJ: Yeah, it's an interesting question. So our organization fundamentally is working on the platform and services that will enable very thin, lightweight augmented reality or mixed reality glasses. So I think that's one big step, is to move away from the clunky form factor to something that's super sleek, and super powerful. So how can I have a pair of standard Ray-Bans look and act like a Hololens times 50?
Alan: [laughs] Oh my god, that's a huge quote. [laughs] Think about that. "How do I make a pair of Ray-Bans look and act like a Hololens times 50?" Oh man.
TJ: That's the platform that we're building, right? That's the vision. Now, on top of that, once you do that, the world sort of your oyster in terms of what the use cases are. And enterprise is definitely the first entry point into that, because we will go through this evolutionary process with hardware for glasses, that it's not just the compute-- and we're solving compute problems, but you do have to solve the display problem, and you have to solve a couple other things. But ultimately, at the end of the day, reduce battery power on that device, reduce battery size, reduce battery power by reducing compute on that device. And then ultimately, at the end of the day, through that step by step process, you get something in. But in the meantime, you're going to get that adoption in the enterprise space. And so we look at -- from a use case standpoint within our enterprise organization -- things like worker safety, and obviously things around efficiency and improvements of workers within environments. And specifically in the industrial space right now, which seems to be where a lot of the opportunity sits, at least from companies that have been coming to us, interested in the space. There's only so much that you can do to [inaudible] certain verticals within a market to adopt a technology. And a lot of them are a lot more forward thinking than others. So we start there.
Alan: It's interesting that you say that, because some of the industries that you actually think would be the least technical -- mining, for example, they haven't changed in 100 years -- they were one of the first people to jump on this technology, because they can use it so quickly and so easily in manufacturing. Old school businesses that you wouldn't think would be technologically advanced are just making these leaps and bounds now, it's amazing to watch.
TJ: It's amazing and it's amazing cultural thing to watch, in my view. It's like you'd think that these-- some of these industries are so advanced and there's so much money, but they have old school practices. And then you look at other ones who have been forced to innovate and change their culture and adopt in these nascent spaces. And you scratch your head and say, "Wow, that's really interesting." And so it kind of throws you off-guard. But, you know, that's where you have to go. You have to go where the people have a sense of urgency and demand around it. And then you make it happen on that front.
Alan: Interesting. I guess what I'm trying to get at is, what are the 5G XR use cases that Verizon thinks -- or you think -- will make the best use of the new networks, of the new 5G capabilities?
TJ: We're looking at a few with the underlying premise that you're trying to merge the physical and digital worlds together. And so retail was-- is a very big area for us, both front office and back office, or consumer focus and then back office. So if we're looking at the consumer front for a second, we're looking at the really interesting and I think most people can relate to use case. Here's where I go into a retail store, and then I'm always on my phone looking at ratings, reviews, pricing information, and other things with respect to those physical products that are on the shelf. And I spend a lot more time on my phone than I actually do perusing and browsing the stuff on the shelf. And so really what we want to do is merge that physical and digital divide, by having a pair of mixed reality glasses that as you're walking down that store -- using a 5G powered headset -- you're literally taking in all the information within your field of view about a set of products and services. So now I'm standing in front of a set of consumer electronics devices, and I want to know which ones are the best rated, which one has the best value. All the stuff that's typically online, now I can have all that stuff instantly overlaid on top of those products, whether it be makeup or electronics, or even clothing. And then take that off the rack and go and purchase it. So that's one of the retail experiences. Other has a safety component to it, the one that you saw us demo. I have a family. I spend an inordinate amount of time at the grocery store looking at the backs of boxes, if they contain certain allergens for my family. Now I can tell my glasses to filter for products that are gluten-free, [inaudible], or kosher, whatever might be the case, then instantly everything within my field of view will light up based on those requirements.
Alan: Do you think that will be driven by computer vision picking up the boxes, or do you think they'll be driven by companies, like the grocery stores -- like Whole Foods, for example -- submitting planograms, so it knows what store you're in, the planogram knows where the boxes are, will it be combination of both?
TJ: I think it'll be a combination of both, because you've seen it in the QR space. You've seen stores do it themselves, you've seen third parties do it themselves. Most of that information is actually publicly available, those databases. And so third parties can actually easily construct that software, but I think it'll be dependent on obviously the training and the learning of that object -- a lot of those imagery is publicly available -- and then blend it with the publicly available information for those products.
Alan: I know Google and Amazon and pretty much everybody's working on computer vision for products. I want to point my phone at a pair of shoes and say, "What are those shoes?" And right now we've got a device in our hands that is pretty powerful and can do a lot of things right now. What are some of the things that we can do right now with our phones, that you're seeing are emerging as killer use cases in this technology?
TJ: That's a great question. Like I said, I think that going back to sort of the chokehold that current existing networks placed on augmented reality, there's a couple areas where I see a big amount of potential for a mobile device. And I think a lot of that fits around potentially markets where you have growing economies. You look at APAC and other areas around the world, where they don't have access to certain types of [inaudible] medical facilities. So one of the things that I saw that was really interesting, is how you could use a mobile phone and computer vision help diagnose patients, by using computer vision and artificial intelligence to look for signs that you wouldn't necessarily be trained or have access to. One of the interesting use cases I saw was to help support a potential phlebotomist out in the field, where they're using a phone to detect veins, so they don't mispuncture a vein in the arm. They could do it right the first time, and completely limit the opportunity for infection.
Alan: I think that's the AccuVein system, isn't it?
TJ: Mm-hm, mm-hm.
Alan: Yeah, it's really great.
TJ: And I think that's transformative on a world basis, is that anytime you use technology to add intelligence to give access to underserved or underprivileged or markets that just don't have the ability to. So we look at that space, too. We look at-- we talked about accessibility, I think, a little bit. We look very much into that space also, as a way to just improve in general. And I think we share a very similar feeling. It's about improving quality of life. It's not about introducing one more thing, one more piece of noise into the environment. How can we help each other sort through daily life, whether it be from a medical standpoint or being inundated with information? How do we make those things simple?
Alan: Absolutely. And as more and more people move into urbanized areas, there's that culture shift from living in the country to living in a city and-- offline, we were talking about taking our kids camping and stuff. My 11-year-old daughter made this huge billboard poster and put it by the fire. It said "No cell phones by the campfire." And I think we're really getting to the point where the technology is pervasive. It's everywhere we are. My kids sit on the couch and watch TV with their phone in their hands. And sometimes they've got an iPad and a phone. It's nuts. I don't even know how they focus. And the other day, my daughter was watching a show, and she had the show in a small window and she had a game that was related to the show in the big window. So she had like picture-in-picture, but the show wasn't the dominant part of it. And I thought that was really interesting, how youth are starting to use these technologies. And we've done a lot of work on delivering people entertainment content. Netflix is using AI algorithms to give people better movies to watch. Amazon's giving you better algorithms to help you purchase better. What I think we need to do is harness those technologies and give kids better ways to learn. And I think these technologies can really catapult that. What are your thoughts around VR, AR, AI in education and training?
TJ: Yeah, we we touched on this a little bit. And I think the impact is absolutely [inaudible] that space, specifically with convergence of people either domestically or internationally from an education standpoint. I think one of the things that you might have seen, our team produces a virtual reality platform called Operation Convergent Response. And the idea behind that was to aggregate or bring together a number of people with different backgrounds and skillsets into a single virtual environment -- or a war room, essentially -- so that they can help support a natural disaster. So if there was an earthquake, bring an earthquake expert, someone who's an expert in fire, someone who's an expert in weather -- or whatever might be the case -- to help quickly triage sort of a situation, without bringing them into a physical location. That's immense also. You can have highly custom education armaments, where you're bringing in specialists in different areas, that hold all under sort of one umbrella. Let's take VR/AR. Someone interested in VR/AR, you could bring in an expert in virtual reality, expert in augmented reality, expert in computer vision, and then you can coalesce and bring all together for the students. Very specific interests in that space, into that area, into that arena. And with virtual reality, you can create a 100,000 square foot space in a 1,000 square space. You put 75 TV's on the wall, you put one TV on the wall--
Alan: Yeah, it's great.
TJ: --you can literally create the most dynamic environment that works for those students. And touching on artificial intelligence for a sec, one of the most amazing things I think that comes out of virtual reality training -- and also in the medical space and other areas -- is the ability for analytics platforms to look at every single piece of interaction that's going on in that space. What does that yield at the end of the day, it yields a very efficient and effective way to help you understand where you're improving, where you're falling behind. And it's amazing because in essence, in a typical education environment, you've got to rely on a teacher -- or a boss, as might be the case -- to provide that feedback, and you can't do that for 30 students in a classroom. But if you've got an analytics platform that's in each one of those individual students on a case-by-case basis, you can then custom produce areas in a report where they should be improving and moving forward. And the amount of advancement we make is just because of that little improvement. I think it's absolutely massive.
Alan: You nailed it. By changing the way we teach and using exponential technologies. It's not even a 10x improvement. It's a 100x improvement, a 1,000x improvement because we're not even scratching the surface of things like how people learn or when they learn. I learn better, maybe from 10 AM to 11 AM. That's my maximum capacity, so all the hard stuff maybe I learned then, and maybe somebody else learns in the evening. Maybe by using galvanic response or measuring your heart rate or your biometrics, you can then deliver content that is highly personalized, highly contextualized and delivers it at the time of maximum absorption or retention. And we're already seeing across the board 20 to 100 percent improvements in retention rates in training. One of the things that you guys did was use VR for hostage and robbery training. How did that come about?
TJ: It was an interesting one. It was actually during our early commercial stages, we were looking at different verticals in areas where we can use technology to obviously improve awareness and other things that happen in our retail environments. It wasn't specifically around hostages, more like store robberies, which don't happen often. But obviously, when they do, how do you train someone in a traditional environment to address a burglary situation, especially with someone being held at gunpoint? So we use 360 video. We use actors and we pull together a very immersive experience around both an armed robbery and also identifying risks for theft by immersing them in a 360 world. We did monitor different bio signs. We also use an analytics platform that allows us to do gaze tracking and other things. So you saw exactly where they were looking. What they were interested in. Were they looking in the right spots? Were they looking in the wrong spots? And we were able to produce a relatively detailed report based on that, and what that allowed us to do is it allowed us to understand, as much as you could in a replicated environment, what is the normal response for a retail rep when these situations occur? And then it allowed us to cater a training program around that, that helped better prepare them for a situation. So we use it as a way to gather analytics, which then yielded a ton of information for us that we could never capture in a -- even if you did this in a real-life scenario and you brought in some actors and you brought in a rep and you them in-- you wouldn't be able to understand where they were looking, how they were acting, how they were responding, without this type of analytics in this virtual environment. So because of that, we felt like we were able to compose a training module that was much more advanced than what was in the market today.
Alan: How do you measure the success of that? What are the KPIs that you guys were looking for?
TJ: That's a great question. And unfortunately, we provided a lot of the technical support and background for that while our learning development team managed the KPIs. But I think some of the key KPIs that we used was sort of this post mortem experience where we then place them back into into that experience with their training to see where there are improvements. And specifically, we use the gaze data, the biometric data, to then see how they respond in that situation. And I don't know what the specific performance improvements were, but I know they were relatively more significant. That's sort of what's baseline for typical training. Unfortunately, I don't have that data, but the process, the methodology, was to re-immerse them back into that situation with the tools that they now have. Again, because it's case-by-case and see where they were looking, how they are acting, how they're responding, so on and so forth.
Alan: I want to shift away from training, because there's so much to unpack here in the long term of this technology. 5G is really going to enable a lot. And one of the things that we have to overcome is some of the challenges. What are some of the challenges that you see -- or Verizon as a company sees -- as standing in the way of the broad adoption of 5G-powered XR, beyond cheaper headsets and 5G coverage? But what do you see as the broader challenges around adoption?
TJ: I think adoption is dependent on establishing a community that understands the technology, in order to build on that. And that is the other half of our 5G labs. I own the development side. We also have a 5G lab-- multiple 5G labs, spread across the country which provide developers and enterprises access to this technology, so they can understand it. So education is hugely critical. We are very immersed in the technical side of things. So we've won those Edge awards, very technical. And the work we do, while deeply technical, has to be translated to a common level. And I think our biggest hurdle and challenge is making that consumable to the end user. And we are solving that through our 5G labs. We are providing developers with training and access to 5G networks, providing enterprises with a full view into the capability. So as much as we're having this conversation here, any CTO or CIO or even CEO listening, visit our 5G labs. You'll get a full view of what we're doing and how it can apply to your business. There are so many companies across so many verticals that it really helps them understand and inspire what they what they could do with the technology.
Alan: It's interesting that you guys created these 5G labs with the purpose of showing what the technology can do. One of the reasons we actually started XR Ignite was the same thing. We kept seeing all these amazing startups coming up with crazy, amazing ideas on how to use XR and AI, but the they were missing the business acumen to be able to take those technologies and bring them to a commercialized state. We formed XR Ignite to kind of help bridge that gap between corporations and startups. And it sounds like you guys are doing the same. Sounds like there's an interesting fit there. We'll talk offline and see if we can collaborate.
TJ: Yeah, no, absolutely. I think there's there's great opportunity for anyone looking and interested in it.
Alan: So what is the most important thing that businesses can start to do now to leverage the power of XR, and AI, and 5G? What can they start doing immediately to reap the benefits?
TJ: Yeah, I think any company should -- just like we do here -- is to assign someone from the strategy side of the business to either bring someone in or to do their own objective (and to some extent, subjective) view of collection of information. There's so much data out there. There's so much information out there. You know, the stuff that I'm talking about is very topical in the sense that this is stuff that you and I have access to. And we're very immersed in this space. Obviously, we know relatively well, but anyone in any space could find the same amount of information. I work at a very deep technical level, which obviously makes not much sense to businesses until we translate that into a business function at the end of the day. But one or two people in an organization can compile some really compelling amount of data and information that exists out there, to really help transform businesses with this technology. So I think it's just that first step that you need to take to say AR, VR; what is this about? Right? What does this mean to my business? And I think they'll be surprised.
Alan: We actually started MetaVRse as a consulting firm to help businesses understand how to use this technology, because our long-term vision was always education. And so we've kind of morphed into consulting on how to use this technology specifically for education, training, learning modules and stuff like that. But with XR Ignite, it's pretty broad. There's so many companies out there building such great tech and everybody is chomping at the bit to do start leveraging 5G. And I had a meeting a couple weeks ago and I got to see the new Samsung s10 5G phone, and it was the first time I'd ever held a 5G phone in my hand and it kind of felt like this new renaissance of technology is coming, and it's going to come as a tidal wave, because it's been years that people have been working on it. I don't know, Verizon's probably working on this for a decade. So it's coming and it's going to come really fast for people. What can consumers do to prepare for that? Because I think by having a 5G phone is great. It's faster, whatever. But really, what can consumers expect from that?
TJ: If you look at where we were with 4G and sort of this scale and ubiquity of 4G were in that same phase. -- and not just in this scale, but also in sort of the advancement of technology -- I think one of the things... so just sort of prefacing this, one of the things that is interesting is that the iPhone didn't release a 4G iPhone until about a year and a half later. And I think once the network launched, I think part of that is because there are ramp ups to this technology as it gets exposed and becomes more accessible to developers, they could start building experiences on top of that, that really validates the value proposition of 5G. As it stands right now, the biggest value proposition of 5G is just the ability to download an immense amount of content in a quick amount of time. Right? So I want to download entire season The Sopranos through HBO Go, or whatever might be the case, before I'm about to hop on a plane or I'm going to make a split second decision. You know, I don't have to wait 10, 15, 20 minutes to do that. I can literally do that within 30 seconds or less.
Alan: That's insane.
TJ: It really makes on demand "on demand," right? It used to be that you had to plan some stuff, or-- one of the great things, I have got music service and unfortunately where I am down at the shore on the beach, I don't have access to -- because I'm pretty remote -- to that network. And so I want to be able to download an entire alternative 90s collection of music of 10,000 songs in...a minute? Less than that?
Alan: OK. I got to just... what the hell, man? I used to be a DJ for 20 years and I remember going to the record store and you'd buy a record. Then I had a collection of books of CDs, and I remember when my daughter was about 10 and I gave her my CD collection, and she looked at me, she's like, "what do I need that for? It all fits in my iPod." And now we're talking about downloading entire genre of music in seconds.
TJ: It's quite crazy. Again, that's just scratching the surface of 5G. Beyond that, there are a lot of other things that we foresee with the implementation of the Edge network that we talked about and providing access to developers. One of the things that I did wanted to touch on is sort of our last piece a minute ago was that developers, the XR space is still nascent enough that if you're a business, their may [be]-- or likely isn't -- a solution for your business today. And so I'd encourage businesses to go out there, understand the technology and build their own solutions, because half of our practice, going back to our commercial period, was a pro services organization and we were just all ears. We would listen to what companies would come to us with challenges and then we would solve those problems for those businesses. And one of the benefits of being Verizon, a massive company that we are, is that we've got so much diversity. We've got retail. We have network operations, sales. Right? We can hear these challenges from all these organizations and build solutions that no developer would ever really think of to build. So I'd encourage businesses to go out there and not just look for answers to their problems, but work with folks that can solve those problems.
Alan: That was our entire business model, was to create some solutions, sit and listen to customers, figure what they wanted and build it for them and then develop those into products. Literally. That was our our entire business model, was just, listen! And that would create products, which it has.
TJ: This is an SAP, right? I know I need some sort of platform. I go to SAP and they're like, I get inside a box software and just customize it. Then that's answer for my business. That's not the state of XR. And if you can jump on that as a business, I think you'll have a major upper hand like we do here. So I'll give you one example. We have a team that's responsible for going out and working with community. You have them understand where we're going to be putting 5G cell sites. And one of the things they do is they hire a graphic artist, a number of graphic artists, to go out and draw what a cell site would look like in a community and they would do it in Photoshop and other tools right there. And then they would bring it to the community board meetings. San Francisco would be the case and say, hey, these are the sites that we want to produce. Right? And this is what the tower is going to look like on this building or on this light pole, whatever might be the case. And it costs them on average just in this one section of the US, $5-million a year contract, graphic artists. So what we did, and what they asked us to do, is how do we solve this problem with augmented reality? Can we just take the poles that they would draw, import them as 3D models and then just drop them into the environment, capture that photo with our camera and submit that to them to the town planning meeting? We said, absolutely. The Verizon rep goes out there, they go to a corner, they pick the pole that they want. They change the color of the pole to match the environment. Right? And then they set it right on the plane of the ground. And then they capture a picture and they upload that picture for the board planning meeting. And literally, that application probably took a month to develop and saved in saving our business $5-million dollars a year just in this one area. That was not something that we'd ever imagined building. But we took that cost out of the business.
Alan: Amazing. Some of the cost savings that I've heard on different interviews on this podcast have been insane.
TJ: It's an insane amount of money, for relatively simple implementation of the technology.
Alan: So with that, last question; what problem in the world do you want to see solved using XR technologies?
TJ: I want to see people's lives become simpler. Everything that we build, from a technology standpoint, very complex in nature. I think the fundamental idea is to make people's lives simpler and to make people's lives safer. I think we've come to a point at technology just earns us so much that you companies out there that are now creating artificial digital clones of ourselves to manage the information overload that we have coming in, so we provide them with a set of rules and responsibilities that they could take on our behalf. And I don't think that's necessary, if you would, in a way that filters out the noise and makes everything [inaudible]. And I think what that is, is that that's blending the digital and physical by... I don't want to be constantly distracted by my cell phone when I'm camping. Right? Because there's something interesting that I just saw, maybe out in the wild. Was that a deer? Was that whatever might be the case? With glasses, I think that information is so passive to you that we no longer have to be distracted by our device and we can tune in and tune out things as needed. And my view at the end of the day is, we strive for simplicity in our lives. And I think that's what we want to go with this technology.
Alan: Oh, I love that. Striving for simplicity with our lives, using the most advanced technologies in the world. I don't know what better way to wrap that up, but thank you so much. Thank you, everyone, for listening. If you want to learn more about the work that T.J. and his team are doing at Verizon, you can visit verizon.com, envrmnt.com, or verizon5glabs.com. I think those are the three places where everybody can get as much information as they want. Thank you again, TJ. This has been amazing.
TJ: Thanks, Alan. I look forward to talking to you soon.
Seeing is believing, but in the age of 3D product views through AR technology, seeing is also conceptualizing. Simply Augmented CEO Boaz Ashkenazy comes...
Today’s guest, location-based VR expert Bob Cooney, has been in the XR space since the early 1990s. He drops by the show to give...
Why trek up to the Arctic circle to capture 360 footage of the aurora borealis yourself, when you can license stunning footage someone else...