IKEA might be best known for its affordable furniture, cartoon building instructions, and hard-to-pronounce product names, but that’s not all its about. They’re also exploring how they can improve lives with XR technology, as Martin Enthed explains.
Alan: Hey everyone, I'm Alan Smithson. Today we're speaking with Martin Enthed, digital manager at IKEA Communications, who's also part of the IKEA Digital Lab, looking five to seven years out into the future of how we bring retail to the masses. Martin is also part of the Khronos Group, an organization working on the open standards for spatial web and 3D world. All that and more, coming up next on the XR for Business podcast. Martin, welcome to the show.
Martin: Thank you, Alan. And thank you for having me here.
Alan: It's my absolute pleasure. This interview has been long overdue. You've had to get a ton of approvals and everything. So we're very, very lucky and honored to have you on the show. And thank you so much for joining us. Let's get into it. Maybe you can describe how you got to where you are, and the role that you're at with IKEA.
Martin: I started 13 years ago now in 2007, and I was hired to try to make use of computer graphics into a volume production, instead of just doing a few models or images a year or two, large volumes. And building those back-end tools, coding, setting up standards and everything, up to 2011. And then they hired me to do all development for that company, IKEA Communications. And I've been the IT manager and development manager for that all the way up to now, two years ago, when I became digital manager there. Then I headed up what's called IKEA Digital Lab, that you mentioned. Now I'm working mainly with that, looking into the spatially aware 3D future.
Alan: So how is IKEA using these tools now? Because I think it's a big shock, when you explained to me how the magazine that we get, some of the photos in there aren't real photographs, they're renders.
Martin: That story has been told a few times. But if I take it very short, it started really in volume 2012-ish. So it took like from 2007 to 2012. And in 2011-2012 we did about 10 to 12,000 high-res images a year, and I would say maybe 1,500 of them were 3D. In the last five, six years we have been doing about 50,000 high risk images a year. And about 35,000 of them is 3D, mainly the product images and those things you find there. And then, of course, a lot of kitchen brochures and such are 3D. You take a kitchen brochure from our stores and look at through that one, you will see a lot of 3D. If you take the IKEA catalog, then it's much, much, much less, because most of the time we also do video sessions in those. And that's so much easier to do in a real set. But it's a lot of 3D. But that's the offline rendering stuff, that's in huge production right now.
Alan: So that's kind of pervasive now. So when you're looking at the kitchen catalogue, most of those renters are all in 3D. It's funny, because Helen Papagiannis -- author of Augmented Human -- she's got this game, "Augmented Reality Or Real?" And I've gone through the magazine, I can't tell. I really can't tell what's real and what's 3D. So kudos to you guys for making it realistic. So we render something and we have the best quality of everything. What about real time rendering? I know a couple of years ago you guys experimented with VR and also the IKEA Place app, and real time rendering of spatial objects. What's kind of on the roadmap there?
Martin: The exploration stuff internally started already in 2010, when we made some things that was running in a browser, and then we sent off a small little file that's told how that scene was set up and then rendered with an offline rendering and sending it back again in 10 seconds. And it worked, it worked nicely. The problem is that we have so many customers on our webpage, so we can't really have a high-end computer waiting for a user like that. So that didn't really fly. Then in 2012, the IKEA catalog app had AR in it. So we used the catalog as a marker. I think we were one of the first big companies in the world using marker based AR. It was only five pages total, in the first one. And then it was added even more the next year, and the next year, and the next year. So all the way up to-- we did VR stuff with ARKit in 2017 -- if I don't say wrong -- then all the way up to there it was marker based. And then the Apple guys came with a brilliant ARKit, and it worked beautifully. And we could switch, and so we didn't need the app. And then we also made the standalone app, IKEA Place. I think it's half a year later or something like that. Then came the same in the Android version, with ARCore. That's how we've done AR so far, when it comes to older things. And then just recently now you might have read that we have made a new thing with the new Apple iPad, with the LiDAR stuff, adding functionality that wasn't possible before.
Alan: I haven't read about that.
Martin: Okay. [chuckles]
Alan: We just ordered one of our iPad Pros. For those of you who are new this, the iPad Pro 2020 edition comes with a LiDAR scanner, which allows you to kind of point cloud map your world and embed 3D objects in a very contextualized way.
Martin: Yep.
Alan: So how are you guys using that?
Martin: The way you are saying, actually. So I can't say much, as it is not that public. Not many people have got their hands on it yet, but as soon as you do, then try it. It's at least as promising as the Tango was back in the days. But now it's on a more powerful device and it can do more. So it looks really promising. We've been waiting for something like this for a long time.
Alan: So something like this -- just to kind of give people an understanding of how they could use it -- if the iPad scans a room and sees that you have a desk, you could put a lamp on that desk. And even if the desk had multi-levels, it would understand that, "Oh, I can put something on different levels" and that sort of thing. Are you able to then cast shadows onto stuff like that? How far are you guys taking this?
Martin: I will say it depends on how detailed we can make the dimensions. We haven't tried that yet. But it would be possible to do occlusion, that's at least theoretically possible.
Alan: Super exciting. I keep seeing these videos pop up of people doing occlusion in their living rooms and stuff like-- man! Our iPad can't come fast enough.
Martin: [chuckles]
Alan: You are a part of the Digital Lab looking out into the future. So you're also part of the Khronos Group, which is an organization working on the open standards of the 3D Web.
Martin: Yeah.
Alan: Maybe you can talk about how the vision of the future-- you've been doing this since 2007, you've seen 3D morph into this accessible tool to everybody. If you'd asked five years ago, "Will 3D be on everybody's phone?" I probably would have said "Not really. Here's a Tango phone and good luck." But every phone in the world now has the ability to show 3D, and now AR. You must be looking out into head-worn devices as well.
Martin: Yes, definitely. The thing is that we truly believe that interaction with our customers will increasingly be spatial or 3D-based. That's also, in a way, needed for us to reach the many people that we want to reach. That's the whole idea now, trying to increase the number of people we are reaching from about, I would you say a little bit under a billion, to maybe three times that in a few years. And it's hard to do that with physical spaces only. So that's one of the things that we need to do. And if you want to interact with us from your home and understand the things that you are able to buy, then it has to be spatial in some way.
Alan: So what you're saying is you want to bring the Billy bookcase to everyone?
Martin: Yeah. [laughs]
Alan: [laughs]
Martin: I think we almost have done that. But it's selling a lot. [laughs] The thing is that actually IKEA's vision is to create a better everyday life for the many people. It doesn't, in the vision, say anything about furniture. We do it by doing well-defined furniture for people with thin wallets. That's the whole idea. But in the vision, it's nothing about furniture. It's a lot about many people. So going back to 3D and computer graphics, the thing is to be able to do this, we have to handle a lot of 3D data. Of course, moving 3D data between different applications has always been hard. Back in 2007 and -12 and -14 and -15, and still being hard all along, because nobody's really cared about the problem. That's what is my point of view.
Alan: I would have to second that. I actually have a slide in one of our decks, and it says "3D is a pain in the assets."
Martin: [laughs] Yeah, yeah. You can move the polygons okay. You can get UV coordinates over. But anything that has to do with-- yeah, maybe the diffuse texture you can get over. But anything that has to do with how the surface reacts to light -- whatever you want to call that, a shader, or a material, or a BRDF, whatever your vocabulary is -- that has never been standardized. And I've been asking for it all the way since we talked the first time at SIGGRAPH in 2013. And I think two years ago something happened, and people were starting to talk about standard surface. The MDL guys from Nvidia started talking about things. Dassault came with the Enterprise PBR. And people are talking about it now and there are movements, but it's moving a little bit slow.
Alan: And what's Enterprise PBR?
Martin: Standard Surface MDL and Enterprise PBR, if I say it right now, is different ways of describing -- in a generic way -- a material definition. So the Standard Surface and Enterprise PBR -- both of them -- in my opinion, is our-- what I call core über shaders. It's one way of describing a shader for a material. MDL is a little bit more complicated than that, in Nvidia's things. But those are at least candidates for being a generic ways of describing materials. Standard Surface is part of MaterialX, and that you might have heard of.
Alan: So somebody who -- let's say, for example -- is a furniture store, local furniture store. They want to create a 3D version of their catalog. The problem that I see is that one, where do you get the 3D models made? Two, how do you make them? Three, how do you store them? Four, how do you make changes to them? Five, what are the standard formats, and are they going to work on all--
Martin: Yeh, that Standard Surface, and MDL, and Enterprise PBR, and all that is a little far over your head. You won't get anything of that.
Alan: Exactly. So how do we get to a point where I can -- as a marketing manager of a company -- start to use 3D as a daily tool for me, without having to take 10 years of education to learn what a file format is, and how to apply a mesh onto-- This is-- it's crazy, and it shouldn't be. That would be like saying to you, "Hey, Martin, I'm going to send you a picture by a JPEG, and you got to put the RGB colors back into it."
Martin: Yep. They come in a side car. That's how it feels.
Alan: [laughs] What is--? Who thought of that? This is madness.
Martin: Yeah. That's one of the things -- if we go back to Khronos -- why we are when we went into -- now a few years ago -- to be part of Khronos. And actually since a month now or so, part of Khronos board. It's because of this, because they had a format in there. It's called glTF. It's in the 3D Formats Working Group. And glTF is an open standard, fairly well-defined, physically based vendor that you can use for real time graphics on your website, or on a phone if it's an Android phone. That's a way of doing it. And the equivalent from an iOS phone is used Apple's version of USD, that Pixar has defined. So you have two formats, actually, that works on phones.
Alan: So what happened to the Open Web XR standard, then? If now we have two formats, that's not really standard.
Martin: glTF is standard, and USD is standard.
Alan: Not if you need a different format on a iPhone.
Martin: But that you have to talk about to the people who makes the phones. [laughs]
Alan: Got it. So basically, what happens -- and I want to make this clear, because this is a problem in the world, of our world -- that everybody's moving towards the standard so that everybody can say, "Okay, glTF's the standard, we're all gonna wrap around this." And then Apple comes along and says, "No, here's a new format."
Martin: But it's better with two, than like 14, that I had four years ago.
Alan: True.
Martin: So it's a movement in the right direction.
Alan: And I've seen some USDZ convertors come up, so that you can convert between glTF and USDZ and stuff. So there's workarounds. But that's truly-- that's like saying VHS and Beta, still.
Martin: Yeah, I agree.
Alan: Yeah. OK. I don't want to make sure that I'm clear this, because I'm still learning, and this stuff is evolving daily. So you're on the cutting edge of this. By joining the Khronos Group board now, is this-- did Apple walk away from this or did they just--?
Martin: I think Apple is part of the board in Khronos still. So they are part of that. If they want to collaborate in this or not, that's really up to them. But they are equivalent, glTF and USDZ. The version of USDZ that Apple is using is fairly equivalent.
Alan: And there's actually-- we did some research, there's about 140 3D model formats of that. There's -- like you said -- 10-14 that people actually use. But there's a lot of 3D formats. I don't know how many years it took for the JPEG to become the gold standard of images. But even if you look at that, you've got PNGs and JPEGs, and there's 100 different formats. But then one just became the easy one to send.
Martin: And then it took a few years or so for the JPEG to look the same in different browsers or different phones or anything, so just making the color come up correctly. So it takes time. And the nice thing about Khronos -- now I'm in there, so of course I'm a little bit partial in this -- but it's a group where anyone can join and everybody has an equal vote. That doesn't mean, though, that it moves fast. I think it moves fast now - at least the 3D formats -- but at least everybody has an equal vote and everybody has a say. And I think that's good. And it's not only this, it's vision processing, it's neural networks, it's parallel computing, it's virtual reality and 3D graphics. So it's many of the things that I'm super interested in, in the same organization.
Alan: It's funny you said that. I was just going to ask you, what of those things excites you the most?
Martin: All of it together. That's what makes the magic. So 3D models, of course, are the Lego pieces that we build everything on. But the neural network stuff and everything is needed to be able to do automatic things. That's also why I'm looking for those people right now. Hats out to the people, to the lab.
Alan: So let's put our futurist hats on. Let's look out ten years. I want to buy some furniture for my house. What does that look like? Or if I want to interact with with IKEA in general? Maybe it's not even furniture at this point. What does it look like when I, as a consumer, interact with the IKEA brand?
Martin: It's very hard to look 10 years out, but there are some glasses there, from somebody. Don't know who will make the first ones, that we then look back on and say that "That was where it all started," when it comes to glassware, with stereo and AR and VR in the same, of course. And if that's there, then there could be so many things happening. And then how people are using that will be the hard thing. Is this something you put on in the morning, and take off in the evening, and you never want to have off? Like people today can't even move without a smartphone. Ten years ago, people were actually using phones to be phones, to talking. [chuckles] Now--
Alan: My phone actually is the worst phone. It does everything but.
Martin: Yeah, so the people who are now raised with smartphones, I wonder if they think of them as phones you're talking mainly, or if it's a texting tool or if it's a blogger tool or whatever it is. It's something else.
Alan: I have two kids, 11 and 15, and they actually use their phones very, very differently. We've been-- we kind of give them carte blanche access to a phone. My 15 year old uses Snapchat to communicate with her friends, and they send little tiny pictures, maybe a picture of the corner of their head. It's really kind of disposable photographs, I would assume, because they're really not flattering, in my opinion. But that's how they communicate: they make a funny face and they communicate through photos, rather than typing. And then my other daughter, she's been using this app called House Party, where they have a bunch of people and it's kind like a collaboration thing, and they all get together and they talk. So watching just even kids that are a few years apart interact with devices. It's very interesting to see.
Martin: Now, one thing that I actually think will happen is that we will have-- if we want to do really augmented things in the future, we have to start looking at other things than just the visual. And that's also why I'm in the lab. We are looking at sound. We are looking at simulation of sound, and how sound changes when things are added or taken away, and all those things. We are looking at the compassionate side of AR. Let's say that you and me would like to build a space together. We have a workplace together. And if I would like you to see how this workplace looks from my point of view. I happen to be-- let's say I'm foot taller than you or I happen to be -- let's say -- colorblind. How would that space look from my point of view? And so you can feel and see that, making it the compassion thing.
Alan: Ooh, I like that. See through another's eyes.
Martin: Yeah. And then, of course, sharing it so you can be more than one person in the same space. AR is mainly single user today. And I think that's a shame, because you have to look over the shoulder on the same device. Why should you? Being able to share the same space, I think we made it public a few years ago, or three years ago. We did this with Hololens in 2017, I think. We made multi-user interaction with the first Hololens. And it works great. It's so much fun to be able to do it together, instead of doing it in single user and saving and sending over and all that.
Alan: Have you seen-- there's a company called Croquet -- that I was just introduced to -- that does server-less multiplayer real time collaborative web AR.
Martin: Yeah, there was a lot of solutions doing this, and the base of it is making some kind of anchor. I think Microsoft is calling them "world anchors" and I think Google has called them "VPS points" or-- everybody has a solution for it today or is a solution for it. And there are a lot of different companies who have solutions that claim that they can do this, but there is no open solution for it. So I can just register and' AR -- if you say so -- experience anywhere in the world, and you can go there and see it. That's not really there, and I don't know if it ever will be there, but I think it will come.
Alan: Oh, I think so. I've interviewed enough people on this podcast working on it. So I'm assuming that will be a thing where you can anchor it.
Martin: Yeah. You have to have a an open standard for anchoring. [chuckles] So that's where you store them. So that's the problem. It's all this-- having a proprietary version, no problem at all. But you will have 25 of them. [chuckles]
Alan: Yeah. This is the problem. It's interesting. We're building a guide right now to XR collaboration tools. How many XR collaboration tools do you think there are?
Martin: Counting low, 60.
Alan: There's 85 that we found.
Martin: I've seen 10 or 12, but it's normally a lot more than you see. So I would guess so.
Alan: So we're at 85. And the ones that are actually in operation, there's probably about 60. So you're bang on there.
Martin: And I would guess that more than half of them -- or even 80 percent of them -- are based on one or two of the game engines that are out there.
Alan: Actually, I would say all of them are based on Unity or Unreal.
Martin: That's what I've found, too.
Alan: Speaking of game engines, what other work is being done in the real time game rendering or web based rendering? Because right now you've got IKEA, the app. But when do you anticipate this will move into web? Does it have to happen? Does it matter?
Martin: We already have-- call them editors -- or configurators or whatever you want to call them -- on the web right now. You can plan your own kitchen and everything. And we have had that all the way since 2002 in different versions. So we have that kind of real time graphics, and have had that very long time. So those are already web based.
Alan: Is that a proprietary engine, or what is that running on?
Martin: I think the ones that we have right now are-- I think the kitchen planner is a bought system, from somebody I don't remember the name of now. And then some of the configurators are things that we have built ourselves internally, based on some JavaScript engine. I guess we do as most other companies do. We try to build our own when it's needed, building on top of things.
Alan: So looking in the future, we'll have some sort of glasses. Hopefully it'll be multi-user experiences, because that just really is exciting. And I love the fact of being able to see someone from someone else's eyes. I love that idea. You really have been in this thing from the beginning. And IKEA is more than just a furniture store, like you mentioned earlier. What are some of the environmental initiatives? Because I think this is really important for people to understand that IKEA's doing a lot more than just making furniture and selling it.
Martin: I'm not working in this area. So you're putting a little bit on the spot here. But fine. [chuckles] I think I know some of the-- at least the public stuff. And I think that it's in Japan now, that we are trying to rent out furniture. So not selling them directly. We are, of course, trying-- looking into how to build furniture out of recycled material, that we've done for a very long time. Well, we should be. I think we're almost all in place to be fully self-sufficient with electricity for everything we do, from windmills and solar cells. And we are also selling solar cell panels for roofs. I think it started in UK. I'm not sure if it'll expand. I think it's in Sweden and so on, expanding slowly into different countries.
Alan: That is absolutely incredible. The fact that a company so large has put a mission out there to make people's lives better. And looking at that holistically, I think, is the key to how we move forward, especially in these crazy times that we're experiencing now. I think a lot more consumer engagement is going to move. Obviously, it's already moved. None of us can leave our house right now. People still need furniture. They still need decorations. They still need to live their lives. So how do you feel this whole thing has contributed to the speed at which people are adopting online purchasing?
Martin: I think that in the present time, everybody is learning to become digital, even if you haven't wanted to before or haven't needed to before. So both in the workplace and privately, people are trying out buying groceries online. And they hey do everything now. And we are-- IKEA is present in almost all countries in the world. And that means that we have customers and co-workers and subcontractors and so on all over the world, affected by what's going on right now. I think this, in a way, is changing a lot of things. Let's see when we come out on the other side, where that has gone and made us become. Normally, all crises are pulling us all together, hopefully, and takes out the best in us, hopefully.
Alan: It seems like that -- especially in this industry -- it seems like everybody, when this happened, everybody's rallying together to say, what can we do to help? And I want to just give a shout out to everybody who's not just sitting on the couch watching Netflix, although there's a lot of that as well.
Martin: [chuckles]
Alan: But for everybody out there that's putting their effort in, and getting up every day, getting dressed. I know it's hard when you don't have to go anywhere, but putting in the effort to make this world a better place. And we will get through this as a community. And so I want to mention one more thing. Are you guys doing anything on smell? Because, man, wouldn't it be great to smell those those meatballs?
Martin: [chuckles] To be honest, yes. Yes, we do. We have the tactile things. And in that, we have actually pushed in touch and smell and taste. Maybe not taste, but at least smell. We sell actually a lot of products on smell, especially scented candles. But there are other things also that actually sells on smells. So if we could-- But that's really far out. But if we could have that in one way or another, that would be great. Touch is hard, also. Touch is very hard, too.
Alan: I've tried a bunch of haptic gloves so far. Have you had any really good experiences with haptic gloves?
Martin: Mmm, nope. [laughs] The thing is that grabbing something, maybe you could do. But that's not really what people are after when it comes to us. They want to feel the surface, how that flows over your fingers. That means that you have to stimulate their sensors that you have underneath the skin, because that's actually where they are at. And you have to read those vibrations in the right way. And you can actually record the vibrations correctly, but you can't play them back. It's like having a a fully fledged HDR camera on one end -- because you can capture what a human finger feels -- but then playing back with an old-time fax. That's how it feels right now, when it comes to how a surface feels. Not grabbing things. That you can, in a way, do. But that's not really what I'm after.
Alan: No, I totally get it. If we're looking 10 years out, if you want to get crazy, I mean, brain-computer interfaces are kind of the next phase of this. You should be able to encode what it feeling in your fingertips look like. But that's-- phew, we're getting some crazy territory then.
Martin: The human brain is actually super interesting. The human brain is almost like a 3D reconstruction engine. It builds a three-dimensional world for you, based on so little data. And builds that up while you are looking around and doing things. You do sit in front of a keyboard right now?
Alan: Yes.
Martin: Okay. If you look at the H key in the middle there, it's the H key and half the G key and half the J key. That's what you see on an arm's length distance actually in full resolution, 60 pixels per degree, retina display. That's the only thing you see. The rest, as soon as you go from away from the H key out to the J and K key, you are down to a tenth of that resolution. And even if you go further out, you actually don't see color in a good way. So everything else is something made up by your brain. Because you've seen it before and it actually fills in the data. It's just that center thing that we have to figure out of giving a new image to, or follow your eye movements, or something in high resolution. But the rest doesn't really need high resolution. If you can solve that, then we can do almost anything in AR. The problem now is that we don't know where you're looking. [chuckles] So we have to have high-res everywhere.
Alan: Yeah, I think this is gonna be resolved. We're starting to see the headsets come out. The HTC Vive Pro has eye tracking-- or the new one has eye tracking. The Pico Neo 2, I believe, has eye tracking option. I would assume, because Facebook has bought an eye tracking company. Apple bought SMI.
Martin: Eye tracking is the key to all of this.
Alan: It's coming. And I think one of the things that-- obviously you've tried the Varjo?
Martin: Yep. Varjo, yeah.
Alan: They took a really nice approach of affixed foviated rendering without eye tracking. But as they start to kind of figure out eye tracking, in addition to what they've got, I think they've got a really nice solution for high end. But again, it's VR and it's a giant headset that has to plug into a supercomputer. So I think the more practical use case is trying to figure out how the phones and tablets can bring this experience. And I wonder if we can use the-- we actually tried this as an experiment, putting a virtual mirror, an AR mirror on a wall, and then using the back-facing camera to project the mirror, and the front-facing camera to project your face onto the virtual mirror. But we tried this couple years ago and it didn't work. The phone actually shut itself off. So I think it was more processing, rather than the fact you could do it. It just didn't do it. [laughs] The phone went boom. So we're getting-- the processing power is getting better, and would you add eye tracking? It should be interesting.
Martin: And then the processing power we'll get in our hands for the same amount of money in 10 years will probably be a thousand times more than today. So if you look back, that's what we have been having as a development the last 10 years. So let's see what we can do with a thousand times more power.
Alan: What is one problem in the world that you want to see solved using XR technologies?
Martin: Yeah. The possibility to reach the many, with any kind of information. I think I have to expand that a little bit. The reason why I think 3D into the hands of the many will make it easier, because I think it's-- if an image says more than a thousand words, then a three dimensional thing or a three dimensional space says even more, for human. So I think that will explain whatever it is much easier to anyone. And that will also make the common knowledge about things and how things works bigger in the world. And knowledge almost always makes us smarter and better together. It's an information engine for any use. Does that make sense?
Alan: That makes perfect sense to me. And you know from the work we're doing and the stuff we've been doing, education is a very big part of why we do this. And your answer is wonderful. Thank you. If an image is a thousand words, a 3D space is a thousand images.
Martin: At least.
Alan: At least. Because it's exponential. So maybe it's a thousand times a thousand. Is there any last things you want to want to say? Where can people find more information about the work you guys are doing? Do you have a website that's specific to this?
Martin: I think if people are interested in looking into possible work opportunities, then our website is fairly easy to find. I have some public talks, you can search for "IKEA, VR, and Meatballs." And if you want to see some talks I did a few years ago, they're still valid. And I'm fairly easy to find on LinkedIn, so they can find me there.
Alan: I'm actually going to look that one up. "IKEA, VR and Meatballs." And one of the things that you guys made a while ago -- and I don't know who made it -- but it was only available in Toronto for a little bit. It was-- you could make pancakes in an IKEA kitchen in VR.
Martin: It's still, I think, available on Steam.
Alan: It was so awesome. You could run around and make pancakes. So if you have a Steam account and a VR headset, you can start to make IKEA pancakes.
Martin: I think it only works for the HTC Vive, but it's still OK. It's still out there.
Alan: Martin, thank you so much for taking the time. And I really appreciate it.
Martin: Thank you.
Alan is back in the podcasting saddle — alongside his partner in life and business Julie Smithson, plus colleague and marketing expert Alex Colgan...
Don’t let his impressive stature fool you; Virtual Reality Marketing CEO Terry Proto knows that, in an industry where there’s a ton of use...
There’s not a lot of good coming out of the current situation affecting the globe, but if there is an upside, it’s the rare...