Behind the Rack
Behind the Rack is the AV industry podcast where Pro AV, broadcast, and live production come to life. Hosted by Hugo Chevrette, Franco Caruso, and Vincent Simoneau, each episode dives into the latest audio, video, and broadcast technology trends, while sharing real-world experiences from integrators, tech specialists, and industry leaders.
From new product launches and system integration tips to AV industry trends, audio solutions, and video workflows, the hosts keep the conversations casual, practical, and easy to follow. Expect honest insights, a few rants, and plenty of useful takeaways that go beyond what you’ll find in spec sheets or trade show booths.
Whether you’re an AV integrator, dealer, broadcast engineer, or just passionate about pro audio and video technology, Behind the Rack helps you stay connected, learn something new, and see what’s really happening behind the rack.
Behind the Rack
Ep.9 - Immersive Tech Explained: XR, VR, AR and the Future of AV
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Immersive technology is rapidly evolving — and it’s starting to intersect with the world of professional AV in new and interesting ways.
In this episode of Behind the Rack, we’re joined by Daniel Grozdavov from Imagine360 to explore the world of Extended Reality (XR) and how technologies like Virtual Reality (VR) and Augmented Reality (AR) are reshaping how organizations communicate, train, and create experiences.
Together, we discuss what immersive technology really means beyond the buzzwords, how these environments are built, and where XR may intersect with the ProAV ecosystem.
Topics include:
• Understanding XR, VR, and AR
• How immersive experiences are designed
• Applications in storytelling, education, and experiential environments
• The role AV infrastructure plays in immersive deployments
• The future of immersive technology
Whether you work in ProAV, broadcast, content creation, or experiential design, this episode explores how immersive technology is beginning to influence the next generation of AV experiences.
The Walmart Photosynth Origin
SPEAKER_04My first photo synth is actually I was in the Walmart. We were about to hike the Grand Canyon. It was like 10 p.m. and I was on the toilet and I got the app. So the first photo synth is in the Walmart bathroom. That's how it started, to be honest. You see my knees, it's part of the thing, and that that was the first uh that's how I got it. My first photo synth.
SPEAKER_09That's yeah, did you ever imagine, Daniel, that you know your journey would take you from a Walmart to a bad room?
SPEAKER_04No, to the whole world. Yeah, to the whole world. Meta has that, it's the neuro rate neural band. So you can in in minutes set it up. You get used to do you start by doing this and then you just think of doing that and it clicks. Jesus. So eventually you could just be scrolling, you could be doom scrolling like this, you know, just or just thinking about it.
SPEAKER_03Yeah, just sitting there doom scrolling. At least before you burn some calories, you know, up and down. Like this. No, you're just literally there, doom scrolling up and down. Yeah. So I didn't I didn't buy it.
SPEAKER_04So it's almost like a gigapixel panorama at the time. And uh there was a guy there who's like, this is really cool. I said we can maybe sell this to to places around in Costa Rica, San Jose. You went to the mall, tried to sell it to them, but it's Costa Rica and like a mall in 2012 having virtual tours and immersive shopping was too advanced.
SPEAKER_07Yeah, the market was not ready. Exactly.
SPEAKER_06Daniel, it is so cool to have you here. Uh, I've been looking forward to uh having you on the podcast for a while now. Um just you taking the next level. Like I mean, us on the pro-live system integration side, broadcast side, we talk a little bit about immersive experiences. And uh there are a lot of uh immersive experiences, I think, uh out there in terms of installs and experiences, especially live events. We're talking about the uh Taylor Swift tour and stuff like that. Uh, but you're actually putting it into action, like you're doing a lot of a lot of cool stuff. So um why don't we start off with talking a little bit about yourself and and uh how it all started?
From Road Trip App To 360 Obsession
SPEAKER_04How it all started, yeah. Well, my background is in physics, so I did biophysics as a specialization, did some research on carbon nanotubes, um atomic force microscopy in Berlin and Germany at the technical university. I also worked with photosynthetic bacteria extracting their proteins to model it for um solar cells so we could get better efficiency, you know, copying nature and engineering. And after I graduated, I went on a trip from Montreal to LA. We did a three-week road trip, and my dad's had this app called uh free app of the day. At the time, iPhone was just coming out, everyone was making apps, and I had this app and I just was scrolling in the I was on in Walmart just waiting or road tripping, and I see this app called Photosynth. I thought, oh, this looks interesting. It was a Microsoft Research Labs uh pilot um app and it allowed you using the gyroscope to rotate your phone and take photos, and it was stitched them together. And I was like, this is great for our trip. So I started using it in the Grand Canyon and Sequoia National Park. Nice, and it you could with just what you're it wasn't the best stitching because your hand is moving, it's not like a perfect spherical camera, but it was enough to get all these cool panoramas, you could share it online, and people could look and zoom in and see everything. So that's how I got into 360s. It was just this random app. And then I went to Mongolia, used it all over there, went to Costa Rica, and then I found on Mac Microsoft Research Labs website they launched a desktop version of this same software, and you could instead of just using your phone, now you could import your DSLR photos, create higher quality, you know, with nice glass uh media. So I think my first one was 40 pictures all around because that was an 18 millimeter lens, so it took a lot of photos. But then when I got the full panorama, you could zoom in and read text on like the finest because it was so high resolution, so it's almost like a gigapixel panorama at the time. And uh there was a guy there who's like, this is really cool. I said, we can maybe sell this to to places around in Costa Rica, San Jose. We went to the mall, tried to sell it to them. We made a a we copied, we cloned our site and inserted panoramas of the the stores to show them what it could look like. But it's Costa Rica and like a mall in 2012 having virtual tours and immersive shopping was too advanced.
Early Sales Wins In Hotels And Resorts
SPEAKER_07Yeah, the market was not ready.
SPEAKER_04No, exactly. So then we gave up on that, and then I just kept working with what I had, and I um I made a demo with uh just with what I had. I went to the the biggest hotel there, the Grand Hotel Costa Rica. Said, I do virtual tours, I could make one for you. Here's some examples. I think they gave me$2,000 US, said let's do it. Um, I did it in about a day, and then I went the same day, went next door to try to sell it to the hotel and said they're just buying it right now. They said, no, we can't let them be the only one. So they bought a tour as well.
SPEAKER_07This one was 25 on time.
Building Credibility With Museums And Government
SPEAKER_04Yeah, something like that around there. So I laid all the cash on my bed. I'm like, look, I have all this money, and uh then I had to move to Venezuela with my then uh girlfriend. There, I I there was all these resorts in uh the tourist town of Island of Margarita, and uh I went from resort to resort, walking with my iPads, showing them the work, and it there was a huge interest. Google Street View was still taking off. There was Google Street View photographers and virtual tours that are more custom-based. I think I got three months for free at a resort plus payment, food and drink included the whole time. Um, penthouse had another one. They paid for my flights back as well. So it was like it's like I had a I don't know, like a gold thing I could offer them, and everybody there was no one walking around doing that in those areas. When I came back here, I went to um the 1234 de la Montagne. It was a club MC Mario's club. Just walked in and I I somehow met him, spoke to him, and we we did a tour of that place. So that was my first Canadian contract, maybe in like 2014, roughly. And then uh just started selling it to like museums. We got Pointe Caliere Museum. Um that's 10 years ago. Yeah, 10 years ago. We did the first, they were just digging up the foundations of the Montreal city, the the Fort Ville Marie, what was the ruins. So my friend who was an archaeologist working there, and he wanted to document this so we'd have it uh as a record. So we did drone 360 virtual tour and all that, and those few projects were enough to get us more and more clients. Um, we even did the um the Museum of Civilization in Ottawa, the Museum of War, full virtual tour of the entire exhibit. And those were old projects that, with these more reputable organizations, gave us um bigger and bigger contracts. Now we've done stuff with the government of Canada, um, the Ottawa Senators, all the banks. We've done the Place Victoria, the um group Mac, maybe you know them. They were redoing their whole lobby, so we virtualized it in VR and they they wanted to save uh, I think it was a$20 million contract. Faskin and Martineau were thinking of changing their lease to a different building, and they said we need to keep them, so we made a virtual reality experience to show the architects rendering of the lobby how it would look because they wanted a more modern look and feel. So at their big presentation with the board, they had R VR to show them what it would look like and they closed the the deal. Wow thanks. I mean, they had other elements that helped them, but the VR was uh one of those. So anyway, it just kept evolving. Like I said earlier, um Lucky Palmer made a Kickstarter project with the Oculus, the initial Oculus headset, and it was enough to get, I think, a million dollars. We got a first prototype uh friend got got it off Kickstarter. I got to try the earliest DK1 development kit one.
SPEAKER_08The one that looked like a Kleenex box.
SPEAKER_04Yeah, I should have brought it. I have it in my office. I forgot to bring it. Yeah, it's a big box like this. It made you nauseous because the tracking was late. So every time you turn your head, it was just slightly off. Version two was a lot better, and then um, as you know, Facebook bought it for two billion dollars, and then we got the first Oculus Rift and so on, and that you know, it was a bit very gaming intensive. We were never in the gaming space because we believed it was just for capturing reality, um, capturing moments that you could then re-revisit uh forever, yeah, and uh for um archaeology, museums to document what they were doing, and obviously live concerts, experiences like that. Tourism was a big one. We often tried to sell our VR to um Tourism Montréal, but again, it was a little too early. They didn't have the the back the hardware, not and the back end to support all this content. Yeah, but that's kind of how we got started. And as we got bigger, I had a larger and larger team. We had an office on Casgrain in uh next to Ubisoft in those big industrial buildings. We were there for about 80 years until the pandemic. Oh yeah, and then the pandemic we had to close our offices because we had a co-working space, a production studio as well, and at the end it was just me working from there during the pandemic. So I said I think we were paying about three grand a month and said we need to just cut our losses for for this and went fully virtual. So, and then during the pandemic, just to quickly say, we pivoted to virtual events because that's what everybody was doing. So I created, I think for Square Enix, they were here, Square Enix Game Studios, they had a their fifth anniversary of the Hitman series. So to celebrate, because they couldn't do it in person, we created a metaverse world in the Hitman mansion. You know, the Hitman there was a mansion, so the employees could have a party in their own mansion that they designed, and everybody had these costumes, the CEOs there giving a speech, there was a dance party, and it was kind of a good way to make some cash with virtual events. That's amazing. And now we're just continuing to uh to grow. Most of the team is fully remote, so we meet virtually in in a virtual office space. We have channels to talk on on Slack, like like a Slack-like experience, but in in this virtual lab. And uh yeah, they're all over the world. I have a team in Ukraine, in India, and one in France as well. And sometimes I just outsource to various people.
Oculus Beginnings And Non-Gaming Vision
SPEAKER_06That's impressive. Wow. That conver that initial conversation, I'd love to hear what that initial pitch was when you was like your first hotel. Because that is, I think, the initial uh I think it's the the hardest part of getting a project is how do you how do you talk about something that they're not even aware exists and how it is like what was that conversation like?
SPEAKER_04Well, the iPad helped because I didn't have my span my Spanish is great now because I spent so much time in these countries. At the time, I was like uh tour virtual, kind of trying to say it in in Spanish, showing them the the content. I think she called her manager and then he saw there was something interesting. Luckily, the the owners were more uh anglophone. I think I was using a what was it? There was like a book publishing thing in the iPads. If you want to have your PDFs, I forget the name, but you had to so to get the virtual tour, I had to publish it as an ebook and load it on the iPad, so then you can look around in 360. There was no way to get it on there so I could actually show it to clients. Now it's a lot easier by the time I had to hack the the iPad just to be able to show this stuff. I forget the ebook format. There was something with with Apple's own proprietary ebooks, but yeah, that's how the iPad is key because it's a technology that unless you see it or try it, like you said, people don't understand the benefits. Yeah. So it was really key to at least and especially with a gyroscope, you could then turn it and look around. That was another cool thing, having the gyroscope to see, wow, it's interactive and I can look around. So you're not talking about ROI here.
SPEAKER_06Like the initial conversation had nothing to do with ROI. It was just basically would you like to be the first one? Yeah, like and and here's how you could elevate, I guess, the experience.
SPEAKER_04Yeah, I wasn't thinking of it from the business and how it will impact their their profits and all that. Just I thought I figured you have a hotel, people want to see the rooms and this the common spaces. Let's show it off so you could put it on your website. And I've never followed up with that hot Grand Hotel, Costa Rica, but I got to go in the presidential suite. President Kennedy stayed there. This they still have oopsie, they still have his room all set up, so that was a a big project at the time. I was like, wow, it's Kennedy's room. Wow. Um, now we've done so many crazy things. Uh we've traveled, actually, I should mention doing VR. We worked with World Vision and uh with their fat with their team. We traveled to India, Rwanda, we went to um Nepal. We've also gone to Haiti, Morocco, Switzerland, Belgium. Uh, I think we did um I mentioned Colombia, Panama, Costa Rica, and then uh all over the states as well. And so we've done projects all over and really just seeing the world and being paid to travel and go to these unique places was was a big um plus.
SPEAKER_08Wow. That's incredible. And you know, I have I have an anecdote for everything back in 2012 when the iPhone 4 came out. Uh I was in France and I bought it like in France, and I did use Photosynth, and I I basically took picture of all my trip like with this app. So I remember you had it too. Okay, awesome. Yeah, that's cool.
SPEAKER_04So most people are like, What I never heard of Photosynth. It's it's you're the first one I made.
Pandemic Pivot To Virtual Events
SPEAKER_08What I like about the app, what I liked about the app is that it you would see where should be taken the next picture. So there were squares all around you, and you guide you. Yeah, you could see the thing build up at the same time you're using it.
SPEAKER_04And you didn't even have to take the photo. Once you start the moment you get to the edge, it just takes a photo, so you're just moving your phone and it fills it out automatically.
SPEAKER_08And bear in mind, like in 2012, you I really looked stupid on those like archaeologic sites, like doing this.
SPEAKER_04Yeah, and if people walk into your shot, it ruins it because then there's someone in the shot that just did it properly, cut in half. Yeah. My first photo synth is actually I was in the Walmart. We were about to hike the Grand Canyon, it was like 10 p.m. and I was on the toilet and I got the app. So the first photosynth is in a Walmart bathroom. That's how it started, to be honest. I still haven't. You see my knees, it's part of the thing, and that that was the first uh that's how I got it. My first photo synth.
SPEAKER_09That's did you ever imagine, Daniel, that you know your journey would take you from a toilet to a bathroom?
SPEAKER_04No, to the whole world, yeah. To the whole world. You know, if I had just not opened that app and kept going, maybe I never found it.
SPEAKER_06Yeah, that's crazy.
SPEAKER_08You would be in science, probably, or something like that.
SPEAKER_04Yeah, exactly.
SPEAKER_06That app really was the beginning.
SPEAKER_04Yeah, I was supposed to go back when I got back from the trip, I was supposed to do my master's at uh UDM in um photos photosynthesis solar cell research lab with this, so and then that derailed my whole my whole trip trips.
SPEAKER_08And it's right, like funny enough, like linking those things. He was about to do a m master in photosynthesis, but he used the yap called Photosynths.
SPEAKER_04Oh my god, I never thought of that. Wow. That's true. So yeah, photosynthes but in different ways. Yeah, wow, that's crazy.
SPEAKER_06Uh, just for everybody out there who's uh not aware, can you kind of give a bit of a difference between because I have to read this uh quote from your website, which is uh what Imagine360, that's your that's your company. Uh and this is Imagine 360. We create immersive VR, AR, and MR experiences that captivate audiences, elevate brand storytelling, and drive measurable results so you can focus on what matters most, which is growing your business. Can you kind of give and I love that by the way? Thank you. Um, can you can you uh tell the audience a little bit of what is the difference between XR, VR, and AR?
SPEAKER_04Yeah, it is a hot topic, and not everyone can it's even in between professionals, we disagree on some of the terms, but my definition would be XR is the umbrella term for AR, VR, MR. It's not its own thing, it's just extended reality. Yeah, so everything that it that AR, VR, and MR combine. Virtual reality, you've seen it's when you put on a headset and you're watching uh something that you can look around in 360.
SPEAKER_08This is the most popular and the most well-known.
Selling The Vision Without ROI Jargon
SPEAKER_04Exactly. And within that, there's a distinction between 360 video, so like a camera like this will produce a 360 video that you can look around in. But you could do that on your computer too, and you could still, you know, on YouTube there's 360 videos, you could watch this on YouTube and have uh the same experience, but for um VR professionals, we would say it's not virtual reality, that's just a 360 video. It's virtual reality when you shoot with a camera like the bigger one that has six lenses in stereo pairs, and your eyes will get one. It's like if you had one of these for each eye, gotcha, and then then you're getting a stereoscopic image. And the feeling of immersion is a lot greater because um you're now you can see the depth in the scene, and when you're when things come close to you, you really feel like it's in your face. Gotcha. So the level of immersion is a lot greater when you have stereoscope. So 360 stereoscopic video is considered virtual reality in the live action capture um component. There's also the the game world engine, so everything that's Unreal Engine, Unity, where you can actually move your head around in the scene, grab objects, that too is is virtual reality. So you have CGI-based 3D environments that are virtual reality and video-based 360 stereoscopic. Um, there's also volumetric video, is another interesting way to experience virtual reality. So imagine there was you've seen maybe sports now are starting to do this. They have a large array of cameras filming video from every angle. All those frames are fed into an AI um neuro rendering engine, and then you can replay the video, and then you're able to move around within the scene and go from any angle.
SPEAKER_06So that's what we saw at the Olympics that I was talking about. I didn't know what it was, but that's volumetric video.
SPEAKER_04So back in the day, you could you could have a semi-spherical array around a person and do sort of like a bullet time experience in the matrix, you know, when he goes like this. Yeah, but that was just a bunch of cameras taking photos and then they're just switching from camera to camera. That's right. So it freezes the moment. It's not he's not actually doing act any live action. But now we can we have the technology and the the processors to put all that data together. I think Intel had a very large state sound stage like that, where they they had it was like the biggest one in the world. They had a whole big fight scene with horses, and all the cameras are filming. You can then pause it and go see from any angle the the like a movie scene, but fully immersive. So the 360 video, we call that three doff, and three doff means three degrees of freedom because you can rotate your head in those three degrees, whereas six doff is six degrees of freedom, you can move forward, backwards, up, down in the scene.
SPEAKER_08So I because I've seen those terms before, but I was wondering, okay.
Defining XR, VR, AR, MR And 3DoF vs 6DoF
SPEAKER_04Yeah, so in uh in a three doff, you can look around, and that's what a 360 video camera would produce. You can only look around, whereas in a game engine or a fully volumetric video scene, you can move within that and have six degrees of freedom. So those are the virtual reality VR um um terms. Augmented reality, the way I would consider it is when you have a Pokemon Go, for example, you know, you open your phone and you see like a Pokemon on the ground, or maybe you're using the IKEA app to see furniture in your home. Then you the the camera has its own. Sometimes there's a LIDAR, but mostly it's just using the camera to see the space. Yeah, it has SLAM technology called simultaneous localization and mapping technology. So it's localizing the camera in space and mapping surfaces like the wall, the the a table, the ground. Knowing those, it can then place objects in the space around you, but you're seeing it through your your phone. So that's augmented because it's augmenting your your reality. Yeah. Um now when you if you take that same concept of having 3D models in the space around you, but you put it on your eyes, and each lens is showing you the same uh sorry, a 3D view, meaning your left eye sees the Pokemon slightly from here and the right eye from here. Now the Pokemon seems to be a really 3D stereoscopic, and you don't need to you're not looking at it through a phone, it's just overlaid on your eyes. That we would call mixed reality because it's now mixing uh 3D stuff with your reality, and there's nothing in uh in the way.
SPEAKER_06Interesting.
SPEAKER_04So the Oculus Quest headset has virtual reality, but if you tap it, it removes the the virtual and you see through the cameras in the front, it has two stereo cameras, and so now with starting the quest two, they had mixed reality pass-through mode, but the cameras were black and white, so it wasn't very immersive because you're in a black and white version of your your living room or whatever. Yeah, um, now it has color pass-through and it's 60 frames per second, high refresh rate. So as you move and you see your each eye is seeing um the left and right cameras, so you have stereo stereoscopy, and you can overlay models in your space. So it really feels like the um what's that movie with minority report? Minority report. You know, with all the screens, you could do that, and you have hand tracking. So that's mixed reality, and all three together comprise XR. But that's how uh we would and maybe Meta's glasses would be more considered augmented reality, the latest version. You can have information here, uh, but anything that's a heads-up display on your glasses is not truly augmented reality. Um it's more a heads up display on your glasses.
SPEAKER_06Okay, so that's not even called mixed reality. Reality or AI or something.
SPEAKER_04No, I wouldn't call that mixed reality because it's just you're just seeing a chat and like the time and an AI prompt. But um glasses like Magic Cleap, maybe you ever heard of Magic Cleap? They were a small startup, but they got billions in funding because they they promised a lot of cool tech. Um, they had a nice demo that sold a lot for them. Basically, they have a whole other level of technology, which I think Meta is currently working on with this project Orion, it's called. And I know we're going deep here, but I know I I love going deep. I'm passionate, I'm really passionate about this. I've never got to experience it fully. The closest headset to that was the Microsoft HoloLens. Maybe you've heard of HoloLens. Yeah, it also did well for a while. I think maybe some people are still using it on the enterprise side, but not as much. But instead of having a screen with lenses that you're seeing information on, like you know, video, and then it's brought into your your eyes through a lens. Here you have a glass that is uh has a waveguide, and um the glass can simulate photons of light the same way that they would happen to come off, let's say this water bottle. So light is hitting here, and it's not video, it's just light coming off the the uh the bottle. It goes into each eye, and my brain has to then process that light to see a 3D image of this. So this headset will pretend there's an a Pokemon here instead of just showing me pixels on a screen of the Pokemon, it will simulate for each each eye the light that would be coming off of that um that 3D model as if it was there, and it's my brain that's then interpreting that light to produce the image. So it creates a much more tangible 3D uh immersion because it feels like you're just really seeing it with your naked eye. And if I were to focus my eyes on the microphone, because that eye, the the light coming from that object is simulated, when I focus on this, that 3D um element will become blurry in the background. You know, that you can focus on it or here, and it's it's physically in your space, and your eyes can adjust uh to it.
SPEAKER_06So sorry, stupid question. That's is that a little bit like ray tracing for a computer like?
Volumetric Video And Waveguides
SPEAKER_04It's potentially like it's tracing the rays that would come there, but instead of calculating just how the light would reflect off it, it generates the full thing with you could have ray tracing added on for for for better lighting, yeah, but it's essentially simulating the photons that would be coming off of uh surfaces. So imagine a car and you just you could be it as if it was there versus just seeing video on a screen. If you have good enough screens, you could remove the pixels like a a retina screen, but this is even better, I think. So they're still working on on that, how to make it smaller, higher resolution, better contrast and saturation. But waveguides will eventually be, I think, in the meta glasses.
SPEAKER_08So that's called wave guides. And uh you said what was the name of the company that that HoloLeap?
SPEAKER_04Sorry, HoloLens and Magic Leap. Magic Leap, okay. Magic Leap. They still have Magic Leap 2, but it never is expensive too. I think it was like$3,000, and developing for it was there wasn't enough demand, so not enough developers, so it was catch 22 kind of so the real question is is it all hype or is it actually deployable? It was hype at the time. Yeah, everybody said it's just hype, no one, you know, it's it's not gonna happen. But I think with advances in technology, AI to you know improve frame rates and and resolution, we'll be able to have something small and powerful enough that it could be just on your maybe with like a battery or a processor that's off device and just streaming to uh to the glasses.
SPEAKER_06And that's something that you're you're trying to foresee to help your business and help other businesses, right?
SPEAKER_04Like I mean mixed reality, yes, because it's more accessible. This more uh waveguide-based mixed reality, I think, is still maybe years, years away. So we're not planning, we're not building around that. More um commercial applications, enterprise, um, still doing some hotels and resorts, but we've moved to industrial applications. I've done a lot of mining work as well. Um, I've been all over the United States and Canada in gold mines, uh, iron, uh, graphite. And what does that do?
SPEAKER_10Yeah, what's the application for that?
SPEAKER_04Training? No, it's not training. So let's say you're an investor in uh and you want to invest in a mine. My client has like a the LinkedIn of the mining world. So if a mine wants to be on LinkedIn, they go to this platform, it's called Verify, and you can see the whole map of the world. You can see mines wherever they are, they have their pitch deck, the and who the the the leaders are, the owners. You can then see 3D samples of um cross-sections of the drilling. So you can see the deposits, the different um density of deposits, how much there is, um, how that's overlaid on all of that. And they so they do that just from the data they extract, but then they'll send me or people who do VR to fly, we fly a drone, get 360 panoramas all over the thing, and go into the tunnels underground. So there I need to bring lights because it's usually pretty dark. Oh yeah. But they want to see the quality of the equipment, how how well it's made, so that instead of the investor having to travel there to inspect the mine, they could maybe make a decision to invest from wherever they are in the world just from this virtualized um experience. They could see how the mine is developing, what buildings they have, the where they do the crushing, the production, every stage of the the mine.
SPEAKER_06Yeah. So so is it the chicken or the egg? Is it is it you coming up with these type of applications? Are you seeing demand for it? Are you seeing other people like coming to you and saying, is there something that you can help in this type of vertical or industry? Or how does that how does that how do you how are you navigating?
What’s Hype, What’s Deployable
SPEAKER_04Yeah, so sometimes we will create something new and then try to propose that to a client, but we're not building any IP. So we're not trying to create for now, at least I've been working on some ideas for apps, yeah, but for the most part, clients will find us or we'll meet somewhere at a networking event. You know, like we have this challenge, we want to we need to do this, and then we'll work backwards from that to develop okay. Maybe it's you know, drones in 360 or augmented reality or mixed reality or a web-based 360 solution or just a virtual tour sometimes is enough.
SPEAKER_10What would you say is like the trend right now? What does everybody want?
SPEAKER_04The trend, I think, more moving away from big goggles that you put on your head. Um, Meta, like I said, has laid off a lot of people. They saw it wasn't growing as fast as they wanted. Mobile is always a big one because it's in your hand. So anything that's with your phone is easy. Everyone has a phone. You can just put a QR code and then I scan it, and now I have you know an augmented reality experience. You can take photos with it and share it more easily. So that is one thing going towards AR. And 360 video is still pretty strong. Um, virtual tours is a steady, steady, I would say, uh, for hotels and resorts. I mentioned augmented reality, that was with the phone, but glasses, there's a lot of glasses that have come out. Um, the meta Ray Band glasses don't have true AR, but there's a company like XRO, I forget the other ones that have pretty compact glasses. You can put a giant virtual screen and watch your either stream from your computer, so you could work with three screens, or you could have 360 videos playing, and they're like compact VR glasses, but they have an augmented reality component, and there's not enough content for it. But if you have an application, at least the hardware is there to do uh augmented reality with compact glasses.
SPEAKER_06Yeah, I see the ray bands all the time uh all over the place.
SPEAKER_04Yeah, I'm about to buy the latest ones. Yeah. Because just having to reach your phone, you can just see it here, is is handy. And going live streaming or taking a photo or recording from your glasses is I could be streaming live right now to people from my point of view. That'd be sweet.
SPEAKER_08Apparently, uh Zuckerbird was in a trial last week, and the agents, like the his security agents, came in. They had the glasses. They had all the latest version of Rayban, and and uh the judge has had to say, no one is taking video live here, please.
SPEAKER_04Yeah, they had to take them off. Yeah, they had to take the camera the because it was a it was a hearing about privacy law, and then they have the glasses, which is like anti-privacy.
SPEAKER_07Uh I just have a question to unwrap. I know seriously.
Industrial And Mining Use Cases
SPEAKER_08Because we didn't talk about the for me, it's kind of the elephant in the room because I almost bought one. I want to know what you think about and let's not go for 12 minutes about this, but what do you think about the the Apple headset?
SPEAKER_04Right. Sorry, the water went down badly.
SPEAKER_08We can fill the time, I mean, because I I almost bought it.
SPEAKER_06Like I was I I I recently went to the Apple store like a couple of weeks ago. And I was gonna put it on, but I had my two uh kids running around, and I I knew that I as soon as I put them on, I was gonna lose them. Even though I know they I I can see like uh what I don't know if you want to call mixed reality or uh augmented reality, but uh yeah.
SPEAKER_04Yeah, so I've only tried it once. I went to the Apple store, I got the the demo. The headset is amazing, it's definitely one of the best headsets you can get. I think the it's like an you know their phone has the what is it, the XDR display, so it's super rich colors, super um high contrast, very bright. I think it's about 16k resolution, so you need very high resolution cameras to capture the content for it. I couldn't see any pixels, no screen door effect, like you're watching through a screen door. Um it was comfortable. I I tried the what was it, the eye tracking? Yeah, so you just look around like this and click with your your thumb, and eventually I think they'll have like a little thing on your wrist, so you could just think of clicking without having to actually move your hand and it'll it'll click. Meta has that, it's the neuro rate neural band, so you can in in minutes set it up. You get used to do you start by doing this and then you just think of doing that and it clicks. Oh Jesus. So eventually you could just be scrolling, you could be doom scrolling like this, you know, just or just thinking about it.
SPEAKER_03Yeah, just sitting there doom scrolling. At least before you'd burn some calories, you know. Up and down like this. Now you're just literally there doom scrolling up and down.
SPEAKER_04Yeah. So I didn't I didn't buy it because I didn't the MetaQuest headset at now it's$300 for the the 3S, is much more accessible. Um, maybe there's some subsidies, that's why it's cheaper, but eventually they're releasing. I think version two was being released. I was tempted to buy that, and then I heard they're killing off the whole project, so there might not be much moving forward. I think it was just too expensive, and not enough people bought it, so there was not enough people to do apps that didn't again the same problem. Not enough developers were building stuff. There were some videos they made with like um a Titanic sort of remake. I don't know if you heard about that. They had the largest ever set because it's because you're in 180 VR and you have these huge cameras to record, you can't do the same kind of cinema tricks, so you had to build these large sets. So they made one very good immersive uh video with that. One was about free solo with that Alex Hannel guy. There was a free solo 180 VR experience, and they did something with F1 and Brad Pitt, another thing, one nature scene as well. And there was a store you could you could make your own apps and push your own content through, but without enough people watching it, you're not gonna make make enough money to return on your investment. So I think they saw also that Meta and Samsung and other companies are doing these augmented reality glasses, and from what I heard, they're ditching that to go to towards uh AR glasses.
Market Trends: Phones, Glasses, And AR
SPEAKER_10So, Daniel, are you are you making content or coming up with ideas based on the technology that is being released? Like you you keep mentioning this was killed or this was introduced, and then we were making videos for it. Like, what's your process? Like, are you looking at what companies are investing in and what they're about to release and making content around that, or is the content just adapted to all these devices?
SPEAKER_04Most of the content is adaptable. Obviously, if AR would be hard to put in VR unless we just reuse the 3D assets and build a whole scene around it. Um, 360 video, we could just film, and mostly it runs on any device, even on YouTube and on your phone. So we we we won't really look at what the companies are doing, except for when we're trying to create a very new high-tech innovative cutting edge project. And we need, for example, recently I had um a pharmaceutical company, they wanted to create a VR video of what it's like to for the pro they have make medication and pills, how it's made from the lab, how it's shipped around the world, and goes to the pharmacy, and then the client walks in and picks up the box. And they wanted you to be like an ego flying through the production facility, shipping it worldwide into distribution and into a pharmacy. In the end, we just used AI to generate most of the scenes because their budget was quite low. Um, but the headset that they the CEO said, I want these glasses. He said it has to be on these glasses. He didn't want a MetaQuest or an Apple, it's too big, and at their trade shows or at the farm the pharmacies, they didn't want to have to put on this big thing. So he had this that was a constraint um to use these very thin, and those glasses you couldn't even move your even if you moved your head, the video would stay with you. You couldn't look around within a screen. So then I had to do research on, I think um, was it Samsung had a very small, like it's called the Elite Something glasses, didn't they do very well, but they're still available for purchase. So we said the closest thing we have is this. So we had to build our whole project around these glasses.
SPEAKER_08So sometimes you'd have a specific order to fit on a device, but technically you're not doing it all the time. You're you're yeah, it's producing content.
Apple Headset, Meta, And Content Economics
SPEAKER_04In the end, the video production was the same as we'd have done for anything else, but here we had to find specific hardware to fit. So to answer your question, most of the time we don't our clients will come to us with uh their, you know, we need to film this thing, and for the most part, our cameras haven't changed in that long in much time. It's still pretty much the same, it's just getting better resolution and higher quality, you know, more cinematic uh sensors and lenses. Yeah, the biggest thing that has changed the industry now is the blackmagic mini, the Ursa Cinema, right? The 12k camera that has upped the game because now you have 12k per eye, uh 90 frames per second, and the whole Apple collaborated with Blackmagic to create a new section in Da Vinci Resolve, it's the whole immersive suite, and it's the first color grading that is um instead of grading each video separately, they're essentially combined together. Wow. It's I forget the name of the format, but you great you're working on one side of one eye, and the second video, instead of being a mirror with all the same pixels, it only has the extra pixels that are different than the left eye. So it optimizes the video for it's less bandwidth, less processing power needed to work on it because it's just segmenting the difference from the one of the eyes versus having two full videos for each. So the whole pipeline is is very nice. I've yet I've worked with some sample footage from um the immersive camera, and it ran really well to create uh the first 12k test for for that client, actually. Another client that I had, for example, it was um they just launched last week. They had a trade show. It's the future of the grocery store. So they the grocery store people have their own conferences where they they show innovation in like the shelving and you know how to display uh your products. So I think this one they they wanted to show the the theme was the future of the of grocery shopping. And so the marketing agency came to one of my clients who then came to me because I've been doing this so long. And so they wanted to film in the grocery store, you have your cart, and as you walk around, imagine you had augmented reality or mixed reality glasses, and when you walk, you could see like your shopping list is as a heads of display, but then you're you see the titles of each aisle. When you're looking at a section, you can see like this is a healthy choice, this is less healthy. This one is like newly released product. You can add it to your cart. You take it, and as you add it in your cart, it automatically updates your your totals. Now, this is just a uh a proof of concept, it wasn't really happening. So we had to film, we actually filmed with that camera the Insta360. I was gonna use the 180, but they wanted a more immersive look. So you went to a metro store at 7 a.m. before it opens, filmed the aisles, filmed that. Then you had to put the products in the cart, and all of it was done with Blender. We added, you know, uh menus and interactive stuff. So when you feel like you're clicking and you're adding stuff to your cart, it was all just a sequenced uh video. And in the end, you got an experience to see what it'd be like shopping uh in the future. And like a girl met your wife says, Can you get some milk? And then you get a pop-up and it automatically adds it to your shopping list.
SPEAKER_10My wife would buy those glasses for me tomorrow.
SPEAKER_03Yeah, so you're not like honey, what was the thing you you needed?
SPEAKER_10Or what aisle is that in? Like, my wife knows. Like, if I'm going to the grocery store, I'll literally tell her, Please keep your phone on you.
SPEAKER_06Yeah, and it's no longer don't forget the list, it's don't forget your glasses. Don't forget your glasses, don't forget your phone.
SPEAKER_04Yeah. Yeah, those glasses are gonna change the world. Uh, Mark Zuckerberg has said, I think the phone will soon go away because I mean, yes, you can take photos, but you can maybe just have it here. Most of the time, you don't need to reach into your pocket just to see a message or reply or ask a question.
SPEAKER_08Yeah. Okay. My next uh sports glasses are the Oakley. That that's on my list.
SPEAKER_04Uh the meta and oakley ones, yeah, yeah.
SPEAKER_08I'm gonna buy them just before the season. I'm an I'm an avid cyclist, and and these glasses are pretty cool. Uh because just like the meta, just like the Ray-Ban, this is the same as the Ray-Ban version one. You can film, you can ask questions to the AI, and but there are sports glasses. I'm an avid eater, so I need the for the food.
SPEAKER_04Yeah, I guess your groceries.
SPEAKER_08Grocery glasses.
SPEAKER_06I don't need glasses to follow my my belly.
SPEAKER_04But what was cool is that all the after we filmed the actual scene, we had a 3D artist create you know the the interface and the mock-ups and the the your shopping list. On your cart, there was a heads-up display, so you could see what you've had have, what's left to do. So as you're moving your cart, this virtualized interface is physically on the cart, and it was all rendered in stereoscopic 3D. So when you have the goggles, those menus are really like floating in front of you as 3D in the aisles, and it was a cool project. That would be considered MR, right? Mixed reality. The the real life application of it is mixed reality, yeah, exactly. We did it in VR to show what it would look like if you had the glasses, which don't exist yet. Yeah, but yeah, it was a it was a mixed reality.
Adapting Content To Devices And Pipelines
SPEAKER_10This is such a good subject because you know, talking about what we do day in and day out here at SFM, we cover everything from video capture with brands like Black Magic Design to post production with Avid and Da Vinci, um Genilec monitors for the post-production as well. We we do touch in the immersive world, but not directly. So it's interesting to hear how immersive now has made it out into all these different industries and how all our brands indirectly touch it. I think that's what the really cool link is between all these things.
SPEAKER_06You're forgetting about Christie because Christy does projection map as well and stuff like that.
SPEAKER_10We didn't even brief Danielle to mention black magic, it's just a player and the business organically, absolutely. Uh a small anecdote, a couple years ago, I visited a uh a live production house, and they were uh uh explaining to me how they started using virtual reality VR now, and this is a this was years ago, this might have been five, six years ago, um, with their lighting designers to try to sell concepts to these big artists. So you'd have, let's say, a big country artist roll in and say, guys, this is what I want my arena tour to look like. What are some of the ideas you have? Now, obviously, with the rendering software, it's capable of you know showing a really good look, right? You input your lights, the beam, the intensities, you get an idea, but how does it feel for the crowd? So that's when they would bring in a VR artist or VR firm to come in and start rendering the entire arena and what it would look like with these lights and with the audio. And that is what their trick was to sell these large tour or these large productions was by using the immersive technology to get that. So, my question to you, Daniel, is are you seeing more of that? Are you seeing less of that? Like, what are you seeing in the live production or live event space?
SPEAKER_04Yeah, so one of the biggest players in that live concert uh space. I maybe you know, is it Pix Mob? I think PixMob. So they came to us to film Taylor Swift's show. They wanted to show what it would look what it looks like in real life with VR. At the time, they I guess they didn't have it in-house.
SPEAKER_10Q was a big Swifty, by the way.
SPEAKER_04Yeah, okay, nice. So it didn't, we we didn't end up getting the contract. It was they had a team, I think, in California that ended up doing it, but we pitched on it and they wanted to film it in VR to show not the what the rendering would look like, but to film an actual concert so they could sell it to other other clients. You know, they have those bracelets that sync up, yeah, and it makes all these colors. So that's a good example of how you can pre-visualize that before the show. Um, make your your plans of the you know, the sequence of of lights, put in VR, and you can see if it looks good in the full stadium. Um, on the live sound, on the live concept, I haven't done much live because it's it's it's tricky in terms of bandwidth and streaming. The place needs to have a good infrastructure if you have these like I don't know five, eight K cameras streaming in stereoscopic 3D. Um and not everybody live is is a is a tough thing if you it's a different world. Yeah, it's a different world, and sometimes we've done it, but the company didn't have enough promotion for the live event, so there was like I don't know, like fifty. People watching when when you you'd want like a thousand or uh you know ten thousand. So it's it's hard to get people at the time to put on a headset at that moment and watch live. You can stream to YouTube and have it live there. The most cool thing I've seen is um iHeartRadio has been working with Meta to create these these intimate live concerts, so that they'll have like Imagine Dragons or Doja Cat, and they'll either have a separate, they have a their own studio where they'll film a live and intimate show, they'll have a real crowd, and they have I think they're using the Canon um R5Cs with the 180 VR lens that Canon released, the 5.2mm dual fisheye. And so I think they have like four or five of those, and they'll place it one in the drummer's seat, one in the crowd, one right up front of the singer. And um normally you'd be if it was alive, you could switch cameras, and as the viewer, you say, Okay, I want camera one, two, and you could move around this the stage, but the way they're releasing it is just they cut from camera one to camera two, so you have a linear video experience showing the best moments from each. Okay, but we've done some where in theory you have uh like in Oculus app, it's an interactive app where you have all the feeds coming in, and the user can choose the camera that they want to see.
SPEAKER_08Like Meta Horizon kind of thing, yeah, a meta, yeah, or its own custom app.
Cameras, 12K Workflows, And Resolve
SPEAKER_04It could be like the Imagine 360 live concert app. And at the bottom you have like camera A, B, C, D, and then all the streams are feeding into, and you're just selectively choosing which one you want to see. Um, this camera, the Insta360 Pro, does have live streaming uh capabilities, it has an Ethernet port coming straight out, and the only thing is um it can only do 4K internally, or else you need a private stitching server to stitch it into 8k and then go online. And there's a 20k license for that, but it can't, you can have five of those either in 180 or 360 VR and stream like that. But I'm sure now with the black magic Ursa, you'll have even better setups for live.
SPEAKER_06Very cool. Is that a good price point too?
SPEAKER_08Uh the 12k versus a lot of the other and we did talk a little bit about the gear that you use to capture, but as you know, my background is post-production. I'm I'm curious about what tools you mentioned, uh uh you mentioned resolve, but what kind of tools you're using to uh without going into much details, but let's say once you have footage coming from the from the from the production cameras. What are the steps? Is this similar to standard post-production? Like you D-rush, then you do uh uh offline edit, then online edit, then final stitching and color grading and stuff.
SPEAKER_04Yeah, something it's pretty much like that. Um, the only extra software I've had to use, where there's some plugins. I've used um Boris FX Continuum. Yes, it has a nice suite for resolve for um putting in titles, you know, re reframing in in uh in 3D VR. So Boris FX Continuum is really good, and Mystica, Mystica, yeah, Mystica is the one we use. I know there's Mystica Boutique, and they have a whole different line for for post, but Mystica VR is specifically for stitching uh virtual reality video.
SPEAKER_08Oh, I I was not I know about Mystica. I mean they there's it's a finishing product it's a finishing software that that was used like years ago, but I didn't know they were.
SPEAKER_04Yeah, so they have a special they have a VR one exclusively. Wow, and they have all the presets for all the cameras, like the Titan, the Ozo, the GoPro rigs, all of these cameras have a preset. So they've aligned it perfectly. You just it's almost like a one-click, you know, um, to calibrate the the stereo stereo the stereo mode, and you have uh the ability to most of the cameras have their own stitching software that comes with it, like Insta360 Pro, GoPro has their own, QCam Candal has their own as well. Um, they're good, but they're not pro-level. Uh so with Mystica, you can really fine-tune it has a it uses your GPU more effectively as well, all the cores to get a faster render. And you can create keyframes between different lenses. So if somebody walks through the scene and comes really close with the other software, you can only apply one stitching effect for the entire video, and that's what's going to be stitched uh throughout. Whereas here you can you can fine-tune the stitch throughout the video with keyframes. So when someone gets closer, you say, I just want to use this lens's uh video more for this part, and then when it they move away, you can then combine and go back to the other lenses. So the the this the fine control on the stitching is really good. So I use Mystica when it's a big project. If it's a fast one that doesn't really matter, or all the subjects are far away, we don't need to go into Mystic. We can just use the the native stitcher, which does has an it has an AI mode now. So you know optical flow for for um inter inter interpolating video. It's also used for stitching to when things don't match up perfectly. The optical flow will kind of create new new um pixels to make it merge better. And then there's the AI version which uses AI to optimize the stitch further, like machine learning and stuff.
SPEAKER_08And the the editing is done in the yeah.
Post Tools: Mystica, Boris, Adobe, Resolve
SPEAKER_04So the editing, once I export from Mystica or those software, I'll get an equirectangular video. Uh huh. So it's two by one, 360 by 180. If it's uh stereo, then it's two one by one, it's pretty much a square because the two by one becomes one by one. And that usually is um in H265, or if you want the best quality, we'll export in ProRes, bring that into uh DaVinci Resolve, and just work on it like a normal video, but then we'll use Fusion to put effects in it and animate stuff, or you know, if you put a 3D model into the virtual reality scene, we'll have to go through Fusion, and that that's where Boris FX is handy as well. Because if you put a straight line in, let's say a text like this straight, if you just slap it onto the video, if it's long enough, it will become warped and curved. You need it to be three 360 aware so that it looks straight when it's in the Boris comes into play. Boris, yeah. The native tools in uh Da Vinci have some tools, but you have to use the nodes, and it's it gets messy if you have a lot of stuff. It's a little bit slower. I I find to go that route. The the um post-production premiere has also been quite good.
SPEAKER_08Yeah, that that because I remember a few years ago people were using Premiere, but yeah.
SPEAKER_04So I know a guy personally, um Chris Bobotis, he launched this thing called um oh now I can't remember the name, but it was it was a tools to work on 360 media for post-production. It was a plug-in for Adobe, and I think it was it was called Metal, M-E-T-T-L-E. And uh it was so good that Adobe bought his company and now like they do Adobe does that. Yeah, and now the if when you're working on 360 footage in Adobe, there's an immersive tab, and that's essentially his plugin that is in embedded directly into Adobe. So Adobe has a good native tools that are easy to use, but color science is a little off, and there's been crashing and lack like some people have had to roll back to previous versions to have stability, and it's not it's not as fun.
SPEAKER_08Whereas Resolve is like your go-to right now is a result.
SPEAKER_04Okay, and unlimited resolution is really nice too. You can work with any resolution, like 16k, 20k, yeah, 32k if you're rendering for the sphere in Las Vegas. Wow, that's unreal. You need beefy, like my GPU. I have six 12 gigs of RAM, and even at 8k, it's at 12 gigs. You know, I think I need to go to 24 now or 32 to to keep up. Wow.
SPEAKER_06So so the big theme here today was video, but um we talk a lot about audio as well. I mean those your guy, yeah. Exactly. So um tell us, I know right before the podcast you were telling us that you had forgotten um to bring a microphone, but uh you do a little bit of spatial audio. How uh you know predominant is that when you're selling a solution, or is that not even a thing? Because on our side, it is a thing, especially in museums and stuff like that. They're trying to go with that install like surround sound, yeah, things like that. Like, you know, um what do you call it? Um, at most immersive 22.2 yeah, yeah, exactly.
Spatial Audio Realities And Standards
SPEAKER_04Yeah, so most clients that we work with are you know in enterprise and tourism boards or a museum. They might know about surround sound when you're have headphones or in a movie theater, but they don't think about it in in virtual reality. Okay. So don't it's rare that they'll sit come to us and say, you know, we need surround sound for this. We often will pitch it as a bonus. If you want to have the most immersive sound, we'll use that. Um, our cameras do have four microphones, most of them now have four up to six microphones in the camera. So you can either capture in stereo or four channel uh Ambi Sonic, Ambix, and you can just export directly. So you're you have your four channels like that, like the Insta360 Pro has that built in. And when you're streaming live, you can also stream the audio in in spatial audio, but you need the right player that can decode the spatial audio or also just default to stereo, right? So, and that's been there's there's been a lack of standardization between platforms like meta, YouTube, other players like DOVR, Apple's ecosystem, each use a different is it muxing to like put it back. Yeah, muxing, so multiple based on that, you need to find the have the right export, and if you do one, it may work here, but not there. So you have different versions for each. So spatial audio can can easily get messy. Depend if you know the exact um hardware you're gonna distribute on, it's fine. But if you have multiple outlets, then you'll need a different version for each. So sometimes we just go stereo and say, you know, it's good enough like that. And many clients are happy with that, it still sounds good. It's just that when you turn your head, the stereo will follow your head, it won't follow the scene. So if there's a lot of action going on, you can't really identify a specific source for the audio, it's not a problem. But if you're in a quiet room in virtual reality and a person's talking to you and they walk around you, you'd expect the sound to track. And usually it does, but you also want it to react when you're turning your head and still this is where binaural can come into play. Yeah. Essentially, what the headset is doing, because it only has stereo speakers, stereo headphones, yeah, is it's uh taking the ambisonic four channel and then continuously mixing it down to a binaural mix, and when you rotate your head, it now recomputes the directions and there's a new mix constantly, so you get this binaural uh effect.
SPEAKER_06Interesting.
SPEAKER_04Yeah, so we do it for bigger projects, and we we've even worked with people who are just dedicated on spatial audio to produce like we'll record the scene with the audio from the the live caption, but then sound effects, voiceover, narration, a door closing, all these extra sound effects have to be brought in and mixed spatially to to create a richer experience. So that's and the workflow is pretty much uh we can do all that in resolve with with plugins.
SPEAKER_06Very interesting. I um thinking about that, I know um in our last well, in our few videos ago, we did AV trends for 2026, and um we talked about um UC and collaboration and stuff like that, and how things are changing the hybrid model and in meetings and stuff like that in corporate spaces. And I know uh you had mentioned like nobody's really thinking about the um the far end person, so the person that's not in the court or the boardroom with everybody else, but at home, and nothing has really elevated that experience. I remember a time where you were playing around uh because you were you were streaming it, or I think you were just uh you put up the video on Facebook, but you were actually in the boardroom, I think, testing it. I think that was the meta stuff, right? That was the Facebook uh meeting uh oh Meta Horizon.
SPEAKER_04Yeah, Meta Horizon. Okay, that is Meta Horizon. Okay. Are you seeing any Meta Horizon workrooms to be specific? Okay, workrooms. Yeah, because they have horizon, the gaming thing where you can create your own worlds and workrooms was it the the collaboration focused thing.
SPEAKER_08And is this when you mentioned uh early in the podcast, you said that you guys are working remotely, but would you when you have meetings, you have virtual meetings in VR. Is that the solution you're using?
Virtual Workrooms And Remote Collaboration
SPEAKER_04So we have two that we have we have a virtual office that's 3D, but it's on your computer, so we can it's like a video game, and then we have a virtual version of that where it is you put on a headset. So depending on if it's a quick meeting, maybe we won't put on a headset, we could just meet there and have like a there is a physical boardroom that like like an RPG, you have to walk your character, yeah, sit on the desk, then like you see people who are late, they're like coming in, and it's it's really funny because then it's it makes you know in a zoom meeting, you're just watching the screen, there's like squares, squares. You don't see it just appear, you don't see them come. Whereas here you see the person load into the lobby and then like walk across, and you're like around. You're late.
SPEAKER_08Where have you been? It's funny. What's the is this a custom made uh solution that you guys designed for your own?
SPEAKER_04No, there's there's an um app called Spot. Spot the spot, and you can you can create multiple rooms, you can put your own custom wall art and like decorate it. Um let's say let's say we're avatars at the boardroom. I can have my laptop and then share my screen onto the the the virtual screen in the room. So then and when you're you when you activate your webcam, it will remove the background from your house, and then based on where you're sitting, it will put the background behind you so that the video looks like you're standing in that boardroom. Oh my gosh, which is kind of cool. Yeah, spot is is nice. Also, we'll use spatial and um spatial.io. They have full-on mobile um desktop and virtual reality, so it's cross-platform. So let's say you can put on your headset, you could just open your phone and you know use the app to walk and be in the event or the meeting, whereas I could be on my headset, so it's cross-platform. That's really nice for collaboration. And brands have been building really nice immersive spaces in there. I think Walmart had a Black History Month experience where they worked with black artists to create art and an exhibit uh to support that with records. You could you know see their music and the art they had made. I think um Pirelli, the the they had made one really nice one. It was showing how they designed their the the tires. There was a merch store, their philosophy, and like they did collaborations with photographers, and that was all there. So these immersive spaces are getting pretty big. Roblox, VR chat all have millions of users now, and the metaverse is is growing steadily. Meta is also they're killing off Horizon workrooms, unfortunately. So I think they they wanted the future of the workspace to be in VR, but they had a hard time getting executives to put on headsets and meet virtually when they can just have everyone come into the office. So that was the big thing. So it's a bit sad they're doing that, but there's other solutions that'll be around. It's just that meta, it's just not their focus. They're focusing on different things for that. But there's other solutions, so it's not dead in the water, it's just that solution didn't take off, and they're they're focusing on the gaming side developing there, and Horizon Worlds as well is moving to um mobile only. So for a while, you'll still be able to access it in a VR headset and connect with people on mobile, but they're just seeing way more uptick from mobile users because you can anyone can access it directly. So eventually you will have no more virtual reality in Horizon. So workrooms was really good at the time. I had a during COVID, one of my business partners was in Toronto. We met three times a week for about an hour to two hours each in the headset. I'd put it on, you'd he'd there'd be a chair like this, and he'd suddenly teleport into it. It didn't look quite like him. It was of an avatar, obviously. Um, but you know, his hands would move when you talk, he could share his screen. We had a whiteboard, we could I had a like you use your controller upside down and you can write on your desk, and it would appear on the oh wow, on the wall, and you can collaborate on it. You could add images, take notes, share our screens, and just seeing him like that face to face. When I turn my head, his you know the audio is spatialized, so you really feel like there's someone in the room next to you. And even though we were far away physically, it felt like we had been meeting in person uh daily or or or twice weekly. Yeah, that was a so it was a it's a nice experience.
SPEAKER_06But for the but for the non-geeky people, I guess the biggest uh roadblock was putting something big physically, yeah, and being used to it.
Live Events, Concerts, And VR Viewing
SPEAKER_04You know, I'm used to it, so I I I could spend hours in in virtual reality, and then you have like the a red ring around your face, but I don't get tired. But some people after 15-20 minutes, they need they need a break because you have to work your way up to to longer sessions. Get your VR legs as they but in Horizon Workrooms, what was cool is um I've had meetings with clients where it was my first meeting with them. I said, come to our virtual office, we'll talk about the project and you know answer your questions. So I give them a link to my work, my workroom. I'll have my headset on, I have my laptop projecting to the screen with a presentation ready to go, but they don't have a VR headset, so they would just get like a Zoom link, and there's a screen inside the virtual office. So I would see them from their webcam, you know, on a big screen, and then they would just see me as my avatar sitting there with the screen and presenting.
SPEAKER_02Cool.
SPEAKER_04And it was pretty impressive. Some some clients I won over just because they were they wanted a VR experience. Yeah, yeah. I'm selling a virtual reality experience in virtual reality, so it's it's it's more cohesive than if I'm just like on Google Meet or something. Um, Meta has been working on, and Apple now has it too. It's called the Codec Avatars. So you you just take your phone, you go like this, scan your face, and it will create a pretty high resolution lifelike version of your face. And with the eye tracking in the MetaQuest Pro, you can you know uh go like this. You're your it has a camera down here that tracks your mouth as well. Oh wow. So it never got to Horizon Workrooms, but eventually the idea was you'd have a one-to-one avatar face for each of your your employees or people you're meeting with. They'd have hand tracking facial expressions, and it would look just like them. So if you never met them in pro in real life and eventually had meetings, when you meet them in person, you'd you'd recognize them like that. It was very high fidelity, and I think you've seen it on Apple when you talk to someone, you you have a FaceTime and you see their eyes through the headset. So I think hopefully one day we'll have avatars for everyone, and you can just have virtual meetings like that. Yeah, maybe your makeup is not good or you're you have you're like had a bad night, but in your avatar, you always look uh your best.
SPEAKER_06Well, we gotta wrap it up soon. Um, but um question for all three of you, I guess. Um we talked so much about the the immersive experience, and on the pro AV side, do you think it's a practical? Do you think it's do you think it's it's um the tech is practical for some of the pro stuff, I guess?
Barriers, Accessibility, And New Audiences
SPEAKER_10Or do you think it's there's still barriers that need to be well, you know, like going back to my example, what I said about the lighting designer that was using it to sell the concepts? I I think there it has a really important part that's not being utilized as much as it should be. Uh so I think we'll see a lot more of that going forward as it's become a lot more uh cost effective to do so. You know, before you needed a lot of rendering power, a lot of headsets, a lot of now almost anyone, I mean, a lot of people have a headset just readily available at home now. So I I think we'll start seeing it a lot more, or or we'll continue to see it more in the sales process in the preparation organization uh side of things. But to see it as, you know, like you said, as an add-on, so like a concert will have obviously a live audience, but then they could they could maybe sell it as a secondary uh revenue stream by having it virtually. I think there's still to his point, we still need to make sure that the infrastructure is okay, that all the different people are communicating so that everyone's in sync and ready to have that secondary production available. Because now not only do you have to worry about the first production and the primary production of in-live performance, but now you've got to worry about the secondary one, right? So it's adding a whole other complexity, but I think with time and technology and extremely intelligent people like Daniel, then it's gonna become a lot more uh uh popular. So yeah, I think there is a space for it. I think it's something that'll be built upon. I'd love to see more of it. Uh, I mean, just hearing Daniel speak today, there's so many things I didn't know about or I've heard or heard of before I I want to try these software. You know, I think that in itself is also a barrier of barrier to entry, right? Like there's so much unknown, it's very complicated. But when you have someone like Daniel understand it and bring it to the masses, we need more people like Daniel to be able to implement it.
SPEAKER_04Thank you for that.
SPEAKER_10Yeah, no, absolutely.
SPEAKER_04I hope to, yeah, I'm an evangelist for VR. Always when I go to shows, I put it on people, and I think I must have shown VR for the first time to like thousands of people from all the events. And uh I'm I don't know if you ever tried um Richie's Plank experience. Yeah, it was like a you put on a headset, you go into an uh you walk into a virtual elevator, you press the button, you go up to the top floor, yeah, and then there's like a piece of wood, and you have to walk on it. And at trade shows, instead of showing the clients' work, we just do this because it gets people to come and it makes a long lineup. Yeah. So we've done like a lot of virtual reality uh experiences for the first time. And I think the other cool thing is um for concerts, I recently watched the Coldplay show. When you're at the show, sometimes you have C. In the back, or you're too. I have a friend who's like this, she's very short. When she goes to a show, everyone's standing up and she can't see the show. So sometimes you have a bad angle, or you just can't see anything. With virtual reality, I was in the piano as uh what's the Coldplay singer's name? I forget his name. Uh Martin or Chris Martin. Chris Martin, yeah, Chris Martin. So he's playing, he's looking at the camera, and I'm right there, and then instantly I'm next to the drummer watching him play.
SPEAKER_06So that's cool.
SPEAKER_04Even if it was on TV, they could have multiple cameras showing those. Right.
SPEAKER_10So you're making it accessible.
SPEAKER_04But you're it's on a flat screen, and you know, you're not seeing it. Here it's 3D, and I'm right in there. So, some in some ways, concerts or these live events can really give the user every user, maybe someone's handicapped, they're all the way in the back, or they're sitting, they can't, it gives you an experience that you can't reach. You can't put a thousand people on stage either, or a hundred thousand people. So that's a really that's nice.
Streaming High-Fidelity Scans With Hyperscape
SPEAKER_08To to Daniel's point, I I I'm also a believer that uh augmented or let's call it XR is is it has been coming for the past five to ten years. We're talking about it, and now everyone's talking about AI, but VR the technology to consume and the technology to uh capture is getting very democratized. Like you can get a VR headset for 300 bucks. So I think it's a matter of time, and those glasses from Meta, like soon you'll you'll be able to like have real glasses with an overlay on top of it, and this is what's this is what's coming, uh that's for sure.
The Metaverse, Digital Twins, And Next Steps
SPEAKER_04Yeah, and to your point, the technology is improving so much that now um back in the day you had to set up a private server to stream content. Now uh Facebook, for example, released a new app called Hyperscape. That's one you should check out. So Hyperscape uses can I can you pass me the reach? I probably here might as well show it off. Here is the MetaQuest 3, everyone. So it has um cameras over here, and this here is a LIDAR, so it sends laser light when it bounces back, it measures the time of flight to estimate how far away everything is. So with this app, you put it on, you say scan space, and you're gonna see the room around you. I wish you could do a live demo here. There we go. And uh when you see the room, a grid will a mesh starts to appear, and as you move your head around, the mesh starts going over all the objects around you, and it says, Okay, walk around the whole room and it puts a mesh on everything. That's stage one. Then now it says go back and look at it in more to remove the mesh. So now you come and you go closer to it, and the more you get details of each part of the scene, it removes the mesh, showing you now that you've detailed scanned that area, and it's all again like augmented reality. The headset can see the room, so it has the slam technology to know where it is in the space, but it kind of hurts your neck because you have to go like this everywhere. So I could scan this space with just this headset. Normally you need a uh DSLR or a phone. Now you could just wear the headset, and as you're doing it, you're seeing what's what's uh how much is left to scan. And in the last phase, it asks you to just find the corners of the room and the walls to figure out the exact layout, so then you'll see the the grid appear. Then all that is uploaded from the headset to Meta's server. It takes about a few hours to process, and it will create what's you know, Gossi and Splatting. Maybe you've heard of that. It's like the new photogrammetry AI-powered neural rendering technique. But the model is so high quality, like it's really one of the scenes I put on. I had scanned the day before in a casso in Italy, and uh I was in the upstairs part of the castle. When I put on the headset, for a second, I thought I was downstairs because what I was seeing was so real, it messed up with my mind. I'm like, what? I'm upstairs, and I had to just remind myself it's not real because the the lighting, Gaussian splatting, you know, is is a very high quality um reproduction of the real space. But the headset couldn't run the amount of polygons and pixels is too heavy for a mobile. This is a Snapdragon XR chipset, it can't run it locally. So meta-servers are rendering what you're seeing and streaming it to the headset in real time. So when you're watching these scenes, you no longer have to have you know a beefy NVIDIA card in here just to be able to run that. Normally, that's what you would need to see the scene at that quality level. So with faster um streaming formats that are more compressed but still have good quality, we can stream real time. Yeah, real-time streaming of these massive data sets. So as technology evolves, it's opening new possibilities. Whereas before, you'd have to have a computer be tethered wirelessly or connected to the computer to see that kind of data. And with hyperscape, you can have extremely realistic scans. Okay, and I've done some castles, I did events too. So maybe someone at an event or to show the space with the lighting could just put on a headset, walk around, scan it, and then you have a super lifelike one-to-one digital twin of the space. Um, for now it's locked behind their stuff, but you could do it on your own with your own um cameras. But that'd be one way to create a space, and then you can then cut it up and place other objects within it to design a better set or you know, collaborate with a designer to to improve it based on the understanding. So hyperscape, check it out. Yeah, I I will check it out. That's for sure. You can visit uh Gordon Ramsey's kitchen, um, catch the rapper, chance the rapper's studio as well, and you can walk around and just you can go like this and see under his keyboard and everything. Everything is as it that's true, uh virtual reality, six degrees of freedom.
SPEAKER_06That's crazy. I'm a huge proponent, I'm a huge evangelist too for for VR, and uh I've always I've always enjoyed it. But um and and I and we discussed a little bit of some of the the roadblocks, but I still think that immersive experiences is so I I I think XR is the future, but I think for people who don't have uh who are are still still have some of those roadblocks, the the the the um the message today is I think immersive technology, regardless whether it's projector mapping or spatial audio or VR or XR is the way to go. I mean i we're always thinking about how how do can people elevate the experience, whether it's in a boardroom or at a live concert or even visiting uh a mine or whatever for investors. It could be whatever it is, but I think we're at the stage now with AI and machine learning and with faster processing that immersive experiences uh is the way to go. So thank you so much for my pleasure, guys.
SPEAKER_04Thanks for having me here.
SPEAKER_06It's always good to mind blow all of us. Passionate people, yeah. Thank you. And and pushing the technology forward and uh because uh it's guys like you that really helps and get us get everybody buying up all kinds of toys and stuff like that.
SPEAKER_04Yeah, I'm hoping to work on a new app so we'll have our own proprietary app that maybe it's a collaborative thing or some kind of gaming, puzzle hunting, uh multiplayer thing. So maybe eventually I'll for now it's all been client work and just producing content, but we want to create something of our own that we could just sell and have passive income and nice almost like a video game, but more of um interactive. Like this one app called Brink VR. They went around the world scanning real places, Antelope Canyon, uh the Grand Canyon, some forest scenes, and they scan it with in super high-resolution uh photogrammetry, and then you can put it on and meet with other people and just explore the world from your living room. You can walk around and feel like you're in the space. A little bit more immersive than 360 videos. Um, so it's it's one way to create apps and could be a meditation app, a training app, stuff like that is uh something we have on the horizon.
Future Plans And Closing Reflections
SPEAKER_06That's amazing. Well, love to have you uh maybe again on the podcast, and maybe we'll have uh the podcast on a VR uh no virtual.
SPEAKER_04I can technically scan this room without you guys, and then you could have any guest from wherever pop pop them in here as an avatar, and they could be having here a virtual podcast in this room.
SPEAKER_06That's amazing. Yeah. Well, anyways, we'll definitely have to pick your brain uh on another. There's just too much to unpack today. There's just way too much. Yeah.
unknownYeah.
SPEAKER_04Oh, yeah. These are my stereo uh glasses, so sometimes on my I don't use them as much anymore, but when I'm want to quickly check this the 3D of a video, instead of rendering it out, putting on the headset to see how the 3D is, I can just switch it to Anaglyph on my screen and put this on, and I could see in real time how it how it looks.
SPEAKER_03Perception.
SPEAKER_04Yeah, it's as good as the stereo off, and then I can adjust it and see how it how it looks. It's amazing. These are actually still very handy for for production. I also worked on my I had a Panasonic 3D TV with the active uh 3D, and there I could also just do side by side in the glasses. It would put the images together and have a real time 3D check.
SPEAKER_06Very cool.
SPEAKER_04All right, thanks again, Daniel. Thank you for having me. Appreciate it, man.
SPEAKER_06Pleasure.