Today we're talking about a technology that gets us a few steps closer to the holodeck from star Trek, or the construct from the matrix, which if you remember, was that virtual environment used by Morpheus and Neo to learn kung fu and control and alter simulations “guns, we’ll need more guns”. These were virtual environments that could have a real impact on the people within the physical world.
Before we get futuristic virtual worlds that mirror and simulate the real world, we’re going to need to work out the technology that expands on how we visually represent and interact with a digital twin to effect change in reality. We have tens of thousands of individual products today that have bi-directional relationships with physical devices and a digital representation, including factories, airplanes, and individual condition monitoring solutions. The challenge we face is finding an easier way to integrate the entirety of these disparate ecosystems of devices that don’t natively talk to one another. Initially, businesses are seeking to improve efficiency, monitor safety, or even simulate or recreate events that occurred within these ecosystems of independent systems by interacting with their digital twin.
Science fiction has been predicting this future state well before the technology existed. It’s told in stories where somebody gets trapped in a virtual simulation while the rest of the crew is outside of the simulation and is in grave peril.
It requires the hero to find the backdoor in the virtual simulation in order to recreate a version of the starships controls that can affect the physical world. We may not have neural interfaces or fully immersive holographic of environments, but we do have examples of very intricate digital twins and augmented reality.
I've seen a number of command and control centers at larger manufacturing facilities where it feels a bit more like mission control filled with monitors displaying digital representations of plant status, which is usually it's a 2d image and color-coded status icons and helpful if you’re a bird or very familiar with the facility, but we are now starting to see the use of high fidelity 3d images that allow digital twins to be manipulated from multiple angles and see multiple data points visually superimposed on a different augmented reality layer. The data streams in from the various sensors that don’t talk to one another but still centrally report in, from the PLCs, IoT sensors, equipment status alarms, and granular condition monitoring subsystems. And that is really exciting. As we discussed in Episode one about the digitization of workforce knowledge, sometimes we need to see all of the different contextual data points to really understand why something is happening.
Generating these large visual 3D representations of a physical space is rapidly evolving. For years we’ve been able to look at photos and have 3D designers create a digital representation — trade show booths, home builders, and architectural firms have been selling their ideas for decades now. That’s not what we’re talking about — the technology we’re going to discuss is how we can digitally recreate a physical space that already exists in 3D In a matter of minutes to hours, depending on the size of the facility.
So in this interview, we've got Brittany Shramm from Matterport, a company that specializes in 3d image capture software and technology that created some of the highest fidelity scans of an industrial facility that I had ever seen.
They showed some demos and I was, uh, blown away. Digital twins and augmented reality have really gone from novel curiosity into something a bit more mainstream. We’re finding that it is becoming increasingly necessary to combine data from physical assets into applications and it infrastructure in order to improve operations.
Service and maintenance teams armed with better
Welcome to conversations in connectivity. I'm Ryan Carlson, your host. And today we're talking about a technology that gets us a few steps closer to the holodeck from star Trek or the construct from the matrix, which if you remember, was that virtual environment used by Morpheus and Neo to learn Kung Fu and control and alter simulations guns we'll need more guns. These were virtual environments that could have a real impact on the people within the physical world as well. So before we get futuristic virtual worlds that mirror and simulate the real world. We're going to need to work out the technology that expands on how we visually represent and interact with the digital twin to effect change in reality. We have tens of thousands of individual products today that have bi-directional relationships with physical devices and digital representations. Including factories, airplanes, and individual conditioned monitoring solutions. And even my garage door. The challenge we face is finding an easier way to integrate the entirety of these disparate ecosystems of devices that don't natively talk to one another. Now, initially businesses are seeking to improve efficiency, monitor safety, or even simulate or recreate events that occurred within these ecosystems of independent systems by interacting with their digital twin. Just like I want to know in the middle of the night is my garage door open or not. And I look at the icon on the app. And if it's closed. I can go back to sleep knowing that my door is closed. Whereas science fiction has been predicting this future state well before the technology even existed, it's told in stories where somebody gets trapped in a virtual simulation while the rest of the crew is outside of the simulation. And they're all in grave peril. Of course it requires the hero to find the virtual backdoor within the virtual simulation as a way to recreate a version of the star ship controls that can then affect the physical world. We've all seen this one before. Well, maybe we haven't, but I have in multiple iterations, we may not have neural interfaces or fully immersive holographic environments. But we do have examples that are very intricate, detailed digital twins and augmented reality applications. Now I've seen a number of command and control centers at large manufacturing facilities, where it feels a bit more like mission control at NASA field with monitors, displaying digital representations of plant status. But usually it's a 2d image and color coded status icons. And it's helpful if you're a bird or you're really familiar with the facility, but we are now starting to see the use of high-fidelity 3d images that allow digital twins to be manipulated from multiple angles and see multiple data points visually superimposed on a different augmented reality layer. The data streams from these various sensors that don't talk to one another can still centrally report in from their PLCs, IOT, sensors, equipment status, alarms, and granular condition monitoring subsystems. And that's really exciting. As we've discussed in episode one about the digitization of workforce knowledge. Sometimes we need to see all of the different contextual data points to really understand why something's happening. Generating these large visual 3d representations of a physical space is rapidly evolving and now for years, we've been able to look at photos and have 3d designers sit down and create a digital representation of trade show booths. Home builders and architectural firms that have been selling their ideas for decades now. Ah, this is what your new home would look like. But that's not what we're talking about right now. The technology we're going to discuss is how we can digitally create or recreate a physical space that all ready, exists. And create it in 3d. And a matter of minutes to hours, depending on the size of the facility. So in this interview, we've got Brittany Schramm from Matterport, a company that specializes in 3d image, capture software and technology. That created some of the highest fidelity scans of an industrial facility that I have ever seen. They showed me some of the demos. And I was really blown away. Digital twins and augmented reality have gone from novel curiosities into something that's a bit more mainstream. We're finding that it is becoming increasingly necessary to combine data from physical assets into applications. And it infrastructure in order to improve operations. Service and maintenance teams armed with better contextualized data can produce business critical insights to help companies actually stand out from everyone else, because it's not just people in a room guessing about what they think is wrong and product teams can now monitor how equipment is being used and can confidently say, this is how people use our equipment. It's actually making decisions based on what's actually happening in the real world and exploring the power of new technologies, like machine learning and AI to quickly analyze digital simulations and arrive at those answers. And now. Under the interview.Ryan:
Matterport, you are doing 3D scans using, both a hardware and a software technology is what I'm to understand, right?Brittany:
Yes. So we have both sides. So Matterport is a 3D data platform. We are leading and revolutionizing, digitizing the built world. We have the largest spatial data, in the world by far, and we make it very easy for customers to capture their spaces and turn that physical asset into a digital asset. That they can build upon, that they can manage, that they can collaborate in, that they can take and integrate into other solutions as well. so how we do that, we, it starts with capturing that space and we have our own hardware. So we have our Pro two and Pro three cameras. the Pro three just came out and. We also have our professional services, so that's a white glove turnkey capture service team that comes out and will capture the space for our customers as well. and then we also are compatible, So we're camera ubiquitous. So we're compatible, We're camera agnostic in a sense, right? Like we're compatible with, Leica. Insta Rico, we're compatible with all these variety of different, hardware providers as well, so we can empower customers to capture their own spaces. we also are compatible with the phone in your pocket, so Android and iOS.Ryan:
So as long as they have the L or the Lidar, lidar, in their l So why we're even having a conversation is yesterday at the keynote at the i t and Oil and Gas Show. We had the VP of Amazon, or AWS, i o t up there, and there's just this video and they just mentioned Matterport. And then it was this like video game, like 3D rendering, or actually it was more like Google Earth than anything else. And it was like this fly through of a plant with all sorts of sensor readings superimposed over the top, like a big augmented reality experie. And I was blown away cuz I've been part of projects with remote condition monitoring and were limited to a stupid 2D floor plan with little status lights or, red, green, yellow, or whatever it might be. And the idea of being able to send someone into the field and go that particular set of interfaces, that's where we're having that problem. And if you turn to. You'll see that the temperature, there's a little modal that's showing that the temperature is high and you look over to your right. Maybe it's a vibration sensor that's showing some anomalous readings, right? So it's not just a discreet, there's an alarm, but it helps someone as if they were working that job for years to have that same contextual sense. So tell me, how is it that people are deploying these 3D walkthroughs and superimposing the sensor on? Absolutely. What are some examples?Brittany:
and meeting you here at the IOT and Oil and gas conference. We are an AWS sponsor, so we are working very closely with the digital twin team, the iot team, and specifically twin maker. And we are launch partner of twin makers. So Matterport has been a launch partner with Twin Maker since November of 2021. Okay. And we have a very close partnership working very collaboratively together. Matterport makes it very easy to capture spaces, like I said, but then also leveraging our SDK and our APIs. Our customers can integrate and build on Matterport. So when you're talking about some of that, that data or visualization, let's say they wanna understand piping and they want to see workflows, or they want to do predictive maintenance scenarios, you can leverage matterports EST to gain APIs and you. Within Matterport, you can also leverage our SDK and APIs and you can integrate Matterport into those solutions and workflows that those customers are using today. what you saw was Invista, who is working through their analytics, integrating sensor data, IOT data, and working on really empowering their Matterport digital twin in a whole new way.Ryan:
So where might have I seen some of Matterports previous work? I've heard that, there's real estate and some other, industry verticals outside at oil and gas that. Been where? some of that early work was applied. Yeah,Brittany:
So residential real estate has been an incredible market for Matterport and providing value there. Commercial real estate, we have several use cases. in manufacturing we have use cases across insurance, so how are folks insuring their homes, but also how are we working with the insurance industry as well? that's been, a vertical for us, and retail is another vertical as well as travel and hospitality. So think of the large hotel brands out there. Think of the large retailers and how are they? Leveraging the phone in their pocket to merchandise and to collaborate. to report back to their corporate offices. Also, how to, communicate merchandising standards to their stores or brand standards and how are they then able to empower their stores to more efficiently meet those standards and share back and communicate back and receive that feedback loop with the corporate office as well as with the field teams.Ryan:
So the use of this 3D spatial stuff. I think, every one of us has seen the, how it's been applied in making movies or it's used in video game capture or, there's all of this use of 3d, but I'm hearing that it's going in and not creating something that didn't exist, but it's going into capturing an actual experience. So is this like the, would I be taking this hardware. not the pro two, but the pro three cuz it's got one more P So I'm gonna go through a space and it's just gonna be capturing like 360, Data. what is, what does this process look like? Because I saw it in the demo, like how that video was applied. But I'm curious, how does that process work?Brittany:
Absolutely. That's a great question. Like I said, through a hardware device, through a capture services team, through the phone in your pocket. Once you captured that physical space, then through Matterports Cortex ai, which at which is our proprietary technology, it's our vision neural network vision pipeline, that then creates the 3D data platform. It takes all of the different layers and stitches that together, right? The different layers being, It's not just photo imagery, it's not just the mesh. It's not just point cloud. It's how we unique. Combine and stitch this together to create that 3D data platform, right? And then that is really the power of Matterport and how that is leveraged and used to being able to, have dimensional accuracy in a space to understand how tall is that ceiling, How, how wide is that? Is that furniture, is that couch? then you can get into like analytics scenarios, right? To start to understand spaces a bit more. So I keepRyan:
thinking about how this te. could or would be applied. I think about, like some of these comic book movies where you've got, the billionaire guy who's got like all these holograms and is like overlaying this idea of a virtual reality over a real reality. So in, in a more practical sense, if we've got a factory and it's got a whole bunch of stuff, and what if there was an accident or an explosion? would I be able to then, import the the scene and then do a comparative analysis of before and after. Mm-hmm. and then, if I'm taking sensing data mm-hmm. you get to Almost like recreate mm-hmm. this scene. I mean, is this, is this just a figment of my mind or?Brittany:
We do have some features within, our platform that we have recently come out with and one of them being layers. So how we, like you're talking about looking at more of a time travel concept. Mm-hmm. of what did this space look like the day before or a year before, especially when you're doing more frequent scanning and capturing and maintenance scanning. and. To your point, like looking at a before and after scenario as well. That's really common in insurance insuring spaces, right? Oh, using the, before, I've captured my home and then there was a fire mm-hmm. So being able to provide that to insurance, companies as well. So that's a very big use case. and then I think what you're also talking about is taking, we're really great at capturing. Physical real space. Mm-hmm. creating that 3D data platform. But then it's really in the power of the customer. Where do they want to take that and what do they want to build, into it or on it? Do they want to take that and layer in, virtual reality? Do they want to, stage a room for a real estate example, right? Like they can do virtual staging through using our sdk, through using our platform, and they can layer the real physical space with virtual assets to achieve. um, other scenarios. Maybe you want to do predictive like maintenance in a facility for industrial IoT use cases, right? How are you taking that real physical space and then, running through the solutions that the customer may have, and where their data is stored, and then how are they able to analyze that within that real space? So it's combining the real. The future or historical,Ryan:
this is the digital twin. Mm-hmm. like it is the closest fidelity to a digital twin that I've ever seen. And that, and again, that's what really drew me to, you know, like, Oh, Matterport. And like, I need to, I need to learn more about this. Because it's been such a pale comparison. Digital twins have just been like the data representation. Mm-hmm. the idea of. A world in which we are losing people in the workforce that hold all institutional knowledge. A lot of factory settings, manufacturing, There's the rail industry and these are people who've been in the industry and their family's been in the industry and no one's replacing them, right? And so what they see is very different. The group. Cohesive, which is Mm. Looking at all these like discrete data points. and to think of how they'd be able to, this, this jive so well with what David Armstrong was saying was that, that we can't just look at a single data point. Mm-hmm. because it might be what someone did further down the line. A production line. Mm-hmm. a change that they made that is skewing the programmatic results. Like, why did this thing happen? it's more complex than that. and to layer. This, augmented reality or this digital twin, I could see this being one of those opportunities to take the person who's a couple years out of retirement. they're, you set'em down. do you use a virtual reality goggles or anything to look atBrittany:
this stuff? Yeah. people definitely use it all the time. I think that's also a use case or people leverage. The goggles when they are looking at, training and enablement. Yeah. So how are they training their employees before they actually get to a facility or before they get to a specific location? Yep. So that way they can be, efficient when they're there and they can understand all the hazards and the machinery. That's definitely, yeah,Ryan:
because example, that, that fidelity gap I think is where we lose a lot inBrittany:
translation. and exactly. and through our cortex ai that's really, and. Our leveraging lidar, leveraging our Pro three camera. That's how we're able to have such photorealism and be the leader in having a 3D data platform that also is photo realistic, but then is in the cloud and you can run it on your computer and your computer's not gonna crash because think of all that immense amounts of data that's in there. so it's very special.Ryan:
So how long does it take for someone to say, I would like to build a prototype. I've got this back mechanical room, or I've. Small bit of factory floor. Is this a hours, days, weeks, months? What's the typical, from someone saying, let's do it and bringing people on site to having some sort of model that they can look at and hand off to maybe a software provider to tag with data points. What does that typicallyBrittany:
look like? And, and the customer can tag with data points directly. but absolutely there's other partners that we work with if they want to integrate and build and make something more robust. it depends on the size of the space. it's very efficient. I can take a Pro three camera and I can do a 360 of a room in a matter of 30 seconds if that. Okay. I captured my entire, two bedroom, two bath house, and I think. a minute and 30 seconds to two minutes. Okay. it's very efficient. And then uploading that to the cloud, obviously if you have a very large facility Yeah. thousands of square feet, That can take longer. And that's really where professional capture services team can come in and help capture that with ease, accuracy, and ef.Ryan:
But what I'm hearing though is that this isn't a lengthy process to create this digital twin. Yeah. I didn't know, right? Yeah. Like we have to set it up and there's an exposure and then it spins around and then, and youBrittany:
can do it with your phone. We have an app and the iOS store you can capture in on your iPhone or on Android. So you can download the app right now and you can capture your room or your house, and less than a minute.Ryan:
All right, I think we all have our homework. We're gonna download ourselves to the Matterport app. We're going. Take our favorite room and turn it into a digital twin. Absolutely. Yeah. I'm looking forward to playing with it. Thank you so much for much sharing this time and sharing your expertise aboutBrittany:
Matterport. Thank you so much. It was great chatting with you. Thank you. Thanks.Ryan C:
This episode of conversations in connectivity is brought to you by Soracom a global connectivity service provider that believes the fastest way to cost savings in scale. Is when customers are in full control of their connectivity, operations. Experience your own self service pay as you go global connectivity plan without a contract at day.
At soracom.io. Signing up for an operator account takes less than a minute. You can even test with a virtual SIM and experience things like VPN or VPC peering just with a couple of clicks. It's pretty amazing. What normally might take three months to a year. You can have done. On your own self-service through your own operator console. In minutes. So if you're using cellular and maybe even satellite. You can now have a platform that works for you. And you can do it on your own. But if you've got questions, Go ahead and reach out and check out soracom.io. And go to contact us and let them know that you learned about this on the podcast.