Video: Matterport ShopTalk Webinar #17: iPhone 12 Pro LiDAR support; Capture on Android and the latest 360 cameras: the Ricoh Theta SC2 and Insta360 ONE X2. Video courtesy of Matterport YouTube Channel | Program first aired live on 10 March 2021

Transcript: Matterport Webinar iPhone 12 Pro LiDAR; Ricoh Theta SC2, Insta360 ONE X2

Speakers:

1. Amir Frank, Matterport Marketing Content Manager
2. Kirk Stromberg, Matterport Senior Director, Product Management

Amir:
Welcome everybody. Thanks for joining us. See participants coming in. That's great. Give you a couple seconds there with the music to get your audio worked out and get settled in. Here we are. Let's go ahead and just get it started. Again, thank you so much for joining us today. Today we are talking all about these new releases that we've got coming and who better than Kirk Stromberg, Product manager to speak about that. All right, let's get going here. We are going to cover a lot of stuff today and then open it up at the end for Q and A, so please stay tuned for that. The questions panel is at the bottom of your screen, you can open that up and just type in your questions at any time so we can address those. I'll try to man the slides and the questions simultaneously as Kirk talks about iPhone 12 Pro LiDAR support, capture on Android, supported 360 cameras, and one of my favorites, backing up scan jobs. Very, very excited to learn about that. It's going to be great. With that, Kirk, thanks again for joining us once again at Shop talk.

Kirk:
Thanks, Amir. Glad to be here, glad to see everybody virtually. Yes, I know that, that topic is one of your favorites scenario. You've done some awesome assistance and webinars and articles for that topic. Let's jump right into it. If we go to the next slide. Most folks are Probably aware, we released an update to iOS capture on the app store last month [February 2021] that supports the LiDAR that is... There we go. The sports LiDAR that's in the iPhone 12 Pro and the Pro Max, and as well as the iPad Pro 2020, that was released at the earlier part of last year [2020]. These are devices that have active depth sensors on them.

Kirk:
For users that are familiar with Matterport for iPhone, using your iPhone or your iPad to scan a space, the difference here is that instead of us synthesizing the data for the complete view, we now have an active depth sensor on those devices. Let's say you could get a more accurate representation of the space, and that tends to lead to a lot better performance in terms of aligning scan to scan, you'll see fewer errors and the fidelity of the model is better. Capture 4.1.1 iOS app store is out right now. We'll go through a little screenshot in the next part. All the other devices that don't have LiDAR continue to operate just as they worked before with Matterport for iPhone, Cortex AI synthesizes the depth for that, for the information.

Kirk:
Just as normal, we're going to continue to do beta releases. And if anybody wants to stay on or participate in a beta release of capture, go to matterport.com/beta. For iOS, we use an application called TestFlight, and this is Apple's beta app store essentially. Basically what you'll do is you'll install TestFlight, and then you follow a link that will have a matterport.com/beta, and that'll get you into the beta automatically. You'll be updating your capture app via the TestFlight app instead of the app store normally. For folks that haven't seen it before, the thing that you will see is very familiar if you've done Matterport for iPhone before.

Kirk:
Basically, you're aiming at a dot, you're rotating and you're trying to rotate around the camera itself to try and minimize parallax and you hold on the dot and you complete it and you move to the next dot in the rotation. The little arrows on the second side is you'll see the field of view for the LiDAR sensor is narrower than the ultra wide camera. And so we're using the ultra wide camera to get the best widest imagery and HDR which we found the iPhone. The LiDAR field of view is smaller. You'll see that dot pattern of the second pen. That shows you where the LiDAR sensor is actually covering. Everything outside of that, the cortex AI will take care of you continue rotating around.

Kirk:
There's one thing that's a little bit different than regular Matterport for iPhone, and that we want to cover that last dot one last time. That helps us synchronize all the data between the imagery and the depth data. Then you Proceed as normal. You go to your next scan location, you continue scanning through the space and basically it's just business as usual. One thing I will note it is possible to pin and rotate the device to try and get better LiDAR coverage, but there is a little bit of a risk there, in that we want to try and minimize parallax artifacts and that's where you really want to try and keep the iPhone in exactly the same spot and rotate around that center.

Kirk:
In general, the coverage of the LiDAR sensor in the mid range is sufficient to get a really good model. You can do that, but did you run some risks of some imaging artifacts because we have to stitch all of this together. In general, you're looking for obviously the best imagery and the best steps data that you can get. I'm looking at a couple of Q and A things. I'm going to pop into say one or two of them as they're coming up here. We have not restated the... The question is, have we restated the expected precision for the scanning method? Not yet because we need to go through and see a lot more models. Obviously, the accuracy for measurement depends on what you're measuring, where, how many scans are in that sequence of what you're measuring.

Kirk:
We're working on a revamp of that in terms of guidance, in terms of what you could expect accuracy wise. It also depends on in what mechanism are you measuring. Do even in workshop, in showcase or with some of their assets. I will plug a lot of Amir's work here on Matterport Academy. That will save us a whole bunch of awesome videos that explain how to use our Products. There're details in terms of getting started and best practices for scanning with Matterport for iPhone. Also, note that when we say capture in the health, there's a little video that you'll see on first use that reminds you. One of the key things with Matterport for iPhone to getting a good model is somewhat counter-intuitive. When all of us do a panel with a phone, we typically rotate around ourselves.

Kirk:
One of the key things with Matterport for iPhone is, you really want to try and rotate around the phone. That's a little bit unnatural. So if you have a monopod or if you want to get a lightweight collapsible monopod, it's a really good reminder and it helps you learn how to rotate best and minimize stitching artifacts as you go rotate through this. Especially if you're doing a larger job, the monopod definitely helps just because, you going through a lot of space and you do lots of rotations. As I think I noted before, in general, if you have an iPhone 12 Pro, Pro Max, iPad Pro 12, and you are LiDAR capable, it is better in the higher fidelity better success rate than if you're doing the regular Matterport for iPhone without LiDAR. Without LiDAR, will be faster to rotate because we're not gathering two sets of data at the same time. But in general, when it quality matters, you want to use LiDAR if you can.

Kirk:
One question had come up before is, are those the schematic floor plan and the Matterport available using LiDAR having the answer? Unfortunately, it's not yet. We're discussing policy changes. We know that folks who want to be able to use that data, especially for 3D modeling and other purposes. Standby for more updates on that. One question here is, can I get a LiDAR scan that is more like a complete scan? That's a really good reminder. Thank you, Matt, for that question. With Matterport for iPhone, we have two modes, simple and complete.

Kirk:
Simple is one rotation around. Complete is two rotations around. In simple, you're pointing the phone directly, well, mostly horizontal, slightly tilted down and you're rotating is captured that rotation. In complete, you will do two rotations, one up and one down and they'll overlap in the middle. Then you end up with much more coverage in terms of the space. You'll see more of the ceiling, more of the floor. As you move scan location in general, you're going to be covering those areas from a different scan location. But the question that Matt had is, am I going to basically get a complete scan for LiDAR? Yes, we are looking at that in terms of adjusting those and making it work for LiDAR. You should be able to get a complete scan as well.

Kirk:
The reason folks want to do this is, I just get more coverage. I get a more complete scan of the space. It is longer. It does take more time, but if that's what you need to do, then we want to make that available for you. A question from Kevin, are you able to transition to exterior scanning with the LiDAR and the sun, but the Pro2 camera can not. Can use the Pro to endorse and iPhone outdoors? The answer is yes, and yes. The frequency that the iOS LiDAR devices uses is not as susceptible to the near infrared that the Pro2 cameras. So, it does generally advertise... They advertise a range of about five meters in our tests outdoors and full sun, you definitely can scan and get depth data in instances where you could not with the Pro2.

Kirk:
Obviously, if you have a non LiDAR device, cortex will synthesize that data for you. You have a multiple options in terms of scanning the full sun. Just like we do with the other cameras, we'd support mixed camera modes. So you could be scanning with one camera in a certain area, switch over to a different camera in the same Project, the same job. They should be aligning at all work with each other. Two questions here. I know I'm going to reading some questions. I want to make sure we're hitting the iPhone LiDAR questions real quick here. We'll catch a bunch of them at the end, but one of them I want to touch here is one question from Peter. Is how do you handle a room with mirrors when you're standing with an iPhone?

Kirk:
Yeah, this is a challenge because you are directly involved in the shot there. If you are directly in front of a mirror and there's no way to get slightly off to the side, then we have one option is scan through the space and then hide that location afterwards. So that you're getting the data, but you don't have a spot where the visitor and the tour would actually land to see you. Obviously, some of the rooms, if there's a room with a large series of mirrors or a lot of reflective surfaces like windows, when it's dark outside, or mirrored wall... Right now we don't have the magic to remove you from the shot.

Kirk:
The last question here on LiDAR is from Don. Does the app automatically use LiDAR if available? If you're using LiDAR device, when you go into your scan modes, the default is on LiDAR. You can choose to do non LiDAR and it uses it by default. If you choose not to, then you'll go a little bit faster, but you won't have the devastate there. We will catch the rest of the Q and A questions around LiDAR at the end. We're going to go see the rest of our sections real quick then we'll wrap around some more Q and A.

Kirk:
All right. Switching gears a little bit, let's talk capture on Android. As folks may or may not know, we've had capture on Android in open beta for a while now. Android capture has been available in the Play Store for a while. Betas on the Android operate a little bit differently than on the iOS app store. Basically you go through the Play Store, there's no Test Flight equivalent but there's just an open pre-release channel. If you search for Matterport on the Play Store now, you'll see it enough in beta. Version 1.0 is essentially, right now it's focused on connected cameras. One of the most requested features of course, is when will I be able to scan with just my Android phone or my Android tablet, that's coming in our next major version since Matterport for Android.

Kirk:
Right now we're working with the Pro cameras. All of the Pro cameras, the Pro2, Pro2 Lite, original Pro and the 360 cameras. We'll talk about all six 360 cameras that we support there. The BLK360 is not yet supported, our first priority for the next release is the Matterport for scan with your device, the smartphone mode. In terms of what do you need for an Android device to use capture with your connected cameras. You need to be on Android eight or above and note that Google is starting to seed versions of Android 12, the next upcoming major release. We haven't tested that yet. If you're on the bleeding edge and you've got one of those developer previews, pretty good chance that it's not going to work right, or there's going to be some an issue.

Kirk:
We haven't fully tested that. I'll stand by. We'll wait until that's getting into a much more public beta realm. And we'll make sure we get explicit support for Android 12. Now here's the key thing, across with capture and connected cameras, and even with a smartphone capturing on iOS, a subtle thing in terms of performance is the amount of RAM in your device. What we're doing here is computationally intensive. It's a larger spaces, it tends to eat up a bunch of RAM in terms of keeping track of all the different scam locations and the imagery and the surfaces and artifacts there. Right now we've restricted this to devices with three gigabytes of RAM or more. That's a lot of devices out there in the Android world right now. But there are a tremendous number of models.

Kirk:
Obviously, some of the more affordable models are down to the lower end of that range. If for some reason you're going through the Play Store and it says that your device is not supportive, that is one of the key reasons why we see people asking, "Why isn't my device supported?" That's one of the key things. We want to make sure that the performance that you're seeing is good. When you get into the lower RAM devices, things start to get really sluggish. You run through a danger of the operating system killing capture, and we don't want to have that happen. Number two, it's a little bit more subtle is that we're using supportive devices that are approved by Google and certified and ungraded.

Kirk:
Some of that, some of those words to me, may or may not be familiar. But in general, device manufacturers say like an Amazon fire tablet uses Android, but it's a fork of Android. It's a different version. It's not certified by Google. Right now, we don't run on the Amazon fire tablets. Similarly, very popular devices like the Huawei series, up until last year, we worked on those devices because they had Google services on board and they were certified by Google. But given the situation with the US government and Huawei right now, more recent devices from Huawei Lite, the P40 Pro don't have Google services onboard. Right now capture doesn't work on something like the Huawei P40 Pro.

Kirk:
Again, that would be a reason why you might go to the Play Store and you're not seeing it available for your device. The one that's the most subtle and the weirdest to figure out as a user if you go in and you can't see capture for your device is we work on 64-bit architecture devices. This means the most modern CPUs and GPUs. This is not advertised well in the Play Store. There're some sites you can go and find the specs there. But if for some reason you think your device should be supported and you're not seeing it, or you're having an issue, then you can write to capture this beta at Matterport.com our beta site and we'll try to investigate that for you. But in general, you should be able to see it in the Play Store and you go from there.

Kirk:
Obviously, we're in the Play Store, we're not in another Android app stores. There's manufacturer app stores. There's a bunch of different app stores in China. It's a little bit more of a complicated world versus the iOS realm. Right now, we're in the major one for users that are in mainland China. Obviously, sometimes people will VPN out to get access to stuff like that. But if you're in China proper and not VPNing out, you're not going to have access to that. One exciting thing with Android capture is that we're localizing application in a wider set of languages to be more friendly to users in other countries. We're expanding to 12 total languages within the app. This is part of our broader effort at Matterport to try to be more global, more tuners to local conditions and native languages. You'll see these languages coming to iOS capture soon as well. Generally we want to try to make sure that you're capturing and your viewing experience or your users customers experiences are as native as they possibly can be.

Kirk:
There's a couple of differences between Android capture and iOS capture. They're not really not huge, but one thing that is quite convenient and this'll be coming to iOS capture as well, is the cortex download is already included in the application when you install it from the Play Store. This is for users that are familiar on iOS, if you use the 360 camera or you're converting a place 360 view to 3D with a Pro camera, or you've used Matterport for iPhone, you know that when you go to do that the very first time, we have an extra download and we download the cortex AI engine. This is the artificial intelligence that helps us convert just visual imagery into 3D depth data. It's not photogrammetry, or you're using a whole bunch of different photos to do that, but it's our trained AI based on the data that we've seen in prior models.

Kirk:
This is more convenient because you can just start scanning with a 360 camera right off the bat. You no longer need an account to try with a 360 camera. This'll come to iOS as well because folks want to try with smartphone capture with a Matterport for iPhone without making an account first. We're relaxing that requirement bundling up directly. Luckily both the Play Store and the iOS app store are smart enough to recognize that if you are a user on a prior version, you already have the cortex engine installed, and it is the latest version of the engine, they won't install it again for you so there's no extra downloads, which is really nice. Do note, on the very, very first launch of capture, we do need to phone home to get a security certificate for that engine.

Kirk:
Most people are obviously online when they're installing the app and most of them launch it right afterwards, so there shouldn't be an issue. But if you happen to see that, that's why. You'll notice that when you're scanning and you're moving from scan location to scan location, if there's an error you'll see a little bit more detailed in the error messages on your display. We're doing this, so we're reporting more specific, more detailed messages, both in our analytics and also for you, if you have to talk to customer support or you're reporting something with a screenshot, the screenshot will have a little bit more detailed than just there was a camera error or something like that. One thing that's a little bit different between the two operating systems is we have a little bit more flexibility on Android to do certain things.

Kirk:
For instance, like when you're in your register, you can do your capture jobs. You can tap on the connecting to a camera and we'll launch directly within capture, lets you scroll around, find the right camera that you want and connect to it without ever leaving capture. iOS has security limitations on what developers can call. It's a little bit more awkward there where you have to switch over to the [inaudible 00:19:15] panel. We're going to talk about exporting and importing scan jobs in a little bit here. Android has an early version of the beta importing scan jobs that you have archives or exported previously. We'll talk about that a little bit more detail. One possible [inaudible 00:19:34] on Androids that we've seen from the beta is that more modern versions of Android are a little bit more aggressive about trying to find the internet connection, which is totally understandable via Wi-Fi.

Kirk:
But obviously in our world, we are connecting Wi-Fi to a connected camera and those cameras don't have Wi-Fi. They don't have an internet connection on them. And so sometimes you will see a pop-up when you connect to that camera and infrared will ask, "There's no internet connection. Do you want to maintain this connection?" If you ignore that or if you say no, then Android is going to drop it and go searching for a better connection that has an internet dial down. If you ignore it and Android may drop it and then you won't see your connected camera, so just watch for that. That's usually is the first thing, it will remember that you'd like this camera and you're okay with it not having internet connection. But that's just a common thing that trips folks up.

Kirk:
Steve asks, "Will the Android capture work on the new trip tech Pro tablet?" I'm not familiar with that tablet. This was one of the benefits and the challenges of the Android world, is that there are just so many models out there. I'd say for Steve, the recommendation there is go to the Play Store, search on Matterport CTC capture. If you see capture, then in theory, it should have passed all those filters in terms of saying, is it running the latest operating system even above, are you in the right architecture? Is it certified and so forth. If you don't see it there, that it probably is not passing one of those things. Then obviously if it is passing, go ahead and download it and give it a try.

Kirk:
We're in open beta now. We expect to be releasing the production version hopefully next week, 4.1.0. We're going to continue the same pattern that we do with iOS capture. We'll have production releases and we'll have beta releases. We'll continue the pre-release beta channel that releases, and you can stay on that channel and you continue to get updates before they migrate over to production. In general, the cycle for folks that haven't experienced this before is, we almost always have an open beta period on any new release because we want to make sure that the diversity of users and cameras and things that people are doing out there is very, very rich and complex.

Kirk:
We want to make sure that there's nothing going wrong. The other key thing is we want to get feedback too. If there's a new feature, especially with new features, we want to get your thoughts on that. Do you like the way that we're doing it? Do you not like it? Do you have other suggestions, other needs? We're really trying to cultivate more of an open communication, open discussion for these new features as they come out. Which is why you'll start to see more features with a beta tag within the app itself. We'll talk about one of those in just a sec. Our lineup is expanding, which is awesome and exciting. Let's just let's just walk our way down here to make sure we're all set in terms of what is supported by what.

Kirk:
For Android and iOS, obviously we support all the Matterport Pro cameras. On iOS we support the BLK360, which is a hand LiDAR sensor that is really used in higher precision, larger scenarios. Right now, Android 1.0 does not support the BLK360. We don't have timing on that yet. Like I had mentioned before, our next priority is making sure we can get scanning with your Android device directly in place. Then the BLK is coming up behind that. If we look at the 360 camera lineup here, we have three cameras from Ricoh and three from Insta360. As we walk through this Android to the current open beta supports all of them. On iOS, the next release, which is in Test Flight right now, which is 4.1.2, and that's going to be going out to the app store next week, supports all six of them.

Kirk:
The key missing item on iOS is the [inaudible 00:23:31] and the release that we're going with next week for it as a support. You'll have essentially all six supported on both platforms. This is a 360 ONE R is a modular system. There are three different lens modules that you can have in that camera. We support a 360 module. We don't support the 4K or the one in sensor that are the more traditional action cameras right there. They're not 360 cameras and so it doesn't make sense for Matterport to not be doing a 360 there. The ONE X2 is the latest slim form factor, pocketable form factor from Insta360. It's got a nice text display on there that does a preview of your shot.

Kirk:
The 360 ONE X is also still available, but more limited quantities. But basically the ONE X2 is obviously the successor to the ONE X. All three of those are supported. The Ricoh Theta is the highest tier, the best image quality across the 360 cameras, the ones that most folks use often in the professional environments. The Ricoh Theta V has been around for quite some time. And it's been the workhorse for the 360 cameras. It is increasingly out of stock. Obviously we continue to support these cameras, even if they go end of life from the manufacturer. The Ricoh Theta SC2 is a more affordable version, same image sensor, a little bit slower. It's got a little display on it and it comes in four colors. It's the most affordable of the group in terms of the lineup here.

Kirk:
While route 360 cameras, there are a lot of other 360 cameras out there. The way this works with Matterport is we try to use what's called the open spherical camera spec, this spec that was spearheaded by Google back in earlier Google 3D days. Cameras that speak that language, we can talk to. What we found is that we generally need to do custom work with every single 360 camera. It's unfortunately not sufficient that just speaking. The OSC spec is good enough. We would love that to be the case. There are other cameras that do support that protocol, but we found that we have to do additional custom integration work with them. We're always looking out for what another good set of 360 cameras to work with.

Kirk:
Obviously in terms of scanning with your handheld device, with your phone, with your tablet, so we've got Matterport for iPhone on iOS and coming in Q2 will be Android support for smartphone capture, Matterport for Android. As soon as we have something that's ready for a public beta, we'll make sure everybody knows about that and that'll be available in the same pattern, the pre-release beta channel on the Android Play Store. While we're here, I see a couple of questions coming in here. Let's see. Any plans to support the GoPro max? The GoPro is a good example of, we would love to support the GoPro devices. Unfortunately they dropped their developer program back in 2017. Garmin is another one. The Garmin Burbs 360 are really good powerful cameras. They don't support the open spherical camera spec. They the GoPro folks don't have an SDK either.

Kirk:
There's no way for us a software development kit. There's no way for us to actually integrate with it. We would literally have to try and hack it. That's not a really sustainable plan. One of the things we've been looking at for a while outside of direct conductivity with capture being able to import 360 taken from other cameras and as developing on that goes forth, if we can get to the point where we can get some public beta experiments out, we'll let everybody know there. Similar question from Tom. For the Ricoh Z1, do we know anything about further next versions? I don't. Frankly, if I did, I probably wouldn't be allowed to say anything. But flat out don't. In general, the same deal is going to apply. We'll continue supporting cameras as we go forward. Even if they're discontinued by the manufacturer, we still can support them for some time.

Kirk:
Let's move on to our last topic here. The back-ups and exports and imports of scan jobs. If you haven't seen them, Amir has done some awesome videos and articles in terms of how to do this with third-party software, like [inaudible 00:28:04]. Cabling your device to a computer and doing full device backups, getting all of that stuff off. Or if you have a Mac operating system, doing what it is now would be equivalent of iTunes sync to back the complete device up. Obviously the reasons you would do this are, safety just in case your device gets stolen. Obviously, if you've uploaded stuff to the cloud, then we have that. But if something happens that device is broken or there's some glitch before you've been able to do that, you don't want to have a loss.

Kirk:
Other folks have been doing that across teams for managing and moving scans ops system and devices. One of the downsides here though, is that our data structure inside capture was never really designed to... It isn't user-friendly. The junk folders, if you've ever looked at these are really long alphanumeric streams that in general, you don't want to be modifying at all because the capture expects them to be exactly what they are. One of the exciting things is both versions of capture now, 4.1.1, 4.1.2 and iOS and Android include the ability to export jobs directly from capture. One of the key reasons why this is a good thing is if you're looking to say, "Hey, I've got an older scan job on this device and they want to move it off. I want to make space on my device, but I want to make sure that I can bring it back if I need to." Or, "I need to transfer somewhere else for safe keeping."

Kirk:
One of the reasons why it's better to do an individual scan job backup within capture and use those features is that, there's a subtlety that we're seeing folks are not aware of. If I have a job number one, and I duplicate it in this now job number two, and I continued doing some work on there and I take the job number two folder inside captures data structures using these third-party tools and I copy it off, it will not include some of the data that's involved in that job because we were not duplicating that on iOS into the new job, we're trying to save space. One thing we're seeing in our support tickets is, folks that have done these manual backups at just a single job or a couple of jobs and then they try to restore them somewhere else and they're missing data. And this is why, because it's a subtlety that we weren't exposing to everybody.

Kirk:
This is generally not true that whole device back-up because the whole device back-up, you're catching all the jobs and so there's no issue. Basically our best practice recommendation going forward is, if you want to export or archive and bring a specific scan job or scan jobs often capture, use the facility within the application. That will make sure that if I go off and I say, "Okay, I'm going to archive or export a particular job." It will go look for all the data that it needs to get into and we'll collect the whole of that together. It's going to make it a zip file for you. It's going to give you the friendly name of the job. You're not going to see something like, F4E9EB whatever. Inside the zip file, you will see those folders. It's a bad idea to edit those folders because if you do so and you bring it back, then capture is going to not understand most likely what's going on, because those are not meant to be edited.

Kirk:
For full device backups, Amir's videos up on Matterport Academy are awesome. Most of the time, the most efficient thing to do is use those third party tools to say, "Okay, great. I'm going to take the entire jobs folder and pull it off." Or, "I'm doing an entire device backup there." This is an example of what you'll see in the current beta feature for 4.1.2. If you go into the help and support menu within capture, down at the bottom below the repair option, you're going to see export jobs. When you tap on this, you'll be given a list of the jobs that you currently have on capture on that device.

Kirk:
Let's say, I'm going to go export this [inaudible 00:32:07] pool shot and I'll select it. Then capture is going to go off and crunch and make sure it's got all the data. That's going to slap that together into a zip archive and to have one single folder. Then it's going to Prompt you with the standard iOS share screen. You can do a couple of things here. You could save that locally to a different folder on that device depending on the applications that you have installed. Like in this particular instance, I have a Google drive and Dropbox installed. And so I can then go ahead and choose to upload this to my Google drive folder. Obviously these jobs tend to be fairly big, but in this case, I want to have it up in the cloud in my own personal space. And so I'll do that here.

Kirk:
On an iOS and on Android, both of the modern versions of everything has the equivalent of Airdrop on Android it's called Nearby and you can select it and you could essentially send it over to another device. This is super convenient if you're just saying, "Look, I'm going to get this on to my computer. I just want to get a different shot spot." I can easily hit Airdrops, [inaudible 00:33:15] computer and it's all set. Then if you're doing something to make space after you've archived this and you've sent it off somewhere else, then you go back into capture and you say, "Okay, now this old model for 2019 now within capture, I'm going to delete that model." That will also make sure that the deletion of that model is cleaned up properly. Now you'll have an archive of that model. It doesn't file off somewhere else. Then you've you've cleaned up your device locally.

Kirk:
On iOS, we don't have the import back pass within capture yet. That's coming soon. The methods that Amir outlined in the Matterport Academy videos will work. You unzip that file, it's got the job folders, you copy those job folders back over to the job directory on your device, they reboot then you should be good to go. But in the next version of capture, you shouldn't even need to do that. It'll have the import facility and I'll show you that on Android and the next slide. On Android, similarly, you go into the help and support menu. The export job flow is very similar, almost identical with slight differences due to just Androidness. Before we get into the flow, I'll call out that you see create backup and restore from backup here and Android.

Kirk:
This is something that we're experimenting with too, which is to basically say, take my entire jobs folder and make a zip archive of this thing and then send it out. This is convenient if you're trying to take a snapshot of the whole folder. There is one downside which is, if you have a lot of jobs then... I've already used 30 gigabytes of data in my jobs folder, and now I need at least another 30 to store a copy of that, that we're going to compress and then move off device. Either is fine for when you've got smaller sets of jobs. If your device is getting really, really full, then you won't be able to do a full backup in this manner. And the method that Amir has done on the Matterport Academy videos is the preferred one. Is to hook up to a third party. Hook a cable device to it, to a laptop computer, and do the third-party tools to pull all that stuff off.

Kirk:
But for important, this is a preview of what you'll see on iOS too. With the slightly different UI. This is early beta kinds of steps. Basically, you can say, "Okay, great. In the previous world, I had exported my model and I took it in a zip file and I stuck it off onto a computer, maybe got a new device. Maybe later on I decided to work with that again." In this case on Android, you can pull that zip file into the directory you see over here on the lower right into the Matterport for directing your documents folder and there's an archive folder. If you pull that in, when you tap on import jobs inside capture, it will give you a list of archives that it sees in that directory. And you can then select that and say, "Okay, great. I'm going to go ahead and import this." A capture we'll unpack that, bring it back in. When you come back into your directory of your jobs, you'll see it there.

Kirk:
Again, what we're trying to do here is give you a user a more user-friendly way to go through these operations that you're normally going through, give you ways that are a little bit more reliable to make a space on the device, get archives of your stuff, keep it in the safe spot. One of the things that will be coming in the future is cloud restore, where you'd be able to say, "I'm going to do this, but I want to be able to just bring things back from the cloud because that data's up there already." We have to do a bunch more work in terms of data structures and all this other stuff, because the current model that we have is not super conducive to making that work as well.

Amir Frank:
Yeah, that was great. That was a lot of stuff, but super informative and very, very useful information. We've got some more questions that we'll get to in just a bit. Just wanted to go through a couple of things so you guys can find the information that you want after we say goodbye today. If you go to matterport.com and check out the resources tab, the first thing there is support. That'll get you to the support hub which basically is the hub for everything support related. Frequently asked questions, link to the help center, which has hundreds and hundreds of articles, all about backing up in the cameras we support and all this stuff that we're talking about here today.

Amir Frank:
If you are in the US or Canada as well maybe even Mexico, the phone number to reach support is 408-805-3347. But regardless of where you are, if you go to that support page and scroll all the way down to the bottom, the correct phone number for your location will be displayed. You can reach us by phone. You can contact us by emailing support@matterport.com. If you go again, back to that support page, there's a little help bubble in the bottom right corner of the screen, where you can chat with support. We try and make it as easy as possible to reach support, but these are the ways to reach support.

Amir Frank:
As always, I have to mention that it's super, super important that your contact information in your profile is up to date. We send not only newsletters and things like that, but we also sometimes send emails that are specific to your accounts. Those get sent to the email address that is associated with your account and you will not get them unless that information is up-to-date. So please do make sure that is relevant. Finally, is to stay connected with everything. We're very active on Facebook. If you go to facebook.com/matterport, you get to our Facebook page where we post all the latest and greatest, every time beta makes it out into GA, then it's posted there so you'll know about it.

Amir Frank:
As well as, if you've got these cool spaces that you've been scanning and you'd like to show it off in the Matterport gallery, this is where you go to get that done, go.matterport.com/nominate-your-space. Just let us know what space it is you'd like to add in there and we'll have a look and check it out. With that, I think we are... That's the end of the presentation. So, thanks. Like I said, a lot of information, but we do have some time left for Q and A. I'm just going to go ahead and stop sharing. Yeah. There was a question regarding how to handle rooms with mirrors when you're standing with the iPhone. Great question.

Amir Frank:
We're huge advocates of using the right tool for the right job. If you walk through the property and you see that this room has a ton of mirrors in it, and you would not really be able to capture it from just one location, you need a few more scam positions and what have you. You come to the realization that it's not going to be possible to have a scan position without you in it, if you use the iPhone. In which case, something like a 360 camera for those areas probably would be a better tool because you can just stick a little small stick much more in conspicuous than even the Pro2 and you can walk around the corner and trigger it. Using the right tool for the right job is super important. That's why we have so many cameras that we support.

Kirk:
There were a couple from the export topic I was going to hit just edit the scans on the computer. I export a job and I go off and I as a thumb, as the files and modify them on the computer, not a good idea. Capture isn't designed to expect and have somebody editing those things. In general, they're not really designed to be user-friendly. The imaging of debts components are not things that you'd pull up at a Photoshop to edit. The file structure and all the broad data around that is really not meant to be user for the rod jobs themselves. Sorry. That's probably not a great idea. There was a question around, if you bring an exported job back to your device yes, you can add scans to it. You can edit, mark it up. You could upload it again to the Matterport cloud, no problems at all. That's part of what this was designed to do. So make it easy to bring that back, if you need to.

Amir Frank:
Alex asked, "I miss something about local rendering of the capture." Something about updating the app. Is that automatic? Did you cover anything about local? I don't think we do anything like that. I'm not sure what he's referring to.

Kirk:
I'm not sure what the question, but in general, for everybody, both app stores, iOS and Android app store are generally set to auto update. The application will generally update itself. You can turn this off if you decide you want more control over that. If Alex, you referring what we're doing on Android, where the cortex engine is now part of the application itself, that's convenience for everybody just so that you don't have the separate download. Basically just on the Android, that's coming as part of the application, and this will be coming to iOS as well. That was just a convenience to make it easier and more streamlined as you get started.

Amir Frank:
If we use the iPhone LiDAR on outside scans and Pro2 inside, does that mean that we cannot do Matterports on floor plans or can we separate somehow? You would have to separate them, yes. I would always lean towards starting inside just to get a good solid foundation if you've got the Pro to start with that. Once you've done the entire inside, create a duplicate of that model in capture, and then you can continue your way outside with something like the iPhone with LIDAR, because the iPhones LiDAR is better suited for outdoors. Doesn't conflict with the sun's infrared. Then you have the both models, one the complete that you can show off. And then the other one that you can use for ordering floor plans, because even if a single scan position was captured using a phone or a 360 camera at this time, that will basically mean that you will not be able to use that model to order floor plans or Matterports and things like that.

Kirk:
There were a couple of iPhone related questions here. One question is around whether there's mounts or motorized mounts for rotation. In general, this is related to what I was talking about before, in terms of when you're scanning with your iPhone, a monopod is a handy mechanism to keep you centered or keep the phone centered rather as you go through that. There are actual mounts and motor mounts that folks have used to rotate their cameras around that. There's no known motor mount right now that capture could control to basically say, mount my iPhone on this and then have it do a rotation. There's a couple other providers like Inside Maps that do things like that, where there's a motor mount for your iPhone, and they're taking the pano imagery as well.

Kirk:
For larger spaces that a mechanism would certainly be handy. That's an area that we keep looking at in terms of our other party's going to be providing those kinds of things. Or is that something that we should look at providing. As we get a sense of what are the kinds of spaces that folks are capturing with their iPhone? What's the size? How large are they? Is this a repeated thing versus more one-off? We'll try and evaluate, and I'm sure this world will evolve to make more convenient things like that available.

Amir Frank:
Question from Greg, "When shooting exterior 360 captures with a 360 camera, are there any options for a timer or way to shoot one lens at a time?" Definitely something that I know Kirk, you've mentioned the past that we're looking into, so you can shoot the one hemisphere walk around to the opposite side. Shoot the opposite hemisphere, and then it's just those together. Great question. Any progress?

Kirk:
Unfortunately, that is still in the queue. For the broader picture, when you're a 360 camera, if you can't hide, there's nowhere to hide. Unless you're really small and you go right beneath the tripod, which is probably unsafe. The notion here is that the Ricoh Theta cameras have a plugin architecture, and there's a bunch of apps that folks have done with timers and things like that. What we would like to try to do is bring the ability to do that with all the 360 cameras.

Kirk:
What Amir was talking about, was shoot one side, shoot then walk around the other side, shoot the other side, remove the operator, or when very changed in between. That's still in our queue behind other features that we're working on. So I don't have any update for you. I know that's not super satisfying, but that's where we are. We know that's important.

Amir:
Really the question is, what range will the LiDAR work outside?

Kirk:
For iPhone 12 Pro and Pro Max, Apple advertises five meters. One thing, we've seen outside in full sun and so forth. Again, the frequency of the LiDAR and the nature of it, it's, it's less susceptible to near infrared, unlike the Pro2 cameras with the structured light sensors. Note that the data density will get pretty sparse out there. Even though it technically goes out that far, your data density and the accuracy of that starting to get spotty. In general, we recommend tightening up the scans a little bit. If it's really important to get 3D data outdoors in that area, I would try to go a little bit denser.

Kirk:
If you're trying to do a really large space... I think part of the other part of this question is, can I do large spaces for this? You can try to push the distance as much as you can, but if you start to see misalignments where the puck and the mini-map are looking to land in the wrong spot, on the mini map, then that's where things are starting to all dicey. You have to start not moving so far between scans.

Amir:
Yeah. We had a question from Edwin who asked, "How can one enhance the quality saturation color curves levels, things like that of a space post shoot?" That is unfortunately something that Matterport doesn't yet offer. However, there are third-party tools namely somebody like MP embed. If you go to MPembe.com, you can... I think even their free tool offers that level of enhancement. It really, really wonderful tool. We are working a lot on our SDKs and APIs in order to to make that stuff more possible.

Kirk:
Two related questions from Richard. This relates to the number of images, number of scans on capture. He was relaying that he was using a Theta V and getting up around 700 to 900 scans with the iPad crashing. This is an important area. This is related to that RAM topic I was talking to before. The full answer around what are the limits is, it depends and it depends on a couple of things.

Kirk:
The iOS or the device that you're using, the capabilities of it, how modern is it in terms of an iPad or Android device? Obviously, storage is important just because, hey, I need to store this stuff somewhere. The most subtle thing and the thing that's usually the killer is how much RAM is on a device. And that's the thing that is not well advertised and it's subtle.

Kirk:
It also depends on what camera you're using. The limitation, the size limitations... Interestingly you can go further with the Pro camera than you can with the 360 cameras, simply because we're doing more computational stuff with the 360 cameras than we are with the Pro cameras. I'm assuming Richard probably had a fairly modern iPad because where he seen these crashes up at around 700, 900 is to me, what I know the limits right now is that yeah, that's probably where you're going to start hitting those problems on a really modernized iPad device. The lower tier devices that effective limit will come down. You might on an old two gram device with 360 cameras, you may only get 200 scans before you start crashing.

Kirk:
One of the things we're going to try and do is publish guidelines around this to say, "Okay, here's the effective limits in terms of where you're going to run into trouble because once you crash, obviously that's terrible." Once you're in that state, really the next thing to do is to start another scan job that overlaps the prior one, and then some of that to support for stitching. More to come there in terms of guidance and limitations. Just remember rule of thumb, if you're shopping for a new device that you're going to use, try to maximize the RAM as much as possible. It saves you time. It saves you a hassle operationally, so it's worth it.

Amir Frank:
Eli asks, "If I'm unsatisfied with the quality of my Matterport scan, can I recapture some scans and then upload that and have that be incorporated into the scan?" It's not that the new scans are going to be incorporated. Let me back up a second. Yes, you can go back to the site. You can capture new scans into the same model. You can make a duplicate of that model in capture, erase a couple scans and recapture that area for example. This happens a lot if somebody just renovated a kitchen, you don't want to re scan the entire house. You just go back, you raise the scan positions around the kitchen, start from a familiar area, someplace that the system can identify as having not changed since the previous scans, and then scan your way into the new area that has been changed. You should not have a problem.

Amir Frank:
However, when you upload, it's not going to replace the scan or the model that has previously been processed. The way it works is like baking a cake. If you've heard me talk about this, I'll say it one more time. You've got your raw ingredients in capture. You put those raw ingredients, your eggs and flours and whatnot, and milk, and you throw that into the oven and it gets processed into a cake. At that point, you can throw icing on that cake in the form of matter tags and highlight reels, but you can't change the ingredients that are in that baked cake.

Amir Frank:
What you can do is alter your raw ingredients and bake a new cake. That's how that works. You will end up with a new model that will have a new and different link. If you are sharing this and have previously published this link, that's going to be new in those links will have to be updated. Let's say we've got time for one or two more, about a minute short of the hour.

Kirk:
Yeah. There's a couple of questions similarly asking about the capture range for the Theta SC2 or differences between the 360 cameras in terms of range and so forth and accuracy. In general, the range... There's two ranges. There's the range of what am I capturing in depth data. What depth state is being synthesized from that image? If I capture it in a really large space. One thing to watch is, cortex will try to generate depth data from a lot of stuff. You can get a very large range from that. But the question is, is that data really useful or accurate for you?

Kirk:
Again, if you're a large space trying to increase your scan... If you're talking about the Wi-Fi range the 360 cameras sometimes have not the strongest Wi-Fi signal. Sometimes you'll notice, like if you're in spaces when I'm scanning, if I'm a little bit further from the camera, sometimes I have to stick my iPad just up behind the corner to get a little bit of a boost to get that thing trigger. If that's the question, it's similar to all the other 360 cameras. It's par for the course with them. There's nothing particularly different about the SC2 as far as I'm aware.

Amir Frank:
Yeah. Last question came from Jay asked, "The floor plan accuracy I should expect from the Ricoh Theta Z1." The Z1 being a top of the line 360 that we're currently supporting with the massive sensor. Just because it has better image quality and good signal to noise ratio, unfortunately does not translate to a more accurate floor plan. In fact, right now, we're still not supporting floor plans when a 360 camera is used to capture that image. You are looking at accuracy of within four to 8%. It just depends a little bit.

Amir Frank:
We feel like we have a little bit more work to do to bring that down before we will be comfortable with floor plans, because those generally need to follow a standard called a RICS standard. So to get there, a little bit more work has to be done, but we're on our way. Okay with that, we'll say goodbye. Thank you so much, Kirk. That was absolutely amazing. A lot of really, really great information. Can't wait for more stuff like not to come out and see that progressing.

Kirk:
Its cool. Thank you. And thanks for all the questions, everybody. This is great. Lots of good info.

Amir Frank:
Yeah. Yeah. Great questions, really appreciate the participation and you attending today. So, thanks very much. With that, have a great rest of the day. Take care, everybody.

Kirk:
Bye, everybody.