Why is 3D scanning things so damn fun?
The best part is it can be done right from your iPhone 12 or newer.
Once you get some practice in, you can scan things, clean them up on your laptop, and then sell them on 3D asset sites like TurboSquid for passive income.
Additionally, with augmented reality on the horizon (ie: Amazon projecting furniture into your room, Snapchat filters with funny dancing characters on your countertop), 3D scanning will open up a new avenue of content creation.
On brand with things I usually cover in this newsletter, 3D scanning things on your phone is in its early stages, so don’t expect perfection from these scans. They can come out wonky sometimes.
In the future, consumer-grade 3D scanning will help speed up the process of creating 3D environments for video games, virtual experiences, and just about anything else that requires complex models to be placed and edited.
Without delay, lets go through apps that let you scan objects (and your friends) from your phone.
Luma AI
Luma AI is a great app that uses NeRFs to create 3D models. A NeRF means Neural Radial Fields. Simply, it takes a bunch of pictures and turns them into a 3D model using machine learning (hence neural, aka artificial intelligence). The free app guides you through scanning; I especially like that it captures reflections well. Other scanning apps ‘bake in’ the reflections, causing reflections to not… reflect. Luma is great at reflections, as they move with the camera’s perspective, as they do in real life.
Above is an image from a 3D scan of my co-worker, Sean. Notice that the light shines through his hair in the top left. All the lighting acts as you would expect in the real world. Everything in the screenshot above is a 3D model. It’s still slightly janky, but just like every other piece of cutting-edge tech we talk about here, this is the worst it will ever be. Tech only gets better.
The unfortunate part about Luma is that it doesn’t let you easily export the models to other 3D software. To export, you need to download a github codebase and use command line to load it into other software. I haven’t learned how to do that, and I hardly have the appetite to learn right now. Command lines scare me lol.
Polycam
Polycam is an alternative that does a decent job at scanning and lets you export the 3D models easily. The downside is it doesn’t capture reflections like Luma. Items that absorb light are your best use case here (People, clothes, fire hydrants, art canvases, etc).
Once you scan some things in, you can use the in app AR filter to project them into the real world.
I’ve also exported 3D models for powerful 3D software like Blender. If you have the skill to create a complex 3D environment in Blender or gaming engines like Unreal Engine, it would benefit you greatly to set aside 2-3 hours to 3D scan real-world objects and test what you can do with them in 3D software.