We may earn a commission from links on this page.
My question about virtual reality has always been, “But what is it for?” I finally have an answer: Guassian Splatting. We’ve always tried to capture our past, whether it’s through physical photographs, VHS tapes, or every picture you have stored in the cloud, but we’ve been limited to viewing our personal histories in flat media, usually from a behind a screen, and always from a single angle. But Gaussian Splatting changes that. This technology allows you to create volumetric 3D models of objects, people, or spaces, so instead of a picture of your child’s favorite toy, you can have a realistic scan of it that you can examine from every angle; instead of a snapshot of Thanksgiving dinner, you can have a photorealistic diorama of the dining room that you can walk around.
What is Gaussian Splatting?
Gaussian Splatting is a technological newborn. It was first theoretically introduced in a 2023 research paper by Bernhard Kerbl, Georgios Kopanas, Thomas Leimkühler, and George Drettakis. The paper details a new rendering technique that builds 3D models out of millions of semi-transparent blobs called “Gaussians” instead of the solid triangles used in traditional computer graphics. Once calculated, the Gaussians are “Splatted” onto a 2D plane by your computer, and that is arranged and layered based on how they should look from any viewpoint within the Splat. Because the blobs are semi-transparent, they don’t block each other. They blend together like brushstrokes in a painting.
Another bonus: Splatting provides a much higher level of detail for its file size compared to traditional methods of scanning. Older scans work on a the geometric principle of stretching a virtual skin made of triangles over an object. For a detailed scan, that could be billions of triangles, resulting in PC-choking file sizes. Splatting is based on mathematical probability rather than rigid geometry. Instead of a solid edge, each “blob” is a tiny cloud that tells the computer how likely a color is to exist in that spot. It only stores the position, color, and transparency of millions of relevant areas in space, as well as how they should reflect light from different angles. The result is files that are big compared to Word documents, but not so huge that you can’t work with them on a phone.
Gaussian Splatting quickly went from theory to practice, and now Splats can be created and rendered with only a decent smartphone, making it more accessible than older methods that sometimes required laser scanners or specialized equipment.
Why you should start Splatting
3D scanning is already in use professionally in things like mapping real estate for virtual tours and creating photorealistic assets for video games, but Gaussian Splatting is accessible enough that anyone can future-proof their nostalgia.
Splatting gives your future self (or your kids) the ability to “visit” your current life with a level of realism that’s breathtaking. It lets you digitally “bottle” the exact layout and volume of a moment in time and preserve it. If your parents had this, you’d be able to walk around your childhood bedroom, or check out every angle and detail of the first car you ever bought.
“Digital preservation” and “3D modeling” sound clinical, but the results of Gaussian Splats are anything but sterile. While photography captures a single angle of light in a room, Gaussian Splats capture the behavior of light from all angles, so the result isn’t what the past looks like, but what the past feels like. It’s hard to describe, but capturing the quality of light on an object or location puts you in touch with it in a way you didn’t think possible. That combined with the haziness of Spats and your own memories adds up to a ethereal, dreamlike experience that isn’t like anything else. (I like Splats a lot.)
How to get started Splatting
The barrier to entry for Splatting is just a little time to figure out how it works. You don’t need a specialized LiDAR scanner or an overpowered PC, just a relatively recent smartphone. Here’s how to get started:
Pick an app: Though the technology is new, a few apps are making it very user-friendly. Here are the two I’ve tried:
-
Scaniverse: Excellent for iPhone users, Scaniverse is free, and it processes Splats entirely on your device in only a minute or two.
-
Luma 3D Capture: Available on both Android and iPhone, Luma is great for beginners, with a scanning process that walks you through creating your first Splat.
Make a capture: Here are some things to think about when making your capture.
-
Before you start scanning locations or bigger objects, pick something small and simple so you get the concepts down. But not pets: Your subject has to remain perfectly still through the process. (Make an exception for your child. They won’t hold still enough, but having even a blurry model of your kid is vital for future you.)
-
Place your subject in an evenly lit room with enough space to walk all the way around it.
-
Hit record and walk in a slow, steady circle around your object, keeping your camera pointed at its center.
-
Do two passes, one from a high angle looking down, another from a low angle, looking up.
-
Gaussian Splats hate uniformity. They struggle with plain white walls, so think in terms of textures. Also, avoid clear glass and mirrors that confuse the depth calculations.
Have a banana: Now that you’ve captured your Splat, take a break so the computer can do its thing. How long it will take depends on the app you’re using, your phone, and how detailed your scan is. Scaniverse processes Splats right on your phone. For something simple like the guitar below, it took about two minutes of rendering on an iPhone 17 Pro. Luma 3D Capture processes files in the cloud, so how long it takes depends on how many people are in front of you in the queue. It might be a couple minutes. It might be a couple hours—the app sends an alert when your image is finished cooking. The video below took several hours.
Enjoy your creation: Once the math is finished mathing, you can view your creation right on your smartphone screen or computer. Pinch to zoom, drag to rotate, and marvel at how perfectly the scan captured the vibe of the object or space.
Share your creation: These apps give you a couple of easy ways to share your volumetric memory:
-
Video: You can plot a camera path through your Splat to export a smooth, 2D “fly-through” video. Below is my first scan on YouTube using Scaniverse (it’s sloppy; I was new), and my second try with Luma.
-
Web Link: You can generate a simple web link and text it to your friends or family through both apps. When they tap it, it opens an interactive 3D viewer in their browser—no special apps, accounts, or heavy downloads required.
How to step inside your Splats
Viewing a 3D scan on your phone or PC is kind of cool, but you can’t really understand how mind-blowing these things are until you check them out in a virtual reality device, where you can physically walk around that Thanksgiving table or lean in to inspect the texture on the couch. Here is how you can do it on the two biggest headsets right now.
Apple Vision Pro
The powerful Apple Vision Pro was built to do this. Apple included “Spatial Scenes” right in the OS. It gives a slight 3D pop to 2D photos, but you can take that a little further with apps like Splat Studio that will generate a deeper 3D scene from 2D photos and let you change settings to improve it. But you can get deeper with Spatial Media Toolkit. It lets you make 2D videos into stereoscopic 3D videos. But the final boss is viewing full Splats you made yourself with apps like Luma 3D Capture or Polycam.
If you follow the steps above, you should be able to export the Splat file you created (.ply or .spz) right from your phone to your Vision Pro and step inside the Splat or walk around the object you scanned. You can also check out Splats other users have uploaded.
Meta Quest 3 and 3S
Meta has embraced the Gaussian Splat revolution. Apps like AirVis (also on the Vision Pro) let you check out Splats you made on your phone, and there are even 4D Splats available on the Quest (more on that below). Meta is also taking the first steps toward cutting out the middleman of your phone altogether. Hyperscape Capture is a still-in-beta app that uses the Quest’s existing cameras to scan your room, then save a 3D version of your space. Meta promises that soon you’ll be able to send a link to a friend with a headset so they can “come visit.”
The future of 4D Splatting
As hyped as I am for Gaussian Splatting, the technology is in its “version 1.0 era.” Capturing a decent Splat takes time and patience and requires the subject to stay absolutely still, and the result isn’t always perfect, but the technology is evolving fast enough that the next thing is emerging already. The cutting-Gaussian-edge is 4D Splatting—the fourth dimension is time. 4D Splats are 3D volumetric videos, moving scenes you can view from any point inside or outside the scene. Unlike stereoscopic 3D movies that let you watch from a single point, these are true holographs. At least they are inside a VR rig.
The technology is already in use commercially, most notably in A$AP Rocky’s music video “Helicopter,” in which performers were captured by 56 cameras and the footage converted to 4D Splats, allowing any angle or impossible camera movement to be used. Check it out:
And there are some 4D Splats you can check out in your headset too. Quest 3 app Gracia has a few volumetric videos that are very impressive. Gracia lets you stream or download 4D Splats of people, and place them anywhere you like in augmented reality. Then you can hit “play” and look at them from any angle, or even move all the way around them. To see what I mean, check out this video I made showing my view from within a Quest 3 headset, of singer Amy May performing a song on my front lawn (with a cameo from my no-doubt confused neighbor).
You probably don’t have an array of 20 or so GoPros to create content like Gracia’s, but there are some experimental tools out there for consumers to create 4D Splats. KIRI Engine uses Apple’s open-source ML-Sharp tool to turn a standard single-lens video into a 4D splat. It doesn’t create an AI-aided approximation of stereoscopic 3D like Splat Studio, but converts each individual frame into a separate Splat. It’s too technical for me to really mess with and the 3D is guesswork not actual 3D, but I would be surprised if a way of taking volumetric video with only a few smart phone angles wasn’t in the works somewhere.
Gaussian Splats are as much of a revelation as I imagine instantly developing snapshots were in the 1960s. Like early Polaroids, it’s a bit of a pain, and the results are sometimes grainy, “dreamy” and reminiscent of pointillism, but the emotional impact of a new way of seeing the past is so strong. So get started Splatting now; your future self will thank you.