Image Credit: Karran Pandey, University of Toronto
What if artists could paint with reality? This SIGGRAPH 2025 Real-Time Live! demo introduces an interactive system that lets artists paint in real time using 3D Gaussian splat brushes captured from the real world. Built on the 3D Gaussian Splatting (3DGS) representation, this approach turns photorealistic scan data into an expressive creative medium — combining geometry, appearance, and depth in a single brush stroke. The result is a fast, intuitive way to remix reality and create rich 3D content … live.
SIGGRAPH: How does your system enable artists to paint with 3D Gaussian splat brushes in real time, and how does this differ from traditional 2D or volumetric painting tools?
Karran Pandey (KP): Our system builds a creative interface directly on top of the 3D Gaussian Splatting (3DGS) representation, which provides a photorealistic, particle-based foundation where appearance and geometry are baked directly into the particles. While 3DGS enables visual realism, our system enables direct, real-time interaction with these particles through painting.
- Compared to 2D painting tools: Traditional 3D texturing software stamps 2D image fragments or stencils onto a mesh. Our system stamps integrated texture and geometry, allowing artists to paint physical volume rather than a flat surface layer.
- Compared to mesh-based or volumetric tools: In tools such as ZBrush, stamping complex geometry is topologically challenging because it requires managing mesh connectivity and UV maps. Because 3DGS is Lagrangian and lacks fixed topology, our system supports spatial manipulation and remixing of complex, real-world content without those constraints.
- Real-time performance: Core painting functionality — including spline fitting, brush placement, and nonrigid deformation — runs interactively for brushes composed of hundreds of thousands of splats representing rich, photorealistic structures.
SIGGRAPH: Can you explain the process of sampling volumetric fragments from real-world Gaussian splat captures and how they are used in your interactive painting workflow?
KP: The process includes the following steps:
- 3D capture: Real-world scenes or objects are captured into a 3DGS representation optimized from a small set of multiview images and corresponding camera information.
- Sampling the brush: Artists “pick up” volumetric fragments of captured reality using either an existing 3D segmenter or a custom screen-space bounding box tool. This supports iterative refinement to isolate a subset of Gaussians as a brush stamp.
- Interactive workflow: Once selected, the brush is assigned an orientation frame, including normal and tangent directions. During painting, the system detects surface hits to determine stamp placement, enabling rapid remixing of captured imagery.
SIGGRAPH: How does your tool handle deformation and diffusion inpainting along painted strokes to create seamless transitions between splats?
KP: To ensure strokes appear organic rather than rigid or disconnected, the system uses a two-step blending process:
- Nonrigid deformation: The stamp is approximated to deform along the curvature of the painted spline. Each Gaussian is tracked along the spline frame so the 3D content conforms naturally to the stroke path.
- Diffusion inpainting: For complex overlaps where geometry alone is insufficient, the system applies automatic inpainting. Overlap regions are rendered from virtual cameras and processed using a Stable Diffusion XL (SDXL) inpainting model to adjust Gaussian opacity and features for seamless transitions.
SIGGRAPH: What advantages do 3D Gaussian splat brushes offer for achieving realistic textures, depth, or volumetric effects compared to other brush techniques?
KP: The primary advantage is effortless realism through real-time interaction.
- Real-time, high-quality results: Because the particles are optimized from real-world scans, they capture visual richness and depth that are traditionally difficult and time-consuming to model by hand. Combining 3DGS rendering with interactive modeling enables photorealistic results at interactive speeds.
- Coherent remixing of reality: The seamless painting workflow, along with creative controls such as controlled randomness, allows artists to create organic foliage and realistic structures — including roads, walls, and rail tracks — that appear naturally weathered. This supports rapid 3D prototyping and ideation.
SIGGRAPH: How might artists or creators integrate this system into existing digital art pipelines or interactive content creation?
KP: During our pilot study, animation and visual effects professionals identified several integration opportunities:
- Rapid 3D prototyping: Concept artists can quickly build new worlds with explicit control, offering a strong alternative to prompt-only workflows.
- Modifying photogrammetry: The tool addresses common pain points in editing raw scans, allowing artists to extend, repair, or modify captured environments using a playful, brush-based interface.
- Animation and stop-motion: Participants suggested integrating a timeline to keyframe the painting process, using brush strokes to represent motion over time for cinematic production.
SIGGRAPH: What advice would you give to someone interested in submitting a project to the SIGGRAPH 2026 Real-Time Live! program?
KP: Based on our experience, strong submissions focus on three core pillars:
- Visually compelling results: The demo should read as high quality immediately. Photorealistic results that are uncommon in real-time contexts consistently stand out.
- Fluid and smooth interaction: Real-Time Live! emphasizes the live experience. Interactions should feel seamless, with heavy computation handled in the background.
- A story built around new capabilities: Tell a clear, engaging story that demonstrates something previously impossible — such as the ability to literally pick up and paint with reality itself.
This demo showcases a new way to create — painting with captured reality in real time, without meshes or textures. If this inspired you, there’s still time to submit your own work to SIGGRAPH 2026 Real-Time Live!.

Karran Pandey is a fourth year CS Ph.D. student at the University of Toronto advised by Karan Singh. His research focuses on interactive tools and representations for visual creation, editing and exploration. He generally works with artistic graphical representations (such as sketches, 3D models, images and videos), and enjoys designing interactive graphics workflows for creative tasks that are convenient, intuitive, and fun.



