A Talk with 2017 ACM Fellow, Steve Seitz

by | 9 January 2018 | ACM SIGGRAPH, Awards, Graphics, Industry Leaders, Research

Steven Seitz

By Melanie A. Farmer

Steven Seitz’s work in teleportation and virtual reality is giving people the ability to feel like they are somewhere they are not; essentially, transporting them to a physical experience with VR technology, novel camera technologies and Google’s powerful data centers and algorithms. Seitz, professor of computer science at University of Washington at Seattle (UW), also leads the teleportation group at Google.

Seitz is part of a new class of recently announced ACM fellows, 54 members in total, whose expansive expertise in computer vision and computer graphics has made a significant impact globally and in the way we live and work in our everyday lives. These new ACM fellows join an elite group of researchers and academicians that represent less than one percent of ACM’s overall membership.

“I was floored that many of the colleagues I most admire would take the time and effort to nominate me for this honor,” says Seitz.

The 2017 fellows have been cited for numerous contributions in areas including artificial intelligence, big data, computer architecture, computer graphics, high performance computing, human-computer interaction, sensor networks, and wireless networking. ACM will formally recognize Seitz as a new ACM fellow at its annual awards banquet, to be held in San Francisco on June 23, 2018.

Seitz, who began working with Google in 2010, has spent his career focusing on problems in computer graphics and computer vision. His research aims to capture the structure, appearance, and behavior of the real world in digital imagery, and more recently his attention has been spent making advances in virtual reality, augmented realty, and teleportation.

“We are focused on inventing ways to capture and transmit the world’s places, people, and events,” he notes of his work at UW. “Imagine watching an NBA game projected holographically onto your table top or communicating with remote family, friends, and colleagues as if you were physically all together, or even preserving your life’s important moments in a way that you can literally step back into them later.”

To explore these possibilities and other research problems in VR and AR, Seitz and UW colleagues Ira Kemelmacher-Shlizerman and Brian Curless have launched a new interdisciplinary center at UW. The UW Reality Lab, unveiled on Jan. 8, is being funded by Facebook, Oculus, Google and Huawei, and aims to advance the state of the art in virtual and augmented reality by developing new technologies and applications, educating the next generation of researchers and technologists, and supporting robust collaborations with industry.

After receiving his B.A. in computer science and mathematics at UC Berkeley and Ph.D. in computer science at University of Wisconsin at Madison, Seitz conducted research at Microsoft, working in the tech giant’s Vision Technology Group and Interactive Visual Media Group. His work with collaborators Noah Snavely and Rick Szeliski formed the basis of Microsoft’s Photosynth, one of the early photography apps that enabled users to create realistic 3-D views of objects and locations through their still photos. Prior to joining the faculty at UW in 2000, Seitz was an assistant professor and later an adjunct assistant professor at the Robotics Institute at Carnegie Mellon University.

For Seitz, the old adage, “right place, right time,” so to speak, plays a key role in how the next big research idea materializes.

“I’m a big fan of timing, where suddenly, due to one breakthrough, another becomes possible,” he adds.  “A good example,” he explains, “is Photo Tourism, a project that would not have been possible even six months prior. It built on two new breakthroughs that had just arrived at the same time—Internet photo sharing, as in Flickr, and [Google research scientist] David Lowe’s SIFT work. This kind of serendipitous timing motivates many of my projects at UW and Google.”

At Google, Seitz has been involved in developing the imagery experience in recent versions of Google Maps. He has enjoyed seeing his work roll out now in several VR technologies, including Google Jump, Cardboard Camera, and VR mode in YouTube.

Related Posts

Hair-raising Innovation

Hair-raising Innovation

Pioneering a new era in real-time hair rendering, the SIGGRAPH 2024 Technical Paper “Strand-Based Hair Modeling and Rendering with Mesh Shaders” introduces innovative methods that redefine efficiency and realism in computer graphics.