Imagine being able to graffiti in the Louvre — on famous works of art. With augmented reality, anyone with a mobile device can do so with zero repercussions.
In light of the submissions deadline for SIGGRAPH 2019’s Appy Hour quickly approaching, we reminisced about some of the great apps that were on display in the 2018 app showcase. Two apps in last year’s program were contributed by one vibrantly creative pioneer in augmented reality mobile: Yosun Chang. Here, Yosun explains her two projects, where the ideas came from, and what she thinks about the future of mobile AR.
SIGGRAPH: Two of your apps were included in the SIGGRAPH 2018 Appy Hour. What does each do, and what makes them unique?
Yosun Chang (YC): When you make too much stuff, the hardest thing tends to be selection of what to show; you don’t want to overwhelm. I ended up deciding on the two app platforms to submit based on SIGGRAPH 2018’s “connected things” theme and differing extremes of audience types, from indie to enterprise. I also wanted to represent areas in AR that weren’t as well explored.*
- AR Interfaces for IoT showed working examples of AR interfaces on IoT objects that you can control. It seemed like a good fit for enterprise use and for people looking for very practical uses of AR. This was an idea I had in the bathtub at Sydney Harbour Hilton after giving a talk at JSConf Down Under 2012 right after a 16-hour flight from SF. I really wanted to reach over to dim the lights — but wouldn’t it be great if the year were 2032 instead, and my bionic eyes could load an AR interface for me to air-slide the dimmer to just the right brightness? Precision and device-free action at a distance. It took a few years for machine vision to catch up to the point where a hacker like me could just hack a prototype out to get customers. (PDF)
- WallText is actually several different apps I built involving AR text messaging. I’ve outlined each of these different apps below. (PDF)
Wall Secret lets you post secrets on walls and other surfaces of the real world. It’s PostSecret Graffiti meets Instagram meets AR. It’s a bit whimsical — I felt it would appeal to both visual and wordy storytellers. Originally, Wall Secret was an experiment with SDF fonts in augmented reality that turned out to look really good, and even better with LUT filters. I then turned that tech discovery into something I could use to fulfil my whimsical dreams of posting secret messages anywhere in the world for others (or myself) to find later.
ArtformAR lets you annotate on specific parts of an artwork, which can later be discovered and further annotated by other discussion participants. This grew out of my own version of lonely traveler, visiting art museums in new cities and wishing to discuss the art, but no one around me wanted to talk. What if it’s possible to hold discussions across time on the fabric of a painting?
Faked.cam lets you augment fake chat screenshots on your friend’s phone. This was an idea I joked about with a friend who tried to help HoloYummy get its first paying customer. Wouldn’t it be silly if something like that became a killer app?
*A segment of AR apps that I easily ruled out were projective 3D model ones — it’s the most common thing that you’d see in the field, and I wanted to present forms of augmented reality that had not been seen as much yet.
SIGGRAPH: Can you share a bit about your background? How did you come to this point?
YC: I love making things, and often come up with ideas that are too detailed or complicated to explain until I build them. And they haunt me if I don’t make them. So, I have to build fast. I learned everything I needed to build the things I wanted to build.
Like the Mad Hatter, I usually come up with a thousand ideas before breakfast. An entrepreneurial-technologist at heart, I narrow the list down by firsthand utility-insight driven market potential — and by making sure the concept utilizes enough next-gen technology to seem like magic, but is still possible and something I can build, myself, right now! In the late ‘90s, when I first started as a prolific professional, the tools I used were Flash and Visual Studio — these days, it’s various computer vision solutions/AR SDK’s and Unity (and your server side of choice; mine is LAMP).
Between now and then, in the pursuit of building stuff, I became something of a generalist with too many specializations, from being an Autodesk 3D Studio Max Professional to picking up OpenCV to Unity C# to CoreML and dozens of other languages and frameworks (and my own) to being too good at hacking together minimum viable products to winning all the big hackathons out there. (I also dabbled with a M.D.-Ph.D. that turned into a triple major in Physics-Philosophy-Bioengineering and then a B.S.-M.S. in Material Physics — and then a complete escape from academia to founding and becoming a Shakespearean theatre director in a virtual world!)
My first mobile AR app was a zBrush-inspired mess I called ClayAR in 2010, built using Unity 3.5 x Qualcomm Augmented Reality (now Vuforia) — the goal was to use multi-touch and smartphone features, like the gyroscope, along with computer vision AR tracking on a marker to make organic-forms 3D modeling more intuitive, but I soon added way too many features to the point where I was forgetting what I’d already done — it turned horrifically into Maya.
I realized a few years ago that modern app development boils down to the prompt: How do you make complex software elegant? That’s the theme guiding the app platforms I build these days.
SIGGRAPH: It’s just the beginning of the New Year. How do you foresee mobile AR moving forward in 2019?
YC: AR used to be a cool “I’m gonna steal your phone for that” magic trick to show off, but I see mobile AR becoming common enough that most people would just expect it on their phones and in the everyday apps they use. I see AR, and by extension machine-learning computer vision, as a feature that an existing app may include, if it makes sense. The technology has been around for a long time, and it’s time that the tool gets properly used.
SIGGRAPH: What’s next for you in the realm of mobile and/or AR?
YC: I’m experimenting with methods to let artists create augmented reality filters and experiences, without needing to learn additional tools other than what they already know.
It’s inspired by an old hack I made in 2012 that lets you color in a coloring book to texture your 3D model. I started thinking about different utility forms based on an isomorphic tech stack. What if instead of being able to color the paper, you could also fold it into an origami creature. So, PlayGAMI was inspired through a collaboration with a 3D artist and an origami guru for World Makerfaire NYC in September.
This then became a bigger picture project: project sur.faced.io. What if you could draw anything on any medium — paper, Photoshop, Bamboo Slate, whatever — and instantly turn it into an AR filter? Come beta test!
Submissions to the SIGGRAPH 2019 Appy Hour close on 12 February 2019. Learn more at s2019.SIGGRAPH.org.
Yosun Chang has been building 3D software (and hacks) since 1998. She has built showcase applications for most mainstream depth sensor device; clients include Intel (Perceptual Computing/RealSense), Microsoft, Google (Project Tango + Glass), Mozilla, and more. Having taken the uncommon step to “always build something new everyday” (for the last 10-plus years), with far more than 10,000 hours of experience making apps, she hacks together remarkably polished MVPs in hackathon settings to win Grand Prizes at TechCrunch Disrupt Hackathon (two times), AT&T IoT Hackathon, Warner Brothers Shape, Microsoft Build, Intel Perceptual Computing Challenge (three times), and more. In short: She turns emerging tech into award-winning applications.) She analogizes the products in her startup studio as the well-crafted wooden sculptures, hand-made furniture, and bespoke mansion a master woodworker would build for her own home and daily use.