Tracing Life From Above

by | 30 October 2025 | Art, Conferences

Image credit: Yuki Wong © 2025 ACM SIGGRAPH

In an age defined by data and digital perception, how we see the natural world is increasingly shaped by technology. “AI in the Sky” explores this evolving relationship through the story of Encephalartos woodii — a rare cycad species believed to be extinct in the wild — and the bigger question of how AI might help us perceive what often remains unseen.

Presented at SIGGRAPH 2025, this Art Gallery installation invites reflection on what it means when technology becomes both observer and interpreter. We caught up with Dr. Laura Cinti to discuss her journey and what it meant to share this work at SIGGRAPH.

SIGGRAPH: What first sparked the idea to pair the story of an extinct cycad with AI and drone technology? Was there a moment when the concept “clicked”?

Dr. Laura Cinti (LC): At the start of this project, I was researching cycads and their now-fragile existence. These ancient plants are living reminders of a bygone era and survivors of several mass extinction events, including the one that wiped out the dinosaurs around 66 million years ago. I was surprised to learn that despite their resilience, one of the greatest threats to their survival today is poaching — the reason some species, such as Encephalartos woodii (E. woodii), central to this project, are secured in iron cages and fitted with alarms in botanical gardens.

What drew me to E. woodii is that it exists only as cloned males, all propagated from a single plant discovered in South Africa’s oNgoye Forest more than a century ago. No female has ever been found, and without one, sexual reproduction is impossible. E. woodii’s survival depends on human care and propagation.

As with many of my projects, this one began by exploring a fragmented archive, which in the case of E. woodii included colonial expeditions, botanical correspondence, and experimental attempts to recreate the species. While working through these materials, I came across an interview with ecologist Dr. Debbie Jewitt, who spoke of using drones to monitor wildlife populations, such as crocodiles, in South Africa. Almost in passing, she mentioned the possibility of applying similar methods to endangered plants. That single comment struck me, and I immediately contacted her. Together, we began to explore how drones and remote sensing might offer a new perspective in the search for E. woodii in the place it was last found — the oNgoye Forest. That was the point where the project “clicked.”

As searches before had been carried out on foot, the idea of using drones to survey and map the oNgoye Forest was a new approach. We included multispectral imaging to detect cycads based on their unique spectral signatures, and later, AI models trained to recognize cycad shapes within the dense forest canopy.

For me, pairing a reproductively extinct plant with aerial AI observation was more than a practical decision; it also carried a conceptual symmetry. The drone’s distant vantage mirrored the plant’s estrangement from natural cycles of renewal, both existing in mediated states dependent on human or technological intervention. What began as an ecological investigation gradually became an inquiry into perception itself — how technologies of vision shape what we recognize as living, and how absence might be rendered visible through machine attention.

SIGGRAPH: What challenges did you face translating this idea from a scientific or ecological premise into a gallery experience people could feel?

LC: One of the main challenges was finding a way to convey our data-driven processes in a visual narrative framework. A lot of thinking went into how these processes come together as a story of our unfolding search: How multispectral data capture features of the forest not visible to the human eye, the role of synthetic images in AI training processes, and the creation of a data-driven narrative that explores new ways of seeing. The outputs and processes are often abstract, which can create a fragmented experience of searching through the forest. I wanted to approach this through a dynamic visual story that merges human and machine perspectives by bringing together mosaic maps, multispectral data, and AI detection to tell the story — first from the AI’s point of view, and then through a human voice. To me, it is a kind of weaving together of technological insight, ecological investigation, and artistic speculation, using the datasets as the foundation.

SIGGRAPH: The project uses drone imaging and AI-driven analysis to search for a missing biological counterpart. Can you share how you approached the technical necessity of teaching AI to interpret and navigate natural environments?

LC: I spent a lot of time analyzing mosaic maps to look for cycads. It is a tedious task, and the more terrain we cover, the bigger the task becomes. The idea of integrating AI was initially intended to speed up the search process by automating detection. Working with my long-term collaborator Dr. Howard Boland, who had previously trained models commercially using convolutional neural networks (CNNs) for highly effective real-time object detection, we aimed to build a fine-tuned model to detect cycads over larger areas, with the ultimate goal of identifying them from a live feed.

The first challenge I faced was constructing a dataset appropriate for AI training. I began collecting images of cycads, but most of these were captured from the ground rather than from above. To get closer to the perspective we needed, we used drone photography to capture cycads in botanical and private gardens, but this still gave me a limited amount of data. To expand the dataset, we began generating synthetic images of cycads and merging them into different forest environments. This gave the model enough variety to start “seeing” cycads across various contexts and to recognize them from the air in ways that otherwise would have been impossible.

This process served both a technical role in the search and later became part of the creative storytelling. Beyond improving detection accuracy across varied environments, it became a method of exploring how AI interprets and navigates natural forms. The outputs — training datasets, synthetic images, and detection visualizations — were subsequently reframed as visual narratives, revealing the AI’s “perception” of the world. I wanted the AI to do more than just assist; I felt it was important that it acted as a perceptual partner, not only speeding up the search for E. woodii but also helping me explore how life itself can be visualized, interpreted, and imagined through technological mediation.

SIGGRAPH: Were there any surprising moments where the AI revealed patterns or forms you didn’t anticipate?

LC: Yes, and these moments were quite compelling. Occasionally, the AI identified patterns that superficially resembled cycads, but over time these detections took on a more poetic significance for me. They revealed the dual nature of machine vision — its ability to extend perception beyond human-defined boundaries while also projecting imaginative interpretations onto complex environments.

There is still much work to be done in this area, such as ground-truthing with the maps, refining detections, and exploring how the growing array of technological tools opens new and exciting possibilities. The AI works surprisingly well in private and botanical gardens, where conditions are more controlled, but detection is more challenging in forest environments. We have also explored LiDAR and first-person view drones to determine whether they can further add value to our toolkit.

What has been more troubling is that, from both the ground and the air, we are finding far fewer cycads than expected in the areas of the forest we have been surveying. This potentially highlights the scale of their removal from the wild and the broader threat of cycad extinction in South Africa. Perhaps the most surprising and sobering revelation has been — and continues to be — the disappearance of these plants happening largely out of sight.

SIGGRAPH: SIGGRAPH brings together technologists, artists, and researchers. What’s been the most meaningful part of presenting this piece in that environment?

LC: I have exhibited in a variety of settings, but what makes SIGGRAPH unique is its audience and the breadth of the conference itself. It brings together art, science, engineering, and critical thinking about the world — exploring new tools, new ways of seeing, and the cross-pollination of ideas across disciplines. In many ways, it offers a glimpse of what tomorrow looks like — a perspective I think about often in my own work.

“AI in the Sky” sits at the intersection of ecological investigation, science, engineering, creative approaches to technology, and artistic speculation. Presenting it at SIGGRAPH allows me to share the project within a multidisciplinary context, reflecting its layered nature. I am deeply grateful for the opportunity to share my work in an environment that has always been an inspiration to me and continues to be a place where art, science, and technology meet.

AI in the Sky” is more than a search for a lost species — it’s a look into how technology reshapes our understanding of nature. Save the date for SIGGRAPH 2026 in Los Angeles, 19-23 July, where new works in the Art Gallery will continue exploring the evolving intersections of art, science, and technology.


Laura Cinti is a research-based artist whose practice intersects science, technology, and visual storytelling through experimentation and field research. Often engaging with plants and their entanglements with technology, her work transforms research processes into artworks that merge ecological investigation with artistic speculation. Her artworks have been exhibited internationally.

AI in the Sky has been presented at TEDAI (2025) and featured in media outlets such as The Times, BBC, The Independent, National Geographic, Smithsonian Magazine and BBC’s Have I Got News for You.

Related Posts