Image credit: Shutterstock with Photoshop edit
As artificial intelligence continues to influence film and animation, questions of authorship, ethics, and creative control have moved from theory to reality. We spoke with Nikki Tomaino, a researcher focused on AI governance in film and animation, about how thoughtful oversight can protect what makes creative work human and how clear governance can help studios innovate responsibly while keeping quality and originality at the forefront.
SIGGRAPH: You’ve worked at major studios and now focus on AI governance. What inspired your shift from creative production to studying AI ethics and regulation?
Nikki Tomaino (NT): I shifted toward AI governance because the questions that shape 3D production are already here. This is my current research focus. My production background keeps me centered on authorship, credit, clean data, and the quality bar that animation is known for. Artists make that quality. AI is a tool, not the author. One can reasonably wager that many studios are experimenting with AI, often quietly.
If studios have an appetite to ethically build internal foundation models using licensed catalogs, artists should help design the tools, define evaluation criteria, sit on model audits, judge efficacy against show-ready quality, and advise on dataset choices for fine-tunes or LoRAs.
I study AI governance from a production lens. I want artists at the table. If we refuse the tools, we lose the chance to shape them. My focus is simple: Clear data rights, clear credits, and practical rules teams can use.
In practice, that means written license paths for any dataset under consideration, an explicit consent and attribution policy for artist work, and lightweight review gates that keep pipelines accountable without slowing delivery.
SIGGRAPH: Many artists and creators are grappling with the impact of AI on ownership, authorship, and identity. From your experience, what are the ethical concerns you hear most often, and which do you think deserve more serious attention?
NT: The two concerns I hear most are theft and replacement.
On theft: People worry AI is just taking from other artists. If you have worked in 3D production, you likely held a single discipline or a narrow slice across a few. You were building shared parts of a larger whole. Asset stores, scan libraries, motion capture, and shader packs are standard. That is collaboration when the terms are clear. On the ethics line, I do not hand out rulings. Ethical is personal. My filter is simple: Did people consent, are the licenses valid, and can we trace what went into the dataset? If those are visible, there is something concrete to evaluate. Reasonable people will still disagree, and that is OK.
On replacement: The fear is rational. Budgets reward speed. Management pressure is real. Power is uneven between studios and individual artists. We have history here. For example, outsourcing waves that hollowed out junior ladders. AI can compress tasks, mask who did the work, and push credit further from the artist. If review is rushed, quality drops while accountability gets fuzzy. That is why people worry, and I share that concern.
SIGGRAPH: What insights from your research or training have helped you better understand how governance can influence innovation?
NT: Governance helps innovation when it protects what makes film and animation valuable — human originality. If a studio treats AI like a shortcut without clear rules, work trends derivative and directors will walk when they cannot see real differentiation. Good governance does the opposite. It creates safe room to invent.
I am not prescribing one workflow. I treat all AI output as raw material that passes through an artist’s hands, gives artists veto, and judges tools on whether they support intent and save time. Make inputs visible. For example, what data was used, what is cleared, and what is the tool not for? If a studio explores in-house models on licensed catalogs, include artists in evaluation and dataset choices so the craft bar stays high. Human authorship first, clear inputs, light checks — that is the mix that keeps quality strong and invites real invention.
SIGGRAPH: You’ve talked about the importance of building responsible workflows from the ground up. What does that mean to you in practical terms — where should teams start?
NT: Right now, I recommend starting with discovery. Research means learning how the current models are built, what data and choices shaped them, where they help, and where they fail creative originality. Gather evidence first, not endorsements. Learn in public; do not bless a pipeline. Ask artists and supervisors where AI might help or hurt. Run small, time-boxed studies with a clear stop rule. Nothing ships. Measure assist value in human terms. For example, time saved, iteration quality, and how much creative control stays with the artist. Track what data went in and any open risks. Decide what worked, what failed, and what to try next or stop doing. Only then write the first workflow, keep humans in charge, name an owner, and review on a regular cadence.
SIGGRAPH: In your view, what’s often missing from public conversations about AI and creativity — especially from those who haven’t worked inside production pipelines?
NT: Two loud takes seem to dominate the public conversation: AI will take all the jobs, and AI will flood everything with slop. Parts of that can happen, but people outside production miss how films are actually made. Production is a team sport where tools assist and artists decide. Integration costs are real because a model has to fit asset naming, color, security, and schedules, or it creates rework. Rights are not abstract. Datasets, scans, textures, and audio come with consent and provenance attached. Quality has gates, with dailies and QC keeping the bar high, so raw model output is not the finish line.
Jobs do change, and ladders matter. If we do not plan for juniors, credits, and training, the squeeze gets worse. My take: AI will be part of our industry for the long run, and I still want to make great films, so I am adapting with care. That means measuring assist value in human terms, for example, time saved and creative control kept with artists, bringing artists into evaluations, and making inputs visible so choices can be reviewed.
SIGGRAPH: Looking ahead, what gives you optimism about the future of AI ethics and governance, and where do you see the biggest opportunities for progress?
NT: My take is cautious, not rosy. I am adapting to protect the craft, not to hype a tool. What gives me optimism is that the same things that make great films also create pressure for better governance: Directors want differentiation, studios want reputational safety, and audiences notice quality.
The biggest opportunities are practical ones inside production. Start with culture, not the tool. Give artists room to say where AI helps and where it hurts, and note that small crews can turn assists into an advantage, beating larger studios that chase only the bottom line.
Progress for me looks like studios choosing to learn in public, inviting artists into the decisions, and valuing originality as a core goal. I am not naive about the politics. I am here because I want to have a say in how this unfolds.
As the conversation around AI in film and animation continues to evolve, one thing remains clear: The future of creativity depends on balance. Governance is not a barrier to innovation but a framework that protects the people and processes behind every frame. By giving artists a voice in how these tools are developed and used, the industry can ensure that technology amplifies — rather than replaces — human imagination.

Nikki Tomaino is a Human-Centered AI Product Builder with over 20 years of experience leading creative teams, building pipelines, and developing innovative tools at studios like Blue Sky, Epic Games, and Spire. She recently earned her AI Product Management certification through IBM, along with badges in Neural Networks and Deep Learning, and completed several advanced courses in AI governance, regulation, and ethics. She is especially focused on helping to build products that protect IP and align with emerging governance standards. The current landscape is complex and far from solved.
A few years ago, Nikki co-founded a studio during one of the most uncertain moments in the industry. The studio built tools to solve real creative production problems. While she has not formally held an AI PM title, she led the kind of work the role requires. She developed a patent-pending AI application, managed end-to-end production pipelines, and driven strategy through every stage of delivery. Her focus now is on AI governance and product-level decision-making, and she continues to study how neural models are built and trained to know the “how’s” that follow the “why’s”. This experience reinforced what has always driven her career: curiosity, creative collaboration, and the belief that innovation only matters if it protects creators and strengthens the communities we build for.



