Ray Tracing Will be Everywhere in 2020

by | 8 October 2019 | Augmented Reality, Gaming, Graphics, Hardware, Production, Real-Time

Not just PCs, but consoles, and mobile devices will show the rays

Photo by Davide Cantelli on Unsplash

Ray tracing, the long-held goal for computer graphics, has been in the movies, CAD design, and visualization for decades. The physically accurate, beautiful images have historically made realistic scenes and impressions; however, those images came at a price, namely lots of computing time and power. But Moore’s Law, and clever algorithm developers, seem to never rest and so the constraints that made ray tracing and the long-sought-after dream of real-time ray tracing have been steadily reduced.

Demonstrations of real-time ray tracing were made by Intel in 2009 when the company was developing its Larrabee project, but it required a rack of servers to do it. Then, in late 2017, Nvidia showed real-time ray tracing using several AIBs. Shortly thereafter, I predicted we’d have ray tracing in PCs by 2023. Hmmmm.

SIGGRAPH 2018 marked the breakout year for real-time ray tracing when Nvidia and Adshir demonstrated real-time ray tracing (RTRT) — in a PC and mobile device, respectfully — something we didn’t expect to see for several more years.

Recently, Morgan McGuire, an Nvidia researcher, predicted that by 2023 we will see the first set of AAA games to require such a RTRT GPU.

Since the fall of 2018, Nvidia has been the catalyst and promoter of RTRT, leading, pulling, and in some cases pushing game developers, as well as professional 3D modeling and rendering software suppliers, to take advantage of the PC hardware. Those software developers have then pushed AMD to give them RTRT on the consoles that are expected this holiday season. And, Intel, which will be introducing a PC AIB next year, is promoting games that don’t need special hardware and can make use of a powerful CPU.

Console, too

AMD’s console processors — variations of its popular APUs found in entry-level notebooks — are expected to also make use of the CPU and have a modicum of GPU support for RTRT, combined with some software tricks of the type hinted at by Sony at E3.

Microsoft and Sony have both indicated, perhaps a bit obliquely, that their new consoles will offer ray tracing support. The games will be there, re-ported from the PC versions, and probably a couple of exclusive titles. Imagine how great the shiny faceplate of Master Chief’s helmet will look.

Sony’s “Red Death Redemption” will get reflective lakes and rivers, saloon windows, and the glint of the hero’s teeth.

Consoles have more third-person adventure games, which will make better use of ray tracing because the player will be able to stop and see the shadows, reflections, and other effects.

Nintendo may be the odd man out and has not shown any interest or commitment to ray tracing for the upcoming Switch.

However, the immersive impact of ray tracing, on big HDR TV screens is going to be sensational, and will increase the demand for such games on all platforms: PC, mobile, and console.

And Mobile Devices

While Nvidia was in the big tent at SIGGRAPH 2018, wowing the world with its big-iron, massive new RTRT chip, tucked a few blocks away in a modest hotel was Israeli-based Adshir, who was showing the first RTRT scenes in AR on a tablet.

Adshir’s dinosaurs will crawl into your table

Adshir is a total software solution and uses Intel’s Embree ray tracing program and libraries, but, the company says, they are not limited, nor contractually obliged, to use Embree. They chose it because they thought it as the most popular. It’s a credit to the efficiency of Intel’s software, in that Embree has been used primarily in large HPC machines for scientific visualization where hours to days are taken for rendering. That Adshir could get it to run on a mobile device in real time is quite a compliment.

At SIGGRAPH 2019, Adshir showed new versions that could run a smartphone, too. As more AR consumer-class (i.e., less conspicuous) smart glasses appear in the market (there are five now), ray tracing will come with them. And that, in turn, will drive the demand and the expectation for ray tracing.

What About Streaming Gaming?

Streaming gaming may benefit the most from ray tracing because the cloud will have the latest, most powerful AIBs and be able to devote maximum power to them. If you want to play a ray-traced game but don’t have the budget for a new AIB and the games, cloud gaming is going to delight you. You’ll get all the effects, interactively, and at a fraction of the cost.

Metro Exodus released in mid-February 2019 delivered some of the most expansive ray tracing support of any game yet

Streaming gaming will need something to get users’ and potential users’ attention to try the service, and ray tracing could be just the thing.

Who Else, What Else Will Bring Ray Tracing to us?

In addition to the PC suppliers like AMD, Intel, and Nvidia, the console supplier AMD, and the mobile supplier Adshir, two other firms are offering IC IP for ray tracing: Imagination Technologies and SiliconArts. IP solutions take a long time to show up in products, so don’t expect any big announcements from either of these companies, or their customers, for two years or more.

Say “Thank You” to Nvidia

So, the entire CG community owes Nvidia a big “thank you.” If Nvidia hadn’t committed itself to the goal of RTRT, and accepted the challenge of building and selling the biggest consumer ASIC ever (754 mm2), and convinced the leading game companies to add ray tracing to their games, it would still be an academic discussion with a few interesting demos.

Now, it’s a thing…a thing every new game and device has to have.

If you want to learn more about it, there are two excellent new books you can read: My book will give you a background and overview of the technology, and Nvidia’s book will give you tools and examples for how to employ and deploy ray tracing.

Jon Peddie Research

Jon Peddie Research is a technically oriented marketing, research, and management consulting firm. Based in Tiburon, California, JPR provides specialized services to companies in high-tech fields including graphics hardware development, multimedia for professional applications and consumer electronics, entertainment technology, high-end computing, and Internet access product development.

Related Posts

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH 2024 Courses Chair Ruth West chats with audio and graphics experts Aaron McLeran, Felipe Romero, and Max Hays, about synesthetic connections between real-time audio and graphics within games.