The Current State Of PTEX And UVLess Workflows For Realtime Rendering
By Neil Blevins
Created On: May 21st 2015

As I have discussed before, there's more than one way to go about applying textures to your 3d models. But the most popular method is bitmaps applied to your model using UVs. Lots of reasons for this, but one of them is the industry push for GPU / Realtime Rendering. In this article, I hope to shed some light on how the desire for Realtime Rendering is shaping the way we texture our models.

History Of Pattern Placement

Original "computers" were specialized machines that had a single pre-programmed purpose. The machine was created for a specific task, and if you wanted to perform a different task, you needed a brand new machine. Then we saw the birth of punch cards and the "personal computer", where the hardware was more generalized and you could write "software" to do many different tasks with the same hardware. You could in fact program a computer to perform new tasks that weren't even thought of when the hardware was first created. That started a period of great flexibility. Then started the growth of the computer graphics industry, and the start of the desire to apply textures to 3d models...
So now here we are at the present, we have procedurals, we have UVs, we have Projections and we have PTEX. But for the most part, the most commonly used technique is still UVing objects and painting the bitmaps using either 2d or 3d paint techniques. Why is this technique still the most popular? Weren't people excited when UVLess techniques started showing up? Don't UVless techniques help the artist achieve great results with less work? Why are we still using UVs then? While many causes could be pointed to, I feel the strongest pull comes from the lure of Realtime Rendering.

Video Cards Favor The UV Workflow

The holy grail in the games industry is fast frame rates. The faster the frame rate, the smoother the visuals, the better the gaming experience. This is even more important now that we are entering the world of Virtual Reality and Augmented Reality with the Oculus, Holo-Lens, etc, where high frame rates aren't only desirable but are actually necessary or else people become ill.

Realtime graphics is the realm of the video card. In 2009 we started seeing all this promise from "GPU rendering", and the first few GPU renderers could perform some rendering functions at incredible speed. A good example is when mental images' iray first became available for 3dsmax, it was super fast. But the speed came with a cost, you could only use a small portion of the standard 3dsmax features. Even today, with V-Ray RT and the more modern iray, we have the same problem, they're compatible with more stuff than they were in 2009, but still incompatible with a large number of features (See this VrayRT supported features chart). This caused a lot of frustration, since features people were used to using, now they couldn't.

While CPUs allow for a lot of flexibility, speed requires less flexible and a stronger focus on single purpose hardware. To achieve the highest speed on a video card, many base functions are a part of the hardware itself. The video card expects your 3d software to give it the data in the way the video card wants, and deviating from that means you don't get the fast frame rates.

Video card technology was for the most part driven by the needs of the videogame industry, as they were their biggest customer. Now mobile devices have a huge say with the hardware manufacturers, as well as potentially the VR and AR field. But these markets have a lot more in common with the gaming industry than with film when it comes to technique and performance requirements. Since gaming and related fields are the largest market for videocards, and UVing is the most common way of texturing stuff in videogames, video cards are created specifically to speed up that particular workflow, to the detriment of other techniques.

PTEX On Video Cards

Part of the reason techniques like PTEX (despite its advantages over UVs) has had trouble gaining ground is because the segment of the industry that's currently most interested in techniques like PTEX accelerated on video cards is too small (ie. the film market). The video card manufacturers aren't likely to improve the PTEX workflow on their hardware unless there's a lot of demand from their main customers (videogames, mobile). And the videogame industry overall hasn't been pushing the issue for a number of reasons...
Here's an article from 2012 by Sebastian Sylvan called Casting a Critical Eye on GPU PTex that contains a lot of useful information, both showing many of the technical issues that would need resolving to see PTEX work well on video cards, and the comments section has a good discussion with the Mudbox team (who made a PTEX implementation for hardware) where they feel Sebastian didn't give PTEX a fair shake.

FX Guide posted an interesting article called UDIM UV Mapping in May of 2014, discussing the advantages / disadvantages of UVs and PTEX. The article was very pro UVs, but then 2 months later it was followed up by another article called PTEX, the other side of texturing, which got into more detail on the advantages of PTEX, and the advances Disney hopes to see in the area.

To note, some research has indeed happened in making PTEX a viable option on hardware by the hardware manufacturers themselves, here's an article from 2013 showing that Nvidia has actually made a ptex implementation, at least in the R&D stage: Eliminating Texture Waste: Borderless Ptex. AMD as well: Radeon HD 7900 Series Graphics Real-Time Demos.

Asset Authoring "Realtime" vs Game Engine Realtime

One other note should be made to distinguish the difference between Asset Authoring "Realtime" and Game Engine Realtime. These two different areas have different requirements.
Perhaps UVless workflows including PTEX may work fine for the asset creation stage, just not for the game engine stage. The idea would be to use UVless techniques while you're making the asset, but then bake the result to textures assigned to Automatic UVs while you're using the asset in the final product. Some sort of Automatic UVing exists in most asset authoring applications. You get UVs faster because you don't have to manually lay them out. But the disadvantage is that UV layout tends to be messier, making it difficult to paint the results in a pure 2d paint program, and issues arise with artifacts at the UV island borders, which are far more numerous and may not be placed in the ideal spot unless hand edited.

Here are some examples of using UVLess techniques at the Asset Creation stage...

As you can see, this is a really complex issue, with a lot of moving parts that are controlled by a lot of different groups, from customers to software companies to hardware companies to entire industries. Trying to get all of these things to align is a really tough job and takes a lot of time. Realtime and interactive rendering has major, major advantages. And UVless workflows really help the artist spend more time on the art and less time on the technical. But right now these two things don't work as well together as we'd like. My hope is that eventually we will be able to have our cake and eat it to. But to have that, we need a push from all of the artists and technical folk in all of the graphics related industries. What we'd need to see...
If we want realtime at both the asset and shot/game stage of the pipeline, and the ability to use PTEX in both, we're going to need these changes to occur. And the only way to get that to happen is with your help. Can we have flexibility AND speed in the future? My hope is yes.

This site is ©2015 by Neil Blevins, All rights are reserved.
Back to