The Current State Of PTEX And
UVLess Workflows For
By Neil Blevins
Created On: May 21st 2015
As I have discussed
before, there's more than one way to go about applying textures to
3d models. But the most popular method is bitmaps applied to your model
using UVs. Lots of reasons for this, but one of them is the industry
push for GPU / Realtime Rendering. In this
article, I hope to shed some light
on how the desire for Realtime Rendering is shaping the way we texture
History Of Pattern Placement
Original "computers" were specialized machines that had a
single pre-programmed purpose. The machine was created for a specific
task, and if you wanted to perform a different task, you needed a brand
new machine. Then we saw the birth of punch cards and the "personal
computer", where the hardware was
more generalized and you could write "software" to do many different
with the same hardware. You could in fact program a computer to perform
new tasks that weren't even thought of when the hardware was first
created. That started a period of great flexibility. Then started the
growth of the computer graphics industry, and the start of the desire
to apply textures to 3d models...
So now here we are at the present, we have procedurals, we have UVs, we
Projections and we have PTEX. But for the most part,
the most commonly used technique is still UVing objects and painting
bitmaps using either 2d or 3d paint techniques. Why is this technique
still the most popular? Weren't people excited when UVLess techniques
showing up? Don't UVless techniques help the artist achieve great
results with less work? Why are we still using UVs then? While many
causes could be pointed to, I feel the strongest pull comes
lure of Realtime Rendering.
- Procedurals: the first
way of applying textures to a surface was using procedural patterns.
Memory was at a premium, and so procedurals made sense since they took
up little memory.
- Bitmaps and UVs: As
memory increased and techniques
expanded, another popular texturing method appeared, a bitmap was
wrapped onto your surface using an atlas, or what we later referred to
as UVs. Using
bitmaps allowed for a larger variety of patterns than you could easily
achieve with procedurals, they were also frequently faster to render
and easier to anti-alias. Nurbs Surfaces and
Patches took over the film industry, whose UVs were locked to their
topology. But UVs for polygons (which were very popular in the video
game industry) allowed the artist to make their own atlas. Film
eventually moved in the same direction when subdivision surfaces
replaced nurbs and patches.
- Projections: Then there
was projections, the idea of taking bitmaps, but rather than
tying them to an atlas, you projected the bitmap onto your surface from
a projector source. This had the advantages of not requiring time
consuming UV setup.
And while the most common use was matte painting, it was also used
frequently to texture assets.
- 3D Paint: Around this era
we also saw
the first stabs at 3d paint programs, where you could paint directly on
your 3d surface. The results of your 3d paint strokes would get baked
to 2d UVs, saving the paint in a 3d format of some sort never really
- PTEX: As time went on,
and our 3d scenes started becoming more and more
complex with bigger meshes and more objects, one thing was clear:
setting up good UVs on objects was a very time consuming task, and not
a lot of fun. Disney threw another contender into the mix in 2010,
The idea of
PTEX was each face gets its own uv space. You use either a 3d paint
program or bake a procedural texture onto your surface, and write them
to a PTEX file. The big advantage, no UVing necessary. People were
super excited at the possibility that UVing was on the way out.
Video Cards Favor The UV Workflow
The holy grail in the games industry
is fast frame rates. The faster the frame
rate, the smoother the visuals, the better the gaming experience. This
is even more important now
that we are entering the world of Virtual Reality and Augmented Reality
with the Oculus, Holo-Lens, etc, where high frame rates aren't only
desirable but are actually necessary or else people become
Realtime graphics is the realm of the video card. In 2009 we started
seeing all this promise from "GPU rendering", and
the first few GPU renderers
could perform some rendering functions at incredible speed. A good
example is when mental
images' iray first
became available for 3dsmax, it was super fast. But the speed came with
a cost, you could only use
a small portion of the standard 3dsmax features. Even today, with V-Ray
the more modern
iray, we have the same problem, they're compatible with more stuff than
they were in 2009, but
still incompatible with a large number of features (See this VrayRT
supported features chart). This caused a lot of frustration, since
features people were
used to using, now they couldn't.
While CPUs allow for a lot of flexibility, speed
flexible and a stronger focus on single purpose hardware. To achieve
the highest speed on a video card, many base functions are a part of
the hardware itself. The video card expects your 3d software to give it
the data in the way the video card wants, and deviating from that means
you don't get the fast frame rates.
Video card technology was for the most part driven by the needs
of the videogame industry, as they were their biggest customer. Now
mobile devices have a huge
say with the hardware manufacturers, as well as potentially the VR and
AR field. But these markets have a lot more in common with the gaming
industry than with film when it comes to technique and performance
requirements. Since gaming and related fields are the largest market
for videocards, and UVing is the
most common way of texturing stuff in
videogames, video cards are created specifically to speed up that
particular workflow, to the detriment of other techniques.
PTEX On Video Cards
of the reason techniques like PTEX
(despite its advantages over UVs) has had trouble gaining ground is
because the segment of the
industry that's currently most interested in techniques like PTEX
accelerated on video
cards is too small (ie. the film market). The video card manufacturers
aren't likely to
improve the PTEX workflow on their hardware unless there's
a lot of demand from their main customers (videogames, mobile). And
the videogame industry overall hasn't been pushing the issue for a
number of reasons...
Here's an article from 2012 by Sebastian Sylvan called Casting
a Critical Eye on GPU PTex that contains a lot of
useful information, both showing many of the
technical issues that would need resolving to see PTEX work well on
video cards, and the comments section has a good discussion with the
Mudbox team (who made a PTEX implementation for hardware) where they
feel Sebastian didn't give PTEX a fair shake.
- They have built so much of their pipelines and expertise around
UVs, a big change now would be difficult in terms of software, pipeline
- The old chicken and egg problem. Video cards aren't optimized to
allow for the PTEX workflow, so the video game folk don't even consider
it an option.
Since they haven't experienced the advantages, they don't ask for it.
And so the card manufacturers don't put it on
the video cards, hence video cards aren't optimized to allow for the
- Lackluster authoring software support.
- Technical hurdles that would need to be addressed to see PTEX
games (Quad workflows, Memory Overhead from Border Filtering, etc).
These technical hurdles are often given as the main reason for
PTEX's slow adoption, however, the people I've spoke to feel these
issues are all solvable if the will is there.
FX Guide posted an interesting article called UDIM UV Mapping
in May of 2014, discussing the advantages / disadvantages of UVs and
PTEX. The article was very pro UVs, but then 2 months later it was
followed up by another article called PTEX, the
other side of texturing, which got into more detail on the
advantages of PTEX, and the advances Disney hopes to see in the area.
To note, some research has indeed happened in making PTEX a
viable option on hardware by the hardware manufacturers themselves,
here's an article from 2013 showing that
Nvidia has actually made a
ptex implementation, at least in the R&D stage: Eliminating
Texture Waste: Borderless Ptex. AMD as well: Radeon
HD 7900 Series Graphics Real-Time Demos.
Asset Authoring "Realtime" vs
Game Engine Realtime
One other note should be made to distinguish the difference between
Asset Authoring "Realtime" and Game Engine Realtime. These two
different areas have different requirements.
Perhaps UVless workflows
including PTEX may work fine for the asset creation stage, just not for
the game engine stage. The idea would be to use UVless techniques while
you're making the asset, but
then bake the result to textures assigned to Automatic UVs while you're
using the asset in the final product. Some sort of Automatic UVing
exists in most asset authoring applications.
UVs faster because you don't have to manually lay them out. But the
disadvantage is that UV layout tends to be messier, making
it difficult to paint the results in a pure 2d paint program, and
issues arise with
artifacts at the UV island borders,
which are far more numerous and may not be placed in the ideal spot
unless hand edited.
- Asset Authoring
realtime would be "I'm inside my 3d paint application painting a PTEX
file in realtime".
While this needs to update fast, it may not require 60 frames a second
to give the user the results they need. In fact, it may be better to
not call this "realtime" at all, but simply "interactive". Even getting
results back in 1-2 seconds may give an asset creation artist the
feedback they need in some circumstances.
- But for the final product, the videogame engine itself (that uses
the assets), achieving smooth and high
frame rates is important. Real realtime is necessary.
Here are some examples of using UVLess techniques at the Asset Creation
- Projections in Mari / Substance:
The painting application Mari allows you
to use projections (triplanar) and then bake the result to UVs. The
same with Substance Designer / Painter, a newcomer on the scene for 3d
games. Since the results of the projections are baked to UVs, you can
use the results efficiently in a game engine on current videocards. But
the disadvantage is
baked, if you want to change your projection, you have to rebake, which
may require manual work and computer time. And of course if you do
manual UVs to bake to, it takes awhile to UV, or if you use Auto UVs,
it has the disadvantages already outlined above. So this workflow
removes some of the advantages
- PTEX in Mudbox: Autodesk
to display and paint PTEX in realtime on
your model while painting, but the way it does that is by create a
to your mesh using a sort of automatic uvset that mimics how a PTEX
file works, as opposed to directly displaying a PTEX file.
- Interactive Workflow, Bake At
The Last Minute: You could also imagine a workflow where you
work UVless for the entire
asset creation stage, and get back interactive feedback (like using
V-Ray RT for example inside 3dsmax or Maya), and then your last stage
before sending the model to the game engine is a texture baking stage
baking the textures to Automatic UVs. This could potentially save the
asset artist a ton of time (since they don't have to UV and can use
stuff like projections to quickly apply complex library materials to an
object), and still give you realtime results in your
game engine. Only disadvantage here is that if you make an asset
change, you have to wait for a rebake before you get the results in
your game engine. And of course the inherent Auto UV issues.
As you can see, this is a really complex issue, with a lot of moving
parts that are controlled by a lot of different groups, from customers
to software companies to hardware companies to entire industries.
Trying to get all of these things to align is a really tough job and
takes a lot of time. Realtime and interactive rendering has major,
And UVless workflows really help the artist spend more time on the art
and less time on the technical. But right now these two things don't
work as well together as we'd like. My hope is that eventually we will
be able to have our cake and eat it to. But to have
that, we need a push from all of the artists and technical folk in all
of the graphics
related industries. What we'd need to see...
If we want realtime at both the asset and shot/game stage of the
pipeline, and the ability to use PTEX in both, we're going to need
these changes to occur. And the only way to get that to happen is with
your help. Can we
have flexibility AND speed in the future? My hope is yes.
- All industries (film, games, etc) ask the video card manufactures
to make the necessary changes to promote the PTEX workflow (or an
equivalent), from the
hardware to the SDKs
- More research into resolving the technical hurdles that slow down
PTEX, or else we'll have to keep PTEX purely as an asset creation
- Better integration of PTEX into 3d authoring software like
3dsmax, maya, mari, etc
- More focus on other UVless workflows in 3d authoring software
This site is ©2015 by Neil
All rights are reserved.
Back to NeilBlevins.com