Ideal normal maps for Unity (and other programs)

... and how to figure out what's wrong with your content creation pipeline.


Let's say you have an amazing high-poly model that the artist has worked on, and you want to put it into the game. The artist showed screenshots of the model in his program and they look amazing. Even after baking normal maps for the low-poly version, it still looks great.


See the greatness of this traditional model for testing normal maps!

Then the artist puts her in the game, and she looks ... a little wrong.


Shouldn't these faces be flat?

The surfaces, which should be flat, are still a little rounded, some are even depressed, and even worse, strange stripes go along them. Familiar situation? Some of you probably came across it, thought: “Well, probably, it won’t work out better”, and moved on. Perhaps a little dirt was applied to mask the most severe cases.

But there is actually something wrong with your content creation pipeline. This is not always obvious, but the problem can be solved! *

* If you assume that in some cases you can make changes to the content pipeline.

Foreword


This article is about how to export ideal normal maps from other applications for use in Unity 5.3 and higher. It does not talk about how to get high-quality normal maps, there are many other tutorials for this, so I won’t touch on this topic.



Tangent space basis


The problem is how the normal maps of the tangent space work. In particular, how does the tangent space (or tangent basis) work. I will not go too far into the technical aspects of what they do. I, like other authors, have already talked about this , but in short, the tangent space determines what the orientation of the direction in the normal map should represent. Which direction on the normal map is actually “forward”, “up” or “right”. Roughly speaking, this orientation is the orientation of the UV coordinates of the texture and normals of the vertices.

It is more important to understand that the program used to bake the normal map from the highpoly mesh to the lowpoly mesh must calculate its tangent spaceexactly the same as the program in which normal maps will be used . Since different programs can use tangent space in different ways, normal baked maps for a mesh will not necessarily be the same for all applications. Even for applications using the same tangent space!

Know your enemy


Coup due to obvious mistake


The most common mistake is the “OpenGL vs Direct3D” problem. They are also called normal maps Y + and Y-. I included them here just for completeness. Most likely, most people who have encountered this problem already know about it, or at least learned how to solve it. Since normal maps are generated in one orientation, they are essentially useless for use in an application designed to use another.


Normal maps in Direct3D orientation in the Unity engine, which expects an OpenGL orientation.

This can be understood by the fact that the lighting seems to be inverted or falling from the other side, but only in some lighting directions. Some corners or faces may look completely normal until you move the light source.

And with this problem, everything is clear enough. Most normal map baking applications have a nice large button or menu to switch between OpenGL / Direct3D or Y orientation option. Find out what orientation the target application uses and bake maps in it. Unity uses OpenGL and Y +. Unreal uses Direct3D and Y-. This can be clearly understood by looking at the texture of the normal map itself. If it seems that in the green channel they are lit "from above", then this is OpenGL orientation. If from below, then this is Direct3D.

Oddities with sRGB


Another common mistake is to use a texture marked as sRGB texture. This changes the way the GPU texture is sampled, forcing it to convert color values ​​from the sRGB space, from which people usually select color values, to a linear color space. For normal maps, this is not necessary. If you set the type to “Normal map” for the texture in Unity, the engine will automatically disable this option, but you can also disable the “sRGB (Color Texture)” option manually, and the texture will work as a normal map. At least in Unity 2017.1 and newer.


Normal map with Default texture type and sRGB enabled.

This problem can be recognized by the fact that normal maps are strongly biased in one direction. In this model, almost every face looks as if it is directed towards a light source. Obviously, this will not always be the case. In a more detailed normal map, all parts will respond to lighting almost correctly, but slightly shift to some angle. Sometimes it may look as if the entire surface of the mesh is distorted.

Too different to be normal


Another common problem is the use of different normal baking meshes in the game asset. As with tangents, for everything to work, they must exactly match. This error often occurs if artists export meshes with incorrect vertex normals. In this case, any application that opens the mesh will have to calculate them yourself. There is no one right way to solve it, it all depends on the mesh, so make sure that you (or artists) export the model with the final normals!


Normal maps baked on a smoothed mesh, displayed as on a mesh with sharp edges


Normal maps baked on a mesh with sharp edges, displayed as on a smoothed mesh

This is another problem that is easy to spot. If the model should look like it has sharp edges, but the shading bends from these edges, then this means that you baked the normal map with smooth normals, but they are rendered on a mesh with a sharp edge. If the shading bends towards the edge, this means that the map was baked with a sharp edge and rendered on a smooth edge. In fact, in one model both cases may be present, because it is possible to indicate the smoothness or sharpness of individual edges, and even the degree of smoothness. But in fact, they are guaranteed to be erroneous if you allow another application to guess how it should be. The model presented above is an extreme case, and often everything is not so obvious, because large parts of the model may look completely normal. But in the general case,if you have several edges that look particularly strange, then this is probably due to a mismatch in the vertex normals.

MikkTSpace vs Autodesk


A less noticeable problem is that when baking 3ds Max or Maya normal maps, they will always be incorrect in the Unity engine. They will be wrong in Unreal, and in almost any other program, except for the one in which they were baked. 3ds Max normal maps do not work in Maya, and vice versa! Each of these applications calculates tangent space in its own unique way, which does not correspond to other applications. When used independently, they look perfect , and if you generate normal maps for use in these programs, then try to bake normal maps in them. Almost no other program is capable of baking the correct normal maps for these applications, so you seem to have no choice.

The only solution in this case is not to bake normal maps with these applications if you want to use them somewhere else. If you need a free program, then use xNormal . This is an industry standard tool. If you use Substance somewhere in the content creation pipeline, use it.


3ds Max baked the normal map used by Unity The

easiest way to notice this problem is to see that flat faces have small dents or curvature. In some cases, this can be very inconspicuous, especially in organic objects. Therefore, problems of this kind often go unnoticed.

Many non-Autodesk applications have switched to a tangent space calculation method called MikkTSpace.. Both xNormal and Substance, as well as Blender, Unity, Unreal, Godot, Marmoset, Houdini, Modo and some others, by default use MikkTSpace or at least somehow support it, which greatly improves their compatibility. In fact, over the past ten years, it has actually become the industry standard in all cases when normal maps of tangent space are used. Presumably, at some point, 3ds Max will add their support, but it is unclear whether the developers plan to allow baking normal maps using MikkTSpace or simply provide support for viewing / rendering with their help.

Export Meshes and Substance Painter


Another problem with 3ds Max and Maya is that when exporting meshes from them there is an export option with “tangents and binormals”. They will not be in the actual tangents of MikkTSpace, but will be based on them. Some applications force-recalculate the correct MikkTSpace by default (for example, xNormal, Unity, and Unreal), but other applications use tangent meshes (if any) “as they are” and compute MikkTSpace only if the mesh has no tangents. For example, Substance Painter uses tangent meshes if they are present as they are, that is, Painter creates normal maps very similar to baked 3ds Max! Similar, but not quite the same, so if you import them back into 3ds Max, they will not look as good as baked in Max.

MikkTSpace vs MikkTSpace?


The next problem is that there is more than one version of MikkTSpace! Yes, that would not be the standard if somehow it did not introduce confusion. MikkTSpace determines how the data should be stored in the mesh for each vertex, and how the data is used for rendering. All MikkTSpace-enabled applications compute the same mesh data for vertices. But there are two different ways to use this data in rendering. MikkTSpace defines how to calculate the tangent vector and sign for each vertex based on the UV orientation. But when using tangent space, the tangent to two points (bitangent) is recalculated either for each vertex or for each pixel. This parameter should also be the same for baking and viewing.

xNormal and Substance have the option to bake normal maps using both methods. These options are called “compute tangent space per fragment” in Substance and “compute binormal in the pixel shader” in xNormal. You do not need to use it if you are baking for the built-in Unity rendering methods. However, it might be worth doing this for SRP, depending on the type you choose!


XNormal “pixel shader binormals” with built-in forward renderer Unity

Visually, this looks like using a completely different tangent space, but is usually much less pronounced. Some faces may look as if the normal is a little distorted, and will not have a large dent. The solution here is very simple - do not select this option if you are baking for Unity's built-in rendering methods. Check the box if you bake Unreal or Unity engines for HDRP.

Many sources say that Blender and Unity are identical, but in fact this is never true! Blender, like Unity, uses MikkTSpace and OpenGL orientation, but, like Unreal, uses binormals for each fragment binomals.

The good news is that now the LWRP, URP and Shader Graph shaders use binormals for each vertex, and the built-in HDRP shaders for each pixel. In a future update of Shader Graph (7.2.0), the transition to pixel-by-pixel computing will be performed, and after some time, the built-in URP shaders will also switch to it. And standard rendering methods can be changed using your own shaders, if you choose such a route.

Triangulate, triangulate, triangulate!


The last problem is different ways of triangulating. If you export meshes without preliminary triangulation, then there is no guarantee that other applications will triangulate the model in the same way. This also harms normal maps! Therefore, triangulate the meshes, either using the export options, or through the triangulation modifier.


Substance Painter , Unity , 3ds Max (quads)

This appears very similar to a completely wrong tangent space, as if normal maps were baked in 3ds Max. But in the general case, they are a little sharper, almost bent, and do not look like smooth concavity. In this mesh, an X-shape is visible on the outer faces. This problem can be difficult and annoying when several people are working with the model in the content production pipeline in different applications, because it is often useful not to triangulate the mesh and save beautiful polygons. In this case, you have to either come to terms with the inconvenience, or make the last tool in the pipeline before exporting the model to the game also used to generate a mesh used for baking normals. Like a tangent space with normals of vertices (which are part of the tangent space),triangulation must also match exactly.

List for creating perfect normal maps for Unity


So, here are tips for generating tangent space normal maps for lowpoly meshes from the original highpoly models for use in Unity:

  • No need to bake normal maps using 3ds Max, Maya or any other application that does not have the tangent option MikkTSpace .
  • Export meshes as triangles , with normals , and no tangents.
  • Choose OpenGL or X + Y + Z + as the orientation of the normal map when baking (as well as when creating a project for Substance Painter).
  • You do not need to enable the options of tangents to the two points " per pixel " / " per fragment " when baking for the built-in rendering methods (proactive or deferred) or LWRP.
  • Include options for tangents to the two “per pixel” / “per fragment” pointswhen baking for HDRP, and soon for URP.
  • Leave the default model import options for Unity, with the normals set to “Import” and the tangents to “Calculate Mikktspace”.

That's all! Every time you get perfectly baked normal maps!


Hooray! The perfect normal map!


... with reflections? Ahem ..

Well, the last one, we just ignore it. As “perfect” as possible with compressed texture formats with 8 bits per channel. A dithering normal map may look a little better than stripes, but, unfortunately, not many tools provide this option. Otherwise, you can use 16-bit normal maps. But this is a topic for another article.

Learn more about MikkTSpace.


The semi-official site of mikktspace.com explains the key benefits of MikkTSpace well. In particular, this space works well both for recounting and for transferring tangent data between programs with different mesh processing, while maintaining the integrity of the results. The site text is a copy of the already defunct Blender wiki page written by Morten S. Mikkelsen himself. Unfortunately, it only mentions the version with tangents to two points for each pixel, but not the version for each vertex, the default used by Unity and xNormal. Since Blender uses derivatives of per-pixel tangents, it is logical that only they are mentioned.

Here are comments about these two options from the original mikktspace.h file :

, /.

bitangent = fSign * cross(vN, tangent);




(/ ) , .

, , , «» , : vT, vB vN.

( )

vNout = normalize( vNt.x * vT + vNt.y * vB + vNt.z * vN );

vNt — . «» , , .




, , , .

So, in the original example, the tangent to two points is recreated in the vertex shader and passed to the pixel shader as another interpolated value. This is how the normal Unity shaders currently handle normal maps. An example of performing this task in a pixel shader is more an impromptu rather than a “main” example. In addition, since the implementation for Unity 5.3 (and xNormal?) Was created by Morten S. Mikkelsen himself, and both use tangents for each vertex, this is another proof of our theory. But then Blender used pixel-by-pixel tangents, and Morten was also involved. Probably, in fact, it was just an option that caused small changes in the finished Unity shaders, and while for real-time graphics, none of the methods were considered much better than the other.

The advantage of pixel-by-pixel implementation is the elimination of two interpolated values ​​at the cost of a little extra cost to ALU (pixel shader math). This means that instead of the three values ​​of Vector3 (tangent, tangent to two points and the normal), we only need Vector4 and Vector3 (tangent with a sign and normal). On modern GPUs, this is slightly faster. And it is definitely faster when executed in the CPU, since it can be used for non-real-time renderers. From this point of view, it is logical that most of the industry decided to recreate the tangent to two points in the pixel, and not at the top.

It's funny that when I wrote this article, Morten promoted the transition in Unity SRP to pixel by pixel tangents to two points! It’s just a bad luck that Unity’s built-in rendering methods have a legacy of inconvenient choices.



Tangent space in different applications


Although this article focuses mainly on Unity, since today this engine has become mainstream, I wanted my study to be more detailed. Therefore, I created a list of various applications with information about the use of tangent space.

Unity (5.3+)


Orientation: OpenGL Y +
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: for vertices or for pixels *

Notes:by default, the engine forcibly redefines the tangents of the mesh upon import to use MikkTSpace. For advanced users, it has options for using tangent mesh, but this does not guarantee compatibility with normal maps baked in other tools, especially when using built-in shaders. They can be replaced with completely own shaders. And for its built-in shaders, HDRP uses pixel-by-pixel tangents to two points. At the time of writing, it was planned that the URP and Shader Graph shaders will also switch to pixel-by-pixel tangents to two points. Prior to Unity 5.3, the engine used Lengyel tangents , which can still be used by selecting the Legacy Tangents option in the mesh importer, but the built-in shaders no longer support this tangent space.

Addition: before URP 7.2.0 the processing of normal maps in Shader Graph was completely broken! Neither MikkTSpace nor even the Lengyel tangent space is used correctly, and it is impossible to fix this inside the Shader Graph. If possible, upgrade to 7.2.0.

Unreal engine 4


Orientation: Direct3D Y-
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: pixel by pixel
Notes: by default, the engine redefines tangents of the mesh when importing using MikkTSpace. For advanced users, it has options for using a tangent mesh, but this does not guarantee compatibility with normal maps baked in other tools.

Godot


Orientation: OpenGL Y +
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: for each vertex
Notes: if there is no tangent data, it can compute tangent meshes in MikkTSpace upon import. The engine documentation suggests exporting tangent mesh meshes, but since most tools do not export the correct tangent mesh, I would ignore this tip. Even if you use the original tangent mesh, there is no guarantee that normal maps will look right.

Playcanvas


Orientation: OpenGL Y +
Tangent space: Lengyel / SchĂŒler *
Calculation of tangents to two points: for each vertex / based on derivatives
Notes: provides several methods for processing tangent space. Code generated tangents do not use MikkTSpace. It has options for using a tangent to two points for each vertex that either normalize the tangent space vectors in the “fastTbn” pixel shader without normalizing the tangent space in the fragment shader (which we need, since this corresponds to the implementation of MikkTSpace with a calculation for each vertex), or they use the derivative-based TBN, borrowed from Christian Schuler ; it completely ignores the tangent of the mesh and calculates the tangent space in the pixel shader. Now, by default, derivatives-based normal maps are used, for which, as far as I know, there are no tools for baking exact normal maps. It’s possible that you can import meshes with existing normals and tangent data from MikkTSpace, and then use the “fastTbn” material option to match Unity’s built-in rendering methods, but I believe this should be done in code, not through the editor interface.

Filament


Orientation: OpenGL Y +
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: for each vertex
Notes: depending on performance in the future, the engine may go to pixel-by-pixel tangents to two points.

3ds max


Orientation: Direct3D Y- *
Tangent space: native
Mesh export options: export FBX files with Triangulate turned on, but Tangents and Binormals turned off. Normals must always be exported. Split per-vertex Normals can be left disabled.
Notes: By default, 3ds Max assumes that Direct3D orientation is used in normal maps, but has options for flipping X and Y orientations when using normal map textures on materials. Baking always uses Direct3D orientation. MikkTSpace support will appear in future versions.

Maya


Orientation: OpenGL Y +
Tangent space: native
Mesh export options: export FBX files with Triangulate turned on, but Tangents and Binormals turned off. Normals are always exported. Split per-vertex Normals can be left disabled.
Notes: some people have told me that Maya partially already supports MikkTSpace, but I don’t know to what extent.

Blender (2.57+)


Orientation: OpenGL Y +
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: for each pixel Mesh
export options: export FBX files with a Smoothing value of "Normals Only". You can export with or without Tangent Space if the target application also uses MikkTSpace, such as Unity or Unreal.
Notes: Blender does not have a triangulation option when exporting to FBX, so use the Triangulate modifier first. Normal maps baked in Blender will not be 100% compatible with Unity, but if you flip the green channel in a third-party application or in customized material, then they work perfectly in Unreal.

xNormal (3.17.5+)


Orientation: customizable
Tangent space: MikkTSpace *
Calculation of tangents to two points for MikkTSpace: customizable (for each vertex or pixel)
Options for baking normal maps:has options for flipping all three axes of the normal map during export. Unity needs X + Y + Z +. There is also the option of using tangents to two points for each vertex or pixel (i.e. fragment). By default, xNormal uses tangents to two points for each vertex, but this can be changed in the Plugins Manager menu (power plug icon in the lower left corner) by clicking on “Tangent basis calculator”, “Mikk - TSpace” and clicking on Configure. A pop-up window will appear with the only option “Compute binormal in the pixel shader”. It must be turned off for Unity and turned on for Unreal.
Notes:although xNormal only supports MikkTSpace, its plugin system theoretically supports any tangent space. So far, the only tangent space plugin I know is designed to bake the Unity 4 tangent space and is already useless. That is, if approached realistically, the application only supports MikkTSpace.

Substance painter


Orientation: customizable (OpenGL or Direct3D)
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: customizable (for vertices or pixels)
Baking options for normal maps: has baking options in OpenGL or Direct3D orientation. And an option for using tangents to two points for each vertex or pixel (fragment), which is configured by the “Compute tangent space per fragment” parameter. When exporting for built-in Unity rendering methods, you need to select OpenGL and disable Compute tangent space per fragment in the project and export options, and when exporting to Unreal or Unity HDRP, do the opposite.
Notes:Substance Painter by default uses the tangent data that the imported mesh uses. Unless you are exporting from Blender, you do not need it! But there are no options to disable this feature. Before importing into Substance Painter, export meshes without tangents. Earlier in the Substance documentation, it was mistakenly claimed that the calculations of tangents to two points
in Blender correspond to Unity, but now the error has been fixed!

Marmoset Toolbag 3


Orientation: custom
Tangent space: custom
Calculation of tangents to two points for MikkTSpace: pixel by pixel
Notes: it looks like the program has options for viewing or baking normal maps in almost any existing configuration. Including ready-made settings for tangent spaces 3ds Max and Maya! However, the tangent space option “Mikk / xNormal” uses pixel-by-pixel tangents to two points, and this cannot be changed. That is, the Marmoset Toolbag 3 cannot properly view or bake normal maps for the built-in Unity rendering methods. Perhaps in future versions the situation will change, because developers are aware of the problem.

3DCoat


Orientation: custom
Tangent space: custom
Calculation of tangents to two points for MikkTSpace :?
Notes: It has many options for viewing and baking normal maps. It is perfectly capable of matching Blender and UE4, therefore it supports pixel-by-pixel tangents to two points, but I do not know if vertex tangents to two points are used in the Unity preset.

Houdini (16.0.514+)


Orientation: customizable
Tangent space: customizable *
Calculation of tangents to two points for MikkTSpace: pixel by pixel
Notes: supports its own basis of tangents or MikkTSpace, both for viewing and for baking, but only in the pixel-by-pixel version. It seems that by default it does not use MikkTSpace for rendering, but starting with Houdini 17.5. Meshes can be modified with built-in nodes to use tangent MikkTSpace.

Modo (10+?)


Orientation: custom
Tangent space: custom
Calculation of two-point tangents for MikkTSpace: custom (vertex or pixel-by-pixel) *
Notes: it looks like it supports rendering and baking in different tangent spaces, as well as the option of pixel-by-pixel tangents to two points for export. I have not tested this.

Knald


Orientation: custom
Tangent space: MikkTSpace
Calculation of tangents to two points for MikkTSpace: pixel-by-pixel
Notes: it uses MikkTSpace by default and redefines the tangents of imported meshes, but if necessary there is an option to use existing tangent meshes. The orientation of normal maps can be flipped over with a parameter that applies to the entire project.



Finally


An important aspect is that if the low-poly model on which you bake is just a plane (for example, for tiling texture), then everything written is inconsequential and you can bake using any tool ... if you just keep everything in the correct orientation or later flip the corresponding texture channels.

In addition, most of the problems listed in the article are noticeable only on solid models. Organic forms can hide the most subtle differences, for example, the difference between vertex and pixel by pixel tangents to two points.

Source: https://habr.com/ru/post/undefined/


All Articles