61
$\begingroup$

I used node textures in a mesh I exported to FBX.

I then imported it into Unreal Engine 4, and it didn't import any textures; I clicked the Upload Textures option in Unreal Engine.

Is there something I'm doing wrong, or is it Blender?

$\endgroup$
1

2 Answers 2

121
$\begingroup$

This is a frequently asked question.

TL;DR

Most file formats just don't support exporting textures, let alone full blown material definitions or other application specific features. Also historically most Blender importers/exporters didn't always fully support node based materials well.

Manually reproduce your materials using available textures at the target environment.

Long story

Materials are too implementation specific and tightly tied to the rendering system they belong to, or software they were created with. You can't, for the most part, import/export material definitions between applications, you can't even get Blender Internal Materials to work with Cycles Renderer nor vice versa, and they are both created within Blender, let alone between completely different applications.

For this reason no exchange file format you use can, or even tries to, import or export material properties, be it 3DS, FBX, Collada, STL, OBJ or any other. These are mostly mesh-only, geometry-centric file formats concerned with porting mesh based object shapes, and some times animation, armature, and basic color properties (like MTL files); never full complex material definitions.

Many of these transitory file formats originate from the fixed GPU pipeline era of materials, materials were a lot more limited back then, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like realism or responsiveness) to be able to correctly map properties, parameters or particular features between all of them easily.

For example most real time rendering (like game engines) are optimized for speed or responsiveness, and have need for some form of explicit "backface culling" option, because rasterization relies heavily on being able to discard invisible geometry (that is facing away from the point of view) for performance reasons. Yet in 'offline renders' like Cycles (concerned with realism over speed) it is integrated as an optional node, because for raytracers it is irrelevant which direction a geometry faces since they are always taken into account. On the other hand glass and transparent shaders "just work" with great refraction in raytracers, yet in EEVEE you need to muck about with blending modes, transparency settings, screen space refraction, and reflection probes, nonexistent in Cycles, because representing object interactions like reflections or refraction is expensive and complex for rasterizers.

Certainly never expect proper export of any procedurally generated textures (like Noise or Voronoi), image textures using "parametric" texture coordinates (like Object or Generated), or running through any other nodes before the final shader (like Color-Ramps or color adjustments); these are always calculated at render time by the engine (like Cycles or EEVEE for display purposes), and can't be exported elsewhere. Have in mind that pre Blender 2.8# most Blender exporters do not even support Cycles node based materials well, so even image textures used in Cycles node trees are often not expected to export correctly.

As of 2.9+ series some work has been done, and more common image based texture maps may some times be correctly preserved, like diffuse, specular, glossiness, for the increasingly popular "PBR workflows", when simple image textures are directly connected to Principled BSDF shader nodes, but even this this shouldn't be relied upon. The manual states which limited subset of node setups are supported.

Same applies for many other features specific to applications or render engines, including but not limited to

  • Lighting parameters, shadow settings, lamps, camera properties
  • Physics simulations (fluids, cloths, soft bodies)
  • Texture specific options (clamping, clipping, tiling, color adjustments)
  • Generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh) like Object or Generated texture coordinates, or anything mapped by a Vector Mapping or vector manipulation nodes
  • Material options (backface culling, shadows, visibility, transparency and blending modes)
  • Material, shader, texture, light or camera properties based animations
  • Particle systems, physics simulations (fluids, cloths, smoke or fire sims)
  • Volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers)
  • Bezier curve parameter based animations (like bevels, extrudes, trims, and "along path" effects)
  • Visibility options (like hide/unhide, wireframe, shadow, etc)
  • And any other parametric or otherwise "live generated data", can't also for the most part be imported or exported (with a few exceptions).

Even if Blender did support exporting any of these features, the exchange file format would have to support "holding" this these types of information. On top of that, there would also have to be feature parity at the receiving end. That means the application reading the exported model would have to supporting both reading that information, and correctly mapping said data to similar or equivalent features on importing environment, which may or may not be available.

That adds all the complexities of ensuring an exchanged model looks similar on both ends without loss of information as an end use would expect when opening his work on a different software. Multiply this by the virtually infinite number of possible combinations of different importer and exporter software, and ensuring any one single feature would work acceptably in all of them.

Exceptions

One notable exception to all this is the glTF file format, which as of version 2.0 glTF does support some material definitions based on a metallic-roughness shading model in its specs. The glTF-Blender-IO addon created by the Khronos Group themselves supports exporting Principled BSDF node based materials. But even then some restrictions apply, it must only contain pure image texture based materials connected directly to a Principled BSDF shader, without any transformations or interference from other nodes or textures. As of 2.9+ the Blender manual states which setups are supported by GLTF 2.0, which include Principled BSDF and Shadeless (Unlit).

The core material system in glTF supports a metal/rough PBR workflow with the following channels of information:

  • Base Color
  • Metallic
  • Roughness
  • Baked Ambient Occlusion
  • Normal Map (tangent space, +Y up)
  • Emissive

As of Blender 3.4+ commit a99a62231e04 the OBJ importer has been improved to support OBJ PBR MTL extension, which is an extension to the original legacy .mtl file format that tries to cover modern PBR materials.

Beware that this is an extension of the original format, not all applications can read it, and some may write non standard or non compliant files which may fail to import correctly.

There are other caveats and limitations, and this should not be relied upon; some materials may fail to import or export correctly. As stated in the task Import/Export: Wavefront Obj Mtl Overhaul (Improved Compatibility and PBR Extension Support)

As a note to artists: This should greatly improve importing in general but there are a great many MTLs that are fundamentally broken. Detecting these issues automatically is rather difficult and requires human intervention by editing the ASCII file. If you experience any of the following issues it is likely the fault of the file and not the importer:

  • Normal map is not being loaded:
    • Solution: Replace "bump", "map_Bump", "disp", "map_Disp" with "norm" or import with "Bump/Displacement is Normal Map" enabled
    • Explanation: The specification does not detail how to load normal maps, only greyscale bump maps. As a result many obj files misuse the bump/displacement attribute to specify normal maps... This is really bad, bump/disp can no longer be trusted. Many files/packages however have opted to extend MTL with a proper normal map property. This updated importer/exporter makes the decision to switch to using this specifier to attempt to undo some of the damage to the ecosystem.
  • Alpha mapped textures are not being loaded correctly (color):
    • Solution: Add "-imfchan m" after every "map_d" attribute which is not a greyscale texture or import with "Alpha from Diffuse Texture" enabled
    • Explanation: The obj specification provides the "-imfchan" texture option to sample scalar information from a specific image channel. The initial version of the importer/exporter update currently only supports "m" which stands for the term used in compositing "matte"
  • Everything is invisible:
    • Solution: For every "d" option flip (newvalue = 1.0 - value) every number (or just set it to 1.0) or import with "Invert Alpha" enabled
    • Explanation: The specification defines "d: 1.0" to be fully opaque and "d: 0.0" to be fully transparent... Some people don't read the manual.
  • Legacy materials don't have glass:
  • Materials that are supposed to be shiny are diffuse and vice versa:
    • Solution: For every "Ns" flip the value (newvalue = 1000.0 - value) or import with "Invert Specular Exponent" enabled
    • Explanation: The specification defines "Ns" to be the specular exponent, 0 is diffuse, 1000 is shiny (blender stops at 900). Some hand-written files might again... have this reversed. The formula we used flips the specular exponent. (The manual formula described doesn't do a propper exponent conversion so this is not technically correct but neither is the exporter that wrote the file.)
  • Glass appears as black:
    • Solution: Edit all glass materials to have "Kd" set to the value of "Tf" and set "Tf" to "1.0 1.0 1.0" or import with "Transmittance Compatibility" enabled
    • Explanation: Some other software doesn't apply diffuse color as the color of glass and instead uses transmittance as the glass color.
  • Every material has subsurface scattering:
    • Solution: Delete all non-glass entries of "Tf" or import with "Transmittance Compatibility" enabled
    • Explanation: "Tf" defines a surfaces ability to let light through, often this is used for glass absorption, other times software will define this as translucency. So in the case of a solid surface the importer deems this to be subsurface translucency, this is a settlement between software that uses Tf as absorption and Tf as translucency. Some software (unwisely) writes transmittance of "1.0 1.0 1.0" for solid materials.
  • Glass materials are transparent:
    • Solution: For every glass material find "d" replace it with "Tr", flip (newvalue = 1.0 - value) for every number
    • Explanation: Many applications don't distinguish between "dissolve" (d) and "transmission" (Tr). Dissolve is available in every shading model (as per-spec), transmission is a component of glass.
  • Transparent materials are made of glass (rare):
    • Solution: For every "Tr" replace it with "d" option flip (newvalue = 1.0 - value) every number (or just delete "Tr")
    • Explanation: Same as above just the other way around. This really should never happen but with the state of files available... you never know.
  • Random materials are glass/metallic/missing reflections:
    • Solution: Set "illum" to 2 or import with "Basic Illumination" enabled
    • Explanation: The importer sees any opaque illumination model without fresnel to likely be metallic, anything with refraction is transmission, specular is disabled in illumination models below 2. Some files have very incorrect illumination models.
  • Color tint is missing from textures on import:
    • Solution: Requires manual setup: unplug the texture, copy the "default value" of the socket (should have been set to the multiplier), create a multiply node with the value and texture as needed.
    • Explanation: Currently texture modulation is not yet supported. This requires extra complicated setups behind the scenes to get working correctly for importing and exporting outside of the Principled BSDF wrangler. This should be a more rare use case for older files where texture memory was scarce and had to be re-used with tinting. If your file needs this however, sorry! You will have to set it up manually.

Another note to artists: Exporter limitations (Questions/Answers):

  • Question: Why isn't my custom shader exporting?
    • Answer: The exporter only exports the Principled BSDF
  • Question: Why isn't my node graph texture exporting?
    • Answer: The exporter only exports image textures which are directly plugged into an acceptable output socket. No procedurals/blending is handled. Consider authoring your textures in a workflow which is baked to an image.
  • Question: Why isn't my subsurface+glass+alpha material exporting with every property?
    • Answer: Not every property can be exported together, the importer/exporter uses context switching to encode properties in the most compatible way possible, as a result you should try to author your materials with only one of these properties. For best results subsurface should also always be "1.0" and base color texture in "subsurface color" for subsurface materials.
  • Question: I have the texture plugged directly into the socket, why isn't it working?
    • Answer: Likely exporting this as a texture is not supported the following section should address that.

How to solve it then?

All you can, and should, import/export are UV coordinates with your mesh so you can correctly apply your textures at the "target application", game engine or receiving rendering environment, where you'll be importing and displaying your model, be it Unreal, Unity, Gamekit, sculpting toolkits, other modelling or animation suites, external rendering software, VR environments, Web Viewers or whatnot; or even Blender itself when importing models from elsewhere. UV maps are generally correctly preserved by most exchange file formats by default, save for complex setups with more than one UVMap per mesh.

There you should spend some time recreating your materials from scratch again with the provided textures and available maps. Yes, it requires some backtracking and will take some time, but with practice it'll get quicker. Use whatever native editor is available locally, save presets, build reusable libraries when available, to reduce the amount of repetition.

What you can do in some situations is use Blender to bake textures that you can later use in the final destination or receiving application to create materials there, not inside Blender.

Baking is the process of pre-calculating shading and storing it in a static image texture that may incorporate several optional channels like diffuse, glossy, indirect lighting, bump maps, normal maps, light-maps, among others.

This is often a requirement for high performance mediums or low power platforms, like browser based or mobile gaming, where available resources are limited or unknown, and speed takes precedence over graphic fidelity.

This may improve perceived lighting quality or "realism" at the expense of dynamism, as certain properties of materials and textures may become static, as if "painted onto the surface", like shadows or reflections. It is also an adequate method for exporting native procedural textures and materials to other applications, without "static" shadows or reflections.

Future Developments

PBR is not a standard however and not every implementation works the same way or interprets the same parameters similarly. The Principled BSDF was already written according to a model designed by an established industry leader (Disney) for their own internal use, and game engines like Unreal Engine® or texture painting applications like Substance Painter® also adopted it.

As the PBR workflow gains popularity and traction it is possible that more applications "join the movement" and implement it in the future. If a common standard can be agreed upon, it may be possible that in the future more importers/exporters improve compatibility with an increasing number of materials properties.

They may either read directly from a Blend file (like Unreal does with a bespoke importer), use some other hypothetical new file format or extension to existing one (like OBJ), or be forced to implement new standards; as the current ones still lack data structures to correctly describe them. At this point it is pure speculation, and no known plans are made.

Some efforts are currently being planned to have node group assets setup in such way that they use supported shaders under the hood, and are prepared for exporting to supported file formats using standardized maps and sockets. It is currently in preliminary planing phases Bundle shader node group assets for compatibility #113145

Related Issues

$\endgroup$
9
  • 3
    $\begingroup$ I have never used Unreal or any other engine, I have no clue, you will have to ask about it in a dedicated community I'm afraid. I hear some engines do not allow multiple materials per mesh, you would have to break it apart into several objects. Anyway you can apply the materials in Blender, for visual fidelity in the viewport, and if the importer supports that it's less work you have to redo in engine, just don't expect to use the same exact material definitions. $\endgroup$ Commented Jul 11, 2016 at 12:44
  • 3
    $\begingroup$ Its worth noting that a lot of applications/formats care about assigned material slots (i.e. which verts have which material assigned), but not the actual material that is assigned, as they need these for assigning their own materials. $\endgroup$
    – Sazerac
    Commented Nov 24, 2017 at 1:38
  • 5
    $\begingroup$ @Duarte Farrajota Ramos there are countless export scripts that export blender meshes with their assigned materials $\endgroup$
    – phil123
    Commented Jan 1, 2018 at 13:26
  • 4
    $\begingroup$ @phil123 You should post that as an answer below, with links to said scripts and a description of the workflow used, along with some screenshots of the results $\endgroup$ Commented Jan 1, 2018 at 20:28
  • 4
    $\begingroup$ FBX supports textures. So the bit about it not being supported by the file format is just plain wrong. $\endgroup$ Commented Aug 28, 2019 at 1:26
15
$\begingroup$

When exporting to .fbx, you can't export materials, you'll have the surface color, but not other settings such as glossiness and reflection, for textures, yes you can, but you have to use blender internal (it does not work with cycles), and make sure you don't use nodes, apply the textures, then, when in the fbx export settings, go to bottom, set path to copy, then check the box-like icon next to it, it should now embed the textures with the exported fbx file.

PS: I didn't test the new blender 2.79

Blender Fbx export settings to embed textures

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .