Skip to main content
added 650 characters in body
Source Link

Many of these transitory file formats originate from the fixed GPU pipeline era of materials, materials were a lot more limited back then, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like realism or responsiveness) to be able to correctly map properties, parameters or particular features between all of them easily.

That adds all the complexities of ensuring a the exchanges modean exchanged model looks similar on both ends without loss of information as an end use would expect when opening his work on a different software. Multiply this by the virtually infinite number of possible combinations of different importer and exporter software, and ensuring any one single feature would work acceptably in all of them.

Some efforts are currently being planned to have node group assets setup in such way that they use supported shaders under the hood, and are prepared for exporting to supported file formats using standardized maps and sockets. It is currently in preliminary planing phases Bundle shader node group assets for compatibility #113145

Many of these transitory file formats originate from the fixed GPU pipeline era of materials, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like realism or responsiveness) to be able to correctly map properties, parameters or particular features between them easily.

That adds all the complexities of ensuring a the exchanges mode looks similar on both ends without loss of information as an end use would expect when opening his work on a different software.

Many of these transitory file formats originate from the fixed GPU pipeline era of materials, materials were a lot more limited back then, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like realism or responsiveness) to be able to correctly map properties, parameters or particular features between all of them easily.

That adds all the complexities of ensuring an exchanged model looks similar on both ends without loss of information as an end use would expect when opening his work on a different software. Multiply this by the virtually infinite number of possible combinations of different importer and exporter software, and ensuring any one single feature would work acceptably in all of them.

Some efforts are currently being planned to have node group assets setup in such way that they use supported shaders under the hood, and are prepared for exporting to supported file formats using standardized maps and sockets. It is currently in preliminary planing phases Bundle shader node group assets for compatibility #113145

Improve answer
Source Link

Most file formats just don't support exporting textures, let alone full blown material definitions or other application specific features. Also historically most Blender importers/exporters don't currentlydidn't always fully support node based materials well.

For this reason no exchange file format you use can, or even tries to, import or export material properties, be it 3DS, FBX, Collada, STL, OBJ or any other. These are mostly mesh-only, geometry-centric file formats concerned with porting mesh based object shapes, and some times animation, armature, and basic shading, or color properties (like MTL files); never full complex material definitions.

Many of these exchangetransitory file formats originate from the fixed GPU pipelinefixed GPU pipeline era of materials, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like speedrealism or responsiveness for real time rendering engines or games, or realism for physically based 'offline renders'), and each using its own different set of parameters and particular ways of interpreting specific properties to be able to correctly map settingsproperties, parameters or particular features between them easily.

For example most real time rendering systems(like game engines) are optimized for speed or responsiveness, and have need for some form of explicit "backface culling" option, because rasterization relies heavily on being able to discard invisible geometry that(that is facing away from the point of view) for performance reasons. Yet in renderers'offline renders' like Cycles (concerned with realism over speed) it is just integrated as aan optional node, because for raytracers it is irrelevant which direction a geometry faces since they are always taken into account. On the other hand glass and transparent shaders "just work" with great refraction in raytracers, yet in EEVEE you need to muck about with blending modes, transparency settings, screen space refraction, and reflection probes, nonexistent in Cycles, because representing object interactions like reflections or refraction is expensive and complex for rasterizers.

Certainly never expect proper export of any procedurally generated textures (like Noise or Voronoi), image textures using "parametric" texture coordinates (like Object or Generated), or running through any other nodes before the final shader (like Color-Ramps or color adjustments); these are always calculated at render time by the engine (like Cycles or EEVEE for viewport display purposes), and can't be exported elsewhere. Have in mind that pre Blender 2.8# most Blender exporters do not even support Cycles node based materials at allwell, so even image textures used in Cycles node trees are often not expected to export correctly.

As of 2.9+ series some work has been done, and more common image based texture maps may some times be correctly preserved, like diffuse, specular, glossiness, orfor the increasingly popular "PBR workflows", when simple image textures are directly connected to Principled BSDF shader nodes, but even this this shouldn't be relied upon. The manual states which limited subset of node setups are supported.

Same applies for many other features specific to applications or render engines, including but not limited to lighting parameters, shadow settings, lamps, cameras, physics simulations (fluids, cloths, soft bodies), texture options (clamping, clipping, tiling, color adjustments), generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh), material options (backface culling, shadows, visibility, transparency and blending modes), materials shader or texture animations, particle systems, physics simulations (fluids, smoke or fire sims), volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers), bezier curve parameter animations (like bevels and extrudes), visibility options (like hide/unhide, wireframe, shadow, etc) and other "generated data", can't also for the most part be imported or exported (with a few exceptions).

  • Lighting parameters, shadow settings, lamps, camera properties
  • Physics simulations (fluids, cloths, soft bodies)
  • Texture specific options (clamping, clipping, tiling, color adjustments)
  • Generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh) like Object or Generated texture coordinates, or anything mapped by a Vector Mapping or vector manipulation nodes
  • Material options (backface culling, shadows, visibility, transparency and blending modes)
  • Material, shader, texture, light or camera properties based animations
  • Particle systems, physics simulations (fluids, cloths, smoke or fire sims)
  • Volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers)
  • Bezier curve parameter based animations (like bevels, extrudes, trims, and "along path" effects)
  • Visibility options (like hide/unhide, wireframe, shadow, etc)
  • And any other parametric or otherwise "live generated data", can't also for the most part be imported or exported (with a few exceptions).

Even if Blender did support exporting any of these features, the exchange file format would have to support "holding" this these types of information. On top of that, there would also have to be feature parity at the receiving end, that. That means the application reading the exported model would have to supporting both reading that information, and correctly mapping said data to similar or equivalent propertiesfeatures on importing applicationenvironment, which may or may not be available.

That adds all the complexities of ensuring a the exchanges mode looks similar on both ends without loss of information as an end use would expect when opening his work on a different software.

As of Blender 3.4+ commit a99a62231e04 the OBJ importer has been improved to support OBJ PBR MTL extension, which is an extension to the original legacy .mtl file format that tries to cover modern PBR materials.

There are other caveats and limitations, and this should not be relied upon; some materials may fail to import or export correctly. As stated in the task Import/Export: Wavefront Obj Mtl Overhaul (Improved Compatibility and PBR Extension Support)

All you can, and should, import/export are UV coordinates with your mesh so you can correctly apply your textures at the "target application", game engine or receiving rendering environment, where you'll be importing and displaying your model, be it Unreal, Unity, Gamekit, sculpting toolkits, other modelling or animation suites, external rendering software, VR environments, Web Viewers or whatnot; or even Blender itself when importing models from elsewhere. UV maps are generally correctly preserved by most exchange file formats by default, save for complex setups with more than one UVMap per mesh.

There you should spend some time recreating your materials from scratch again with the provided textures and available maps. Yes, it requires some backtracking and will take some time, but with practice it'll get quicker. SaveUse whatever native editor is available locally, save presets into, build reusable libraries when available, to reduce the amount of repetition.

Baking is the process of pre-calculating shading and storing it in a static image texture that may incorporate several optional channels like diffuse, glossy, indirect lighting, bump maps, normal maps, light-maps, among others.

This is often a requirement for high performance mediums or low power platforms, like browser based or mobile gaming, where available resources are limited or unknown, and speed takes precedence over graphic fidelity.

This may improve graphicperceived lighting quality or perceived "realism" at the expense of dynamism, as certain properties of materials and textures may become static, as if "painted onto the surface", like shadows or reflections. ThisIt is often a requirementalso an adequate method for high performance mediums or low power platforms, like web or mobile gamingexporting native procedural textures and materials to other applications, where available resources are limitedwithout "static" shadows or unknown, and speed takes precedence over graphic fidelityreflections.

As the PBR workflow gains popularity and traction it is possible that more applications "join the movement" and implement it in the future. If a common standard can be agreed upon, it may be possible that in the future more importers/exporters improve compatibility with an increasing number of materials properties. 

They will likelymay either read directly from a Blend file (like Unreal does with a bespoke importer), use some other hypothetical new file format or extension to existing one (like OBJ), or be forced to implement new standards; as the current ones still lack data structures to correctly describe them. At this point it is pure speculation, and no known plans are made.

Most file formats just don't support exporting textures, let alone full blown material definitions or other application specific features. Also most Blender importers/exporters don't currently support node based materials well.

For this reason no file format you use can, or even tries to, import or export material properties, be it 3DS, FBX, Collada, STL, OBJ or any other. These are mostly mesh-only, geometry-centric file formats concerned with porting object shapes, and some times animation, armature, and basic shading, or color properties (like MTL files); never full complex material definitions.

Many of these exchange file formats originate from the fixed GPU pipeline era of materials, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like speed or responsiveness for real time rendering engines or games, or realism for physically based 'offline renders'), and each using its own different set of parameters and particular ways of interpreting specific properties to be able to correctly map settings, parameters or particular features between them easily.

For example most real time rendering systems have need for some form of explicit "backface culling" option, because rasterization relies heavily on being able to discard invisible geometry that is facing away from the point of view for performance reasons. Yet in renderers like Cycles it is just integrated as a node, because for raytracers it is irrelevant which direction a geometry faces since they are always taken into account. On the other hand glass and transparent shaders "just work" with great refraction in raytracers, yet in EEVEE you need to muck about with blending modes, transparency settings and reflection probes, nonexistent in Cycles, because representing object interactions like reflections or refraction is expensive and complex for rasterizers.

Certainly never expect proper export of any procedurally generated textures (like Noise or Voronoi), image textures using "parametric" texture coordinates (like Object or Generated), or running through any other nodes before the final shader (like Color-Ramps or color adjustments); these are always calculated at render time by the engine (like Cycles or EEVEE for viewport display purposes), and can't be exported elsewhere. Have in mind that pre Blender 2.8# most Blender exporters do not even support Cycles node based materials at all, so even image textures used in Cycles node trees are often not expected to export correctly.

As of 2.9+ series some work has been done, and more common image based texture maps may some times be correctly preserved, like diffuse, specular, glossiness, or the increasingly popular "PBR workflows", when simple image textures are directly connected to Principled BSDF shader nodes, but even this this shouldn't be relied upon. The manual states which limited subset of node setups are supported.

Same applies for many other features specific to applications or render engines, including but not limited to lighting parameters, shadow settings, lamps, cameras, physics simulations (fluids, cloths, soft bodies), texture options (clamping, clipping, tiling, color adjustments), generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh), material options (backface culling, shadows, visibility, transparency and blending modes), materials shader or texture animations, particle systems, physics simulations (fluids, smoke or fire sims), volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers), bezier curve parameter animations (like bevels and extrudes), visibility options (like hide/unhide, wireframe, shadow, etc) and other "generated data", can't also for the most part be imported or exported (with a few exceptions).

Even if Blender did support exporting any of these features there would also have to be feature parity at the receiving end, that means supporting both reading and mapping said data to similar or equivalent properties on importing application, which may or may not be available.

As of Blender 3.4+ commit a99a62231e04 the OBJ importer has been improved to support OBJ PBR MTL extension, which is an extension to the original legacy .mtl file format that tries to cover.

There are caveats and limitations, and this should not be relied upon; some materials may fail to import or export correctly. As stated in the task Import/Export: Wavefront Obj Mtl Overhaul (Improved Compatibility and PBR Extension Support)

All you can, and should, import/export are UV coordinates with your mesh so you can correctly apply your textures at the "target application", game engine or receiving rendering environment, where you'll be importing and displaying your model, be it Unreal, Unity, Gamekit, sculpting toolkits, other modelling or animation suites, external rendering software or whatnot; or even Blender itself when importing models from elsewhere. UV maps are generally correctly preserved by most exchange file formats by default.

There you should spend some time recreating your materials from scratch again with the provided textures and available maps. Yes, it requires some backtracking and will take some time, but with practice it'll get quicker. Save presets into reusable libraries when available, to reduce the amount of repetition.

Baking is the process of pre-calculating shading and storing it in a static image texture that may incorporate several optional channels like diffuse, glossy, indirect lighting, bump maps, normal maps, light-maps, among others. This may improve graphic quality or perceived "realism" at the expense of dynamism, as certain properties of materials and textures may become static, as if "painted onto the surface", like shadows or reflections. This is often a requirement for high performance mediums or low power platforms, like web or mobile gaming, where available resources are limited or unknown, and speed takes precedence over graphic fidelity.

As the PBR workflow gains popularity and traction it is possible that more applications "join the movement" and implement it in the future. If a common standard can be agreed upon, it may be possible that in the future more importers/exporters improve compatibility with an increasing number of materials properties. They will likely either read directly from a Blend file, use some other hypothetical new file format, or be forced to implement new standards; as the current ones still lack data structures to correctly describe them. At this point it is pure speculation, and no known plans are made.

Most file formats just don't support exporting textures, let alone full blown material definitions or other application specific features. Also historically most Blender importers/exporters didn't always fully support node based materials well.

For this reason no exchange file format you use can, or even tries to, import or export material properties, be it 3DS, FBX, Collada, STL, OBJ or any other. These are mostly mesh-only, geometry-centric file formats concerned with porting mesh based object shapes, and some times animation, armature, and basic color properties (like MTL files); never full complex material definitions.

Many of these transitory file formats originate from the fixed GPU pipeline era of materials, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like realism or responsiveness) to be able to correctly map properties, parameters or particular features between them easily.

For example most real time rendering (like game engines) are optimized for speed or responsiveness, and have need for some form of explicit "backface culling" option, because rasterization relies heavily on being able to discard invisible geometry (that is facing away from the point of view) for performance reasons. Yet in 'offline renders' like Cycles (concerned with realism over speed) it is integrated as an optional node, because for raytracers it is irrelevant which direction a geometry faces since they are always taken into account. On the other hand glass and transparent shaders "just work" with great refraction in raytracers, yet in EEVEE you need to muck about with blending modes, transparency settings, screen space refraction, and reflection probes, nonexistent in Cycles, because representing object interactions like reflections or refraction is expensive and complex for rasterizers.

Certainly never expect proper export of any procedurally generated textures (like Noise or Voronoi), image textures using "parametric" texture coordinates (like Object or Generated), or running through any other nodes before the final shader (like Color-Ramps or color adjustments); these are always calculated at render time by the engine (like Cycles or EEVEE for display purposes), and can't be exported elsewhere. Have in mind that pre Blender 2.8# most Blender exporters do not even support Cycles node based materials well, so even image textures used in Cycles node trees are often not expected to export correctly.

As of 2.9+ series some work has been done, and more common image based texture maps may some times be correctly preserved, like diffuse, specular, glossiness, for the increasingly popular "PBR workflows", when simple image textures are directly connected to Principled BSDF shader nodes, but even this this shouldn't be relied upon. The manual states which limited subset of node setups are supported.

Same applies for many other features specific to applications or render engines, including but not limited to

  • Lighting parameters, shadow settings, lamps, camera properties
  • Physics simulations (fluids, cloths, soft bodies)
  • Texture specific options (clamping, clipping, tiling, color adjustments)
  • Generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh) like Object or Generated texture coordinates, or anything mapped by a Vector Mapping or vector manipulation nodes
  • Material options (backface culling, shadows, visibility, transparency and blending modes)
  • Material, shader, texture, light or camera properties based animations
  • Particle systems, physics simulations (fluids, cloths, smoke or fire sims)
  • Volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers)
  • Bezier curve parameter based animations (like bevels, extrudes, trims, and "along path" effects)
  • Visibility options (like hide/unhide, wireframe, shadow, etc)
  • And any other parametric or otherwise "live generated data", can't also for the most part be imported or exported (with a few exceptions).

Even if Blender did support exporting any of these features, the exchange file format would have to support "holding" this these types of information. On top of that, there would also have to be feature parity at the receiving end. That means the application reading the exported model would have to supporting both reading that information, and correctly mapping said data to similar or equivalent features on importing environment, which may or may not be available.

That adds all the complexities of ensuring a the exchanges mode looks similar on both ends without loss of information as an end use would expect when opening his work on a different software.

As of Blender 3.4+ commit a99a62231e04 the OBJ importer has been improved to support OBJ PBR MTL extension, which is an extension to the original legacy .mtl file format that tries to cover modern PBR materials.

There are other caveats and limitations, and this should not be relied upon; some materials may fail to import or export correctly. As stated in the task Import/Export: Wavefront Obj Mtl Overhaul (Improved Compatibility and PBR Extension Support)

All you can, and should, import/export are UV coordinates with your mesh so you can correctly apply your textures at the "target application", game engine or receiving rendering environment, where you'll be importing and displaying your model, be it Unreal, Unity, Gamekit, sculpting toolkits, other modelling or animation suites, external rendering software, VR environments, Web Viewers or whatnot; or even Blender itself when importing models from elsewhere. UV maps are generally correctly preserved by most exchange file formats by default, save for complex setups with more than one UVMap per mesh.

There you should spend some time recreating your materials from scratch again with the provided textures and available maps. Yes, it requires some backtracking and will take some time, but with practice it'll get quicker. Use whatever native editor is available locally, save presets, build reusable libraries when available, to reduce the amount of repetition.

Baking is the process of pre-calculating shading and storing it in a static image texture that may incorporate several optional channels like diffuse, glossy, indirect lighting, bump maps, normal maps, light-maps, among others.

This is often a requirement for high performance mediums or low power platforms, like browser based or mobile gaming, where available resources are limited or unknown, and speed takes precedence over graphic fidelity.

This may improve perceived lighting quality or "realism" at the expense of dynamism, as certain properties of materials and textures may become static, as if "painted onto the surface", like shadows or reflections. It is also an adequate method for exporting native procedural textures and materials to other applications, without "static" shadows or reflections.

As the PBR workflow gains popularity and traction it is possible that more applications "join the movement" and implement it in the future. If a common standard can be agreed upon, it may be possible that in the future more importers/exporters improve compatibility with an increasing number of materials properties. 

They may either read directly from a Blend file (like Unreal does with a bespoke importer), use some other hypothetical new file format or extension to existing one (like OBJ), or be forced to implement new standards; as the current ones still lack data structures to correctly describe them. At this point it is pure speculation, and no known plans are made.

Add specifics for curves
Source Link

Same applies for many other features specific to applications or render engines, including but not limited to lighting parameters, shadow settings, lamps, cameras, physics simulations (fluids, cloths, soft bodies), texture options (clamping, clipping, tiling, color adjustments), generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh), material options (backface culling, shadows, visibility, transparency and blending modes), materials shader or texture animations, particle systems, physics simulations (fluids, smoke or fire sims), volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers), bezier curve parameter animations (like bevels and extrudes), visibility options (like hide/unhide, wireframe, shadow, etc) and other "generated data", can't also for the most part be imported or exported (with a few exceptions).

Same applies for many other features specific to applications or render engines, including but not limited to lighting parameters, shadow settings, lamps, cameras, physics simulations (fluids, cloths, soft bodies), texture options (clamping, clipping, tiling, color adjustments), generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh), material options (backface culling, shadows, visibility, transparency and blending modes), materials shader or texture animations, particle systems, physics simulations (fluids, smoke or fire sims), volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers), visibility options (like hide/unhide, wireframe, shadow, etc) and other "generated data", can't also for the most part be imported or exported (with a few exceptions).

Same applies for many other features specific to applications or render engines, including but not limited to lighting parameters, shadow settings, lamps, cameras, physics simulations (fluids, cloths, soft bodies), texture options (clamping, clipping, tiling, color adjustments), generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh), material options (backface culling, shadows, visibility, transparency and blending modes), materials shader or texture animations, particle systems, physics simulations (fluids, smoke or fire sims), volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers), bezier curve parameter animations (like bevels and extrudes), visibility options (like hide/unhide, wireframe, shadow, etc) and other "generated data", can't also for the most part be imported or exported (with a few exceptions).

added 2 characters in body
Source Link
Loading
added 54 characters in body
Source Link
Loading
Update in light of recent OBJ importer changes
Source Link
Loading
fix typos
Source Link
Loading
Add updated info from manual
Source Link
Loading
Improve answer
Source Link
Loading
added 393 characters in body
Source Link
Loading
Commonmark migration
Source Link
Loading
Improve answer
Source Link
Loading
fix borken manual link
Source Link
Timaroberts
  • 12.4k
  • 6
  • 40
  • 74
Loading
Correct outdated details on Blender glTF addon. Principled BSDF material is now supported.
Source Link
Loading
Fix typos
Source Link
Loading
Fix broken link
Source Link
Loading
Improve explanation
Source Link
Loading
Improve answer
Source Link
Loading
Fix formatting
Source Link
Loading
Improve formatting
Source Link
Loading
Improve formatting
Source Link
Loading
Improve explanation
Source Link
Loading
deleted 43 characters in body
Source Link
user1853
user1853
Loading
Fix typos
Source Link
Loading
Add baking references
Source Link
Loading