Most file formats just don't support exporting textures, let alone full blown material definitions or other application specific features. Also historically most Blender importers/exporters don't currentlydidn't always fully support node based materials well.
For this reason no exchange file format you use can, or even tries to, import or export material properties, be it 3DS, FBX, Collada, STL, OBJ or any other. These are mostly mesh-only, geometry-centric file formats concerned with porting mesh based object shapes, and some times animation, armature, and basic shading, or color properties (like MTL files); never full complex material definitions.
Many of these exchangetransitory file formats originate from the fixed GPU pipelinefixed GPU pipeline era of materials, and there are simply no data structures in their specifications to accommodate that type of data, let alone all possible types of properties, settings or exotic combinations of maps. Even if there were, there are way too many different rendering techniques for a variety of purposes and responding to distinct requirements (like speedrealism or responsiveness for real time rendering engines or games, or realism for physically based 'offline renders'), and each using its own different set of parameters and particular ways of interpreting specific properties to be able to correctly map settingsproperties, parameters or particular features between them easily.
For example most real time rendering systems(like game engines) are optimized for speed or responsiveness, and have need for some form of explicit "backface culling" option, because rasterization relies heavily on being able to discard invisible geometry that(that is facing away from the point of view) for performance reasons. Yet in renderers'offline renders' like Cycles (concerned with realism over speed) it is just integrated as aan optional node, because for raytracers it is irrelevant which direction a geometry faces since they are always taken into account. On the other hand glass and transparent shaders "just work" with great refraction in raytracers, yet in EEVEE you need to muck about with blending modes, transparency settings, screen space refraction, and reflection probes, nonexistent in Cycles, because representing object interactions like reflections or refraction is expensive and complex for rasterizers.
Certainly never expect proper export of any procedurally generated textures (like Noise or Voronoi), image textures using "parametric" texture coordinates (like Object or Generated), or running through any other nodes before the final shader (like Color-Ramps or color adjustments); these are always calculated at render time by the engine (like Cycles or EEVEE for viewport display purposes), and can't be exported elsewhere. Have in mind that pre Blender 2.8# most Blender exporters do not even support Cycles node based materials at allwell, so even image textures used in Cycles node trees are often not expected to export correctly.
As of 2.9+ series some work has been done, and more common image based texture maps may some times be correctly preserved, like diffuse, specular, glossiness, orfor the increasingly popular "PBR workflows", when simple image textures are directly connected to Principled BSDF shader nodes, but even this this shouldn't be relied upon.
The manual states which limited subset of node setups are supported.
Same applies for many other features specific to applications or render engines, including but not limited to lighting parameters, shadow settings, lamps, cameras, physics simulations (fluids, cloths, soft bodies), texture options (clamping, clipping, tiling, color adjustments), generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh), material options (backface culling, shadows, visibility, transparency and blending modes), materials shader or texture animations, particle systems, physics simulations (fluids, smoke or fire sims), volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers), bezier curve parameter animations (like bevels and extrudes), visibility options (like hide/unhide, wireframe, shadow, etc) and other "generated data", can't also for the most part be imported or exported (with a few exceptions).
- Lighting parameters, shadow settings, lamps, camera properties
- Physics simulations (fluids, cloths, soft bodies)
- Texture specific options (clamping, clipping, tiling, color adjustments)
- Generated texture coordinates (anything not mapped explicitly by a UV map or created by unwrapping a mesh) like Object or Generated texture coordinates, or anything mapped by a Vector Mapping or vector manipulation nodes
- Material options (backface culling, shadows, visibility, transparency and blending modes)
- Material, shader, texture, light or camera properties based animations
- Particle systems, physics simulations (fluids, cloths, smoke or fire sims)
- Volumetric data, modifiers parameters, shape morphing, animated, generative, shape altering or deformation based animations (like shapekeys, Ocean, Build, Geometry Nodes modifiers)
- Bezier curve parameter based animations (like bevels, extrudes, trims, and "along path" effects)
- Visibility options (like hide/unhide, wireframe, shadow, etc)
- And any other parametric or otherwise "live generated data", can't also for the most part be imported or exported (with a few exceptions).
Even if Blender did support exporting any of these features, the exchange file format would have to support "holding" this these types of information. On top of that, there would also have to be feature parity at the receiving end, that. That means the application reading the exported model would have to supporting both reading that information, and correctly mapping said data to similar or equivalent propertiesfeatures on importing applicationenvironment, which may or may not be available.
That adds all the complexities of ensuring a the exchanges mode looks similar on both ends without loss of information as an end use would expect when opening his work on a different software.
As of Blender 3.4+ commit a99a62231e04 the OBJ importer has been improved to support OBJ PBR MTL extension, which is an extension to the original legacy .mtl file format that tries to cover modern PBR materials.
There are other caveats and limitations, and this should not be relied upon; some materials may fail to import or export correctly. As stated in the task Import/Export: Wavefront Obj Mtl Overhaul (Improved Compatibility and PBR Extension Support)
All you can, and should, import/export are UV coordinates with your mesh so you can correctly apply your textures at the "target application", game engine or receiving rendering environment, where you'll be importing and displaying your model, be it Unreal, Unity, Gamekit, sculpting toolkits, other modelling or animation suites, external rendering software, VR environments, Web Viewers or whatnot; or even Blender itself when importing models from elsewhere. UV maps are generally correctly preserved by most exchange file formats by default, save for complex setups with more than one UVMap per mesh.
There you should spend some time recreating your materials from scratch again with the provided textures and available maps. Yes, it requires some backtracking and will take some time, but with practice it'll get quicker. SaveUse whatever native editor is available locally, save presets into, build reusable libraries when available, to reduce the amount of repetition.
Baking is the process of pre-calculating shading and storing it in a static image texture that may incorporate several optional channels like diffuse, glossy, indirect lighting, bump maps, normal maps, light-maps, among others.
This is often a requirement for high performance mediums or low power platforms, like browser based or mobile gaming, where available resources are limited or unknown, and speed takes precedence over graphic fidelity.
This may improve graphicperceived lighting quality or perceived "realism" at the expense of dynamism, as certain properties of materials and textures may become static, as if "painted onto the surface", like shadows or reflections. ThisIt is often a requirementalso an adequate method for high performance mediums or low power platforms, like web or mobile gamingexporting native procedural textures and materials to other applications, where available resources are limitedwithout "static" shadows or unknown, and speed takes precedence over graphic fidelityreflections.
As the PBR workflow gains popularity and traction it is possible that more applications "join the movement" and implement it in the future. If a common standard can be agreed upon, it may be possible that in the future more importers/exporters improve compatibility with an increasing number of materials properties.
They will likelymay either read directly from a Blend file (like Unreal does with a bespoke importer), use some other hypothetical new file format or extension to existing one (like OBJ), or be forced to implement new standards; as the current ones still lack data structures to correctly describe them. At this point it is pure speculation, and no known plans are made.