3
$\begingroup$

I'm brand new to the Blender world (and 3D in general). I started diving in about a week ago, with the ultimate goal of making assets for a video game project I plan to start next year with some friends.

My question is what is the de facto workflow to create meshes, apply maps, texturing and lighting effects for game art? I'm especially hazy as to where and how you create the various maps (like normal and height maps) and add textures (i.e. wood, steel, etc.)... Is that typically all done in Blender and brought directly into your game engine or would that part of the process handled in Unity/Unreal?

I've seen some great tutorials online but they usually only handle one part of the puzzle and don't explain how the various pieces are integrated. The more detail you can provide, the better! I'm a longtime Photoshop user, but if you explain the process from the Gimp perspective, I'm sure I can figure out how to translate it to PS.

Specifically

  • Do applied Blender shaders work in game engines or are they only for rendering directly from Blender?
  • If so, do lighting effects (like emissive materials and specularity) work in an equivalent way in other game engines?
  • Can you stack multiple maps (i.e. diffuse + specular) and then import the targeted asset in another program?
  • When you unwrap a mesh, can you import it into photoshop? Is there a "best practice" to doing so? Any tutorials you'd recommend?

TL;DR

What is the standard game asset workflow for Blender? For example:

high-poly mesh in Blender >
bake to low-poly mesh >
UV unwrap >
digital paint and maps in Photoshop >
apply maps, shaders and lighting in game engine (Unity/Unreal/whatevs)

$\endgroup$
3
  • 2
    $\begingroup$ Blender is a tool (like a hammer). You can use it to create 3D models, animations, materials, textures ... . There is no standard workflow. You have to develop your own personal process by your own. It strongly depends on what other tools you are using (especially the game engine of your choice). $\endgroup$
    – Monster
    Commented Nov 2, 2015 at 6:20
  • $\begingroup$ I'll be using Unity. My biggest question is if you can do textures, maps and shaders in Blender and port those over to your game engine or if it has to be done in-engine. For instance, I read elsewhere that game engine lighting works very differently than something raytraced like Cycles. Granted, I'm sure there isn't a "this-is-the-ONLY-way-to-do-it" type of workflow, but perhaps there are some fairly common ones? $\endgroup$ Commented Nov 2, 2015 at 16:20
  • $\begingroup$ In that case you use Blender as asset creation tool (like Gimp). That has nothing to do with the build-in Blender Game Engine (it will not create any asset for you). For details you should check the according Unity resources (tutorials, manuals, forums ...). $\endgroup$
    – Monster
    Commented Nov 5, 2015 at 8:05

3 Answers 3

3
$\begingroup$

There are many ways. Sometimes, I use this way:

  1. model low poly first
  2. UVunwrap it
  3. texturing
  4. In properties panel, go to "data" tab and then "UVmaps", add a second UV map by clicking on the plus button, highlight it by clicking on it, unwrap the low poly model for a second time for a good unwrapping because of the texturing
  5. create a new texture and bake "textures" and normal maps
  6. Remove all materials and the second UV map
  7. duplicate the low poly
  8. add more details to the second low poly models
  9. bake normal map of the the high poly mesh on the low poly one
$\endgroup$
2
+100
$\begingroup$

There is no single workflow, or even a most common defacto work flow for game assets, because the workflow varies a lot based on what the asset will be used for in a game. You might want to follow tutorials by Grant Abbitt who specializes in game assets for beginners, to get a general feel of some of the various approaches.

The answers to the specific questions all pretty much amount to "it depends on the game engine". Specifically, game engines usually require Blender to export assets using certain file formats that the engine can import from. What works depends on what the exported file format can deal with and is limited by the Blender exporter and the game importer.

The workflow in general depends on what use the asset will receive. High end "hero" assets in games designed for high end consoles will include everything from motion capture, photogrammetry, and imported textures from programs like Substance Designer, requiring extensive use of third party tools in the pipeline before Blender, as well as much manipulation.

At the other extreme low end "background" assets might consist of simple low polygon count meshes, with texture details provided by maps, possibly baked from high polygon versions, and very simple shaders.

$\endgroup$
2
$\begingroup$

Meshes are imported. Textures are just images that don't even need any special import process, just "save". Materials are not imported, but created for the specific game/rendering engine.

Your specific questions, then addressing your TLDR:

Do applied Blender shaders work in game engines or are they only for rendering directly from Blender?

Depending on what you mean by "applied", materials/shaders are specific to the rendering engine. There's not really any "export material". There are commonalities in how things are going to be handled, but if you haven't built, say, transmission into your game shader, you're not going to get transmission.

If so, do lighting effects (like emissive materials and specularity) work in an equivalent way in other game engines?

No, not necessarily. Again, these are bits of materials, and your game shader decides how to handle them. If you don't build emission into your game shader, you don't get emission. (But emission, as Eevee does it, is the simplest shader there is.) There are multiple ways to handle specularity, and you decide how to handle it when you make your game shader(s).

Can you stack multiple maps (i.e. diffuse + specular) and then import the targeted asset in another program?

Same theme: you can import textures, whether those be diffuse color maps, specular color maps, roughness/gloss/specular power maps, environment maps, or even IOR maps (for Fresnel.) But "import" here makes it sound more complicated than it is: textures are just image files, and you create them then save them. You can create your own game shader that uses any number of kinds of textures, but if you don't have a game shader that uses those kinds of textures, then there's no reason to save those images; textures are merely parameters for your game-engine materials, and if your game engine material doesn't read those parameters....

When you unwrap a mesh, can you import it into photoshop? Is there a "best practice" to doing so? Any tutorials you'd recommend?

While I don't have PS, you generally cannot import 3D models into 2D image editing apps. You can import an exported UV map-- which is just a picture with outlines of your UV map edges, just an image, not something weird and magical that beginners sometimes think it is-- but painting on a 2D image with that as a guide is not often the best way to do things. There are other apps, most popularly Substance Painter, which will import 3D meshes, with their UV maps, and allow you to paint in 3D. This is similar to Blender's texture painting, except SP is a lot better (more powerful, cleaner.) There are free SP-clones, but I can't speak to their quality.

What is the standard game asset workflow for Blender?

high-poly mesh in Blender > bake to low-poly mesh > UV unwrap > digital paint and maps in Photoshop > apply maps, shaders and lighting in game engine (Unity/Unreal/whatevs)

That is a perfectly reasonable workflow. We might add "rig" in there, which would come before or after UV map (either is reasonable, and there are reasons for both, and you can always iterate.) A reasonably professional workflow is going to involve Substance Painter more than Photoshop. By "apply maps" etc, what is really meant is to design a material, which, yes, is game engine specific. A professional team will have both a shader coder and artists that makes maps, and the shader coder will likely end up creating a standard material and defining what maps need to be created by the artists. Realistically, a game is going to have a limited number of material templates to choose from-- possibly, only one, roughly comparable to Blender's Principled BSDF.

But perhaps it would be better to say that "apply shaders" etc is the first step, or parallel with make a high poly. You can't really make your textures without knowing how they're going to be used by your game engine. In some cases, you can make reasonable guess, a diffuse is a diffuse is a diffuse, but do you need a map of tangent rotation? Do you need an AO map? Do you need both an AO map and a cavity map? Those aren't questions you can answer until you know what your shader(s) are doing.

I said that is a reasonable workflow because it is not the only potential workflow. Sculpt, retopo, high-to-low is not the only way to make meshes and textures; you'll see different workflows from Call of Duty, Overwatch, Genshin Impact. The reality is, you adopt a workflow that suits your desired output, and your desired output is different from game to game.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .