4
$\begingroup$

I'm trying to add objects (feathers would be a good example, or similar to How to Tweak Dragon Scales Scaling from a Particle Emitter) to a very organic shape, and I'm trying to keep my workflow as procedural as possible so I can go back and easily change things (density/count in particular). Since I need a semi-random but "uniform" (read: not clumpy) distribution, it looks like geometry nodes are the only feasible option.

Now, I had this mostly working with a particle system, using the mesh tangent as the base object alignment. But... clumpy. So I'm trying to use geometry nodes with Poisson distributed points instead, which works well as far as not clumping, but now I can't figure out how to apply the proper rotation to the instances. How do I replicate the rotation that I achieved with particles?

(Basically, this, which has no answers. 😭)

I'd be interested in both:

  • How to achieve this only using geometry nodes.
  • How to automatically generate an appropriate vector-field texture.

If it helps, the logic to do this for particles is in psys_interpolate_face, specifically, the logic that computes utan (u⃑) and vtan (v⃑). (Don't worry about phase; I'm not sure I actually need anything but u⃑ for my immediate project, and I'm pretty sure I can get the u⃑ and v⃑, I can figure out on my own how to deal with the phase.)


I can't share my "real" model, but here's a quick-and-dirty example (using particles to show desired alignment):

sample

The UV unwrap for this looks sort of like a bow tie. Note how the pyramids are not just pointing down, but are describing an arc that is more extreme toward the back (left, in this image) of the sphere and reverses at the mid-line in the front of the sphere (not visible in this image).

The goal is for the "top" of the pyramids to always point away from the sphere (i.e. in the direction of the mesh normal), while the long "tip" points in the direction that corresponds to the texture u⃑.

$\endgroup$
3

3 Answers 3

5
$\begingroup$

This seems to work for me...

Texture-based Approach

Generating the Texture

First, we need a texture that encodes the u⃑ tangent vector. Assign a material to the object for which the tangent texture will be generated and go to the "Shading" workspace to edit the material's nodes. We'll need the following:

  • Input ⏵ Tangent
  • Texture ⏵ Image Texture
  • Converter ⏵ Map Range
  • Shader ⏵ Diffuse BDSF
  • Output ⏵ Material Output

The "Material Output" node should already be present. Any other existing nodes can be deleted. Note that the "Image Texture" node is only present so that baking will know what to target; it does not need to be connected, however the Color Space should be changed to Non-Color. Be sure to select the intended image target in this node to avoid a confusing error. If you don't already have an image, you'll need to create one.

The other nodes should be connected as follows: Tangent:Tangent to Map:Vector, Map:Vector to Diffuse:Color and Diffuse:BDSF to Material:Surface. Also in the Map Range node, the input range should be (-1,-1,-1) to (1,1,1), and the output range should be the default (0,0,0) to (1,1,1). This is needed to convert the signed tangent vectors to unsigned color components. The Tangent node also needs the mode changed from the default Radial to UV.

Your node graph should look something like this:

At this point, the viewport should be giving a reasonable preview. Here's what it looks like for the sample project in the question:

Now we need to "bake" this. In the Properties panel, select "Render Properties" (television icon). Set the engine to Cycles. If you have a supported GPU, you can set the Device to GPU Compute.

Scroll down to Bake and expand the section. Note the "Bake" button at the top of the section; we'll come back to that shortly. Set the Bake Type to Diffuse. Under Influence, turn off Direct and Indirect (we don't want lighting being applied!). Make sure the Output Target is Image Textures. Go back and hit the Bake button and wait a while. When it's done, switch back to the UV Editing workspace and save the image.

Here's what we got for the sample project:

(For some reason, probably related to color spaces, the version seen here is significantly darker than as seen in Blender, or even the above preview. The gradients are still visible, but don't be alarmed if you see a similar difference in lightness.)

Orienting the Instances

To add the instances, add a Geometry Nodes modifier and switch to the Geometry Nodes workspace to edit it. We'll need the following nodes:

  • Input ⏵ Group Input (should already be present)
  • Output ⏵ Group Output (should already be present)
  • Point ⏵ Distribute Points on Faces
  • Instances ⏵ Instance on Points
  • Input ⏵ Object Info
  • Texture ⏵ Image Texture
  • Utilities ⏵ Map to Range
  • Utilities ⏵ Align Euler to Vector

Connect the Group Input's Geometry output to the Distribute Points' Mesh input. We're going to assume the objective is to get semi-uniform spacing, so change the Distribute Points' distribution from Random to Poisson Disk. Adjust the Distance Min and Density Max to taste. (Unless the goal is a small number of sparse instances, I typically find it necessary to crank the Density Max to around 10k-50k. Don't worry, this only controls the number of candidate points; the Distance Min will reduce this number significantly. Be warned, however, that going over about 100k will result in Blender having to spend many CPU cycles deciding which of the many candidate points to keep.) If using a vertex group for instance density, connect the Distribute Points' Density Factor to an empty output port slot on the Group Input. This will cause a new line to appear in the Modifier Properties for the object. The offset-cross icon (Input Attribute Toggle) can be used to toggle this between a constant value and various attributes such as vertex group weights. Clicking the input field will allow a vertex group to be selected.

Connect the Distribute Points' Points output to the Points input of Instance on Points. In the Object Info node, select the object to be used as the instances, and connect the Geometry port to the Instance on Points's Instance port. Connect the Instance on Points's Instances to the Group Output's Geometry.

At this point (no pun intended) our instance objects should be visible, but incorrectly oriented, and possibly with an unreasonable scale. The last can be adjusted in the Instance on Points node.

To get the scale, first ensure that the correct image is selected in the Image Texture node. Connect the Vector input of this to a new output port on the Group Input node, then, in the Modifier Properties panel, connect this to the object's UV map (see notes on instance density for detailed instructions). Connect the Color output of the Image Texture node to the Vector input of the Map Range node (which first needs its mode changed from Float to Vector). The input range should be the default (0,0,0) to (1,1,1), and the output range should be (-1,-1,-1) to (1,1,1). This should be familiar; we're reversing what we had to do to go from vector space to color space during baking. Connect the Vector output of the Map to Range node to the Vector input of the Align Euler to Vector node. Connect the Rotation output of the Distribute Points on Faces node to the Rotation input of the Align Euler to Vector node.

Your node graph should now look something like this:

However, our instances may not be correctly aligned. Fixing this is a matter of trial and error fiddling with the input axis of the Align Euler to Vector node. This is dependent on how your UV map is oriented and how you want your instances to be oriented. One trick that will help is to temporarily bypass the Align Euler to Vector node and connect the Rotation output of the Distribute Points node to the Rotation input of the Instance on Points node, then rotate the instance object to point along the normal vector. (Don't forget to apply the rotation after fiddling!)

A Geometry ⏵ Join Geometry node can be used if the input surface is to be retained. Some math and/or random nodes, and possibly another Map to Range node, can be used to adjust the instance scale and/or add some random variation. Hooking these up is left as an exercise for the reader.

Here's my final result, using the test project:

Note that I wasn't trying to match the original scale or density, as those are easily tweaked. The objective was rather to match the orientation, which this seems to achieve satisfactorily. Note also that this render was done with a Join Geometry node that is absent in the downloadable .blend version.

$\endgroup$
3
  • $\begingroup$ Well solved! ...you did not need our help ;-) +1 $\endgroup$
    – quellenform
    Commented Sep 2, 2022 at 0:58
  • $\begingroup$ btw: I don't know what's wrong, but some text is not displayed correctly. What characters did you use there? i.sstatic.net/ZlIzT.png $\endgroup$
    – quellenform
    Commented Sep 2, 2022 at 1:02
  • $\begingroup$ @quellenform, I wouldn't say vklidu didn't help, I definitely copied bits of that setup... and I wish that answer hadn't been deleted. Also, that's just a right-pointing triangle, used as a menu separator, because arrows (→) line up horridly for me, and I'm not a fan of the ASCII -> representation 🙂. $\endgroup$
    – Matthew
    Commented Sep 2, 2022 at 1:02
3
$\begingroup$

As you can see from the comments, you don't want to show us a screenshot or your blend file, so unfortunately the answer is a bit short and theoretical:

Add the modifier Geometry Nodes, get your mesh and scales with Object Info, pre-transform the scales in local space with Transform, use Distribute Points on Faces and transfer the generated rotation to the Instance on Points node.

enter image description here

$\endgroup$
2
  • $\begingroup$ This aligns the instances to the normals, but not the tangents. $\endgroup$
    – Matthew
    Commented Sep 2, 2022 at 0:31
  • $\begingroup$ @Matthew Correct. Because here the tangents are completely missing. I would have liked to help you more, but it was hard to get concrete information out of you, so I had to leave the field to others ;-) $\endgroup$
    – quellenform
    Commented Sep 2, 2022 at 0:52
1
$\begingroup$

So... I'm probably late for this one, but since I haven't seen this answer elsewhere, I'll give it a shot. I won´t deal with orientation here, first because others have already answered you that, and secondly because the main issue, as far as I see it, is getting the tangent inside Geo Nodes.

So this answer solves the issue of getting the tangent exclusively through Geometry Nodes, as far as possible. The only thing that you need is a valid UVMap attribute.

A bit theory

UV as Binormal and Tangent

Tangent and binormal, as defined through Mikktspace, are basically the V and U coordinates converted to world space. For any given point, the direction from this point's position to the position (both in world space) of the closest point in the V direction (In UV space) is the Tangent direction. Get the same thing for the U direction and you have the binormal.

Also: given two of Normal, Binormal, Tangent, the third one will be the cross product of the first two.

Sampling the UVs

That being said a first setup that will actually get the Tangent is this one:

enter image description here

Here the UVs are sampled in order to get the closest position in world space following the V direction and I get the direction. This solve most cases, but in some borders where advancing in the V direction doesn't sample anything we might need to sample the other directions as well. In this case you cascade the checks for the directions with switches and do a cross product with the normals when you're checking agains the U direction (because that is technically the binormal).

Cascade checks

And this is what things align to:

Instance alignment

This one is actually easier to read in the file, so here is the example file:

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .