This seems to work for me...
Texture-based Approach
Generating the Texture
First, we need a texture that encodes the u⃑ tangent vector. Assign a material to the object for which the tangent texture will be generated and go to the "Shading" workspace to edit the material's nodes. We'll need the following:
- Input ⏵ Tangent
- Texture ⏵ Image Texture
- Converter ⏵ Map Range
- Shader ⏵ Diffuse BDSF
- Output ⏵ Material Output
The "Material Output" node should already be present. Any other existing nodes can be deleted. Note that the "Image Texture" node is only present so that baking will know what to target; it does not need to be connected, however the Color Space should be changed to Non-Color. Be sure to select the intended image target in this node to avoid a confusing error. If you don't already have an image, you'll need to create one.
The other nodes should be connected as follows: Tangent:Tangent to Map:Vector, Map:Vector to Diffuse:Color and Diffuse:BDSF to Material:Surface. Also in the Map Range node, the input range should be (-1,-1,-1) to (1,1,1), and the output range should be the default (0,0,0) to (1,1,1). This is needed to convert the signed tangent vectors to unsigned color components. The Tangent node also needs the mode changed from the default Radial to UV.
Your node graph should look something like this:
At this point, the viewport should be giving a reasonable preview. Here's what it looks like for the sample project in the question:
Now we need to "bake" this. In the Properties panel, select "Render Properties" (television icon). Set the engine to Cycles. If you have a supported GPU, you can set the Device to GPU Compute.
Scroll down to Bake and expand the section. Note the "Bake" button at the top of the section; we'll come back to that shortly. Set the Bake Type to Diffuse. Under Influence, turn off Direct and Indirect (we don't want lighting being applied!). Make sure the Output Target is Image Textures. Go back and hit the Bake button and wait a while. When it's done, switch back to the UV Editing workspace and save the image.
Here's what we got for the sample project:
(For some reason, probably related to color spaces, the version seen here is significantly darker than as seen in Blender, or even the above preview. The gradients are still visible, but don't be alarmed if you see a similar difference in lightness.)
Orienting the Instances
To add the instances, add a Geometry Nodes modifier and switch to the Geometry Nodes workspace to edit it. We'll need the following nodes:
- Input ⏵ Group Input (should already be present)
- Output ⏵ Group Output (should already be present)
- Point ⏵ Distribute Points on Faces
- Instances ⏵ Instance on Points
- Input ⏵ Object Info
- Texture ⏵ Image Texture
- Utilities ⏵ Map to Range
- Utilities ⏵ Align Euler to Vector
Connect the Group Input's Geometry output to the Distribute Points' Mesh input. We're going to assume the objective is to get semi-uniform spacing, so change the Distribute Points' distribution from Random to Poisson Disk. Adjust the Distance Min and Density Max to taste. (Unless the goal is a small number of sparse instances, I typically find it necessary to crank the Density Max to around 10k-50k. Don't worry, this only controls the number of candidate points; the Distance Min will reduce this number significantly. Be warned, however, that going over about 100k will result in Blender having to spend many CPU cycles deciding which of the many candidate points to keep.) If using a vertex group for instance density, connect the Distribute Points' Density Factor to an empty output port slot on the Group Input. This will cause a new line to appear in the Modifier Properties for the object. The offset-cross icon (Input Attribute Toggle) can be used to toggle this between a constant value and various attributes such as vertex group weights. Clicking the input field will allow a vertex group to be selected.
Connect the Distribute Points' Points output to the Points input of Instance on Points. In the Object Info node, select the object to be used as the instances, and connect the Geometry port to the Instance on Points's Instance port. Connect the Instance on Points's Instances to the Group Output's Geometry.
At this point (no pun intended) our instance objects should be visible, but incorrectly oriented, and possibly with an unreasonable scale. The last can be adjusted in the Instance on Points node.
To get the scale, first ensure that the correct image is selected in the Image Texture node. Connect the Vector input of this to a new output port on the Group Input node, then, in the Modifier Properties panel, connect this to the object's UV map (see notes on instance density for detailed instructions). Connect the Color output of the Image Texture node to the Vector input of the Map Range node (which first needs its mode changed from Float to Vector). The input range should be the default (0,0,0) to (1,1,1), and the output range should be (-1,-1,-1) to (1,1,1). This should be familiar; we're reversing what we had to do to go from vector space to color space during baking. Connect the Vector output of the Map to Range node to the Vector input of the Align Euler to Vector node. Connect the Rotation output of the Distribute Points on Faces node to the Rotation input of the Align Euler to Vector node.
Your node graph should now look something like this:
However, our instances may not be correctly aligned. Fixing this is a matter of trial and error fiddling with the input axis of the Align Euler to Vector node. This is dependent on how your UV map is oriented and how you want your instances to be oriented. One trick that will help is to temporarily bypass the Align Euler to Vector node and connect the Rotation output of the Distribute Points node to the Rotation input of the Instance on Points node, then rotate the instance object to point along the normal vector. (Don't forget to apply the rotation after fiddling!)
A Geometry ⏵ Join Geometry node can be used if the input surface is to be retained. Some math and/or random nodes, and possibly another Map to Range node, can be used to adjust the instance scale and/or add some random variation. Hooking these up is left as an exercise for the reader.
Here's my final result, using the test project:
Note that I wasn't trying to match the original scale or density, as those are easily tweaked. The objective was rather to match the orientation, which this seems to achieve satisfactorily. Note also that this render was done with a Join Geometry node that is absent in the downloadable .blend
version.