2
$\begingroup$

I am attempting to simulate structured light 3D scanning within Blender (v2.78c), however have encountered an unexpected alignment issue that I have have not been able to resolve thus far.

I have isolated the issue in a simple blend file which has:

  • One simple plane surface to scan
  • One camera
    • 2001x1080 resolution (and retested with 2048x1080)
    • 49.134d FOV
  • One spotlight
    • 49.134d FOV
    • same position and orientation as camera
    • distance of 10.44 that lies on the plane
    • constant falloff
    • blend of 0

The assumption was that rendered outputs from the camera would produce single lines captured exactly corresponding to the images projected, however the center of the captured lines appears to be consistently offset.

The simple blend zip file also contains three images to be projected, and their rendered captures, each is black with a single pixel vertical line at x=500, x=2000 and x=1500. Here are the captured lines, zoomed in:

enter image description here

The gray column to the left of the center of the captured line is consistently darker than that of the gray column to the right of the captured line, meaning once sub-pixel accuracy is calculated, the column is always computed as being located slightly to the right of the actual column. The expected output is gray columns of equal intensity either side of the captured column, given the same location/orientation of the camera and spotlight, and the same FOV.

If anyone could shed some light (pun intended) on the possible cause and/or resolution of this issue, it would be most appreciated.

EDIT: Here are the captures using a camera resolution of 2048x1080, unfortunately the issue persists in an updated blend file.

enter image description here

$\endgroup$

1 Answer 1

1
$\begingroup$

I suspect that this is due to rounding - the Image coordinate would naturally vary from 0.0 to 1.0. Your image width (2001) is a tricky number to represent in binary (all values are held in binary in a computer) so there will naturally be a rounding error - however small. In a perfect situation you'd expect anything below 0.5 to round down and 0.5 and above to round up but, due to the error in precision, values aren't held exactly - so values very close to, but below, 0.5 might actually be represented as slightly higher and actually round up when they should be rounded down. This would mormally be balanced by opposite rounding at the other end of the range (ie, values slightly less than 0.0 being rounded up) but since image coordinates are limited to the range 0.00000000 - 1.00000000 this doesn't occur (since they are never less than 0) so the rounding is slightly biassed to one side - ie, 0.0 to, say, 0.49999 are rounded down but 0.49999 to 1.0 are rounded up. The extra 0.00001 causes the discrepancy.

Note that I haven't any evidence for this - just my thought regarding how values are held internally and how rounding occurs. Written as an answer rather than a comment as too long-winded for comments.

A good test would be to use an image width of 2048 (a power of 2) and see that each side is shaded the same (ie, pixel 1024 should match 1025 precisely.

$\endgroup$
2
  • $\begingroup$ Thank you for the feedback, rounding had occurred to us as a possibility, however we found this same issue happens at other resolutions also. Have updated the original post retested with resolution 2048x1080 where unfortunately the issue persists. $\endgroup$ Commented Sep 4, 2017 at 1:34
  • $\begingroup$ @EvanDekker What happens if you rotate the camera and/or the plane? Does that give any clues as to what's going on? eg, rotating the plane 180 degrees, does that reverse the pattern?, does rotating the camera 180 degrees reverse the pattern?, does rotating both 180 degrees reverse the pattern? - this could identify whether it's the camera space, object space, world space that this is related to. $\endgroup$ Commented Sep 4, 2017 at 14:11

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .