I have added a brief addendum to the end.
I have been getting into procedural vector displacement with shader nodes in Cycles. The other day I created a model and tried to animate it rotating through space (not using any vector transformations in the shader itself) only to find that it messed with the texture coordinates I was using. This is odd to me since I was using generated coordinates transformed to match object coordinates. I figured the generated coordinates were in local space and should allow me to rotate the source mesh however I want, applying that transformation to the displacement. Well, for some reason the global transformation changed the shape of the local displacement and distorted the object.
After trying a few things like using object coordinates and not finding any change to the results whatsoever, I did some research and found absolutely nothing relating to this issue at all. I thought that maybe I had just messed up my math and was referencing global coordinates somewhere and I just happened to make that mistake in all three of my experimenting projects. To be sure, I copied CGMatter's asteroid tutorial exactly as he had done it in as close to the same build of Blender as I could. Same problem. The product looked good, but simply rotating the model upside down turned it back into the shape of the source mesh (The project starts with a subdivided cube and ends with a very different-looking asteroid).
Next, I tried completely reinstalling Blender. This was after updating a few days before, so I wanted to really make sure nothing was broken on my end. That didn't work either. This means it's either a bug or I'm dumb, at this point. I don't have the confidence to say it's a bug, so I would appreciate it if someone could look over a quick project I created with the same problem and educate me in the ways of the Texture Coordinate node. If I did correctly interpret its function, and this is a bug, I will actually submit a report.
In this project, I'm doing the old cube-squeezed-into-a-pyramid displacement. If the source mesh is rotated about X or Y, it seems to invert some or all of the local coordinates (if the object is upside down, yes it should invert observed coordinates, but local should not change!). If the source is rotated about Z, the base rotates in the opposite direction from the top. The strangest part to me is that if you look at the coordinates being used by applying them to the output of the shader, they are mapped correctly, but the math being done appears not to be using this transformation in the same way.
Here is an image of the node setup:
Again, I could be missing something obvious, but I feel like this should work in local object space no matter the orientation of the source mesh since the transformation to local space will always give back the same position vectors. The top of the cube should always be squished even if it's facing downwards because it's always locally the top.
Anyways, sorry for the VERY long post, I just have no clue what is going on and wanted to add as much context as possible. Here is a brief video of me messing with the object and the exact project file from the video:
One last time, thank you to anyone who made it through this!
QUICK ADDENDUM: So I wasn't very clear about which part of this is the problem. It's not how I have to cycle through render modes in the editor to get things to update to a global transform. I wouldn't have any problems at all, but when I go to do the final render, it updates every frame and turns the local displacement into the weird half local-half global displacement. My first inclination is that there is a bug in the way Blender is handling the vector displacement.