As this question and its answers point out, projecting a sphere on a flat picture plane, with perspective projection, may result in a ellipse, not a circle, and my understanding is that the only way a sphere can be projected as a circle is when its center perfectly align with the center of vision, in other words, when the viewer's eye or the camera is oriented so that the sphere is smack-dab at the center of the rendered image. But in pretty much all 3D programs imaginable out there, the image of a sphere is always a circle, not matter where the sphere is in the image, left, right, top, bottom. The only time a sphere looks stretched is when it exceeds the bound of the image.
So my question is: how does 3D graphic software adjust for distortion (such as a sphere being projected as an ellipse)? What are the possible different solutions for minimize distortion when you're working on a scene (for example, through a viewport, when you're placing objects, modeling, simulating, etc.) as opposed to when you render (in which case, factors such as the camera's sensor size, focal length, etc. are taken into consideration)? Is it possible that 3D software doesn't actually project image onto a picture plane, but some sort of picture sphere instead, which I figure will cause no distortion? And does the way 3D graphic software adjusts for distortion relate to how real-life camera adjust?