You need to render them back to front relative to the camera in order for transparency to correctly work, check painters algorithm, and depth buffering.
The rational behind this, is that because how depth tests work. The "usual" case for OpenGL is to test fragments for depth, once a fragment passes the depth test it will be written to the frame buffer overwriting any fragment that was written in the same frame buffer position, in other words when your scene have only opaque objects the depth buffer is enough to sort your rendered objects because it only keeps fragments that are near the camera.
When transparency comes into play, this is no more the case, and you need more info than the only nearest fragment. The depth test will still make near fragments overwrite far ones in the frame buffer, even though transparency says otherwise.
GL_BLEND_FUNC()
(deprecated btw you should switch to shaders) can only blend fragments using a certain fixed equation, has nothing to do with the order pixels are written and have no control over the depth buffer. So it will apply the blending equation regardless of how the depth buffer sorted them. So the only practical option is to draw far then near objects.
Order-independent transparency on the other hand exist but they are particularly hard to implement.
As a side note opaque only objects (unlike transparent) should be draw front to back to avoid over-draw.