In my current Project, I use a Render Layer node and it's z output to obtain a depth image of the scene. My question now is, the distance obtained from this output, does give me the distance from the nearest object point to the camera plane. This is quite clear when using orthogonal projections where all rays run in parallel but what is this z-value when using perspective projection like in real cameras? Does blender already calculate the distance to the camera plane like shown in this image?
Or is it the distance along the ray?
Another question concerning this topic: Does anybody know an alternative way to obtain depth data in blender, different than using the render layer? (Unfortunately, This method does not work in background mode)



