Robots are getting smarter everyday with the implementation of computer vision system in it. It is now highly required for any robot to have a natural vision system or more likely humanoid vision system to interact with real life incidents. On the perspective of such imaging and vision, we propose an efficient method in order to determine the absolute view point of any desired image location. We used self calibration system and humanoid vision mechanism via stereo cameras to find the region of convergent of an object which with the help of a mathematical model can measure the distance of the object. With comparing different objects position it is also possible to determine the relative distance of the objects. Our system shows that, the real human eye tracking system used, can be possible for getting a realistic view of the image at the 3D point positioning.