||Robot navigation is an old problem that periodically receives attention as new sensors are developed. One of the most important feature needed in the robot navigation is to determine the status of the robot itself. The status of robot includes position, velocity, acceleration, attitude angle, etc. The combination of GPS(Global Positioning System) and IMU(Inertial Measurement Unit) successfully demonstrate the possibility of estimating all these. But such system which uses GPS and IMU simultaneously has its own drawback, that is, if GPS fails, the whole system won't work due to the drifting property of IMU. GPS could be jammed in wartime, or even wouldn't be able to receive signal valley or in indoor environment. New sensors need to be developed to overcome the drawback of such system. Vision, as a newly developed sensor, is used to provide other features of navigation, like path planning in the navigation environment, object tracking and obstacle avoidance. How we can use vision as a measurement of the status of the robot is a challenging topic. Rather than the old method, measuring the status of robot using the information measured from optical flow, we propose a new method which uses a calibrated camera in the structured environment to estimate the vanishing points by the parallel lines in the environment. Then estimate the status of the robot using the properties of projective geometry. This method could be characterized as a camera calibration problem with constant intrinsic parameters while extrinsic parameters change and need to be calibrated. Based on the proposed method, we successfully extract the status of robot from a video sequence that simulating an aerial robot navigating in the urban environment. The simulation results verify the validity of our proposed method.