To date, most autonomous micro air vehicles (MAV-s) operate in a controlled environment, where the location of and attitude of the aircraft are measured with an infrared (IR) tracking systems. If MAV-s are to ever exit the lab, their flight control needs to become autonomous and based on on-board image and attitude sensors. To address this need, several groups are developing monocular and binocular image based navigation systems. One of the challenges of these systems is the need for exact calibration in order to determine the vehicle's position and attitude through the solution of an inverse problem. Body schemas are a biologically-inspired approach, emulating the plasticity of the animal brain, which allows it to learn non-linear mappings between the body configurations, i.e. its generalized coordinates and the resulting sensory outputs. The advantages of body schemas has long been recognized in the cognitive robotic literature and resulting studies on human-robot interactions based on artificial neural networks, however little effort has been made so far to develop avian-inspired flight control strategies utilizing body and image schemas. This paper presents a numerical experiment of controlling the trajectory of a miniature rotorcraft during landing maneuvers suing the notion of body and image schemas. More specifically, we demonstrate how trajectory planning can be executed in the image space using gradient-based maximum seeking algorithm of a pseudo-potential. It is demonstrated that a neural-gas type artificial neural network (ANN), trained through Hebbian-type learning algorithm, can be effective in learning a mapping between the rotorcraft's position/attitude and the output of its vision sensors. Numerical simulation of the landing performance, including resulting landing errors are presented using an experimentally validated rotorcraft model.