Sunday, November 7, 2010

spy bird uav

Remotely Operated Quad rotor Aerial Vehicle and Camera Unit Using a Fly-The-Camera Perspective
Sonu shekhar


ABSTRACT
This paper presents a mission-centric approach to controlling the optical axis of a video camera mounted on a camera manipulator and fixed to a quad rotor remotely operated vehicle. A four-DOF quad rotor, UAV model will be combined with a two-DOF camera kinematic model to create a single system to provide a full six DOF actuation of the camera view. This work proposed exploits that all signals are described in camera frame. The closed-loop controller is designed based on a Lyapunov-type analysis so that the tracking result is shown to produce Globally Uniformly Ultimately Bounded (GUUB). Computer simulation results are provided to demonstrate the suggested controller. [1]
Keywords
Dc brushless motor, remote control, manual control ,visual camera, Quad rotor, PID controller, VTOL, UAV, MATLAB simulink.
1. INTRODUCTION
The typical scenario for using the quad rotor helicopter (or any aerial vehicle) as a video camera platform is based on mounting the camera on a positioned that is controlled independently from the vehicle. When the navigation or surveillance tasks become complicated, two people may be required to achieve the camera targeting objective: a pilot to navigate the UAV and a camera operator. An important underlying action on the part of the camera operator that makes this scenario feasible is that the camera operator must compensate for the motions of the UAV that disturb the camera targeting; uncompensated camera platform motion on the camera axis might be loss of targeting, but a scenario where the camera positioned is used to compensate for the platform motion can maintain the camera view. The potential shortcomings of this typical operational scenario can be summarized as: i) multiple skilled technicians are typically required, ii) the camera operator must compensate for the actions of the pilot, and iii) it is not intuitive for a camera operator to split the camera targeting tasks between actions of the camera positioned controlled by the operator and commands to the pilot. The problem of providing an intuitive interface with which an operator can move a camera positioned to make a video camera follow a target image appears in many places. The difficulty of moving a system that follows a subject with a video camera was recently addressed in. A die rent perspective to this same basic camera targeting problem was presented where the camera platform, a quad rotor UAV, and the camera positioning unit are considered to be a single robotic unit. The work in builds on to show the design of a velocity controller for the combined quad rotor-camera system that works from operator commands generated in the camera field-of-view to move both elements. The paper is organized as follows. In Section 3, a kinematic and dynamic model of the quad-rotor is presented. The kinematics for a three-link camera positioned are developed; however, only two links are used in any position scenario. The case of this positioned used in a 2-link, Tilt-Roll configuration to look forward is carried through the control design and simulation.[2]

2. LITERATURE REVIEW
2.1. QUADROTOR MODEL
A. Under actuated Quad rotor Aerial Vehicle Model .The elements of the quad-rotor unmanned aerial vehicle model are shown in Figure 2. The quad rotor body fixed frame, F, is chosen to coincide with the center of gravity which implies that it has a diagonal inertia matrix. The kinematic and a dynamic model of a quad rotor are expressed as follows [1],[2]
〖 p〗_if^i=R_f^i(ө)v_if^f (1)
Ө_if^i=T_f^i(ө)ω_if^f (2)
R_f^i=R_f^i s(ω_if^f ) (3)
〖mv〗_if^i=F_f^F-ms(ω_if^f ) v_if^f+N_1 (v_if^i )+G(R_f^i ) (4)
In this model v_if^i (t) = [vx,vy,vz] ∈R^3 R3 denotes the linear velocity of the quad rotor body-fixed frame F with respect to the earth-fixed inertial frame, I, expressed in the body-fixed frame, F, and ω_if^f (t)= [ωx,ωy,ωz] ∈R^3 R3. denotes the angular velocity of the quadrotor body-fixed frame F with respect to the inertial frame, I, expressed in the body-fixed frame, F. Equations (1) - (3) represent the kinematics of the quad rotor. The 〖 p〗_if^i (t) in (1), is the velocity of the quad rotor and Ө_if^i in (2) represents, the angular velocity ω_if^f (t) transformed by the matrix〖 T〗_f^i(ө) The position and angle〖 p〗_if^i (t), Ө_if^i (t), ω_if^f (t), v_if^f (t) are assumed to be measurable. Equation (2) represents the modeling assumption that angular velocity of the quad rotor is calculated directly in lieu of modeling the angular dynamics; that is ω_if^f (t), is considered as the system input. The dynamics of the translational velocity is shown in (4) and contains the gravitational term, G〖(R〗_f^i)=mg(R_f^i) E3 ∈ R3, where g ∈ R, denotes gravitational acceleration, E3 = [0, 0, 1]T. denotes the unit vector in the coordinates of the inertial frame, m ∈ R1 is the known mass of the quad-rotor,N1(v_if^i )∈R3. represents a bounded function, e.g., aerodynamic damping force, and S(.)∈ R2*3 . is a general form of the skew-symmetric matrix [6]. The quad rotor has inherently six degrees-of-freedom; however, the quad rotor has only four control inputs: one translational




Figure:1 Quad rotor with a Pan-Tilt-Roll Camera Positioned

force along the z-axis and three angular velocities. The vector F_f^F(t) ∈R3 refers to the quad rotor translational forces but in reality represents the single translational force which is created by summing the forces generated by the four rotors and is expressed as

F_f^F= B1 U1 ■([0&0&U1)]

where B1 = I3 is a configuration matrix (actuator dynamics are beyond the scope of this design) and u1(t) ∈ R1

Figure2. Movement of quad rotor

A quad rotor has four motors located at the front, rear, left, and right ends of a cross frame. The quad rotor is controlled by changing the speed of rotation of each motor. The front and rear rotors rotate in a counter-clockwise direction while the left and right rotors rotate in a clockwise direction to balance the torque created by the spinning rotors. The relative speed of the left and right rotors is varied to control the roll rate of the UAV. Increasing the speed of the left motor by the same amount that the speed of the right motor is decreased will keep the total thrust provided by the four rotors approximately the same. In addition, the total torque created by these two rotors will remain constant. Similarly, the pitch rate is controlled by varying the relative speed of the front and rear rotors. The yaw rate is controlled by varying the relative speed of the clockwise (right and left) and counter-clockwise (front and rear) rotors. The collective thrust is controlled by varying the speed of all the rotors simultaneously.[1-3],[4]
2.2. CAMERA POSITONED KINEMATICS

As stated, the quad-rotor can thrust in the z-direction, but it cannot thrust in the x- or y-directions. Since the quad rotor helicopter is under actuated in two of its translational velocities, a two actuator camera is added to achieve six degrees of freedom (DOF) control in the camera frame. A tilt-roll camera is added to the front of the helicopter as seen in. With the new camera frame, there are now three rotations and three translations, a total of six DOF, to actuate. To control any of the DOF, either the camera must move, the UAV must move, or both.


2.2.1. TILT-ROLL CAMERA ON FRONT OF UAV
The rotation matrix between UAV frame and Camera frame seen in upper Fig 1 is:


sinөt cosөr sinө tsinөr cosө¬t
R_c^f= sinөr cosөr 0
- cosө¬t cosөr cosө tsinөr -sinөt



Since only two of the angles vary, the Jacobian can be redefined as

JC(front) = ■(0&cosθ@1&0@0&sinθ)
and finally
ω_fc^f= jc(front) өc,өc= өt өr t ∈R^2

which facilitates the calculation of the angles of the camera.[02]

2.2.2. VISUAL SENSOR
Typically, the visual sensor consists of a camera and
image processing block. In the simulation the object was defined as 3x1 vectors of coordinates related to earth for each points by 'polyhedral' command in MATLAB . To characterize the object four feature points were selected, being defined as the camera was modelled by using the positions and orientations of the camera and the object (xc , xo). The image processing block is modelled in the details of imaging geometry and perspective projection can be found in many computer vision texts. To develop the visual sensor model, first the frames are defined. The helicopter frame is Rh, the camera frame is Rc and the object frame is Ro as shown in Figure 3.[6]




Figure3.The axes of the camera, object and helicopter

















3. CONTROL OF HELICOPTER

Figure4 The control system


The helicopter controllers have four input commands as U1, U2, U3, U4. U1 represents the translation around the z axis. U2 represents the rotation around the y axis (roll angle). U3 represents the rotation around the x axis (pitch angle). Finally, U4 represents the rotation around the z axis (yaw angle). In this study, Proportional-Derivative (PD) controllers are
designed to control the helicopter. This is because that the control algorithm can be obtained from the helicopter model and this algorithm makes the system exponentially stable as.[6]

3.1. ALTITUDE CONTROL
For the altitude control of the helicopter Eq. 1 is used

U1 = mg/(cosθcos∅)+m ([kd((z-(z*)]) ̇ ) ̇)/(cosθcos∅) (1)

where, z* is the reference linear velocity value around the z axis which is the third component of helicopter reference velocity vector vh*.


3.2. TRANSLATION CONTROL
It is necessary to control the pitch and roll angles for controlling the translations around the x and y axis. Therefore, for translation around x axis, reference pitch angle and angular rate of pitch angle (ө.*, ө*) are demanded. In the same way, for translation around y axis, reference roll angle(φ*, φ. *) and angular rate of roll angle are demanded. While the angular rates are determined from vh* vector (4th and 5th components), the angles are determined by using Eq (2).

φ*= arcsin[kdy(y. *-y.)]
φ*= arcsin[kdx(x.-x*)]

U2=kp φ(φ*- φ)-kd φ φ (2)
U¬3=kp ө(ө*- ө)-kd ө ө (3)

Yaw Control:
Desired input signal for the yaw control of the helicopter is presented in eqn 3

U4 =kd φ(φ*- φ.) (4)


Figure5. Linear and angular velocities of the helicopter


4. EXPERIMENT


A custom designed experimental test stand shown in Figure 6. is used to perform secure experiments. The test stand allows the helicopter to perform yaw motions freely, allows up to 2 meters of altitude, as well as up to ±20° roll and pitch motion. The experiment system consists of a model quad rotor helicopter, a test stand, a pan/tit/zoom camera, a catadioptric camera to be used in the future researches, and an IMU sensor. A Core2Quad 2.40 Ghz processor desktop computer with 3 GBs RAM on Windows XP that has a dual frame-grabber has been used. Algorithms were developed using Matrix Imaging Library 8.0 on C++ [20]. A Sony pan/tilt/zoom camera is directed to a stationary ground target. Captured images are processed with a feature extraction routine to determine the black blob features on the scene. Black blobs of various sizes were selected for simplicity and real-time performance. In order to show the effectiveness of the proposed algorithms an experiment of yaw motion under visual-servo control has been performed. The helicopter starts at 70 degree yaw angle and the goal is to reach 110 degree yaw angle under visual servo control. The Euler angles of the helicopter during the experiment are presented in Figure. The helicopter reaches the desired yaw values as the roll and pitch angles are kept at zero degrees during the motion.
figure 6 The Euler angles of the helicopter during the experiment.

The linear and angular velocities during the experiment are presented in Figure6 . The desired angular velocity wz which is related with the yaw motion approaches zero line as helicopter approaches the desired yaw angle.


Figure7 Results of the yaw control experiment.

5. PROPOSED TECHNIQUES

The proposed techniques in our UAV are listed below .It has four rotor and using four propeller. it will be more stable. it has one vision camera. It can give the proper image upto 100m fit (approximate) . it can attain a height upto 100 fit (approximate). it has brushless dc motor of capacity 3000 rpm, which will be very light in weight.

6. CONCLUSION
This paper suggests a novel fly-the-camera approach to designing a nonlinear controller for an under actuated quad rotor aerial vehicle that compliments the quad rotor motion with two additional camera axes to produce a fully actuated camera targeting platform. The approach fuses the often separate tasks of vehicle navigation and camera targeting into a single task where the pilot sees and flies the system as through riding on the camera optical axis. The controller was shown to provide position and angle tracking in the form of Globally Uniform Ultimately Bounded (GUUB) result. Visual information has been used solely for the control of the vehicle with the feature estimation, image based control, and helicopter controller blocks. Various simulations in MATLAB and experiments performed on a model helicopter show that the approach is successful. As a future work, we plan to experimentally validate the simulation results with stationary and non-stationary objects in the control loop with more advanced motions.



7. REFRENCES

[1] IEEE transaction on robotics,vol-24,no-2,april2008 33. A Practical Visual Servo Control for an Unmanned Aerial Vehicle Nicolas Guenard, Tarek Hamel, Member, IEEE, and Robert Mahony, Senior Member, IEEE

[2] Velocity Control for a Quad-Rotor UAV Fly-By-Camera Interface
Andrew E. Neff, DongBin Lee, Vilas K. Chitrakaran, Darren M. Dawson and Timothy C. Burg
Clemson University
.
[3] Control of a Remotely Operated Quad rotor Aerial Vehicle and Camera Unit Using a Fly-The-Camera Perspective Proceedings of the 46th IEEE Conference on Decision and Control New Orleans, LA, USA, Dec. 12-14, 2007

[4] Build a Micro Air Vehicle (MAV)
Thesis Topic Analysis and design of an on-board electronic system for a Quad Rotor Micro Aerial Vehicle.

[5] Quad rotor Control Using Dual Camera Visual Feedback Erding Altuk, James P. Ostrowski, Camillo J. Taylor GRASP Lab. University of Pennsylvania, Philadelphia, PA 19104, USA. September 14-19,1003

[6] Vision-based Servo Control of a Quad rotor Air Vehicle Zehra Ceren, Erdinç Altu

[7]. Modelling of Quad rotor Helicopter Dynamics
Engr. M. Yasir Amir 1, Dr. Valiuddin Abbass 2, Department of Electronic and Power Engineering, National University of Sciences and Technology, Karachi, Pakistan

[8]. James F. Roberts, Timothy S. Stirling, Jean-Christophe Zufferey and Dario Floreano Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, 1015, Switzerland.” Quadrotor Using Minimal Sensing For Autonomous Indoor Flight” Pre-print version, published at MAV07 (EMAV2007)

[9]. Thae Su Aye, Pan Thu Tun, Zaw Min Naing, and Yin Mon Myint. “Development of Unmanned Aerial Vehicle Manual Control System”. World Academy of Science, Engineering and Technology 42 2008.




8. ACKNOWLEDGEMENT
The authors gratefully acknowledge the authorities of Dr. mgr college & research institute, Chennai India, for the facilities provided to carry out this research work. Many lives & destinies are destroyed due to the lack of proper guidance, directions & opportunities. It is in this respect I feel that I am in much better condition today due to continuous process of motivation & focus provided by my parents & teachers in general. The process of completion of this project was a tedious job & requires care & support at all stages. I would like to highlight the role played by individuals towards this.
I am eternally grateful to honorable Mrs. K. Sujata for providing us the opportunity & infrastructure to complete the project as a partial fulfillment of B.Tech degree. I am very thankful to L.Ramesh, Head of Department of EEE, for his kind support & faith in us.

No comments:

Post a Comment