`Minneapolis, Minnesota, USA, June 14-16, 2006
`
`WeC01.3
`
`Vision-Based Tracking and Motion Estimation for Moving targets
`using Small UAVs
`Vladimir N. Dobrokhodov, Isaac I. Kaminer Member, IEEE, Kevin D. Jones, and Reza Ghabcheloo
`
`Equation Chapter 1 Section 1(cid:3)
`
`Abstract— This paper addresses the development of a vision-
`based target tracking system for a small unmanned air vehicle.
`The algorithm performs autonomous tracking of a moving
`target, while simultaneously estimating GPS coordinates of the
`target. A low cost off the shelf system is utilized, with a
`modified radio controlled aircraft airframe, gas engine and
`servos. Tracking is enabled using a low-cost, miniature pan-tilt
`gimbal. The control algorithm provides rapid and sustained
`target acquisition and tracking capability. A target position
`estimator was designed and shown to provide reasonable
`targeting accuracy. The impact of target loss events on the
`control and estimation algorithms is analyzed in detail.
`
`T
`
`INTRODUCTION
`I.
`two decades have witnessed a remarkable
`HE past
`increase in the utilization of unmanned air vehicles
`(UAVs) both in the US and abroad. While many of the large
`UAV systems are quite capable, their cost is also very high.
`Consequently there is much interest in the development of
`small, low-cost platforms which can perform some of the
`tasks normally assigned to larger UAVs, for example vision-
`based target tracking.
`This paper addresses the development of a vision based
`target tracking and position estimation system for a small
`UAV. This work is an extension of the results reported [1].
`Therefore most of the details addressing the hardware design
`and software implementation have been omitted. In this
`paper the case of a moving target is studied.
`The platform used to test the system is a modified RC
`aircraft with a miniature pan-tilt gimbaled camera built using
`COTS components (see Fig. 1). In a typical operational
`scenario, the system operator may select a target of interest
`using a joystick that steers the onboard camera. Once a
`target is selected, the UAV and the camera automatically
`track the target and provide an estimate of its position,
`velocity and heading. The target can be either stationary or
`moving.
`
`Manuscript received September 23, 2005. This work was supported by
`the U.S. Government under the grants from USSOCOM and CDTEMS.
`Vladimir N. Dobrokhodov is with the Mechanical and Astronautical
`Engineering department, Naval Postgraduate School, Monterey, CA 93943
`USA (phone: 831-656-7714; fax: 831-656-2313; e-mail: vldobr@ nps.edu).
`Isaac I. Kaminer is with the Mechanical and Astronautical Engineering
`department, Naval Postgraduate School, Monterey, CA 93943 USA (e-mail:
`kaminer@ nps.edu).
`Kevin D. Jones is with the Mechanical and Astronautical Engineering
`department, Naval Postgraduate School, Monterey, CA 93943 USA (e-mail:
`jones@ nps.edu).
`Reza Ghabcheloo is with Instituto de Sistemas e Robotica (ISR) at
`Instituto Superior Tecnico (IST), Lisbon, Portugal (mail reza@isr.ist.utl.pt).
`
`To keep the airborne system cost low, much of the
`expensive equipment is left on the ground. The video is
`transmitted to the ground, where it is processed in real time.
`The centroid of the target in the camera frame is identified
`by an image processing algorithm and is used to drive the
`integrated UAV/gimbal control algorithm, which in turn
`steers the UAV and the gimbal to keep the target in the
`center of the camera frame.
`Reliance on inexpensive off the shelf equipment as well
`as communication interrupts due to the RFI resulted in
`frequent loss of tracking by the system. Therefore, the key
`technical challenge was to design control and motion
`estimation algorithms that were robust in the presence of
`loss of tracking events. Therefore, the design and analysis of
`both the control and, particularly, the motion estimation
`algorithms have borrowed heavily from the theory of
`systems with brief
`instabilities
`[2]
`and the
`linear
`parametrically varying (LPV) systems [3]. In this paper the
`target-loss events were modeled as brief instabilities.
`
`Fig. 1. Modified Telemaster UAV.
`The paper is organized as follows. The design of the UAV
`control algorithm
`is discussed
`in Section
`II. The
`development of the target motion estimator is included in
`Section III. The results of flight experiments with moving
`targets are discussed in Section IV. The paper ends with
`some concluding remarks.
`
`II. CONTROL SYSTEM DEVELOPMENT
`Consider Fig.2. Let(cid:85) denote range from the UAV to the
`(cid:71)
`(cid:71)
`gV
`g(cid:79)
`target,
` - the UAV ground speed,
` - the line of sight
`(cid:71)
`(cid:71)
` - the vector perpendicular to g(cid:79)
`p(cid:79)
`.
`(LOS) vector and
`Furthermore, let(cid:72)denote the angle between the LOS vector
`and the camera heading, (cid:79)- the LOS angle, (cid:92)- the UAV
`
`1-4244-0210-7/06/$20.00 ©2006 IEEE
`
`1428
`
`Yuneec Exhibit 1019 Page 1
`
`
`
`heading,
`
`h(cid:92) - the gimbal pan angle and (cid:75)- the angle
`(cid:71)
`(cid:71)
`gV
`and p(cid:79)
` vectors.
`between the
`In addition, suppose the target is moving with constant
`(cid:71)
`tV
`t(cid:92) as shown in Fig.2.
`speed
`, and heading,
`
`Results of a full scale nonlinear simulation (Fig.4) show
`that control law performs remarkably well when tracking a
`moving target while using information obtained from the
`onboard camera and the UAV velocity available from
`onboard GPS. Note, in the presence of target loss events the
`control system maintains the last turn rate command
`generated during target lock.
`
`Fig. 4. UAV motion versus target motion.
`
`III. RANGE ESTIMATION
`The standard approach to range estimation using a single
`camera involves triangulating between two consecutive
`points along the UAV path. At each point the stored
`measurements include the UAV position and the LOS angle
`of the onboard camera. Care must be taken that the baseline
`(distance between these points) is sufficiently large to
`guarantee low Dilution of Precision (DOP). Clearly, for a
`UAV tracking a target along the circular path this approach
`will result in a large wait time between each measurement.
`In this paper we assume that UAV’s altitude above target
`is known and use it as an additional measurement. To obtain
`this measurement we use the filter developed in [1] to get
`target’s latitude and longitude. Target’s altitude is then
`obtained from a geo-referenced database made available by
`the Perspective View Nascent Technologies (PVNT) [5]
`software package by providing it with target’s estimated
`latitude and longitude.
`Consider Fig.5, which depicts an aircraft equipped with a
`gimbaled optical camera pointing to the moving ground
`target. Let {I} denote an inertial reference frame, {B} a
`body-fixed frame that moves with the UAV, and {C} a
`gimbaled-camera frame that coincides with body frame
`origin and rotates with respect to {B}.
`
`Fig. 5. UAV-Target relative kinematics.
`
`Fig. 2. Moving target tracking for the control law (4).
`From Fig.2 it can be shown that tracking problem
`kinematics for a moving target are given by
`V
`V
`))
`cos
`cos(
`(
`
`
`(cid:75)(cid:16)
`
`g
`t
`t
`(cid:85)
`))
`(
`cos(
`
`t
`(cid:85)
`V
`V
`))
`(
`sin(
`(cid:92) (cid:92) (cid:75)
`(cid:16)
`(cid:16)
`(cid:16)
`g
`t
`t
`The control objective is to drive (cid:72)and (cid:75)to zero using the
`UAV turn rate (cid:92)(cid:5) and pan rate
`h(cid:92)(cid:5) as control inputs. To this
`end the following control law is proposed
`V
`(cid:85)
`k
`k
`(cid:32)
`(cid:14)
`(cid:92) (cid:75) (cid:72)
`h
`1
`2
`d(cid:85) denotes a desired horizontal range to target to be
`1
`1
`(cid:16)
`(cid:85) (cid:85)
`d
`
`(cid:92) (cid:92) (cid:75)(cid:16) (cid:16)
`
`(cid:5)
`
`(cid:92)(cid:14)
`
`(cid:32) (cid:32)
`(cid:5) (cid:5) (cid:5)
`
`(cid:72)
`
`(cid:85)
`
`
`
`(cid:75)(cid:32) (cid:16)
`
`V
`g
`
`cos
`
`
`(cid:75)(cid:16)
`
`V
`t
`
`
`
`(cid:92) (cid:92) (cid:75)(cid:16) (cid:16)
`
`(cid:5) (1)
`(cid:5)
`(cid:92) (cid:92)
`(cid:16) (cid:16)
`h
`
`sin
`(cid:75)
`(cid:14)
`
`
`
`
`
`
`
`
`
`
`
`
`
`(2)
`
`(cid:75) (cid:75)(cid:16)
`
`k
`cos
`
`1
`,
`
`g d
`
`(cid:32)
`
`(cid:5) (cid:5)
`
`(cid:92)
`
`where
`
`selected by
`
`the operator. Define
`
`(cid:85)
`e
`
`(cid:32)
`
` and
`
`(cid:85) (cid:85)(cid:32)
`
`
`e
`
`(cid:14)
`
`. Then in can be shown that the feedback
`
`1
`(cid:85)
`d
`system consisting of (1) and (2) is given by (3):
`V
`k
`d
`cos
`
`(cid:75)(cid:32) (cid:16)
`
`
`(cid:85) (cid:75) (cid:75) (cid:85)(cid:16) (cid:16)
`e
`g
`1
`d
`k
`V
`cos
`
`
`
`(cid:72) (cid:85) (cid:75) (cid:72) (cid:85)(cid:32) (cid:16) (cid:16)
`g
`e
`2
`1
`V
`(
`)
`sin
`2
`
`
`g
`e
`e
`
`,
`
`(3)
`
`d
`
`(cid:75)(cid:14)
`
`(cid:14)
`
`(cid:5) (cid:5) (cid:5)
`
`(cid:85) (cid:85)(cid:32)
`
`(cid:85)
`d
`d V(cid:100)
`|
`where|
`represents a bounded disturbance. It can be
`t
`shown that this system is uniformly ultimately bounded. The
`proof can be found in [4].
`
`1429
`
`Yuneec Exhibit 1019 Page 2
`
`
`
`y
`f
`(cid:170)
`(cid:186)
`(cid:186)
`(cid:170)
`(cid:171)
`(cid:187)
`(cid:187)
`(cid:171)
`z
`x
`(cid:172)
`(cid:188)
`(cid:171)
`(cid:187)
`c
`(cid:171)
`(cid:187)
`z
`y
`x
`cos cos
`sin sin
`sin
`(cid:77) (cid:84)
`(cid:77) (cid:84)
`(cid:84)
`(cid:16)
`(cid:14)
`(cid:14)
`(cid:172)
`(cid:188)
`c
`c
`c
`Therefore, the process model considered in this paper in the
`following form
`
`c c
`
`(cid:152)
`
`b
`
`b
`
`b
`
`g
`(
`(cid:77)(cid:84)
`
`p
`c
`
`)
`
`(cid:32)
`
`d
`(
`dt
`d V
`dt
`y
`(cid:32)
`
`R p
`(cid:152)
`c
`
`)
`
`V V
`(cid:32) (cid:16) (cid:14)
`tg
`
`IC
`
`,
`
`
`
`
`
`(8)
`
`p
`c
`
`)
`
`(cid:14)
`
`w
`
`y
`
`0 (
`
`g
`(cid:77)(cid:84)
`
`(cid:32)
`
`tg
`
`m
`
`(cid:173)
`
`(cid:176)(cid:176)(cid:176)
`
`(cid:174)(cid:176)
`
`(cid:176)(cid:176)(cid:175)
`
`where my denotes camera and altitude measurements
`yw .
`corrupted by the process noise
`The practical problem now consists of determining the
`relative position and velocity of the moving target with
`respect to a UAV using IMU, GPS and tracking camera
`measurements complemented by the altitude above the
`target provided by the PVNT. In [2], a general structure of a
`nonlinear filter with guaranteed stability and performance
`characteristics that solves this problem in the presence of
`measurement noise was proposed, while in [3] these results
`were extended to include out-of frame events typical for
`vision-based applications.
`During numerous flight tests [1] the image tracking
`software (see Section IV) lost target track on a regular basis.
`This prompted the following question: can the filtering
`solution maintain stability in the presence of target loss
`events? In fact, the ideas presented in [2] and [3] are used in
`this paper to derive a nonlinear filter that tracks a moving
`target using the process model (8) in the presence of out-of-
`frame events.
`Following the development in [3], define an out-of frame
`(cid:111)
`event as a binary signal s: [0,(cid:102))
`{0, 1}
`t
`0 out of frame event at time
`(cid:16)(cid:173)
`(cid:32) (cid:174) (cid:16)(cid:175)
`t
`1 camera tracks the target at time
`.
`t (cid:87)(cid:33) (cid:33) ,
`0
`s
`given
`binary
`signal
`and
`a
`For
`t(cid:87) denote
`sT
`( , )
`of
`the
`length
`time
`the
`in
`let
`interval( , )t(cid:87) that
`s (cid:32) Then formally
`0.
`t
`( ))s l dl .
`
`( , ) :t
`
`
`sT
`(1
`(cid:179)
`(cid:87) (cid:32)
`(cid:16)
`(cid:87)
`The signal s is said to have brief
`loss events
`target
`T (cid:116)
`
`( , )t
`
`T
`(t
`sT
`0
`)
`t (cid:87)(cid:5) (cid:116) (cid:116) , for some
`0
`(cid:100)
`(cid:14)
`(cid:16)
`(cid:87)
`(cid:68) (cid:87)
`if
`,
`0
`0
`[0,1]
`(cid:68)(cid:143)
`and
`.
`Next, consider that the orientation of the camera frame
`installed onboard the UAV is constrained by a compact set
`(cid:94)
`(cid:96)
`:
`,
`c (cid:79) (cid:77) (cid:77) (cid:84) (cid:84)(cid:47) (cid:32) (cid:100) (cid:100)max max
`
`
`
`
`
`(9)
`
`
`and that the relative position of the UAV relative to the
`
`s
`
`(cid:32)
`
`
`
`( )s t
`
`tgV ) and
`Suppose that the target’s inertial velocity (
`tg(cid:92) ) are fixed. Following the notations introduced
`heading (
`p
`x y z
`[
`]T
`(cid:32)
` denote the relative position of
`in [2], let
`c
`c
`c
`c
`
`C R , IB R and
`the center of {C} with respect to target, and let I
`R denote coordinate transformations from {C} to {I}, from
`{B} to {I} and from {C} to {B} respectively. These
`transformations are available onboard and resulted from the
`IMU attitude measurements and the positional pan/tilt
`feedback of gimbaled camera.
`From these definitions, it follows that
`p
`p
`R p
`I
`(cid:32)
`(cid:14)
`(cid:152) (cid:159)
`C
`tg
`c
`d
`d
`d
`R p
`(cid:152)
`c
`dt
`dt
`dt
`d
`d
`V
`p
`)B
`) and
`tg
`tg
`dt
`dt
`assumption of constant speed of the target provides the
`following process equations
`d
`(cid:173)
`R p
`(
`(cid:152)
`(cid:176)(cid:176)
`c
`dt
`d V
`dt
`(cid:176)(cid:175)
`Here, measurements of the camera and its gimbal angles
`cp in first equation. In order to introduce
`contribute to the
`these measurements to the process model, we assume that
`the camera readings are obtained using simple pinhole
`camera model of the form
`y
`u
`f
`(cid:186)
`(cid:170)
`(cid:170) (cid:186)
`(cid:187)
`(cid:171)
`(cid:171) (cid:187)
`z
`v
`x
`(cid:172) (cid:188)
`(cid:188)
`(cid:172)
`c
`In this equation f is a focal length of the camera and
`
`[ v]Tu
` are the coordinates of the centroid of target image in
`a camera frame. Since the camera onboard is gimbaled
`(directly controlled through pan and tilt angular commands),
`the target is always located in front of the camera’s image
`cx (cid:33)
`0.
`plane, i.e.
`As discussed above in addition to
`measurements (6) we use UAV’s altitude above target:
`z
`x
`y
`z
`sin
`sin sin
`cos cos
`b
`b
`b
`(cid:32) (cid:16)
`
`(cid:84)(cid:14)
`
`(cid:77) (cid:84)(cid:14)
`(cid:77)
`(cid:84), (7)
`c
`c
`c
`,(cid:77) (cid:84)are
`where
`the roll and pitch Euler angles
`that
`determine orientation of the camera with respect to {I}. This
`R and
`equation is a linear combination of the third row of
`cp resolved in {B}
`p
`x
`y
`z
`R p
`[
`]
`b
`b
`b
`b
`T
`the
`.
`(cid:32)
`(cid:32)
`(cid:152)
`c
`c
`c
`c
`y
`u v z
`
`g p(cid:77)(cid:84)
`(
`[
`]
`)
`T
`(cid:32)
`(cid:32)
`Let
`, then
`
`.
`
`
`
`
`
`
`
`
`
`
`
`
`
`(6)
`
`c c
`
`(cid:32)
`
`IC
`
`B
`C
`
`c
`
`m
`
`c
`
`BC
`
`B
`
`.
`
`(4)
`
`)
`
`(cid:32) (cid:16)
`
`(
`
`p
`
`B
`
`)
`
`(cid:14)
`
`(
`
`p
`tg
`
`)
`
`IC
`
`(
`
`Introducing
`
`(cid:32)
`
`(
`
`V
`
`(cid:32)
`
`(
`
`p
`
`with an
`
`.
`
`
`
`
`
`
`
`(5)
`
`)
`
`V V
`(cid:32) (cid:16) (cid:14)
`tg
`
`IC
`
`(cid:32)
`
`0
`
`tg
`
`(cid:174)(cid:176)
`
`1430
`
`Yuneec Exhibit 1019 Page 3
`
`
`
`velocity measurements to obtain an estimate of the relative
`position (dead
`reckoning). When
`target
`tracking
`is
`reestablished the integrators are reinitialized based on the
`real-time imagery.
`
`Fig. 6. Implementation of filter (12).
`Next, the entire system including the control law (2) and
`the filter (12) was tested in a full scale 6DOF nonlinear
`simulation in the presence of wind and measurement noise.
`Scenario used for simulation assumed identification of a
`moving target and start of target tracking at 2.5 sec after
`beginning of flight, then initialization of position estimation
`filters at 26 sec when the object of interest
`is at 50(cid:113)
`starboard. Between 2.5 and 26 seconds interval the UAV
`experiences transient of the control law that brings the UAV
`to a circular motion around the moving target. Target is
`moving with constant ground speed of 14 m/s and heading
`45(cid:113). Based on the analysis of real measurements from
`numerous flight experiments the following sensors noise
`were applied to simulation: camera noise for both channels
`with 0(cid:113) mean and 2.5(cid:113) variance, measurements of altitude
`above the target with 0m mean and 20m variance (here we
`assumed worst case scenario when only GPS measurements
`are available and target is moving on a flat ground at known
`altitude MSL).
`The result of this simulation for the ideal case when no
`0(cid:68)(cid:32) ) are presented next.
`out-of-frame events occur (
`Figure 7 shows 3D and plane projections of the target, UAV
`trajectories and the projection of the estimated target
`position obtained with filter (12). The filter is initialized
`with the horizontal coordinates of the UAV but with the
`altitude of the target.
`Analysis shows that except for the very short convergence
`interval the estimated target position closely follows true
`motion of the target. Figure 8 represents the filtering results
`for position, speed and heading estimation errors. It can be
`0(cid:68)(cid:32) the convergence
`seen that in an ideal scenario with
`time for
`the positional error (see Fig.8.a – shows
`convergence to 10 m) does not exceed 5.5 seconds and 11
`seconds for both speed and heading (see Fig.8.b – shows
`convergence to 5 m/s and 5(cid:113)).
`Analysis of the same experiment with a variable target
`loss parameter (cid:68)is presented in Fig.9. The metrics used to
`evaluate performance of the filter as (cid:68) increases were
`
`. (13)
`
`f
`x
`c
`
`0
`
`0
`
`f
`cx
`cos cos
`sin
`cos
`(cid:84) (cid:77) (cid:84) (cid:77)
`
`(cid:186)(cid:187)(cid:187) (cid:187)(cid:187)(cid:187)(cid:187)(cid:188)
`
`(cid:187)
`
`fy
`c
`x
`2
`c
`fz
`c
`(cid:32) (cid:16)(cid:171)
`x
`2
`c
`sin
`(cid:84)
`(cid:16)
`
`(cid:170) (cid:16)(cid:171)
`
`(cid:171)(cid:171)
`
`(cid:171)(cid:171)
`
`(cid:171)(cid:172)
`
`H p
`(
`c
`
`)
`
`cz
`
`3
`
`It is easy to check that
`
`det(
`
`H
`
`)
`
`(cid:32)
`
`2
`
`f
`
` and therefore
`
`cp ,
`
`x
`)cH p
`(
`is always invertible for all admissible values of
`,(cid:77) (cid:84)except if altitude z=0.
`The filtering solution (12) extends results proposed in [2]
`to include out-of-frame events. Theorem 1 in [3] can be used
`to prove regional stability of the filter (12) for the process
`ˆ
`model (8) with the regions Pc and
` given by (10) and (11)
`cP
`in the presence of brief out-of-frame events characterized by
`0T
`(cid:68). The proof follows directly from
`the parameters
` and
`the one used in [3] and is therefore omitted.
`Figure 6 shows implementation of the filter (12). When
`the out-of-frame event occurs,
`the filter integrates the
`
`1431
`
`target and expressed in a camera frame is constrained to be
`in
`
`. (10)
`
`(cid:191)(cid:189)
`
`x
`max
`y
`z
`
`max
`
`c c
`
`cp to be
`
`p
`c
`
`(cid:32)
`
`(cid:62)
`
`x y z
`,
`,
`c
`c
`
`c
`
`T
`
`(cid:64)
`
`:
`
`(cid:173)(cid:176)
`
`P
`c
`
`(cid:32)
`
`x
`x
`(cid:100)
`(cid:100)
`c
`min
`(cid:176)
`y
`y
`(cid:100)
`(cid:100)
`(cid:190)
`(cid:174)
`min
`(cid:176)
`(cid:176)
`z
`z
`(cid:100)
`(cid:100)
`(cid:175)
`max
`min
`x
`z
`...
`Notice that the yaw angle (cid:92)is not limited,
` are
`max
`min
`chosen according to the geometry of the mission and the
`relative vehicles dynamics.
`Filter is designed to provide the estimates ˆcp of
`bounded by
`(cid:173)
`(cid:176)
`
`,(11)
`
`(cid:189) (cid:190) (cid:191)
`
`(cid:176)
`(cid:176)
`
`max
`
`min
`
`ˆ
`p
`c
`
`(cid:32)
`
`(cid:62)
`
`ˆ
`ˆ
`ˆ
`x y z
`,
`,
`c
`c
`c
`
`T
`
`(cid:64)
`
`:
`
`(cid:174)(cid:176)
`
`ˆ
`P
`c
`
`(cid:32)
`
`ˆ
`dx
`x
`x
`x
`x
`(cid:14)
`(cid:16)
`(cid:100)
`(cid:16)
`c
`c
`min
`max
`ˆ
`dy
`y
`y
`y
`y
`(cid:14)
`(cid:16)
`(cid:100)
`(cid:16)
`c
`c
`ˆ
`dz
`z
`z
`z
`z
`(cid:14)
`(cid:16)
`(cid:100)
`(cid:16)
`(cid:175)
`c
`c
`max
`min
`where dx, dy and dz are positive numbers, and dx<xmin .
`The nonlinear filter used in this paper is given by (12)
`(see also Fig. 6.
`d
`(cid:173)
`ˆ
`p
`V
`(
`)
`(cid:32) (cid:16)
`m
`dt
`d V
`ˆ
`tg
`dt
`ˆ
`p
`c
`
`ˆ
`V
`tg
`
`(cid:14)
`
`s K
`(cid:14) (cid:152)
`1
`
`(cid:152)
`
`ˆ
`R H p
`(
`1
`(cid:16)
`(cid:152)
`c
`
`I
`C
`
`) (
`(cid:152)
`
`g
`(cid:77)(cid:84)
`
`(
`
`ˆ
`p
`c
`
`)
`
`(cid:16)
`
`y
`
`m
`
`)
`
`ˆ
`R H p
`(
`1
`(cid:16)
`(cid:152)
`c
`
`I
`C
`
`(cid:152)
`
`2
`
`) (
`(cid:152)
`
`g
`(cid:77)(cid:84)
`
`(
`
`ˆ
`p
`c
`
`)
`
`(cid:16)
`
`y
`
`m
`
`)
`
`(12)
`
`s K
`(cid:32) (cid:152)
`ˆ
`R p
`(cid:152)
`
`I
`C
`
`(cid:32)
`
`(cid:176)(cid:176)(cid:176)
`
`(cid:174)(cid:176)(cid:176)
`
`(cid:176)(cid:175)
`)cH p
`(
`where s defines the out-of-frame event and
`is the
`
`g p(cid:77)(cid:84)
`(
`)c
` with respect
`Jacobean of nonlinear transformation
`cp
`to
`
`Yuneec Exhibit 1019 Page 4
`
`
`
`chosen to be the speed of convergence parameters.
`Specifically, these were defined to be the 1st time instance
`pass which the estimate stays within 10% of the true value.
`Here represents the position metric and Vconv – the velocity
`metric.
`
`stable
`the filter exhibits
`that
`The analysis shows
`convergence times for both position and velocity estimates
`in the presence of out-of-frame events characterized by (cid:68)as
`high as 0.45 (the target is lost 45% of the time). The
`positional convergence time Pconv for the nonlinear filter
`(NLF) reported earlier [1] is also included in Fig. 9. It is
`obvious that the filter (12) outperforms the NLF filter for the
`entire range of values of (cid:68) considered.
`
`Fig. 7. 3D and 2D projections of relative motion.
`
`Fig. 9. Convergence time vs. variable(cid:68).
`
`IV. FLIGHT TEST RESULTS
`The flight test setup for to test the filter (12) is almost
`identical to the one described in [1] and is shown in Fig.10.
`A customized Senior Telemaster model aircraft was used to
`house the gimbaled camera, wireless video and serial links
`as well as Piccolo autopilot [6] with its dedicated control
`link. The image obtained by the onboard camera was
`broadcast on a 2.4 GHz link and processed on the ground by
`off-the-shelf PerceptiVU image processing software [7].
`
`a. Position error.
`
`a. Velocity and Heading errors.
`Fig. 8. Convergence results for filter (12).
`
`Fig. 10. Flight test setup.
`PerceptiVU allows the user to select and lock on a target
`displayed on a ground station screen. In the configuration
`used in this experiment, PerceptiVU provides coordinates of
`the centroid of the target selected by the user. These
`coordinates were then employed by the control and filtering
`algorithms
`introduced
`in previous
`sections
`that were
`implemented on the NPS ground station (GS).
`Multiple flight
`tests of the complete system were
`conducted in February-May and August-September of 2005.
`This time, rather than being fixed the target (white minivan)
`was moving along side of the runway with fixed speed of 4-
`
`1432
`
`Yuneec Exhibit 1019 Page 5
`
`
`
`5 m/s and heading 296(cid:113) (parallel to the runway) (see
`Fig.11). In order to evaluate the system performance the
`position, direction and speed of the target were continuously
`tracked by a GPS receiver.
`
`Fig. 11. An example of visual tracking
`Results of the tracking are summarized in Figs.12 and
`Fig.13. For
`the sake of comparison
`they represent
`implementation of two estimation algorithms. Figure 12
`includes a 3D plot of the UAV trajectory at the top as well
`as the estimates of the target position at the bottom. The
`UAV trajectory is color coded to display the time intervals
`where the target track was lost. Due to low speed of the
`target, the control law maintains a circular motion with the
`turn radius of about 200m and a slowly moving center as
`predicted by the analysis presented in Section II.
`
`Fig. 12. Flight test result of tracking a moving target
`Figure 13 shows range and Fig.14 velocity estimation
`errors. Superimposed on the position estimation error plot is
`the time history of the tracking loss events.
`As can be seen from Fig. 13 the filter (12) performs
`significantly better than the NLF filter, while the velocity
`estimation error obtained with the filter (12) does not exceed
`0.5 m/s.
`
`Fig. 13. Flight test range estimation errors for two algorithms.
`
`Fig. 14. Flight test velocity estimation error.
`
`V. CONCLUSIONS
`A system capable of tracking a moving target and
`estimating
`its position and velocity was developed.
`Straightforward nonlinear analysis was used to motivate a
`simple control system for integrated control of a UAV and
`of an onboard gimbaled camera. The control system was
`shown to perform well in both nonlinear simulation and in
`flight tests. Furthermore, a nonlinear LPV filter for target
`motion estimation was introduced. The filter performance
`was analyzed in the presence of target loss events. It was
`shown that the filter exhibited graceful degradation of
`performance in the presence of these events. The flight test
`results for moving target supported this conclusion. Future
`work will address improving performance of the target
`tracking and motion estimation algorithms by decreasing
`convergence times, reducing occurrence of target loss events
`and of their impact on the filter performance.
`
`REFERENCES
`[1] Whang I.H., Dobrokhodov V.N., Kaminer I.I., Jones K.D., “On
`Vision-Based Target Tracking and Range Estimation for Small
`UAVs,” Proceedings of AIAA Guidance, Navigation, and Control
`Conference, San Francisco, CA, August 15-18, 2005.
`[2] Olivera P., Pascoal A., Kaminer I.I, “A nonlinear Vision Based
`Tracking System for Coordinated Control of Marine Vehicles,”
`Proceedings of the 10th Mediterranean Conference on Control and
`Automation – MED 2002, Lisbon, Portugal, July 9-12, 2002.
`[3] Hespanha J., Yakimenko O., Kaminer
`I., Pascoal A., “Linear
`Parametrically Varying Systems with Brief
`Instabilities: An
`IEEE
`Application to
`Integrated Vision
`/
`IMU Navigation”,
`Transactions on Aerospace and Electronic Systems Technology, July
`2004.
`[4] V. Dobrokhodov, I. Kaminer, K. Jones and R. Ghabcheloo, ”Vision
`Based Tracking and Motion Estimation for Moving Targets using
`Small UAVs,” NPS Internal Report, January 2006.
`[5] Baer, W., “UAV Target Mensuration Experiment Using Synthetic
`Images from High Resolution Terrain Databases at Camp Roberts,”
`72nd MORSS, 10-12 June 2004, NPS Monterey, CA ,WG 25 T&E,
`web site at http://www.trac.nps.navy.mil/PVNT/
`[6] Piccolo/Piccolo Plus autopilots - A highly integrated autopilots for
`small
`UAVs,
`Cloud
`Cap
`Technology,
`Inc.,
`http://cloudcaptech.com/.
`[7] The PerceptiVU Target Tracking Software manual, PerceptiVU Inc,
`www. PerceptiVU.com .
`
`1433
`
`Yuneec Exhibit 1019 Page 6