`US007970507B2
`
`(») United States Patent
`Fregene et al.
`
`(Io) Patent No. :
`(45) Date of Patent:
`
`US 7, 970, 507 B2
`Jnn. 2S, 2011
`
`(54) METHOD AND SYSTEM FOR AUTONOMOUS
`TRACKING OF A MOBILE TARGET BY AN
`UNMANNED AERIAL VEHICLE
`
`(75)
`
`Inventors: Kingsley 0 C Fregene, Andover, MN
`(US); Peter Lommel, St. Cloud, MN
`(US)
`
`International
`(73) Assignee: Honeywell
`Morristown, NJ (US)
`
`Inc. ,
`
`( * ) Notice:
`
`the term of this
`Subject to any disclaimer,
`is extended or adjusted under 35
`patent
`U. S. C. 154(b) by 805 days.
`
`(21) Appl. No. : 12/018, 669
`
`(22) Filed:
`
`Jan. 23, 2008
`
`(65)
`
`Prior Publication Data
`
`US 2009/0187299 Al
`
`Jul. 23, 2009
`
`(51) Int. Cl.
`G01C 22/00
`(2006. 01)
`(52) U. S. Cl. . . . .
`. . . . . . . . . . 701/23: 701/14; 701/206; 342/36;
`340/948; 340/961
`(58) Field of Classification Search
`. . . . . . . . . . . . . . . . . . . . 701/14,
`701/16, 23, 29, 35, 117, 120, 301, 206; 340/961,
`340/945, 948; 342!29, 33, 34, 35, 36
`file for complete search history.
`
`See application
`
`(56)
`
`References Cited
`
`6, 910. 657
`7, 307. 579
`7, 437. 225
`2003/0212478
`2007/0093945
`2007/0250260
`2007/0268364
`2009/0015674
`
`U. S. PATENT DOCUMENTS
`B2
`6/2005 Schneider
`B2
`12/2007 Rees et al.
`B 1 " 10/2008 Rathinam
`. . . . . . . . . .
`Al" 1 1/2003 Rios . . . , . . . . . . . . . . . . , . ,
`Al
`4/2007 Grzywna et al.
`10/2007 Ariyur et al
`A 1
`1 1/2007 Neff et al.
`A 1
`A 1 "
`I/2009 Alley et al.
`OTHER PUBLICATIONS
`
`. . . 701!14
`. . . . , 701/2
`
`. . . 348/144
`
`Jusuk Lee etl al, Strategies of Path-Planning
`for a UAV to Track a
`Ground Vehicle, AINS, Menlo Park, CA, Jun. 2003.
`
`~ cited by examiner
`
`Primary Examiner
`Gertrude Arthur Jeanglaud
`(74) Attorney, . dgent, or Finn
`Fogg 6k Powers LLC
`ABSTRACT
`(57)
`This invention provides a system and method for autono-
`tracking a moving
`from unmanned
`aerial
`target
`mously
`vehicles (UAVs) with a variety of airframe and sensor pay-
`the vehi-
`load capabilities
`so that the target remains within
`cle's sensor field of view regardless of the specific target
`motion patterns. The invention uses information about target
`location, UAV platform type and states, sensor payload capa-
`bility, and ratio of target-to-UAV speeds to select from a suite
`of sub-algorithms,
`each of which generates desired platform
`(in form of waypoints)
`and/or sensor orientation
`positions
`to keep the target in view.
`commands
`
`20 Claims, 6 Drawing Sheets
`
`UAV 11
`
`SENSOR 114
`
`TARGET 116
`
`REMOTE OPERATOR 118
`
`Yuneec Exhibit 1008 Page 1
`
`
`
`U. S. Patent
`
`Jun. 28, 2011
`
`Sheet 1 of 6
`
`US 7, 970, 507 B2
`
`0
`
`V)
`
`0 I—
`
`0
`0
`
`Yuneec Exhibit 1008 Page 2
`
`
`
`U. S. Patent
`
`Jun. 28, 2011
`
`Sheet 2 of 6
`
`US 7, 970, 507 B2
`
`LU
`
`Z 0
`I- 0
`Zg Dm $ I-
`$ Z
`0
`O
`
`8 Z
`lO ~
`lO-
`Lll Z 0 3 0
`lL 0.
`
`CV O
`
`CI
`T
`
`0
`0 0
`
`00
`Cl
`lV
`
`EO
`C)
`CV
`
`Yuneec Exhibit 1008 Page 3
`
`
`
`U. S. Patent
`
`Jun. 28, 2011
`
`Sheet 3 of 6
`
`US 7, 970, 507 B2
`
`START
`
`IDENTIFY THE VERTICES AND
`CENTER OF THE SENSOR
`FOOTPRINT
`
`302
`
`IDENTIFY THE COORDINATES
`OF THE MIDPOINTS FOR EACH
`SIDE OF THE SENSOR
`FOOTPRINT
`
`TRANSFORM EACH
`COORDINATE INTO AN INERTIAL
`COORDINATE
`
`SCALE THE INERTIAL
`COORDINATES
`
`DETERMINE IF THE TARGET IS
`CLOSE TO LEAVING THE FIELD
`OF VIEW OF THE SENSOR
`
`CAUSE UNMANNED AERIAL
`VEHICLE TO FLY IN A TRACK
`MODE THAT KEEPS THE
`TARGET IN THE FIELD OF VIEW
`OF THE SENSOR
`
`06
`
`08
`
`10
`
`12
`
`END
`
`Yuneec Exhibit 1008 Page 4
`
`
`
`U. S. Patent
`
`Jun. 28, 2011
`
`Sheet 4 of 6
`
`US 7, 970, 507 B2
`
`FIG. 4
`
`Direction of
`frame motion
`
`FIG. 5
`
`Yuneec Exhibit 1008 Page 5
`
`
`
`U. S. Patent
`
`Jun. 28, 2011
`
`Sheet 5 of 6
`
`US 7, 970, 507 B2
`
`Yuneec Exhibit 1008 Page 6
`
`
`
`U. S. Patent
`
`Jun. 28, 2011
`
`Sheet 6 of 6
`
`US 7, 970, 507 B2
`
`Yuneec Exhibit 1008 Page 7
`
`
`
`US 7, 970, 507 B2
`
`1
`METHOD AND SYSTEM FOR AUTONOMOUS
`TRACKING OF A MOBILE TARGET BY AN
`UNMANNED AERIAL VEHICLE
`
`GOVERNMENT LICENSE RIGHTS
`
`The U. S. Government may have certain
`in the
`rights
`present invention as provided for by the terms of Contract No.
`FA8650-04-C-7142 with
`the Defense Advanced Research
`Projects Agency.
`
`BACKGROUND TECHNOLOGY
`
`Unmanned aerial vehicles (UAVs) are remotely piloted or
`self-piloted aircraft that can carry cameras, sensors, conunu-
`nications equipment, or other payloads. They have been used
`role for many
`in a reconnaissance and intelligence-gathering
`for the
`years. More recently. UAVs have been developed
`purpose of surveillance and target tracking.
`and target tracking performed
`surveillance
`Autonomous
`is becom-
`by UAVs in either military or civilian enviromnents
`ing an important aspect of intelligence-gathering.
`Typically,
`when a target is being tracked from aerial vehicles (e. g. a
`operators must closely monitor
`UAVl. human
`imagery
`to assess
`the aircraft
`target behavior and
`from
`streamed
`ensure that the target continues to be in view.
`
`SUMMARY
`
`This invention provides a system and method for autono-
`mously tracking a moving target from UAVs with a variety of
`airframe and sensor payload capabilities
`so that the target
`remains within the vehicle's sensor field of view regardless of
`the specific target motion patterns. The invention uses infor-
`mation about target location, UAV platform
`type and states,
`sensor payload capability, and ratio of target-to-UAV speeds
`to select from a suite of sub-algorithms,
`each, of which gen-
`erates desired platform positions (in form of waypoints) and/
`or sensor orientation commands
`to keep the target in view.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`10
`
`16
`
`20
`
`26
`
`30
`
`36
`
`40
`
`2
`DETAILED DESCRIPTION
`
`In the following detailed description,
`are
`embodiments
`described in sufficient detail to enable those skilled in the art
`It is to be understood
`to practice the invention.
`that other
`embodiments may be utilized without departing
`from the
`scope of the present
`invention. The following
`detailed
`is, therefore, not to be taken in a limiting sense.
`description
`FIG. 1 is a simplified diagram depicting a system 100 for
`tracking a target from a UAV. As shown in FIG.
`automatically
`1, the system includes (I) a UAV 112 equipped with at least
`one sensor 114, (3) a target 116 and (4) a remote operator 11S.
`that while remote operator 118 is
`It should be understood
`shown in FIG. 1, remote operator 11S may be many miles
`away from UAV 112 and target 116.
`The UAV 112 can either be a hover-capable aerial vehicle
`or a fixed-wing aerial vehicle. Sensor 114 may be any device
`capable of imaging a target, such as a camera or radar. Target
`116 may be anything being monitored by UAV 112. For
`example, target 116 may be a ground-based vehicle, an air-
`based vehicle. a roadway, or a person. To acquire a target,
`UAV 112 typically sends images from sensor 114 to remote
`operator 118. Remote operator 118 then defines an area in the
`image as the target, and sends the target to UAV 112.
`Remote operator 11S may be any device capable of com-
`municating with UAV 112. In addition, remote operator 118
`to remotely control UAV 112. Remote
`may be configured
`operator 118 may be a device such as a desktop computer,
`laptop, or a personal data assistant ("PDA"). for example.
`Aspects of the present
`invention may be carried out by
`remote operator 118 (or any other entity
`UAV 112 and/or
`capable of controlling UAV 112). FIG. 2 depicts functional
`that may be included in UAV 112 and/or remote
`components
`operator 118 to carry out various aspects of the invention. As
`shown in FIG. 2, the components
`include a communication
`interface 200, a processing unit 202, and data storage 206, all
`of which may be coupled together by a system bus, network,
`or other mechanism 210.
`interface 200 comprises a mechanism
`for
`Communication
`over an air interface, so as to facilitate com-
`communicating
`munication between UAV 112 and remote operator 118. Fur-
`interface 200 may include one or more
`ther, communication
`to facilitate air interface communication.
`antennas
`Processing unit 202 comprises one or more general pur-
`pose processors (e, g, , INTEL microprocessors) and/or one or
`more special purpose processors (e. g. . digital signal proces-
`sors). Data storage 204, in turn. comprises one or more vola-
`tile and/or non-volatile storage mechanisms, such as memory
`and/or disc-drive storage for instance, which may be inte-
`grated in whole or in part with processing unit 202.
`As shown, data storage 204 includes program
`logic 206
`and reference data 208. Prograin
`logic 206 comprises one or
`more logic modules
`includes
`(applications), and preferably
`instructions executable by processing unit
`machine
`language
`204 to carry out various functions described herein, such as
`the coordinates of the footprint of sensor 114,
`(I) identifying
`(2) determining whether the target is close to leaving the field
`o f view of the sensor, and (3) causing UAV 112 to fly in a track
`mode that keeps target 116 in the field of view of sensor 114.
`Reference data 208, in turn, may include data such as imaging
`data acquired by sensor 114.
`FIG. 3 is a flow chart depicting identifying various coordi-
`nates in a sensor footprint used to autonomously
`track target
`116. In particular, FIG. 3 depicts (I) identifying
`the coordi-
`nates of the footprint of sensor 114, (2) determining whether
`the target is close to leaving the field of view of the sensor, and
`
`Features of the present invention will become apparent to
`those skilled in the art from the following description with 45
`reference to the drawings, Understanding
`that the drawings
`depict only typical embodiments of the invention and are not
`therefore to be considered limiting in scope, the invention will
`be described with additional specificity and detail through the
`use of the accompanying drawings,
`in which:
`FIG. 1 is a schematic diagram depicting a system for aerial
`tracking of a ground vehicle according to one embodiment of
`the invention.
`FIG. 2 is a simplified block diagram of an entity arranged
`to implement aspects of the exemplary embodiment.
`FIG. 3 is a flow chart depicting functions that can be carried
`out in accordance with the exemplary embodiment.
`FIG. 4 is a diagram depicting an example of a sensor
`footprint.
`FIG. 5 is a diagram of a footprint that illustrates variables
`needed to determine how close target 116 is from leaving
`sensor 114's field of view.
`FIG. 6 is a diagram depicting SIN tracking.
`FIG. 7 is an illustration depicting the parameters for adjust-
`ing the loitering orbit when target 116 is in motion.
`FIG. 8 depicts a forward-looking
`sensor footprint that has
`been nomialized.
`
`50
`
`66
`
`60
`
`66
`
`Yuneec Exhibit 1008 Page 8
`
`
`
`US 7, 970, 507 B2
`
`3
`[3] causing UAV 112 to fly in a track mode that keeps target
`116 in the field of view of sensor 114.
`As shown in FIG. 3, at step 302, UAV 112 identifies
`the
`coordinates of the vertices and center of the footprint (ke, , the
`viewing window) of sensor 114. Examples of sensor foot-
`prints are depicted in FIG. 4. As shown in FIG. 4, UAV 112 is
`equipped with forward and side looking sensors. Forward
`looking sensor footprint 402 includes vertices (a, b, c, d). The
`center of footprint 402 is identified as (i). Side-looking sen-
`sor footprint 404 includes vertices (e, f, g, h). The center of
`is identified as (j).
`side-looking sensor footprint
`FIG. 8 depicts a forward-looking
`sensor footprint that has
`been normalized [[. e. , displayed as a rectangle). As shown in
`FIG. 8, the footprint includes vertices (a, b, c, d), center (i),
`midpoints (ad. , ab. , bc, dc. ), and angles
`
`
`
`( — ' — ") 2'2'
`
`where cth and ct„are the horizontal and vertical field of view
`angles for sensor 114.
`to FIG. 3, the coordinates of the vertices and
`Returning
`center of the sensor footprint may be computed using
`the
`following data:
`[uh, u, ], the horizontal and vertical field of view for sensor
`114;
`[0. t)t, t)l], the attitude angles of UAV 112, where 0, is the
`pitch, h)t is the roll, and t)t is the yaw. In this example climb
`requires a positive pitch, the right wing down is a positive roll
`and clockwise from the top of the vehicle is a positive yaw;
`[O„h)t„ t[1, ], the attitude angles of sensor 114, where 0, is the
`pitch, h)t is the roll, and t[1 is the yaw. In this example, pitch is
`measured between 0 and 90 degrees measured from straight
`down. The Camera lookdown angle is [I-[], ], the roll angle is
`positive right and the yaw angle is positive in the clockwise
`facing sensor 114 has
`direction. Consequently,
`a forward
`t)I =0, while a left-pointing camera has a t[I =-90 degrees; and
`[N, E, h], the position coordinates of UAV 112 where
`N=north, E=east, and h=height
`from some reference point
`[such as UTM northings, castings and altitude).
`The local coordinates of the vertices and center of the
`footprint are identified as follows:
`
`At step 306, the radii of the ellipse that circumscribes
`frame is identified as follows:
`
`the
`
`uh
`p'h = tan—
`2
`
`uv
`rv — — tan—
`2
`
`where rh is the radius in the horizontal direction and r, is the
`radius in the vertical direction. The smaller of these two radii
`axis of the
`to the ]en(uh of the semi-minor
`corresponds
`ellipse.
`At step 308. each coordinate
`is transformed
`to inertial
`the coordinate by pitch-ro]]-yaw
`coordinates my multiplying
`rotation matrices [R] and [R. ], where
`[Rj=/R(0!] [R(t[ll] [R(VI!]; and
`
`20
`
`[R J=/R(0, !][R($, !][R([', )].
`
`Thus,
`
`A=a[R][R, ]
`
`B=b [R] [R, ]
`
`Cm [R] [R, ]
`
`D=d[R] [R, ]
`
`t=i[R][R, ]
`
`AB, =ab, [R] [R, ]
`
`BC, =bc, [R] [R, ]
`
`DC, =dc, [R] [R, ]
`
`AD, =3((, [R][R, ]
`
`35
`
`Rotational matrices are well known in the art, and are not
`described in detail here.
`At step 310 the scaled coordinates of the sensor footprint
`are computed by scaling the inertial coordinates by the height
`[h] that UAV 112 is flying above the ground [if target 116 is a
`ground target), or the height of UAV 112 is flying above the
`target 116 [if target 116 is not necessarily on the ground). The
`is calculated as follows:
`45 footprint
`
`i
`
`-tan( — )
`
`uh
`
`2
`uh
`tan( —, )
`
`tan( —
`')
`2
`tan( —
`')
`2
`— tan l — )
`I uv
`'2
`tan( —, )
`-tan( — )
`— tiun — )
`I uv
`'2
`0
`
`i
`
`uh
`
`2
`
`0
`
`At step 304, the local coordinates of the midpoints for each
`side of the sensor footprint are identified as follows:
`
`ab,
`
`bc,
`
`dc,
`
`ad,
`
`tan(a, /2!
`0
`— tan(uv /2!
`0
`
`0
`
`tan(uI, /2!
`0
`— tan(uh /2!
`
`I
`
`i
`
`I
`
`i
`
`— — Ax
`
`As
`
`h
`A(3!
`
`h
`B, =Bx
`B(3!
`h
`C(3!
`
`C, =Cx
`
`Ds — — Dx
`
`h
`
`D(31
`
`I, =Ix
`
`h
`I(3!
`
`AB, s = AB, x
`
`h
`ABI(3!
`
`BCs =BC, x
`
`h
`BCI(3!
`DC„= DC, x
`
`AD„= AD, x
`
`h
`DC, (3!
`h
`AD, (3!
`
`50
`
`60
`
`65
`
`Yuneec Exhibit 1008 Page 9
`
`
`
`US 7, 970, 507 B2
`
`the length of the semi-minor axis may be scaled
`Similarly,
`to account for the height UAV 112 is above target 116:
`
`rgM = minlra. r, i x- l(3l
`
`h
`
`the various coordinates of sensor 114's
`After computing
`sensor footprint, at step 310, the target's (I) position relative
`
`to the center of the camera footprint (r. . . , ) on the ground. (2)
`the distance from the center of the camera footprint to the side
`(e. g. , [AB. g BC gDC gAD gj ofthe footprint that is closestto
`the target (r„~, ), and (3) the distance from the center of the
`frame to the target (r, ) are used to determine how close target
`116 is from leaving sensor 114's field of view. These posi-
`in FIG. 5, which is a diagram of a footprint
`tions are illustrated
`that illustrates variables need to determine how close target
`116 is from leaving sensor 114's field of view. As shown in
`r„„,
`FIG. 5, the
`a center point,
`a
`includes
`frame
`target,
`r„r„d„and
`the direction of the frame of motion.
`In order to detertuine whether
`the target is about to leave
`the target's field of view, the value of r. . . and r„~. is first
`calculated. r. . . , is calculated by using the following equa-
`r, „„=(e„e„',
`where e„aude„are unit vectors along a line from the target
`to the center of the footprint and from the mid-point of the
`closest side to the center respectively. That is, e„ is the unit
`vector along r„while eaa is the unit vector along r„, .
`r„~. is calculated using the following equation:
`
`tion:
`
`t rt
`
`r»g, = argnn
`
`Igeg
`
`"target
`
`BCeg
`
`"target
`
`rDCeg
`
`"target
`
`rnneg
`
`rtarget
`
`where r» -rA~
`is the distance from the center of the frame
`to the side of the frame.
`By calculating these values over time, UAV 112 (or remote
`operator 118) can determine if and when target 118 will leave
`the field of view of sensor 114.
`At step 312 the location and speed of target 116 are used to
`cause UAV 112 to, depending on how fast target 116 is mov-
`ing relative to the speed UAV 112 is capable of flying, fly in
`track modes. These modes include, all at suitable
`different
`altitude and stand-off ranges, (I) a straight following of the
`target (SFO), (2) flying orbits around the target (ORB), or (3)
`doing S-shapes (i. e. sinusoids) (SIN). Tlfis enables UAV 112
`target 116 in the sensor footprint of sensor 114.
`to maintain
`UAV 112 flies in these modes by receiving waypoint com-
`mands and flying to the waypoints.
`To determine which tracking mode UAV should use, the
`ratio (ty) of the speed of UAV 112 (v ) to the speed of target
`114 (v, ) is identified:
`
`va a=-
`
`at
`
`If a is around I, UAV 112 and target 114 are traveling at
`similar speeds, and UAV 112 should employ SFO tracking. If 66
`a is greater than I (i. e. , is travelling
`faster than target 114),
`UAV 112 can slow down to match the speed of target 114 and
`
`if UAV 112 is unable
`maintain SFO tracking. However,
`to
`travel at such a slow speed (because it would stall), UAV 112
`should employ either SIN or ORB tracking. Consequently,
`UAV 112 should employ SIN tracking. If UAV 112 would be
`unable to maintain
`the target in its sensor footprint using SIN
`tracking (i. e. , the amplitude
`is too large), then UAV
`required
`112 should employ ORB tracking. The value of ty that triggers
`UAV 112 to engage in a different mode is aircraft specific, as
`different aircraft have different properties
`(such as being a
`hovercraft vs. a fixed-wing aircraft). UAV 112 may automati-
`cally switch track modes depending on the value of at or may
`be instructed by remote entity 118 to enter a different track
`mode depending on the value of o.
`SFO Tracking
`If UAV 112 is flying in the SFO track mode, the waypoint
`(WPT~~~) is calculated as follows:
`
`BT T~r ~=tgt P os n-I
`
`fixed
`
`Where tgtPosn is the target's position in the inertial frame,
`and I (calculated above) is the offset between the aircraft and
`the center of the footprint.
`SIN Tracking
`FIG. 6 is an illustration depicting
`the variables used to
`in FIG. 6,
`for SIN tracking. As shown
`calculate waypoints
`UAV 112 is traveling at a fixed airspeed v, and is tracking
`target 116, Target 116 is moving along a straight line with a
`speed vr In addition, a pair of aligned coordinate systems
`is
`t the target, and the other
`shown. One system (x„y, ) is
`(x, t y, ) is fixed to the beginning of the sinusoid. In order for
`UAV 112 to maintain
`track of target 116, the distance (D) that
`UAV 112 travels
`in one period (I) is equal to the distance
`traveled by the target in the same direction:
`
`6
`
`10
`
`16
`
`20
`
`23
`
`30
`
`Dr ntptT
`
`36
`
`40
`
`that may be selected as
`The period is a free parameter
`desired. However, longer periods are more desirable because
`it decreases the size of the amplitude
`(A), making it easier for
`UAV 112 to track target 116, The amplitude
`is determined by
`considering which direction of the footprint displacement and
`the distance from the target to the mid-point of the closest side
`of the footprint:
`
`45
`
`50
`
`60
`
`where r„~, and r„„were calculated above. and k„ is a
`parameter used to tune the sensitivity of the algorithm. k„
`
`than zero and less than or equal
`to one and is a
`greater
`factor based on the distance from the target to
`multiplicative
`the closest side of the footprint. Generally
`
`tr
`
`r
`t g
`
`r»ge
`
`although other values may be used.
`(x, y ) is the desired position of UAV 112 in the xtn y,
`coordinate system:
`
`yr — Asi
`
`y
`
`4 „, ('"" j. ,
`where A' =, and
`
`nmA
`
`'
`
`D
`
`Yuneec Exhibit 1008 Page 10
`
`
`
`US 7, 970, 507 B2
`
`sections are then special cases of the elliptic sections in which
`the Ien)uh of all of the half-axes are equivalent, Additionally,
`the principal axes of the elliptic sections may be rotated and
`the center of the orbit may be displaced from the target posi-
`tion. These seven parameters can be computed analytically or
`chosen using numerical optimization
`techniques and stored in
`a look-up table as a function of the speed ratio. For generality,
`the piecewise elliptical orbit is also normalized. That is. the
`orbit is defined relative to a unit circle orbit and during appli-
`lo cation the actual radius of the circular orbit is used to scale the
`piecewise elliptical orbit.
`Once the piecewise elliptical orbit parameters have been
`chosen, a modified waypoint can be calculated as follows:
`
`ttTT'oRB>Posn+ra, p py xoffset(tgtvell+
`tt(02'Tons, tgtPosn, tgtVellx( tYPToRB-tgtPosn)
`is the position of target 116 in the inertial
`where tgtPosn
`frame, t@Vel is the velocity of target 116 in the inertial frame,
`r„. , is the radius of the nominal (circular) orbit, and:
`
`offset(rgr Veo = dl (rgr Veov + d2(rgr Veoh
`
`rgrVel
`
`I)rgrVel I
`
`20
`
`25
`
`30
`
`The scale factor, ct, is computed using the equation of an
`ellipse. The principal axes of the ellipse are defined as fol-
`lows:
`
`p, =cos(0)i+sin(oln
`
`pb=-sin(010+cos(oln
`The Ienlnhs of the axes are defined as follows:
`
`ra =
`
`~
`
`if
`
`rl
`
`rs
`
`~1
`
`A
`
`~
`
`fp p)0
`r
`otherwise
`
`a
`
`rb =
`
`ip =
`
`r: if w pb&0
`otherwise
`rs
`(IVPToRB — rgrPosnl
`I I ( Vt P To RB — tg t Po snl ) I
`
`The equation of the ellipse is:
`
`B MB=1
`
`M = YY
`
`tPa Pb]
`
`n is the scalar multiplier such that trav lies on the ellipse, tr
`may be computed by plugging x=nw
`the equation
`into
`x Mx=l. Thus:
`
`-continued
`
`go=
`
`lp
`
`(2' p
`1+ Ancost(
`
`V
`
`The waypoints (WPTB~) for UAV 112 when it is operating
`in SIN track mode are calculated as follows:
`
`rl gp1
`dt — ls
`WPTsttt = [R(ltttstl])
`
`(tlt, , ) is the target's heading and [R(t)l, , ) J is the
`where
`rotation matrix that transforms x and y back into the inertial
`frame.
`ORB Tracking
`ORB tracking enables UAV 112 to loiter over target 116.
`The orbiting track waypoints (WPT~») are given in N, E, h
`(north, east, height), and are calculated for a UAV having
`inertial coordinates N. and E. as follows:
`
`(VPTORB =
`
`E,
`h
`N,
`N, +rasirbg Et+ racosl3 h
`N, +ra
`E,
`h
`N, + r, sirb0 E, — r, cosl3 h
`E, — r,
`h
`N,
`N, — r, sirb0 E, — r, cosfd h
`N, — r,
`E,
`h
`N, — r, sirb0 E, +r, cos13 h
`
`where (N„E, ) is the position of target 116, r, =tliN, '+E, ,
`and )3 is an angle dependent on how many waypoints you want
`should be relative to the
`to produce, and where waypoints
`target. For example, if there are 12 waypoints for UAV 112 to
`visit,
`
`35
`
`40
`
`45
`
`5o
`
`that the first row of the WPTo» is
`It should be understood
`the first waypoint visited by UAV 112.
`not necessarily
`Instead, a cost function may be utilized
`the
`that determines
`order in which waypoints are visited by UAV 112. The cost
`function uses the range to all the waypoints and the required
`change in heading needed for UAV 112 to get to each way-
`point.
`the position of the orbital waypoints relative to
`In addition,
`frame of motion may be adjusted
`the target's
`in order to
`achieve acceptable
`tracking performance
`at smaller speed ss
`ratios (higher target speeds). This results in "stretching" UAV
`112' s orbit ahead of target 116 and can reduce the amount that
`UAV 112 falls behind.
`FIG. 7 is an illustration depicting the parameters for adjust-
`ing the loitering orbit when target 116 is in motion. As shown
`in FIG. 6, the seven parameters
`include radii r, -r4, diameters
`d, and d„angle 0. which is the angle behveen the motion of
`the target and d, . The shape of the orbit is adjusted by sepa-
`rating the circular orbit into four arcs (quarter-circles). Each
`arc is described by an elliptic section, constrained at their 65
`connection points. This is achieved by specifying
`the length
`of the four "halfaxes" of the elliptic sections. The circular
`
`6o
`
`Yuneec Exhibit 1008 Page 11
`
`
`
`US 7, 970, 507 B2
`
`Height Adjustment
`Regardless of whether UAV 112 is using SFO, SIN or ORB
`the height of UAV 112 may be adjusted
`to
`track modes,
`target track, Height adjustment may be done using a
`maintain
`the cost function
`gradient-descent
`law to minimize
`
`r
`eaegee
`lgsgg I'
`
`with respect to h. As noted above
`
`= Ilsssglgg. ge1 X
`
`h
`l(3l
`
`Then,
`
`hj
`h= y-
`
`ahh
`
`3
`
`10
`
`15
`
`20
`
`where I is the gain of the gradient scheme.
`invention may be embodied
`The present
`in other specific
`from its essential characteristics.
`forms without departing
`are to be considered
`The described embodiments
`in all
`respects only as illustrative and not restrictive. The scope of
`is therefore
`indicated by the appended claims 30
`the invention
`than by the foregoing description. All changes
`rather
`that
`the meaning and range of equivalency of the
`come within
`claims are to be embraced within their scope.
`
`the track mode com-
`
`10
`8. The method of claim 1, wherein employing
`the track
`mode selected from the plurality of track modes is further
`based on at least one limitation of the unmanned
`aerial
`vehicle.
`9. A system comprising:
`interface;
`a communication
`a processing unit;
`data storage; and
`program logic stored in the data storage and executable by
`the processing unit to (i) identify at least one coordinate
`of a footprint of a sensor tracking a target, wherein
`the
`sensor is equipped on an unmanned aerial vehicle; (ii)
`determine if the target will leave a field of view of the
`sensor; iii) employ a track mode selected from a plurality
`of track modes based on a first speed of the target and a
`second speed of the unmanned
`aerial vehicle; and (iv)
`cause the unmanned aerial vehicle to fly in the employed
`the target in the field of view of
`track mode to maintain
`the sensor.
`10. The system of claim 9, wherein
`the track mode com-
`prises a straight following of the target.
`11. The system of claim 9, wherein
`the track mode com-
`prises flying in the shape of a sinusoid.
`12. The system of claim 9, wherein
`prises flying orbits around the target.
`13. The system of claim 12, wherein the orbits are adjust-
`able.
`14. The system of claim 9, wherein
`prises flying to at least one waypoint.
`15. The system of claim 13, wherein the program
`logic is
`further executable to adjust the orbits if the first speed of the
`target increases.
`16. The system of claim 9, wherein
`the program
`logic is
`further arranged to employ the track mode selected from the
`plurality of track modes based on at least one limitation of the
`unmanned aerial vehicle.
`17, A method comprising:
`determining a position of a target relative to the center of a
`sensor footprint;
`from the center of the sensor
`the distance
`determining
`footprint to a side of the sensor footprint that is closest to
`the target;
`a first speed of the target;
`determining
`determining a second speed of an unmanned aerial vehicle;
`employing a track mode selected from a plurality of track
`modes based on the first speed of the target and the
`second speed of the unmanned aerial velflcle; and
`causing the unmanned aerial vehicle to fly in the employed
`the target within a field of view
`track mode to maintain
`ofthe sensor.
`18. The method of claim 17, further comprising causing the
`unmanned aerial vehicle to fly at a height in which the target
`will remain in the field of view of the sensor.
`19. The method of claim 17, further comprising causing the
`unmanned aerial vehicle to fly to at least one waypoint.
`20. The method of claim 17, wherein employing
`the track
`mode selected from the plurality of track modes is further
`based on the stall speed of the unmanned aerial vehicle.
`
`the track mode com-
`
`33
`
`45
`
`What is claimed is:
`1. A method comprising:
`identifying at least one coordinate of a footprint of a sensor
`the sensor is equipped on an
`tracking a target, wherein
`unmanned aerial vehicle;
`if the target will leave a field of view of the
`detemtining
`sensor; and
`employing a track mode selected from a plurality of track
`modes based on a first speed of the target and a second
`speed of the unmanned aerial vehicle;
`causing the unmanned aerial vehicle to fly in the employed
`track mode to keep the target in the field of view of the
`sensor.
`2, The method of claim 1, 1v herein the track mode com-
`prises a straight following of the target.
`3. The method of claim 1, wherein
`the track mode corn- 50
`prises flying in the shape of a sinusoid.
`4. The method of claim 1, wherein
`prises flying orbits around the target.
`5. The method of claim 4, wherein the orbits are adjustable.
`6. The method of claim 1, wherein
`the track mode corn- 33
`prises flying to at least one waypoint.
`7. The method of claim 5, wherein the orbits are adjusted if
`the first speed of the target increases.
`
`the track mode com-
`
`Yuneec Exhibit 1008 Page 12