`
`EJATENT
`
`UNETED STATES PROVISIONAL PA'1‘ENTAP1-‘LICATEON
`
`FOR
`
`BEGITAL '35‘ §E1’E‘1—H~1RENG HER "E‘RM_:m‘NG WITH
`
`A1; ’i‘(}N()M0 us AEREAL Ram)?
`
`APPLICANTS:
`
`ASHLEY A. GELMORE
`
`DAVID L. DEWEY
`
`PREFAICED Y:
`
`SOKOLOEF‘, TAy1,0R & ZAFMAN, LLP
`1279 OA1<MEAI:» PARK‘vVA‘,~(
`
`SUNN‘,~(‘x/‘A,LE,, CA 94085-4040
`(503) 439-8778
`
`EFS F1113’;
`
`Yuneec
`
`Exhibit1012
`
`Page1
`
`Yuneec Exhibit 1012 Page 1
`
`
`
`E)l{} l'l'A L T "" ‘T ill3lil§lNG FOR 'l‘RACl{lNG "‘Wl'E‘H
`AUT(}N{} M GU zf’s.Tl?.RE[tiE_4 RSI) fit}?
`
`FEEZLD
`
`ltltlllll
`
`Emhodirrierits described are related generally to urirrianried aitrcralt, and einbodirnerits
`
`described are more particularly related to a tracking aerial robot.
`
`CGPYREGHT N{}'"l"lCE,/FERM ESSEGN
`
`§:l}l}tl2il
`
`Portions of the disclosure of this patent document can contain material that is subject
`
`to copyright protection, Thee copyr'igl'it owner has no objection to the reprodueti on by anyone of
`
`the patent document or the patent disclosure
`
`it appears in the l’atent and 'l'raden1arl< Office
`
`patent tile or r'ecords, but otherwise reserves all copyriglit rights whatsoever. The copyright
`
`notice applies to all data as described below, and in the accompanying drawings hereto, well
`
`as to any software described below: Copyright
`
`2013, Gilmore Labs, LLC. All Rights
`
`Reserved.
`
`BACKGRQUND
`
`lllllllfll
`
`Aircraft are currently used to lihn a variety ofspoiting events. l-lowever, the cost of
`
`using aircraft is very high. Additionally, there are practical lirnitations on how the types and
`
`angles ofcaniera shoot that can be accomplished with traditional aircraft filming. 'l‘hei'e are
`
`currently RF (radio lieqtieiiey) aircraft available, but the limitations on flight control and signal
`
`delays nialtes the traditional use of such craft unfit for tilining certain sporting events.
`
`BRIEF BESCREPTEQN 917 THE DR/»°i.W’lNGS
`
`llllltlétl
`
`The following description includes discussion of tigi.ires having illustrations given by
`
`way of example ol"inipleniei1tatioiis ofeinbodinients described. The drawings should be
`
`understood by way of exainple, and not by way of limitation. As used herein, references to one
`
`or more ”einbodirr1ents" are to be understood as describing a particular‘ feature, structure, or
`
`eharacteristic included in at least one l)”,T1[llfi,'(Il6IllEll.lOIl, Thus, phrases such as "in one
`
`enihodirnent" or "in an alternate enihodintent“ appearing herein describe various enibodinients
`
`and irnplerrientations, and do not necessarily all r'el’er to the same enibodirnent. l%lowever, they
`
`are also not necessarily mutually esxclusive.
`
`l)o<:l<:ei‘. No.: 95t37PQ(}l Z
`
`Yuneec
`
`Exhibit1012
`
`Page2
`
`Yuneec Exhibit 1012 Page 2
`
`
`
`§tiiiil5l
`
`Figure 1 is a hloclr diagram of an einbodiinent of an aerial robot that inaintains a
`
`position A with i'eS}’J"\:‘:Ct to a target.
`
`genes;
`
`Figure 2 is a block diagram of an ernhorliinent of an aerial rohot that maintains a
`
`position A with respect to a target while the target is in motion.
`
`{E3397}
`
`Figure 3 is a hlocl: diagram of an embodiment of a system having an aerial robot that
`
`l1‘dCl{S a target via a heacon.
`
`illfitliil
`
`Figure 4 is at hlocl-g diagram of an enihodinient ot‘ a system having an aerial robot that
`
`tracks a target via identifying an indicator.
`
`llllililil
`
`Figure 5 is a block diagrani of an einbodiinent of an aerial robot including one or
`
`more features for detecting its position and tracking a target.
`
`{@819}
`
`Figure 6 is at hloclt diagram of an enihodinient of an aerial robot including one or
`
`more features for detecting its position and tracking a target.
`
`{hull}
`
`Figure 7 is a tlovv diagram of an embodiment of target traclring with a digital tether.
`
`§liill2}
`
`Descriptioiis otceitain details and embodiments follow, inciutliiig a description of the
`
`figures, which can depict seine or all of the enihodinients described below, as well
`
`discussing
`
`other potential eniliodinients or iI'1‘lpl§3l’,‘(l€ll tations of the inventive concepts presented herein.
`
`DE'l‘AlLEl} l}ESCRli"l‘i 0N”
`
`iitlillligi
`
`l\/lavhot Elliadow is a photography/video platform that tracks a moving target. Position
`
`from the target can he set to at certain angle, tlistance and height ll‘t)i11 or latitude longitude and
`
`altitude offset. ln one embodiment, trackoiiig the position of the target is via a “heacon" attached
`
`to the target. in one einhotliinent, the beacon contains a GPS module, microprocessor, radio or
`
`other communications link, and optional aecelei'onieters, gyros, ll'lEtgl'lf3lOl’,TlCi§3i”, harometer, and
`
`other sensors to complement the GPS position data.
`
`{@314}
`
`Mavhot Sl’l2tClGVv' can also he less reliant on GPS for positioning. For example,
`
`distance to target can he deterrnined by radio signal strength or propagation delay (or that ofa
`
`non—radio signal such a light or ultrasound); altitude above target by barometric pressure
`
`ditlereiiee; and, tlirection to target by using one or more directional antennas (or sensors) to
`
`receive signal.
`
`illlilfil
`
`Mavhot Shadow may use optical flow and other image processing technologies to
`
`determine the location of the moving object in the video streain, and use feedback to move the
`
`l)ocl<:ei‘. No: 95?37P()0l Z
`
`if.)
`
`Yuneec
`
`Exhibit1012
`
`Page3
`
`Yuneec Exhibit 1012 Page 3
`
`
`
`robot in such a vvay that the object maintains the same size and position in the video stream, thus
`
`complementing the data trorn the beacon or f3l,l,'(lll)”,l£tllll,g it entirely,
`
`llltlldl
`
`lh one embodiment, lvlavhot Shadow includes some or all of the following features.
`
`{$917}
`
`l. Setting position
`
`ltllll 8}
`
`Tlie user can adjust the position offsets of shadow relative to the target. Altitude
`
`above target, distance from target, and angle from target can be set. Carnera pointing offsets may
`
`also be an option, although the carnera can point at the target by default, Offsets may be set by
`
`adjusting height, angle, and distance controls, by flying the robot manually to the desired
`
`position, and/or hy moving the target to the desired location relative to the tlyirig robot. Then
`
`using a button or other control to corninand the system to record those particular offsets. When
`
`offsets are set, Shadow records the three axis position offsets, ln one enibodinient, they are
`
`recorded by the beacon, and added to robot position coinniands before sending over the radio
`
`link. ln an alternative einbodiinent, they could also he recorded by the robot and added to the
`
`received positions.
`
`ltltllhl
`
`2. Following
`
`{@026}
`
`Once offsets are recorded, Shadow can rnaintain the same offsets from the target as
`
`the target moves in space. This vvorlrs by the target reading position data from its GPS inodule,
`
`adding the X, y, and 2. offsets, and sending comrnarids tlirough the radio for the robot to fly the
`
`calculated position. When tracking and/or positioning rnechanisnis other than GPS are used, the
`
`offset calculations can be obtained from whatever sensor or niodtile provides the inforination.
`
`llltllll
`
`3. Cainera pointing
`
`{$322}
`
`During the following sequence, the robot also keeps the camera pointed at the target.
`
`Thus, even if the target turns faster or accelerates, decelerates, ascends, or descends faster than
`
`the robot’s ability to do so, the camera can still remain pointed at the target to record the targets
`
`action. To point the camera, at the target the camera mount may only need a tilt axis; the entire
`
`body of the robot can yaw to face the target, since vertical—tal:eoff aircraft can fly in any
`
`direction, not only the direction they are pointed. However, a roll axis could also prevent tilting
`
`ofthe eaniera image during sideways robot acceleration.
`
`{E3823}
`
`4-. Offset angle transforination
`
`llltildl
`
`The offset angle of the robot from the target can be relative to compass direction from
`
`target, for exainple: robot is set at an offset ot”-45 degrees (Northeast). Whichever direction the
`
`l)o<:l<:et No: 95t37PQ(}l Z
`
`Yuneec
`
`Exhibit1012
`
`Page4
`
`Yuneec Exhibit 1012 Page 4
`
`
`
`target moves the robot will try to maintain the set distance away, at an angle 45 degrees from the
`
`target.
`
`iitliilfil
`
`The offset angle can also be subject to transformation depending on the target's
`
`velocity vector, for exaniple: the robot. is set at an offset of 45 degrees and the target heads 315
`
`degrees (t”lOl'tll'WE:Sl:l. The robot is thus to the target's left. lf the target gradually turns West (W hile
`
`in motion) until the direction of travel becomes due west (270 degrees), the robot can
`
`corres1:iondingly gradually transforrn its offset angle by the same amount, froin 45 degrees to (3
`
`degrees. Thus, the robot will still be at the left—hand side of the target. Tlie targets course change
`
`can be computed from the change in the course given by a GPS inodule, or siniply by eon’iparing
`
`successive GPS coordinates. A snioothing filter can he appled to sniooth out any sudden course
`
`changes. Adding niagnetoineter and/or gyros to the beacon can eliininate the requirenient that the
`
`target he in motion to effect these offset angle corrections. lnstead it could he based on the
`
`direction the target faces instead of successive GPS locations.
`
`gases}
`
`l fa sport or other activity involves many or very sudden direction changes, the
`
`algorithm can turn off the angle transforniation when nieasured course changes are too rapid, to
`
`avoid the need for high speed maneuvering, and avoid video ehoppiness. Thus, the angle
`
`transforrnation can automatically be restricted if there are more than a threshold of changes
`
`within a. ti rn e period. The threshold of changes can he a threshold number of changes, or a
`
`inininiuni threshold of angle changes g., do not tracl: angle if the angle change does not reach
`
`a certain angle) witliin a given time period. Alternatively, the offset angle transforrnation could
`
`he turned offnianually as the need arises.
`
`{$327}
`
`5. Stopping
`
`{G328}
`
` ln one enihodinient, the ltveacon includes a stop liutton or other control that will cause
`
`the beacon to signal the robot via radio coinrnand to stop and hover in position when activated.
`
`This may be necessarry if the user is finished and no longer wishes the robot to follow, if the
`
`robot is about to hit an obstacle, or if any uiiexiiected situation, such as erroneous GPS data,
`
`arises,
`
`{(3329}
`
`6. Synchronization
`
`{E3839}
`
`Two GPS modules will give slightly different readings under differeiit conditions
`
`(such as weather, current satellite conditions, obstructions, interference, and differences in the
`
`GPS niodnles thernselves). The diflei'eiiee in GPS rnodule readings can cause a difference in
`
`l)o<:l<:et No: 95t37PQ(}l Z
`
`--4--
`
`Yuneec
`
`Exhibit1012
`
`Page5
`
`Yuneec Exhibit 1012 Page 5
`
`
`
`reported position hetween the heacon’s GPS and the robot's (393 that will often remain
`
`appro;\‘ima,tely the same over the short period of time that the user is iilinirig. l*l owet/er, the
`
`ditlereiices change over longer periods of time as conditions change. ln one embodiment, the
`
`system iiicludes as feature to allow the user to synehronirze GPS or other location detection
`
`modules before use. One way to perforni synchronization would he to place the beacon and robot
`
`together and press a button that causes the software to record the difference in reported position
`
`between the two GPS modules, The disfferenice in position can then he talten into account during
`
`suhsetpient following to allow the robot to point the camera to and follow the target more
`
`accurately .
`
`{W31}
`
`7. Forward l’redietion
`
`{G832}
`
`lnexpeiisiye GPS modules have a delay in processing so that the coordinates read
`
`may not he the current coordinates, but instead the coordinates from a short time prior (for
`
`example 05 seconds before). Lag can also he introduced elsewhere in the system, such as
`
`processing delay, radio conimunications delay, and Fl‘vlU (Flight Managenieiit Unit} delay. This
`
`will result in the robot position and camera pointing lagging the actual position of the target. in
`
`one embodiment, the system can at least partially mitigate the lag or delays in the system by
`
`projecting the last loiown Velocity Vector of the heacon and adding to the last known position to
`
`arrive at an estimated current or future position of the beacon. Several of the latest lrnown
`
`Velocity Vectors can he used for at hetter prediction. Similarly, changes in Velocity can be
`
`estimated by adding last lmown acceleration vectors to last known velocity Vectors.
`
`§,ll033l
`
`Beacon inertial l\/leasurenient
`
`{$334}
`
`To reduce tlie delay in position readings from the GPS, and to fill in gaps in GPS
`
`reception and correct shoit~term (EPS errors, inertial ineasureinent can he used. The beacon can
`
`contain an ll\«lU (inertial l\/l'€&Sttl’6l’l1t23ll‘i'Ullll:3 consisting of 3—aXis accelerometers, gyros, and
`
`t’,tl,a,gIl,{;‘l0ll'l§3l'ii‘l‘, Combined with a compass declinaition loolrup table for all locations on lfarth, the
`
`magnetometer can keep track of heacon orientation over time. Gyro data can be integrated to
`
`compute instantaneous orientation data, and orientation and aceelierorneter data can be integrated
`
`to conipnte Velocity, which can be integrated to compute position. This resulting position is
`
`relative, but can he used to estiinate absolute position hy integrating over the time period
`
`corresponding to the specific lag in GPS data, then adding the relative position to the absolute
`
`position liom the GPS to get an estiinate Oldlltf.» absolute position at the current instant.
`
`Dt)(fl<i{fl. No; 95t37P()0l Z
`
`in
`
`Yuneec
`
`Exhibit1012
`
`Page6
`
`Yuneec Exhibit 1012 Page 6
`
`
`
`{$335}
`
`9. Camera pointing correction
`
`lllfiffirll
`
`lriexpensive GPS morlules tend to drill over time, and as a result the otlset of the
`
`robot relative to the target may not always be exactly as originally set. llowever, it's possible to
`
`still
`
`very good. video despite this offset variation, as long as the cxarnera is still pointed exactly
`
`at the target {and especially if the carnera has an adjustable zoorn lens). Unfortunately the GPS
`
`drifi will also affect the pointing oftlie camera. There are various ways to correct for this. ln one
`
`enihotlinieiit, the system can use inlrai'erl or other light source on the target or on the beacon that
`
`can he tleteetecl by the robots carnera, or by a secontlaiy camera or tleteetor specifically sensitive
`
`to intra.red or a speeilic light wavelength or color. The robots computer (or other cornputer in the
`
`system) would average the position of the light source overtime, and gratlnally apply
`
`compensation offsets to the commanded cxarnera pointing angles such that this average position
`
`would tenrl towartls the center of view. intensity infoi'iriation could also he used to compensate
`
`for tlistance tlrilt.
`
`{W37}
`
`ln one enihotlinieht, the system can calculate optical flow of the vitleo image, antl
`
`segregate pixels in fairly uni form motion or motion counter to the target/rohot system (the
`
`haeltgrouritl) from pixels not in rnoti on or counter to the motion, of the other pixels (the target).
`
`From the average position of the target pixels, the target’s position relative to camera pointing
`
`eoultl he inlerretl, and thus the clrijft cornpensati on camera, ollsets could be applied, Another
`
`method would be simply to track the position of an object {such as a colored ball or sticker) or
`
`insignia attached to the target or beacon, using common computer vision object recognition
`
`techniques. Another rnethod would he to simply tracl< the position of a color on the target or
`
`beacon that is not commonly found in the haclrground (_loi' example, a slrier wearing a retl coat on
`
`a white snow l’)'(7tCl*é{gi’“0'U.liCl>. With one or a combination of the above techniques, and fast enough
`
`irnage/vicleo processing, the neetl for a lieaeon to tracl: an object may tlisappear altogether, and
`
`optical traclring may be suflieierit.
`
`§:l}l}38iE
`
`l 0. Robot lnertial Measurement and Position Control
`
`lllllfllll
`
`The robot niay similarly use ineitia,l rneasurernent eornbinerl with GPS to estimate its
`
`C1).l‘l'6ltl position precisely and ensure it flies to the exact position eomrnandetl hy the ‘beacon. The
`
`error between the actual position and commanded position can also he caleulatet.l, which can
`
`drive a Pll) loop that will keep said error at a minirnuni. Nested Pll) loops may he ‘used,
`
`consisting ofaceeleration and velocity control. For example, when the position error is higher,
`
`l)o<:l<:et No: 9587PQ(}l Z
`
`--5-—
`
`Yuneec
`
`Exhibit1012
`
`Page7
`
`Yuneec Exhibit 1012 Page 7
`
`
`
`requested velocity toward the target increases, and when the error between requested and actual
`
`velocity is higher, acceleration toward the requested velocity increases.
`
`iliildtll
`
`l l. Attitude and terrain followting
`
`{$941}
`
`Because inexpensive GPS modules may not indicate altitude to high precision,
`
`alternate methods of altitude rneasur'ement may be needed. tiiaronieters provide highly accurate
`
`air pressure measurement and can determine relative altitude to within tens of centimeters, Air
`
`pressure continually elia.n,ges with weather. in one ernhodirnent, both the beacon and th e robot
`
`can include barorneters, and both the baroineter in the beacon and the barometer in the robot
`
`would be atfecte equally by these ehiainiges, so relative altitude would still be l<,'(1OW’Il. Th e robot
`
`could also be prograrned to follow the target at a set altitude above the terrain by use of a sonar
`
`or laser range tinder mounted on the robot and directed downwards,
`
`§l)€}42,§
`
`l2. Safety t"eatures
`
`{$343}
`
`To prevent tlyaways and crashes, in one enibodinient, the system is prograinined with
`
`or otherwise include one or more of the following features.
`
`{$344}
`
`A. A "panic" button or control that commands the robot to stop and enter a static
`
`hover when a,eti,va.ted. The hover position would be held static by the FMU based on GPS
`
`position and/or inertial rneasurernent, as described above.
`
`{E3345}
`
`The system could also automatically enter the hover mode or auto land under
`
`the following conditions: GPS fix lost, too few Gl’S satellites for accurate fix, GPS niodule's
`
`reported for accuracy low, radio conmiunieation lost, distance between robot and beacon too
`
`great, robot altitude unusually high or low (as identified by preset thresholds), robot at unsate
`
`proximity to target ( indicated by sensorts) and contigurattioii), robot battery low, and other
`
`conditions that nialte continuing unsafe or inadvisable.
`
`guess}
`
`C. All radio coniniands exclianged between robot and beacon contain a ohecksuin
`
`to prevent corrupted radio link data trorn eornniariding the robot incorrectly,
`
`§:l}047iE
`
`D. As the user may elect to have the robot follow from behind or otherwise out of
`
`sight, the robot will signal the beacon to produce an alert (such as sound, vibration, tlaslning, and
`
`possibly a display indicating the specific error condition that occurred) during an error condition.
`
`Thus, the user will not continue on oblivious to the fact that the robot has entered a static hover
`
`(such a signal will of course fail in the event ofa radio linlt loss, but the signal can be sent
`
`repeatedly in the hopes of at least one packet getting through).
`
`Dt)(fl<i{fl. No: 95t37PQ(}] Z
`
`Yuneec
`
`Exhibit1012
`
`Page8
`
`Yuneec Exhibit 1012 Page 8
`
`
`
`{@348}
`
`E. Kill switch to eompieteiy stop all robot motors in case of an imminent crash or
`
`FMU H”Ii’c1ifl!I}C'i.i,Ol”1. The use efa kill switch weuld iiievimbiy ezmse the robot to crash, but without
`
`the added daiigei‘ of preps spinning at high Veieeity. This eeuid also save props and meters from
`
`daiiiage.
`
`iiiiicw}
`
`The feiiewing pmvides exaiiipie pseudo code for one example impiementatioii that
`
`uses :1 GPS at both :1 beaeeii and an aerial robot.
`
`Beacon Pseudecede for Feliewirig (wiih GPS only):
`
`Heep {
`
`‘em GPS dafiag
`
`if {new GPS data) {
`
`updaie ‘0eaeenwgpsW_i0eati0ii;
`
`desii‘ed_i‘ebet_pesitien
`
`beaeeii_gps_ieeatieii
`
`3iC)_e3‘7fsets;
`
`Sf.?11{.i»_1‘i?idi()»_C{)I111112E,I1d(i{f3<ip camera painted ie: beaeeiimgps»_ie<.:a,tioii);
`
`setid_i‘adio_ceimnand(ineVe to: desired_mb0t_p0siti0ii);
`
`Yuneec
`
`Exhibit1012
`
`Page‘.-'1
`
`Robot Pseudecede fer Beacmi Feilewiiig (with GPS 0n1y}:
`
`ioop {
`
`read GPS daita;
`
`if (new GPS data) {
`
`update robeiwgpsW_ie<.:a,tieii;
`
`1%
`
`read radio data;
`
`if(radie (‘.r0I'1'11Y1EL1’1di-§6€'p camera pointed to) {
`
`eaniei‘a_p0ii1t_tai"get = received position;
`
`} i
`
`firadio eemmaiid move to) {
`
`nay»_iai*getmlocation = received ‘pesitien;
`
`} p
`
`eini, eaiiiem te: isamerawp0intM_E,a,i"get;
`
`D:)<:i<:eiINcs.; 9587P()(}] Z
`
`--8--
`
`Yuneec Exhibit 1012 Page 9
`
`
`
`Yuneec
`
`Exhibit1012
`
`Page10
`
`update itav PEI} loops;
`
`Beaceii Pseudoeode fer Setting Position Offsets
`
`loop while (in fixed htwer mode) {
`
`read GPS data;
`
`if(1'1ew GPS data) {
`
`update beac0n_gps_E0eation;
`
`} i
`
`f (angle, height, or distance cdntmfis adjusted) {
`
`desiredmrobetmpesitien += adjustment;
`
`sendjadi0_cenimand(heVer at: desii'ed_mbet_pesitien);
`
`} r
`
`ead radio data;
`
`if{reeeiVe_radi0_te1emet1"y:1‘Ob0t_p0sitien) {
`
`31T}____eff<;ets mb0t___p0siti0t1 ~ beac0n___gps___E0eatien;
`
`\'’»,~-/
`
`)1J
`
`save offsets;
`
`enter feliowiiig mode;
`
`Robot Pseudocede for Setting Pesitimi Offsets
`
`loop {
`
`read GPS data;
`
`if(1'1ew GPS data) {
`
`update 1‘0'bQt_gps_E0cati0ri;
`
`} r
`
`eceive radio data;
`
`iftradie eeilimatid hever at) {
`
`hevei‘_tai'get_pesitien
`
`received position;
`
`~..\..z
`
`D:)<:i<:et‘. N0; 95t37PQ(}] Z
`
`Yuneec Exhibit 1012 Page 10
`
`
`
`update hover Pl D loops;
`
`send radio telei'netr'y: robot____gps___positioii;
`
`lllililtll
`
`Wlrile general descriptions of embodiments are provided above, additional examples
`
`are now provided. with respect to the drawings.
`
`ltlfifill
`
`Figure l is at bloclr diagram of an €l'1‘llI)'0(‘lll'1’lf3t'llt of an aerial robot that rna,intains at
`
`position A with respect to a target. Tlie robot can be set at an offset of a certain distance in front
`
`or behind th e target, to one side or an other or directly inline with the target, and/or at a certain
`
`altitude with respect to the target.
`
`{@852}
`
`Figure 2 is at blocl; d.iagrain of an embodiment of an aerial robot that inaintains at
`
`position A with respect to a target while the target is in motion. ln response to rnoveinent of the
`
`target aerial robot computes and executes flight path B to rnaintain or approxiinateiy rnaintain
`
`offset position A with respect to the target. The inovernent of the target can be in any direction
`
`and the robot will track the target. The target can also change direction,
`
`discussed above, and
`
`the robot can traclt the target to niaintain position widths niovcinent in any d ‘ection, and liollowing
`
`changes of direction.
`
`H3353}
`
`lilgure 3 is a block dia.graoi of an ernbodirnent of a system liavirlg an aerial robot that
`
`tracks a target Via a beacon. in one einbotliinent, a beacon is placed on the target, and the aerial
`
`robot follows the target Via tracking the beacon. The beacon can include a position unit to
`
`determine its location, and a cornnninication unit to exchange corninunication with the aerial
`
`robot. ln one enibodirnent, the beacon sends its position inforniation to the aerial robot to allow
`
`the robot to autononiously change its flight path to track the target.
`
`{$354}
`
`ln one einbodinrent, the aerial robot includes a position unit to determine its position
`
`and calculate llight path intorrnattilori based on its present location and the chan in location of
`
`the beacon. in one enibodinient, tracking unit perforrns the calculations to allow the aerial robot
`
`to track the beacon based on its own position iiiforrnati on and the position i,tllbi‘ttiEtti()n of the
`
`beacon. The aerial robot can also include a communication unit to exchange inforrnation with the
`
`beacon, such as receiving its position inforination. The aerial robot also includes a flight
`
`nianagernent unit (l3~‘l\/l U) to control its flight in accordance with the calculation made by the
`
`tracking unit.
`
`Dt)(fl<i{fl. No: 9587P€l0] Z
`
`-10-
`
`Yuneec
`
`Exhibit1012
`
`Page11
`
`Yuneec Exhibit 1012 Page 11
`
`
`
`{@355}
`
`Figure at is at bloc/l< diagram of an embodiment of a system having an aerial robot that
`
`traicks a target via identifying an indicator. ln one emborliment, rather than have a beacon that
`
`transmits information for the aerial robot to traclr, the target can simply include a target indicator.
`
`The target indicator can be a, color, a pattern, or other n1€t1‘l§€:1‘ on the target. Alternatively, the
`
`indicator can simply be the target as it moves across frame in captured Video. Thus, the target
`
`does not necessarily need to be previously tagged with a beacon, but can be acquired in video
`
`anrl then tracked. For example, the aerial robot can autonomously pick up the target, or be
`
`manually controlled with a signal indicating a target identified by a user based on streamed
`
`video.
`
`{$356}
`
`ln such a system, the aerial robot also includes an Flvlll, tracking unit, and position
`
`unit. The aerial robot also includes a target identifier unit, such as a Video tracking systeni, which
`
`can inclurle Vldfifl, infrared, laser, and/or other systems. The traclciiig unit computes flight path
`
`information based on the V’l(.lf3G traelring systen1{_s) indicating the inovenient of the target.
`
`lllll57}
`
`Figure 5 is a hlocl< diagram of an embodiment of an aerial robot including one or
`
`more features for detecting its position and traclring a target. An aerial robot can include one or
`
`more of the following features, and does not necessarily include all features indicated. :l.)l'lfl,V§3t”E3‘(ll.
`
`emhotliments of the aerial robot can include different combinations of features.
`
`H3358}
`
`The aerial robot includes an Fl‘~/EU, which includes one or more flight controllers,
`
`are lrnown in the industry. The flight controllers enable the aerial robot to perform Vertical
`
`inaneuvering, as well as X and y nioveinent, and pitch and yaw movements.
`
`lll059l
`
`The traclting unit can include one or more of an RF unit to emit and/or receive radio
`
`frequency signals, one or more lasers or other light liequency signals and/or signal sensors,
`
`and/or other type(s:) of sensor (e.g., a barometric pressure sensor). The tracking unit includes a
`
`rnoyeinent calculator to compute flight path information for the l9'l‘vlU. ln one ernbodirnent, the
`
`l’,‘(l,OV€3l’,‘(l,'t3lll. calcula,toi' includes one or more mecha,nisrns to perforin predictive caleulationls to
`
`estimate the location the robot should move to. Some forms of tracking hardware will result in a
`
`delay between the rnoyeinent of the target an (l the nioveinent of the robot to track the tarrget. By
`
`predicting the rnoveinent ofthe target, the robot can compensate for the delay.
`
`{G869}
`
`ln one embodiment, the robot eonipensates for delay by receiving instantaneous
`
`Velocity information from the target (beacon), and using that information to calculate its flight
`
`path. The robot can adjust the calculations based on siibsequently received position inforination,
`
`l)ocl<:ei‘. No: 9587P€l(}l Z
`
`-l l-
`
`Yuneec
`
`Exhibit1012
`
`Page12
`
`Yuneec Exhibit 1012 Page 12
`
`
`
`For example, the eacon can send position information and Velocity intorniation. Based on the
`
`Veioeity it1,i"O1‘Yi’}a,tiOti, the aerial robot can corn pute a flight path, and in ove tm 'a:r'd where it
`
`shouid be to maintain the tiigitai tether, based on the caicuiations. When the beacon seiitls
`
`t!}Z)Ci’<i.tf.’.ti position and Velocity inforrnation, the ti'aching unit can eoinpute an adjustment based on
`
`the actu-at position iiitorrnation, and compute flight path information based on both the new
`
`Velocity inforrnatioii, and at eoinparison between the previous calculations and the new (actual)
`
`position it1,i"01‘ma,tiOt1 of the target.
`
`ititititi
`
`in an alternative enihodinient, the tracking unit computes flight path inforniation
`
`based on cornparisoris ofstibseolueritiyweceiVert position irifoi*rriation, Such iI}fO1’T?(},’c1,i.i(_tt'1 can
`
`inchide signals received from the beacon itsetf in an enihodinrerit where the target sends position
`
`inforinationt in an enibodinient where the aerial robot tracks the target without inforination sent
`
`by the target (e.g., Video tracking}, the received position iiitorniation is inforniati on retriever}
`
`from its tracking sensors.
`
`H3362}
`
`The position unit can inciude one or more GPS units, Video recognition units, altitude
`
`sensors, and/or inertial nieasurenient units (EMU). in one enihodinient, the aeriai robot includes
`
`rnuitipie (BPS units to provide higher accuracy in its position determinations. The video
`
`recognition unit can be in accordance with what
`
`described above and/or any other siniiiar
`
`system known in the art. The attitude sensor can be or include one or more ot‘ the tracking unit
`
`sensors, such as sonar (RF signal), iaser attitude guidance (laser), barometer (other sensor), or
`
`other sensor: The EMU can sirniiar utilize or iiictude one or more aeceieroineters, rnagnetonieters,
`
`anti/or gyroscopes.
`
`{$363}
`
`in one eniboctinient, tracking is based in part on sensors in the aerial robot (_e.g.,
`
`Visuai i0Ci<, heat seekiiig, other sensors) and partiy on sensors in a beacon (e.g., GPS, EMU,
`
`barometer, or other units). in such an ernbodirnent, processing of the inovenient and flight path
`
`caicuiations can be ciisstributed between the aerial robot and the beacon.
`
`§:t}t}{i4iE
`
`The aerial robot can inciucte a coninrunication unit to enable coninrunication with a
`
`beacon, ariti/or a user controller, an override systerni or other device that can provitte i1l,i"0‘(Ti’}.?tti,Oii
`
`usethi in computing the flight path. in one einbodiinent, the aerial robot inchirtes an obstacle
`
`detection unit, which enables the robot to detect and avoid obstacles‘ in one enibodinient, the
`
`obstacie detection unit iiiciuctes or ieverages siiniiar or the same type ofsensors in use in the
`
`tracking unit.
`
`Dt)(fi€i{fi. No; 9587P€t0] Z
`
`.
`
`-I i\.)
`
`Yuneec
`
`Exhibit1012
`
`Page13
`
`Yuneec Exhibit 1012 Page 13
`
`
`
`{@365}
`
`Figure 65 is a bloclr diagram of an ernbodirnent of an aerial robot including one or
`
`more features for detecting its position and trael<ing a target. A beacon can include one or more
`
`of the following t‘eatures, and does not necessarily inclnrle all featiires indicated. lltitterent
`
`einbotliinents of the beacon can include different C0iTll>l1l8.l'iOl1S ofieattires. Dillei'eiit version of
`
`the heacoh may be trackable by the same aerial robot.
`
`{@866}
`
`The beacon includes a eoininunication unit to send inforniation about its location to a
`
`tra.ck“ii'ig aerial robot. in one errihodirrient, the beacon includes an aerial robot control unit to
`
`provide coininands or controls to an aerial robot. Such controls can include initialization
`
`controls. stop coritrols. and/or controls that change an otlfset of the digital tether (e. g.,
`
`perspective, position, angle).
`
`{@367}
`
`Figure 7 is allow diagram of an einbotliinent of target traelzing with a digital tether.
`
`The beacon and the aerial robot are initialized, arid the system establishes a digital tether. The
`
`digital tether includes three tliinensional ollsetts) between the aerial robot and the target, in
`
`accordance with any description altvove. in one embodiment, a reterence angle is set between the
`
`rohot and the target. in one einliodiment, a perspective
`
`set for video capture or other image
`
`capture. The aerial robot can train the video capture or other image captui'e or camera to track the
`
`target in response to movement by the target. The video capture can also be moved in response to
`
`iiioveinent by the aerial robot. The video ca:pture perspective can be iiidepeiideiit (e. a
`
`separately controllable swivel inount) of movement of the robot.
`
`{E3868}
`
`The aerial robot follows the target movement to maintain essentially the same three
`
`dimensional offset. it will be understood that some drift can occur with acceptable tolerances.
`
`The tolerances can he preset for the system. At any given instant, there may be inconsistency in
`
`the relative position of the robot with respect to the target, but on average the robot can trael< the
`
`target at the set oi‘i‘set(s’).
`
`{G069}
`
`While the robot is tracking the target, three snb—processes can occur. in a tirst sub-
`
`process, the robot monitors for a change to an offset. if there is no change, the robot continues to
`
`follow the target at the set relative position or set fixed offset. There can he a. change deteriniri ed
`
`hy the rohot itselt‘ in response to a collision detection event, or a command received from the
`
`beacon or another source. When a change to an ollset is detected, the robot can calculate and/or
`
`receive new otlset inlormation. and adjust its flight path and/or adjust the perspective of the
`
`l)o<:l