`US009367067B2
`
`c12) United States Patent
`Gilmore et al.
`
`(IO) Patent No.:
`(45) Date of Patent:
`
`US 9,367,067 B2
`Jun.14,2016
`
`(54) DIGITAL TETHERING FOR TRACKING
`WITH AUTONOMOUS AERIAL ROBOT
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`(71) Applicants:Ashley A Gilmore, Salem, OR (US);
`David L Dewey, Port Ludlow, WA (US)
`
`(72)
`
`Inventors: Ashley A Gilmore, Salem, OR (US);
`David L Dewey, Port Ludlow, WA (US)
`
`( *) Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 34 days.
`
`(21) Appl. No.: 14/214,123
`
`(22) Filed:
`
`Mar. 14, 2014
`
`(65)
`
`Prior Publication Data
`
`US 2015/0205301 Al
`
`Jul. 23, 2015
`
`Related U.S. Application Data
`
`(60) Provisional application No. 61/800,201, filed on Mar.
`15, 2013.
`
`(51)
`
`Int. Cl.
`GOSD 1108
`GOlS 19113
`GOSD 1110
`GOSD 1112
`(52) U.S. Cl.
`CPC .............. GOSD 110808 (2013.01); GOlS 19113
`(2013.01); GOSD 11101 (2013.01); GOSD 1112
`(2013.01)
`
`(2006.01)
`(2010.01)
`(2006.01)
`(2006.01)
`
`( 58) Field of Classification Search
`CPC . G05D 1/0212; G05D 1/0094; G05D 1/0808;
`G05D 1/12
`See application file for complete search history.
`
`4/2013
`512014
`412015
`512015
`3/2008
`612009
`
`8,423,204 B2 *
`8,718,838 B2 *
`9,019,376 B2 *
`9,026,272 B2 *
`2008/0054158 Al*
`200910157233 Al *
`
`.......................... 70112
`Lee et al.
`Kokkeby et al. .................. 70113
`Lee et al.
`. . . . . . . . . . . . . . . . . . . . . . 348/ 144
`Kokkeby et al. .................. 70113
`Ariyur et al. ............... 250/203 .1
`Kokkeby ............... GOlS 3/7864
`70113
`212010 Kokkeby et al. .................. 70113
`2010/0042269 Al *
`912010 Hines et al. ....................... 70112
`2010/0250022 Al *
`6/2012 Hampapur et al.
`........... 348/143
`2012/0154579 Al*
`2012/0287274 Al* 1112012 Bevirt ........................... 348/144
`2012/0307042 Al* 12/2012 Lee et al.
`...................... 348/114
`2013/0176423 Al*
`7/2013 Rischmulleretal. ......... 348/114
`OTHER PUBLICATIONS
`
`B. Yun, K. Peng and B. Chen, "Enhancement of GPS Signals for
`Automatic Control ofa UAV Helicopter System," In Proc. 2007 IEEE
`International Conference on Control and Automation, China, Jun.
`2007, pp. 1185-1189.*
`* cited by examiner
`Primary Examiner - Thomas G Black
`Assistant Examiner - Peter D Nolan
`(74) Attorney, Agent, or Firm - Vincent Anderson
`
`ABSTRACT
`(57)
`An aerial device automatically maintains a relative position
`with respect to a target. The aerial device can set a relative
`multi-dimensional position with respect to the target. The
`target can have an indicator (e.g., a visual marker for image
`capture tracking, or a radio indicator for tracking via signal(cid:173)
`ing) that the aerial device reads. The aerial device can auto(cid:173)
`matically adjust its flight path in response to movement of the
`target as indicated by the indicator. Thus, the aerial device can
`maintain a digital tether, moving with the target to maintain
`substantially the same relative position with respect to the
`target, tracking the target in multiple dimensions.
`20 Claims, 9 Drawing Sheets
`
`TARGET
`110
`
`AERIAL
`ROBOT
`120
`
`CAMERA
`122
`
`z
`
`' ,'
`I XI
`i~---------------------------1=]
`
`y
`
`Yuneec Exhibit 1009 Page 1
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 1of9
`
`US 9,367,067 B2
`
`BEACON
`112
`
`TARGET
`110
`
`ROBOT
`120
`
`CAMERA
`122
`
`z
`
`I
`I
`
`I
`I
`
`X I
`i~ ___________________________ .r=]
`
`y
`
`FIG. 1
`
`AERIAL
`ROBOT
`POSITION
`222
`
`AERIAL
`ROBOT
`POSITION
`224
`
`TARGET
`LOCATION
`212
`
`TARGET
`LOCATION
`214
`
`FIG. 2
`
`Yuneec Exhibit 1009 Page 2
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 2of9
`
`US 9,367,067 B2
`
`300
`
`AERIAL ROBOT
`320
`
`POSITION UNIT
`322
`
`TRACKING UNIT
`326
`
`COMMUNICATION 0
`vv 324
`
`UNIT
`
`8
`
`BEACON (TARGET)
`310
`
`POSITION UNIT
`312
`
`COMMUNICATION
`UNIT
`314
`
`v
`
`FIG. 3
`
`AERIAL ROBOT
`420
`
`POSITION UNIT
`422
`
`TRACKING UNIT
`426
`
`TARGET IDENTIFIER
`UNIT
`424
`
`~
`~
`
`TARGET IDENTIFIER
`410
`
`FIG. 4
`
`Yuneec Exhibit 1009 Page 3
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 3of9
`
`US 9,367,067 B2
`
`AERIAL ROBOT
`500
`
`POSITION UNIT
`520
`
`~
`~
`
`VIDEO
`RECOGNITION UNIT
`524
`
`ALTITUDE SENSOR
`526
`
`FMU
`510
`
`FLIGHT
`CONTROLLER
`512
`
`TRACKING UNIT
`530
`
`RF UNIT
`532
`
`r-::-1
`~
`
`~
`~
`
`SENSOR
`536
`
`MOVEMENT
`CALCULATOR
`538
`
`OBSTACLE DETECTION UNIT
`550
`
`COMMUNICATION UNIT
`540
`
`RF TRANSCEIVER
`542
`
`IMAGE CAPTURE UNIT
`560
`
`CONTROL UNIT
`562
`
`FIG. 5
`
`Yuneec Exhibit 1009 Page 4
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 4of9
`
`US 9,367,067 B2
`
`BEACON
`600
`
`POSITION UNIT
`610
`
`COMMUNICATION UNIT
`620
`
`~
`~
`
`RF TRANSCEIVER
`622
`
`ALTITUDE SENSOR
`614
`
`~
`~
`
`ROBOT CONTROLLER UNIT
`630
`
`INITIALIZATION
`632
`
`TETHER CONTROL
`634
`
`FIG. 6
`
`Yuneec Exhibit 1009 Page 5
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 5of9
`
`US 9,367,067 B2
`
`BEACON PSEUDOCODE -TRACKING 700
`
`702
`
`loop {
`
`read GPS data;
`
`if (new GPS data) {
`
`update beacon_gps_location;
`
`desired_robot_position = beacon_gps_location + 3D_offsets;
`
`send_radio_command(keep camera pointed to: beacon_gps_location);
`
`send_radio_command(move to: desired_robot_position);
`
`704
`
`706
`
`708
`
`710
`
`712
`
`714
`
`716
`
`718
`
`FIG. 7
`
`ROBOT PSEUDOCODE - TRACKING 800
`
`802
`
`loop {
`
`804
`
`806
`
`808
`
`810
`
`812
`
`814
`
`816
`
`818
`
`820
`
`822
`
`824
`
`826
`
`828
`
`830
`
`read GPS data;
`
`if (new GPS data) {
`
`update robot_gps_location;
`
`read radio data;
`
`if (radio command keep camera pointed to) {
`
`camera_point_target =received position;
`
`if (radio command move to) {
`
`nav_target_location =received position;
`
`point camera to: camera_point_target;
`
`update nav PID loops;
`
`FIG. 8
`
`Yuneec Exhibit 1009 Page 6
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 6of9
`
`US 9,367,067 B2
`
`BEACON PSEUDOCODE - SETTING OFFSETS 900
`
`902
`
`loop while (in fixed hover mode) {
`
`read GPS data;
`
`if (new GPS data) {
`
`update beacon_gps_location;
`
`if (angle, height, or distance controls adjusted) {
`
`desired_robot_position +=adjustment;
`
`send_radio_command(hover at: desired_robot_position);
`
`read radio data;
`
`if (receive_radio_telemetry: robot_position) {
`
`3D_offsets = robot_position - beacon_gps_location;
`
`904
`
`906
`
`908
`
`910
`
`912
`
`914
`
`916
`
`918
`
`920
`
`922
`
`924
`
`926
`
`928
`
`930
`
`save offsets;
`
`932
`
`enter following mode;
`
`FIG. 9
`
`ROBOT PSEUDOCODE - SETTING OFFSETS 1000
`
`1002
`
`loop {
`
`read GPS data;
`
`if (new GPS data) {
`
`update robot_gps_location;
`
`receive radio data;
`
`if (radio command hover at) {
`
`hover _target_position = received position;
`
`update hover PID loops;
`
`send radio telemetry: robot_gps_position;
`
`1004
`
`1006
`
`1008
`
`1010
`
`1012
`
`1014
`
`1016
`
`1018
`
`1020
`
`1022
`
`1024
`
`FIG. 10
`
`Yuneec Exhibit 1009 Page 7
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 7of9
`
`US 9,367,067 B2
`
`DIGITAL TETHERING 1100
`
`INITIALIZE BEACON 1102
`
`INITIALIZE AERIAL ROBOT 1104
`
`ESTABLISH DIGITAL TETHER 1106
`
`SET OFFSET(S) (THREE DIMENSIONAL) 1108
`
`SET REFERENCE ANGLE BETWEEN AERIAL ROBOT
`AND TARGET 1110
`
`SET VIDEO CAPTURE/CAMERA PERSPECTIVE 1112
`
`TRAIN VIDEO CAPTURE/CAMERA TO TRACK
`TARGET IN RESPONSE TO MOVEMENT 1114
`
`DETERMINE IF TARGET HAS MOVED 1116
`
`NO
`
`YES
`
`DETERMINE IF EXCEPTION TO FOLLOWING TARGET
`MOVEMENT WITH CURRENT DIGITAL TETHER 1120
`
`. - - - - - - - - - - - - -YES
`
`IDENTIFY EXCEPTION TYPE 1126
`
`NO
`
`CALCULATE AERIAL ROBOT MOVEMENT WITH
`RESPECT TO TARGET AND EXCEPTION 1128
`
`FOLLOW TARGET MOVEMENT WITH AERIAL
`ROBOT TO MAINTAIN DIGITAL TETHER 1124
`
`FIG.11A
`
`Yuneec Exhibit 1009 Page 8
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 8of9
`
`US 9,367,067 B2
`
`MODIFYING DIGITAL TETHER 1130
`
`IDENTIFY EXCEPTION TYPE 1126
`
`MODIFY DIGITAL TETHER OFFSET(S) 1132
`
`CALCULATE AND/OR RECEIVE NEW OFFSET(S) 1134
`
`ADJUST FLIGHT AND/OR VIDEO CAPTURE/CAMERA
`FOR NEW OFFSET(S) 1136
`
`CALCULATE AERIAL ROBOT MOVEMENT WITH
`RESPECT TO TARGET AND EXCEPTION 1128
`
`FIG. 118
`
`IDENTIFY EXCEPTION TYPE 1126
`
`OVERRIDE DIGITAL TETHER 1142
`
`DETERMINE TO STOP OR SHUT DOWN, BASED ON
`OVERRIDE RECEIVED 1144
`
`CALCULATE AERIAL ROBOT MOVEMENT WITH
`RESPECT TO TARGET AND EXCEPTION 1128
`
`FIG. 11C
`
`Yuneec Exhibit 1009 Page 9
`
`
`
`U.S. Patent
`
`Jun.14,2016
`
`Sheet 9of9
`
`US 9,367,067 B2
`
`AVOIDING OBSTACLE 1150
`
`IDENTIFY EXCEPTION TYPE 1126
`
`DETECT OBSTACLE IN FLIGHT PATH REQUIRED TO
`MAINTAIN DIGITAL TETHER 1152
`
`COMPUTE ADJUSTED FLIGHT PATH TO AVOID
`OBSTACLE 1154
`
`MOVE AERIAL ROBOT BASED ON ADJUSTED FLIGHT
`PATH 1156
`
`DETERMINE WHETHER TO MAINTAIN NEW
`DIGITAL TETHER OR RECOVER ORIGINAL DIGITAL
`TETHER 1158
`
`NO----.
`
`YES
`
`SET NEW OFFSET(S) 1162
`
`MANEUVER TO PREVIOUS OFFSET(S) 1164
`
`CALCULATE AERIAL ROBOT MOVEMENT WITH
`RESPECT TO TARGET AND EXCEPTION 1128
`
`FIG. 110
`
`Yuneec Exhibit 1009 Page 10
`
`
`
`US 9,367,067 B2
`
`1
`DIGITAL TETHERING FOR TRACKING
`WITH AUTONOMOUS AERIAL ROBOT
`
`RELATED APPLICATIONS
`
`This application is a nonprovisional based on and claims
`the benefit of priority of U.S. Provisional Application No.
`61/800,201, filed Mar. 15, 2013. The provisional application
`is hereby incorporated by reference.
`
`FIELD
`
`Embodiments described are related generally to unmanned
`aircraft, and embodiments described are more particularly
`related to a tracking aerial device.
`
`COPYRIGHT NOTICE/PERMISSION
`
`Portions of the disclosure of this patent document can
`contain material that is subject to copyright protection. The
`copyright owner has no objection to the reproduction by
`anyone of the patent document or the patent disclosure as it
`appears in the Patent and Trademark Office patent file or
`records, but otherwise reserves all copyright rights whatso(cid:173)
`ever. The copyright notice applies to all data as described
`below, and in the accompanying drawings hereto, as well as to
`any software described below: Copyright© 2014, Gilmore
`Labs, LLC. All Rights Reserved.
`
`BACKGROUND
`
`10
`
`2
`FIG. 5 is a block diagram of an embodiment of an aerial
`robot including one or more features for detecting its position
`and tracking a target.
`FIG. 6 is a block diagram of an embodiment of beacon
`including one or more features for enabling an aerial robot to
`track a target.
`FIG. 7 is a representation of an embodiment of pseudocode
`for a beacon to provide tracking information to an aerial
`robot.
`FIG. 8 is a representation of an embodiment of pseudocode
`for aerial robot to track a target via a beacon.
`FIG. 9 is a representation of an embodiment of pseudocode
`for a beacon to establish a digital tether with an aerial robot.
`FIG. 10 is a representation of an embodiment of
`15 pseudocode for aerial robot to establish a digital tether with a
`beacon.
`FIG. llA is a flow diagram of an embodiment of a process
`for digital tethering.
`FIG. llB is a flow diagram of an embodiment of a process
`20 for modifying a digital tether.
`FIG. llC is a flow diagram of an embodiment ofa process
`for overriding a digital tether.
`FIG. llD is a flow diagram of an embodiment of a process
`for avoiding an obstacle while being digitally tethered to a
`25 moving target.
`Descriptions of certain details and embodiments follow,
`including a description of the figures, which can depict some
`or all of the embodiments described below, as well as discuss(cid:173)
`ing other potential embodiments or implementations of the
`30 inventive concepts presented herein.
`
`DETAILED DESCRIPTION
`
`As described herein, an aerial device automatically main(cid:173)
`tains a relative position with respect to a target. Maintaining
`the relative position with respect to a target can be referred to
`as a digital tether. The electronic platform of the aerial device
`can cause the device to move with the target to maintain
`substantially the same relative position with respect to the
`target. The position can be a three dimensional (3D) position
`and/or an offset with respect to angle to the target. Tracking
`the target can include tracking in the three dimensions of
`space as well as other dimensions (e.g., velocity, angle). The
`aerial device first sets a relative position with respect to the
`45 target to determine what relative position to maintain. The
`position can include two dimensional distance component (x
`and y distance components) as well as an altitude component
`(z distance component). In one embodiment, the aerial device
`maintains an x and y position as the digital tether, but not
`50 necessarily an altitude component. In one embodiment, the
`aerial device also has an angle component, which indicates an
`angle of observance. The angle can be relative to a direction of
`movement, relative to a compass heading, and/or relative to
`an indicator or identifier of the target. In one embodiment, the
`55 angle is part of the digital tether. The target can have an
`indicator (e.g., a visual marker for image capture tracking, or
`a radio indicator for tracking via signaling) that the aerial
`device monitors. The aerial device can automatically adjust
`its flight path in response to movement of the target as indi-
`60 cated by the indicator.
`It will be understood that the device can either be referred
`to as an "aerial device" or an "aerial robot." The expression
`"aerial robot" merely points out the fact that at least some of
`the operations occur automatically, without specific user
`intervention during the course of operation of the device. The
`autonomy of the operations typically results from configuring
`and/orprograniming the device to identify and react to certain
`
`Aircraft are currently used to film a variety of sporting
`events. However, the cost of using aircraft is very high. Addi(cid:173)
`tionally, there are practical limitations on how the types and
`angles of camera shoot that can be accomplished with tradi- 35
`tional aircraft filming. There are currently RF (radio fre(cid:173)
`quency) aircraft available, but the limitations on flight control
`and signal delays makes the traditional use of such aircraft
`either difficult or unfit for filming certain sporting events.
`Traditional use of such aircraft required multiple individuals 40
`(e.g., a driver and a camera controller) to coordinate to simul(cid:173)
`taneously fly the aircraft and capture images.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The following description includes discussion of figures
`having illustrations given by way of example of implementa(cid:173)
`tions of embodiments described. The drawings should be
`understood by way of example, and not by way oflimitation.
`As used herein, references to one or more "embodiments" are
`to be understood as describing a particular feature, structure,
`or characteristic included in at least one implementation.
`Thus, phrases such as "in one embodiment" or "in an alternate
`embodiment" appearing herein describe various embodi(cid:173)
`ments and implementations, and do not necessarily all refer to
`the same embodiment. However, they are also not necessarily
`mutually exclusive.
`FIG. 1 is a block diagram of an embodiment of an aerial
`robot that maintains a position A with respect to a target.
`FIG. 2 is a block diagram of an embodiment of an aerial
`robot that maintains a positionA with respect to a target while
`the target is in motion.
`FIG. 3 is a block diagram of an embodiment of a system
`having an aerial robot that tracks a target via a beacon.
`FIG. 4 is a block diagram of an embodiment of a system 65
`having an aerial robot that tracks a target via identifying an
`indicator.
`
`Yuneec Exhibit 1009 Page 11
`
`
`
`US 9,367,067 B2
`
`3
`signals and/or sensor inputs. Thus, a user does not necessarily
`"fly" the aerial device once it is triggered to track a target.
`Instead, the configuration/programming of the device allows
`it to automatically track the target. It will be understood that
`the aerial device includes an aerial platform, which is a hard(cid:173)
`ware platform including the controlling mechanisms and pro(cid:173)
`cessing devices that allow automatic operation of the device
`(e.g., processor, sensors). The hardware platform interfaces
`with and/or is part of the hardware on the aerial device that
`enables flight and motor control devices (e.g., motors, con- 10
`trailers for the motors). The hardware platform could be a
`module that is connected to an aerial device. For purposes of
`example, the autonomous aerial device is typically referred to
`herein as an aerial robot.
`In one embodiment, the aerial device is an image capture
`platform that tracks a moving target. For example, the aerial
`device can include photography and/or video equipment to
`monitor the target. The image capture equipment can include
`still photograph capture devices, lenses, video capture
`devices, or other image capture equipment. In one embodi(cid:173)
`ment, the aerial device also or alternatively includes audio
`monitoring equipment, such as a microphone. In one embodi(cid:173)
`ment, the aerial device can include other data capture sensors,
`such as infrared sensors, or other sensors.
`In one embodiment, the position of the aerial device from
`the target can be set to a certain angle, distance, and height
`from the target, or latitude, longitude, and altitude offset with
`respect to the target. In one embodiment, the aerial device
`tracks the position of the target via a "beacon" attached to the
`target. In one embodiment, the beacon contains a GPS (global
`positioning system) module, microprocessor, radio or other
`communications link, and optional accelerometers, gyro(cid:173)
`scopes, magnetometer, barometer, or other sensors to
`complement the GPS position data. In one embodiment, the
`aerial device obtains initial position information from a GPS
`device, and tracks the target by radio signal strength or propa(cid:173)
`gation delay for a target that generates a radio frequency
`beacon signal. In one embodiment, the aerial device obtains
`initial position information via an ultrasound beacon signal,
`which can be used for propagation delay and/or signal
`strength distance measurement. In one embodiment, the
`aerial device tracks the target at least in part via light or
`ultrasound sensing. In one embodiment, the aerial robot
`maintains altitude above the target at least in part by measur(cid:173)
`ing barometric pressure difference. In one embodiment, the 45
`aerial device detects direction to the target by using one or
`more directional antennas (or sensors) to receive a signal
`from the target.
`In one embodiment, the aerial device includes optical flow
`processing or other image processing components to perform 50
`image recognition to track the target. The target can be or
`include an identifying shape and/or color that can be identi(cid:173)
`fied by the image processing technology used on the aerial
`device. Thus, the aerial device can determine the location of
`the moving object (the target) in a captured video stream, and 55
`use feedback to move the aerial device to maintain the target
`with a consistent size and position in the video stream. In one
`embodiment, such video processing techniques can be used to
`complement data sent to the aerial device from the target. In
`one embodiment, such video processing techniques can be 60
`used to eliminate the need for the target to send beacon data to
`the aerial device.
`It will be understood that an autonomously tracking aerial
`device or aerial robot can be at risk for flyaways and/or
`crashes. The aerial device can include one or more features to
`reduce the risk of flyaways and crashes. It will be understood
`that completely eliminating the risk may not be possible in all
`
`4
`cases. However, certain features can significantly reduce
`injury and/or damage due to flight errors by an autonomous
`device.
`In one embodiment, the aerial device is governed by a
`"panic" button or override switch controlled by a user or
`administrator of the device. The aerial platform hardware can
`include an interrupt and logic to override any other flight
`operations. For example, the override can cause the aerial
`device to stop flying and enter a static hover when activated.
`In one embodiment, the static hover is governed by GPS
`positioning information. In one embodiment, the aerial
`device maintains a static hover based on inertial measurement
`sensors on the aerial platform. For example, the flight man(cid:173)
`agement of the device can attempt to keep the aerial device in
`15 a position that reduces all inertial measurements as close to
`zero as possible.
`In one embodiment, the aerial platform supports the use of
`a kill switch to completely stop all motors on the aerial plat(cid:173)
`form. A kill switch can be initiated via some type of signaling
`20 provided to the aerial device by a user or administrator. The
`use of a kill switch in the case of an imminent crash or flight
`management unit (FMU) malfunction would inevitably cause
`the device to crash, but can reduce the damage associated with
`the crash. For example, stopping all motor function can
`25 reduce propeller speed prior to a crash, which is expected to
`reduce the danger associated with high velocity propeller
`spinning. The kill switch crashing can actually reduce pro(cid:173)
`peller and motor damage in the event of a crash.
`In one embodiment, the aerial device is programmed to
`30 automatically enter a static hover and/or to automatically land
`under any of a variety of error conditions. For example, the
`aerial device can enter a static hover if it loses a beacon signal
`from the target and/or carmot identify the target from image
`processing. Thus, in cases of interrupted radio communica-
`35 tion, too great a distance between the target and the aerial
`device, the aerial device at an unsafe proximity to the target
`(as indicated by sensor( s) and/or configuration), robot battery
`low, or other conditions that make continuing to fly unsafe
`and/or inadvisable. In another example, the aerial device can
`40 perform an automatic landing in cases oflosing a GPS signal,
`not being able to obtain an accurate GPS fix (e.g., because of
`identifying too few GPS satellite signals), or identifying that
`the position module's accuracy is below a threshold. It will be
`understood that the conditions could also result in the aerial
`device entering a static hover. It will also be understood that
`any of the other conditions identified above as causing a static
`hover could also, or alternatively, cause the aerial device to
`automatically land. In one embodiment, upon detection of an
`error, the aerial device can first attempt to enter a static hover,
`and if unsuccessful in maintaining a static hover, automati(cid:173)
`cally land.
`In one embodiment, any error condition can cause the
`aerial device to generate an alert. For example, if the aerial
`platform determines that there is an error on the platform, it
`can initiate a sound, vibration, flashing light, an error trans(cid:173)
`mission that shows up on a display device, or other alert. Such
`an alert can be sent from the aerial device to a beacon device
`or control device for the aerial device. Thus, in cases where a
`user elects to have the aerial device follow from behind or
`otherwise out of sight, the aerial device can indicate an error
`to the user, and for example, indicate that it has engaged in a
`static hover or automatic landing. It will be understood that if
`the error condition is a loss of radio transmission, such a
`signal may be unable to get through. However, repeatedly
`65 sending such a signal anyway can increase the possibility that
`at least one attempt to indicate the error will be detectable to
`the user. Loss of radio transmission could also be an error
`
`Yuneec Exhibit 1009 Page 12
`
`
`
`US 9,367,067 B2
`
`5
`condition in itself, which can be displayed or otherwise indi(cid:173)
`cated on a user's handheld or body mounted device.
`In one embodiment, the aerial device tracks a target device
`based on radio signal exchanges with the target. Such radio
`signal exchanges can include a checksum to prevent cor(cid:173)
`rupted radio link data from causing incorrect commands.
`Thus, the system can be configured to accept only data with a
`valid checksum and/or commands with a valid checksum by
`the receiving device.
`Most examples provided herein refer to the offset or the 10
`digital tether in terms of the three dimensions of distance (two
`horizontal dimensions and a vertical dimension). However, it
`will be understood that the offsets or digital tether can be
`understood as multidimensional. In addition to the three dis- 15
`tance dimensions, the aerial robot can track velocity and/or
`acceleration and/or angle with respect to the target. Any one
`of these, or all of them, could be considered additional dimen(cid:173)
`sions. Thus the aerial robot can track in multiple dimensions,
`three distance dimensions as well as other dimensions. For
`example, in one embodiment, the aerial robot can be pro(cid:173)
`grammed for a preset path, tracking a specific flight path with
`respect to a race course or event course. Thus, the three
`dimensions of distance could be set for each part of the flight
`path in certain implementations. The aerial robot can then
`dynamically track or respond to controls for velocity and/or
`acceleration along the flight path. Thus, the aerial robot can
`track a target in multiple dimensions.
`FIG. 1 is a block diagram of an embodiment of an aerial
`device or robot that maintains a position A with respect to a 30
`target. System 100 includes aerial robot 120, which has a
`digital tether with target 110. Target 110 is illustrated as a
`person, but could be another device or vehicle, an animal, or
`other object to track. Vector A or position A represents an
`offset between aerial robot 120 and target 110. As illustrated, 35
`vector A can include X, Y, and/or Z components, representing
`two horizontal axes and a vertical axis. The distance along any
`of the axes can be any distance so long as vector A is within
`range of the monitoring and tracking capabilities of aerial
`robot 120. Thus, vector A should be within range of a radio 40
`signal if a radio beacon is used, or within range of optical
`processing if optical processing is used. In one embodiment,
`vector A is bounded within programming, based on tracking
`technology used.
`Aerial robot 120 can be set at an offset of a certain distance 45
`in front or behind target 110, to one side or the other, and at a
`certain altitude with respect to the target. In one embodiment
`aerial robot 120 is set directly inlinewith target 110 along one
`of the axes. It will be understood that by setting the robot
`inline with the target, one or more of the other axes will be
`eliminated. In one embodiment, aerial robot 120 can also set
`an angular rotation or angle R with respect to target 110.
`Angle R refers to how aerial robot 120 is "pointed" at target.
`In one embodiment, target 110 is a user ofaerial robot 120.
`In one embodiment, the user has a controller that allows
`certain interaction with the robot. Thus, the user can set vector
`A, and/or adjust the position offsets of the digital tethering.
`While the expression digital tethering is used, "shadowing" is
`another expression that can be used to refer to the tracking of
`aerial robot 120 at a relatively constant vector A. It will be 60
`understood that in practice vector A may only be fixed with
`respect to expected tolerances. Thus, aerial robot 120 can
`maintain substantially the same relative position with respect
`to target 110, while still varying slightly. The tolerances can
`include positional tolerances, as well as delay/timing taler- 65
`ances. For example, initial movements and/or acceleration by
`target 110 can cause a delay in the tracking of aerial robot 120.
`
`6
`Over time, on average, aerial robot 120 is expected to track
`target 110 at substantially the same distance or offset(s).
`In one embodiment, aerial robot 110 includes image cap(cid:173)
`ture hardware, such as a video recording device or camera
`(either still image camera, video camera, or both), as repre(cid:173)
`sented by camera 122. Camera 122 can monitor target 110,
`and can move relative to target 110 at least partially indepen(cid:173)
`dently of aerial robot 120. Camera 122 represents both the
`image capture hardware and motion control hardware of cam(cid:173)
`era 122. In one embodiment, camera 122 is fixed on aerial
`robot 120, and will thus monitor target 110 based on vector A
`of aerial robot to target 110, and angular position R of the
`robot relative to the target. Camera 122 can be fixed at a
`vertical angle down from horizontal (planar with respect to
`the ground), to enable the camera to monitor the target. In one
`embodiment, the vertical angle of camera 122 with respect to
`aerial robot 120 is adjustable. In one embodiment, the vertical
`angle and the rotational angle of camera 122 is adjustable.
`Thus, at least one of pitch or yaw of camera 122 can be
`20 adjusted with respect to the target, at least partially indepen(cid:173)
`dently of adjusting a position of the aerial robot.
`In one embodiment, target 110 includes beacon 112. Bea(cid:173)
`con 112 represents a device or mechanism at the target that
`can send commands and/or exchange data with aerial robot
`25 120. It will be understood that a "controller" that enables
`manual control of aerial robot 120 and/or the sending of
`commands to the aerial robot can exist separately from in
`addition to or in place of having such controls at beacon 112.
`Thus, in one embodiment, beacon 112 is a relatively simple
`signal generating device, while in another embodiment, bea(cid:173)
`con 112 is part of or connected to a controller at target 110. In
`one embodiment, aerial robot 120 sends data captured by
`camera 122 to beacon 112 or a controller.
`The initial positioning of aerial robot 120 with respect to
`target 110 can set the digital tether for the aerial robot. As
`mentioned above, vector A represents the digital tether, and
`can include offsets in X, Y, and/or Z. Thus, system 100 can set
`a relative three dimensional position of aerial robot 120 with
`respect to target 110. The user can set the one or more offsets
`by adjusting height, angle, and distance controls of aerial
`robot 120, by flying the robot manually to the desired posi-
`tion, and/or by moving the target to the desired location
`relative to the aerial robot. In one embodiment, the user can
`set a button or other control to command the system to record
`the particular offsets. When offsets are set, aerial robot 120
`can record the three axis position offsets.
`In one embodiment, beacon 112 records the offsets. In one
`embodiment, beacon 112 sends commands and flight infor(cid:173)
`mation for aerial robot 120 based on movement of target 110.
`50 Thus, beacon 112 can include processing hardware that gen(cid:173)
`erates position data for aerial robot 120. In such an imple(cid:173)
`mentation, beacon 112 can store the offsets for vector A, and
`factor the offsets into calculations of position data to send to
`aerial robot 120. In one embodiment, the beacon records
`55 offsets for aerial robot 120 and adds the offsets to its own
`position information, which it then sends to aerial robot 120
`over a radio link. In one embodiment, aerial robot 120 records
`the offsets and adds them to position information sent by
`beacon 112.
`In one embodiment, angle R can be relative to compass
`direction from target 110. For example, aerial robot 120 can
`be set at an offset of 45 degrees (northeast). Whichever direc(cid:173)
`tion target 110 moves, aerial robot 120 can try to maintain the
`set distance A away from the target, and maintain an angle of
`45 degrees from the target. It will be understood that the offset
`angle can also be subject to transformation depending on the
`target's velocity vector. For example, if aerial robot 120 is set
`
`Yuneec Exhibit 1009 Page 13
`
`
`
`US 9,367,067 B2
`
`7
`at an offset of 45 degrees and target 110 is heading 315
`degrees (northwest), the robot will be to the target's left. If
`target 110 gradually turns west while in motion until the
`direction of travel becomes due west (270 degrees), aerial
`robot 120 can correspondingly gradually transform its offset
`angle R by the same amount, from 45 degrees to 0 degrees.
`Thus, the robot will still be at the left-hand side of the target.
`In one embodiment, the target's course change can be
`computed from the change in the course given by a GPS
`module, or simply by comparing successive GPS coordi(cid:173)
`nates. Whether position information is calculated by a con(cid:173)
`troller at target 110 and sent to aerial robot