`
`(12) United States Patent
`Zang
`
`(lo) Patent No.: (cid:9)
`(45) Date of Patent: (cid:9)
`
`US 9,164,506 B1
`Oct. 20, 2015
`
`(54) SYSTEMS AND METHODS FOR TARGET
`TRACKING
`
`(71) Applicant: SZ DJI TECHNOLOGY Co., Ltd,
`Shenzhen (CN)
`
`(72) Inventor: Bo Zang, Shenzhen (CN)
`
`(73) Assignee: SZ DJI TECHNOLOGY CO., LTD,
`Shenzhen (CN)
`
`(*) Notice: (cid:9)
`
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 0 days.
`
`(21) Appl. No.: 14/471,954
`
`(22) Filed: (cid:9)
`
`Aug. 28, 2014
`
`Related U.S. Application Data
`
`application (cid:9)
`of (cid:9)
`(63) Continuation (cid:9)
`PCT/CN2014/083315, filed on Jul. 30, 2014.
`
`No.
`
`(2006.01)
`(2006.01)
`
`(51) Int. Cl.
`G05D 1/12 (cid:9)
`G05D 1/00 (cid:9)
`(52) U.S. Cl.
`CPC ............ G05D 1/0038 (2013.01); G05D 1/0094
`(2013.01); G05D 1/12 (2013.01)
`(58) Field of Classification Search
`CPC ............................... G05D 1/0094; G05D 1/12
`USPC ........ 701/2, 11; 348/113, 114, 144, 169, 170,
`348/171, 172, 211.99
`See application file for complete search history.
`
`(56) (cid:9)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`7,970,507 B2* (cid:9)
`2008/0054158 Al* (cid:9)
`2009/0157233 Al * (cid:9)
`
`6/2011 Fregene et al . ................. 701/23
`3/2008 Ariyur et al ................ 250/203.1
`6/2009 Kokkeby et al ................... 701/3
`
`................. 701/23
`
`7/2009 Fregene et al . (cid:9)
`2009/0187299 Al * (cid:9)
`1/2010 Bodin etal.
`20 10/0004802 Al (cid:9)
`....................... 701/2
`9/2010 Hines et al . (cid:9)
`2010/0250022 Al * (cid:9)
`2011/0141287 Al* (cid:9)
`6/2011 Dunkel et al . (cid:9)
`................ 348/169
`2011/0304737 Al* (cid:9) 12/2011 Evans et al .................... 348/169
`2012/0143808 Al * (cid:9)
`6/2012 Karins et al ..................... 706/46
`2012/0154579 Al* (cid:9)
`6/2012 Hampapur et al. (cid:9)
`........... 348/143
`20 12/0200703 Al (cid:9)
`8/2012 Nadir et al.
`2012/0287274 Al* (cid:9) 11/2012 Bevirt (cid:9)
`........................... 348/144
`2012/0307042 Al* (cid:9) 12/2012 Lee et al . (cid:9)
`...................... 348/114
`(Continued)
`
`FOREIGN PATENT DOCUMENTS
`
`CN (cid:9)
`CN (cid:9)
`WO (cid:9)
`
`102809969 A
`103149939 A
`WO 2010/089738 A2
`
`12/2012
`6/2013
`8/2010
`
`OTHER PUBLICATIONS
`
`Rafi, et al. Autonomous target following by unmanned aerial
`vehicles. In Proceedings of the SPIE, May 2006.
`(Continued)
`
`Primary Examiner Thomas G Black
`Peter D Nolan
`Assistant Examiner
`(74) Attorney, Agent, orFirm Wilson Sonsini Goodrich&
`Rosati
`
`ABSTRACT
`(57) (cid:9)
`The present invention provides systems, methods, and
`devices related to target tracking by UAVs. The UAV may be
`configured to receive target information from a control ter-
`minal related to a target to be tracked by an imaging device
`coupled to the UAV. The target information may be used by
`the UAV to automatically track the target so as to maintain
`predetermined position and/or size of the target within one or
`more images captured by the imaging device. The control
`terminal maybe configured to display images from the imag-
`ing device as well as allowing user input related to the target
`information.
`
`30 Claims, 18 Drawing Sheets
`
`X 100
`
`116
`
`"Si
`
`Yuneec Exhibit 1001 Page 1
`
`(cid:9)
`(cid:9)
`(cid:9)
`
`
`US 9,164,506 Bl
`Page 2
`
`(56) (cid:9)
`
`References Cited (cid:9)
`
`OTHER PUBLICATIONS
`
`U.S. PATENT DOCUMENTS (cid:9)
`
`701/49 (cid:9)
`4/2013 Mathews (cid:9)
`2013/0085643 Al (cid:9)
`7/2013 Rischmuller et al.......... 348/114
`2013/0176423 At * (cid:9)
`2014/0049643 At * 2/2014 Segerstrom et al. .......... 348/144 (cid:9)
`
`International search report and written opinion dated May 6, 2015 for
`PCT/CN20 14/0 833 15.
`
`* cited by examiner
`
`Yuneec Exhibit 1001 Page 2
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 1 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`100
`
`FIG. 1
`
`Yuneec Exhibit 1001 Page 3
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 2 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`C 200
`
`FIG. 2
`
`Yuneec Exhibit 1001 Page 4
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 3 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`300
`
`304
`
`306
`
`mm
`
`310
`
`312 (cid:9)
`
`314
`
`316
`
`FIG. 3
`
`Yuneec Exhibit 1001 Page 5
`
`(cid:9)
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 4 of 18
`
`US 9,164,506 B1
`
`C 400
`
`I (cid:9)
`
`Obtain target information
`
`Identify target based on target
`information
`
`Detect deviation of target from
`predetermined position and/or size
`
`Generate commands for UAV, carrier
`and/or imaging device to substantially
`correct the deviation
`
`402
`
`404
`
`A'.
`
`A':
`
`FIG. 4
`
`Yuneec Exhibit 1001 Page 6
`
`(cid:9)
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 5 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`Yj 508 (cid:9)
`
`500
`
`5:
`
`Z,
`
`Xl 510
`
`- X2 516
`
`FIG. 5
`
`Yuneec Exhibit 1001 Page 7
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 6 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`Y 606
`
`X 608
`
`Z 610
`
`(0,0) (cid:9)
`
`u i (cid:9)
`
`I (cid:9)
`
`600
`
`601
`
`uo
`
`vo ---I ----------f l l
`
`Po (uo,vo) 604
`
`H
`
`P (u, v)
`602
`
`v
`
`603
`
`Y 606
`
`(W, H)
`
`Z 610
`
`FIG. 6
`
`Yuneec Exhibit 1001 Page 8
`
`(cid:9)
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 7 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`(-700
`
`702
`
`--~ S
`
`SO
`
`704
`
`IL
`
`'I
`
`I
`I
`
`705
`
`Z 710
`
`FIG. 7
`
`Yuneec Exhibit 1001 Page 9
`
`(cid:9)
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 8 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`800
`
`Receive user navigation commands and
`target information (cid:9)
`
`Control movable object according to
`navigation commands (cid:9)
`
`Adjust movable object, carrier and/or
`imaging device to track target (cid:9)
`accordin2 to tareet information
`
`802
`
`804
`
`806
`
`FIG. 8
`
`Yuneec Exhibit 1001 Page 10
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 9 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`900
`
`Receive navigation input (cid:9)
`
`Receive tracking input
`
`Generate navigation commands based
`on navigation input (cid:9)
`
`Generate target information based on
`tracking input (cid:9)
`
`Provide navigation commands and
`target information (cid:9)
`
`902
`
`904
`
`906
`
`908
`
`910
`
`FIG. 9
`
`Yuneec Exhibit 1001 Page 11
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 10 of 18
`
`US 9,164,506 B1
`
`1000
`
`Display images captured by movable
`object (cid:9)
`
`Receive user selection of a target (cid:9)
`
`Generate target information based on
`user selection (cid:9)
`
`Provide target information to the
`movable object (cid:9)
`
`1002
`
`1004
`
`1006
`
`1008
`
`FIG. 10
`
`Yuneec Exhibit 1001 Page 12
`
`(cid:9)
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 11 of 18
`
`US 9,164,506 B1
`
`~1100
`
`I (cid:9)
`
`I (cid:9)
`
`I (cid:9)
`
`Receive images captured by UAV
`
`Receive tracking data
`
`Display images with tracking data
`
`1102
`
`1104
`
`1106
`
`FIG. 11
`
`Yuneec Exhibit 1001 Page 13
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 12 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`C 1200
`
`1202
`
`1210
`
`1206
`
`1208
`
`1205 (cid:9)
`
`1204
`
`FIG. 12
`
`Yuneec Exhibit 1001 Page 14
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 13 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`13
`
`1304
`
`FIG. 13A
`
`131
`
`13'
`
`1308
`
`FIG. 136
`
`1314
`
`FIG. 13C
`
`Yuneec Exhibit 1001 Page 15
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 14 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`1400
`
`.404
`
`1406
`
`1402
`
`1'-F 1 G
`
`FIG. 14
`
`Yuneec Exhibit 1001 Page 16
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 15 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`1500
`
`1506 (cid:9)
`
`1506
`
`1516
`
`151
`
`FIG. 15
`
`Yuneec Exhibit 1001 Page 17
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 16 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`1600
`
`FIG. 16
`
`Yuneec Exhibit 1001 Page 18
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 17 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`1700
`
`1710
`
`1712
`
`1714
`
`1706
`
`1702
`
`1708
`
`1704
`
`FIG. 17
`
`Yuneec Exhibit 1001 Page 19
`
`
`
`U.S. Patent (cid:9)
`
`Oct. 20, 2015 (cid:9)
`
`Sheet 18 of 18 (cid:9)
`
`US 9,164,506 Bi
`
`100
`
`1803
`
`1804
`
`1805
`
`1802
`
`
`
`/i 1806
`
`1803
`
`1804
`
`1805
`
`1802
`
`FIG. 18
`
`Yuneec Exhibit 1001 Page 20
`
`
`
`1
`SYSTEMS AND METHODS FOR TARGET
`TRACKING
`
`CROSS-REFERENCE
`
`This application is a continuation application of Interna-
`tional Application No. PCT/CN2014/083315, filed on Jul. 30,
`2014, the content of which is hereby incorporated by refer-
`ence in its entirety.
`
`BACKGROUND OF THE INVENTION
`
`Aerial vehicles such as unmanned aerial vehicles (UAVs)
`can be used for performing surveillance, reconnaissance, and
`exploration tasks for military and civilian applications. Such
`aerial vehicles may carry a payload configured to perform a
`specific function such as capturing images of surrounding
`environment.
`In some instances, it may be desirable for aerial vehicles to
`track a specific target. For small-sized aerial vehicles, such
`tracking is traditionally achieved via control commands from
`a user-operated remote control terminal or device. Such
`manual tracking control may become difficult in certain cir-
`cumstances, such as when the movable object or target is
`moving quickly or when the movable object is at least par-
`tially blocked from view of the user. Furthermore, the atten-
`tion necessary for such manual tracking typically requires a
`dedicated user that controls a camera that onboard the aerial
`vehicle separate from a user that controls the navigation of the
`aerial vehicle, thereby increasing the cost for aerial photog-
`raphy and other applications of the aerial vehicles.
`
`SUMMARY OF THE INVENTION
`
`In some instances, it may be desirable for aerial vehicles to
`track a specific target. Thus, a need exists for improved UAV
`tracking methods and systems that provides automatic or
`semi-automatic tracking of a target, thereby relieving opera-
`tors of the aerial vehicles of manually tracking the targets.
`The present invention provides systems, methods, and
`devices related to target tracking by UAVs. The UAV may be
`configured to receive target information from a control ter-
`minal related to a target to be tracked by an imaging device
`coupled to the UAV. The target information may be used by
`the UAV to automatically track the target so as to maintain
`predetermined position and/or size of the target within one or
`more images captured by the imaging device. Any description
`of tracking may include visual tracking by the imaging
`device. The control terminal may be configured to display
`images from the imaging device as well as allowing user input
`related to the target information.
`According to an aspect of the present invention, a method
`for controlling an unmanned aerial vehicle (UAV) is pro-
`vided. The method comprises: receiving, from a remote user,
`one or more navigation commands to move the UAV along a
`flight path; receiving, from the remote user, target informa-
`tion of a target to be tracked by an imaging device on the UAV;
`and tracking the target according to the target information by
`automatically adjusting at least one of the UAV or the imaging
`device while the UAV moves along the flight path according
`to the one or more navigation commands from the remote
`user.
`According to another aspect of the present invention, an
`unmanned aerial vehicle (UAV) with tracking capabilities is
`provided. The UAV comprises: one or more receivers, indi-
`vidually or collectively, configured to receive from a remote
`user (1) one or more navigation commands to move the UAV
`
`45
`
`50 (cid:9)
`
`60 (cid:9)
`
`US 9,164,506 B1
`
`20
`
`25 (cid:9)
`
`2
`along a flight path, and (2) target information of a target to be
`tracked by an imaging device on the UAV; and one or more
`processors, individually or collectively, configured to track
`the target according to the target information by automati-
`5 cally adjusting at least one of the UAV or the imaging device
`while the UAV moves along the flight path according to the
`one or more navigation commands from the remote user.
`According to another aspect of the present invention, s
`system for controlling an unmanned aerial vehicle (UAV) is
`io provided. The system comprises: one or more receivers, indi-
`vidually or collectively, configured to receive from a remote
`user (1) one or more navigation commands to move the UAV
`along a flight path, and (2) target information of a target to be
`tracked by an imaging device on the UAV; and one or more
`15 processors, individually or collectively, configured to track
`the target according to the target information by automati-
`cally adjusting at least one of the UAV or the imaging device
`while the UAV moves along the flight path according to the
`one or more navigation commands from the remote user.
`In some embodiments, the imaging device includes a cam-
`era or a camcorder.
`In some embodiments, the one or more navigation com-
`mands are adapted to control a speed, position or attitude or
`the UAV.
`In some embodiments, the target is substantially stationary
`relative to a reference object.
`In some embodiments, the target is moving relative to a
`reference object.
`In some embodiments, the target information includes ini-
`30 tial target information.
`In some embodiments, the initial target information
`includes an initial position or an initial size of the target
`within an image captured by the imaging device.
`In some embodiments, the target information includes tar-
`35 get type information.
`In some embodiments, tracking the target according to the
`target information further includes identifying, based on the
`target type information, the target to track from within one or
`more images captured by the imaging device using an image
`4o recognition algorithm.
`In some embodiments, the target type information includes
`color, texture, or pattern information.
`In some embodiments, the target information includes
`expected target information.
`In some embodiments, the expected target information
`includes an expected position or an expected size of the target
`within an image captured by the imaging device.
`In some embodiments, the expected size of the target is the
`same as an initial size of the target.
`In some embodiments, the expected position of the target is
`the same as an initial position of the target.
`In some embodiments, tracking the target according to the
`target information includes maintaining, within a predeter-
`mined degree of tolerance, the expected position, or the
`55 expected size of the target within one or more images cap-
`tured by the imaging device.
`In some embodiments, the imaging device is coupled to the
`UAV via a carrier configured to permit the imaging device to
`move relative to the UAV.
`In some embodiments, the carrier is configured to permit
`the imaging device to rotate around at least two axes relative
`to the UAV.
`In some embodiments, tracking the target according to the
`target information includes automatically adjusting at least
`65 one of the UAV, the carrier, or the imaging device while the
`UAV moves along the flight path according to the one or more
`navigation commands from the remote user.
`
`Yuneec Exhibit 1001 Page 21
`
`
`
`US 9,164,506 B1
`
`3
`In some embodiments, the target information includes
`expected target information and tracking the target according
`to the target information comprises: determining current tar-
`get information of the target based on one or more images
`captured by the imaging device; detecting a deviation of the
`current target information from the expected target informa-
`tion; and calculating an adjustment to the UAV, the carrier, or
`the imaging device so as to substantially correct the deviation.
`In some embodiments, the deviation is related to a change
`in position of the target and the calculated adjustment is
`related to an angular velocity for the UAV.
`In some embodiments, the angular velocity is relative to a
`yaw axis of the UAV.
`In some embodiments, the angular velocity is relative to a
`pitch axis of the UAV.
`In some embodiments, the deviation is related to a change
`in position of the target and the calculated adjustment is
`related to an angular velocity for the imaging device relative
`to the UAV.
`In some embodiments, the calculated adjustment is used to
`generate control signals for the carrier so as to cause the
`imaging device to move relative to the UAV.
`In some embodiments, the deviation is related to a change
`in size of the target and the adjustment is related to a linear
`velocity for the UAV.
`In some embodiments, the deviation is related to a change
`in size of the target and the adjustment is related to one or
`more parameters of the imaging device.
`In some embodiments, the one or more parameters of the
`imaging device include focal length, zoom, or focus.
`In some embodiments, the calculated adjustment is limited
`to a predetermined range.
`In some embodiments, the predetermined range corre-
`sponds to a predetermined range of control lever amount of a
`control system.
`In some embodiments, the control system includes a flight
`control system for the UAV or a control system for the carrier.
`In some embodiments, a warning signal is provided if the
`calculated adjustment falls outside the predetermined range.
`In some embodiments, tracking the target comprises com-
`paring the calculated adjustment to a predetermined maxi-
`mum threshold value and providing the predetermined maxi-
`mum threshold value if the calculated adjustment exceeds the
`predetermined maximum threshold value.
`In some embodiments, the predetermined maximum
`threshold value includes a maximum angular velocity or a
`maximum linear velocity for the UAV or the imaging device.
`In some embodiments, tracking the target comprises com-
`paring the calculated adjustment to a predetermined mini-
`mum threshold value and providing the predetermined mini-
`mum threshold value if the calculated adjustment is less than
`the predetermined minimum threshold value.
`In some embodiments, the predetermined minimum
`threshold value includes a minimum angular velocity or a
`minimum linear velocity for the UAV or the imaging device.
`In some embodiments, the target information is received
`from a remote control device accessible to the remote user.
`In some embodiments, the one or more navigation com-
`mands are received from the same remote control device.
`In some embodiments, the one or more navigation com-
`mands are received from a different remote control device.
`In some embodiments, the remote control device is con-
`figured to receive user input from a touchscreen, joystick,
`keyboard, mouse, or stylus.
`In some embodiments, the remote control device is con-
`figured to receive user input from a wearable device.
`
`5
`
`4
`In some embodiments, the remote control device is con-
`figured to: receive one or more images captured by the imag-
`ing device from the UAV; display the one or more images;
`receive a user selection of a target from within a displayed
`image; generate the target information of the target based on
`the user selection of the target; and transmit the target infor-
`mation to the UAV.
`In some embodiments, the remote control device is further
`configured to generate the one or more navigation commands
`io based on user input and to transmit the one or more navigation
`commands to the UAV.
`In some embodiments, the remote control device is further
`configured to receive tracking information related to the tar-
`get and to display the one or more images with the tracking
`15 information.
`According to an aspect of the present invention, an
`unmanned aerial vehicle (UAV) with tracking capabilities is
`provided. The UAV comprises: one or more receivers, indi-
`vidually or collectively, configured to receive, from a remote
`20 user, user-specified target information of a target to be tracked
`by an imaging device on the UAV, the user-specified target
`information including a predetermined position or a prede-
`termined size of the target within an image captured by the
`imaging device, the imaging device coupled to the UAV via a
`25 carrier configured to permit the imaging device to move rela-
`tive to the UAV; and one or more processors, individually or
`collectively, configured to: detect a deviation from the prede-
`termined position or the predetermined size of the target
`based on one or more images captured by the imaging device;
`so and generate commands to automatically adjust the UAV, the
`carrier, orthe imaging device so as to substantially correct the
`detected deviation from the predetermined position or the
`predetermined size of the target.
`According to another aspect of the present invention, a
`35 system for controlling an unmanned aerial vehicle (UAV) is
`provided. The system comprises: one or more receivers, indi-
`vidually or collectively, configured to receive, from a remote
`user, user-specified target information of a target to be tracked
`by an imaging device on the UAV, the user-specified target
`40 information including a predetermined position or a prede-
`termined size of the target within an image captured by the
`imaging device, the imaging device coupled to the UAV via a
`carrier configured to permit the imaging device to move rela-
`tive to the UAV; and one or more processors, individually or
`45 collectively, configured to: detect a deviation from the prede-
`termined position or the predetermined size of the target
`based on one or more images captured by the imaging device;
`and generate commands to automatically adjust the UAV, the
`carrier, orthe imaging device so as to substantially correct the
`5o detected deviation from the predetermined position or the
`predetermined size of the target.
`According to another aspect of the present invention, a
`method for controlling an unmanned aerial vehicle (UAV) is
`provided. The method comprises: receiving, from a remote
`55 user, user-specified target information of a target to be tracked
`by an imaging device on the UAV,, the user-specified target
`information including a predetermined position or predeter-
`mined size of the target within an image captured by the
`imaging device, the imaging device coupled to the UAV via a
`60 carrier configured to permit the imaging device to move rela-
`tive to the UAV; detecting, by a processor onboard the UAV, a
`deviation from the predetermined position or the predeter-
`mined size of the target based on one or more images captured
`by the imaging device; and automatically adjusting the UAV,
`65 the carrier, or the imaging device so as to substantially correct
`the detected deviation from the predetermined position or the
`predetermined size of the target.
`
`Yuneec Exhibit 1001 Page 22
`
`
`
`US 9,164,506 B1
`
`5
`In some embodiments, the imaging device includes a cam-
`era or a camcorder.
`In some embodiments, the method further comprises
`receiving, from the remote user, one or more commands
`adapted to control a speed, position, orientation or attitude or
`the UAV.
`In some embodiments, the method further comprises
`receiving, from the remote user, one or more commands
`adapted to control a speed, position, orientation or attitude or
`the carrier.
`In some embodiments, the method further comprises
`receiving, from the remote user, one or more commands
`adapted to control one or more operational parameters of the
`imaging device.
`In some embodiments, the one or more operational param-
`eters of the imaging device include focal length, zoom level,
`imaging mode, image resolution, focus, depth of field, expo-
`sure, lens speed, or field of view.
`In some embodiments, the carrier is configured to permit
`the imaging device to rotate around at least one axes relative
`to the UAV.
`In some embodiments, the carrier is configured to permit
`the imaging device to rotate around at least two axes relative
`to the UAV.
`In some embodiments, the target information of the target
`further includes target type information.
`In some embodiments, the target type information includes
`a color or texture of the target.
`In some embodiments, the predetermined position of the
`target includes an initial position or an expected position of
`the target.
`In some embodiments, the predetermined size of the target
`includes an initial size or an expected size of the target.
`In some embodiments, detecting the deviation from the
`predetermined position or the predetermined size of the target
`comprises comparing a position or size of the target within the
`one or more images captured by the imaging device with the
`predetermined position or predetermined size, respectively.
`In some embodiments, adjusting the UAV, the carrier, or
`the imaging device comprises calculating an adjustment to
`the UAV, the carrier, or the imaging device so as to substan-
`tially correct the deviation.
`In some embodiments, the deviation is related to a change
`in position of the target and the adjustment is related to an
`angular velocity for the UAV.
`In some embodiments, the angular velocity is relative to a
`yaw axis of the UAV.
`In some embodiments, the angular velocity is relative to a
`pitch axis of the UAV.
`In some embodiments, the deviation is related to a change
`in position of the target and the adjustment is related to an
`angular velocity for the imaging device relative to the UAV.
`In some embodiments, the adjustment is used to generate
`control signals for the carrier so as to cause the imaging
`device to move relative to the UAV.
`In some embodiments, the angular velocity is relative to a
`yaw axis of the imaging device.
`In some embodiments, the angular velocity is relative to a
`pitch axis of the imaging device.
`In some embodiments, the deviation is related to a change
`in size of the target and the adjustment is related to a linear
`velocity for the UAV.
`In some embodiments, the deviation is related to a change
`in size of the target and the adjustment is related to one or
`more parameters of the imaging device.
`
`5 (cid:9)
`
`10
`
`30 (cid:9)
`
`6
`In some embodiments, the one or more parameters of the
`imaging device include focal length, zoom level, imaging
`mode, image resolution, focus, depth of field, exposure, lens
`speed, or field of view.
`In some embodiments, the calculated adjustment is limited
`to a predetermined range.
`In some embodiments, the predetermined range corre-
`sponds to a predetermined range of control lever amount of a
`control system.
`In some embodiments, the control system includes a navi-
`gation control system for the UAV or a control system for the
`carrier.
`In some embodiments, the method further comprises pro-
`15 viding a warning signal if the adjustment falls outside the
`predetermined range.
`In some embodiments, the warning signal is used to pro-
`vide an audio or visual signal.
`In some embodiments, the warning signal is used to pro-
`20 vide a kinetic signal.
`In some embodiments, the method further comprises trans-
`mitting, in substantially real-time, images captured by the
`imaging device to a remote user device accessible to the
`remote user.
`In some embodiments, the remote user device comprises a
`display for displaying the images captured by the imaging
`device.
`In some embodiments, the remote user device comprises
`an input device for providing the target information.
`In some embodiments, the input device includes a touch-
`screen, joystick, keyboard, mouse, or stylus.
`In some embodiments, the input device includes a wear-
`able device.
`In some embodiments, the target information is provided
`35 based on the transmitted images.
`In some embodiments, the method further comprises pro-
`viding, in substantially real-time, tracking information of the
`target to the remote user device.
`In some embodiments, the remote user device is configured
`40 to: receive a user selection of the target from within one or
`more images displayed on the remote user device; and gen-
`erate the target information of the target based on the user
`selection of the target.
`According to another aspect of the present invention, a
`45 method for controlling an unmanned aerial vehicle (UAV) is
`provided. The method comprises: displaying, via a display,
`one or more images captured by an imaging device coupled to
`the UAV in substantially real-time; receiving, via an input
`device, a user selection of a target from within at least one of
`50 the one or more images being displayed in substantially real-
`time; generating target information of the target based at least
`in part on the user selection of the target; and providing the
`target information to the UAV so as to allow the UAV autono-
`mously track the target according to the target information.
`According to another aspect of the present invention, a
`system for controlling an unmanned aerial vehicle (UAV) is
`provided. The system comprises: a display configured to dis-
`play one or more images captured by an imaging device
`coupled to the UAV; an input device configured to receive a
`60 user selection of a target from within at least one of the one or
`more images being displayed on the display; one or more
`processors, individually or collectively, configured to gener-
`ate target information of the target based at least in part on the
`user selection of the target; and a transmitter configured to
`65 provide the target information to the UAV so as to allow the
`UAV autonomously track the target according to the target
`information.
`
`25
`
`55 (cid:9)
`
`Yuneec Exhibit 1001 Page 23
`
`(cid:9)
`
`
`US 9,164,506 B1
`
`7
`According to another aspect of the present invention, an
`apparatus for controlling an unmanned aerial vehicle (UAV)
`is provided. The apparatus comprises: a display configured to
`display one or more images captured by an imaging device
`coupled to the UAV; an input device configured to receive a
`user selection of a target from within at least one of the one or
`more images being displayed on the display; one or more
`processors, individually or collectively, configured to gener-
`ate target information of the target based at least in part on the
`user selection of the target; and a transmitter configured to
`provide the target information to the UAV so as to allow the
`UAV autonomously track the target according to the target
`information.
`In some embodiments, the target information includes ini-
`tial target information.
`In some embodiments, the initial target information
`includes an initial position or an initial size of the target
`within an image captured by the imaging device.
`In some embodiments, the initial target information is gen-
`erated based on the user selection of the target.
`In some embodiments, the target information includes tar-
`get type information.
`In some embodiments, the target type information includes
`color, texture, or pattern information.
`In some embodiments, the target type information is gen-
`erated based on the user selection of the target.
`In some embodiments, the target information includes
`expected target information.
`In some embodiments, the expected target information is
`generated based on the user selection of the target.
`In some embodiments, the expected target information
`includes an expected position or an expected size of the target
`within an image captured by the imaging device.
`In some embodiments, the target information does not
`include expected target information.
`In some embodiments, the input device includes a touch-
`screen, joystick, keyboard, mouse, stylus, orwearable device.
`In some embodiments, the user selection of the target is
`achieved by a user selecting an area of the at least one of the
`one or more images being displayed on the display, the
`selected area corresponding to the target.
`In some embodiments, the user selection of the target is
`achieved by a user directly touching an area of the at least one
`of the one or more images being displayed on the display, the
`touched area corresponding to the target.
`In some embodiments, the user selects the area using a
`stylus, mouse, keyboard, or a wearable device.
`In some embodiments, selecting the area includes touch-
`ing, swiping, circling, or clicking in the area.
`In some embodiments, the one or more processors, indi-
`vidually or collectively, are further configured to display, on
`the display, the selected target with a selection indicator in
`response to the user selection of the target, the selection
`indicator indicating that the target has been selected by the
`user.
`In some embodiments, the one or more processors, indi-
`vidually or collectively, are further configured to receive
`tracking information related to the target and, based on the
`tracking information, display the selected target with a track-
`ing indicator within one or m