`
`Petition for Inter Partes Review
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`_______________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`_____________
`
`Yuneec International Co. LTD. and Yuneec USA Inc.,
`
`Petitioners
`
`v.
`
`SZ DJI Technology Co., LTD.
`
`Patent Owner
`
`Patent No. 9,164,506
`
`Issue Date: October 20, 2015
`
`_______________
`
`Inter Partes Review No. IPR2017-00123
`
`____________________________________________________________
`
`Yuneec Exhibit 1002 Page 1
`
`
`
`Inter Partes Review of USP 9,164,506
`
`
`Attorney Docket No.: 748710000004
`
`DECLARATION OF JONATHAN D. ROGERS
`
`I, Jonathan D. Rogers, make this declaration in connection with the
`
`proceeding identified above.
`
`I.
`
`INTRODUCTION
`
`1.
`
`I have been retained by counsel for Yuneec International Co.
`
`LTD. and Yuneec USA Inc. (collectively “Petitioners” or “Yuneec”) as a technical
`
`expert in connection with the proceeding identified above. I submit this
`
`declaration in support of Yuneec’s Petition for Inter Partes Review of United
`
`States Patent No. 9,164,506 (“the ’506 patent”).
`
`2.
`
`I am being paid at an hourly rate for my work on this matter. I
`
`have no personal or financial stake or interest in the outcome of the present
`
`proceeding.
`
`II. QUALIFICATIONS
`
`3.
`
`I am an Assistant Professor in the School of Mechanical
`
`Engineering at the Georgia Institute of Technology (Georgia Tech). At Georgia
`
`Tech, I am the founder and director of the Intelligent Robotics and Emergent
`
`Automation Lab (iREAL), which is a cutting-edge robotics laboratory affiliated
`
`with Georgia Tech’s Institute for Robotics and Intelligent Machines.
`
`4.
`
`I received my M.S. and Ph.D. degrees in Aerospace Engineering
`
`from Georgia Tech in 2007 and 2009 respectively and a B.S. degree in Physics
`1
`
`
`Yuneec Exhibit 1002 Page 2
`
`
`
`Inter Partes Review of USP 9,164,506
`
`from Georgetown University in 2006. Prior to my appointment at Georgia Tech, I
`
`Attorney Docket No.: 748710000004
`
`was an Assistant Professor in the Aerospace Engineering Department at Texas
`
`A&M from 2011-2013.
`
`5.
`
`To date, I have authored or co-authored more than 50 journal and
`
`conference papers in the areas of aerospace robotics, flight control, and unmanned
`
`vehicles. My research has been funded by a variety of sources including the Army
`
`Research Office, Defense Advanced Research Projects Agency (DARPA), Air
`
`Force Research Lab, Army Research Lab (ARL), National Science Foundation
`
`(NSF), and NASA, among others. Industry sponsors have included BAE Systems,
`
`General Dynamics, and SAIC. Over the past five years, this government and
`
`industry support has resulted in over $2.5M in external funding provided to my lab
`
`to support our research activities. Graduate students who have studied under my
`
`mentorship currently work at a variety of aerospace companies or labs around the
`
`country including Boeing, L3 Unmanned Systems, the Air Force Flight Test
`
`Center, and Stanford University.
`
`6.
`
`Recently, I was the recipient of the NSF CAREER Award (2016),
`
`the Lockheed Martin Inspirational Young Faculty Award (2016), and the Army
`
`Research Office Young Investigator Award (2012). I currently serve on the
`
`Editorial Board of the IMechE Journal of Aerospace Engineering.
`
`
`
`2
`
`Yuneec Exhibit 1002 Page 3
`
`
`
`Inter Partes Review of USP 9,164,506
`
`
`Attorney Docket No.: 748710000004
`
`7.
`
`My research activities are primarily focused on aerial robotics.
`
`This field encompasses unmanned-vehicle design; guidance, navigation, and
`
`control; and flight dynamics. My research activities seek to uncover, explore,
`
`develop, and prototype novel unmanned vehicles and control algorithms in order to
`
`create a new class of aerial robots that serve in a variety of missions. In some
`
`cases, our research group focuses on design and construction of novel vehicle
`
`configurations. An example is an unmanned vehicle that exhibits hybrid
`
`locomotion – i.e., a single vehicle that can travel efficiently in air, on ground, or
`
`underwater. In other cases, we develop novel control algorithms for existing air
`
`vehicles. As an example of this line of research, our group recently developed a
`
`control algorithm that can land autonomous helicopters when the engine fails.
`
`After developing this control algorithm in simulation, we successfully flight tested
`
`it using one of our lab’s custom helicopter unmanned aerial vehicles (“UAVs”).
`
`8.
`
`While my lab has strong expertise in modeling and simulation,
`
`one area in which we are particularly experienced is in vehicle prototyping and
`
`flight testing. As a result, I have substantial knowledge regarding design of UAV
`
`hardware and flight operations. Currently, my lab operates a variety of UAVs,
`
`including two autonomous quadrotor vehicles and several helicopter UAVs of
`
`various scales. Some of these vehicles are custom built, while others are
`
`commercially-available platforms that we have modified for research purposes. In
`3
`
`
`Yuneec Exhibit 1002 Page 4
`
`
`
`Inter Partes Review of USP 9,164,506
`
`addition to the vehicles themselves, I have extensive knowledge of user control
`
`Attorney Docket No.: 748710000004
`
`interfaces (or “ground stations”) obtained through flight operations over the course
`
`of many research projects. These ground station interfaces include both
`
`commercially-available models (such as MavLink) and custom-built interfaces for
`
`our autopilots.
`
`9.
`
`I have led a number of research projects, including in the field of
`
`UAVs. For instance, I have recently led a project focused on control algorithms
`
`for autonomous UAV tracking of multiple ground targets. This research,
`
`documented in a paper in the Journal of Aerospace Information Systems,
`
`developed a new control algorithm that allows a UAV to autonomously track
`
`multiple moving ground targets simultaneously. (See N. Miller, J. Rogers,
`
`“Simultaneous Tracking of Multiple Ground Targets from a Multirotor UAV,”
`
`Journal of Aerospace Information Systems, Vol. 12, No. 3, 2015, pp. 345-364,
`
`Ex. 1014.) Through the use of an advanced optimal-control method called model
`
`predictive control, the UAV automatically adjusts its position and height to
`
`maintain all targets within its field of view at all times. We developed two
`
`separate control laws: one for a UAV with a gimbaled camera (that allows
`
`rotational motion of the camera along two axes), and one for the case of a fixed
`
`camera mounted rigidly to the aircraft. This research was a significant
`
`advancement over prior work in this area in that, rather than considering only a
`4
`
`
`Yuneec Exhibit 1002 Page 5
`
`
`
`Inter Partes Review of USP 9,164,506
`
`single target, our control laws were developed to simultaneously track multiple
`
`Attorney Docket No.: 748710000004
`
`targets. This research was also the subject of a Masters thesis documenting these
`
`novel control algorithms.1
`
`10.
`
`Through the course of my teaching and research activities I can be
`
`considered as an expert in a number of areas, including UAV control algorithms,
`
`development and use of user interfaces for UAVs, and image/video processing
`
`algorithms. In the area of UAV control algorithms, I have published numerous
`
`scientific papers describing novel optimal tracking and control algorithms (see
`
`above as examples), with external research funding provided by NASA and ARL.
`
`I have furthermore gained expertise in UAV user interfaces through both my
`
`extensive use of commercial interfaces and our construction of custom user
`
`interfaces for our lab’s UAVs. Finally, in the area of image/video processing, I
`
`have worked on several projects involving image-based navigation and flight
`
`control for autonomous vehicles. Through this work I have become very familiar
`
`with common feature recognition algorithms (such as SURF) and image-based
`
`navigation algorithms (such as SLAM). I recently authored a paper involving
`
`image-based control of a guided munition. (F. Fresconi, J. Rogers, “Flight Control
`
`
`
` 1
`
` N. Miller, J. Rogers, “Simultaneous Tracking of Multiple Ground Targets from a
`Single Multirotor UAV,” AIAA Atmospheric Flight Mechanics Conference,
`Atlanta, GA, June 16-20, 2014. (Ex. 1015.)
`5
`
`
`Yuneec Exhibit 1002 Page 6
`
`
`
`Inter Partes Review of USP 9,164,506
`
`of a Small Diameter Spin-Stabilized Projectile Using Imager Feedback,” Journal of
`
`Attorney Docket No.: 748710000004
`
`Guidance, Control, and Dynamics, Vol. 38, No. 2, 2015, pp. 181-191, Ex. 1016.)
`
`III. MATERIALS CONSIDERED
`
`11.
`
`In preparing this declaration, I have reviewed, among other things,
`
`the ’506 patent and its file history, U.S. Patent No. 9,367,067 to Gilmore et al.
`
`(including the provisional application, application no. 61/800,201), U.S. Patent
`
`Publication No. US2012/0287274 A1 to Bevirt (including the provisional
`
`application, application No. 61/476,767), U.S. Patent No. 7,970,507 to Fregene et
`
`al., and U.S. Patent Publication No. US2015/0350614 A1 to Meier et al. (including
`
`the provisional application, application no. 62/007,311).
`
`12.
`
`I have also reviewed Sections 2141 and 2143 of the Manual of
`
`Patent Examining Procedure.
`
`IV. DEFINITIONS AND STANDARDS
`
`13.
`
`I have been informed and understand that claims are construed
`
`from the perspective of one of ordinary skill in the art at the time of the claimed
`
`invention, and that during inter partes review, claims are to be given their broadest
`
`reasonable construction consistent with the specification.
`
`14.
`
`I have also been informed and understand that the subject matter
`
`of a patent claim is obvious if the differences between the subject matter of the
`
`claim and the prior art are such that the subject matter as a whole would have been
`6
`
`
`Yuneec Exhibit 1002 Page 7
`
`
`
`Inter Partes Review of USP 9,164,506
`
`obvious at the time the invention was made to a person having ordinary skill in the
`
`Attorney Docket No.: 748710000004
`
`art to which the subject matter pertains. I have also been informed that the
`
`framework for determining obviousness involves considering the following
`
`factors: (i) the scope and content of the prior art; (ii) the differences between the
`
`prior art and the claimed subject matter; (iii) the level of ordinary skill in the art;
`
`and (iv) any objective evidence of non-obviousness. I understand that the claimed
`
`subject matter would have been obvious to one of ordinary skill in the art if, for
`
`example, it results from the combination of known elements according to known
`
`methods to yield predictable results, the simple substitution of one known element
`
`for another to obtain predictable results, use of a known technique to improve
`
`similar devices in the same way or applying a known technique to a known device
`
`ready for improvement to yield predictable results. I have also been informed that
`
`the analysis of obviousness may include recourse to logic, knowledge, judgment
`
`and common sense available to the person of ordinary skill in the art that does not
`
`necessarily require explication in any reference.
`
`15.
`
`In my opinion, a person of ordinary skill in the art pertaining to
`
`the ’506 patent at the relevant date discussed below would have at least a
`
`Bachelor’s degree in aerospace or electrical engineering and approximately five
`
`years of industry related experience to UAVs, including relevant experience in
`
`UAV control algorithms, development and use of user interfaces for UAVs and
`7
`
`
`Yuneec Exhibit 1002 Page 8
`
`
`
`Inter Partes Review of USP 9,164,506
`
`image/video-processing algorithms. I reach this opinion based on a number of
`
`Attorney Docket No.: 748710000004
`
`factors, including the sophistication of the technology and the types of problems
`
`encountered in this art, such as discussed in the “Background of the Technology”
`
`section below.
`
`16.
`
`I have been informed that the relevant date for considering the
`
`patentability of the claims of the ’506 patent is July 30, 2014, which is the earliest
`
`filing date to which the claims are entitled. I have not analyzed whether the claims
`
`of the ’506 patent are entitled to this filing date, but I have analyzed obviousness as
`
`of that date. I may refer to this time frame as the “relevant date” or the “relevant
`
`time frame.” Based on my education and experience in the field of Aerospace
`
`Engineering set forth above, I believe I am more than qualified to provide opinions
`
`about how one of ordinary skill in the art by the relevant date in 2014 would have
`
`interpreted and understood the ’506 patent, its claims, and the prior art discussed
`
`below.
`
`17.
`
`I set forth a few examples of the kinds of skills one of ordinary
`
`skill would have at the relevant date, without intending to list every such skill.
`
`Such a person would have understood UAV control algorithms, development and
`
`use of user interfaces for UAVs, and image/video processing algorithms.
`
`
`
`8
`
`Yuneec Exhibit 1002 Page 9
`
`
`
`Inter Partes Review of USP 9,164,506
`
`V. BACKGROUND OF THE TECHNOLOGY
`
`Attorney Docket No.: 748710000004
`
`18.
`
`I understand that the obviousness inquiry requires consideration of
`
`common knowledge. By 2014, autonomous target tracking via unmanned aerial
`
`vehicles was well known. The first autonomous target tracking algorithms (and
`
`supporting technologies) were developed by U.S. military contractors for
`
`implementation on large fixed-wing UAVs controlled via satellite link, such as for
`
`example the RQ-4 Global Hawk. These UAVs were originally designed to be
`
`controlled by two human operators – one to fly the aircraft, and one to track the
`
`ground target via manual control of the onboard gimbal (which points the camera
`
`with respect to the aircraft). The requirement for two operators to be fully engaged
`
`in flying the aircraft and manually tracking a target proved to be heavily
`
`burdensome and reduced the number of aircraft that could be deployed (and thus
`
`targets that could be tracked) at any one time. To address this, researchers in the
`
`defense community developed autonomous target tracking algorithms primarily
`
`designed for fixed-wing UAVs that allowed for fully autonomous flight control
`
`and target tracking. These algorithms determine a proper aircraft flight path and
`
`camera pointing angles based on recognition of the target over a sequence of
`
`images, thereby eliminating the need for a human operator to manually manipulate
`
`the aircraft and camera controls. Several patents and publications describe
`
`
`
`9
`
`Yuneec Exhibit 1002 Page 10
`
`
`
`Inter Partes Review of USP 9,164,506
`
`example implementations of these autonomous control algorithms, including for
`
`Attorney Docket No.: 748710000004
`
`military systems.2
`
`19.
`
`The integrated technologies required to build UAVs with
`
`autonomous tracking capability cover three distinct areas: tracking control
`
`algorithms, command and control user interfaces, and image processing. The first
`
`of these, tracking control algorithms, addresses the actual feedback control
`
`methods used by the vehicle to maintain target position and/or size in the camera
`
`field of view throughout flight. These algorithms operate as follows. At a single
`
`control cycle, the target is identified in one or more digital images using an image
`
`recognition algorithm. Based on the size, location, and/or orientation of the target
`
`in the image, control inputs are generated for the UAV, camera gimbal mount, or
`
`both to drive the target toward a desired location and/or size within the image.
`
`This process is then repeated at a regular update rate (for instance, 10 Hz).
`
`20.
`
`Numerous different target tracking algorithms have been
`
`developed prior to 2014. For fixed-wing aircraft, the autonomous tracking
`
`problem becomes rather complex due to the minimum speed limitation of fixed-
`
`wing vehicles. From a control algorithms standpoint, this speed limitation makes
`
`the problem somewhat interesting and nontrivial to solve, as the vehicle must
`
`
`
` 2
`
`
`
` U.S. Patent 7,970,507 (Fregene et al., “Fregene,” Ex. 1008); U.S. Patent
`10
`
`Yuneec Exhibit 1002 Page 11
`
`
`
`Inter Partes Review of USP 9,164,506
`
`ensure that the target remains visible even if the target velocity is less than the
`
`Attorney Docket No.: 748710000004
`
`minimum linear speed of the UAV. This necessitates the use of periodic flight
`
`patterns (such as the “weave” pattern proposed by Kokkeby) that must be
`
`generated by the control algorithm in real time. Due to the interesting nature of
`
`this control problem and its wide applicability to fixed-wing aircraft, numerous
`
`authors have proposed solutions to the fixed-wing target tracking problem
`
`including Rafi et al.3, Dobrokhodov et al.4, Regina et al.5, Fregene, and Kokkeby,
`
`among others.
`
`21.
`
`When considering rotary-wing vehicles, the tracking problem
`
`becomes noticeably simpler due to the absence of a minimum speed limitation.
`
`Example autonomous tracking algorithms for rotorcraft vehicles are provided by
`
`
`
`2009/0157233 (Kokkeby et al., “Kokkeby,” Ex. 1017).
`3 F. Rafi, S. Khan, K. Shafiq, M. Shah, “Autonomous Target Following by
`Unmanned Aerial Vehicles,” Proceedings of the SPIE 6230, Unmanned Systems
`Technology VIII, 623010, May 9, 2006. (Ex. 1018.)
`
` 4
`
` V. Dobrokhodov, I. Kaminer, K. Jones, R. Ghabcheloo, “Vision-Based Tracking
`and Motion Estimation for Moving Targets Using Small UAVs,” 2006 American
`Control Conference, Minneapolis, MN, June 2006. (Ex. 1019.)
`
` 5
`
` N. Regina, M. Zanzi, “Fixed-Wing UAV Guidance Law for Surface-Target
`Tracking and Overflight,” 2012 IEEE Aerospace Conference, Piscataway, NJ,
`March 2012. (Ex. 1020.)
`
`
`
`11
`
`Yuneec Exhibit 1002 Page 12
`
`
`
`Inter Partes Review of USP 9,164,506
`
`Gomez-Balderas et al.6 and Kim and Shim7, among others. In light of the
`
`Attorney Docket No.: 748710000004
`
`extensive prior work in target tracking algorithms over the past two decades prior
`
`to 2014, this area is considered to be quite well-explored and there are a variety of
`
`common algorithms that are typically employed on commercially-available UAVs.
`
`In particular, GPS-based target tracking algorithms are often used in commercially-
`
`available UAVs due to the relative simplicity of obtaining target position
`
`information (as compared to vision-based tracking). In one approach, GPS-based
`
`tracking algorithms work by computing the current relative position offset between
`
`the UAV and target by comparing the GPS positions of each. Then, an error signal
`
`is computed as the difference between the current relative position offset, and a
`
`desired relative position offset provided by the user. The UAV control inputs are
`
`then adjusted so as to drive this error to zero. Some examples of UAV target
`
`tracking controllers that use GPS feedback are provided by Kokkeby and Gilmore.
`
`22.
`
`By 2014, a multitude of user-interface (UI) products were
`
`available to support operation of UAVs. The main purpose of these UIs is to
`
`
`
` 6
`
` J.-E. Gomez-Balderas, G. Flores, L.-R. Garcia Carrillo, R. Lozano, “Tracking a
`Ground Moving Target with a Quadrotor Using Switching Control,” International
`Conference on Unmanned Aircraft Systems (ICUAS 2012), Philadelphia, PA, June
`2012. (Ex. 1021.)
`
`
`
`12
`
`Yuneec Exhibit 1002 Page 13
`
`
`
`Inter Partes Review of USP 9,164,506
`
`provide a way to efficiently task and monitor UAVs, even by inexperienced pilots
`
`Attorney Docket No.: 748710000004
`
`or operators. By 2014, UIs were capable of communicating wirelessly with the
`
`subject aircraft through so-called telemetry. Depending on the desired range or
`
`reliability required, wireless communication is implemented via WiFi (2.4 GHz),
`
`radio frequency (900 MHz), or even satellite link. For UAVs equipped with
`
`onboard video, UIs also usually include the ability to view a real-time video feed
`
`from the aircraft. In some cases, this video feed may be used to select one or more
`
`targets for autonomous tracking by the UAV control system. Numerous
`
`commercially-available UIs have been developed that support a broad range of
`
`aircraft models.
`
`23.
`
`A final technology area relevant here is real-time image
`
`processing. Any autonomous tracking control algorithm must be able to identify,
`
`in real time, the target position with respect to the aircraft to facilitate real-time
`
`control. One convenient method for doing this that does not require any additional
`
`sensors (beyond the onboard camera used for tracking) is through the use of feature
`
`or object recognition. The basic premise of feature recognition is that once a
`
`certain pattern of pixels is identified by the user as the “target,” this pattern can be
`
`
`
` 7
`
` J. W. Kim, D. Shim, “A Vision-based Target Tracking Control System of a
`Quadrotor by using a Tablet Computer,” International Conference on Unmanned
`Aircraft Systems (ICUAS 2013), Atlanta, GA, May 2013. (Ex. 1022.)
`13
`
`
`Yuneec Exhibit 1002 Page 14
`
`
`
`Inter Partes Review of USP 9,164,506
`
`identified and located in subsequent images. However, as the target moves with
`
`Attorney Docket No.: 748710000004
`
`respect to the aircraft, this pattern can become distorted due to changes in the
`
`relative position and orientation of the target with respect to the aircraft (so-called
`
`pose changes). To address this, feature-recognition algorithms have been
`
`developed that can heavily mitigate the effects of pose or illumination changes.
`
`This is referred to in the computer vision field as scale- and orientation-invariance.
`
`Since the early 2000’s computer vision, and feature recognition in particular, has
`
`become a highly active area of research. Historically, feature recognition has been
`
`known as a particularly burdensome computational process. Due to extensive
`
`research in this area a suite of readily-available algorithms is now available that
`
`can perform object recognition in real time even on low cost embedded computers.
`
`Some examples of common feature recognition algorithms include the Scale
`
`Invariant Feature Transform (SIFT)8, the Speeded-Up Robust Features (SURF)
`
`transform,9 and the Histogram of Oriented Gradients (HOG) method.10 Open
`
`
`
` 8
`
` D. Lowe, “Object Recognition from Local Scale-Invariant Features,” Proceedings
`of the 1999 International Conference on Computer Vision, pp. 1150-1157.
`(Ex. 1023.)
`
` 9
`
` H. Bay, A. Ess, T. Tuytelaars, L. Van Gool, “SURF: Speeded Up Robust
`Features,” Computer Vision and Image Understanding (CVIU), Vol. 110, No. 3,
`pp. 346–359, 2008. (Ex. 1024.)
`
`
`
`14
`
`Yuneec Exhibit 1002 Page 15
`
`
`
`Inter Partes Review of USP 9,164,506
`
`source implementations of many of these algorithms are available for commercial
`
`Attorney Docket No.: 748710000004
`
`use as part of the well-known OpenCV software package.11
`
`24.
`
`Another well-known aspect of UAV target tracking systems
`
`involves the dynamic allocation of decision-making and control between the user
`
`and the autonomous vehicle. Target tracking is a complex process involving
`
`numerous decisions that have to be made repeatedly at a very high rate. These
`
`decisions and control computations can be divided between the user and computer
`
`processors onboard the UAV in an arbitrary number of ways. Accordingly, there
`
`are varying levels of control that a human operator can exert over the tracking
`
`process – from manual control (in which the operator physically steers the vehicle),
`
`to high level flight path guidance (in which the user provides a suggested vehicle
`
`trajectory), to complete autonomy for the UAV (in which the user allows the UAV
`
`to make all decisions). In the early 2000’s, researchers in the UAV field realized
`
`that the degree of control that human operators want, or are capable of executing,
`
`is highly dependent on the workload of the operator, the type of user interface
`
`
`
`10 N. Dalal, B. Triggs, “Histograms of Oriented Gradients for Human Detection,”
`Proceedings of the 2005 Conference on Computer Vision and Pattern Recognition,
`pp. 886-893. (Ex. 1025.)
`11 Itseez Developer Team, “OpenCV: Open Source Computer Vision,”
`http://opencv.org/.
`
`
`
`15
`
`Yuneec Exhibit 1002 Page 16
`
`
`
`Inter Partes Review of USP 9,164,506
`
`available, and the complexity of the environment through which the UAV is flying,
`
`Attorney Docket No.: 748710000004
`
`among other factors. Because at least some of these factors may change over the
`
`course of a mission, the right approach for many systems is to implement “adaptive
`
`autonomy,” in which control over the tracking process may shift in a dynamic way
`
`from the user to the UAV and back again as operator workload or external
`
`parameters change. These adaptive autonomy methods were well-known in the
`
`robotics field as of 2014 and have been the subject of numerous research papers in
`
`the domain of human-robot interface. For example, a thorough review of adaptive
`
`autonomy methods, also sometimes called dynamic function allocation, as of 2001
`
`is provided in the review paper by Kaber et al.12
`
`VI. THE ’506 PATENT
`
`25.
`
`The ’506 patent is directed to systems and methods for
`
`autonomous tracking of a moving target by an unmanned aerial vehicle. (See, e.g.,
`
`Abstract.) The patent is specifically related to a UAV that is equipped with an
`
`imaging device. (See, e.g., FIG. 1.) The imaging device (e.g., a video camera) is
`
`assumed to either be fixed rigidly to the UAV, or mounted via a gimbal, such that
`
`the camera has freedom of movement with respect to the aircraft body, such as
`
`
`
`12 D. Kaber, J. Riley, K.-W. Tan, M. Endsley, “On the Design of Adaptive
`Automation for Complex Systems,” International Journal of Cognitive
`Ergonomics, Vol. 5, No. 1, 2001, pp. 37-57. (Ex. 1026.)
`16
`
`
`Yuneec Exhibit 1002 Page 17
`
`
`
`Inter Partes Review of USP 9,164,506
`
`rotational degrees of freedom. (Id.) The UAV is operated by a user who specifies
`
`Attorney Docket No.: 748710000004
`
`the target to be tracked via a user interface (UI) or some type of ground-based
`
`control station that transmits this target data to the UAV. The UAV receives the
`
`target information and executes autonomous flight in which the target is tracked.
`
`(See, e.g., 1:35-2:19.) The ’506 patent describes that “[a]n active target may be
`
`configured to transmit information about the target, such as the target’s GPS
`
`location, to the movable object.” (12:30-33.) I understand that the Patent Owner
`
`has stated that the “target information may also include ‘the target’s GPS
`
`location,’” and cited the passage at Col. 12, lines 30-33 of the ’506 patent.
`
`(Ex. 1011 at 3.)
`
`26.
`
`FIG. 6 (below, left image) from the ’506 patent illustrates a
`
`desired target position (u0, v0) and a current target position (u, v) (below, left
`
`image) in the camera image. FIG. 7 (below, right image) illustrates a desired target
`
`size (s0 or s1) and actual target size (s). The differences between the actual image
`
`position and/or size, and the desired position and/or size forms the error value from
`
`which control inputs are then computed.
`
`
`
`17
`
`Yuneec Exhibit 1002 Page 18
`
`
`
`Inter Partes Review of USP 9,164,506
`
`
`Attorney Docket No.: 748710000004
`
`
`
`
`
`27.
`
`At regular control update intervals, the deviation between the
`
`desired and actual image parameters described above are computed, and the
`
`aircraft computes control inputs to modify its flight path and/or gimbal
`
`configuration in an attempt to minimize these deviations. (See, e.g., 24:46-63.)
`
`The control inputs take the form of pitch and roll angular velocity commands to the
`
`UAV. (Id.) These act to change the UAV orientation, and also to induce
`
`translational velocity of the aircraft that results in a reduction in error of the
`
`imaging parameters. (Id.) Alternatively, or in addition, control inputs may also
`
`take the form of angular velocity commands to the gimbal which result in a
`
`reduction in error of the imaging parameters. (Id.)
`
`28.
`
`The resulting system of the ’506 patent purportedly allows the
`
`UAV to perform target tracking in a fully autonomous fashion, with the user
`
`providing only target information. The UAV may then capture, store, and/or
`
`transmit streaming video of the target to the user throughout the autonomous
`
`
`
`18
`
`Yuneec Exhibit 1002 Page 19
`
`
`
`Inter Partes Review of USP 9,164,506
`
`tracking session. An overall notional tracking process is depicted in FIG. 18
`
`Attorney Docket No.: 748710000004
`
`(shown below) from the ’506 patent. (See also 51:4-37.) In FIG. 18, the user
`
`wishes to be tracked while, for instance, running. The user first selects
`
`himself/herself via the user interface as the target to be tracked. Once this
`
`information is received by the UAV, the aircraft then purportedly autonomously
`
`tracks the user while the user is running, keeping the user inside the camera field of
`
`view at all times.
`
`
`
`29.
`
`According to the specification, a target can be explicitly identified
`
`as a set of pixels on the streamed image via the user interface (for instance, by
`
`circling the pixels with a stylus or finger). (See, e.g., 37:18-24.) Alternatively, the
`
`user can provide the UAV with target information in the form of a more general
`
`
`
`19
`
`Yuneec Exhibit 1002 Page 20
`
`
`
`Inter Partes Review of USP 9,164,506
`
`description of the target – i.e., target color, pattern, or type information. (See
`
`Attorney Docket No.: 748710000004
`
`37:41-54.) Image processing hardware onboard the UAV, notionally using some
`
`type of automatic target recognition (ATR) algorithm, can then attempt to identify
`
`the target within the image. Once the target is identified, errors between the
`
`current and desired target position and/or size can be determined for control
`
`computation. The determination of whether to adjust the UAV, gimbal, or both,
`
`and the magnitude of these adjustments, may be a function of certain constraints.
`
`These constraints may include the configuration or settings of the UAV and the
`
`camera gimbal. (15:1-48.) For example, an adjustment that involves rotation
`
`around two axes may be achieved solely by a corresponding rotation of the UAV
`
`around two axes if the camera is fixed to the UAV, according to the ’506 patent.
`
`(15:9-14.) The constraints may include the navigation path of the UAV. (15:49-
`
`59.) The ’506 patent refers to an example of a predetermined navigation path.
`
`(Id.) These constraints may include maximum and minimum limits for rotation
`
`angles, angular speed values, or other parameters. (15:60-16:2.)
`
`30.
`
`The ’506 patent discloses that it may be desirable to limit the
`
`angular velocity commands to the UAV or gimbal to maximum amounts (a term
`
`referred to in engineering as “saturation”). (See 15:60-16:15.) The ’506
`
`specification states that when control commands are computed, they may be
`
`compared to saturation limits, and if they exceed saturation limits, the maximum
`20
`
`
`Yuneec Exhibit 1002 Page 21
`
`
`
`Inter Partes Review of USP 9,164,506
`
`allowable control commands will be given instead. (Id.) If this occurs, the
`
`Attorney Docket No.: 748710000004
`
`specification of the ’506 patent states that a “warning” may be provided to the user
`
`via the UI. Note that these saturation limits are not limited to vehicle and gimbal
`
`motion commands, but may also extend to camera parameters such as zoom, field
`
`of view, etc.
`
`31.
`
`The ’506 specification also discloses that the UI has the capability
`
`to receive images or a video stream from the UAV, potentially in real-time.
`
`(16:24-27.) During initialization, the target may be selected by, for instance,
`
`circling the target with a finger (on a touchpad display), or selecting f