throbber
llIIlIlllIlIlIllIlIllIllIllIIlllIIllIlIlllIIlIlllIIlIlllllIIlIllIIIlllIIlllIIIIIIIIIIIIIII
`US 20090157233A1
`
`119) United States
`112) Patent Application Publication
`Kokkeby et al.
`
`110) Pub. No. : US 2009/0157233 A1
`Jun. 18, 2009
`(43) Pub. Date:
`
`154) SYSTEM AND METHODS FOR
`AUTONOMOUS TRACKING AND
`SURVEILLANCE
`
`176)
`
`Inventors:
`
`Kristen L. Kokkeby, Corona, CA
`1US); Robert P. Lutter, Tacoma,
`WA (US); Michael L. Munoz,
`Tacoma, WA (US); Frederick W.
`Cathey, Seattle, WA (US); David J.
`Hilliard, Shoreline, WA 1US);
`Trevor L. Olson, Seattle, WA 1US)
`
`Correspondence Address:
`KLEIN, O' NEILL 4 SINGH, LLP
`43 CORPORATE PARK, SUITE 204
`IRVINE, CA 92606 (US)
`
`121) Appl. No. :
`
`11/956, 711
`
`122) Filed:
`
`Dec. 14, 2007
`
`Publication Classification
`
`151) Int. Cl.
`G05D 1/00
`
`12006. 01)
`
`. 701/3
`
`ABSTRACT
`
`152) U. S. Cl. .
`157)
`A system and methods for autonomously
`tracking and simul-
`taneously providing surveillance of a target from air vehicles.
`In one embodiment
`the system receives inputs from outside
`sources, creates tracks, identifies
`the targets and generates
`flight plans for unmanned
`air vehicles 1UAVs) and camera
`controls for surveillance of the targets. The system uses pre-
`dictive algorithms and aircraft control laws. The system com-
`prises a plurality of modules configured to accomplish these
`tasks. One embodiment comprises an automatic target recog-
`to receive video informa-
`nition 1ATR) module configured
`tion, process the video information, and produce ATR infor-
`mation including target information. The embodiment
`further
`comprises a multi-sensor
`integrator 1MSI) module configured
`to receive the ATR information, an air vehicle state input and
`track
`a target state input, process
`the inputs and produce
`information for the target. The embodiment
`further comprises
`to receive the track information,
`a target module configured
`process the track information, and produce predicted future
`state target information. The embodiment
`further comprises
`to receive the track informa-
`an ownship module configured
`tion, process the track information,
`and produce predicted
`future state air vehicle information. The embodiment
`further
`to receive the pre-
`comprises a planner module configured
`dicted future state target information and the predicted future
`travel path infor-
`state air vehicle information and generate
`for
`flight and camera steering commands
`mation
`including
`the air vehicle.
`
`RECEIVE AIRCRAFT VIDEOS
`S118
`
`RECEIVE AIRCRAFT STATE
`AND TARGET STATE
`S120
`
`INTEGRATE TARGET AND
`AIRCRAFT DATA
`S122
`
`GENERATE TARGET FILE
`S124
`
`PREDICT UAV AND TARGET
`POSITIONS
`8126
`
`PREDICT FUTURE UAV STATES AND
`REVISE UAV PLAN, WHEN NEEDED
`S'I 28
`
`VERIFY PLAN VALIDITY
`S130
`
`GENERATE CAMERA CONTROL AND
`UAV NAVIGATION COMMANDS
`S132
`
`Yuneec Exhibit 1017 Page 1
`
`

`
`Patent Application Publication
`
`Jun. 18, 2009 Sheet 1 of 6
`
`US 2009/0157233 A1
`
`Yuneec Exhibit 1017 Page 2
`
`

`
`Patent Application Publication
`
`Jun. 18, 2009 Sheet 2 of 6
`
`US 2009/0157233 A1
`
`Loiter Circle
`
`Statlonag
`Tat get
`
`Weave Plan
`
`I
`
`Moving Target
`72
`80---
`
`FIG. 3
`
`Chase Plan
`
`Fast Moving Target
`72
`
`Yuneec Exhibit 1017 Page 3
`
`

`
`Patent Application Publication
`
`Jun. 18, 2009 Sheet 3 of 6
`
`US 2009/0157233 A1
`
`-8000
`
`-6000
`9O'
`
`*
`
`I
`
`'2500
`
`-5000
`
`-10000
`
`-12500
`
`-15000
`
`*
`
`r
`
`Yuneec Exhibit 1017 Page 4
`
`

`
`Patent Application Publication
`
`Jun. 18, 2009 Sheet 4 of 6
`
`US 2009/0157233 A1
`
`Yuneec Exhibit 1017 Page 5
`
`

`
`Patent Application Publication
`
`Jun. 18, 2009 Sheet 5 of 6
`
`US 2009/0157233 A1
`
`Yuneec Exhibit 1017 Page 6
`
`

`
`Patent Application Publication
`
`Jun. 18, 2009 Sheet 6 of 6
`
`US 2009/0157233 A1
`
`Yuneec Exhibit 1017 Page 7
`
`

`
`US 2009/0157233 A1
`
`Jun. 18, 2009
`
`SYSTEM AND METHODS FOR
`AUTONOMOUS TRACKING AND
`SURVEILLANCE
`
`BACKGROUND
`[0001] 1. Technical Field
`to control of
`[0002] The present
`disclosure
`relates
`air vehicles (UAVs), tracking of moving
`unmanned
`targets
`and surveillance of areas, stationary
`targets and moving tar-
`gets.
`[0003] 2. Description of Related Art
`[0004] Aerial surveillance and tracking includes the use of
`unmanned air vehicles. Currently human operators remotely
`control UAVs. The operators must steer both the UAV and the
`in order to maintain
`camera/surveillance
`tracking
`payload
`and positive identification of a moving target. Positive iden-
`or obstructions
`tification may require no interruptions
`in
`visual observation of the target. This practice is labor inten-
`sive, and therefore expensive. Usually two operators track a
`single target, enabling one operator to control flight and the
`focus, zoom, etc.
`other operator to control camera pointing,
`And in military applications
`involving hill value targets, such
`two UAVs are dedicated
`to the
`terrorists, usually
`as known
`four operators. Remotely controlling
`target,
`thus requiring
`is also prone to loss of positive
`UAVs with human operators
`identification due to bad vehicle position or bad camera angle.
`Current methods also do not adequately
`support real time
`collection of target attribute data. In addition,
`the operators
`must pay special attention to no fly zones, restricted airspace
`and obstructions, further increasing the difficulty o f maintain-
`track.
`ing an uninterrupted
`
`SUMMARY
`
`[0005] The embodiments of the present system and meth-
`ods for autonomous
`tracking and surveillance have several
`features, no single one of which is solely responsible for their
`desirable attributes. Without limiting the scope of the present
`as expressed by the claims that follow, their
`embodiments
`features now will be discussed briefly. After
`more prominent
`this discussion, and particularly after reading the
`considering
`section entitled "Detailed Description", one will understand
`how the features of the present embodiments provide advan-
`in the number of human
`include a reduction
`tages, which
`operators needed to operate the system, which in turn trans-
`lates into cost savings, a reduction
`in the likelihood
`that
`tracked targets will be lost, a decrease in the risk that UAVs
`will be lost due to crashes/collisions,
`and a decrease in the risk
`that UAVs will enter no fly zones.
`[0006] One aspect of the present system and methods for
`the realiza-
`includes
`tracking and surveillance
`autonomous
`tion that current systems for tracking and surveillance
`are
`heavily dependent upon human operators. This dependence
`is costly, and subject to losses of target/track
`upon humans
`data due to bad vehicle position or bad camera angle. Human
`to blame for these losses. Accordingly, a
`error is frequently
`for automating
`system and methods
`surveillance,
`targeting
`and tracking functions would save costs and reduce errors.
`[0007] One embodiment of the present system for autono-
`tracking a target from an air vehicle comprises an
`mously
`to
`target recognition
`automatic
`(ATR) module configured
`receive video information, process the video information, and
`produce ATR information
`target information. The
`including
`further comprises a multi-sensor
`(MSI)
`integrator
`system
`
`to receive the ATR information,
`an air
`module configured
`vehicle state input and a target state input, process the inputs
`for the target. The system
`track information
`and produce
`to receive the
`further comprises a target module configured
`track information, process the track information, and produce
`predicted future state target information. The system further
`comprises an ownship module configured to receive the track
`information, process the track information, and produce pre-
`dicted future state air vehicle information. The system further
`to receive the pre-
`comprises a planner module configured
`dicted future state target information and the predicted future
`travel path infor-
`state air vehicle information and generate
`mation including fight and camera steering commands for the
`air vehicle.
`[000S] One embodiment of the present methods of autono-
`mously tracking a target from an airborne vehicle comprises
`the steps of receiving video information
`input to an automatic
`target recognition (ATR) module, processing the video infor-
`mation, and producing ATR information. The method further
`the steps of receiving
`the ATR information,
`air
`comprises
`vehicle state
`as
`information
`information
`and target state
`to a multi-sensor
`the
`(MSI), processing
`integrator
`inputs
`track information. The method further
`inputs and producing
`the steps of receiving the track information as an
`comprises
`the track information,
`input to a target module, processing
`predicting a future state of the target and producing
`target
`the steps of
`information. The method
`further comprises
`to an ownship
`the track information
`receiving
`as an input
`module, processing the track information, predicting a future
`state of the air vehicle and producing ownship
`information.
`The method further comprises the steps of receiving the target
`information as inputs to a plan-
`information and the ownship
`ner module and generating a travel path for the air vehicle.
`embodiment of the present
`for
`[0009] Another
`system
`tracking a target from an air vehicle comprises
`autonomously
`means for receiving video information, processing the video
`recognition
`automatic
`information,
`target
`and producing
`target information. The system
`(ATR) information
`including
`further comprises means for receiving the ATR information,
`an air vehicle state input and a target state input, processing
`track information for the target. The
`the inputs and producing
`system further comprises means for receiving the track infor-
`mation, processing the track information, and producing pre-
`information. The system
`dicted future state target
`further
`comprises means for receiving
`the track information, pro-
`cessing the track information, and producing predicted future
`state air vehicle information. The system further comprises
`means for receiving the predicted future state target informa-
`tion and the predicted future state air vehicle information and
`flight and cam-
`travel path information
`including
`generating
`for the air vehicle.
`era steering commands
`and advantages of the
`[0010] The features,
`functions,
`in vari-
`present embodiments can be achieved independently
`ous embodiments, or may be combined
`in yet other embodi-
`ments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0011] The embodiments of the present system and meth-
`tracking and surveillance now will be
`ods for autonomous
`in detail with an emphasis on highlighting
`discussed
`the
`features. These embodiments depict the novel
`advantageous
`system and methods shown in the accom-
`and non-obvious
`
`Yuneec Exhibit 1017 Page 8
`
`

`
`US 2009/0157233 A1
`
`Jun. 18, 2009
`
`panying drawings, which are for illustrative purposes only.
`These drawings
`include the following figures, in which like
`indicate like parts:
`numerals
`[0012] FIG. 1 is a functional block diagram of one embodi-
`ment of the present system and methods
`for autonomous
`tracking and surveillance;
`[0013] FIG. 2 is a schematic view of a loiter circle accord-
`ing to one embodiment of the present system and methods for
`tracking and surveillance;
`autonomous
`[0014] FIG. 3 is a schematic view of a weave plan accord-
`ing to one embodiment of the present system and methods for
`tracking and surveillance;
`autonomous
`[0015] FIG. 4 is a schematic view of a chase plan according
`to one embodiment of the present system and methods
`for
`tracking and surveillance;
`autonomous
`[0016] FIG. 5 is a schematic view of a method of smoothing
`noisy tracking data according to one embodiment;
`[0017] FIG. 6 is a schematic view of a systematic search
`in which the UAV remains on one side of a border
`pattern
`while capturing visual images across the border according to
`one embodiment;
`[001S] FIG. 7 is a process flow diagram for autonomously
`tracking a target using a UAV according to one embodiment;
`and
`[0019] FIG. S is a schematic view of one embodiment of the
`present system including a UAV and a ground station.
`
`DETAILED DESCRIPTION
`[0020] Embodiments of the present system andmethods
`for
`to
`are configured
`and surveillance
`autonomous
`tracking
`to
`air vehicle
`enable an unmanned
`(UAV) continuously
`observe stationary and track moving
`targets while maintain-
`ing a low risk that the surveillance asset will be discovered.
`The targets may be ground-based,
`airborne and/or seaborne.
`The targets may be fixed structures,
`such as buildings, and
`may even be subsurface. The automated UAVs may also
`conduct general surveillance of an area, such as for defense of
`a base or fleet, and for monitoring
`roadsides for improvised
`explosive devices (IEDs) to protect ground-based
`convoys.
`The present system may be applied
`in both military and
`civilian environments. For example, the military may use the
`system to surveil or observe hostile areas in search of military
`targets, or a police department may use the system to track
`fleeing suspects.
`[0021] The system accepts target data and UAV data (and
`may accept other data, such as obstruction data and/or "blue
`force" data from the UAV or a ground station). The system
`route to maintain an
`the best navigation
`then determines
`slant range to the target for high quality camera
`advantageous
`imaging and a low probability of intercept (LPOI). The sys-
`tem then computes trajectories/flight paths to reduce the like-
`lihood of discovery of the UAV (also referred to herein as
`"ownship"). The system may incorporate numerous
`tracking
`including weaves, orbits, escapes,
`and maneuver
`techniques,
`and lead/lag pursuit course estimations. The system also con-
`trols the camera aboard the UAV to maintain uninterrupted
`visual contact with the target. The system is adapted to control
`both navigation and camera functions simultaneously.
`[0022] Because the present system is automated
`it drasti-
`cally reduces the workload of any operator(s) monitoring
`the
`system. The system thus enables tracking of high value mov-
`ing targets while reducing the likelihood of a loss of positive
`identification
`in target viewing) during
`target
`(interruption
`tracking. The operator can "fly the camera. " because he or she
`
`is relieved of the dual duty of navigating
`the UAV and main-
`the desired pointing of the camera. The operator is
`taining
`thus able to focus on stalking targets, scanning borders, look-
`ing for IEDs, etc. The system also enables a single operator to
`track multiple moving targets simultaneously,
`the
`increasing
`probability of engaging a high value target after an external
`incident. Because one operator
`attack or a base intrusion
`working at a single location, such as an Insitu Multiple UAV
`(IMUSE) station, may track multiple
`Software Environment
`the logistical footprint
`the present system reduces
`targets,
`necessary for target tracking. The present system also allows
`tar-
`an operator to control multiple UAVs to track maritime
`gets. It can establish a visual identification area around deep
`track and identify small or
`sea and littoral fleets to monitor,
`large moving objects.
`[0023]
`In one embodiment, a system (also referredto herein
`as a "Stalker system") and associated methods provide auto-
`matic generation of UAV and camera steering controls for
`itself may be imple-
`target following. The Stalker system
`mented as software executable code, specialized application
`specific integrated circuits (AS ICs), or a combination
`thereof,
`in hardware and oth-
`where some functions are implemented
`ers in executable code. In a high-level
`sense, the Stalker
`system can operate as a finite state machine where the states
`are steps in a plan to achieve a certain desired trajectory. The
`Stalker system accepts target and UAV state updates, and
`when engaged may be queried for UAV and camera com-
`mands. FIG. 1, which is described in detail below, illustrates
`this process. Each UAV command query checks for a plan-
`ning state transition and may output a new UAV steering
`command depending upon the selected mode.
`[0024] Embodiments of the Stalker system support at least
`four main functions. One function
`is generating UAV and
`for stalking a cooperative
`camera positions and orientations
`target. A cooperative moving
`is one that
`target
`moving
`its own geodetic position, as is typical of
`actively publishes
`friendly forces. Another function is generating UAV and cam-
`for stalking a non-
`era position and orientation commands
`is autono-
`cooperative moving
`the tracking
`target, whether
`mous, or by an operator using a camera joystick. A non-
`is one whose position must be
`cooperative moving
`target
`the use of electronic sensors and operator
`observed through
`inputs, as is typical of hostile forces. Another
`is
`function
`generating UAV and camera position and orientation com-
`for automatic
`to
`camera and position calibration
`mands
`reduce target location errors. Another function
`is generating
`for
`UAV and camera position and orientation
`commands
`subse-
`stalking a roadside or a search area, and generating
`to revisit targets if targets of interest are
`quent commands
`detected in those specified areas.
`[0025] One goal of the Stalker system is to establish and
`maintain a range to target between preset minimum and maxi-
`mum values. These values are specified to provide a large
`number of pixels on the target, while maintaining noise and
`that the target is not likely to detect. Another
`visual signatures
`goal of the Stalker system is to maintain an uninterrupted
`line
`of sight to the target, taking care to avoid obstructing viewing
`angles with the wing and fuselage.
`In embodiments of the present system, a UAV (not
`[0026]
`shown) includes at least one video camera, which may be a
`form of camera
`digital camera. For simplicity
`the singular
`those of ordinary skill in
`will be used throughout,
`although
`that the UAV may include more than
`the art will appreciate
`one camera. The UAV further includes a plurality of sensors.
`
`Yuneec Exhibit 1017 Page 9
`
`

`
`US 2009/0157233 A1
`
`Jun. 18, 2009
`
`3
`
`A first subset of the sensors detects various states of the UAV,
`while a second subset of the sensors detects various states of
`the target. The detected states may include, but are not limited
`speed, acceleration and
`to position, orientation,
`heading,
`other kinematic states, size, type and/or class of the target, and
`other states. A video signal generated by the camera and
`to hardware
`signals generated by the sensors are transmitted
`to visualize and track the
`that use the signals
`components
`target. FIG. 1 illustrates some of these components. Some or
`all of the components
`in FIG. 1 could be located on
`illustrated
`the UAV or they could be located at one or more ground
`could also be split between
`stations. The components
`the
`UAV and one or more ground stations. FIG. S illustrates an
`example embodiment of the present system including a UAV
`74 and a ground station 134.
`the terms "component",
`[0027] As used in this disclosure,
`"module", "system. " and the like are intended
`to refer to a
`entity, either hardware, a combination of
`computer-related
`hardware and software, software, or software
`in execution.
`For example, a component may be, but is not limited to being,
`a process running on a processor, a processor, a hardware
`component, an object, an executable, a thread of execution, a
`program, and/or a computing system. Also, these components
`can execute from various computer readable media having
`various data structures stored thereon. Computer executable
`(or code) can be stored, for example, on com-
`components
`puter readable media including, but not limited to, an ASIC
`(application specific integrated circuit), CD (compact disc),
`DVD (digital video disk), ROM (read only memory), floppy
`disk, hard disk, EEPROM (electrically erasable program-
`mable read only memory) and memory stick in accordance
`with the claimed subject matter.
`to FIG. 1, one embodiment of the
`[002S] With reference
`present system 20 includes an automatic
`target recognition
`integration (MSI) mod-
`(ATR) module 22 and a multi-sensor
`ule 24. As used herein,
`the term module may include any
`combination of hardware,
`firmware, and software to imple-
`ment the functions described. The ATR module 22 receives a
`video signal 26 from the UAV (not shown). The ATR module
`22 includes
`the video signal 26 and
`to analyze
`instructions
`generates an output 2S that it sends to the MSI module 24. In
`to the ATR output 2S, the MSI module 24 also
`addition
`receives a UAV state signal 30 and a target state signal 32. The
`signals 30, 32 are generated by the sensors described above,
`and may also be generated by other sources observing
`the
`UAV and/or the target, such as ground-based observers, radar,
`satellites, etc. All of these signals include information about
`the states of the UAV and the target, which may include
`speed, acceleration
`position, orientation,
`heading,
`and/or
`other kinematic states, size, type and/or class of the target, and
`other states.
`inputs 2S, 30, 32
`[0029] The MSI module 24 receives
`described above and processes the data therein to produce an
`output 34. The MSI module output 34 is referred to herein as
`track information or a track file. The track file 34 includes not
`the kinematics o f the UAV and the
`only information regarding
`target, but also estimates of the accuracy of the data in the
`track file 34, and also target identification data, such as the
`type of the target, whether
`the target is
`size, class, and/or
`cooperative or non-cooperative, etc. Those o f ordinary skill in
`that the track file may or may not be
`the art will appreciate
`retrieval. The word "file" is
`stored in memory for subsequent
`
`used broadly herein and does not imply that the process of
`step of
`the track file 34 includes an additional
`producing
`storing the file in memory.
`[0030] The MSI module 24 sends the track file 34 to a target
`module 36 and an ownship module 3S. The target module 36
`processes the data in the track file 34 relating to the current
`state of the target, and compares (Gates) this data to previous
`the current state o f the target. The
`predictions made regarding
`target module 36 uses all available data and comparisons
`between past predictions and current states, and makes fur-
`ther predictions about future states of the target. Gating
`in
`target module 36 produces an output 40 that it sends to a
`planner module 42.
`[0031] Ownship module 3S processes the data in the track
`file 34 relating to the current state of the UAV, and compares
`(Gates) this data to previous predictions
`(not shown) made
`the current state of the UAV. Discrepancies
`in the
`regarding
`predicted state of the UAV versus its current state may be due
`the UAV off its intended
`to, for example, winds blowing
`course. The ownship module 3S uses all available data and
`comparisons between past predictions and current states, and
`about future states of the UAV.
`makes further predictions
`Gating in ownship module 3S then produces an output 44 that
`it sends to the planner module 42.
`[0032] The planner module 42 combines the target module
`input 44 with additional
`input 40 and the ownship module
`data provided by a legs module 46, a weave corridor module
`4S, a loiter circle module 50, a region search module 52, a
`command module 54 and a camera module 56. The functions
`of each of these modules are described in detail below. Based
`on the various inputs, the planner module 42 builds a model
`for predicting future UAV states given its current state and the
`currently active command. The planner module 42 uses the
`model to predict future UAV states at certain critical times,
`and to establish goals, which in turn produce predicted UAV
`and camera positions. The planner 42 also combines all data
`for course corrections and/or pattern
`to produce commands
`for the UAV. These adjustments
`are described
`adjustments
`below with respect to three top-level goal states for the UAV.
`The present system 20 uses all of the functions described
`in stalking both cooperative and non-co-
`above extensively
`operative targets.
`[0033] With continued reference to FIG. 1, the legs module
`46 predicts a long-term flight path for the UAV. In support of
`the legs module 46 also predicts
`the long-term predictions,
`short-term
`legs that together make up the long-term
`flight
`path. The legs module 46 communicates
`to the
`its predictions
`planner module 42 to aid the planner module 42 in creating
`to control the flight of the UAV.
`UAV commands
`to FIG. 1, in certain
`[0034] With continued
`reference
`the command module 54 includes data regard-
`embodiments
`ing the UAV mission environment. This data may include, for
`locations of interna-
`terrain maps,
`topographical
`example,
`tional borders, locations of obstructions and other data. The
`data may also include the locations and kinematics of other
`aircraft in the vicinity. By accessing the data in the command
`module 54, the present system 20 can command
`the UAV to
`track on a target while avoiding
`maintain an uninterrupted
`and crossing into no fly zones. The com-
`collisions/crashes
`mand module 54 also validates UAV commands to ensure that
`the UAV is capable of executing the commands
`to achieve the
`desired flight path. For example, if a UAV command indicates
`that the UAV should execute a very tight turn that is beyond
`
`Yuneec Exhibit 1017 Page 10
`
`

`
`US 2009/0157233 A1
`
`Jun. 18, 2009
`
`limits, the validity function of the com-
`the UAVs physical
`mand module 54 will reject the command as being impossible
`for the UAV to execute.
`[0035] With continued reference to FIG. 1, the present sys-
`tem 20 further comprises a camera module 56 and a camera
`commander module 5S. The camera module 56 predicts
`future camera imaging characteristics, such as pointing, focus
`and zoom. The camera module 56 communicates with the
`planner module 42 and generates outputs
`for the camera
`commander module 5S. The camera commander module 5S
`generates commands 60 for the camera, such as where to
`point and how to focus and zoom. Together the camera mod-
`ule 56 and the camera commander module 5S, in conjunction
`control camera
`with the planner module 42, automatically
`in order to obtain an uninterrupted
`functions
`and high quality
`image of the target.
`[0036]
`In certain embodiments
`the camera and/or sensors
`information beyond
`that generally
`may provide additional
`provided by traditional visual surveillance. For example, the
`camera/sensors may provide three-dimensional
`visual repre-
`sentations of the target. These three-dimensional
`views are
`enhanced by multi-aspect viewing of the target in accordance
`and chase surveillance
`the
`with
`loiter, weave,
`patterns
`described below. The camera/sensors may further provide
`informa-
`infrared signature
`information,
`signature
`thermal
`tion, color information, etc. for the target. All information
`collected by the camera/sensors may be provided to an ATR/
`Trainer module 62 (FIG. 1), described below, for use in future
`target identifications. Multiple aspect coverage of the target
`function of the
`target recognition
`enables
`the automatic
`present system 20, described below, to recognize geometric
`aspects of the target that are not available in two-dimensional
`or single aspect imagery, drastically decreasing the time nec-
`essary for the present system 20 to recognize the target.
`[0037] While in cooperative and non-cooperative
`stalking
`modes, and prior to receiving either the UAV state input 30 or
`the target state input 32, the Stalker system 20 is in a startup
`state. Once the system 20 has received both the UAV state
`input 30 and the target state input 32, the system 20 is queried
`for a steering command and/or a camera command. The sys-
`tem 20 then transitions
`from startup to a top-level goal state.
`These top-level goal states include loiter 64, weave 66, and
`chase 6S, each of which are illustrated
`in FIGS. 2-4, respec-
`tively. Those of ordinary skill in the art will appreciate
`that
`top-level goal states may be provided depending
`additional
`upon the state of the target.
`[003S] Each top-level goal state corresponds
`to a dynami-
`cally generated plan to attain a desired UAV trajectory for
`imaging quality while controlling visual and
`advantageous
`audio signatures of the UAV. Each top-level goal state is also
`intended to prevent over flight of the target, which could cause
`the target to detect the UAV. Consistent with these objectives,
`then, at least target speed and UAV speed detertmine
`the
`top-level goal. For example, if target speed is zero or near
`zero, the coal may be to loiter in a circle 70, as illustrated
`in
`FIG. 2. The loiter path 70 may encircle the target 72, or it may
`in the vicinity of the target 72. Further,
`be somewhere
`the
`loiter path 70 need not be a circle, but could be some other
`shape. If target speed is not near zero and is less than UAV
`speed, the goal may be to weave back and forth behind
`the
`in FIG. 3. If target speed is high, the
`target 72, as illustrated
`in FIG. 4. The
`goal may be to chase the target 72, as illustrated
`as the target 72 acceler-
`top-level goal changes dynamically
`ates, decelerates, stops and starts.
`
`[0039] Corresponding
`to each top-level goal are goal-spe-
`states, or steps to achieve the top-level goal.
`cific planning
`that are sched-
`These steps are mapped to steering commands
`uled to be sent to the UAV at specified times. Planning a UAV
`in both space and time and
`reasoning
`trajectory
`involves
`predicting how the UAV will respond to commands. There-
`fore, accurately
`a UAV
`trajectory preferably
`planning
`includes an estimate of the command
`time latency and a
`model of how the UAV will maneuver when it executes the
`command.
`[0040] When loitering, each UAV maneuver
`is executed
`to commands generated by the planner module 42 in
`pursuant
`conjunction with the loiter circle module 50 (FIG. 1). The
`loiter circle module 50 makes predictions regarding the future
`state of the UAV, which the planner module 42 uses to gen-
`for the UAV. In the case of a circular
`erate loiter commands
`loiter path 70 (FIG. 2), a loiter command has three parts: a
`turn center (a co-altitude geodetic location), a turn radius, and
`a turn direction (clockwise or counter-clockwise
`as viewed
`from above). Thus, when the system 20 determines
`that the
`the target 72 is stopped,
`loiter, as when
`the
`UAV should
`planner 42 and the loiter circle module 50 generate at least
`one loiter point for the UAV. The loiter point(s) is/are sent to
`that controls the UAV's movement along with
`the hardware
`camera pointing commands.
`if the aircraft
`[0041]
`In one embodiment,
`the
`is outside
`loiter circle 70 then it executes a loiter command
`commanded
`as follows. With reference to FIG. 2, the UAV 74 makes an
`initial turn 76 so that its direction of flight is tangential
`to the
`loiter circle 70 and is compatible with the commanded
`turn
`direction. The UAV 74 then flies straight to the tangent point
`7S. Upon reaching
`the tangent point 7S the UAV 74 flies
`around the loiter circle 70 until commanded
`to do otherwise.
`Each of these UAV maneuvers are executed pursuant
`to com-
`mands generated by the planner module 42 in conjunction
`with the loiter circle module 50 (FIG. 1).
`[0042] When the loiter path 70 encircles the target 72, the
`provides full 360' imaging of the
`loiter plan advantageously
`target 72. Images captured and other sensor readings
`taken
`from such 360' degree sweeps can advantageously
`provide
`the target 72 to the ATR module
`full geometric data regarding
`the ATR/Trainer module 6' (FIG. 1)
`22. In one embodiment
`logs the target data and attempts to identify the
`automatically
`target. If the target cannot be identified,
`then the ATR/Trainer
`module 62 classifies the target as a new entity and records the
`data. This data may be shared system wide, including con-
`to other UAVs in the field. The present
`tinuous dissemination
`system 20 thus rapidly increases its knowledge base as UAVs
`in the field gather more and more data about new targets and
`share that data with other UAVs in the field.
`to FIG. 3, when commanded
`[0043] With reference
`to
`execute a weave plan 66 the UAV 74 moves back and forth
`across the target's path of travel

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket