`us 20150350614Al
`
`(19) United States
`(12) Patent Application Publication
`Meier et al.
`
`(10) Pub. No.: US 2015/0350614 Al
`Dec. 3, 2015
`(43) Pub. Date:
`
`G06K 9/00
`H04N 5/232
`(52) U.S. Cl.
`CPC
`
`(2006.01)
`(2006.01)
`
`H04N 7/185 (2013.01); H04N 5/23296
`(2013.01); H04N 5/28 (2013.01); G06K 9/0063
`(2013.01); G06K 9/00369 (2013.01)
`
`(57)
`
`ABSTRACT
`
`In some implementations, a camera may be disposed on an
`autonomous aerial platform. A user may operate a smart
`wearable device adapted to configured, and/or operate video
`data acquisition by the camera. The camera may be config(cid:173)
`ured to produce a time stamp, and/or a video snippet based on
`receipt of an indication of interest from the user. The aerial
`platform may comprise a controller configured to navigate a
`target trajectory space. In some implementation, a data acqui(cid:173)
`sition system may enable the user to obtain video footage of
`the user performing an action from the platform circling
`around the user.
`
`1110
`
`1102
`
`(54) APPARATUS AND METHODS FOR
`TRACKING USING AERIAL VIDEO
`
`(71) Applicant: BRAIN CORPORATION, San Diego,
`CA (US)
`
`(72)
`
`Inventors: Philip Meier, San Diego, CA (US);
`Heathcliff Hatcher, San Diego, CA
`(US); Marius Buibas, San Diego, CA
`(US)
`
`(21) Appl. No.: 14/332,322
`
`(22) Filed:
`
`Jul. 15, 2014
`
`Related U.S. Application Data
`
`(60) Provisional application No. 62/007,311, filed on Jun.
`3,2014.
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`H04N7/18
`H04N5/28
`
`(2006.01)
`(2006.01)
`
`1104
`
`1100
`
`\
`
`Yuneec Exhibit 1010 Page 1
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 1 of 20
`
`US 2015/0350614 Al
`
`100
`~
`
`FIG. 1
`
`202
`~
`
`FIG. 2
`
`Yuneec Exhibit 1010 Page 2
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 2 of 20
`
`US 2015/0350614 Al
`
`/
`
`"-
`
`\
`
`\
`
`\
`
`/
`
`\
`
`315
`
`316
`
`""-306'
`I' ~ ~
`l cG2)
`'!!
`",
`-r§
`-....-. -
`
`304
`
`/
`
`300
`
`FIG.3A
`
`Yuneec Exhibit 1010 Page 3
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 3 of 20
`
`US 2015/0350614 Al
`
`338
`
`/J2~
`
`340e
`
`344
`
`~
`
`/
`l ~
`\ ~15 306 I
`-"
`-
`
`I \
`
`330
`
`FIG.3B
`
`Yuneec Exhibit 1010 Page 4
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 4 of 20
`
`US 2015/0350614 Al
`
`372
`
`,
`,
`0)6
`I
`
`I
`
`I
`
`I
`
`I
`
`;
`
`;
`
`\
`
`,
`
`,
`
`'"'"~
`
`~
`
`;
`
`368
`~'
`
`, . - - -..
`
`I
`
`I
`
`I1
`
`II,
`
`\
`
`,
`
`,
`
`,
`
`,
`
`,
`
`360
`
`\
`
`,
`,:
`
`\
`
`I
`
`I
`
`~
`
`370
`_ /
`~-
`
`'"
`
`;-1315
`,
`
`'
`
`... __ ..-'"
`
`"
`
`......... __ ... --
`
`..-"
`
`~
`
`'"
`
`",'"
`
`FIG.3C
`
`Yuneec Exhibit 1010 Page 5
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 5 of 20
`
`US 2015/0350614 Al
`
`400
`~
`
`FIG.4A
`
`FIG.4B
`
`420
`~
`
`FIG.4D
`
`Yuneec Exhibit 1010 Page 6
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 6 of 20
`
`US 2015/0350614 Al
`
`500
`~
`~
`
`510
`~
`~
`
`540
`~
`
`[502 ]
`
`[
`
`504 ]
`
`508
`\0
`~
`
`[ 506 ]
`
`[
`
`512 ]
`
`518
`\
`
`,
`
`[516 ]
`
`FIG.5A
`
`[
`
`514
`
`]
`
`FIG.5B
`
`542
`
`~
`
`~
`
`546
`
`566
`
`548"[
`]
`7~
`
`544
`
`FIG.5C
`
`FIG.5D
`
`Yuneec Exhibit 1010 Page 7
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 7 of 20
`
`US 2015/0350614 Al
`
`564
`
`[
`
`]
`
`570
`~
`
`[ 572 ]
`[ 574
`]
`
`[ 576
`
`]
`
`FIG.5E
`
`600
`~
`
`614\
`
`616\
`
`620\
`
`626\
`
`Memory
`
`Processing
`
`Sensory
`
`ComInS
`interface
`
`612\
`
`618\
`
`622\
`
`624\
`
`Storage
`
`Mechanical
`
`Electrical
`
`Power
`
`FIG. 6
`
`Yuneec Exhibit 1010 Page 8
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 8 of 20
`
`US 2015/0350614 Al
`
`734
`~ 736
`tr')i
`\ -..
`+ •.........:.
`.. n.!,.n
`£4J
`....
`
`738
`\_
`~~
`-~~ ~
`.(j~
`~ .r~~
`
`&rb~
`
`FIG.7B
`
`715
`
`727
`
`FIG.7A
`
`Yuneec Exhibit 1010 Page 9
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 9 of 20
`
`US 2015/0350614 Al
`
`80~
`
`FIG.8B
`
`"~Y ..../
`.,,:.....~....:"-~
`
`FIG.8A
`
`Yuneec Exhibit 1010 Page 10
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 10 of 20
`
`US 2015/0350614 Al
`
`!
`
`904 908
`
`<
`
`,
`
`942if
`
`944
`
`FIG.9A
`
`Yuneec Exhibit 1010 Page 11
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 11 of 20
`
`US 2015/0350614 Al
`
`FIG.9B
`
`Yuneec Exhibit 1010 Page 12
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 12 of 20
`
`US 2015/0350614 Al
`
`FIG. lOA
`
`{
`
`1016
`1010
`\_~'@'
`-~ ~."':'~'.",.'~"
`\
`...'.'."""""'" 1••012
`..
`q.......
`':'.
`I
`\ .
`..
`£~
`.
`...
`rl
`,
`:'
`.
`:l:
`' .
`-~
`"'1006
`' . . .
`""." '.
`.,.§
`',,,,,
`....
`..
`~ .• '-i\."...J.t'~
`~y
`
`~
`§
`~'"
`
`1024
`~F\
`/
`\\ >'\.
`,l
`r('
`'n
`
`FIG. lOB
`
`FIG.Ioe
`
`Yuneec Exhibit 1010 Page 13
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 13 of 20
`
`US 2015/0350614 Al
`
`1102
`
`1100
`
`1110'\
`
`FIG.11A
`
`Yuneec Exhibit 1010 Page 14
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 14 of 20
`
`US 2015/0350614 Al
`
`FIG. liB
`
`1214
`
`1210 2
`
`FIG. 12
`
`1208
`
`Yuneec Exhibit 1010 Page 15
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 15 of 20
`
`US 2015/0350614 Al
`
`1300
`~
`
`'
`'
`. . . . . . . . . . . .
`
`.
`
`.~:/
`
`.
`
`1302
`
`1320
`~
`
`1326rI
`
`,' "".
`W.'.'~."•...
`
`:.>~...,.
`
`FIG.13B
`
`Yuneec Exhibit 1010 Page 16
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 16 of 20
`
`US 2015/0350614 Al
`
`1340
`~
`
`1360
`~
`
`FIG.13C
`
`FIG.13D
`
`Yuneec Exhibit 1010 Page 17
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 17 of 20
`
`US 2015/0350614 Al
`
`I
`
`Start
`
`]
`
`1402
`
`\
`
`Determine a state parameter while navigating a
`trajectory
`
`!
`
`1404
`
`NO
`
`1406
`
`Populate the state area with the trajectory
`
`!
`
`Continue
`
`FIG.14A
`
`Yuneec Exhibit 1010 Page 18
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 18 of 20
`
`US 2015/0350614 Al
`
`Start
`
`Navigate a target trajectory
`
`Receive an indication of interest
`
`\
`
`\
`
`12
`
`14
`
`14
`
`1416
`
`Produce a time stamp associated with the
`indication
`
`!
`
`Continue
`
`FIG.14B
`
`Yuneec Exhibit 1010 Page 19
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 19 of 20
`
`US 2015/0350614 Al
`
`I
`
`Start
`
`]
`
`42
`
`\
`
`Track a SOl while navigating a target trajectory
`
`14
`
`44
`
`\
`
`Acquire video of SOl
`
`1446
`
`NO
`
`1448
`
`1450
`
`1452
`
`Produce a time stamp
`
`1
`
`Store acquired historical video portion
`
`Acquire&store subsequent video portion
`
`Continue
`
`FIG.14C
`
`Yuneec Exhibit 1010 Page 20
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 20 of 20
`
`US 2015/0350614 Al
`
`1460
`
`462
`
`\ 1
`
`464
`1
`
`1466
`
`468
`1
`
`1470
`
`Start
`
`Configure VAV
`
`Indicate SOl
`
`!
`
`Confirm SOl quality
`
`\
`
`\
`
`\
`
`Observe video produced during trajectory
`navigation by VAV
`
`!
`
`Provide "Awesome" indication
`
`I
`
`Continue
`
`I
`
`FIG.14D
`
`Yuneec Exhibit 1010 Page 21
`
`
`
`US 2015/0350614 Al
`
`Dec. 3,2015
`
`1
`
`APPARATUS AND METHODS FOR
`TRACKING USING AERIAL VIDEO
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application claims the priority benefit of U.S.
`Provisional Patent Application Ser. No. 62/007,311 filed on
`Jun. 3, 2014 and entitled "APPARATUS AND METHODS
`FOR TRACKING USING AERIAL VIDEO"; and is related
`to co-owned and co-pending U.S. patent application Ser. No.
`___, Client Reference BC201415A, Attorney Docket No.
`021672-0433333 filed on Jul. 15,2014 herewith, and entitled
`"APPARATUS AND METHODS FOR AERIAL VIDEO
`ACQUISITION", U.S. patent application Ser. No. __,
`Client Reference BC201414A, Attorney Docket No. 021672(cid:173)
`0433332, filed on Jul. 15,2014 herewith, and entitled "APPA(cid:173)
`RATUSANDMETHODSFORCONTEXTBASEDVIDEO
`DATA COMPRESSION", U.S. patent application Ser. No.
`13/601,721 filed on Aug. 31, 2012 and entitled "APPARA(cid:173)
`TUS AND METHODS FOR CONTROLLING ATTEN(cid:173)
`TION OF A ROBOT" and U.S. patent application Ser. No.
`13/601,827 filed Aug. 31, 2012 and entitled "APPARATUS
`AND METHODS FOR ROBOTIC LEARNING", each ofthe
`foregoing being incorporated herein by reference in its
`entirety.
`
`COPYRIGHT
`
`[0002] A portion of the disclosure of this patent document
`contains material that is subject to copyright protection. The
`copyright owner has no objection to the facsimile reproduc(cid:173)
`tion by anyone of the patent document or the patent disclo(cid:173)
`sure, as it appears in the Patent and Trademark Office patent
`files or records, but otherwise reserves all copyright rights
`whatsoever.
`
`BACKGROUND
`
`[0003]
`1. Field of the Disclosure
`[0004] The present disclosure relates to apparatus and
`methods for tracking human subjects, and/or other moving
`and/or static objects using aerial video data.
`2. Description of Related Art
`[0005]
`[0006] Aerial unmanned vehicles may be used for collect(cid:173)
`ing live video data. A progrannning and/or two way commu(cid:173)
`nication between the remote vehicle and a user may be
`employed in order to control video collection. Users engaged
`in attention consuming activities (e.g., surfing, biking, skate(cid:173)
`boarding, and/or other activities) may not be able to control
`remote devices with sufficient speed and/or accuracy using
`conventional remote control devices and/or pre-programmed
`trajectories.
`
`SUMMARY
`
`[0007] One aspect ofthe disclosure relates to a video acqui(cid:173)
`sition system. The video acquisition system may include one
`or more of a mobile camera apparatus, a controller, an inter(cid:173)
`face, and/or other components. The mobile camera apparatus
`may be configured to obtain video ofa subject ofinterest. The
`controller may be configured to navigate the apparatus along
`a trajectory configured in accordance with a position of the
`subject of interest. The interface may be configured to detect
`an indication from a wearable device and to produce an event
`
`signal. The event signal may be configured to enable access to
`a portion of the video at a time associated with the detection
`of the indication.
`[0008] These and other features, and characteristics of the
`present technology, as well as the methods of operation and
`functions of the related elements of structure and the combi(cid:173)
`nation of parts and economies of manufacture, will become
`more apparent upon consideration of the following descrip(cid:173)
`tion and the appended claims with reference to the accompa(cid:173)
`nying drawings, all ofwhich form a part ofthis specification,
`wherein like reference numerals designate corresponding
`parts in the various figures. It is to be expressly understood,
`however, that the drawings are for the purpose of illustration
`and description only and are not intended as a definition ofthe
`limits ofthe invention. As used in the specification and in the
`claims, the singular form of "a", "an", and "the" include
`plural referents unless the context clearly dictates otherwise.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`illustration depicting an
`FIG. 1 is a graphical
`[0009]
`autonomous aerial device configured to follow a person using
`real time video, according to some implementations.
`[0010]
`FIG. 2 is an illustration depicting exemplary trajec(cid:173)
`tory configuration useful for video collection by an autono(cid:173)
`mous aerial vehicle of, e.g., FIG. 1, according to some imple(cid:173)
`mentations.
`[0011]
`FIG. 3A is an illustration depicting exemplary tra(cid:173)
`jectories of the autonomous aerial vehicle of FIG. 1 during
`tracking of a subject of interest (SOl), according to some
`implementations.
`[0012]
`FIG. 3B is an illustration depicting exemplary tra(cid:173)
`jectories of the autonomous aerial vehicle of FIG. 3A in
`presence of obstacles during tracking of an SOl, according to
`some implementations.
`[0013]
`FIG. 3C is an illustration depicting exemplary tra(cid:173)
`jectory of the autonomous aerial vehicle of FIG. 3A config(cid:173)
`ured to populate allowable state space in presence of
`obstacles during tracking ofan SOI, according to some imple(cid:173)
`mentations.
`[0014]
`FIGS. 4A-4D illustrate various exemplary devices
`useful for communicating with autonomous aerial vehicles
`(e.g., of FIGS. 1, 3A, 8A) during tracking and/or video col(cid:173)
`lection, according to some implementations.
`[0015]
`FIG. 5A depicts configuration of uniform length
`video snippets obtained based on user indication for use with
`video acquisition by the aerial vehicle ofFIG. 1, according to
`some implementations.
`[0016]
`FIG. 5B depicts configuration of non-uniform
`length video snippets obtained based on user indication for
`use with video acquisition by the aerial vehicle of FIG. 1,
`according to some implementations.
`[0017]
`FIG. 5C depicts configuring pre/post event duration
`of a video snippet for use with video acquisition by the aerial
`vehicle of FIG. 1, according to some implementations.
`[0018]
`FIG. 5D depicts multiple snippets produce respon(cid:173)
`sive to multiple proximate indications ofuser interest for use
`with video acquisition by the aerial vehicle ofFIG. 1, accord(cid:173)
`ing to some implementations.
`FIG. 5E depicts storing ofvideo snippets in an array
`[0019]
`based on detection of one or more events provided to the
`aerial vehicle ofFIG. 1 during video acquisition, according to
`some implementations.
`[0020]
`FIG. 6 is a functional block diagram illustrating a
`computerized apparatus for implementing, inter alia, track-
`
`Yuneec Exhibit 1010 Page 22
`
`
`
`US 2015/0350614 Al
`
`Dec. 3,2015
`
`2
`
`ing, video acquisition and storage, motion and/or distance
`determination methodology in accordance with one or more
`implementations.
`[0021]
`FIG. 7A is an illustration depicting exemplary use of
`a quad-rotor DAV for tracking a person carrying a wearable
`device, according to some implementations.
`[0022]
`FIG. 7B is a block diagram illustrating sensor com(cid:173)
`ponents ofan DAV configured for tracking an SOl, according
`to some implementations.
`FIG. 8A is an illustration depicting exemplary use of
`[0023]
`a quad-rotor DAV for tracking a bicyclist, according to some
`implementations.
`[0024]
`FIG. 8B is a block diagram illustrating a smart grip
`interface to the DAV of FIG. 8A, according to some imple(cid:173)
`mentations.
`[0025]
`FIG. 9A is a graphical illustration depicting tracking
`ofa moving subject of interest using moving camera, accord(cid:173)
`ing to some implementations.
`[0026]
`FIG. 9B is a graphical illustration depicting adjust(cid:173)
`ment ofthe DAV camera orientation when tracking a station(cid:173)
`ary subject of interest, according to some implementations.
`lOB, and 10C illustrate use of an
`[0027]
`FIGS.
`lOA,
`umbrella DAV for tracking a SOl, according to some imple(cid:173)
`mentations.
`[0028]
`FIGS. llA-11B illustrate use of a vehicle-docked
`umbrella DAV for tracking a SOl, according to some imple(cid:173)
`mentations.
`[0029]
`FIG. 12 is a functional block diagram illustrating a
`cloud server repository, according to some implementations.
`[0030]
`FIG. BA is a graphical illustration depicting an
`aerial platform comprising a camera, according to some
`implementations.
`[0031] FIG.BB is a graphical illustration depicting a sys(cid:173)
`tem configured to manipulate a camera according to some
`implementations.
`[0032]
`FIG. 13C is a plot depicting state space parameters
`useful for trajectory navigation by, e.g., the apparatus of FIG.
`13A, according to some implementations.
`FIG. 13D is a graphical illustration depicting a
`[0033]
`mobile camera apparatus, according to some implementa(cid:173)
`tions.
`FIG. 14A is a logical flow diagram illustrating gen(cid:173)
`[0034]
`eralized method for trajectory control useful when acquiring
`video from a mobile camera device, in accordance with some
`implementations.
`[0035]
`FIG. 14B is a logical flow diagram illustrating a
`method for producing a time stamp based on an indication of
`interest, in accordance with some implementations.
`[0036]
`FIG. 14C is a logical flow diagram illustrating a
`method for producing a video snippet based on an indication
`of interest, in accordance with some implementations.
`FIG.14D is a logical flow diagram illustrating gen(cid:173)
`[0037]
`eralized method for operating a smart wearable device, in
`accordance with some implementations.
`[0038] All Figures disclosed herein are © Copyright 2014
`Brain Corporation. All rights reserved.
`
`DETAILED DESCRIPTION
`
`Implementations ofthe present technology will now
`[0039]
`be described in detail with reference to the drawings, which
`are provided as illustrative examples so as to enable those
`skilled in the art to practice the technology. Notably, the
`figures and examples below are not meant to limit the scope of
`the present disclosure to a single implementation or imple-
`
`mentation, but other implementations and implementations
`are possible by way of interchange of or combination with
`some or all ofthe described or illustrated elements. Wherever
`convenient, the same reference numbers will be used through(cid:173)
`out the drawings to refer to same or like parts.
`[0040] Where certain elements of these implementations
`can be partially or fully implemented using known compo(cid:173)
`nents, only those portions ofsuch known components that are
`necessary for an understanding ofthe present technology will
`be described, and detailed descriptions of other portions of
`such known components will be omitted so as not to obscure
`the disclosure.
`[0041]
`In the present specification, an implementation
`showing a singular component should not be considered lim(cid:173)
`iting; rather, the disclosure is intended to encompass other
`implementations including a plurality of the same compo(cid:173)
`nent, and vice-versa, unless explicitly stated otherwise
`herein.
`Further, the present disclosure encompasses present
`[0042]
`and future known equivalents to the components referred to
`herein by way of illustration.
`[0043] As used herein, the term "bus" is meant generally to
`denote all types of interconnection or communication archi(cid:173)
`tecture that is used to access the synaptic and neuron memory.
`The "bus" may be optical, wireless, infrared, and/or another
`type of communication medium. The exact topology of the
`bus could be for example standard "bus", hierarchical bus,
`network-on-chip, address-event-representation (AER) con(cid:173)
`nection, and/or other type of communication topology used
`for accessing, e.g., different memories in pulse-based system.
`[0044] As used herein, the terms "computer", "computing
`device", and "computerized device" may include one or more
`of personal computers (PCs) and/or minicomputers (e.g.,
`desktop, laptop, and/or other PCs), mainframe computers,
`workstations, servers, personal digital assistants (PDAs),
`handheld computers, embedded computers, programmable
`logic devices, personal communicators, tablet computers,
`portable navigation aids, J2ME equipped devices, cellular
`telephones, smart phones, personal integrated communica(cid:173)
`tion and/or entertainment devices, and/or any other device
`capable of executing a set of instructions and processing an
`incoming data signal.
`[0045] As used herein, the term "computer program" or
`"software" may include any sequence of human and/or
`machine cognizable steps which perform a function. Such
`program may be rendered in a programming language and/or
`environment including one or more of C/C++, C#, Fortran,
`COBOL, MATLABTM, PASCAL, Python, assembly lan(cid:173)
`guage, markup languages (e.g., HTML, SGML, XML,
`VoXML), object-oriented environments
`(e.g., Common
`Object Request BrokerArchitecture (CORBA)), Java™ (e.g.,
`J2ME, Java Beans), Binary Runtime Environment (e.g.,
`BREW), and/or other programming languages and/or envi(cid:173)
`ronments.
`the terms "connection", "link",
`[0046] As used herein,
`"transmission channel", "delay line", "wireless" may include
`a causal link between any two or more entities (whether
`physical or logical/virtual), which may enable information
`exchange between the entities.
`[0047] As used herein, the term "memory" may include an
`integrated circuit and/or other storage device adapted for
`storing digital data. By way of non-limiting example,
`memory may include one or more of ROM, PROM,
`EEPROM, DRAM, Mobile DRAM, SDRAM, DDR/2
`
`Yuneec Exhibit 1010 Page 23
`
`
`
`US 2015/0350614 Al
`
`Dec. 3,2015
`
`3
`
`SDRAM, EDO/FPMS, RLDRAM, SRAM, "flash" memory
`(e.g., NAND/NOR), memristor memory, PSRAM, and/or
`other types of memory.
`[0048] As used herein,
`the terms "integrated circuit",
`"chip", and "IC" are meant to refer to an electronic circuit
`manufactured by the patterned diffusion of trace elements
`into the surface ofa thin substrate of semiconductor material.
`By way of non-limiting example, integrated circuits may
`include field progrannnable gate arrays (e.g., FPGAs), a pro(cid:173)
`grammable logic device (PLD), reconfigurable computer fab(cid:173)
`rics (RCFs), application-specific integrated circuits (ASICs),
`and/or other types of integrated circuits.
`[0049] As used herein, the terms "microprocessor" and
`"digital processor" are meant generally to include digital
`processing devices. By way of non-limiting example, digital
`processing devices may include one or more of digital signal
`(DSPs),
`reduced instruction set computers
`processors
`(RISC), general-purpose (CISC) processors, microproces(cid:173)
`sors, gate arrays (e.g., field progrannnable gate arrays (FP(cid:173)
`GAs)), PLDs, reconfigurable computer fabrics (RCFs), array
`processors,
`secure microprocessors, application-specific
`integrated circuits (ASICs), and/or other digital processing
`devices. Such digital processors may be contained on a single
`unitary IC die, or distributed across multiple components.
`[0050] As used herein, the term "network interface" refers
`to any signal, data, and/or software interface with a compo(cid:173)
`nent, network, and/or process. By way of non-limiting
`example, a network interface may include one or more of
`FireWire (e.g., FW400, FWSOO, etc.), USB (e.g., USB2),
`Ethernet
`(e.g., 101100, 10110011000 (Gigabit Ethernet),
`IO-Gig-E, etc.), MoCA, Coaxsys (e.g., TVnet™), radio fre(cid:173)
`quency tuner (e.g., in-band or OOB, cable modem, etc.),
`Wi-Fi (S02.11), WiMAX (S02.16), PAN (e.g., S02.15), cel(cid:173)
`lular (e.g., 3G, LTE/LTE-AlTD-LTE, GSM, etc.), IrDA fami(cid:173)
`lies, and/or other network interfaces.
`[0051] As used herein, the terms "node", "neuron", and
`"neuronal node" are meant to refer, without limitation, to a
`network unit (e.g., a spiking neuron and a set of synapses
`configured to provide input signals to the neuron) having
`parameters that are subject to adaptation in accordance with a
`model.
`[0052] As used herein, the terms "state" and "node state" is
`meant generally to denote a full (or partial) set of dynamic
`variables used to describe node state.
`[0053] As used herein, the term "synaptic channel", "con(cid:173)
`nection", "link", "transmission channel", "delay line", and
`"communications channel" include a link between any two or
`more entities (whether physical (wired or wireless), or logi(cid:173)
`cal/virtual) which enables information exchange between the
`entities, and may be characterized by a one or more variables
`affecting the information exchange.
`[0054] As used herein, the term "Wi-Fi" includes one or
`more of IEEE-Std. S02.11, variants of IEEE-Std. S02.11,
`standards related to IEEE-Std. S02.11 (e.g., S02.11 alb/g/n!
`s/v), and/or other wireless standards.
`[0055] As used herein, the term "wireless" means any wire(cid:173)
`less signal, data, communication, and/or other wireless inter(cid:173)
`face. By way of non-limiting example, a wireless interface
`may include one or more of Wi-Fi, Bluetooth, 3G (3GPP/
`3GPP2), HSDPAlHSUPA, TDMA, CDMA (e.g., IS-95A,
`WCDMA, etc.), FHSS, DSSS, GSM, PAN/S02.15, WiMAX
`(S02.16), S02.20, narrowband/FDMA, OFDM, PCS/DCS,
`LTE/LTE-AlTD-LTE, analog cellular, CDPD, satellite sys-
`
`tems, millimeter wave or microwave systems, acoustic, infra(cid:173)
`red (i.e., IrDA), and/or other wireless interfaces.
`[0056]
`It may be desirable to utilize autonomous aerial
`vehicles for video data collection. A video collection system
`comprising an aerial (e.g., gliding, flying and/or hovering)
`vehicle equipped with a video camera and a control interface
`may enable a user to start, stop, and modify a video collection
`task (e.g., circle around an object, such as a person and/or a
`vehicle), as well as to indicate to the vehicle which instances
`in the video may be of greater interest than others and worth
`watching later. The control interface apparatus may comprise
`a button (hardware and/or virtual) that may cause generation
`of an indication of interest associated with the instance of
`interest to the user. The indication ofinterest may be commu(cid:173)
`nicated to a video acquisition apparatus (e.g., the aerial
`vehicle).
`[0057]
`In one or more implementations, the video collec(cid:173)
`tion system may comprise a multi-rotor Unmanned Aerial
`Vehicle (DAV), e.g., such as illustrated and described with
`respect to FIGS. 1, 7A, 8A, lOA-lIB, below. In some imple(cid:173)
`mentation, the interface apparatus may comprise a wearable
`apparatus such as, for example, a smart watch (e.g., ToqTM), a
`clicker, smart glasses, a pendant, a key fob, and/or other
`mobile communications device (e.g., a phone, a tablet). In
`one or more implementations, the interface may comprise
`smart hand grip-like sports equipment, e.g., a smart bike
`handlebar described below with respect to FIG. 8B, a smart
`glove, a smart ski pole, a smart helmet, a smart show, and/or
`other computerized user device.
`[0058] The interface apparatus may communicate to the
`UAV via a wireless communication channel (e.g., radio fre(cid:173)
`quency, infrared, light, acoustic, and/or a combinationthereof
`and/or other modalities).
`[0059] By way of an illustration, a sports enthusiast may
`utilize the proposed video collection system to record footage
`ofherself surfing, skiing, running, biking, and/or performing
`other activity. In some implementations, a home owner may
`use the system to collect footage of the leaves in the roof s
`gutter, roofconditions, survey not easily accessible portion of
`property (e.g., up/down a slope from the house) and/or for
`other needs. A soccer coach may use the system to collect
`footage of all the plays preceding a goal.
`[0060] Prior to flight (also referred to as "pre-flight") the
`user may configure flight trajectory parameters of the UAV
`(e.g., altitude, distance, rotational velocity, and/or other
`parameters), configure recording settings (e.g., 10 seconds
`before, 20 seconds after the indication of interest), the direc(cid:173)
`tion and/or parameters of rotation after a pause (e.g., clock(cid:173)
`wise, counter clock, alternating, speed). In one or more
`implementations, the user may load an operational profile
`(e.g., comprising the tracking parameters, target trajectory
`settings, video acquisition parameters, and/or environment
`metadata). As used herein the term video acquisition may be
`used to describe operations comprising capture (e.g., trans(cid:173)
`duction is light to electrical signal volts) and buffering (e.g.,
`retaining digital samples after an analog to digital conver(cid:173)
`sion). Various buffer size and/or topology (e.g., double, triple
`buffering) may be used in different systems, with common
`applicable characteristics: buffers fill up; for a given buffer
`size, higher data rate may be achieved for shorter clip dura(cid:173)
`tion. Buffering operation may comprise producing informa(cid:173)
`tion related to acquisition parameters, duration, data rate,
`time of occurrence, and/or other information related to the
`video.
`
`Yuneec Exhibit 1010 Page 24
`
`
`
`US 2015/0350614 Al
`
`Dec. 3,2015
`
`4
`
`[0061] The tenn video storage may be used to describe
`operations comprising persistent storing of acquired video
`(e.g., on flash, magnetic and/or other medium). Storing opera(cid:173)
`tions may be characterized by storage medium capacity
`greatly exceeding the buffer size. In some implementations,
`storage medium does not get depleted by subsequent capture
`events in a way that would hinder resolution of the capture
`process for, e.g., 0.005 second to 500 second clips. Storage
`may be performed using a local storage device (e.g., an SD
`card) and/or on a remote storage apparatus (e.g., a cloud
`server).
`[0062] The pre-flight configuration may be performed
`using a dedicated interface apparatus and/or using other com(cid:173)
`puterized user interface (VI) device. In some implementa(cid:173)
`tions, the user may employ a portable device (e.g., smart(cid:173)
`phone running an app), a computer (e.g., using a browser
`interface), wearable device (e.g., pressing a button on a smart
`watch and/or clicker remote and/or mode button on a smart
`hand-grip), and/or other user interface means.
`[0063] The user may utilize the interface apparatus for
`flight initiation, selection of a subject of interest (SOl) (e.g.,
`tracking target), calibration, and/or operation ofthe UAV data
`collection. In some implementations the SOl may be used to
`refer to a tracked object, a person, a vehicle, an animal, and/or
`other object and/or feature (e.g., a plume of smoke, extend of
`fire, wave, an atmospheric cloud, and/or other feature). The
`SOl may be selected using video streamed to a portable
`device (e.g., smartphone) from the UAY, may be detected
`using a wearable controller carried by the SOl and configured
`to broadcasts owners intent to be tracked, and/or other selec(cid:173)
`tion methods. In some implementations, a user may utilize a
`remote attention indication methodology described in, e.g.,
`co-owned and co-pending U.S. patent application Ser. No.
`13/601,721 filed on Aug. 31, 2012 and entitled "APPARA(cid:173)
`TUS AND METHODS FOR CONTROLLING ATTEN(cid:173)
`TION OF A ROBOr, incorporated supra. As described in
`above-referenced application No. '721, attention of the UAV
`may be manipulated by use ofa spot-light device illuminating
`a subject of interest. A sensor device disposed on the UAV
`may be used to detect the signal (e.g., visible light, infrared
`light), reflected by the illuminated area requiring attention.
`The attention guidance may be aided by way of an additional
`indication (e.g., sound, radio wave, and/or other) transmitted
`by an agent (e.g., a user) to the UAV indicating that the SOl
`has been illuminated. Responsive to detection of the addi(cid:173)
`tional indication, the UAV may initiate a search for the signal
`reflected by the illuminated area requiring its attention.
`Responsive to detecting the illuminated area the UAV may
`associate one or more objects within the area as the SOl for
`subsequent tracking and/or video acquisition. Such approach
`may be utilized, e.g., to indicate SOl disposed in hard to reach
`areas (e.g., underside of bridges/overpasses, windows in
`buildings and/or other areas.
`[0064]
`FIG. 1 illustrates use ofan autonomous aerial device
`configured to follow a subject ofinterest (SOl) using real time
`video, according to some implementations. The autonomous
`aerial device 100 of FIG. 1 may comprise a multi-rotor UAV
`(e.g., DJI Phantom, Draganflyer X6, Aibot X6, Parrot ASR
`Drone®, Hex) comprising a plurality ofpropellers 110 and a
`sensor component 104. Although methodology ofthe present
`disclosure is illustrated using rotor UAV devices it will be
`recognized by those skilled in the arts that methodologies
`described herein may be utilized with other devices such
`remote controlled planes, gliders, kites, balloons, blimps,
`
`model rockets, hybrids thereof, and/or practically any other
`aerial vehicles weighting less than 25 kg and with dimensions
`selected from the range between 0.5 m to 3 m.
`[0065]
`In one or more implementations, the sensor compo(cid:173)
`nent 104 may comprise one or more cameras configured to
`provide video information related to the person 106. The
`video information may comprise for example multiple
`streams of frames received from a plurality of cameras dis(cid:173)
`posed separate from one another. Individual cameras may
`comprise an image sensor (e.g., charge-coupled device
`(CCD), CMOS device, and/or an active-pixel sensor (APS),
`photodiode arrays, and/or other sensors). In one or more
`implementations, the stream offrames may comprise a pixel
`stream downloaded from a file. An example of such a file may
`include a stream of two-dimensional matrices of red green
`blueRGBvalues (e.g., refreshed at a 12 Hz, 30Hz, 60Hz, 120
`Hz, 250 Hz, 1000 Hz and/or other suitable rate). It will be
`appreciated by those skilled in the art when given this disclo(cid:173)
`sure that the above-referenced image parameters are merely
`exemplary, and many other image representations (e.g., bit(cid:173)
`luminance-chrominance (YUV, YCbCr), cyan-ma(cid:173)
`map,
`genta-yellow and key (CMYK), grayscale, and/or other
`image representations) are equally applicable to and useful
`with the various aspects of the present disclosure. Further(cid:173)
`more, data frames corresponding to other (non-visual) signal
`modalities such as sonograms, infrared (IR), lidar, radar or
`tomography images may be equally compatible with the pro(cid:173)
`cessing methodology of the disclosure, or yet other configu(cid:173)
`rations.
`[0066] The device 100 may be configured to move around
`the person 106 along, e.g., a circular trajectory denoted by
`arrow 102 in FIG. 1. The sensor component 104 may com(cid:173)
`prise one or more of Global Positioning System (GPS)
`receiver, proximity sensor, inertial sensors, long-base and/or
`short-base wireless posit