throbber
I 1111111111111111 11111 lllll lllll 111111111111111 lllll 111111111111111 11111111
`
`US00635964 7Bl
`
`
`(12)United States Patent
`
`Sengupta et al.
`
`(10)Patent No.:US 6,359,647 Bl
`
`Mar.19,2002
`(45) Date of Patent:
`
`(54)AUTOMATED CAMERA HANDOFF SYSTEM
`
`FOR FIGURE TRACKING IN A MULTIPLE
`CAMERA SYSTEM
`
`5,699,444 A * 12/1997 Palm .......................... 382/106
`
`
`
`
`
`
`5,729,471 A * 3/1998 Jain et al. ..................... 348/13
`
`
`
`5,745,126 A * 4/1998 Jain et al. ..................... 348/42
`
`
`6,002,995 A * 12/1999 Suzuki et al. ............... 702/188
`
`(75)Inventors: Soumitra Sengupta, Stamford, CT
`
`
`
`
`(US); Damian Lyons, Putnam Valley,
`NY (US); T homas Murphy,
`EP
`EP
`
`Manchester, NH (US); Daniel Reese,
`JP
`
`Landisville, PA (US)
`
`
`
`FOREIGN PATENT DOCUMENTS
`
`H04N/7/18
`
`
`0529317 Al * 3/1993
`
`
`
`
`0714081 Al 5/1996 ......... G08B/13/196
`
`
`
`
`08011071 A * 1/1996 .............. B25J/3/00
`
`
`
`WO97/04428 2/1997 ......... G08B/13/196
`
`WO
`
`Philips Electronics North America
`
`
`(73)Assignee:
`
`Corporation, New York, NY (US)
`
`* cited by examiner
`
`
`
`
`
`Primary Examiner-Vu Le
`
`ABSTRACT
`
`(57)
`( *) Notice: Subject to any disclaimer, the term of this
`
`
`
`
`
`
`
`patent is extended or adjusted under 35
`The invention provides for the automation of a multiple
`
`
`
`
`
`U.S.C. 154(b) by O days.
`
`
`
`camera system based upon the location of a target object in
`
`
`
`
`a displayed camera image. The preferred system provides a
`
`
`
`
`
`moves nearly continuous display of a figu re as the figure
`
`
`about throughout multiple cameras' potential fields of view.
`
`
`
`When the figure approaches the bounds of a selected cam­
`
`(51)Int. Cl.7 .................................................. H04N 7/18
`
`
`
`era's field of view, the system determines which other
`348/143; 348/153; (52)U.S. Cl. ....................... 348/154;
`
`
`
`
`
`
`
`camera's potential field of view contains the figu re, and
`
`348/159; 348/169
`
`
`
`
`adjusts that other camera's actual field of view to contain the
`
`
`(58)Field of Search ................................. 348/143, 152,
`
`
`figu re. When the figu re is at the bounds of the selected
`
`348/153, 154, 159, 169; 382/103
`
`
`
`
`camera's field of view, the system automatically selects the
`
`
`
`
`other camera. The system also contains predictive location
`
`
`
`
`determination algorithms. By assessing the movement of the
`
`
`
`
`camera based and adjusts the next figu re, the system selects
`
`
`
`upon the predicted subsequent location of the figure.
`4,511,886 A * 4/1985 Rodrigu ez ..................
`
`340/534
`
`
`
`5,164,827 A * 11/1992 Paff ........................... 348/143
`
`18 Claims, 9 Drawing Sheets
`
`(21)Appl. No.: 09/131,243
`
`(22)Filed:Aug. 7, 1998
`
`(56)
`
`
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`111
`
`
`
`SECURED AREA
`
`102
`
`112
`
`,,.
`,c,
`---------------
`
`120 103
`
`140
`
`144
`
`142
`
`CAMERA HANDDFF SYSTEM
`
`IPR2021-00921
`Apple EX1014 Page 1
`
`

`

`U.S. Patent Mar. 19, 2002
`Sheet 1 of 9
`
`
`US 6,359,647 Bl
`
`
`
`SECURED AREA
`
`112
`
`I
`
`I
`I
`
`-- ---
`,- -- - - -
`I
`
`--
`
`I
`
`111
`
`I
`
`I
`
`I
`_.,.
`
`,
`
`-
`
`....-
`
`....
`
`_--
`
`-
`
`102
`
`.,
`,,,,
`,,,,
`<----------------
`
`120
`
`103
`
`SWITCH 135
`
`130
`CONTROLLER
`150
`
`180
`
`140
`LOCATION
`
`FIELD OF VIEW
`DETERMINATOR
`
`DETERMINA TOR
`FIGURE
`144
`TRACKING
`SYSTEM
`SECURED
`AREA
`142
`PREDICTOR
`DATABASE
`160
`
`
`
`
`
`CAMERA HANDOFF SYSTEM
`
`FIG. 1
`
`IPR2021-00921
`Apple EX1014 Page 2
`
`

`

`Sheet 2 of 9 US 6,359,647
`Bl
`U.S. Patent Mar.19,2002
`
`2
`
`101
`
`t-- - --, - -102 -L_J
`
`104
`
`PS
`
`103 106
`105
`
`FIG. 2
`
`IPR2021-00921
`Apple EX1014 Page 3
`
`

`

`U.S. Patent Mar.19,2002
`
`Sheet 3 of 9
`
`US 6,359,647 Bl
`
`229
`
`228
`
`-220
`102 -----�
`
`� 227 j
`
`226
`
`·p
`
`224
`
`223
`
`225
`'"'
`
`-
`
`221 222
`
`FIG. 3a
`
`IPR2021-00921
`Apple EX1014 Page 4
`
`

`

`U.S. Patent Mar.19,2002
`
`Sheet 4 of 9
`
`US 6,359,647 Bl
`
`233 232
`
`226
`
`231
`
`103 230
`
`FIG. 3b
`
`IPR2021-00921
`Apple EX1014 Page 5
`
`

`

`U.S. Patent
`
`
`Mar. 19, 2002 Sheet 5 of 9
`
`
`US 6,359,647 Bl
`
`L_J
`
`243
`
`264
`
`265
`
`249
`
`241
`
`242
`
`246
`
`240
`
`104
`
`256
`
`252
`
`250
`
`254
`
`FIG. 3c
`
`IPR2021-00921
`Apple EX1014 Page 6
`
`

`

`flllf,,.
`
`£7)
`�
`
`--==------
`
`--
`
`-----
`
`---
`
`-----
`
`..........
`
`---
`
`2 -
`
`11
`
`-
`
`10
`
`----
`---
`
`
`452
`
`51
`14
`'4
`
` -
`-j
`'/-
`
`443 ' I_ --456 '
`
`-....-
`462
`'
`'
`48(',,
`ff-Jr 461 I '''
`= --
`'
`-_n, 444
`I I I -7
`�--=-t-=-
`440 104 -�'::/., _L':�----
`-� r:..:------...,--✓-_-103
`_j ___ =rs-. -r_,ffL ........................................ 441 L:I_ -----
`-r---
`i-____ �T_
`2 /101
`
`I I
`I
`\
`I
`\
`\ 457 I
`1, 454 I
`I
` I 455 '---
`I I,/ ---4.::-·"-
`\
`I 449
`
`I
`
`I
`
`__ ___,_..,,.._
`
`I I
`I I
`I I
`I 1
`1
`I I
`I I
`I I
`I
`I
`
`450 :,
`
`I
`
`480
`
`"' �
`
`
` w
`"'Q-,.,
`
`�
`
`I.O
`
`
`.....°"
`
`
`��
`
`0
`
`
`,.I.O
`
`i--1
`
`',, --
`
`60
` 4
`
`----=-=--�
`
`�----"'-
`
`-��459
`
`I
`I
`
` 4
`
`FIG.
`
`�------------Y
`
`\ I ,. .,.
`
`\ f
`
`I
`
`\
`
`458 �
`
`I
`I
`I
`
`,,---/1
`
`\
`
` \
`,11
`
`
`
`1t
`11
`
`II
`
`l 453
`f
`J
`I
`\
`�
`
`IPR2021-00921
`Apple EX1014 Page 7
`
`

`

`U.S. Patent Mar. 19, 2002
`Sheet 7 of 9
`
`
`US 6,359,647 Bl
`
`511
`
`FIG. 5a
`
`FIG. 5b
`
`I---�
`
`_ ____,.
`
`580
`/ J/
`58 ,,,
`� 582 .____ ___,
`/
`p
`585
`02
`
`IPR2021-00921
`Apple EX1014 Page 8
`
`

`

`U.S. Patent Mar. 19, 2002
`Sheet 8 of 9
`
`
`US 6,359,647 Bl
`
`GET CAMERA ID, AND
`
`
`POSITION OF FIGURE IN
`IMAGE
`
`610
`
`DETERMINE LOS
`
`
`DIRECTION FROM
`CAMERA TO TARGET
`
`615
`
`TRIANGULATE
`
`RANGE 624
`
`DETERMINE DISTANCE
`
`R, FROM CAMERA TO
`
`TARGET, ALONG LOS.
`
`TARGET LOCATION
`= POINT P ON LOS AT
`
`A DISTANCE R FROM
`
`LOCATION OF CAMERA
`
`628
`
`LOS1 = LOS
`CAM1 = CAMERA ID
`
`632
`
`GET CAMERA ID
`634
` AND
`�CAM2
`POS TION F FIGURE
`
`IN ANOTHER IMAGE
`
`6,
`
`DETERMINE LOS2
`
`
`DIRECTION FROM CAM2
`TO TARGET
`
`636
`
`TARGET LOCATION, P
`
`= INTERSECTION OF
`
`LOS1 FROM CAM1 &
`LOS2 FROM CAM2.
`
`638
`
`PROCESS 640
`
`
`(FILTER, PREDICT, ETC.)
`
`TARGET POSITION P
`
`FIG. 6a
`
`IPR2021-00921
`Apple EX1014 Page 9
`
`

`

`
`U.S. Patent
`Mar.19,2002
`Sheet 9 of 9
`
`US 6,359,647 Bl
`
`USER SELECTS
`REMOTE ALARM
`GET COORDINATES
`690
`650
`ALTERNATE
`(X, Y,Z)
`CAMERA
`
`
`OF TARGET LOCATION P
`
`680
`
`
`FOR EACH CAMERA i
`652
`
`MARK THE CAMERA
`692
`
`ASSOCIATED WITH
`THIS ALARM
`
`GET TARGET
`
`COORDINATES (X, Y, Z)
`
`ASSOCIATED WITH
`THIS ALARM
`
`694
`
`SELECT A
`660
`
`MARKED CAMERA
`
`DETERMINE LINE OF
`
`
`SIGHT FROM CAMERA
`664
`
`POSITION TO X, Y, Z
`
`ADJUST CAMERA TO
`
`
`668
`THIS LINE OF SIGHT
`
`UPDATE FIGURE
`
`670
`
`TRACKING SYSTEM
`
`RETURN
`
`675
`
`FIG. 6b
`
`IPR2021-00921
`Apple EX1014 Page 10
`
`

`

`
`
`US 6,359,647 Bl
`
`1
`
`
`
`AUTOMATED CAMERA HANDOFF SYSTEM
`
`CAMERA SYSTEM
`
`
`
`2
`tern will allow for the near continuous display of a figu re as
`
`
`
`
`
`
`the figure moves about throughout the multiple cameras'
`FOR FIGURE TRACKING IN A MULTIPLE
`
`potential fields of view.
`The approximate physical location of a figure is deter-
`
`
`
`
`BACKGROUND OF THE INVENTION
`
`
`s mined from the displayed image, the identification of the
`
`tracking system, and figu re within this image by the figure
`
`1. Field of the Invention
`
`
`
`
`a knowledge of the camera's location and actual field of
`
`This invention relates to a system for controlling multiple
`
`
`
`view which is producing the displayed image. If the figu re
`
`
`video cameras. This invention allows for an automated
`
`
`
`exits a selected camera's field of view, another camera
`
`
`
`
`camera handoff for selecting and directing cameras within a
`
`
`
`
`10 containing the figure within its field of view is selected. The
`
`
`
`
`multi-camera system, as might be used in a security system
`
`
`bounds of each camera's potential field of view are con­
`
`
`
`
`or a multi-camera broadcasting system. The automation is
`
`
`tained in the system. The system determines which cameras'
`
`
`
`provided by tracking a figure within the image from an
`
`potential fields of view contain the figure by determining
`
`
`individual camera, coupled with an area representation of
`
`
`
`
`whether the figure's determined physical location lies within
`
`
`the fields of view of each of the other cameras.
`
`15 the bounds of each camera's field of view.
`
`
`2.Description of Related Art
`
`
`In a preferred embodiment, when the figure approaches
`
`
`
`
`Security systems for airports, casinos, and the like typi­
`
`
`the bounds of the selected camera's potential field of view,
`
`
`
`
`cally employ a multitude of cameras that provide images of
`
`
`
`
`the system determines which other camera's potential field
`
`
`
`selected areas to a control station. The images from each of
`
`
`of view contains the figure, then adjusts that other camera's
`
`these cameras, or a subset of these cameras, are displayed on 20
`
`
`
`
`
`
`
`actual field of view to contain the figu re. When the figu re is
`
`
`
`one or more monitors at the control station. The operator of
`
`
`
`
`at the bounds of the selected camera's field of view, the
`
`
`
`
`the control station is provided an ability to select any one of
`
`
`system automatically selects an other camera and commu­
`
`
`
`
`the cameras for a display of its image on a primary monitor,
`
`
`nicates the appropriate information to the figure tracking
`
`
`
`and, if the camera is adjustable, to control of the camera's
`
`
`
`
`process to continue the tracking of the figu re using this other
`
`
`
`field of view. Such control systems are also utilized for
`25
`camera.
`
`
`
`selecting from among multiple cameras at an event being
`In a further embodiment of the invention, the system also
`
`
`
`
`
`
`
`
`
`broadcast, for example, multiple cameras at a sports arena,
`
`
`
`contains predictive location determination algorithms. By
`or studio.
`
`
`and assessing the movement of the figu re, the selection
`
`The selection and control of the cameras is typically
`
`
`
`
`
`
`
`adjustment of the next camera can be effected based upon
`b 30 a bank of switches, or y
`
`
`
`accomplished by controlling
`
`
`
`
`the predicted subsequent location of the figure. Such pre­
`selecting from amongst a list of cameras on a computer
`
`
`
`
`
`
`
`
`dictive techniques are effective for tracking a figure in a
`
`
`
`
`terminal. To view a particular area, the operator selects the
`
`
`
`secured area in which the cameras' fields of view are not
`
`camera associated with that area. If the camera is adjustable,
`
`
`necessarily overlapping, and also for selecting from among
`
`
`
`
`the operator subsequently adjusts the selected camera's field
`
`
`
`
`multiple cameras containing the figure in their potential field
`
`
`of view by adjusting its rotation about a horizontal axis (pan) 35
`
`
`of view.
`
`
`
`
`or vertical axis (tilt), or its magnification (zoom). The entire
`By associating the displayed image to the physical locale
`
`
`
`
`
`span of an adjustable camera's span of view is termed herein
`
`
`
`
`of the secured area, the operator need not determine the
`
`
`
`as the camera's potential field of view, whereas the view
`
`
`
`potential egress points from each camera's field of view, nor
`
`
`resulting from the particular pan, tilt, and zoom settings is
`40
`
`
`need the operator know which camera or cameras cover a
`
`
`
`termed the camera's actual field of view.
`
`
`given area, nor which areas are adjacent each other.
`
`
`
`Image processing algorithms are available which allow
`
`
`
`
`In another embodiment, the selection of a target is also
`
`
`for the identification of a particular pattern, or figu re, within
`
`
`
`
`automated. Security systems often automatically select a
`
`
`
`an image, and the identification of any subsequent move­
`
`
`
`camera associated with an alarm, for the presentation of a
`
`control system, ment of that fiwith a security gu re. Coupled
`
`
`
`
`
`45 view of the alarmed area to the operator. By associating a
`
`
`
`such image processing algorithms allow for the automated
`
`
`
`target point with each alarm, for example the entry way of
`
`
`
`adjustment of a camera so as to keep the figure in the center
`
`
`a door having an alarm, the system can automatically select
`
`
`
`of the cameras actual field of view. When the figure travels
`
`
`
`and adjust the camera associated with the alarm to contain
`
`
`beyond the potential field of view of the camera, the operator
`
`
`
`
`that target point, and identify the target as those portions of
`
`
`selects another camera whose potential field of view con­
`
`
`
`
`the system Thereafter, so the image which exhibit movement.
`
`
`
`
`the camera, location, adjusts tains the figu re at its new
`
`
`will track the target, as discussed above.
`
`
`
`identifies the figure in the camera's actual field of view, and
`
`
`
`
`thereafter continues the automated tracking until the figure
`
`
`
`exits that camera's potential field of view.
`FIG. 1 illustrates an example multi-camera security sys­
`
`
`
`
`
`
`In the conventional camera selection scenario, the opera-55
`
`
`tem in accordance with this invention.
`
`
`
`
`tor must be familiar with the layout of the secured area, as
`
`
`
`
`FIG. 2 illustrates an example graphic representation of a
`
`
`
`
`well as the correspondence between the displayed image and
`
`
`
`
`secured area with a multi-camera security system, in accor­
`
`
`this layout. That is, for example, if a figure is seen exiting
`
`dance with this invention.
`
`
`
`
`through one of several doorways, the operator must be able
`
`
`FIGS. 3a, 3b and 3c illustrate example field of view
`
`
`
`to quickly determine to which other area that particular
`60
`
`
`
`polygons associated with cameras in a multi-camera security
`
`
`
`doorway leads, and must further determine which camera
`
`
`system, in accordance with this invention.
`
`includes that other area.
`
`
`
`FIG. 4 illustrates an example three dimensional represen­
`
`
`
`tation of a secured area and a camera's field of view
`
`
`
`invention. 65 polyhedron, in accordance with this
`It is an object of this invention to provide for the auto­
`
`
`
`
`
`
`
`
`
`
`mation of a multiple camera system, so as to provide for a FIGS. Sa, Sb and Sc illustrate an example of the associa­
`
`
`
`
`
`
`a figure in an image from a camera and the tion between sys-The preferred multi-camera ficapability. gu re tracking
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`SUMMARY OF THE INVENTION
`
`IPR2021-00921
`Apple EX1014 Page 11
`
`

`

`
`
`US 6,359,647 Bl
`
`EMBODIMENTS
`
`
`
`4
`3
`
`
`
`physical representation of the secured area, in accordance
`the location determinator 140. In FIG. 2, line segments Pl
`
`
`
`
`through PS represent the path of a person (not shown)
`
`with this invention.
`
`
`
`
`traversing the secured areas. The operator of the security
`
`
`FIGS. 6a and 6b illustrate example flowcharts for the
`
`
`
`system, upon detecting the fiin the image gu re of the person
`
`
`
`automated camera handoff process in accordance with this
`
`
`5 of camera 105, identifies the figure to the figu re tracking
`invention.
`
`
`system 144, typically by outlining the figu re on a copy of the
`DESCRIPTION OF THE PREFERRED
`image from camera 105 on the video screen 180.
`
`
`
`Alternatively, automated means can be employed to identify
`FIG. 1 illustrates a multi-camera security system. The
`
`
`
`
`
`
`moving objects in an image that conform to a particular
`
`
`
`system comprises video cameras 101, 102, 103 and 104--106
`
`
`
`
`
`105 isshape, speed, etc. Camera 10 target profile, such as size,
`
`
`(shown in FIG. 2). Cameras 101 and 102 are shown as
`
`
`
`initially adjusted to capture the figure, and the figure track­
`
`
`
`
`adjustable, pan/tilt/zoom, cameras. The cameras 101, 102,
`
`
`
`
`ing techniques continually monitor and report the location of
`
`
`103 provide an input to a camera handoff system 120; the
`
`
`the figure in the image produced from camera 105. The
`
`
`
`
`connections between the cameras 101, 102, 103 and the
`
`
`the characteristics of figu re tracking system 144 associates
`
`
`
`camera handoff system 120 may be direct or remote, for
`
`
`
`combinations and patterns, to15 the selected area, such as color
`
`
`
`example, via a telephone connection. In accordance with this
`
`
`
`
`
`tracking the figure Thereafter, the identified target. figu re, or
`
`
`
`invention, the camera handoff system 120 includes a con­
`
`
`
`system 144 determines the subsequent location of this same
`
`
`
`troller 130, a location determinator 140, and a field of view
`
`
`characteristic pattern, corresponding to the movement of the
`
`
`determinator 150. The controller 130 effects the control of
`
`
`identified target as it moves about the camera's field of view.
`
`
`the cameras 101, 102, 103 based on inputs from the sensors
`
`
`be used in 20 Manual figure tracking by the operator may
`
`
`
`111, 112, the operator station 170, and the location deter­
`
`
`addition to, or in lieu of, the automated figure tracking
`
`
`
`minator 140 and field of view determinator 150.
`
`
`
`system 144. In a busy scene, the operator may be better able
`
`
`An operator controls the security system via an operator's
`
`
`
`to distinguish the target. In a manual figure tracking mode,
`
`
`
`station 170, and controller 130. The operator typically
`
`
`
`the operator uses a mouse or other suitable input device to
`
`
`
`
`
`selects from options presented on a screen 180 to select one
`
`25 point to the target as it traverses the image on the display
`
`
`
`of the cameras 101, 102, 103, and controls the selected
`180.
`
`
`camera to change its line of sight, via pan and tilt
`If camera 105 is adjustable, the controller 130 adjusts
`
`
`
`
`
`adjustments, or magnification factor, via zoom adjustments.
`
`
`
`camera 105 to maintain the target figure in the center of the
`
`
`
`The image from the selected camera's field of view is
`
`
`image from camera 105. That is, camera 105's line of sight
`
`
`presented to the operator for viewing via the switch 135.
`
`
`
`30 and actual field of view will be adjusted to continue to
`
`
`
`The optional alarm sensors 111, 112 provide for automatic
`
`
`
`
`contain the figure as the person moves along path Pl within
`
`camera selection when an alarm condition is sensed. Each
`
`
`
`camera 105's potential field of view. Soon after the person
`
`alarm sensor has one or more cameras associated with it;
`
`
`
`
`
`
`
`
`progresses along path P2, the person will no longer be within
`when the alarm is activated, an associated camera is selected
`
`
`
`
`camera 105's potential field of view.
`
`
`
`
`and adjusted to a predefined line of sight and the view is 35
`
`
`
`
`In accordance with this invention, based upon the deter-
`
`
`displayed on the screen 180 for the operator's further
`
`
`
`mined location of the person and the determined field of
`
`
`assessment and subsequent security actions.
`
`
`
`
`view of each camera, the controller 130 selects camera 106
`
`
`
`The field of view de terminator 150 determines the field of
`
`
`when the person enters camera 106's potential field of view.
`
`view of each camera based upon its location and orientation.
`
`
`
`
`
`In a preferred embodiment that includes a figure tracking
`
`Non-adjustable camera 103 has a fixed field of view,
`
`
`
`
`40 system 144, the figure tracking techniques will subsequently
`
`
`
`
`whereas the adjustable cameras 101, 102 each have varying
`
`
`be applied to continue to track the figure in the image from
`
`
`
`
`fields of view, depending upon the current pan, tilt, and
`
`
`camera 106. Similarly, the system in accordance with this
`
`zoom settings of the camera. To facilitate the determination
`
`
`invention will select camera 103, then camera 102, then
`
`of each camera's field of view, the camera handoff system
`
`
`camera 104, and then camera 102 again, as the person
`
`
`
`
`120 includes a database 160 that describes the secured area
`
`
`
`path. 45 proceeds along the P3-P4-P5
`
`
`
`and the location of each camera. The database 160 may
`To effect this automatic selection of cameras, the camera
`
`
`
`
`
`
`
`include a graphic representation of the secured area, for
`
`
`handoff system 120 includes a representation of each cam­
`
`
`
`example, a floor plan as shown in FIG. 2. The floor plan is
`
`
`era's location and potential field of view, relative to each
`
`
`
`created and entered in the control system when the security
`
`
`
`other. For consistency, the camera locations are provided
`
`
`system is installed, using for example Computer Aided
`
`
`
`50 relative to the site plan of the secured area that is contained
`
`
`Design (CAD) techniques well known to one skilled in the
`
`
`
`in the secured area database 160. Associated with each
`
`art. Each wall and obstruction is shown, as well as the
`
`
`
`camera is a polygon or polyhedron, outlining each camera's
`
`
`location of each of the cameras 101-106.
`
`
`potential field of view. FIG. 3a illustrates the polygon
`
`The location determinator 140 determines the location of
`
`
`associated with camera 102. FIG. 3b illustrates the polygon
`
`
`
`an object within a selected camera's field of view. Based
`
`
`
`55 associated with camera 103. Camera 102 is a camera having
`
`upon the object's location within the image from the
`
`
`
`an adjustable field of view, and thus can view any area
`
`
`
`
`selected camera, and the camera's physical location and
`
`
`
`
`within a full 360 degree arc, provided that it is not blocked
`
`
`
`orientation within the secured area, the location determina-
`
`
`by an obstruction. Camera 103 is a camera with a fixed field
`
`
`
`
`tor 140 determines the object's physical location within the
`
`
`
`of view, as represented by the limited view angle 203.
`
`
`
`secured area. The controller 130 determines which cameras'
`
`
`
`60 Camera 102's potential field of view is the polygon bounded
`
`
`field of view include the object's physical location and
`
`
`
`by vertices 221 through 229. Camera 103's field of view is
`
`
`
`
`selects the appropriate camera when the object traverses
`
`
`
`
`the polygon bounded by vertices 230-239. As shown, the
`
`
`from one camera's field of view to another camera's field of
`
`
`
`field of view polygon can include details such as the ability
`
`
`
`view. The switching from one camera to another is termed
`
`
`
`to see through passages in obstructions, such as shown by
`
`a camera handoff.
`
`
`
`65 the vertices 238 and 239 in FIG. 3b. Also associated with
`
`
`
`each camera is the location of the camera, shown for
`
`
`In a preferred embodiment, the camera handoff is further
`
`
`
`example as 220, 230, 240 in FIGS. 3a, 3b, 3c. The polygon
`
`
`
`automated via the use of figure tracking system 144 within
`
`IPR2021-00921
`Apple EX1014 Page 12
`
`

`

`
`
`US 6,359,647 Bl
`
`6
`5
`representing the field of view of camera 104 is shown in the camera to center the figure, a greater degree of accuracy
`
`
`
`
`
`
`
`
`
`
`FIG. 3c, comprising vertices 240 through 256. As shown in
`
`
`
`can be achieved in resolving the actual line of sight to the
`
`
`
`FIG. 3c, the field of view polygon can omit details, as shown
`
`
`
`camera, figu re. With either an adjustable or non adjustable
`
`
`
`
`
`by the use of vertices 244-245, omitting the actual field of
`
`
`
`the direction of the target from the camera, in relation to the
`view vertices 264-265. The level of detail of the polygons
`
`
`
`
`
`
`5 physical site plain, can thus be determined. For ease of
`
`
`
`
`is relatively arbitrary; typically, one would provide the detail
`
`
`
`understanding, the line of sight is used herein as the straight
`
`necessary to cover the maximum surveillance area within
`
`
`
`
`line between the camera and the target in the physical
`the secured area. If one area is coverable by multiple
`
`
`
`
`
`coordinate site plan, independent of whether the camera is
`
`
`
`
`cameras, the need is minimal for identifying the fact that a
`
`adjusted to effect this line of sight.
`
`
`particular camera can also view that area by viewing through
`10
`FIG. Sc illustrates the physical representation of a secured
`
`
`
`
`
`
`a doorway. Conversely, if the only view of an area is through
`
`area, as well as the location of camera 502, the line of sight
`
`
`
`
`such a doorway, the encoding of the polygon to include this
`
`
`
`
`580 to the target, and the camera's actual field of view, as
`
`
`otherwise uncovered area may be worthwhile. Similarly,
`
`bounded by rays 581 and 582 about an angle of view 585.
`
`
`
`although an unobstructed view of a camera is infinite,
`
`
`
`
`To determine the precise location of the target along the
`
`
`
`polygon bounds can be defined to merely include the area of 15
`
`
`line of sight 580, two alternative techniques can be
`
`
`
`interest, as shown for example in FIG. 3c, where the bounds
`
`
`
`if the ranging. In triangulation, employed: triangu lation and
`
`
`249-250 and 253-254 are drawn just beyond the perimeter
`
`
`
`
`target is along the line of sight of another camera, the
`
`of the area being secured.
`
`
`intersection of the lines of sight will determine the target's
`The site map may also be represented as a three dimen­
`
`
`
`
`
`actual location along these lines of sight. This triangu lation
`
`
`sional model, as shown in FIG. 4. In a three dimensional
`20
`
`
`
`
`
`method, however, requires that the target lie within the field
`
`
`
`
`model, the cameras' fields of view are represented by
`
`
`
`of view of two or more cameras. Alternatively, with auto­
`
`
`
`
`polyhedron, to include the three-dimensional nature of a
`
`
`
`
`focus techniques being readily available, the target's dis­
`
`
`
`
`camera's field of view. The polyhedron associated with
`
`
`tance (range) from the camera can be determined by the
`
`camera 104 is shown in FIG. 4, and is represented by the
`
`
`
`
`
`setting of the focus adjustment to bring the target into focus.
`
`
`
`
`vertices 441 through 462. As discussed above, the detail of 25
`
`
`
`
`
`Because the distance of the focal point of the camera is
`
`
`the polyhedron model is dependent upon the level of pre­
`
`
`
`
`directly correlated to the adjustment of the focus control on
`
`
`
`
`
`cision desired. For example, vertices 449 through 454 model
`
`
`
`
`the camera, the amount of focus control applied to bring the
`
`
`the view through the portal 480 as a wedge shaped area,
`
`
`
`target into focus will provide sufficient information to esti-
`
`
`
`whereas vertices 455 through 462 model the view through
`
`
`
`
`
`mate the distance of the target from the location of the
`
`the portal 481 as a block shaped area. Three dimensional
`30
`
`
`
`camera, provided that the correlation between focus control
`
`
`
`
`
`modeling will provide for greater flexibility and accuracy in
`
`and focal distance is known. Any number of known tech­
`
`
`
`
`the determination of actual location of the target, but at
`
`
`niques can be employed for modeling the correlation
`
`
`
`increased computational costs. For ease of understanding,
`
`
`
`
`between focus control and focal distance. Alternatively, the
`
`
`
`
`two dimensional modeling will be discussed hereafter. The
`
`
`
`
`camera itself may contain the ability to report the focal
`
`
`
`
`techniques employed are equally applicable to three dimen-35
`
`
`distance, directly, to the camera handoff system. Or, the
`
`
`
`sional site maps, as would be evident to one skilled in the art.
`
`
`
`focal distance information may be provided based upon
`
`
`The coordinate system utilized for encoding the camera
`
`
`
`independent means, such as radar or sonar ranging means
`
`
`
`locations and orientations can be any convenient form.
`
`associated with each camera.
`
`
`Actual dimensions, relative to a reference such as the floor
`
`
`
`In the preferred embodiment, the correlation between
`
`
`plan, may be used; or, scaled dimensions, such as screen
`40
`
`
`
`focus control and focal distance is modeled as a polynomial,
`
`
`
`coordinates may be used. Techniques for converting from
`
`
`
`
`associating the angular rotation x of the focus control to the
`
`
`
`one coordinate system to another are well known to one
`
`focal distance R as follows:
`
`
`
`
`
`skilled in the art, and different coordinate systems may be
`
`
`
`utilized as required. Combinations of three dimensional
`
`
`
`modeling and two dimensional modeling may also be
`45
`
`
`
`
`
`employed, wherein for example, the cameras at each floor of
`The degree n of the polynomial determines the overall
`
`
`
`
`
`a multistoried building are represented by a two dimensional
`
`
`
`
`accuracy of the range estimate. In a relatively simple system,
`
`plan, and each of these two dimensional plans have a third,
`
`
`a two degree polynomial (n=2) will be sufficient; in the
`
`
`
`
`elevation, dimension associated with it. In this manner, the
`
`
`
`preferred embodiment, a four degree polynomial (n=4) is
`
`
`
`computationally complex process of associating an image to 50
`
`
`
`
`found to provide highly accurate results. The coefficients ao
`
`
`
`
`a physical locale can operate in the two dimensional
`
`
`
`through an are determined empirically. At least n+l mea­
`
`
`representation, and the third dimension need only be pro­
`
`
`
`
`
`surements are taken, adjusting the focus x of the camera to
`
`
`
`
`
`cessed when the target enters an elevator or stairway.
`
`
`
`focus upon an item place at each of n+ 1 distances from the
`
`
`
`FIGS. 5a-5c demonstrates the association of a figure in an
`
`
`
`
`camera. Conventional least squares curve fitting techniques
`
`
`
`
`
`image to a target in the physical coordinate system, in 55
`
`
`
`are applied to this set of measurements to determine the
`
`
`accordance with this invention. An image 510 from camera
`
`
`
`and curve coefficients a0 through an. These measurements
`
`
`502 (shown in FIG. Sc), containing a figu re 511, is shown in
`
`
`
`
`fitting techniques can be applied to each camera, to deter­
`
`
`
`FIG. Sa. As discussed above, figure tracking processes are
`
`
`
`
`mine the particular polynomial coefficients for each camera;
`
`
`
`
`
`available that determine a figure's location within an image
`
`
`
`or, a single set of polynomial coefficients can be applied to
`
`
`
`and allows a camera control system to adjust camera 502's 60
`
`
`
`all cameras having the same auto-focus mechanism. In a
`
`
`
`line of sight so as to center the figure in the image, as shown
`
`
`
`preferred embodiment, the common single set of coefficients
`
`
`
`in FIG. Sb. The controller 130 in accordance with this
`
`
`
`
`are provided as the default parameters for each camera, with
`
`
`
`
`invention will maintain the camera 502's actual line of sight,
`
`
`a capability of subsequently modifying these coefficients via
`
`
`
`in terms of the physical site plan, for subsequent processing.
`
`camera specific measurements, as required.
`
`
`
`If the camera is not adjustable, the line of sight from the
`
`
`
`If the camera is not adjustable, or fixed focus, alternative
`65
`
`
`distance camera to the figu re is determined by the angular
`
`
`
`techniques can also be employed to estimate the range of the
`
`
`
`image. By adjusting the figu re is offset from the center of the
`
`
`
`
`
`target from the camera. For example, if the target to be
`
`IPR2021-00921
`Apple EX1014 Page 13
`
`

`

`
`
`US 6,359,647 Bl
`
`7
`8
`tracked can be expected to be of a given average physical
`
`
`
`
`a person, compnsmg arbitrarily moving appendages and
`
`
`
`
`size, the size of the figure of the target in the image can be
`
`
`
`
`
`relatively unsharp edges, is difficult to determine absolutely.
`
`
`
`
`used to estimate the distance, using the conventional square
`
`
`
`
`
`Data smoothing techniques can be applied so as to minimize
`
`
`the jitter in the predictive location Q, whether determined
`
`
`
`
`law correlation between image size and distance. Similarly,
`
`
`
`
`if the camera's line of sight is set at an angle to the surface
`
`
`
`using a linear or non-linear model. These and other tech­
`5
`
`
`
`of the secured area, the vertical location of the figu re in the
`
`
`niques of motion estimation and location prediction are well
`
`
`
`displayed image will be correlated to the distance from the
`
`known to those skilled in the art.
`
`Given a predicted location Q, in the site map coordinate
`
`
`
`camera. These and other techniques are well known in the art
`
`
`
`system, the cameras containing the point Q within their
`
`
`
`
`for estimating an object's distance, or range, from a camera.
`
`
`Given the estimated distance from the camera, and the 10
`
`
`
`potential fields of view can be determined. If the predicted
`
`
`
`
`location Q lies outside the limits of the current camera's
`
`
`
`
`
`camera's position and line of sight, the target location P, in
`
`
`
`the site plan coordinate system, corresponding to the figure
`
`
`
`potential field of view, an alternative camera, containing
`
`
`
`
`location Q in its field of view, is selected and adjusted so as
`
`
`
`location in the displayed image from the camera, can be
`
`
`
`
`determined. Given the target location P, the cameras within
`
`
`
`
`to provide the target in its actual field of view. The system
`
`
`whose fields of view the location P lies can be determined. 15
`
`
`
`need not wait until the predicted location is no longer within
`
`
`
`This is because the cameras' fields of view are modeled in
`
`
`
`the current camera's field of view; if the predicted location
`
`
`
`
`Q is approaching the bounds of the selected camera's field
`
`
`
`
`this same coordinate system. Additionally, the cameras
`
`
`
`whose fields of view are in proximity to the location P can
`
`
`
`of view, but well within the bounds of another camera's field
`also be determined.
`
`
`
`of view, the other camera can be selected. Similarly, the
`
`
`
`
`
`
`
`At option, each of the cameras including the target point 20 distance from each camera can be utilized in this selection
`
`
`
`process. As is common in the art, a weightin

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket