`
`(12) Unlted States Patent
`(10) Patent No.:
`US 7,990,422 B2
`
`Ahiska et a].
`(45) Date of Patent:
`Aug. 2, 2011
`
`(54) AUTOMATICALLY EXPANDING THE ZOOM
`_
`CAPABILITY OF A WIDE ANGLE VIDEO
`CAMERA
`
`(75)
`
`Inventors: Bartu Ahiska, Guildford (GB); Mark
`Kenneth Davey, Bromley (GB); Ahmet
`-
`-
`1 nl:
`Ems cetm’
`ara (TR)
`(73) Assignee: Grandeye, Ltd., Guildford, Surrey (GB)
`( * ) Notice:
`Subject to any disclaimer, the term of this
`3:13 1155:3313“;90f£1215th under 35
`'
`'
`'
`y
`y '
`
`$133: gogghlan et311~
`2,323,333 2
`0 1nson et a .
`,
`,
`4/1989 Hempel et a1.
`4,821,209 A
`2/1991 Morgan
`4,992,866 A
`6/1991 Artigalas et al.
`5,027,287 A
`5,164,827 A * 11/1992 Paff .............................. 348/143
`gyfggg; :
`$133; éimrlrfrmann
`,
`,
`tsu
`5,311,305 A
`5/1994 Mahadevan et al.
`5:321:72? :
`333::
`Isilllfgrloet 31'
`(Continued)
`FOREIGN PATENT DOCUMENTS
`1 341 383 A2
`9/2003
`
`EP
`
`(21) Appl. No.: 11/184,720
`
`(Continued)
`
`(22)
`
`(65)
`
`Filed:
`
`Jul. 19, 2005
`.
`.
`.
`Prlor Publlcatlon Data
`US 2006/0056056 A1
`Mar. 16, 2006
`
`OTHER PUBLICATIONS
`Comaniciu, D., Ramesh, V., and Meer, P., “Real-Time Tracking of
`Non-Rigid Objects Using Mean-shift,” IEEE Computer Vision and
`Pattern Recognition, V01. 1 II, 2000, pp. 142-149.
`
`Related US. Application Data
`
`(Continued)
`
`Primary Examiner 7 Jason Chan
`Assistant Examiner 7 Joel Fosselman
`
`(74) Attorneyflgenl, orFirm iRobert Groover; Storm LLP
`(57)
`ABSTRACT
`.
`.
`.
`.
`A system for automatically expanding the zoom capability of
`a wide-angle Video camera using images from multiple cam-
`era locations. One preferred embodiment achieves this using
`images from the wide-angle Video camera that are analyzed to
`identify regions of interest (ROI). Pan-Tilt-Zoom (PTZ) con-
`trols are then sent to aim slave cameras toward the R01.
`Processing circuitry is then used to replace the ROI from the
`wide-angle images with the higher-resolution images from
`one of the slave cameras. In addition, motion-detecting soft-
`ware can be utilized to automatically detect, track, and/or
`ZOOIII in 011 mOVing objects
`
`6 Claims, 9 Drawing Sheets
`
`Optical zoom
`system107
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`
`
`
`
`Wide-angle
`Digital
`2 Position
`
`i teedback‘
`optical system video signal
`101
`
`
`
`Master wide-angle video camera
`
`
`Position
`Optical zoom
`
`PTZ control
`Digital
`
`
`system107
`video signal
`feedback‘
`
`
`
`
`Slave PTZ video camera (with
`
`optical zoom capability)
`
`
`.:
`
`PTZ control
`
`APPL-1007 / Page 1 of 21
`Apple v. Corephotonics
`
`(60) Provisional application No. 60/589,104, filed on Jul.
`19, 2004~
`11 .
`.
`I t Cl
`H04N 5/225
`H04N 5/232
`(52) U S Cl
`(58) Field of Classification Search
`
`(2006.01)
`(2006.01)
`348/218 1. 348/211 3
`’348/218 1.
`345/428
`
`(51)
`
`See a
`
`pp
`
`56
`
`(
`
`)
`
`""""""""
`lication file for com lete search histo
`p
`.
`C t d
`R f
`l e
`e erences
`US. PATENT DOCUMENTS
`
`ry.
`
`g’ggg’ggg :
`4,326,218 A
`4,549,208 A
`4,667,236 A
`
`113;? 55:;cechowsky
`4/1982 Coutta et al.
`10/1985 Kamejima et al.
`5/1987 Dresdner
`
`Wide-angle
`optical system
`101
`
`
`
`
`
`
`Master wide-angle video camera
`
`
`
`
`Video to
`.
` Image
`
`
`
`3:3;
`processing
`Dutput
`base station
`
`Circuitry
`102
`WWW
`_
`M
`
`Control from
`
`L 83:33:;
`basestation
`
`
`
`I...-
`Digital video
`Optical zoom
`2 Position
`system107
`signal
`1 Feedback'
`
`
`
`Slave PTZ video camera (with
`
`optical zoom capability)
`
`
`APPL-1007 / Page 1 of 21
`Apple v. Corephotonics
`
`
`
`US 7,990,422 B2
`
`Page 2
`
`US. PATENT DOCUMENTS
`5,359,363 A
`10/1994 Kuban et a1.
`5,365,597 A
`“/1994 Holeva
`5,384,588 A
`1/1995 Martin et a1.
`5,394,209 A
`2/1995 Stiepel et a1.
`5,396,284 A
`3/1995 Freeman
`RE34,989 E
`7/1995 Struhs et a1.
`5,434,617 A
`7/ 1995 Bianchi
`5,495,292 A
`2/1996 Zhang
`5’530’650 A
`6/1996 Biferno et al.
`5,539,483 A
`7/1996 Nalwa
`5 563 650 A
`10/1996 Poelstra
`5:589:901 A
`12/1996 Means
`5,610,391 A
`3/1997 Ringlien
`5,627,616 A
`5/1997 Sergeant et 31.
`5,654,750 A
`8/1997 Weil et a1.
`5,666,157 A
`9/ 1997 Aviv
`5,684,937 A
`11/1997 Oxaal
`6,049,281 A
`4/2000 Osterwell
`6,147,709 A
`11/2000 Martin et al.
`6,215,519 Bl
`4/2001 Nayar et al.
`6,243,099 Bl
`6/2001 Oxaal
`6,344,852 Bl
`2/2002 Zhu
`6,509,926 Bl
`“2003 Mills et 3L
`6,724,421 B1
`4/2004 Glatt
`6,757,434 B2
`6/2004 Miled et 31'
`65763 5068 B2
`7/2004 Oktem
`6,853,809 B2 *
`2/2005 Pelletier """""""""""""" 396/85
`2002/0063711 A1*
`5/2002 Park et a1.
`..................... 345/428
`
`2003/0210329 A1 * 11/2003 Aagaard et a1.
`2005/0018045 A1*
`1/2005 Thomas et a1.
`-
`.................. 348/169
`9/2006 Senior et a1.
`2006/0197839 A1*
`FOREIGN PATENT DOCUMENTS
`
`............... 348/159
`
`..
`348/157
`
`W0
`
`W0 02/062056 A1
`
`8/2002
`
`OTHER PUBLICATIONS
`Y. Yardimci, I. Yilmaz, A. E. Cetin, “Correlation Tracking Based on
`Wavelet Comain Information,” Proceedings of SPIE vol. #5204, San
`.
`D‘ego’ Aug: 5'7, 200.3'
`.
`.
`.
`.
`.
`A M. Bagci,-Y. Yardimci, A. E. Cetin, “Mov1ng Object Detection
`Us1ng Adaptive Subband Decompos1tion and Franctional Lower-
`Order Statistics in Video Sequences,” Signal Process1ng, 82 (12):
`1941-1947, Dec. 2002.
`C. Stauffer, W. Grimson, “Adaptive Background Mixture Models for
`Real-Time Tracking.” Proc. IEEE CS Conf. on Computer Vision and
`Pattern Recognition, vol. 2, 1999, pp. 246-252.
`“A System for Video Surveillance and Monitoring,” in Proc. Ameri-
`can Nuclear Society (ANS) Eighth International Topical Meeting on
`Robotics and Remote Systems, Pittsburgh, PA, Apr. 25-29, 1999, by
`Collins, Lipton and Kanade.
`Aube, 12th International Conference on Automatic Fire Detection,
`2001.
`X. Zhou, R. Collins, T. Kanade, and P. Metes, “A Master-Slave
`System to Acquire Biometric Imagery ofHumans at Distance”, ACM
`International Workshop on Video Surveillance, Nov. 2003.
`
`2003/0128756 A1
`
`7/2003 Oktem
`
`* cited by examiner
`
`APPL-1007 / Page 2 of 21
`
`APPL-1007 / Page 2 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 1 of9
`
`US 7,990,422 B2
`
`Master wide-angle video camera
`Wide-angle
`optical system
`
`101
`
`
`Image
`Ima e
`.
`
`
`base station
`QUtPUt
`sensgor
`processmg
`
`
`
`
`CII’CUItry
`circuitry
`102
`
`
`
`
`105
`—
`m
`
`
`
`
`
`
`
`Control
`
`
`
`Circuitry
`
`
`Control from
`
`base station
`
`Video to
`
`Optical zoom
`system
`
`Digital video
`signal
`
`
`PTZ
`
`control
`
`E Position
`i Feedback*
`
`107
`
`
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`FIG. 1A
`
`APPL-1007 / Page 3 of 21
`
`APPL-1007 / Page 3 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 2 of9
`
`US 7,990,422 B2
`
`Optical zoom
`system
`107
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`
`
`
`
`Wide-angle
`Digital
`optical system video signal
`
`
`
`
`FIG. 1 B
`
`APPL-1007 / Page 4 of 21
`
`101
`
`PTZ control
`
`
`
`
`
`
`1 Position
`5 feedback*
`
`
`
`Master wide-angle video camera
`
`
`
`
`Optical zoom
`; Pomfion
`Digital
`
`PTZ control
`2 feedback*
`system
`video signal
`
`
`107
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`APPL-1007 / Page 4 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 3 of9
`
`US 7,990,422 B2
`
`Wide-angle
`optical system
`201
`Master wide-angle video camera
`
`
`
`Image
`Image
`
`
`
`processing
`OUtPUt
`sensor
`
`circuitry
`CerUiW
`
`
`
`212
`@
`M
`
`
`
`base station Control
`
`circuitry
`
`M
`
` Analog-to-digital
`
`
`Video to
`
`base station
`
`Control from
`
`convertor
`
`M
`
`Optical zoom
`system
`207
`
`
`Analog video
`signal
`
`
`
`t Position
`'3 Feedback*
`
`
`
`optical zoom capability)
`
`
`Slave PTZ video camera (with
`
`FIG. 2
`
`APPL-1007 / Page 5 of 21
`
`APPL-1007 / Page 5 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 4 of9
`
`US 7,990,422 B2
`
`Wide-angle
`
`301
`
`
`
`Control from
`
`Analog video
`to
`
`base station
`
`optical system
`Master wide-angle video camera
`
`Image
`image
`Analog conversion
`
`
`
`sensor
`processing
`and formatting
`
`
`302
`circuitry
`crrcuntry
`
`_
`fl
`3_05
`
`base station L C°“”°' I Vim-Sim”?
`
`
`
`
`
`circuitry
`C'VCUI FY art
`
`3_06
`
`304
`output crrcurtry
`_
`
`
`
`Analog video
`signal
`
`
`
`
`
`
`Optical zoom
`system
`307
`
`PTZ
`control
`
`3 Position
`i Feedback*
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`FIG. 3
`
`APPL-1007 / Page 6 of 21
`
`APPL-1007 / Page 6 of 21
`
`
`
`Control from
`
`base station
`
`
`
`Master wide-angle video camera Digital video
`
`Image
`Multiplexing,
`t0
`_
`
`
`
`Image
`
`
`processing
`compression and
`base station
`sensor
`
`
`circuitry
`formatting circuitry
`
`403
`fl
`
`L C°“”°'
`
`IM
`
`
`
`
`407
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 5 of9
`
`US 7,990,422 B2
`
`Wide-angle
`optical system
`401
`
`Optical zoom
`system
`
`circuitry
`
`PTZ
`control
`
`3 Position
`5 Feedback*
`
`Digital video
`signal
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`FIG. 4
`
`APPL-1007 / Page 7 of 21
`
`APPL-1007 / Page 7 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 6 of9
`
`US 7,990,422 B2
`
`base station
`
`Wide-angle
`optical system
`501
`
`
`Image
`
`Image
`
`
`processing
`sensor
`
`
`circuitry
`
`502
`
`
`—
`fl
`
`
`
`Master wide-angle video camera Digital video
`to
`
`Multiplexing,
`
`compression and
`formatting circuitry
`
`5_05
`
`,
`_
`Analog—to-oigital
`conver5ion
`
`Control
`CerUItl'y
`
`Control from
`base station
`
`
`
`‘M
`£6
`
`
`
`Optical zoom
`system
`507
`
`PTZ
`control
`
`i Position
`Analog video
`5 Feedback*
`signal
`
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`FIG. 5
`
`APPL-1007 / Page 8 of 21
`
`APPL-1007 / Page 8 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 7 of9
`
`US 7,990,422 B2
`
`.
`
`.
`
`Wide-angle
`optical system
` Master wide-angle video camera
`601
`Video to
`
`
`.
`Image
`Ima e
`.
`
`
`
`
`
`base station
`sensgor
`processmg
`Output oircwtry
`
`
`
`602
`circuitry
`M
`
`
`
`
`—"
`QB
`
`
`
`
`
`
`
`circuitry
`
`
`
`
`
`
`Slave PTZ video camera (with
`optical zoom capability)
`
`base station
`
`FIG. 6
`
`APPL-1007 / Page 9 of 21
`
`Control from
`
`base station
`
`L C°“”°'
`IQ
`
`
`2 Position
`PTZ
`Optical zoom
`i Feedback*
`control
`system
`
`
`
`607
` Video to
`
`
`
`APPL-1007 / Page 9 of 21
`
`
`
`US. Patent
`
`Aug. 2, 2011
`
`Sheet 8 of9
`
`US 7,990,422 B2
`
`702
`
`706
`
`708
`
`Determine. the
`salient pomts
`in the image
`
`Determine the
`salient pomts
`in the image
`
`704
`
`of slave camera
`
`Match the salient points using
`local color histogram
`comparison
`
`Update the pixels of the image In of
`master camera using the pixels of Jn
`
`Output image
`
`FIG. 7
`
`APPL—1007 / Page 10 of 21
`
`APPL-1007 / Page 10 of 21
`
`
`
`U.S. Patent
`
`Aug. 2, 2011
`
`Sheet 9 of 9
`
`US 7,990,422 B2
`
`802
`
`804
`
`806
`
`808
`
`810
`
`812
`
`814
`
`
`
`Capturing a distorted wide-angle
`video image using wide-angle
`master camera
`
`Defining a ROI in the master camera view
`
` Transmitting estimated PTZ commands
`
`
`
`
`to slave camera from master camera
`to achieve approximate view matching
`with Rol
`
`Reverse transforming the digital output
`of slave camera in master camera
`
`
`
`Comparing distorted slave image with
`distorted wide-angle image to determine
`
`
`and transmit adjustment PTZ control
`signals to the slave camera
`
`
`
`
`
`met
`
`matching accuracy
`
`
`
`zoom function
`
`Replacing perspective corrected
`master camera view by adjusted slave
`camera view to achieve an expanded
`
`FIG. 8
`
`APPL—1007 / Page 11 of 21
`
`APPL-1007 / Page 11 of 21
`
`
`
`US 7,990,422 B2
`
`1
`AUTOMATICALLY EXPANDING THE ZOOM
`CAPABILITY OF A WIDE-ANGLE VIDEO
`CAMERA
`
`CROSS-REFERENCE TO OTHER
`APPLICATIONS
`
`This application claims priority from provisional US.
`patent application 60/589,104 filed Jul. 19, 2004, which is
`hereby incorporated by reference.
`
`BACKGROUND AND SUMMARY OF THE
`INVENTION
`
`1. Field of the Invention
`
`The present inventions relate to video monitoring systems,
`and more specifically, to automatically expanding the zoom
`capability of a wide-angle video camera.
`2. Background
`Real-time video surveillance systems have become
`increasingly popular in security monitoring applications. A
`new class of cameras replaces the mechanical Pan-Tilt-Zoom
`(PTZ) functions with a wide-angle optical system and image
`processing, as discussed in US. patent application Ser. No.
`10/837,019 entitled “Method of Simultaneously Displaying
`Multiple Views for Video Surveillance,” which is hereby
`incorporated by reference. This class of cameras is further
`discussed in US. patent application Ser. No. 10/837,325
`entitled “Multiple View Processing in Wide-Angle Video
`Camera,” which is hereby incorporated by reference. This
`type of camera monitors a wide field of view and selects
`regions from it to transmit to a base station; in this way it
`emulates the behavior of a mechanical PTZ camera. The
`
`wide-angle optics introduces distortion into the captured
`image, and processing algorithms are used to correct the
`distortion and convert it to a view that has the same perspec-
`tive as a mechanical PTZ camera.
`
`The US. patent application Ser. No. 10/837,326 entitled,
`“Multiple Object Processing in Wide-Angle Video Camera”
`by Yavuz Ahiska, which is hereby incorporated by reference,
`describes a way to correct the distorted view captured by a
`wide-angle camera. This camera, even using this distortion-
`correction process, only has limited capabilities to zoom into
`a region of interest. The camera can also be a programmable
`one as described in US. patent application Ser. No. 10/837,
`325, entitled “Multiple View Processing in Wide-Angle
`Video Camera,” containing programmable embedded micro-
`processors.
`There exists a conflict between a video camera’s field of
`
`view and the effective resolution of its image. Wide-angle
`lenses rarely offer any significant optical zoom, and similarly,
`video cameras with a high zoom capability have restricted
`fields of view (especially when their magnification is
`increased).
`A solution to monitoring a wide-angle area while being
`able to capture regions at a higher detail is to utilize multiple
`cameras at differing locations. The US. Pat. No. 6,724,421,
`which is hereby incorporated by reference, and the public
`domain document, “A Master-Slave System to Acquire Bio-
`metric Imagery of Humans at Distance,” by X. Zhou et al,
`which is hereby incorporated by reference, describe systems
`using multiple cameras to monitor a wide-angle area. In these
`systems, a separate base station unit controls the two cameras
`monitoring the scene. In addition, these systems do not try to
`expand the zoom function of the master camera.
`The US. Pat. No. 6,147,709, which is hereby incorporated
`by reference, describes a method and apparatus for overlay-
`
`2
`
`ing a high-resolution image onto a hemispherical interactive
`image captured by a camera by matching at least three points
`between the high-resolution image and the perspective cor-
`rected image. A major drawback with this process is that it
`makes comparisons in the perspective corrected domain.
`Moving regions in a video corresponding to persons or
`moving objects, together with tracked objects which may no
`longer be moving, and their local neighborhoods in the video
`define Regions of Interest (RoI) because persons, moving
`and/or tracked objects, etc. are important in security monitor-
`ing applications. In order to provide real-time alarms for
`dangerous events, RoI should be tracked and zoomed for
`closer inspection. Conventional Closed Circuit Television
`(CCTV) systems, which only capture recorded video for later
`analysis, cannot provide automatic alarm and event triggers
`without delay.
`A wide field of view camera that can both monitor a wide-
`
`angle scene, while also being able to simultaneously and
`automatically capture regions of interest at a greater magni-
`fication is very desirable in surveillance systems. For
`example, a high-resolution image could make the difference
`in positively identifying a criminal committing an offense or
`the detail surrounding an unattended suitcase. Therefore, it is
`very important to provide a high-resolution view of a person
`in a surveillance application.
`Wide-angle surveillance is necessary in many CCTV
`applications. Cameras such as dome cameras and cameras
`with fisheye or peripheral lenses can produce wide-angle
`video. A major weakness of wide-angle surveillance cameras
`and systems is that they either do not have the capability to
`zoom into a RoI or are limited in their zooming capability.
`The system can also have a computer program comprising
`a machine-readable medium having computer executable
`program instructions thereon for executing the moving object
`detection and object tracking algorithms fully in the program-
`mable camera device as described in US. patent application
`Ser. No. 10/924,279, entitled “Tracking Moving Objects in
`Video Using Wavelet Domain Information,” by A. E. Cetin
`and Y. Ahiska, which is hereby incorporated by reference.
`Automatic moving-object detection and object tracking capa-
`bility of the wide field of view camera can define a RoI in the
`wide-angle scene monitored by the camera containing the
`object in question. As this RoI will be of interest in many
`security applications, the region can be tracked by the elec-
`tronic PTZ capability of the master camera.
`There is a present demand for a system that can both
`monitor a wide area, while also being able to simultaneously
`and automatically capture regions of interest at a higher reso-
`lution.
`
`Automatically Expanding the Zoom Capability of a Wide-
`Angle Video Camera
`The present
`innovations include a new approach that
`achieves the ability to monitor a wide-angle area while being
`able to capture regions of higher detail.
`In one example embodiment, a wide-angle, master camera,
`such as a dome camera or a camera with a fish-eye or periph-
`eral lens, preferably with substantially no zoom capabilities,
`is used to capture images and automatically identify RoI, e.g.
`motion detecting and/or object tracking. In this embodiment,
`at least one other camera, preferably with expanded zoom
`capabilities relative to the master camera, can be used to zoom
`into the identified RoI. The views from the cameras other than
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`the master camera can be used for several purposes including,
`but not limited to, input into the master camera or output to a
`base station.
`
`65
`
`In another example embodiment, control circuitry sends
`PTZ controls to one or more slave cameras based in at least
`
`APPL—1007 / Page 12 of 21
`
`APPL-1007 / Page 12 of 21
`
`
`
`US 7,990,422 B2
`
`3
`partial dependence on the wide-angle images captured by the
`master camera. Among other things, these controls can be
`used to aim the slave camera towards the RoI and/or zoom the
`slave camera onto the RoI.
`
`In another class of embodiments, the output of a slave
`camera is compared to the images captured by the master
`camera and PTZ controls are sent to one or more slave cam-
`
`eras based in at least partial dependence on the comparison.
`Output images from the slave cameras can then be used for
`several purposes including, but not limited to, comparing
`them to RoI from the master camera, outputting them to a base
`station, or overlaying them onto other images.
`In a sample of this embodiment, after the slave camera has
`moved in accordance with the PTZ controls, the output from
`the slave camera can be compared to the images from the
`master camera to generate a new set of PTZ controls. This
`process can be, but does not have to be, used to match the
`output images from the slave camera to the RoI identified in
`the output images from the master camera. This process can
`be, but does not have to be, an iterative process that can be
`repeated to yield any level of desired matching accuracy.
`There are multiple methods for implementing this synchro-
`nization including, but not limited to image-processing tech-
`niques to match views, calibration procedures, or position
`analysis of feedback from the slave camera.
`In another example embodiment, the images from the slave
`camera can be used to replace, correct, inset, or overlay some
`or all of the images from the master camera. The composite
`images can be used for several purposes including, but not
`limited to, recording them, outputting them to a base station,
`and/or using them to generate PTZ controls.
`In another embodiment, several slave cameras, preferable
`monitoring different regions, can be used and the perspective-
`corrected view of the master camera can be altered in at least
`
`partial dependence on the adjusted views of at least one of
`these slave cameras.
`
`In another embodiment, motion-detecting software can be
`utilized to define the RoI as moving regions in the video
`corresponding to, but not limited to, persons, moving objects,
`tracked objects which may no longer be moving, and/or their
`local neighborhoods in the video.
`These and other embodiments of the present innovations
`are described more fully below.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The disclosed inventions will be described with reference
`
`to the accompanying drawings, which show important
`sample embodiments ofthe invention and which are incorpo-
`rated in the specification hereof by reference, wherein:
`FIG. 1A shows a diagram of a camera system consistent
`with a preferred embodiment of the present invention.
`FIG. 1B shows a diagram of a camera system consistent
`with a preferred embodiment of the present invention.
`FIG. 2 shows a diagram of a camera system consistent with
`a preferred embodiment of the present invention.
`FIG. 3 shows a diagram of a camera system consistent with
`a preferred embodiment of the present invention.
`FIG. 4 shows a diagram of a camera system consistent with
`a preferred embodiment of the present invention.
`FIG. 5 shows a diagram of a camera system consistent with
`a preferred embodiment of the present invention.
`FIG. 6 shows a diagram of a camera system consistent with
`a preferred embodiment of the present invention.
`FIG. 7 shows a flowchart implementing process steps con-
`sistent with the preferred embodiment of the present inven-
`tion.
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`
`FIG. 8 shows a flowchart implementing process steps con-
`sistent with the preferred embodiment of the present inven-
`tion.
`
`DETAILED DESCRIPTION OF THE PREFERRED
`EMBODIMENTS
`
`The numerous innovative teachings of the present applica-
`tion will be described with particular reference to the pres-
`ently preferred embodiment (by way of example, and not of
`limitation).
`Before the present innovations, the systems available only
`had limited zoom capabilities, but using a slave PTZ camera
`controlled from the master camera can expand this electronic-
`zooming capability to get even higher resolution images of
`the RoI.
`
`Unlike US. Pat. No. 6,147,709, the method and systems
`disclosed below which make comparisons do so in the wide-
`angle distorted domain within the master camera, as opposed
`to the perspective corrected domain, to generate PTZ com-
`mands for controlling a slave PTZ video camera. This can be
`an iterative process, and yields the desired matching accuracy
`given enough steps.
`The control system within the master camera that performs
`view matching between a master, wide-angle video camera
`and a slave PTZ camera allows the master camera to acquire
`detailed images of required areas from the slave camera.
`There are multiple methods for implementing this synchro-
`nization namely using image-processing techniques to match
`views and/or by a calibration procedure. Position feedback
`from the slave camera can also be a useful element of infor-
`
`mation for accurate view matching. The US. Pat. No. 6,509,
`926 entitled “Surveillance Apparatus for Camera Surveil-
`lance System,” which is hereby incorporated by reference,
`discusses a system for generating the azimuth and elevation
`angles of a camera and lens. Carrying out this process in the
`distorted domain allows the comparison to be made without
`losing anything in terms of quality. Comparing images or
`vectors x and y can be measured many different ways, the
`most well known way is the Euclidian distance, ||x—y||, but can
`also use ||g(x)—g(y)|| where g is an appropriate function rep-
`resenting the distortion.
`In a preferred embodiment, the master wide-angle camera
`has the capability of sending PTZ control signals to the slave
`PTZ camera to zoom into the RoI in an automatic manner by
`implementing the motion-detection and/or object tracking
`algorithms on the current wide field of view image. In an
`example embodiment, the slave is commanded to go to a set
`angle, which can be described as a PTZ control although it is
`not the standard widespread PTZ interface. A control system
`resident in the master camera can perform view matching
`between the master, wide-angle video camera and the slave
`PTZ camera. One preferred embodiment uses image-process-
`ing techniques to match the views of the master and slave
`cameras, allowing detailed images of the RoI to be acquired.
`In one class of embodiments, the master and slave cameras
`are calibrated and PTZ controls can be sent to the slave
`
`camera based at least partially on these calibrations. If cali-
`bration between the master and slave cameras is insufficient
`
`alone, image registration or matching can be carried out either
`using the corrected images of the scene or using the raw
`wide-angle images captured by the master camera.
`The following is one possible example of the calibration
`process between the master and slave cameras. It provides a
`switching mode where the master camera’s output can be
`switched to the slave camera’s output, the switch can be based
`on a predefined zoom point where the slave camera’ s position
`APPL—1007 / Page 13 of 21
`
`APPL-1007 / Page 13 of 21
`
`
`
`US 7,990,422 B2
`
`5
`can then be lined up with the master camera’s selected view
`and, if slave tracking is being used, the slave camera can be
`used to follow an object being tracked by motion tracking.
`The master camera’s wide-angle view can be divided into
`smaller regions and, using image processing, these regions
`can be zoomed in on. By tracking these smaller views, the
`master camera is acting as a Virtual camera, or VCAM. As
`mentioned earlier,
`the zoomed in VCAM views have
`smoothed edges and blurred details. The slave camera views
`are needed to capture the level of detail required in most
`surveillance applications.
`Manual Mode:
`
`The video of Head A (in one preferred embodiment this
`refers to view of the output from the master camera) is
`switched to the video output of the slave camera on pressing
`the enter key. Once switched, keyboard control of the slave
`camera is provided. The slave camera can then be manually
`calibrated to aim at the same object as the master camera.
`Pressing escape returns the video and keyboard control to the
`master camera. In another example embodiment, there can be
`two analogue outputs from the master camera, each referred
`to as a Head. A monitor can view the output from a Head.
`There can be BNC connectors at the back of the master
`camera, labeledA and B so that a monitor can be connected to
`either Head A or Head B.
`Zoom Switch Mode:
`
`While controlling any VCAM on Head A, if the field of
`view goes beyond 25 degrees, the video is switched to the
`slave camera and it is moved to the same position as the
`master camera’ s VCAM. An option is provided for pre-posi-
`tioning. If this option is turned on, the slave camera will be
`moved to the position of the master camera’s VCAM at a
`zoom level of 30 degrees and is repositioned until the zoom
`level reaches 25 degrees at which point the video is switched
`to the slave camera. Once switched, keyboard control of the
`slave camera is provided. Pressing escape returns the video
`and keyboard control to the master camera.
`Slave Tracking Mode:
`Whenever motion tracking is triggered, HeadA is switched
`to the slave camera and the slave camera is moved to the
`
`position of the motion rectangle being tracked and is updated
`as the motion rectangle moves. If the motion rectangle stops
`moving or slows down the slave camera is zoomed in and will
`zoom out again if the motion rectangle moves faster again. If
`the user moves the joystick, control of the slave camera will
`be given to them. If after 5 seconds of no activity from the
`keyboard the camera is still tracking, it will return the slave
`camera to tracking the motion rectangle. Iftracking has ended
`control and video will be returned to the master camera.
`
`Design
`Three additional classes were required in order to imple-
`ment the master/slave features; slavecalibration, slaveman-
`ager and serialout. The code described here is enabled via the
`abspos setting within the adome.ini.
`Slavecalibration:
`
`This class is responsible for calibrating the master camera
`with the slave camera and translating between the master
`camera spherical coordinates and slave camera spherical
`coordinates.
`
`The calibration phase is run by positioning the slave cam-
`era at a reference point.
`lts video output is then displayed on Head A while a
`VCAM is positioned at the same reference point. (but in the
`coordinate system of the master camera). The user then has
`control of positioning of the master camera’s VCAM and
`should line up the VCAM to match the image from the slave
`camera. Once matched, the user would press enter and the
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`current position of the VCAM would be read. The difference
`in pan between the reference point and the new position ofthe
`VCAM is stored for later use by the translation function. For
`the calibration phase the slavecalibration class collaborates
`with the Menus class and it is possible to use more than one
`reference point if required without changing the Menus class
`(see GenerateCalibrationPresets function of the slavecalibra-
`tion class).
`The second responsibility ofthe slavecalibration class is to
`provide a translation ofthe master camera’ s coordinates to the
`slave camera’s coordinates for the slavemanager class. This is
`done in the TranslateToSlaveCoordinates function by firstly
`assuming that the point being viewed is a distance of 5 meters
`away. The spherical coordinates are then translated into Car-
`tesian coordinates. A rotation in the z-axis by the difference in
`pan that was measured during the calibration phase is then
`made. A translation in x and z coordinates is then made. This
`
`translation is accounting for the physical distance between
`the two cameras (including their difference in height). The
`mounting kit will ensure that the distance between the two
`cameras is constant along the x-axis. As the height ofthe slave
`cameras can be different from one another the z-axis transla-
`
`tion depends on which slave camera is connected. The final
`stage is to convert the translated and rotated Cartesian coor-
`dinates back into spherical coordinates.
`Slavemanager:
`The slavemanager class is responsible for checking the
`zoom level for when to switch to the slave camera, switching
`the video to the slave camera, positioning the slave camera
`and dealing with timeouts from no keyboard activity.
`The ProcessSlaveMode function is called once per frame.
`If the zoom switch is enabled it will check the zoom level of
`the active VCAM on HeadA and if it is under 25 it will switch
`
`to the slave camera and position it by calling the SetSlave-
`Mode function (described below).
`If prepositioning is
`enabled it will also position the slave camera, but not switch
`to it when the zoom level is between 30 and 25. This is done
`
`by calling the SwitchSlaveToNearestPreset function (de-
`scribed below). A timer is managed by the class in order to
`deal with timeouts from slave mode and slave tracking mode.
`This timer is checked in this function and the appropriate
`mode entered after a timeout. The timer is reset by calling the
`SlaveMoved function (this is done by serialOut and serial
`described below).
`The SetSlaveMode function switches the video to the slave
`
`camera and positions it. The switch to the slave camera video
`is done by setting a bit ofthe register controlling the CPLD via
`an i2c write. The positioning is carried out by reading the
`current position of the active VCAM, translating the coordi-
`nates by calling the TranslateToSlaveCoordinates function of
`the slavecalibration class and passing it to the output queue
`for the serialout class to deal with (described below).
`The SwitchSlaveToNearestPreset function takes the mas-
`
`ter camera’s spherical coordinates, uses the TranslateTo-
`SlaveCoordinates function of the slavecalibration class and
`
`passing it to the output queue for the serialout class to deal
`with (described below). This is used by the prepositioning and
`by the MotionTracker class for slave tracking (described
`below).
`Serialout:
`
`The serialout class is responsible for sending commands to
`the slave camera via RS485. The serialout class runs a sepa-
`rate thread, which blocks on the output queue until a com-
`mand is added to the queue. Once a command is added to the
`queue it calls the appropriate send function on the serial class.
`APPL—1007 / Page 14 of 21
`
`APPL-1007 / Page 14 of 21
`
`
`
`US 7,990,422 B2
`
`7
`In addition to the new classes described above some
`
`changes have been made to existing classes. The key changes
`are described below:
`Serial:
`
`The serial class that deals with keyboard input has the
`addition of a passthrough mode which is enabled when in
`slave mode, slave tracking mode or slave calibration mode. In
`the passthrough mode all commands received are passed out
`of the second serial port (the one connected to the slave
`do