throbber
(12) Umted States Patent
`Milinusic
`
`um Patent No.:
`(45) Date of Patent:
`
`US 7,106,333 B1
`Sep. 12, 2006
`
`US007l06333Bl
`
`(54) SURVEILLANCE SYSTEM
`
`(75)
`
`Inventor‘. Tomislav F. Milinusic. Decatur. GA
`(US)
`
`3451473.
`(58) Field o1'(.'lassification Search
`345M741 3433143, 159. 154, 152, 153. 155;
`332,I']53
`See appliealion file for eornplele search history.
`
`(73) Assignee: Vistaseape Security Systems Corp._.
`-’\‘1fl'1‘a- GA (US)
`
`(55)
`
`R07‘-‘l’9fl‘-TS Cit‘-‘d
`us. PATENT DOCUMENTS
`
`( "‘ ) Notiee:
`
`Suhjeel In any disclaimer. the lerm 0l‘1his
`patent is extended or adjusted under 35
`U.S.C. l54(b) by 507 days.
`
`(21) Appl‘ Nu: 1"m79‘639
`
`5-"3000 Scale)’ El Ill.
`
`5.059.655 A "’
`s cilcd by examine],
`Pr.fi::n.='_1' fzhrarriiner Kimbinh T. Nguyen
`(74) /lrIome_11 Agem‘. or Firm Morris. Manning & Marlin.
`LLP
`
`343-""154
`
`(60)
`
`(51)
`
`Related U-S- APP““"’fi"“ Data
`§3%‘;l5i0fl«1l_ _3F"l"]@'33fi'3I1_'l 510-
`it-"'3:5 all
`on e .
`.prov1s1ona app Ieallnn.-"0.
`{II
`3
`.
`_
`.
`i
`16. 2001. and provisional application No. 60.-’269.6?6. filod
`on [:e1,_ 15. 20.31‘
`‘
`Int’ ('1'
`G067 13/00
`
`(200601)
`
`(52) U.S. Cl.
`
`3451474‘. 345F473; 343N521
`348K153; 348K154‘. 348.I’l55
`
`A system is provided for oollocling surveillance data from
`one or more sensor unils and incorporating the surveillance
`v
`v
`-
`-
`dam Him a Suweillance dmab.aSe' The ‘yS‘‘°’.’‘‘ 15 Configured
`to retrieve surveillance‘ data trom_the surveillatlce database
`and perfomi predetermined analytical lunc:l1ons on lhe dala.
`The syslem is also eonfigured to present surveillance data
`and the results ofdata analysis in one or more predetermined
`formats.
`
`21 Claims, 4 Drawing Sheets
`
`472
`
`DA UNIT
`
`i
`i 451
`
`
`
`DA UNIT
`
`458
`
`459
`
`3
`
`!
`
`'
`
`45?
`
`fl
`1- - -
`DB
`452 '
`GIMBEL
`E
`474
`; -
`
`NETWORK
`m
`
`240
`
`.=F:_—';‘--
`SURVEILLANCE
`
`210
`
`En.
`
`CLIENT
`Em4
`SURVEILIANCE
`SERVER
`
`
`
`TEMP DEF.
`E
`
`476
`
`DA UNIT
`
`453
`
`260
`
`'
`
`r ------------------- —-—.
`
`
`MASTER CONTROLLER
`£9
`
`
`
`
`
`
`MEMORY
`192
`
`GIMBEL
`CONTROLLER
`
`Page 1 of 9
`
`RPX Exhibit 1003
`
`RPX Exhibit 1003
`RPX v. MD Security
`RPX V. MD Security
`
`Page 1 of 9
`
`

`
`U.S. Patent
`
`Sep. 12,2006
`
`Sheet 1 of4
`
`Us 7,106,333 B1
`
`.._Ow:.ZO0wn_z<_2s_oo
`
`_2.n._._..m..>m
`
`
`
`:o=mN__m:m_>n_.m
`
`
`
`ez_2<mEmomo_>
`
`Emommm
`
`s_m._.m>m
`
`_.zmsm_o<z<2
`
`
`
`.E>m_E.mm<._.¢.n_
`
`V2
`
`_..0_u_
`
`—oz_wmm_oomn_
` s_m5>mozammoomm—mm<m§<n_
`
`
`
`
`
`mm>mmmxmozfimzI
`
`/8?
`
`
`
`smE...>wmomzmm
`
`N2
`
`omo_>
`
`._<.:0_D
`
`m<mm__2<o
`
`mum
`
`m_m_m<._
`
`ZO_._._m0n_
`
`mm_2.<mmn_sm:
`
`n_z:om..o_n_:<
`
`Page 2 of 9
`Page 2 of 9
`
`
`
`
`
`
`
`
`
`

`
`U.S. Patent
`
`Sep. 12, 2006
`
`Sheet 2 of 4
`
`US 7,106,333 B1
`
`a:2:momzmm
`
`Koo?
`
`
`
`._._z:momzmm
`
`ma
`
`.523momzmw
`
`Page 3 of 9
`Page 3 of 9
`
`moz<._.__m>~_:m
`
`mm>mmm
`
`N.O_n_
`
`
`
`._.2.n.:._0moz¢._.__m_>~..__._m
`
`
`
`
`

`
`U.S. Patent
`
`WS
`
`M2
`
`w_.flS
`
`A-.
`
`US 7,106,333 B1
`
`r—--—--"'—-—---—-—-------—--—-—-————————-----1
`1-
`“I
`N
`
`
`
`MmommmoommH,mo__._n_¢mw
`
`ma
`
`6mommmoomao:
`
`mm
`
`
`
`UQwomtmE_.30..
`
`lnllnllllllllIII|IIII1|nl.I'llII..IIll|IIIallIIII...rllllllllclInIIInI..lIIIIIIIIIIII.|I|oIol.II.|lI|..InluI|I.I.I
`
`Page 4 of 9
`Page 4 of 9
`
`
`
`

`
`U.S. Patent
`
`Sep. 12,2006
`
`Sheet 4 of4
`
`Us 7,106,333 B1
`
`._mms=o
`
`MMJJOEFZOU
`
`ll‘3%
`
`m_oz<._.__m>m:m
`
`mm>mmm
`
`
`
` %mm._._oEzoommpwqz
`
`:,_m_._o
`
`._._ZD<0
`
`._._ZD<0
`
`mt.
`
`Page 5 of 9
`Page 5 of 9
`
`
`
`
`
`

`
`US 7,106,333 B1
`
`2
`
`not necessarily to scale. emphasis instead being placed upon
`clearly illustrating the principles of the present invention.
`Moreover. in the drawings. like reference numerals desig-
`nate corresponding pans throughout the several views.
`I"[("r.
`l
`is a block diagram illustrating a surveillance
`system 100;
`FIG. 2 is a block diagram funher illustrating the structure
`of surveillance system 100:
`FIG. 3 is a block diagram illustrating an embodiment of
`surveillance server 210; and
`FIG. 4 is a block diagram illustrating a further embodi-
`ment of surveillance system ltlll.
`
`DETAILED DESCRIPTION
`
`is a block diagram representative of ai1 embodi-
`FIG. I
`ment of a surveillance system 100. The surveillance system
`100 is structured to include a sensor system 102, a process-
`ing system ltI4. a network server 106 and a command and
`control system 112.
`Sensor system 102 may include any type of detection or
`sensing device. Sensor system 102 may include one or more
`detection or sensing devices. Some examples of detection!
`sensing devices are: cameras. such as video or digital
`czuneras: position sensors. such as global satellite position-
`ing system [(iPS) compliant receivers or transceivers, laser
`measurement devices and triangulation based positioning
`systems; radar. temperature detectors and the like. Further
`examples ol‘detectionJ'sensing devices include audio devices
`responsive to sound. These devices may be configured to
`capture audio data. The detection devices of sensor system
`102 may be configured to capture and record captured data
`or to capture and transmit captured data to an intended
`receiving system or device.
`'l"h.is captured data may be
`transmitted along with position data. such as ground coor-
`dinate data. as well as time data that may also be generated
`by the detection devices of the sensor system 102.
`Processing system 104 includes systems for receiving.
`compiling and storing data received from sensor system 102.
`It
`includes processing unit 108 and database un.it 110.
`Processing system 104 is also configured to retrieve data and
`distribute it according to input from command and control
`system 112.
`Network server 106 may be configured to receive data
`from sensor system I02.
`It may also be configured to
`distribute data from processing system 104 in accordance
`with instructionsfcommands received from command and
`
`10
`
`IS
`
`30
`
`‘
`
`40
`
`1
`SURVEILLANCE SYSTEM
`
`("_l..A[M ()1-' PRIORITY
`
`This application claims priority to co—pending U.S. pro-
`visional application entitled. “SCANNING CAMIERAANI)
`SIJRVl}ll[,I.ANCl5. SYSTI-EM,” having Ser. No. 602969.434.
`and filed Feb.
`16, 2001: U.S. provisional application
`entitled. “SURVEILLANCE CAMERA SYSTEM,” having
`Ser. No. 601269.676, and filed on Feb. 16. 200]‘; and U.S.
`provisional application entitled. “SURV’lEl[.I.ANCli
`SYS'l‘l'.iM,” having Ser. No. 608 I 7.635. ad filed on Sep. 6.
`2001. the disclosures of which are all entirely incorporated
`herein by reference.
`TECHNICAL FIELD
`
`The present invention is generally related to a surveillance
`system and more particularly. to a system for collection.
`analysis and distribution of surveillance data.
`BACKGROUND OF THE INVENTION
`
`Systems designed to monitor predetermined areas. places
`or objects are known. These systems ofien incorporate video
`cameras l.hat provide a continuous Feed of video data that is
`eitl1er displayed in real time on a display device andfor
`recorded to a recording device. such as a video tape recorder.
`While these systems provide for capture and recordation of
`video data depicting the conditions andfor occurrences
`within the monitored area. they do not provide a means of
`easily determining when and where an occurrence or con-
`dition has taken place. Nor do they provide for any means
`of analyzing the information depicted by the video data.
`Further. as video data requires substantial recording
`media space for storage, it is common [or video data to be
`recorded and archived for only a very limited period oftime.
`Thus, once the period of archiving has expired, the video
`data is either recorded over or otherwise erased from the
`recording media. Further. known systems do not provide for
`any type of analysis ol‘ video data that would allow for a
`determination of, for example, how lo11g an intrtider has
`been in a monitored area; whether the intruder is alone; how
`the intruder got into the monitored area: where the intruder
`has previously been: what the intentions of the intruder
`might be or. where the intruder may be going to next.
`SUMMARY OF THE INVENTION
`
`The present invention provides a system for collecting
`and distributing surveillance data collected via one or more
`sensor units. Brielly described_. in architecture, one embodi-
`ment of the system can be implemented as follows. Memory
`is provided. A surveillance database is provided that is stored
`on the memory. The surveillance database includes surveil-
`lance data collected by a surveillance sensor unit. A surveil-
`lance server is provided that is associated with the memory
`and is configured to receive surveillance data from a sur-
`veillance sensor unit that is configured to detect predeter-
`mined conditions and to generate surveillance data repre-
`sentative of the detected conditions.
`
`Other features and advantages of the present invention
`will become apparent to one with skill
`in the art upon
`examination of the following drawings and detailed descrip-
`tion.
`It
`is intended that all such additional features and
`advantages be included herein within the scope of the
`present invention.
`BRIEF l)[.4lS(.‘RIP’l"l()N OI-' Tllli l)I{./\WIN(‘rS
`The invention can be better understood with reference to
`
`50
`
`60
`
`control system 112.
`Command and control system 112 is configured to pro-
`vide for control and management of surveillance system
`100. Command and control system I12 may be conligured
`to initiate retrieval of data from processing system 104 and
`to present data as. for example. representative 3-D visual-
`izations based upon data received from processing system
`104. It may also provide for presentation of video or audio
`data in a streaming fomiat. l"°urther. it may be cortligured to
`generate predetermined reports.
`FIG. 2 is a block diagram illustrating a further embodi-
`ment ofa surveillance system 100 according to the present
`invention. The surveillance system Illtl may include a sur-
`veillance server 210 that‘ is connected to a network 230.
`Surveillance server 210 is associated with a database 220. A
`surveillance client 240 is provided and is coimected to the
`. network 230. A sensor unit 250. a sensor unit 260 and a
`sensor unit 270 are also provided. liach of sensor units 250,
`260 and 270 are connected to the network 230. liach ofthe
`
`the following drawings. The components in the drawings are
`Page 6 of 9
`Page 6 of 9
`
`

`
`US 7,106,333 B1
`
`4
`
`based upon predetermined distribution criteria. Further. sur-
`veillance server 210 may be configured to determine such
`things as l1ow long a detected occurrence or condition has
`existed, whether there are other similar occurrences or
`conditions that exist. as well as what preceded the detected
`occurrence or condition. It may also be configured to predict
`future conditions or occurrences based upon detected con-
`ditions or occurrences. The surveillance server 210 may be
`configured to generate and display a l.l1.l'ee dimensional
`model of an area under monitor based upon the data stored
`in database 220. This model can then be used to analyze
`detected conditions or occurrences within the monitored
`area.
`
`In this embodiment. surveillance server 210 includes a
`central processing unit 360, storage memory 365 for storing
`data 368 andfor software 367. An inputfoutput (U0) proces-
`sor 375 is provided for interfacing with associated input and
`output devices. A local interface 370 is provided for trans-
`ferring data between the CPU 360. memory 365 andfor I.r’()
`processor 375. A graphics processor 385 is provided for
`processing graphical data. Associated input and output
`devices may include keyboard device 320. mousefpointing
`device 326 andfor a network 130.
`
`CPU 360 is preferably configured to operate in accor-
`dance with software 367 stored on memory 365. CPU 360
`is preferably configured to control the operation of server
`210 so that surveillance data may be received from the
`various sensor units 250. 260 and 270 (FIG. 2) a11d incor-
`porated into the surveillance database 220 (FIG. 2). It is also
`preferably configured to retrieve and distribute surveillance
`data to a requesting surveillance client 240 or based upon
`predetermined distribution criteria. Further_. it may also be
`configured to determine duration of detected occurrences
`and preceding conditions or occurrences.
`It may also be
`configured to predict future conditions or occurrences based
`upon detected conditions or occurrences represented by
`surveillance data stored in the surveillance database 220.
`
`if]
`
`15
`
`30
`
`3
`sensor units 250. 260 and 270 are configured to collect
`surveillance data. More particularly. the sensor units are
`configured to detect predetermined conditions or occur-
`rences and generate surveillance data representative of the
`detected conditions or occurrences.
`
`Database 220 may be stored on a memory device that is
`directly cor1.l'1ected to tl1e surveillance server 210 as shown.
`Altematively_. database 220 may be stored on a memory
`device that is connected to the network 230 and accessible
`to the surveillance server 210 via network 230. Database 230
`may be configured t'o include surveillance data received
`from. for example. sensor units 250. 260 andfor 270. Sur-
`veillance data may include. video data. still
`image data.
`audio data. position or location data. radar data. temperature
`data. as well as time data representative of. for example, the
`time at which strrveillance data was collected by a respective
`sensor unit.
`
`Network 230 may be a wide area network (WAN). such
`as. for example, the Internet. or a local area network (LAN).
`Iiaeh of the sensor units 250. 260 or 2'.r"[| may be connected
`to the network 230 via an interface (not shown )_. such as a
`wireless or wired interface. Some examples of suitable
`wireless interfaces include. but are not
`limited to. radio
`frequency (RF) wireless interfaces or infrared (IR) inter-
`faces. Other suitable interfaces may include data acquisition
`units (DA Units) such as those described in co-pending U .S.
`patent application entitled “DATA ACQUISITION
`SYSTEM.“ filed on Mar. 13. 2001 and accorded Ser. No.
`09i’805.229. the disclosure of which is hereby incorporated
`herein in its entirety.
`for
`Surveillance client 240 may be implemented.
`example._ as a general-purpose computer or personal com-
`puter. Further, it may be implemented as a personal digital
`assistant (PDA) such as the Palm Pilot. Surveillance client
`240 is preferably configured to allow a user to retrieve
`sttrveillance data or specilied reports by issuing a request to
`surveillance server 210. Surveillance client 240 may also be
`configured to control or adjust specified sensor units via
`issuing requests to surveillance server 210 that are then
`transmitted to the specified sensor unit.
`Sensor units 250. 260 and 270 are configured to collect
`surveillance data by detecting predetermined conditions or
`occurrences and generating and outputting surveillance data
`representative of the detected conditions or occurrences.
`Surveillance data may be transmitted to. for example. the
`stlrveillance server 210 via the network 230. The sensor
`units 250. 260 and 270 may be, for example. cameras, such
`as for example. a digital camera. or video camera configured
`to be responsive to, for example. the visible light spectrum
`or infrared radiation (IR). Further. sensor units 250. 260 and
`270 may also be configured as position sensing devices, such
`as. for example. global positioning satellite (GPS) receiver
`or GPS transceiver: a radar receiver, sonar receiver. tem-
`perature detector. motion detector andfor distance detection
`devices. They may also be audio detection devices such as
`microphones or the like.
`that are capable of capturing
`audiofsound.
`
`FIG. 3 is a block diagram of an embodiment of a
`surveillance server 210 according to the present invention.
`Surveillance server 210 is preferably configured to receive
`surveillance data from the various sensor units 250. 260 and
`270 (FIG. 2) and to incorporate collected surveillance data
`into the database 220 (FIG. 2). It is also preferably config-
`ured to retrieve and distribute surveillance data to a request-
`ing surveillance client. It may also be configured to analyze
`andfor distribute surveillance data to a surveillance client
`
`40
`
`50
`
`60
`
`The processor 385 andfor CPU 360 of the present inven-
`tion can be implemented in hardware. software. firmware. or
`a combination thereof In the preferred embodiment(s). the
`processor 385 is implemented in software or firmware that
`is stored in a memory and that is executed by a suitable
`instruction execution system. If implemented in hardware.
`as in an alternative embodiment. the processor 385 E:lJ1(.l.:f('ll'
`(‘PU 360 can implemented with any or a combination of the
`following technologies. which are all well known in the art:
`a discrete logic circuit(s) having logic gates for imple1nent—
`ing logic functions upon data signals. an application specific
`integrated circuit having appropriate logic gates. a program-
`rnable gate array(s) (1-’("rA). a fully programmable gate array
`(FPGA). etc. Processor 385 may be implemented as a
`general—purpose processor. such as. for example the Intel”
`PentiumT'“ IV central processing unit. Furtl1er_. processor
`385 may be implemented as a graphics processor or a digital
`signal processor (l)Sl-’).
`The processor 385 may be configured to incorporate or
`otherwise carry out the filnctions ofCPU 360. CPU 360 may
`also be configured to incorporate or otherwise carry out the
`functions of processor 385.
`The software 367 comprises a listing of executable
`instructions for implementing logical functions. and can be
`embodied in any computer—readable medium for use by or in
`connection with an instruction execution system, apparatus.
`or device. such as a computer—based system. processor-
`containing system. or other system that can fetch the instruc-
`tions from the instruction execution system. apparatus. or
`Page 7 of 9
`Page 7 of 9
`
`

`
`US 7,106,333 B1
`
`if]
`
`15
`
`6
`systems or position detection systems that use multiple
`sensor units of known location to calculate the location of
`the detected changefmovement via triangulation techniques.
`l"urther, suitable PSI‘) devices include those disclosed and
`described in co-pending U.S. patent application e11titled
`“AN IMMERSIVE CAMERA SYSTEM." filed on Apr. 18.
`2001 and accorded Ser. No. 09f8?-7.916; and co—pending
`U patent application entitled “A SCANNING CAMERA
`SYS'I‘I€M_.”
`filed on Apr. 18, 200] and accorded Ser. No.
`09i’8?-7,915. the disclosures of which are both hereby incor-
`porated herein in their entirety.
`Each of the sensor units 250, 260 and 270 may be
`configured to include one or more detection devices. Detec-
`tion devices may be of the same type or dilferent types. For
`example, sensor unit 250 may be configured to include a
`digital camera sensitive to IR and a camera sensitive to the
`visible light spectrum. It may also be configured to include
`a position sensing device for detecting the position of a
`detected occurrence or condition.
`
`Image data generated and output by the cameras units 250
`and 260 may include position data representative of the
`position of the camera. the position of the area and.:"or the
`position of an object or objects within the area. as well as
`detected changes within the area. Position data may be
`generated by a position-sensing device (PS1)) associated
`with the sensor unit 250 or 260.
`
`Surveillance data is preferably output from the cameras
`451, 452 and 461 and transmitted to data acquisition units
`(DA) 472, 474 and 476 that are provided for each camera
`45], 452 and 461, respectively. In turn. surveillance data is
`transferred over the network 130 to surveillance server 210,
`which in turn causes the surveillance data to be incorporated
`into database 220.
`
`30
`
`5
`device and execute the instructions. In the context of this
`document. a “computer—readable medium" can be any
`means that can contain, store, communicate, propagate, or
`transport tl1e program for use by or in connection with the
`instmction execution system. apparatus, or device. The
`computer—readable medium can be. for example. but not
`limited to, an electronic, magnetic, optical, electromagnetic,
`infrared. or semiconductor system. apparatus. device. or
`propagation medium. More specific examples [a nonexhaus-
`tive list) of the computer—readable medium would include
`the following: an electrical connection (electronic) having
`one or more wires, a portable computer diskette (magnetic),
`a random access memory (RAM) [magnetic]. a read-only
`memory (ROM)
`(magnetic),
`a11 erasable programmable
`read—only memory (EPROM or Flash memory) (magnetic),
`an optical
`fiber (optical). and a portable compact disc
`read-o11ly memory ((.'l)R()M)
`(optical). Note that
`the
`computer-readable medium could even be paper or anothe1'
`suitable medium upon which the program is printed. as the
`program can be electronically captured, via for instance,
`optical scanning of the paper or other medium.
`then
`compiled. interpreted or otherwise processed in a suitable
`manner if necessary. and then stored in a computer memory.
`l"'I(:i. 4 is diagram illustrating a further embodiment of
`system 100 ir1 which sensor units 250 and 260 are cameras
`and sensor unit 270 includes a temperature detection device.
`Sensor unit 250 is configured, as a visual spectrum sensitive
`camera 451 and an infrared radiation (IR) sensitive camera
`452. The cameras 451 and 452 each preferably incorporate
`wide—ang]e optics (lens 458 and 459) to allow for viewing
`andi"or capture of a wide field of view. The IR camera 45]
`includes an imager 456 that is preferably sensitive to IR. The
`visual spectrum camera 452 includes an imager 457 that is
`preferably sensitive to the visible light spectnim.
`Sensor unit 260 is configured as an IR sensitive camera
`461. The camera 461 preferably incorporates telephoto
`optics to allow for close—up monitoring andfor capture of an
`area or objects within an area, from a greater distance. The
`IR camera 46] includes an imager 466 that is preferably
`sensitive to IR. It will be recogrli‘/Jed that sensor unit 260
`may also be configured as a visual spectrum sensitive
`camera. Similarly. it may be configured to include both IR
`and visual spectrum cameras.
`Sensor unit 270 is configured as a temperature detection
`device. Sensor unit 270 may include a thermometer as well
`as smoke or carbon monoxide detection sensors.
`
`in this example, imagers 456, 457 and 466 are preferably
`photo multiplier tubes [I-’MT). Ilowever. other types of
`ilnagers may also be used depending on the particular
`application at hand, including, but not limited to, charged
`coupled device (CCD) imagers or complementary metal
`oxide (('.‘M()S] imagers.
`Sensor units 250 and 260 are preferably configured to
`monitor a predetermined area. The cameras 451. 452 and
`461 are configured to capture an image of the area and
`objects within the area and to generate and output image data
`representative of the areafobjects. Image capture may be set
`to occur at predetermined times or upon the occurrence of
`predetermined occurrences. such as the detection of move-
`ment within the area being monitored by the sensor urI.its 250
`or 260. Sensor units 250 and 260 may be configured so as
`to be associated with a p-osition—sensii1g device (PSD) that
`determines the position of, for example, the sensor unit. or
`an object or occurrence within the area being monitored by
`the sensor unit. The PS1) will generate position data repre-
`sentative of the determined position of the object or occur-
`rence.
`
`Sensor units 250 and 260 may be supported and posi-
`tioned by associated gimbals 453 and 463. respectively. One
`. gimbal is preferably provided for each camera 451, 452 and
`461. Altematively, one gimbal may be provided for each
`sensor unit 250 and 260. In l’[("r. 4, gimbal 453 is associated
`with sensor unit 250 and gimbal 463 is associated with
`sensor unit 260. Each gimbal 453 and 463 is preferably
`mounted to a support device of some type, such as, for
`example. a tripod, concrete wall. building or other structure
`capable of providing support. liach gimbal 453 and 463 is
`adjustable about two axes of rotation {X—axis and Y—axis)
`and is preferably responsive to a control signal from a
`control device such as gimbal controller 485. By controlling
`the gimbal. the position of the sensor unit 250 or 260 may
`be moved about the x—axis and y—axis.
`Surveillance data may include pixel data representative of
`the image captured by the camera. This pixel data may be
`stored into database 220. The database 220 may be config-
`ured to include pixel data representative of the captured
`image, as well as, position data representative ofthe position
`(x. y and 2) of the areafobject represented by, the pixel data.
`Additionally.
`the database 220 may be configured to
`include a time stamp indicative o f the time at which the pixel
`‘ data was captured, stored andfor changed. This time data
`may be generated by. for example, the sensor unit 250 or
`260. or via master controller 480. It may also be generated
`by surveillance server 210.
`The database 220 may be configured to include reference
`data representative of_. for example. a base image represen-
`tative of a predetennined view of the area being monitored.
`This predetermined view might be. for example. an image of
`the area in a typical state. For example. where the area is that
`of a warehouse interior area, the base image might be an
`image of the warehouse interior during non—business hours
`when no personnel are present and no activities are taking
`Suitable I-’Sl)'s may include global satellite positioning
`(UPS) receivers or transceivers.
`laser distance detection
`place
`no changes in the area are occurring).
`Page 8 of 9
`Page 8 of 9
`
`40
`
`50
`
`60
`
`

`
`US 7,106,333 B1
`
`7
`
`As an example of the operation of t.he present invention,
`consider the following. The sensor unit 250 is configured to
`monitor a predetermined area. such as for example, a
`railroad—switchir1g yard. The sensor unit 250 is further
`configured to detect any changes in the area and capture an
`image of the changes within the area. These changes will
`typically represent movement of objects within the area
`being monitored. Once these changes are detected image
`data representing an image of the areafobjects are output via
`the DA unit 474 and subsequently recorded to the database
`220.
`
`the location of the detected changes!
`Additionally,
`movements is determined by sensor unit 250. This may be
`done via. for example, a laser distance detection system or
`via triangulation tech.niques wherein multiple sensor units of
`known location are used to calculate the location of the
`
`In one embodiment, master
`detected changefmovement.
`controller 480 is configured to carry out calculations for
`determining the position of the detected changefmovement
`in the monitored railroad yard based upon input Trorn
`relevant position sensing devices (not shown) associated
`with the sensor unit 250.
`
`Once the location of the changefmovement has been
`determined.
`telephoto camera 461 may be engaged to
`“mom-in” on the detected changes to obtain a closer view
`ofthe cliangesfmovements at the determined location. (Tam-
`era 46l may also be configured to capture an image of the
`areafobjects at the location of the detected changes within
`the monitored railroad yard and to output image data rep-
`resentative olithe areafobjects. Subsequently, this image data
`can be recorded to the database 220. along with position data
`indicative of the location of the detected changes and time
`data representative of the time of the image capture of the
`changes.
`11 should be emphasized that the above-described embodi-
`ments of the present invention, particularly. any “preferred”
`embodiments. are merely possible examples of
`implementations. merely set forth for a clear understanding
`of the principles of the invention. Many variations and
`modifications may be made to the above-described
`embodiment(s) ofthe invention without departing substan-
`tially from the spirit and principles ofthe invention. All such
`modifications and variations are intended to be included
`herein within the scope of the present invention and pro-
`tected by the following claims.
`Therefore. having thus described the invention. at least
`the following is claimed:
`1. A surveillance management system for controlling at
`least one position-controllable surveillance device in
`response to processed surveillance data. comprising:
`a sensor system including the at
`least one position-
`controllable surveillance device and configured to
`detect predetennined conditions and generate surveil-
`lance data in response thereto. said surveillance data
`including position data;
`a processing systeln configured to receive said surveil-
`lance data and incorporate said surveillance data into a
`surveillance database:
`
`a control and command system operative to retrieve
`predetermined position data from said surveillance data
`in said surveillance database and to generate a position
`control signal in accordance with said position data.
`and
`
`a position-controllable surveillance device responsive to
`said control signal for adjusting the position of the
`surveillance device.
`
`10
`
`15
`
`30
`
`40
`
`50
`
`60
`
`8
`2. The system of claim 1, wherein said control and
`command system is further configured to generate and
`output reports based upon said surveillance data.
`3. The system of claim 1. wherein said control and
`command system is further configured to distribute said
`surveillance data over a network.
`4. The system of claim 1. wherein said control and
`command system is Further conligured to generate graphical
`representations for display on a display device. based upon
`said surveillance data.
`5. The system of claim 1. wherein said sensor system
`ClJ]'l']pl'l5C1'i E1 S€['lS()l'
`llflll.
`6. The system ol‘ claim 5, wherein said sensor unit is
`configured to detect predetermined conditions and to gen-
`erate surveillance data representative of the detected condi-
`l.](J]1S.
`
`7. The system of claim 6. wherein said surveillance data
`comprises data indicative of the time said conditions where
`detected.
`8. The system of claim 6. wherein said surveillance data
`comprises data indicative of the location of said detected
`conditions.
`9. A surveillance management system for providing a
`position control signal usable by a position-controllable
`surveillance device comprising:
`a memory:
`a surveillance database stored on said memory;
`said surveillance database operative for storing surveil-
`lance data collected by a surveillance sensor unit, said
`surveillance data including position data; and
`a surveillance server associated with said memory and
`configured to receive surveillance data including said
`position data from a surveillance sensor unit configured
`to detect predetermined conditions, to generate surveil-
`lance data representative of the detected conditions,
`and to generate a position control signal for utilization
`by said position-controllable surveillance device.
`10. The system of claim 9, wherein said surveillance
`server is further configured to incorporate surveillance data
`received from said surveillance sensor unit into said sur-
`veillance database.
`
`11. The system ofclaim 10. wherein said surveillance data
`comprises data indicative ol‘ the time said predetennined
`conditions were detected.
`12. The system ofclaim ll , wherein said surveillance data
`comprises data indicative of the location where said prede-
`termined conditions were detected.
`13. The system of claim 1 2, wherein said surveillance data
`comprises data representative ol‘ said detected conditions.
`14. The system ofclaim 12. wherein said surveillance data
`comprises video data representative of said detected condi-
`tions.
`15. The system of claim 9, wherein said surveillance
`sensor unit comprises a detection device.
`16. The system of claim 9, wherein said surveillance
`sensor unit comprises a plurality of detect ion devices.
`17. The system oliclaim 15, wherein said detection device
`comprises a cam era.
`18. The system of claim 17. wherein said camera is
`responsive to the visible light spectrum.
`19. The system of claim 17. wherein said camera is
`responsive to infrared radiation (ll{).
`20. The system of claim 17. wherein said camera com-
`prises a video camera.
`21. The system ofclaim 15, wherein said dejection device
`comprises a position detection device.
`*
`=l=
`5%
`=9
`5%
`
`Page 9 of 9
`Page 9 of 9

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket