`Lagassey
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7,348,895 B2
`Mar. 25, 2008
`
`USOO7348895B2
`
`(54) ADVANCED AUTOMOBILE ACCIDENT
`DETECTION, DATA RECORDATION AND
`REPORTING SYSTEM
`
`(*) Notice:
`
`(76) Inventor: Paul J. Lagassey, P.O. Box 643207,
`Vero Beach, FL (US) 32964
`s
`Subject to any disclaimer, the term of this
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 210 days.
`(21) Appl. No.: 11/267,732
`
`(22) Filed:
`
`Nov. 3, 2005
`
`(65)
`
`Prior Publication Data
`US 2006/0092043 A1
`May 4, 2006
`ay 4,
`Related U.S. Application Data
`(60) Provisional application No. 60/522,749, filed on Nov.
`3, 2004.
`(51) Int. Cl.
`(2006.01)
`G08G I/095
`(52) U.S. Cl. ...................... 340/907; 340/937; 340/909;
`701f1
`(58) Field of Classification Search ................. 340/907
`See application file for complete search history.
`References Cited
`U.S. PATENT DOCUMENTS
`
`(56)
`
`6, 1991 Hashimoto
`5,025,324 A
`5,056,056 A 10, 1991 Gustin
`5,353,023 A 10/1994 Mitsugi
`5,446,659 A
`8, 1995 Yamawaki
`5,539,398 A
`7, 1996 Hall et al.
`5,677,684 A 10, 1997 McArthur
`5,689.442 A * 11/1997 Swanson et al. ............ 380,241
`5,699,056 A 12, 1997 Yoshida
`5,717.391 A
`2/1998 Rodriguez
`
`100 Na
`
`
`
`3/1998 Kupersmit
`5,734,337 A
`5,784,007 A * 7/1998 Pepper ....................... 340,933
`5,845,240 A 12/1998 Fielder
`5,890,079 A
`3, 1999 Levine
`5,938,717 A
`8/1999 Dunne et al.
`5,943,428 A
`8, 1999 Seri et al.
`5.948,026 A
`9, 1999 Beemer et al.
`5.948,038 A
`9, 1999 Daly et al.
`aly et a
`(Continued)
`OTHER PUBLICATIONS
`Whitney and Pisano, Idea Project Final Report Contract ITS-19.
`IDEA Program Transportation Research Board National Research
`Council, Dec. 26, 1995, “AutoAlert: Automated Acoustic Detection
`of Incidents.
`
`(Continued)
`
`Primary Examiner Jeffery Hofsass
`Assistant Examiner Kerri McNally
`(74) Attorney, Agent, or Firm—Milde & Hoffberg LLP
`
`ABSTRACT
`(57)
`A system for monitoring a location to detect and report a
`vehicular incident, comprising a transducer for detecting
`acoustic waves at the location, and having an audio output;
`a processor for determining a probable occurrence or
`impending occurrence of a vehicular incident, based at least
`upon said audio output; an imaging System for capturing
`images of the location, and having an image output; a buffer,
`receiving said image output, and storing at least a portion of
`said images commencing at or before said determination;
`and a communication link, for selectively communicating
`said portion of said images stored in said buffer with a
`remote location and at least information identifying the
`location, wherein information stored in said buffer is pre
`served at least until an acknowledgement of receipt is
`received representing Successful transmission through said
`communication link with the remote location.
`
`43 Claims, 6 Drawing Sheets
`
`40
`
`
`
`US 7,348,895 B2
`Page 2
`
`
`
`U.S. PATENT DOCUMENTS
`
`1/2003 Narayanaswami et al.
`2003/0011684 A1
`1/2003 Ghazarian ................... 340,901
`2003/00 16143 A1
`5.990,801 A 1 1/1999 Kyouno et al.
`2/2003 Bassett
`2003/0041329 A1
`6,009,356 A 12/1999 Monroe
`3, 2003 Ebram
`2003.0053536 A1
`6,072,806 A * 6/2000 Khouri et al. .............. 370/465
`4/2003 Naidoo et al.
`2003, OO62997 A1
`6,075,466 A
`6/2000 Cohen et al.
`4, 2003 Monroe
`2003, OO67542 A1
`6,087,960 A
`7/2000 Kyouno et al.
`2003/0080878 A1* 5/2003 Kirmuss ..................... 340/936
`6,088,635 A * 7/2000 Cox et al. ..................... TO1/19
`2003/0081121 A1
`5/2003 Kirmuss
`6,091,956 A
`7/2000 Hollenberg
`2003/0081122 A1
`5/2003 Kirmuss
`6,100,819 A * 8/2000 White ........................ 340,933
`2003/0081127 A1
`5/2003 Kirmuss
`6,111,523 A
`8/2000 Mee
`2003/0081128 A1
`5/2003 Kirmuss
`6,133,854. A * 10/2000 Yee et al. ................... 340,907
`2003/0081934 A1
`5/2003 Kirmuss
`6,141,611 A 10 2000 Mackey et al.
`2003/0081935 A1
`5.2003 Kirmuss
`6,154,658. A 1 1/2000 Caci
`2003/0095.043 A1* 5/2003 Butzer et al. .......... 340,539.13
`6,163,338 A 12/2000 Johnson et al.
`2003/0125853 A1
`7/2003 Takagi et al.
`6,211,907 B1
`4/2001 Scaman et al.
`2003/0214405 A1 11/2003 Lerg et al.
`6,226,389 B1* 5/2001 Lemelson et al. .......... 382/104
`2003/0222981 A1 12/2003 Kisak et al.
`6.252,544 B1
`6/2001 Hoffberg
`2003/022551.6 A1 12/2003 DeKock et al.
`6,281,792 B1
`8/2001 Lerg et al.
`2004/0022416 A1
`2/2004 Lemelson et al.
`6,288,643 B1
`92001 Lerg et al.
`2004/0222904 A1* 11/2004 Ciolli ......................... 340/937
`6,304,816 B1 * 10/2001 Berstis ....................... 701 117
`2006/0261979 A1* 11/2006 Draaijer et al. .
`340/937
`6,314,364 B1
`1 1/2001 Nakamura
`6,324.450 B1
`1 1/2001 Iwama
`OTHER PUBLICATIONS
`6,339,370 B1
`1, 2002 Ruhl et al.
`Veeraraghavan, Masoud, Papanikolopoulos, “Vision-based Moni
`6,353,169 B1* 3/2002 Juszkiewicz et al. ......... 84f600
`-- ?
`M
`toring of Intersections'. Artificial Intelligence, Vision, and Robotics
`6,366,219 B1* 4/2002 Hoummady ................ 340,907
`Lab, Department of Computer Science and Engineering, University
`6,389,340 B1
`5/2002 Rayner
`of Minnesota, in Proc. IEEE 5th International Conference on
`6,392,692 B1
`5, 2002 Monroe
`Intelligent Transportation Systems. pp. 7-12. Singapore, Sep.
`6,401,027 B1
`6, 2002 Xu et al.
`20O2.*
`6,404,352 B1
`6/2002 Ichikawa et al.
`Whitney, et al: (TASC, Inc., Reading, MA); “AutoAlert: Automated
`6,427,113 B1
`7/2002 Rahman ..................... 701 117
`s
`Acoustic Detection of Incidents'. IDEA Project Final Report,
`6,429,812 B1
`8/2002 Hoffberg
`Contract ITS-19, IDEA Program, Transportation . . . .
`6,449,540 B1
`9/2002 Rayner
`G6
`Navaneethakrishnan; "Automated Accident Detection in Intersec
`6,466,260 B1 * 10/2002 Hatae et al. ................ 348,149
`--
`-
`-- -99
`tions Via Digital Audio Signal Processing” (Thesis, Mississippi
`6,472,982 B2 10/2002 Eida et al.
`6542.077 B2
`4/2003 Joao
`State University, Dec. 2003).
`6,573,831 B2
`6, 2003 Ikeda et al.
`Nindsunodak.edu/indsu/ugpti/MPC Pubs/html/MPC01-122.
`6,573,929 B1* 6/2003 Glier et al. ................. 348, 149
`stat-www.berkeley.edu/users/kwon/papers/inc detection.pdf.
`6,574,538 B2
`6, 2003 Sasaki
`www-users.cs.umn.edu/~masoud/publications/harini-intersection
`6,574,548 B2
`6, 2003 DeKock et al.
`itsc-2002.pdf
`6,580,373 B1
`6, 2003 Ohashi
`Skabardonis; The I-80 Experiment: Real-Time Algorithms for
`6,600,417 B2
`7/2003 Lerg et al.
`Travel Time Estimates and Incident Detection.
`6,617.981 B2* 9/2003 Basinger ..................... 340,909
`-
`ieeexplore.ieee.org/xpl/tocresult.jsp?isNumber=14013.
`6,630,884 B1
`10/2003 Shanmugham
`& 8
`Karim, et al., “Fast Automatic Incident Detection on Urban and
`6,633,238 B2 10/2003 Lemelson et al.
`---->
`Rural Freeways. Using the Wavelet Energy Algorithm”, Journal of
`6,647,270 B1
`1 1/2003 Himmelstein
`Transportation Engineering, ASCE, vol. 129, No. 1.
`6,684,137 B2
`1/2004 Takagi et al.
`& 8
`Karim, et al., “Comparison of Fuzzy Wavelet Radial Basis Function
`6,690,294 B1
`2/2004 Zierden
`Neural Network Freeway Incident Detection Model with California
`6,718,239 B2
`4/2004 Rayner
`Algorithm'. Journal of Transporation . . . .
`6,760,061 B1* 7/2004 Glier et al. ................. 348, 149
`& 8
`Chien-Hua Hsiao, et al., “Application of Fuzzy Logic and Neural
`6,781,523 B2 * 8/2004 Matsui et al. ............... 340,910
`Networks to Automatically Detect Freeway Traffic Incidents'. Jour
`6.961,079 B2 * 1 1/2005 Kaylor et al.................. 348-49 NSEE, y
`7,046,273 B2 * 5/2006 Suzuki ....................... 348.157
`por
`ng
`g . . . .
`.
`Stubbs, et al., “A real-time collision warning system for intersec
`2001/0005804 A1
`6/2001 Rayner
`tions', in Proc. ITS American 13th Annual Meeting, Minneapolis
`2001/0040897 A1* 11/2001 Hamlin ....................... 370/466
`2002/0008619 A1
`1/2002 Lerg et all
`MN, May 2003.
`erg et al.
`Martin, et al., “Incident Detection Algorithm Evaluation” (Univer
`2002fOOO8637 A1
`1/2002 Lemelson et al.
`sity of Utah, Prepared for Utah Dept. of Transportation) Mar. 2001.
`9, 2002 Joao
`& 8
`2002/O121969 A1
`Dastidar, et al., “Wavelet-Clustering-Neural Network Model for
`2002/0147982 A1 10, 2002 Naidoo et al.
`Freeway Incident Detection'. Computer-Aided Civil and Infrastruc
`ture Engineering 18 (5)
`2002/0163579 A1 11, 2002 Patel et al.
`2002/017O685 A1 11, 2002 Welk et al.
`gineering
`2002/0193938 A1 12/2002 DeKock et al.
`* cited by examiner
`
`
`
`U.S. Patent
`U.S. Patent
`
`Mar. 25, 2008
`Mar. 25,2008
`
`Sheet 1 of 6
`Sheet 1 of 6
`
`US 7,348,895 B2
`US 7,348,895 B2
`
`
`
`S.
`40
`
`Ne
`
`1
`YN
`
`Motorola v. Stellar
`
`Motorola Exhibit 1035
`Page 003
`
`is
`
`r
`
`f C
`
`100
`
`o
`v
`
`
`
`Mar. 25, 2008
`Mar. 25,2008
`
`Sheet 2 of 6
`Sheet 2 of 6
`
`US 7,348,895 B2
`US 7,348,895 B2
`
`U.S. Patent
`U.S. Patent
`
`
`
`e
`
`(N
`CN
`S. N
`
`Dh
`
`Motorola v. Stellar
`
`Motorola Exhibit 1035
`Page 004
`
`
`
`U.S. Patent
`
`Mar. 25, 2008
`
`Sheet 3 of 6
`
`US 7,348,895 B2
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Control Unit Active and Receiving incoming Audio and Video Signals 51
`
`Store At Least Video Data and Other Desired Accident Related Data in a
`Circular Buffer While 52
`
`No
`
`incoming Audio Signals for Match to
`Stored Acoustic Signature of Qualifying Sound to Determine if
`Qualifying Sound is Present?
`
`Yes
`Stop Overwriting and Preserve Data in Circular Buffer 54
`
`Continue to Save at Least Subsequent Video Data and Other Desired
`Accident-Related Data 55
`
`
`
`Initiate Contact With Monitoring Center 75
`
`Contact Established 76
`
`
`
`NO
`
`
`
`
`
`NO
`
`No
`
`Yes
`Transmit Location Data and At Least One Image to Monitoring Center 77
`
`Continue Saving Desired Accident Related Data 78
`Yes
`
`
`
`Time Limit Passed,
`emory Limit Been Reached, or Termination Signal Received
`79
`
`
`
`Yes
`Stop Saving Accident-Related Data 80
`
`
`
`Transmit Accident-Related Data 81
`
`uccessful Transmission
`Verified 82
`
`Yes
`End Transmission 85
`
`Flush and Reuse allocated memory 90
`
`Fig. 3
`
`
`
`U.S. Patent
`
`Mar. 25, 2008
`
`Sheet 4 of 6
`
`US 7,348,895 B2
`
`NO
`
`Receive Audio And Video Signals, Time And
`LOCation Data 50
`
`START
`
`
`
`
`
`
`
`
`
`
`
`NO
`
`
`
`
`
`Compare
`Audio Signals For Match
`tored ACOustic Signature
`51
`
`YES
`
`Qualifying Sound 55
`
`
`
`Preliminary Sound 54
`
`Commence Saving Accident
`Related Data 60
`
`Continue Saving And Analyze
`Subsequent Audio Signals For
`Match To Qualifying Sound 61
`
`Ound Detected Within Firs
`Predetermined Time?
`
`YES
`
`
`
`
`
`
`
`
`
`Commence/Continue Saving Of
`ACCident Related Data 70
`
`YES
`
`NO
`
`Initiate Contact With Monitoring
`Center 75
`
`Transmit stored data to remote
`location (optional) 69
`
`Contact Established? 76
`
`YES
`Transmit LOCation Data And At
`Least One Image To Monitoring
`Center 77
`
`Continue Saving Of Accident
`Related Data 78
`
`
`
`
`
`
`
`
`
`
`
`econd Predetermined
`Time Passed, Storage
`Capacity Been Reached, Or
`Termination Signal
`Received? 79
`
`NO
`
`Stop Saving Accident-Related Data
`80
`
`Transmit Or Upload Accident
`Related Data 81
`
`
`
`
`
`
`
`SuCCessful
`Transmission Or Upload
`Verified 82
`
`YES
`
`FIUSh Buffer 90
`
`Reinitialize (Optional) 99
`Fig. 4
`
`
`
`U.S. Patent
`U.S. Patent
`
`Mar. 25, 2008
`Mar. 25,2008
`
`Sheet S of 6
`Sheet 5 of 6
`
`US 7,348,895 B2
`US 7,348,895 B2
`
`\ \s
`ve
`W7 A S
`if/®
`
`UM
`W
`
`Fig.5
`
`C
`c
`
`s
`
`
`
`Motorola v. Stellar
`
`Motorola Exhibit 1035
`Page 007
`
`
`
`U.S. Patent
`
`Mar. 25, 2008
`
`Sheet 6 of 6
`
`US 7,348,895 B2
`
`Detect acoustic waves at the location 301
`
`Analyze conditions at location 302
`Determine alikely occurrence or an imminent occurrence of a vehicular accident or
`Other incident 303
`Optionally determine compliance with traffic control regulations 304
`Capture initial images of the location along with audio, timecode, state of traffic signal,
`GPS code, optionally polling a plurality of cameras 305
`
`Store data, starting no later than the determination of likely occurrence 306
`
`Optionally Communicate Location and At Least One image 307
`
`Continue to capture stream of images of the location along with audio, timecode, state
`of traffic signal, GPS code until cease condition 308
`Optionally use sensor data to model location 309
`Optionally communicate to or from traffic signal control device 310
`
`Establish communication pathway and communicate the stored
`images and incident related data to a remote location 311
`
`erification of Successfu
`Communication?
`312
`
`Retry and/or try
`alternate communication
`pathway 313
`
`Yes
`Preserve stored information at least until verification of successful communication, and
`then delete 314
`
`Receive and display information from a plurality of locations at a remote monitoring
`center on a map 315
`Route information to an available live agent at a remote monitoring center,
`coordinate multiple communications 316
`
`Preserve information in a forensically reliable record 317
`
`
`
`Communicate from remote monitoring center to location with audio communications,
`to control and program traffic signal control device, control and program the
`components of the system, and to activate visual alert 318
`Fig. 6
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`US 7,348,895 B2
`
`1.
`ADVANCED AUTOMOBILE ACCIDENT
`DETECTION, DATA RECORDATION AND
`REPORTING SYSTEM
`
`CROSS REFERENCE TO RELATED
`APPLICATION
`
`The present application claims benefit of priority from
`U.S. Provisional Patent Application 60/522,749 filed Nov. 3,
`2004.
`
`10
`
`BACKGROUND OF THE INVENTION
`
`2
`U.S. Pat. No. 6,141,611 issued to Mackey et al. entitled
`“Mobile Vehicle Accident Data System” (“the Mackey ref
`erence'), expressly incorporated herein by reference in its
`entirety, discloses an on-board vehicle accident detection
`system including one or more video cameras that continu
`ously record events occurring at a given scene. Camera
`images of the scene are digitally stored after compression.
`An accident detector on-board the vehicle determines if an
`accident has occurred, and if so, the stored images are
`transmitted to a remote site for observation. However, the
`Mackey reference includes video cameras on-board the
`vehicles themselves, increasing the likelihood that the cam
`eras would become damaged during an accident thereby
`rendering them impractical for accident-recording systems.
`Further, the on-board cameras image-capturing ability is
`severely limited due to the constraints of the vehicle them
`selves. Additionally, the Mackey reference discloses a sys
`tem that determines if an accident is present by the Sudden
`acceleration or deceleration of the vehicle, without the use
`of fixed microphones. The invention claimed by Mackey is
`on board the vehicle, it does nothing to solve the problem or
`record an accident in two vehicles which are not so
`equipped. Equipping every vehicle with this system is
`impractical and therefore not feasible.
`U.S. Pat. No. 6,111,523 issued to Mee entitled “Method
`and Apparatus for Photographing Traffic in an Intersection'.
`expressly incorporated herein by reference in its entirety,
`describes a system for taking photographs of vehicles at a
`traffic intersection by triggering a video camera to capture
`images wherein the triggering mechanism of the video
`camera is based upon certain vehicle parameters including
`the speed of the vehicle prior to its entrance into the traffic
`intersection.
`U.S. Pat. No. 6,088,635 issued to Cox et al. entitled
`“Railroad Vehicle Accident Video Recorder, expressly
`incorporated herein by reference in its entirety, discloses a
`system for monitoring the status of a railroad vehicle prior
`to a potential accident. The system employs a video camera
`mounted within the railroad car that continuously views the
`status of a given scene, and continuously stores the images
`of the scene. Like Mackey, it is impractical and therefore not
`feasible to equip every vehicle with this system.
`U.S. Pat. No. 5,717.391 issued to Rodriguez entitled
`“Traffic Event Recording Method and Apparatus”, expressly
`incorporated herein by reference in its entirety, describes a
`system for determining the condition of a traffic light and
`includes an audio sensor which monitors Sound at all times.
`Sound detected above a certain decibel level triggers the
`recordation of sounds, the time of day and the status of the
`traffic lights. However, Rodriguez fails to disclose video
`cameras or any image-capturing means.
`U.S. Pat. No. 5,677,684 issued to McArthur entitled
`“Emergency Vehicle Sound-Actuated Traffic Controller,
`expressly incorporated herein by reference in its entirety,
`describes a traffic controller system utilizing Sound detection
`means connected to a control box which contains a Switch
`ing mechanism that, in a first orientation, allows normal
`operation of traffic light control and a second orientation
`that, upon the detection of an approaching siren, sets all
`traffic signals at an intersection to red to prohibit the
`entrance into the intersection of additional vehicles.
`U.S. Pat. No. 5,539,398 issued to Hall et al. entitled
`“GPS-based Traffic Control Preemption System”, expressly
`incorporated herein by reference in its entirety, discloses a
`system for determining if a vehicle issuing a preemption
`
`15
`
`The invention generally relates to an automobile accident
`detection and data recordation and reporting system, and in
`particular to a system which detects accidents based on a set
`of characteristic Sounds or other cues.
`Traffic accidents cause significant costs in terms of direct
`loss, consequential loss, and Societal loss due to obstruction
`of the roadway in the aftermath of an accident. Another issue
`is the allocation of direct costs, for example when more than
`one vehicle is involved, the vehicle at fault is generally held
`liable for the damages.
`It is possible to monitor locations that are likely places for
`accidents to occur, however, without intelligence, this pro
`cess may be inefficient and unproductive. Likewise, without
`immediate and efficient communication of the information
`obtained, benefits of the monitoring are quite limited.
`Since cellular telephone technology has become so widely
`adopted, the most common means by which motor vehicle
`accidents are reported to agencies in the U.S. is through
`cellular telephones. However, this is not always reliable or
`immediate if the victims are unable to use their cellular
`phones or if there are no witnesses with cellular phones to
`report the accident, and it fails to record an actual record of
`the accident which can later be used as evidence.
`Automobile accident detection systems are common in
`the art. Upon the occurrence of an automobile accident, it
`may be desirable to obtain video images and sounds of the
`accident and to record the time of the accident and the status
`of the traffic lights at the time the accident occurred. This
`information can then be sent to a remote location where
`emergency crews can be dispatched and the information
`further examined and forwarded to authorities in order to
`determine fault and liability.
`A number of prior art techniques are available for pre
`dicting the occurrence of an accident. Some of these require
`an extended period of time for an automated system to
`analyze the data, and thus any report generated is Substan
`tially delayed. In others, the accuracy of the system depends
`on environmental conditions, such as lighting or time of day.
`Therefore, in order to provide an immediate and reliable
`response to a predicted occurrence of an accident, Such
`techniques are Suboptimal.
`For example, Japanese Patent Application No. 8-162911
`entitled “Motor Vehicle Accident Monitoring Device” (“the
`Japanese reference”), expressly incorporated herein by ref
`erence in its entirety, discloses a system for monitoring
`traffic accidents including a plurality of microphones and
`Video cameras disposed at an intersection. Collision sounds
`are chosen from among the typical Sounds at an intersection.
`The source of the collision sounds is determined by com
`paring the time differences of the sounds received by each of
`the microphones. Image data from the cameras is recorded
`upon the occurrence of the collision. However, the Japanese
`reference discloses a system that is constantly photograph
`ing the accident scene thereby wasting video resources.
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`
`
`US 7,348,895 B2
`
`5
`
`10
`
`15
`
`3
`request to an emergency vehicle or police car is within an
`allowed approach of a traffic intersection, utilizing a GPS
`system.
`U.S. Pat. No. 6,690,294 issued to Zierden entitled “Sys
`tem and method for detecting and identifying traffic law
`violators and issuing citations', expressly incorporated
`herein by reference, discloses a mobile or stationary traffic
`monitoring system for detecting violations of speed limits or
`other traffic laws by vehicle operators and issuing citations
`to an operator and/or vehicle owner Suspected of a violation
`using a digital camera to capture images of the operator
`and/or the vehicle, transmitting the captured images and
`other relevant data to an analysis center where the images
`and data are analyzed to determine whether to issue a
`citation and, if so, to issue the citation or take other appro
`priate law enforcement measures. The system captures
`images of a vehicle and/or vehicle operator Suspected of a
`traffic violation, determines the time and geographic loca
`tion of the Suspected violation, transmits the images and
`other data to an analysis center, issues citations to violators
`and derives revenue therefrom.
`U.S. Pat. No. 5,938,717 to Dunne et al., expressly incor
`porated herein by reference, discloses a traffic control sys
`tem that automatically captures an image of a vehicle and
`speed information associated with the vehicle and stores the
`image and information on a hard disk drive. The system uses
`a laser gun to determine whether a vehicle is speeding. The
`hard drive is later connected to a base station computer
`which is, in turn, connected to a LAN at which the infor
`mation from the hard drive is compared with databases
`containing data such as vehicle registration information and
`the like. The system automatically prints a speeding citation
`and an envelope for mailing to the registered owner of the
`vehicle
`U.S. Pat. No. 5,734.337 to Kupersmit, expressly incor
`porated herein by reference, discloses a stationary traffic
`control method and system for determining the speed of a
`vehicle by generating two images of a moving vehicle and
`calculating the vehicle speed by determining the distance
`traveled by the vehicle and the time interval between the two
`images. The system is capable of automatically looking up
`vehicle ownership information and issuing citations to the
`owner of a vehicle determined to be speeding.
`U.S. Pat. No. 5.948,038 to Daly et al., expressly incor
`porated herein by reference, discloses a method for process
`ing traffic violation citations. The method includes the steps
`of determining whether a vehicle is violating a traffic law,
`recording an image of the vehicle committing the violation,
`recording deployment data corresponding to the violation,
`matching the vehicle information with vehicle registration
`information to identify the owner, and providing a traffic
`violation citation with an image of the vehicle, and the
`identity of the registered owner of the vehicle.
`The I-95 Corridor Coalition, Surveillance Requirements/
`Technology, Ch. 4. Technology Assessment, expressly
`incorporated herein by reference, describes a number of
`different technologies suitable for incident detection. For
`example, AutoAlert: Automated Acoustic Detection of Traf
`fic Incidents, was an IVHS-IDEA project which uses mili
`tary acoustic sensor technologies, e.g., AT&T IVHS NET
`60
`2000TM. The AutoAlert system monitors background traffic
`noise and compares it with the acoustic signatures of pre
`viously recorded accidents and incidents for detection. See,
`David A. Whitney and Joseph J. Pisano (TASC, Inc., Read
`ing, Mass.), “AutoAlert: Automated Acoustic Detection of
`Incidents”, IDEA Project Final Report, Contract ITS-19,
`IDEA Program, Transportation Research Board, National
`
`45
`
`4
`Research Council, Dec. 26, 1995, expressly incorporated
`herein by reference. The AutoAlert system employs algo
`rithms which provide rapid incident detection and high
`reliability by applying statistical models, including Hidden
`Markov Models (HMM) and Canonical Variates Analysis
`(CVA). These are used to analyze both short-term and
`time-varying signals that characterize incidents.
`The Smart Call Box project (in San Diego, Calif.) evalu
`ated the use of the existing motorist aid call box system for
`other traffic management strategies. The system tests the
`conversion of existing cellular-based call boxes to multi
`functional IVHS system components, to transmit the data
`necessary for traffic monitoring, incident detection, hazard
`ous weather detection, changeable message sign control, and
`CCTV control.
`In 1992 the French Toll Motorway Companies Union
`initiated testing an Automatic Incident Detection (AID)
`technique proposed by the French National Institute for
`Research on Transportation and Security (INRETS). The
`technique consists of utilizing computers to analyze video
`images received by television cameras placed along the
`roadway. A “mask’ frames the significant part of the image,
`which typically is a three or four-lane roadway and the
`emergency shoulder. The computer processes five pictures a
`second, compares them two at a time, and analyzes them
`looking for points that have moved between two Successive
`pictures. These points are treated as objects moving along
`the roadway. If a moving object stops and remains stopped
`within the mask for over 15 seconds, the computer considers
`this an anomaly and sets off an alarm. In 1993, as part of the
`European MELYSSA project, the AREA Company con
`ducted a full scale test over an urban section of the A43
`motorway located east of Lyons. The roadway was equipped
`with 16 cameras on 10 meter masts or bridges with focal
`distances varying from 16 to 100 km, and fields of detection
`oscillating between 150 and 600 meters. Image Processing
`and Automatic Computer Traffic Surveillance (IMPACTS) is
`a computer system for automatic traffic Surveillance and
`incident detection using output from CCTV cameras. The
`algorithm utilized by the IMPACTS system takes a different
`approach from most other image processing techniques that
`have been applied to traffic monitoring. Road space and how
`it is being utilized by traffic is considered instead of iden
`tifying individual vehicles. This leads to a qualitative
`description of how the road, within a CCTV image, is
`occupied in terms of regions of empty road or moving or
`stationary traffic. The Paris London Evaluation of Integrated
`ATT and DRIVE Experimental Systems (PLEIADES) is
`part of the DRIVE Research Programme. The Automatic
`Traffic Surveillance (ATS) system has been installed into
`Maidstone Traffic Control Center and provides information
`on four separate CCTV images. This information will be
`used both in the Control Center and passed onto the Traffic
`Information Center via the PLEIADES Information Con
`troller (PIC) and data communications link. Instead of
`remote PCs there is a duplicate display of the Engineers
`workstation that is shown in the Control Office on a single
`computer monitor. The ATS system communicates data at
`regular intervals to the PIC. Any alarms that get raised or
`cleared during normal processing will get communicated to
`the PIC as they occur. The PIC uses the information received
`to display a concise picture of a variety of information about
`the highway region. The ATS system uses video from CCTV
`cameras taken from the existing Control Office Camera
`Multiplex matrix, while not interfering with its normal
`operation. When a camera is taken under manual control, the
`
`25
`
`30
`
`35
`
`40
`
`50
`
`55
`
`65
`
`
`
`US 7,348,895 B2
`
`10
`
`15
`
`5
`processing of the data for that image is Suspended until the
`camera is returned to its preset position.
`Navaneethakrishnan Balraj. “Automated Accident Detec
`tion. In Intersections Via Digital Audio Signal Processing
`(Thesis, Mississippi State University, December 2003),
`expressly incorporated herein by reference, discusses, inter
`alia, feature extraction from audio signals for accident
`detection. The basic idea of feature extraction is to represent
`the important and unique characteristics of each signal in the
`form of a feature vector, which can be further classified as
`crash or non-crash using a statistical classifier or a neural
`network. Others have tried using wavelet and cepstral trans
`forms to extract features from audio signals such as speech
`signals. S. Kadambe, G. F. Boudreaux-Bartels, “Application
`of the wavelet transform for pitch detection of speech
`signals.” IEEE Trans. on Information Theory, vol. 38, no. 2,
`part 2, pp. 917-924, 1992: C. Harlow and Y. Wang, “Auto
`mated Accident Detection.” Proc. Transportation Research
`Board 80th Annual Meeting, pp. 90-93, 2001. Kadambe etal
`developed a pitch detector using a wavelet transform. One of
`the main properties of the dyadic wavelet transform is that
`it is linear and shift-variant. Another important property of
`the dyadic wavelet transform is that its coefficients have
`local maxima at a particular time when the signal has sharp
`changes or discontinuities. These two important properties
`of the dyadic wavelet transform help to extract the unique
`features of a particular audio signal. Kadambe et al made a
`comparison of the results obtained from using dyadic wave
`let transforms, autocorrelation, and cepstral transforms. The
`investigation showed that the dyadic wavelet transform pitch
`detector gave 100% accurate results. One reason for the
`difference in the results was that the other two methods
`assume stationarity within the signal and measure the aver
`age period, where as the dyadic wavelet transform takes into
`account the non-stationarities in the signal. Hence, the
`dyadic wavelet transform method would be the best to
`extract feature when the signals are non-stationary. Harlow
`et al developed an algorithm to detect traffic accidents at
`intersections, using an audio signal as the input to the
`system. The algorithm uses the Real Cepstral Transform
`(RCT) as a method to extract features. The signals recorded
`at intersections include brake, pile drive, construction and
`normal traffic Sounds. These signals are segmented into
`three-second sections. Each of these three second segmented
`signals is analyzed using RCT. RCT is a method where the
`signal is windowed for every 100 msec using a hamming
`window with an overlap of 50 msec. Thus, for a given
`three-second signal, there will be almost 60 segments of 100
`msec duration each. RCT is applied to each of these seg
`ments, and the first 12 coefficients are used as the features.
`The features obtained using the RCT are then classified as
`“crash” or “non-crash” using a neural network.
`Balraj's experimental results showed that among the three
`different statistical classifiers investigated, maximum like
`lihood and nearest neighbor performed best, although this
`had high computational costs. Haar, Daubechies, and Coi
`flets provided the best classification accuracies for a two
`class system. Among the five different feature extraction
`methods analyzed on the basis of the overall accuracy, RCT
`60
`performed best. The second-generation wavelet method, the
`lifting scheme, was also investigated. It proved computa
`tionally efficient when compared to DWT. Thus, it was
`concluded that the optimum design for an automated system
`would be a wavelet-based feature extractor with a maximum
`likelihood classifier. Thus the choice of DWT or the lifting
`scheme would be preferred for a real-time system.
`
`45
`
`6
`In any and/or all of the embodiments described herein, the
`systems, equipment systems, Subsystems, devices, compo
`nents, and/or appliances, of and/or utilized in any of the
`respective embodiments, can include and/or can utilize the
`teachings and/or the subject matter of the following U.S.
`Patents, the subject matter and teachings of which are hereby
`incorporated by reference herein and form a part of the
`disclosure of this patent application: U.S. Pat. No. 6,009,356
`(Monroe, Dec. 28, 1999); U.S. Pat. No. 5,890,079 (Beemer,
`II, et al., Sep. 7, 1999); U.S. Pat. No. 5,845,240 (Fielder,
`Dec. 1, 1998): U.S. Pat. No. 5.948,026 (Levine, Mar. 30,
`1999); U.S. Pat. No. 5,446,659 (Yamawaki, Aug. 29, 1995);
`U.S. Pat. No. 5,056,056