throbber
(19) United States
`:12; Patent Application Publication (10] Pub. No.: US 2007/0152804 A1
`(43) Pub. Date: Jul. 5, 2007
`
`Breed et a].
`
`US 20070152804A1
`
`{541
`
`(75)
`
`ACCIDENT AVOTDANCE SYSTEMS AND
`METHODS
`
`Inventors: David 5. Breed. Miami Beach. lI'l.
`{US}: Wilbur E. DuVall. Reeds Spring.
`MD (US): Wendell C. Johnson.
`Kanoohe. I-ll (US)
`
`Correspondence Address:
`BRIAN ROFFE. ESQ
`ll SUNRISE PLAZA. SUITE 303
`VALLEY STREAM. NY 11580-6111 (US)
`
`{73)
`
`Assighce;
`
`'l‘PlCllN()l.()(illiIS
`IN‘I'EI.1..IGl-IN’I'
`INTERNATIONAL. INC.. Denville. NJ
`(US)
`
`1’21)
`
`Appl. No.:
`
`111681.817
`
`{22)
`
`Filed:
`
`Mar. 5.. 2007
`
`{531
`
`Related U .5. Application Data
`
`CTootinnation-in-parl ol‘ application No. 11f034.325.
`filed on Jan. 12. 2005. now Pat. No. 7.202.776. which
`is a continuation-impart of application No. 100522.
`445. filed on Apr. 12. 2004. now Pat. No. 7.085.637.
`which is a continuatiomin—part of application No.
`103118.858.
`filed on Apr. 9, 2002. now Pat. No.
`6.720.920. which is a continuation-in-part of appli-
`cation No. 091'127041. filed on Oct. 22. 1998. now
`Pat. No. 6.370.475.
`Said application No. [Of] 18,858 is a continuation—in—
`part of application No. 09f6?9.3l7, filed on Oct. 4,
`2000. now Pat. No. 6.405.132. which is a continua-
`tion—impart of application No. 091523.559. filed on
`Mar. 10. 2000. now abandoned.
`Said application No. 10;If 1 13,858 is a continuation-in-
`part of application No. 09f909.466. filed on Jul. 19.
`2001. now Pat. No. 6.526.352.
`Said application No. 101822.445 is a continuation—in—
`part' ol‘application No. 10f216.633. filed on Aug. 9.
`2002. now Pat. No. 6.768.944.
`Cfootinnation-in-pan of application No. 111461.619.
`filed on Aug. 1. 2006. and which is a continuation-
`
`in—part of application No. 108224-15. filed on Apr.
`12. 2004. now Pat. No, 7.085.637. and which is a
`continuation—in—pzu't of application No.
`ltlz‘028.386.
`filed on Dec. 21. 200].
`
`Continuation-impart of application No. 10464335.
`filed on Aug. 1-1. 2006. and which is a continuation—
`in—part of application No. l 1t028.386_. filed on Jan. 3.
`2005. now Pat. No. 7.110.880.
`
`(60} Provisional application No. 601062.729. filed on Oct.
`22. 1997. Provisional application No. 60:” 123.882.
`filed on Mar.
`]1_. 1999. Provisional application No.
`60f7‘ll.452.1iled on Aug. 25. 2005. Provisional appli-
`cation No. 60!?11352. filed on Aug. 25. 2005.
`
`Publication Classification
`
`Int. (II.
`3609 1/00
`G036 1/16
`
`(2006.01)
`(2006.01)
`
`34016135; 701E301
`
`ABSTRAC T
`
`(511
`
`(52}
`
`(57)
`
`Accident avoidance system for a host vehicle includes a
`global positioning system residing on the host vehicle for
`determining the host vehicle‘s location as the host vehicle
`travels based on signals received li'ont one or more satellites.
`a map database having digital maps corresponding to an area
`including the location of the host vehicle as determined by
`the global positioning system. a velncle-lo-vehicle commu-
`nication system residing on Ihe host vehicle operative for
`receiving signals including location inlbmialion acquired by
`global positioning systems
`residing on other vehicles
`directly from the other vehicles indicating the locations of
`the other vehicles. and a navigation system including a
`display residing on the host vehicle for displaying images to
`an occupant ol’the host vehicle showing the digital maps and
`indications of the locations of the host vehicle and the other
`
`vehicles on the digital maps. The navigation system also
`updates the images shown on the navigation system display
`to reflect changes in die locations of the host vehicle and the
`other vehicles.
`
`lmnlmwlmw
`Cbhmflawflz‘lflv
`nut-- amid
`Maw-armm
`
`as
`
`
`
`
`
`
`
`
`
`
`
`
`OWNER EX. 2024, page 1
`
`OWNER Ex. 2024, page 1
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 1 of 20
`
`US 2007f0152804 Al
`
`
` -:::
`
`L/
`
`z?
`
`_/ _/J _/%
`
`BASE STATION
`
`“I
`
`‘
`
`Prior Art
`
`OWNER EX. 2024, page 2
`
`OWNER Ex. 2024, page 2
`
`

`

`Patent Applicah’on Publication
`
`Jul. 5, 2007 Sheet 2 of 20
`
`US 2007l0152804 Al
`
`2
`
`I 9 fl 3‘5
`$5!
`.w “‘1
`“”5
`4
`|
`_|_:,/
`\_|_
`_|_ I
`1'—
`'59
`a? \‘E
`g :47 ¥ §
`/ I \ F'
`_ r3"
`lg.
`
`3
`
`I
`
`’
`
`“K.
`
`“a ‘ ‘
`
`‘ ~.
`
`15
`
`22
`
`22
`
`'51.;
`
`
`
`BASE STATION
`
`l““
`
`‘_
`
`BASE STATION
`
`13
`
`Prior Art
`
`GPS & DGPS
`
`PROCESSING
`
`SYSTEM
`
`
`
`
`
`I
`
`En
`
`
`
`DRIVER WARNING
`SYSTEM
`
`I
`
`OWNER EX. 2024, page 3
`
`OWNER Ex. 2024, page 3
`
`

`

`Patent Application Publication
`
`Jul. 5. 2007 Sheet 3 of 20
`
`US 200710152804 Al
`
`U1N
`
`GPS
`Receiver
`
`DGPS
`Receiver
`
`Inter-Vehicle
`communication
`
`Infrastructure
`C mmunication
`
`1:
`
`54
`
`56
`
`OJC:
`
`Cameras
`
`52
`
`Radar
`
`6
`
`Laser Radar
`
`56
`
`6 3
`
`70
`
`1,2
`
`74
`
`7
`
`Warning
`LightlSound
`
`0.1 '0
`Database
`
`Brake
`Servo
`
`Steering
`Servo
`
`Throttle
`Servo
`
`Velocity
`Sensor
`
`100
`
`Central Processor 6. Circuits:
`
`GPS Ranging
`DGPS Corrections
`
`Image Analysis. 150
`— Radar Analysis
`Laser Radar Scanning Control
`and Analysis of Received
`Information
`
`Warning Message Generation
`Map Communication
`Vehicle Control
`
`Inertial Navigation System
`Calibrations and Control
`
`Displayr Control
`Precise Positioning
`Calculations
`Road Condition Predictions
`And Other Functions.
`
`Accelerometers
`
`Gyroscopes
`
`Display
`
`Memory
`
`MIR, RFID
`
`Weather
`Sensors
`
`Vehicle
`
`Diagnostics
`
`Stoplight
`Sensor
`
`Accurate
`Clock
`
`58
`
`7'8
`
`80
`
`82
`
`84
`
`86
`
`88
`
`90
`
`92
`
`
`
`
`
`Controls
`
`Fig. 5
`
`OWNER EX. 2024, page 4
`
`OWNER Ex. 2024, page 4
`
`

`

`Patent Applicah’on Publication
`
`Jul. 5, 2007 Sheet 4 of 20
`
`[IS 2007;0152804 Al
`
`Fig. 5A
`
`Fig. 6
`
`61
`
`
`
`
`
`B—s-nterface
`
`
`
`11
`
`OWNER EX. 2024, page 5
`
`OWNER Ex. 2024, page 5
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 5 of 20
`
`US 2007;“0152804 Al
`
`
`
`OWNER EX. 2024, page 6
`
`OWNER Ex. 2024, page 6
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 6 of 20
`
`US 20071'0152804 Al
`
`123456789|0
`
`1391411141
`
`INPUT
`
`9:19
`
`
`
`
`
`LAYER 1 mg 0
`
`
`
`OWNER EX. 2024, page 7
`
`OWNER Ex. 2024, page 7
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 7 of 20
`
`US 2007;“0152804 A]
`
`E\; 3\
`
`
`
`OWNER EX. 2024, page 8
`
`OWNER Ex. 2024, page 8
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 8 of 20
`
`US 2007l0152804 A]
`
`130
`
`132
`
`Determine Absolute
`Position of Vehicle
`
`I
`
`Data on Edges of Roadways
`Yellow lines and Stoplights
`
`Compare Absolute Position
`
`
`
`ls
`
`Absolute Position of
`
`Vehicle Approaching Close to Edge
`of Roadway
`
`136
`
`to Edges of Roadway
`
`
`Fig. 12a
`
`Sound Alarm andlor
`Giude Vehicle to Shoulder
`
`140
`
`130
`
`132
`
`Determine Absolute
`Position of Vehicle
`
`Data on Edges of Roadways
`Yellow lines and Stoplights
`
`
`
`
`Compare Absolute Position
`to Position of Yellow Lines
`
`ls
`
`Absolute Position of
`
`
`Vehicle Approaching Close to
`a Yellow Line
`
`Fig. 12b
`
`Sound Alarm andior
`Guide Vehicle away from Yellow Line
`or to Shoulder
`
`__140
`
`OWNER EX. 2024, page 9
`
`OWNER Ex. 2024, page 9
`
`

`

`Patent Applicafion Publication
`
`Jul. 5, 2007 Sheet 9 of 20
`
`US 200710152804 Al
`
`130
`
`132
`
`
`Determine Absolute
`Position of Vehicle
`
`Data on Edges of Roadways
`Yellow lines and Stoplighls
`
`
`Compare Absolute Position
`to Edges of Roadway And
`Posilion of Stoplight
`
`150
`
`
`
`
`
`
`
`-- 154
`
`Determine Color
`of Stoplight
`
`Is
`
`Absolute Position of
`
`Vehicle Approaching Close to
`a Red Stoplight?
`
` 140
`
`.
`F lg . 1 20
`
`Yes
`
`_
`
`Sound Alarm andior
`Giude Vehicle to Shoulder
`
`Fig.1
`
`174)
`
`25
`
`fl
`
`2%
`
`OWNER EX. 2024, page 10
`
`OWNER Ex. 2024, page 10
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Slleet ll} of 2|]
`
`US 2007;0152804 Al
`
`172
`
`180
`
`180
`
`
`
`170
`
`130
`
`1 BO
`
`25
`
`
`
`18
`
`”L
`
`Fig. 14
`
`OWNER EX. 2024, page 11
`
`OWNER Ex. 2024, page 11
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 1] of 20
`
`US 200710152804 Al
`
`
`
`26
`
`Fig. 15
`
`OWNER EX. 2024, page 12
`
`OWNER Ex. 2024, page 12
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 12 of 20
`
`US 2007;“0152804 A]
`
`218
`
`
`202
`
`200
`
`OWNER EX. 2024, page 13
`
`OWNER Ex. 2024, page 13
`
`

`

`Patent Applicah’on Publication
`
`Jul. 5, 2007 Sheet 13 of 20
`
`US 200710152804 Al
`
`
`
`Lefi “EYE”
`
`Right “EYE"
`
`
`
`OWNER EX. 2024, page 14
`
`OWNER Ex. 2024, page 14
`
`

`

`Patent Applicah’on Publication
`
`Jul. 5, 2007 Sheet 14 of 20
`
`US 2007l0152804 Al
`
`2
`
`210
`
`Fig. 17A
`
`Linear Array
`camera
`
`224/
`
`Data acquisition module
`
`
`
`OWNER EX. 2024, page 15
`
`OWNER Ex. 2024, page 15
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 15 of 20
`
`US 200710152804 A]
`
`
`
`
`
`Data _on
`Transmltter
`
`Transmission
`
`"Na-f“
`
`/240
`
`
`
`
`242
`
`244
`
`
`
`
`
`
`Data on
`Transmitter
`
`242
`
`240
`
`
` 252a
`m./252b
`
`Transmissio
`
`244
`
`252
`
`Transmitter Data
`
`246
`
`250
`
`rocessor
`
`Positioning
`Determining
`Device
`
`Fig. 20
`
`OWNER EX. 2024, page 16
`
`OWNER Ex. 2024, page 16
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 16 of 20
`
`US 2007;“0152804 A]
`
`260
`
`
`
`
`
` 260 _ '
`
`
`
`
`..
`
`.....
`
`
`
`'
`
`'
`
`'
`
`.261
`
`260---—--..
`
`251_.___
`
`
`
`OWNER EX. 2024, page 17
`
`OWNER Ex. 2024, page 17
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 17 of 20
`
`US 2007;“0152804 Al
`
`
`
`OWNER EX. 2024, page 18
`
`OWNER Ex. 2024, page 18
`
`

`

`Patent Application Publication
`
`Jul. 5, 2007 Sheet 18 of 20
`
`US 2007;“0152804 A]
`
`276- -
`
`266
`
`
`
`
`
`OWNER EX. 2024, page 19
`
`OWNER Ex. 2024, page 19
`
`

`

`Patent Applicafion Publication
`
`Jul. 5, 2007 Sheet 19 of 20
`
`US 2007l0152804 A1
`
`116
`
`I
`
`DIRECT LASER BEAM
`INTO ENVIRONMENT
`
`
`
`AROUND VEI I ICLL‘
`
`
`_ DETERMINE LOCATION OF
`VEHICLE ON MAP
`
`DEFINE SCANNING FIELD
`OF LASER BEAM BASED
`
`ON VEIIICLE‘S POSITION
`
`AND MAP
`
`-
`
`Flg. 25
`
`OWNER EX. 2024, page 20
`
`102
`
`10$
`
`RECEIVE REFLECTIONS 0F
`
`\ LASER BEAM lN'DlCATWI-L 01:
`PRESENCE OF OBJECT IN I’A’l‘ll
`
`OF LASER BEAM
`1
`
`RANGE [EAI'E REFLECTIONS
`
`
`TO NARROW DISTANCE RANGE
`FROM WIIICII REFLECTIONS
`ARE PROCESSED
`
`
`
`m
`
`IDENTIFYIASCERTAIN
`'I‘I'II: IDENTITY OE OBJECTS
`INRA'G‘ AS'
`01
`
`REIIILEngOESI V
`
`ASSESS POTENTIAL FOR
`
`OLLISIONI'CONSEQUENCES
`112 / OF POTENTIAL COLLISION
`WI'I'II OBJECT
`
`114 a... [EFFECT CDUN'I‘ERMEASURE
`II-‘ COLLISION IS LIKELY
`
`OWNER Ex. 2024, page 20
`
`

`

`Patent Applicah’on Publication
`
`Jul. 5, 2001I Sheet 20 of 20
`
`US 200710152804 Al
`
`GENERATE
`INFORMATION FROM
`
`m
`
`SOURCEIS)
`
`‘
`DIRECT SOURCE
`TO VEHICLE
`TRANSMISSION
`
`m
`
`\‘
`
`DIRECT
`
`INFORMATION TO
`VEHICLE USING
`NETWORK
`
`GATl-IER
`INFORMATION AT
`DATA STORAGE
`FACILITY
`
`
` 284
`PROVIDERS
`
`CENTRAL DATA STORAGE AND
`PROCESSING FACILITY
`
`TRAFFIC
`CONTROL.
`DEVICES
`
`EMERGENCY
`RESPONSE
`FACILITY
`
`INTERNET
`CONTENT
`
`OWNER EX. 2024, page 21
`
`OWNER Ex. 2024, page 21
`
`

`

`US 2007/0152804 A1
`
`Jul. 5. 2007
`
`ACCIDENT AVOIDANCE SYSTEMS AND
`METHODS
`
`[0001] This application is:
`
`1. a continuation-impart (C‘IP)ol‘ 11.5. patent appli-
`[0002]
`catiott Ser. No. 11t034.325 filed Jan. 12. 2005 which is a ('11J
`of 1.1.8. patent application Ser. No. 101822445 filed Apr. 12.
`2004. now US. Pat. No. 7.085.637. which is:
`
`[0003] A) a CIP of US. patent application Ser. No.
`107113.358 filed Apr. 0. 2002. now 11.3. Pat. No.
`6.720.920. which is:
`
`1) a (TIP of 1.1.3. patent application Ser. No.
`[0004]
`097177041 filed Oct. 22. 1998. now U.S. Pat. No.
`6.370.475. which claitus priority wider 35 U.S.C.
`§119{e) of US. provisional patent application Ser.
`No. 601062.729 tiled Oct. 22. 1997:
`
`2) a C [P of U.S. patent application Ser. No.
`[0005]
`091679.317 filed Oct. 4. 2000. now US. Pat. No.
`6.405.132. which is a ClP of U.S. patent application
`Ser. No. 081523.559 filed Mar. 10. 2000. now Ethan-
`doned. which claims priority tlnder 35 U.S.C. §
`119(e] of U.S. provisional patent application Ser. No.
`607123.882 filed Mar. 11. 1999: and
`
`3) a C'IP of U.S. patent application Ser. No.
`[0006]
`097909.466 filed Jul. 19. 2001. now 11.3. Pat. No.
`6.526.352: and
`
`13] a (‘ll’ of 11.3. patent application Ser. No.
`[0007]
`111910.633 filed Aug. 9. 2002. now U.S. Pat. No.
`6.768.944: and
`
`2. a CTP ofU.S. patent application Ser. No. 111461.
`[0008]
`619 tiled Aug. 1 . 2006 which claims priority under 35 U.S.C‘.
`§]10(ei 01' US. provisional patent application Ser. No.
`(107711.452 tiled Aug. 25. 2005 and is:
`
`[0009] A) a (‘IP o.l‘ 1.1.5. patent application Ser. No.
`107822.445 filed Apr. 12. 2004. now LLS. Pat. No.
`7,085,637. the history of which is set forth above; and
`
`13] a ('II’ of U.S. patent application Ser. No.
`[0010]
`111023.386 filed Jan. 3. 2005. now US. Pat. No.
`7.110.880: and
`
`3. a CIPofU.S. patent application Ser. No. 111464.
`[0011]
`385 filed Aug. 14. 2006 which claims priority under 35
`11.3.(7. § 119(e} of U.S. provisional patent application Ser.
`No. 601711.452 filed Aug. 25. 2005 and is a ("11’ 13111.5.
`patent application Ser. No. 111028.386 filed Jan. 3. 2005.
`now US. Pat. No. 7.110.880. the history ofwhich is set forth
`above.
`
`[0012] This application is related to U.S. patent applica-
`tion Ser. No. 117562.730 filed Nov. 22. 2006 on the grounds
`that they contain common subject matter.
`
`[0013] All of the above applications are incorporated by
`reference herein.
`
`FIELD OF THE INVENTION
`
`[0014] The present invention relates generally to accident
`avoidance or elimination systems for vehicles.
`
`BACKGROUND ()1T T1 IE MENTION
`
`[01115] A detailed discussion of background information is
`set 031111 in parent applications, [1.3. patent application Ser.
`
`Nos. 091679.317, 107822.445 and 11.034325. all of which
`are incorporated by reference herein. Some tnore pertinent
`background is set forth below. All of the patents. patent
`applications.
`technical papers and other references men-
`tioned below and in the parent applications are incorporated
`lterein by reference in their entirety. No admission is made
`that any or all of these references are prior an {aid indeed.
`it is contemplated that they tnay not be available as prior art
`when interpreting 35 11.5.6. § 102 in consideration of the
`claims of the present application.
`
`“Pattern recognition" as used herein will generally
`[0016]
`mean any system which processes a signal that is generated
`by an object (e.g.. representative of a pattern of returned or
`received impulses. waves or other physical property specific
`to andror characteristic of andi’or representative of that
`object) or is modified by interacting with an object, in order
`to determine to which one ol‘a set of classes that the object
`belongs. Such a system might determine only that the object
`is or is not a member of one specified class. or it might
`attempt to assign the object to one at“ a larger set of specified
`classes. or find that it is not a member of any of the classes
`in the set. The signals processed are generally a series of
`electrical signals coming from transducers that are sensitive
`to acoustic (ultrasonic) or electromagnetic radiation (cg-
`visible light. infrared radiation. capacitance or electric and“
`or magnetic fields}. although other sources of information
`are frequently included. Pattern recognition systems gener—
`ally involve the creation of a set of rules that permit the
`pattern to be recognized. These rttles can be created by 0me
`logic systems. statistical correlations. or
`through sensor
`fusion methodologies as well as by trained pattern recogni-
`tion systems such as neural networks. combination neural
`networks. cellular neural networks or
`support vector
`machines.
`
`“Neural network" as used herein. unless stated
`[0017]
`otherwise. will generally mean a single neura] network. a
`combination neural network. a cellular neural netwark. a
`
`support vector machine or any combinations thereof. For the
`purposes herein. a “neural network" is defined to include all
`such learning systents including cellular neural networks.
`support vector machines and other kentel-hased learning
`systems and methods. cellular automata and all other pattern
`recognition methods and systetns that learn. A "combination
`neural network" as used herein will generally apply to any
`combination of two or tnore neural networks as most
`broadly defincd that are either connected together or that
`analyze all or a portion of the input data.
`
`[0018] A “combination neural network" as used herein
`will generally apply to any combination of two or more
`neural networks that are either connected together or that
`analyze all or a portion of the input data. A combination
`neural network can be used to divide up tasks in solving a
`particular object sensing and identification problem. For
`example. one neural network can be used to identify an
`object occupying a space at the side of an automobile and a
`second neural network can be used to determine the position
`of the object or its location with respect to the vehicle. for
`example.
`in the blind spot.
`In another case. one neural
`network can be used merely to determine whether the data
`is similar to data upon which a tnain neural network has been
`trained or whether there is something significantly diil'erent
`about this data and therefore that the data should not be
`analyzed. Combination neural uetwmks can sometimes be
`
`OWNER EX. 2024, page 22
`
`OWNER Ex. 2024, page 22
`
`

`

`US 2007/0l52804 Al
`
`Jul. 5. 2007
`
`implemented as cellular neural networks. What has been
`described above is generally referred to as modular neural
`networks with and withottt feedback. Acntally. the feedback
`does not have to be from the output to the input of the satue
`neural network. ”the feedback from a downstream neural
`network could be input to an Upstream neural network. for
`example. The neural networks can be cottthined in other
`ways. For example in a voting situation. Sometimes the data
`upon which the systetn is trained is stifliciently complex or
`imprecise that dille ant views of the data will give different
`results. For example. a subset of transducers may be used to
`train one neural network and another subset to train a second
`neural network etc. The decision can then be based on a
`
`voting oi' the parallel neural networks, sometimes known as
`an ensemble neural network. In the past. neural networks
`have usually only been used in the lbrm ot' a single neural
`network algori Ilun for identifying the occupancy state of the
`space near an automobile.
`
`[0019] A trainable or a trained pattern recognition system
`as used herein generally means a pattern recognition system
`that is taught to recognize various patterns constituted within
`the signals by subjecting the system to a variety ol’examples.
`The most successful such system is the neural network used
`either singly or as a combination of neural networks. Thus,
`to generate the pattern recognition algorithm. test data is first
`obtained which constitutes a plurality of sets of retumed
`waves, or wave patterns. or other information radiated or
`obtained from an object (or from the space in which the
`object will be situated in the passenger compartment. i.e..
`the space above the seat) and an indication of the identify of
`that object. A number of dinerent objects are tested to obtain
`the unique patterns from each object. As such. the algorithm
`is generated. and stored in a computer processor. and which
`can later be applied to provide the identity ol‘an object based
`on the wave pattern being received dttrittg use by a receiver
`connected to the processor and other information. For the
`purposes here. the identity of an object sometimes applies to
`not only the object
`itself but also to its location andfor
`orientation and velocity in the vicinity of the vehicle. For
`exatttple, a vehicle that is stopped but pointing at the side of
`the host vehicle is dilferent from the same vehicle that is
`
`approaching at sttclt a velocity as to impact the host vehicle.
`Not all pattern recognition systems are trained systems and
`not all trained systems are neural networks. Other pattern
`recognition systems are based on fuzzy logic. sensor fusion.
`Kalntan filters. correlation as well as linear and non—linear
`regression. Still other pattern recognition systems are
`hybrids of more than one system such as neural-fuzzy
`systems.
`
`[0020] A pattern recognition algorithm will Elms generally
`mean an algorithm applying or obtained using any type of
`pattent recognition system. e.g.. a nettral network. sensor
`fusion. fuzzy logic. etc.
`
`[0021] To “identify” as used herein will generally mean to
`determine that the object belottgs to a particular set or class.
`The class may be one containing. for example. all motor—
`cycles. one containing all trees. or all trees in the path of the
`host vehicle depending on the purpose of the system.
`
`[0022] To “ascertain the identity oi“ as used herein with
`reference to an object will generally mean to determine the
`type or nature ol‘the object [obtain inlorination as to what
`
`the object is]. i.c.. that the object is an car. a car on a collision
`course with the host vehicle. a truck. a tree. a pedestrian. a
`deer etc.
`
`[0023] A “rear seat" of a vehicle as used herein will
`generally mcatt any seat behind the front seat on which a
`driver sits. Thus. in minivans or other large vehicles where
`there are more than two rows of seats. each row of seats
`behind the driver is Considered a rear seat and thus there may
`be more than one “rear seat" in such vehicles. The space
`behind the front seat includes any number ofsuch rear seats
`as well as any trunk spaces or other rear areas sttch as are
`present in station wagons.
`
`In the description herein on anticipatory sensing.
`[0024]
`the term “approaching“ when used in connection with the
`mention of an object or vehicle approaching another will
`usually mean the relative motion of the object toward the
`vehicle having the anticipatory sensor system. Thus. in a
`side impact with a tree,
`the tree will be considered as
`approaching the. side of the vehicle and impacting the
`vehicle.
`In other words.
`the coordinate system ttsed in
`general will he a coordinate system residing in the target
`vehicle. The “target" vehicle is the vehicle that
`is being
`impacted. This convention permits a general description to
`cover all of the cases such as where (i) a moving vehicle
`impacts into the side of a stationary vehicle. (ii) where both
`vehicles are moving when they impact. or [iii] where a
`vehicle is moving sideways into a stationary vehicle. tree or
`wall.
`
`“Vehicle“ as used herein includes any container
`[0025]
`that is movable either under its own power or using power
`from another vehicle.
`It
`includes. but
`is not
`limited to.
`automobiles. trttcks. railroad cars. ships. airplanes. trailers.
`shipping containers. barges. etc. The word “container“ will
`frequently be used interchangeably with vehicle however a
`container will generally mean that pan of a vehicle that
`separate from and ill some cases may exist separately and
`away from the source of motive power. Thus. a shipping
`container may exist iii a shipping yard and a trailer may be
`parked in a parking lot without the tractor. The passenger
`compartment or a trunk of an automobile. on the other hand.
`are compartments of a container that generally only exists
`attaches to the vehicle chassis that also has an associated
`
`engine for moving the vehicle. Note a container can have
`one or a plurality of compartments.
`
`“Transducer“ or “transceiver" as used herein will
`[0026]
`generally mean the combination of a transmitter and a
`receiver. ln come cases. the same device will serve both as
`the transmitter and receiver while in others two separate
`devices adjacent to each other will be used. In sotne cases.
`a transmitter is ttot used and iii such cases transducer will
`mean only a receiver. Transducers include. for example,
`capacitive. inductive. ultrasonic. electromagnetic (antenna1
`(‘Cll CMDS arrays. laser. radar transmitter. terahertz. trans-
`mitter attd receiver. focal plane array, pin or avalanche
`diode. etc). electric field. weight measuring or sensing
`devices. in some cases. a transducer will be a single pixel
`either acting alone,
`in a linear or an array of some other
`appropriate shape. in some cases. a transducer may comprise
`tWo pans such as the plates ofa capacitor or the antemias of
`an electric field settsor. Sometimes. one antenna or plate will
`communicate with several other antennas or plates and thus
`for the purposes herein. a transducer will be broadly dclincd
`
`OWNER EX. 2024, page 23
`
`OWNER Ex. 2024, page 23
`
`

`

`US 2007f0l52804 Al
`
`Jul. 5. 2007
`
`to refer. in most cases. In any one ot‘lhe plates ofa capacitor
`or antennas of a field sensor and in some other cases a pair
`of such plates or antemias will comprise a transducer as
`determined by the context in which the term is used.
`
`[0027] A “wave sensor" or “wave transducer“ is generally
`any device which senses either ultrasonic or electromagnetic
`waves. Art electromagnetic wave sensor.
`for example.
`includes devices that sense any portion of the electromag—
`netic spectrum ti'om ultraviolet dowrt to a few hertz. The
`most commonly used kinds ol'clectromagnetic wave sensors
`include (TD and C'MCJS arrays lior sensing visible andfor
`infrared waves. millimeter wave and microwave radar, and
`capacitive or electric HDCUGI‘ magnetic field monitoring sen—
`sors that rely on the dielectric constant of the object occu—
`laying a space but also rely on the time vtu'iation of the field,
`expressed by waves as defined below. to determine a change
`in state.
`
`[0028] A "CCU" will be defined to include all devices.
`including CMOS arrays. APS arrays. QWIP arrays or
`equivalent. artificial retinas atid particularly HDRC‘ arrays.
`which are capable of converting light frequencies. including
`infrared. visible and ultraviolet. into electrical signals. The
`particular C(‘IJ array used For many of the applications
`disclosed herein is implemented on a single chip that is less
`than two centimeters on a side. Data from the CCD array is
`digitized and sent serially to an electronic circuit [at times
`designated 120 herein) containing a microprocessor For
`analysis of the digitised data.
`In order to minimize the
`amount of data that needs to be stored. initial processing of
`the image data takes place as it is being received from the
`(‘('D array. as discussed in more detail above. In some cases.
`some image processing can take place on the chip such as
`described in a Kage et a1. artificial retina article referenced
`in patent applications.
`
`[0029] An “occupant protection apparatus“ is any device.
`apparatus. system or component which is actuatable or
`deployable or includes a component which is actuatable or
`dcployablc liar the purpose of attempting to reduce intury to
`the occupant
`in the event of a crash. rollover or other
`potential injurious event involving a vehicle
`
`Inertial measurement unit [lMlJ]. inertial naviga-
`[0030]
`tion system {INS} and inertial reference tmit (IRU) will in
`general be used be used interchangeably to mean a device
`having a plurality of accelerometers and a plurality of
`gyroscopes generally within the same package. Usually such
`a device will contain 3 accelerometers and 3 gymscopes. In
`some cases a distinction will he made whereby the INS
`relates to an [MU or an [RU plus additional sensors and
`software such as a UPS. speedometer. odometer or other
`sensors pltts optimizing software which may be based on a
`Kalman filter.
`
`[0031] A precise positioning system or PPS is a system
`based on some information. Usually ot'a physical nature. in
`the infrastructure that determines the precise location of a
`vehicle independently of a OPS—based system or the IMU.
`Such a system is employed as a Vehicle is traveling and
`passes a particular location. A PPS can make use of various
`technologies including radar.
`laser radar.
`terahenz radar.
`RFID tags located in the infiasmtcntre. MIR transmitters
`and receivers. Such locations are identified on a map data~
`base resident Within the vehicle. In one case. for example,
`the map database contains data from a terahcrtz radar
`
`continuous scan of the environment to the side of a vehicle
`from a device located on a vehicle and pointed 45 degrees
`up relative to the horizontal plane. The map database con—
`tains the exact location ofthc vehicle that corresponds to the
`scan. Another vehicle can then determine its location by
`comparing its scan data with that stored with the map
`database and when there is a match. the vehicle knows its
`location. Of course many other technologies can be used to
`accomplish a similar result.
`
`[0032] Unless stated otherwise. laser radar. lidar and ladar
`will he considered equivalent herein.
`In all cases,
`they
`represent a projected laser beam, which can be in the visual
`part of the electromagnetic spectrum btlt generally will he
`the infrared pan oi‘t‘lte electromagnetic spectntm and usually
`in the near infrared wavelengths. The projected laser beam
`can emanate from the optics as a nearly parallel beam or as
`a beam that diverges all any desired angle from less than zero
`degrees to ten or more of degrees depending on the appli—
`cation. A particular implementation may use a laser beam
`that at one time diverges at an angle less titan one degree and
`at another time may diverge at several degrees using adjust-
`able optics. "l'he laser beam can have a diameter as it leaves
`the vehicle ranging from less than a millimeter to several
`centimeters. The above represent typical or representative
`ranges ol' ditiiensions but this invention is not limited by
`these ranges.
`
`OBJECTS AND SUMMARY OF THE
`INVENTION
`
`]t is an object of the present invention to provide
`[0033]
`new and improved methods and systems for preventing
`accidents between vehicles.
`
`In order to achieve this object and others. an
`[0034]
`accident avoidance system for a host vehicle in accordance
`with the invention includes a global positioning system
`residing on the host vehicle for determining the ltost vehi-
`cle’s location as the host vehicle travels based on signals
`received from one or more satellites. a map database cont—
`prising digital maps corresponding to an area comprising the
`location of the host vehicle as determined by the global
`positioning system. a vchicle-to-vehicle communication
`system residing oil the host vehicle operative for receiving
`signals comprising location inl'onnatiou acquired by global
`positioning systems residing on other vehicles directly from
`the other vehicles indicating the locations of the other
`vehicles, and a navigation system comprising a display
`residing on the host vehicle for displaying images to an
`occupant of the host vehicle showing the digital maps and
`indications of the locations of the [test vehicle and the other
`vehicles on the digital maps. and for updating the images
`shown on the navigation system display to reflect changes in
`the locations of the host vehicle and the other vehicles.
`
`[0035] The signals received by the vehicle-to-Veliiclc
`communication system may be indications of the velocities
`and directions of travel of tile other vehicles. The global
`positioning system may inclttde a dill'crential correction
`signal receiver operable For receiving global positioning
`difl'erential correction signals from a global positioning
`augmentation system. Accordingly. the global positioning
`system may use the global positioning ditt'erential correction
`signals in the determination of the location of the host
`vehicle.
`
`OWNER EX. 2024, page 24
`
`OWNER Ex. 2024, page 24
`
`

`

`US 20072’0l52804 A1
`
`Jul. 5. 2007
`
`In one embodiment. a processor determines that a
`[0036]
`potential collision is likely to occur between the host vehicle
`and one ofthe other vehicles and a wanting system operative
`for alerting an operator of the host vehicle of the potential
`collision. The processor can reside on the host vehicle or
`apart from the host vehicle.
`
`[0037] When the host vehicle has an assigned corridor of
`travel. a processor can determine that the host vehicle will
`potentially exit its assigned corridor of travel and alert an
`operator of the host vehicle of the potential exit from the
`assigned corridor of travel via a waming system.
`’Ilte
`wanting system may generate an audible warning andlor
`cause the navigation system display to indicate an alarm.
`
`[0038] The tttap database may include one or more digital
`maps indicating at least one edge and surface shape of a
`roadway on which the host vehicle is traveling. indicating an
`elevation ot'a roadway which the host vehicle is traveling.
`indicating an edge of a roadway on which the host vehicle
`is traveling HIKlJ'IUI' indicating a character of land beyond the
`edge of the roadway.
`
`[0039] When the host vehicle includes a radar system. and
`while the host vehicle is traveling. the radar system can
`acquire a location of an obiect in the area comprising the
`location of the host vehicle and the navigation system is
`operative for indicating the location ol‘ the object on a map
`shown on the navigation system display.
`
`[0040] When the host vehicle includes a data acquisition
`system. and while the host vehicle is traveling.
`the data
`acquisition system may receive a digital map transmitted
`from a site remote ti'om the host vehicle and the navigation
`system displays the received digital map on the navigation
`system display. Additionally or alternatively. the data acqui—
`sition system receives weather condition lltlttmmlifll‘l per-
`taining to the area where the host vehicle is located traits-
`mitted item a site remote from the host vehicle and the
`navigation system indicates t

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket