throbber
PATENT DATE
`. iitiJl·.J. u. -~
`
`PATENT
`NUMBER
`
`SERIAL NUIVJ,BER
`
`FILING D~TE CLASS
`r
`
`:340
`
`SUBCLASS
`
`GROUP _ART l)NIT
`
`'/)
`
`....
`~TOMOYUKI NISHIO, KAWASAKI, JAPAN~
`rt
`.
`J
`::i
`li
`l.;
`
`il:' ' * * ·····r··l'l·:-,·· r· 1··1' i ·r t 11~·,
`\. _:, D~TA***i*****************
`1,.,• •• .l ... ··,:..:.~
`\ '·- .
`VEi::;~ I'F I E'I>
`THIS 8PPLN IS A CON. OF
`·----·~-'f2 ........... ..
`
`o::::;o-::t7, :t7::::
`
`:;.•
`
`**FOREIGN/PCT APPLICATIONS********~***
`\./ 1,:.=-••• r:;·. T r::· .·!.· !~.·.· .... ·.1.-.:.
`· .. ··~· H" j=• -':'-_. j\.'t
`- i
`-
`... - ...
`~~L:I::I ~ :.::: U .1. /
`r _ r
`r·:!
`................. ~ .. r:2 ......... ..
`
`-
`, -
`.~::(::;:
`
`Foreign priority claimed
`!5 USC 119 conditions met
`
`FILING FEE
`INDEP.
`STATE OR SHEETS TOTAL
`COUNTRY DRWGS. CLAIMS CLAIMS RECEIVED
`
`ATTORNEY'S
`DOCKET NO .
`
`.JF'X.
`
`7
`
`Kf4NESP,I<A {2Jl\II) T~41<EUCH I
`727 23RD STREET SOUTH
`ARL~NGTDN VA
`22202
`
`Label
`Area
`
`Form PT0-436A
`(R~v. 8/92)
`
`.. ·BRENT A. SWARTHOUT·
`PRIMARY EXAMINER
`GRQUP2600·
`
`WARNING: The information disclosed herein may be restricted.
`by the United States Code litle 35, Sections 122, 1
`Patent & Trademark Office is restricted to
`
`disclosure may' be prohibited
`Possession outside the u:s.
`employees and contractors only.
`
`:··. '/
`
`(FACE)
`
`1
`
`Mercedes-Benz USA, LLC, Petitioner - Ex. 1005
`
`

`
`PATENT DATE
`
`PATENT
`NUMBER
`
`FILl~ ~ATE CLASS
`07/27/9:3
`340
`
`SUBCLASS
`
`9o3
`
`GROUP ART
`2617
`
`TOMOYUK I NISHI 0 ~. KAWAqf\K I ..,.S!:i,I.;
`.
`~ > '.
`
`,.
`
`DATA***********•*********
`
`**CONTINUING
`VERIFIED
`-~~~}_@
`
`**FOREIGN/PCT APPLICATIONS********.***
`VERIFIED
`.JAPAN
`229" 21) 1/92
`lSf\5
`
`OB/04/92
`
`I
`727 TWENTY-THIRD STREET SOUTH
`ARLINGTON~ VA 22202
`
`.JPX
`
`7
`
`::.
`
`EHICLE CRASH-PREDICTIVE AND EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`U.S.DEPT.OtCDMM.-Pat. &'TM Offlce-Pl'Oo43&1.(rev.10o78)
`-··- ---- ---- ------·--- ---- --·------------- ---- -·'-~---:_ _____ ~-~----·-·-------'-c--·-~-----'----'--'-~-'-
`
`Assistant Examiner
`
`Sheets Drwg.
`
`Print Fig.
`
`ISSUE
`BATCH
`NUMBER
`
`Primary Examiner
`
`Label
`Area
`
`. WARNING: The information disclosed herein may be restricted. Unauthorized disclosure may be prohibited
`by the United States Code Title 35, Sections 122, 181 and 368. Possession outside the. U.S.
`Patent & Trademark Office is restricted to authorized employees and contractors only.
`
`(FACE)
`
`2
`
`

`
`BAR CODE LABEL
`
`11111111111111111111111111111111111111111111111111
`
`U.S. PATENT APPLICATION
`
`SERIAL NUMBER
`
`FILING DATE
`
`CLASS
`
`GROUP ART UNIT
`
`08/375,249
`
`01/19/95
`
`340
`
`2617
`
`TOMOYUKI NISHIO, KAWASAKI, JAPAN.
`
`1-z
`()
`:::i
`0..
`0..
`<(
`
`**CONTINUING DATA*********************
`VERIFIED
`THIS APPLN IS A CON OF
`
`08/097,178 07/27/93 ABN
`
`**FOREIGN/PCT APPLICATIONS************
`VERIFIED
`JAPAN
`229,201/92
`
`08/04/92
`
`STATE OR
`COUNTRY
`
`SHEETS
`DRAWING
`
`TOTAL
`CLAIMS
`
`INDEPENDENT
`CLAIMS
`
`FILING FEE
`RECEIVED
`
`ATTORNEY DOCKET NO.
`
`JPX
`
`7
`
`4
`
`1
`
`$730.00
`
`K-1518
`
`MANABU KANESAKA
`KANESAKA AND TAKEUCHI
`727 23RD STREET SOUTH
`ARLINGTON VA
`22202
`
`VEHICLE CRASH PREDICTIVE AND EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`(/)
`(/)
`w
`a:
`Cl
`Cl
`<(
`
`w
`..J
`1-
`i=
`
`This is to certify that annexed hereto is a true copy from the records of the United States
`Patent and Trademark Office of the application wh1ch is identified above.
`By authority of the
`COMMISSIONER OF PATENTS AND TRADEMARKS
`
`Date
`
`Certifying Officer
`
`3
`
`

`
`BAR CODE LABEL
`
`llllllllllllllllllllllllllllllllllllllllllllllllll
`
`U.S. PATENT APPLICATION
`
`SERIAL NUMBER
`
`FILING DATE
`
`CLASS
`
`GROUP ART UNIT
`
`08/097,178
`
`07/27/93
`
`340
`
`2617
`
`TOMOYUKI NISHIO, KAWASAKI-SHI, JAPAN.
`
`**CONTINUING DATA*********************
`VERIFIED
`
`**FOREIGN/PCT APPLICATIONS************
`VERIFIED
`JAPAN
`229,201/92
`
`08/04/92
`
`STATE OR
`COUNTRY
`
`SHEETS
`DRAWING
`
`TOTAL
`CLAIMS
`
`INDEPENDENT
`CLAIMS
`
`FILING FEE
`RECEIVED
`
`ATIORNEY DOCKET NO.
`
`JPX
`
`7
`
`7
`
`1
`
`$710.00
`
`K1398
`
`KANESAKA AND TAKEUCHI
`727 TWENTY-THIRD STREET SOUTH
`ARLINGTON, VA 22202
`
`VEHICLE CRASH PREDICTIVE AND EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`(/)
`
`(/) w
`0:
`0
`0
`<(
`
`w
`j::!
`i=
`
`This is to certify that annexed hereto is a true copy from the records of the United States
`Patent and Trademark Office of the application wh1ch is identified above.
`By authority of the
`COMMISSIONER OF PATENTS AND TRADEMARKS
`
`Date
`
`Certifying Officer
`
`4
`
`

`
`.APPR~FO~\CENSE 0
`,~Jq,.?..,
`
`INITIALS - - - - -
`
`Date
`Received
`or
`Mailed
`
`11.
`
`12.
`
`13.
`
`14.
`
`15.
`
`16.
`
`17.
`
`18.
`
`19.
`
`20.
`
`21.
`
`22.
`
`23.
`
`24.
`
`25.
`
`26.
`
`27.
`
`28.
`
`29.
`
`30.
`
`31.
`
`32.
`
`__ /
`
`(FRONT)
`
`5
`
`

`
`L.
`/
`APPROVED. FOR LICENSE
`
`.~l
`
`CONTENTS
`
`Date.
`Received
`or
`Mailed
`
`,
`Date
`- Entered ·
`or
`Counted
`
`·~·
`
`/
`Jz_-!cf-'jj
`
`~.­
`I
`\,
`
`..
`
`___ ___:._1
`
`____ 1
`
`-~---'·•
`
`i .
`I
`
`,.
`
`> ... :~~ .-'.
`
`.
`
`).
`
`(FRONT)
`
`6
`
`

`
`United States Patent [19J
`Nishio
`
`IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIJ
`5,541,590
`*Jul. 30, 1996
`
`US005541590A
`[11] Patent Number:
`[45] Date of Patent:
`
`[54] VEIDCLE CRASH PREDICTIVE AND
`EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`[75]
`
`Inventor: Tomoyuki Nishio, Kawasaki, Japan
`
`[73] Assignee: Takata Corporation, Tokyo, Japan
`
`[ *) Notice:
`
`The term of this patent shall not extend
`beyond the expiration date of Pat. No.
`5,377,108.
`
`FOREIGN PATENT DOCUMENTS
`
`0358628
`2554612
`3837054
`4001493
`04008639
`9002985
`
`311990 European Pat. Off ..
`5/1985 France.
`6/1989 Germany.
`7/1991 Germany.
`111992
`Japan.
`311990 WIPO.
`
`OTHER PUBLICATIONS
`
`[21] App1. No.: 375,249
`
`[22] Filed:
`
`Jan. 19, 1995
`
`Rumelhart et a! "Parallel Distributed Processing", vol. 1 pp.
`161, 162, copyrighted 1986.
`
`Related U.S. Application Data
`
`Primary Examiner-Brent A. Swarthout
`Attorney, Agent, or Firm-Kanesaka & Takeuchi
`
`[63] Continuation of Ser. No. 97,178, Jul. 27, 1993, abandoned.
`Foreign Application Priority Data
`
`[30]
`
`[57]
`
`ABSTRACT
`
`Aug. 4, 1992
`
`[JP]
`
`Japan .................................... 4-229201
`
`Int. Cl.6
`....................................................... G08G 1116
`[51]
`[52] U.S. CI ........................... 340/903; 340/435; 348/148;
`364/424.04; 395/23; 3951905
`[58] Field of Search ..................................... 340/435, 995,
`340/903, 905; 3481170, 113, 148, 149;
`364/424.01, 424.04, 424.05; 395/905.22,
`11, 21, 23; 382/104, 157
`
`[56]
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,130,563
`5,161,014
`5,161,632
`5,162,997
`5,189,619
`5,200,898
`5,214,744
`5,249,157
`5,270,708
`5,282,134
`5,285,523
`5,377,108
`5,434,927
`
`7/1992 Nabet et al ............................. 364/807
`11/1992 Pearson et al. ........................... 395/11
`1111992 Asayama ................................. 340/435
`1111992 Takahashi ............................ 364/424.1
`211993 Adachi et al. .......................... 340/903
`4/1993 Yuhara et al ...................... 364/431.04
`5/1993 Schweizer et al ...................... 364/807
`911993 Taylor ..................................... 340/903
`12/1993 Karnishima ............................. 340/905
`111994 Gioutsos et al .................... 364/424.01
`2/1994 Takahashi .......................... 364/424.01
`1211994 Nishio ................................ 364/424.05
`711995 Brady et al ............................. 348/148
`
`A system for predicting and evading crash of a vehicle
`includes an image pick-up device mounted on the vehicle for
`picking up images of actual ever-changing views when the
`vehicle is on running to produce actual image data, a crash
`predicting device associated with said image pick -up device,
`said crash predicting device being successively supplied
`with the actual image data for predicting occurrence of crash
`between the vehicle and potentially dangerous objects on the
`roadway to produce an operational signal when there is
`possibility of crash and a safety drive ensuring device
`connected to said crash predicting device for actuating, in
`response to the operational signal, an occupant protecting
`mechanism which is operatively connected thereto and
`equipped in the vehicle. The crash predicting device
`includes a neural network which is previously trained with
`training data to predict the possibility of crash, the training
`data representing ever-changing views previously picked-up
`from said image picking-up device during driving of the
`vehicle for causing actual crash.
`
`4 Claims, 7 Drawing Sheets
`
`TO STEERING WHEEL
`
`TO THROTTlE VALVE
`
`TO BRAKE
`
`52
`
`'
`------------ ----- -----~...1
`r
`53
`50
`
`22
`
`CRASH
`PREDICTING
`CIRCUIT
`
`40
`
`!
`60
`
`7
`
`

`
`0
`\0
`01
`~ -...
`~
`01
`-...
`01
`
`PR lOR ART
`
`1
`
`I G
`
`F
`
`24
`
`23
`
`0
`~
`~
`::r
`00
`
`'""'
`
`-....)
`
`c:l'l
`\C)
`\C)
`~
`
`!?
`tJ,)
`~
`
`= ~
`
`(t
`~
`~
`•
`00.
`0 •
`
`'---
`I
`I
`I
`I
`I
`:
`:
`:
`:
`r-------------------------------------------------------,
`TO STEERING WHEEL
`
`ACTUATOR
`
`BRAKE
`
`ACTUATOR
`
`THROTTLE
`
`ACTUATOR
`
`STEERING
`
`TO BRAKE
`
`TO THROTTLE VALVE
`
`RAM
`
`33
`
`---------~-1
`32
`
`50
`
`(
`
`----------53------,------~J _j
`
`,-, __ 30
`
`---___ J
`
`40
`
`OUTPUT 1/F
`
`34
`
`----
`
`--
`L
`I
`: 31
`
`SPEED I I STEERING GEAR
`
`RATIO SENSOR
`
`SENSOR
`
`I
`
`I
`I
`I
`I
`I
`I
`1 .---(cid:173)
`r-----
`
`INPUT 1/F
`
`22
`
`52
`
`51
`
`8
`
`

`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 2 of 7
`
`5,541,590
`
`INPUT Xi
`X
`
`XI
`
`' '
`
`x=2: xi wi
`
`I
`
`OUTPUT
`
`y = f {X)
`
`F I G. 2
`PR10R ART
`
`y = f (X)
`1. 0 ---- - - ---- -- ·-
`
`0
`
`X
`f(X)=--1- -
`1 + exp (-X)
`
`Fl G.3
`PRIOR ART
`
`9
`
`

`
`Q
`~
`tiJ
`~ ....
`~
`tiJ
`....
`tiJ
`
`-....1
`~
`0
`w
`1'JJ. =-~
`
`f"f'
`~
`
`'""'" "" "" Q\
`
`!?
`~
`~
`
`~ = ~
`
`~
`~
`~
`•
`00
`d •
`
`I
`:
`r---------------------------------------------------t--------l
`
`)
`(
`
`ACTUATOR
`
`BRAKE
`
`TO BRAKE
`
`TO THROTTLE VALVE
`
`TO STEERING WHEEL
`
`F I G. 4
`
`60
`7
`
`40
`r
`~ OUTPUT 1/F I
`
`I
`
`I
`... , PREDICTING I
`
`CRASH
`
`CIRCUIT
`
`I
`
`INPUT I /F
`
`22
`f
`
`21
`;-·---
`PICK-UP..___ _____ ___.
`IMAGE
`
`)
`10
`
`50
`(
`
`53
`
`52
`
`51
`
`L ____ ----------------------------------r------~..J
`I
`I
`I
`I
`:
`:
`
`ACTUATOR
`
`THROTTLE
`
`ACTUATOR
`
`STEER lNG
`
`10
`
`

`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 4 of 7
`
`5,541,590
`
`TO 50
`
`63
`
`60
`
`61
`
`000
`
`61-1 61-2
`
`61- n
`
`F R 0 M 22
`I G. 5 (a)
`
`F
`
`I
`I
`I
`---------,----
`I
`
`•
`
`•
`
`•
`
`1
`J
`I
`I
`I
`•
`I
`•
`•
`I
`--- -r-----1----
`-------+---
`I
`I
`I
`I
`!
`----y---- ----
`I
`I
`
`J
`I
`
`PICKED UP
`IMAGE
`
`F I G
`
`5 (b)
`
`11
`
`

`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 5 of 7
`
`5,541,590
`
`F
`
`l G. 6(a)
`
`F
`
`I G . 6 ( b )
`
`so b
`
`12
`
`

`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 6 of 7
`
`5,541,590
`
`80c
`
`F I G . 7
`
`SOd
`
`F I G. 8
`
`13
`
`

`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 7 of 7
`

`
`5,541,590
`
`TO 50
`
`.....
`
`-
`
`/
`I
`
`I :0--
`~0--
`, ..._ __
`
`1
`I
`
`1
`I
`
`--0------
`)
`- - - - - - _ _.,
`-- -
`
`65
`_..,
`I'
`------~
`\
`I
`
`0---
`
`0
`
`61
`I
`/
`
`61-1
`
`61-2
`
`61- n
`
`FROM 22
`
`FIG. 9
`
`14
`
`

`
`5,541,590
`
`1
`VEHICLE CRASH PREDICTIVE AND
`EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`This application is a continuation of application Ser. No. 5
`08/097,178, filed Sep. 27, 1993, now abandoned.
`
`BACKGROUND OF THE INVENTION
`
`2
`The image processing system of the type described is
`useful in normal driving conditions where the pattern match(cid:173)
`ing can be effectively made between the image patterns
`successively pkked up and the reference patterns for safety
`driving control. However, image patterns representing vari(cid:173)
`ous conditions on the roadway should be stored previously
`in the intelligent vehicle as the reference patterns. Vehicle
`orientation at initiation of crash varies greatly, so that huge
`numbers of reference patterns are required for the positive
`operation. This means that only a time-consuming calcula(cid:173)
`tion will result in a correct matching of the patterns, which
`is not suitable for evading an unexpected crash.
`It is, of course, possible to increase operational speed of
`the pattern matching by using a large image processor.
`However, such a processor is generally complex in structure
`and relatively expensive, so that it is difficult to apply the
`same as the on-vehicle equipment. In addition, on-vehicle
`image processors, if achieved, will perform its function
`sufficiently only in the limited applications such as a supple(cid:173)
`mental navigation system during the normal cruising.
`
`SUMMARY OF THE INVENTION
`
`25
`
`This invention generally relates to a system for predicting 10
`and evading crash of a vehicle, in case of
`In driving a car, a driver unconsciously senses various
`conditions through the objects in view and, as a case may be,
`he must take an action to evade any possible crash or
`collision. However, drivers will often be panicked at the 15
`emergency. Such a panicked driver may not properly handle
`the vehicle. Besides, the response delay to stimuli in varying
`degrees is inherent to human beings, so that it is physically
`impossible in some cases to evade crash or danger. With this
`respect, various techniques have been developed to evade 20
`collision by means of mounting on a vehicle a system for
`determining the possibility of crash in a mechanical or
`electrical manner before it happens. Accidents could be
`reduced if drivers had an automatic system or the like
`warning of potential collision situations.
`An automobile collision avoidance radar is typically used
`as this automatic system. Such an automobile collision
`avoidance radar is disclosed in, for example, M. Kiyoto and
`A. Tachibana, Nissan Technical Review: Automobile Colli(cid:173)
`sion-Avoidance Radar, Vol. 18, Dec. 1982 that is incorpo(cid:173)
`rated by reference herein in its entirety. The radar disclosed
`comprises a small radar radiation element and antennas
`installed at the front end of a vehicle. A transmitter transmits
`microwaves through the radiation element towards the head- 35
`way. The microwave backscatter from a leading vehicle or
`any other objects as echo returns. The echo returns are
`received by a receiver through the antennas and supplied to
`a signal processor. The signal processor carries out signal
`processing operation to calculate a relative velocity and a 40
`relative distance between the object and the vehicle. The
`relative velocity and the relative distance are compared with
`predetermined values, respectively, to determine if the
`vehicle is going to collide with the object. The high possi(cid:173)
`bility of collision results in activation of a proper safety 45
`system or systems.
`However, the above mentioned radar system has a disad(cid:173)
`vantage of faulty operation or malfunctions, especially when
`the vehicle implementing this system passes by a sharp
`curve in a road. The radar essentially detects objects in front 50
`of the vehicle on which it is mounted. The system thus tends
`to incorrectly identify objects alongside the road such as a
`roadside, guard rails or even an automobile correctly run(cid:173)
`ning on the adjacent lane.
`An intelligent vehicle has also been proposed that com- 55
`prises an image processing system for cruise and traction
`controls. The views ahead the vehicle are successively
`picked up as image patterns. These image patterns are
`subjected to pattern matching with predetermined reference
`patterns. The reference patterns are classified into some 60
`categories associated with possible driving conditions. For
`example, three categories are defined for straight running,
`right turn and left turn. When a matching result indicates the
`presence of potentially dangerous objects in the picked up
`image, a steering wheel and a brake system are automati- 65
`cally operated through a particular mechanism to avoid or
`evade crash to that object.
`
`An object of the present invention is to provide a system
`for predicting and evading crash of a vehicle using neural
`networks.
`Another object of the present invention is to provide a
`system capable of training neural networks by means of
`collected image data representing scenes along the moving
`30 direction of a vehicle until the vehicle collides with some(cid:173)
`thing.
`It is yet another object of the present invention to provide
`a system for predicting crash though matching operation
`between data obtained on driving a vehicle and data learned
`by neural networks. It is still another object of the present
`invention to provide a system for evading crash of a vehicle
`using neural networks to actuate a vehicle safety system for
`protecting an occupant.
`In order to achieve the above mentioned objects, the
`present invention is provided with a system for predicting
`and evading crash of a vehicle comprising: an image pick-up
`device mounted on the vehicle for picking up images of
`ever-changing views when the vehicle is on running to
`produce image data; a crash predicting circuit associated
`with the image pick-up device, the crash predicting circuit
`being successively supplied with the image data for predict(cid:173)
`ing occurrence of crash between the vehicle and potentially
`dangerous objects on the roadway to produce an operational
`signal when there is possibility of crash; and a safety driving
`ensuring device connected to the crash predicting circuit for
`actuating, in response to the operational signal, occupant
`protecting mechanism which is operatively connected
`thereto and equipped in the vehicle; wherein the crash
`predicting circuit comprises a neural network which is
`previously trained with training data to predict the possibil(cid:173)
`ity of crash, the training data representing ever-changing
`views previously picked-up from the image picking-up
`device during driving of the vehicle and just after actual
`crash.
`The neural network comprises at least an input layer and
`an output layer, and the training data are supplied to the
`input layer while the output layer is supplied with, as teacher
`data, flags representing expected and unexpected crash,
`respectively, of the vehicle. In addition, the neural network
`may comprise a two-dimensional self-organizing competi(cid:173)
`tive learning layer as an intermediate layer.
`
`15
`
`

`
`5,541,590
`
`3
`Other advantages and features of the present invention
`will be described in detail in the following preferred
`embodiments thereof.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`5
`
`15
`
`FIG. 1 is a block diagram of a conventional system for
`predicting and evading crash of a vehicle;
`FIG. 2 is a schematic view showing a processing element
`in atypical neural network;
`FIG. 3 is a graphical representation of a sigmoid function
`used as a transfer function for training neural networks;
`FIG. 4 is a block diagram of a system for predicting and
`evading crash of a vehicle using neural networks according
`to the first embodiment of the present invention;
`FIG. S(a) is a schematic structural diagram of a crash
`predicting circuit in FIG. 4 realized by a neural network of
`three layers;
`FIG. 5(b) shows an example of an input layer consisting 20
`of a two-dimensional array of processing elements of the
`neural network shown in FIG. 5(a);
`FIGS. 6(a) and 6(b) are exemplified views picked up, as
`the training image data supplied to the neural network, at
`different time instances during driving an experimental 25
`vehicle;
`FIG. 7 is a view showing an example of an image data
`obtained during driving a utility vehicle;
`FIG. 8 is a view showing another example of an image
`data obtained during driving a utility vehicle; and
`FIG. 9 is a block diagram of a system for predicting and
`evading crash using neural networks according to the second
`embodiment of the present invention.
`
`4
`distortions at this stage. The input interface 22 is also
`connected to a speed sensor 23, a steering gear ratio sensor
`24 and a signal processor 30. The speed sensor 23 supplies
`velocity data to the signal processor 30 through the input
`interface 22. The velocity data represents an actual velocity
`of the automobile 10 at the time instant when the image
`pick-up device 21 picks up an image of a view. Likewise, the
`steering gear ratio sensor 24 supplies steering gear ratio data
`to the signal processor 30 through the input interface 22. The
`10 steering gear ratio data represents an actual steering gear
`ratio of the automobile 10.
`The signal processor 30 comprises a central processing
`unit (CPU) 31, a read-only memory (ROM) 32 and a
`random-access memory (RAM) 33. CPU 31, ROM 32 and
`RAM 33 are operatively connected to each other through a
`data bus 34. To evade potentially dangerous objects, CPU 31
`carries out calculation operation in response to the image,
`velocity and steering gear ratio data given through the input
`interface 22. CPU 31 performs proper functions according to
`programs stored in ROM 32 and RAM 33. The outputs of the
`signal processor 30 is transmitted through an output inter-
`face 40. ROM 32 stores a table relating to numerical values
`required for the calculation. It also stores a table represent(cid:173)
`ing operational amount for a safety drive ensuring arrange(cid:173)
`ment 50. On the other hand, RAM 33 stores programs for
`use in calculating an optimum operational amount for the
`safety drive ensuring arrangement 50. A program for this
`purpose is disclosed in, for example, Teruo Yatabe, Auto(cid:173)
`mation Technique: Intelligent Vehicle, pages 22-28.
`The signal processor 30 first determines, according to the
`picked up image data, whether there is a space available on
`the roadway to pass through. When there is enough space to
`pass through and a potentially dangerous object is present on
`the roadway, the signal processor 30 calculates optimum
`35 operational amount for the safety drive for ensuring arrange(cid:173)
`ment 50 to operate the same. In FIG. 1, the safety drive
`ensuring arrangement 50 consists of a steering actuator 51,
`a throttle actuator 52 and a brake actuator 53. If the signal
`processor 30 determines that it is necessary to operate these
`actuators, it produces steering gear ratio command, set
`velocity command, and brake operation command. The
`steering actuator 51, the throttle actuator 52 and the brake
`actuator 53 are operated depending on the condition in
`response to the steering gear ratio command, the set velocity
`command and the brake operation command, respectively.
`The actuators are for use in actuating occupant protecting
`mechanism such as a brake device. Operation of these
`actuators is described now.
`The steering actuator 51 is a hydraulic actuator for use in
`rotating steering wheel (not shown) in an emergency. In this
`event, the steering wheel is automatically rotated according
`to the steering gear ratio and rotational direction indicated
`by the steering gear ratio command. The operational amount
`of the steering or hydraulic actuator can be controlled in a
`well-known manner through a servo valve and a hydraulic
`pump, both of which are not shown in the figure.
`The throttle actuator 52 acts to adjust opening amount of
`a throttle valve (not shown) to decrease speed while evading
`60 objects or so on.
`The brake actuator 53 performs a function to gradually
`decrease speed of a vehicle in response to the brake opera(cid:173)
`tional command. The brake actuator 53 is also capable of
`achieving sudden brake operation, if necessary.
`As mentioned above, CPU 31 carries out its operation
`with the tables and programs stored in ROM 32 and RAM
`33, respectively, calculating for all picked up image data.
`
`30
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`
`45
`
`A conventional system for predicting and evading crash of
`a vehicle is described first to facilitate an understanding of
`the present invention. 'Throughout the following detailed 40
`description, similar reference numerals refer to similar ele(cid:173)
`ments in all figures of the drawing.
`In the following description, the term "crash" is used in a
`wider sense that relates to all unexpected traffic accidents.
`Accidents other than crash include a turnover or fall of a
`vehicle, with which the phenomenon of"crash" is associated
`in some degrees, so that the term crash is used as a cause of
`traffic accidents.
`As shown in FIG. 1, an image pick-up device 21 is 50
`mounted at a front portion of an automobile 10 to pick up
`ever-changing images as analog image data. This image
`pick-up device 21 is any one of suitable devices such as a
`charge-coupled-device (CCD) camera. The image data are
`subject to sampling for a sampling range e.. T at a predeter- 55
`mined sampling interval e,.L The image data are collected up
`to crash. In this event, the image pick-up range of the image
`pick-up device 21 corresponds to a field of view observed
`through naked eyes.
`The image pick-up device 21 is connected to an input
`interface 22. The analog image data obtained by the image
`pick-up device 21 are supplied to the input interface 22. The
`input interface 22 serves as an analog-to-digital converter
`for converting the analog image data into digital image data.
`More particularly, the picked up images are digitized by 65
`means of dividing the same into tiny pixels (data elements)
`isolated by grids. It is preferable to eliminate noises and
`
`16
`
`

`
`5,541,590
`
`5
`The conventional system is thus disadvantageous in that the
`calculation operation requires relatively long time interval as
`mentioned in the preamble of the instant specification.
`On the contrary, a system according to the present inven(cid:173)
`tion uses image data representing ever-changing views s
`picked up from a vehicle until it suffers from an accident.
`These image data are used for training a neural network
`implemented in the present system. After completion of the
`training, the neural network is implemented in a utility
`vehicle and serves as a decision making circuit for starting 10
`safety driving arrangements to evade crash, which otherwise
`will certainly happen. The neural network predicts crash and
`evades the same by means of properly starting an automatic
`steering system or a brake system.
`A well-known neural network is described first to facili- 15
`tate an understanding of the present invention and, following
`which preferred embodiments of the present invention will
`be described with reference to the drawing.
`A neural network is the technological discipline con(cid:173)
`cerned with information processing system, which is still in 20
`a development stage. Such artificial neural network structure
`is based on our present understanding of biological nervous
`systems. The artificial neural network is a parallel, distrib(cid:173)
`uted information processing structure consisting of process(cid:173)
`ing elements interconnected unidirectional signal channels 25
`called connections. Each processing element has a single
`output connection that branches into as many collateral
`connections as desired.
`A basic function of the processing elements is described
`below.
`As shown in FIG. 2, each processing element can receive
`any number of incoming functions while it has a single
`output connection that can fan out to form multiple output
`connections. Thus the artificial neural network is by far more 35
`simple than the networks in a human brain. Each of the input
`data x1, x2,_ .. , xi is multiplied by its corresponding weight
`coefficient w1, w2, . .. , wi, respectively, and the processing
`element sums the weighted inputs and passes the result
`through a nonlinearity. Each processing element is charac- 40
`terized by an internal threshold or offset and by the type of
`nonlinearity and processes a predetermined transfer function
`to produce an output f(X) corresponding to the sum
`(X~xi·wi). In FIG. 2, xi represents an output of an i-th
`processing element in an (s-1)-th layer and wi represents a 45
`connection strength or the weight from the (s-1)-th layer to
`the s-th layer. The output f(X) represents energy condition of
`each processing element. Though the neural networks come
`in a variety of forms, they can be generally classified into
`feedforward and recurrent classes. ln the latter, the output of 50
`each processing element is fed back to other processing
`elements via weights. As described above, the network has
`an energy or an energy function that will be minimum
`finally. In other words, the network is considered to have
`converged and stabilized when outputs no longer change on 55
`successive iteration. Means to stabilize the network depends
`on the algorithm used.
`The back propagation neural network is one of the most
`important and common neural network architecture, which
`is applied to the present invention. ln this embodiment, the 60
`neural network is used to determine if there is a possibility
`of crash. When the neural network detects the possibility of
`crash, it supplies an operational command to a safety ensur(cid:173)
`ing unit in a manner described below. As well known in the
`art, the back propagation neural network is a hierarchical 65
`design consisting of fully interconnected layers of process(cid:173)
`ing elements. More particularly, the network architecture
`
`30
`
`6
`comprises at least an input layer and an output layer. The
`network architecture may further comprise additional layer
`or N hidden layers between the input layer and the output
`layer where N represents an integer that is equal to or larger
`than zero. Each layer consists of one or more processing
`elements that are connected by links with variable weights.
`The net is trained by initially selecting small random
`weights and internal thresholds and then presenting all
`training data repeatedly. Weights are adjusted after every
`trial using information specifying the correct result until
`weights converge to an acceptable value. The neural net(cid:173)
`work is thus trained to automatically generate and produce
`a desired output for an unknown input.
`Basic learning operation of the back propagation neural
`network is as follows. First, input values are supplied to the
`neural network as the training data to produce output values,
`each of which is compared with a correct or desired output
`value (teacher data) to obtain information indicating a
`difference between the actual and desired outputs. The
`neural network adjusts the weights to reduce the difference
`between them. More particularly, the difference can be
`represented by a well-known mean square error. Durin<>
`training operation, the network adjusts all weights to mini~
`mize a cost function equal to the mean square error. Adjust(cid:173)
`ment of the weights is achieved by means of back propa(cid:173)
`gation transferring the error from the output layer to the
`input layer. This process is continued until the network
`reaches a satisfactory level of performance. The neural
`network trained in the above mentioned manner can produce
`output data based on the input data even for an unknown
`input pattern.
`The generalized delta rule derived with the steepest
`descent may be used to optimize the learning procedure that
`involves the presentation of a set of pairs of input and output
`patterns. The system first uses the input data to produce its
`own output data and then compares this with the desired
`output. If there is no difference, no learning takes place and
`otherwise the weights are changed to reduce the difference.
`As a result of this it becomes possible to converge the
`network after a relatively short cycle of training.
`To train the net weights input data (training data) are
`successively supplied to the processing elements in the input
`layer. Each processing element is fully connected to other
`processing elements in the next layer where a predetermined
`calculation operation is carried out. In other words, the
`training input is fed through to the output. At the output layer
`the error is found using, for example, a sigmoid function and
`is propagated back to modify the weight on a connection.
`The goal is t? minimize the error ~o that the weights are
`repeatedly adjusted and updated unttl the network reaches a
`satisfactory level of performance. A graphical representation
`of sigmoid functions is shown in FIG. 3.
`In this embodiment a sigmoid function as shown in FIG.
`3 is applied as the transfer function for the network. The
`sigmoid function is a bounded differentiable real function
`that is defined for all real input values and that has a positive
`derivative everywhere. The central portion of the sigmoid
`C:vhether it is near 0 or displaced) is assumed to be roughly
`lmear. With the sigmoid function it becomes possible to
`establish effective neural network models.
`As a sigmoid function parameter in each layer, a y-direc(cid:173)
`tional scale and

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket