`Nishio
`
`lllllllllllllllllllllllllllllllllllll IIIII lllllllllllllllllllll 11111111 1111
`US005541590A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,541,590
`*Jul. 30, 1996
`
`[54] VEIDCLE CRASH PREDICTIVE AND
`EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`[75]
`
`Inventor: Tomoyuki Nishio, Kawasaki, Japan
`
`[73] Assignee: Takata Corporation, Tokyo, Japan
`
`[ *] Notice:
`
`The term of this patent shall not extend
`beyond the expiration date of Pat. No.
`5,377,108.
`
`FOREIGN PATENT DOCUMENTS
`
`0358628
`2554612
`3837054
`4001493
`04008639
`9002985
`
`3/1990 European Pat. Off ..
`511985 France.
`611989 Germany.
`7/1991 Germany.
`1/1992 Japan.
`3/1990 WIPO.
`
`OTHER PUBLICATIONS
`
`[21] Appl. No.: 375,249
`
`[22] Filed:
`
`Jan. 19,1995
`
`Rumelhart et al "Parallel Distributed Processing", vol. 1 pp.
`161, 162, copyrighted 1986.
`
`Related U.S. Application Data
`
`Primary Examiner-Brent A. Swarthout
`Attorney, Agent, or Firm-Kanesaka & Takeuchi
`
`[63] Continuation of Ser. No. 97,178, Jul. 27, 1993, abandoned.
`Foreign Application Priority Data
`
`[30]
`
`[57]
`
`ABSTRACT
`
`(JP]
`
`Japan .................................... 4-229201
`
`Aug. 4, 1992
`Int. Cl.6
`....................................................... GOSG 1116
`[51]
`[52] U.S. CI . .......................... 340/903; 340/435; 3481148;
`364/424.04; 395/23; 395/905
`[58] Field of Search ..................................... 340/435, 995,
`340/903, 905; 3481170, 113, 148, 149;
`364/424.01, 424.04, 424.05; 395/905.22,
`11, 21, 23; 382/104, 157
`
`[56]
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`711992 Nabet et al ............................. 364/807
`5,130,563
`5,161,014 1111992 Pearson et al ............................ 395/11
`5,161,632 1111992 Asayarna ................................. 340/435
`5,162,997 1111992 Takahashi ............................ 364/424.1
`211993 Adachi et al ........................... 340/903
`5,189,619
`4/1993 Yuhara et al.
`5,200,898
`..................... 364/431.04
`5/1993 Schweizer et al ...................... 364/807
`5,214,744
`9/1993 Taylor ..................................... 340/903
`5,249,157
`5,270,708 12/1993 Karnishirna ............................. 340/905
`1/1994 Gioutsos et al .................... 364/424.01
`5,282,134
`2/1994 Takahashi .......................... 364/424.01
`5,285,523
`5,377,108 12/1994 Nishio ................................ 364/424.05
`7/1995 Brady et al ............................. 348/148
`5,434,927
`
`A system for predicting and evading crash of a vehicle
`includes an image pick-up device mounted on the vehicle for
`picking up images of actual ever-changing views when the
`vehicle is on running to produce actual image data, a crash
`predicting device associated with said image pick -up device,
`said crash predicting device being successively supplied
`with the actual image data for predicting occurrence of crash
`between the vehicle and potentially dangerous objects on the
`roadway to produce an operational signal when there is
`possibility of crash and a safety drive ensuring device
`connected to said crash predicting device for actuating, in
`response to the operational signal, an occupant protecting
`mechanism which is operatively connected thereto and
`equipped in the vehicle. The crash predicting device
`includes a neural network which is previously trained with
`training data to predict the possibility of crash, the training
`data representing ever-changing views previously picked-up
`from said image picking-up device during driving of the
`vehicle for causing actual crash.
`
`4 Claims, 7 Drawing Sheets
`
`TO STEERING WHEEL
`
`TO THROTTLE VALVE
`
`TO BRAKE
`
`52
`
`53
`
`'
`-----T...J
`(
`50
`
`fO
`~
`
`CRASH
`PREDICTING
`CIRCUIT
`
`40
`
`60
`
`IPR2015-00262 - Ex. 1104
`Toyota Motor Corp., Petitioner
`1
`
`
`
`0
`\C
`01
`~ -...
`~
`01
`-...
`01
`
`....:1
`
`~
`
`g,
`('!) a
`i:T'
`00
`
`~
`\C
`\C
`~
`
`9
`tN
`~
`
`= ~
`~ ;-
`~
`•
`rJ.J.
`d •
`
`'--------------------------------------------------T-'
`I
`I
`I
`I
`I
`I
`:
`:
`i
`:
`r-------------------------------------------------------,
`TO STEERING WHEEL
`
`50
`/ (
`
`ACTUATOR
`
`BRAKE
`
`TO BRAKE
`
`TO THROTTLE VALVE
`
`I
`I
`I
`I
`
`I
`33
`
`321--.-t
`
`RAM
`
`----
`
`I
`I
`I
`I
`I
`1
`1 ...---
`r-----
`
`INPUT 1/F
`
`22
`
`53
`
`52
`
`51
`
`0
`
`' --3
`
`40
`
`OUTPUT 1/F
`
`PRIOR ART
`
`1
`
`F I G
`
`24
`
`23
`
`l
`
`I
`34 I
`
`: 31 ~ I cPU I
`
`L ____________ J
`,-
`
`SPEED I I STEERING GEAR
`
`RAT I 0 SENSOR
`
`SENSOR
`
`2
`
`
`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 2 of 7
`
`5,541,590
`
`INPUT xi
`X
`
`XI
`
`' '
`
`x=l:xiw\
`I
`OUTPUT
`
`y = f {X)
`
`FIG. 2
`PRtOR ART
`
`y = f {X)
`1.0 -------------
`
`0
`
`X
`f (X)= - - -1- -
`1 + e xp (-X)
`
`FIG. 3
`PRIOR ART
`
`3
`
`
`
`\0 c
`"' 01
`~
`.a;.
`"' 01
`01
`
`~
`.-1'-
`~
`
`rJJ. =(cid:173)~
`
`-...1
`s,
`
`="
`\C
`"'""'
`\C
`9
`~
`~
`
`~ = ~ = f""'to-
`
`•
`00.
`~ •
`
`51
`-------------------------------------,-'
`L____
`I
`I
`I
`I
`:
`:
`:
`:
`r-----------------------------------------------------------,
`
`ACTUATOR
`
`BRAKE
`
`ACTUATOR
`
`THROTTLE
`
`ACTUATOR
`
`STEERING
`
`F I G. 4
`
`40
`
`OUTPUT 1/F
`
`60
`>
`
`CIRCUIT
`PREDICTING
`
`CRASH
`
`I
`
`INPUT 1/F
`
`22
`
`'
`
`·I
`
`IMAGE I
`
`21
`)
`I
`PICK-U~
`
`)
`10
`
`r
`
`50
`
`53
`
`52
`
`TO BRAKE
`
`TO THROTTLE VALVE
`
`TO STEERING WHEEL
`
`4
`
`
`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 4 of 7
`
`5,541,590
`
`TO 50
`
`63
`
`60
`
`61
`
`000
`
`61-1 61-2
`
`61- n
`
`F R 0 M 22
`
`F
`
`I G. 5 (a)
`
`•
`
`•
`
`•
`
`I
`I
`I
`---------,----
`I
`1
`I
`I
`I
`I
`•
`I
`•
`•
`I
`--- -r-- ---1----
`-------41--_
`I
`I
`I
`I
`I
`----~---- ----
`I
`1
`I
`I
`
`61-n
`
`F I G
`
`5 (b)
`
`PICKED UP
`IMAGE
`
`5
`
`
`
`U.S. Patent
`US. Patent
`
`Jul. 30, 1996
`Jul. 30, 1996
`
`Sheet 5 of 7
`Sheet 5 of 7
`
`5,541,590
`5,541,590
`
`F
`
`I G . 6 (a)
`
`
`
`F
`
`I G . 6 ( b )
`
`80 b
`
`6
`
`
`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 6 of 7
`
`5,541,590
`
`SOc
`
`F I G . 7
`
`BOd
`
`F I G. 8
`
`7
`
`
`
`U.S. Patent
`
`Jul. 30, 1996
`
`Sheet 7 of 7
`
`5,541,590
`
`TO 50
`
`/ ' -
`
`I
`I
`
`I :0--
`~0--
`, ..._ __
`
`1
`I
`
`1
`I
`
`--0------
`)
`--------- _ _.,.
`--- ---("'
`\
`I
`
`65
`
`--'
`I'
`
`0
`
`61
`I
`/
`
`61-1
`
`61-2
`
`61- n
`
`FROM 22
`
`FIG. 9
`
`8
`
`
`
`5,541,590
`
`1
`VEHICLE CRASH PREDICTIVE AND
`EVASIVE OPERATION SYSTEM BY NEURAL
`NETWORKS
`
`This application is a continuation of application Ser. No. 5
`08/097,178, filed Sep. 27, 1993, now abandoned.
`
`BACKGROUND OF THE INVENTION
`
`2
`The image processing system of the type described is
`useful in normal driving conditions where the pattern match(cid:173)
`ing can be effectively made between the image patterns
`successively pic;:ked up and the reference patterns for safety
`driving control. However, image patterns representing vari(cid:173)
`ous conditions on the roadway should be stored previously
`in the intelligent vehicle as the reference patterns. Vehicle
`orientation at initiation of crash varies greatly, so that huge
`numbers of reference patterns are required for the positive
`operation. This means that only a time-consuming calcula(cid:173)
`tion will result in a correct matching of the patterns, which
`is not suitable for evading an unexpected crash.
`It is, of course, possible to increase operational speed of
`the pattern matching by using a large image processor.
`However, such a processor is generally complex in structure
`and relatively expensive, so that it is difficult to apply the
`same as the on-vehicle equipment. In addition, on-vehicle
`image processors, if achieved, will perform its function
`sufficiently only in the limited applications such as a supple(cid:173)
`mental navigation system during the normal cruising.
`
`SUMMARY OF THE INVENTION
`
`25
`
`This invention generally relates to a system for predicting 10
`and evading crash of a vehicle, in case of
`In driving a car, a driver unconsciously senses various
`conditions through the objects in view and, as a case may be,
`he must take an action to evade any possible crash or
`collision. However, drivers will often be panicked at the 15
`emergency. Such a panicked driver may not properly handle
`the vehicle. Besides, the response delay to stimuli in varying
`degrees is inherent to human beings, so that it is physically
`impossible in some cases to evade crash or danger. With this
`respect, various techniques have been developed to evade 20
`collision by means of mounting on a vehicle a system for
`determining the possibility of crash in a mechanical or
`electrical manner before it happens. Accidents could be
`reduced if drivers had an automatic system or the like
`warning of potential collision situations.
`An automobile collision avoidance radar is typically used
`as this automatic system. Such an automobile collision
`avoidance radar is disclosed in, for example, M. Kiyoto and
`A. Tachibana, Nissan Technical Review: Automobile Colli(cid:173)
`sion-Avoidance Radar, Vol. 18, Dec. 1982 that is incorpo(cid:173)
`rated by reference herein in its entirety. The radar disclosed
`comprises a small radar radiation element and antennas
`installed at the front end of a vehicle. A transmitter transmits
`microwaves through the radiation element towards the head-
`way. The microwave backscatter from a leading vehicle or
`any other objects as echo returns. The echo returns are
`received by a receiver through the antennas and supplied to
`a signal processor. The signal processor carries out signal
`processing operation to calculate a relative velocity and a 40
`relative distance between the object and the vehicle. The
`relative velocity and the relative distance are compared with
`predetermined values, respectively, to determine if the
`vehicle is going to collide with the object. The high possi(cid:173)
`bility of collision results in activation of a proper safety 45
`system or systems.
`However, the above mentioned radar system has a disad(cid:173)
`vantage offaulty operation or malfunctions, especially when
`the vehicle implementing this system passes by a sharp
`curve in a road. The radar essentially detects objects in front
`of the vehicle on which it is mounted. The system thus tends
`to incorrectly identify objects alongside the road such as a
`roadside, guard rails or even an automobile correctly run(cid:173)
`ning on the adjacent lane.
`An intelligent vehicle has also been proposed that com(cid:173)
`prises an image processing system for cruise and traction
`controls. The views ahead the vehicle are successively
`picked up as image patterns. These image patterns are
`subjected to pattern matching with predetermined reference
`patterns. The reference patterns are classified into some
`categories associated with possible driving conditions. For
`example, three categories are defined for straight running,
`right turn and left turn. When a matching result indicates the
`presence of potentially dangerous objects in the picked up
`image, a steering wheel and a brake system are automati- 65
`cally operated through a particular mechanism to avoid or
`evade crash to that object.
`
`35
`
`An object of the present invention is to provide a system
`for predicting and evading crash of a vehicle using neural
`networks.
`Another object of the present invention is to provide a
`system capable of training neural networks by means of
`collected image data representing scenes along the moving
`30 direction of a vehicle until the vehicle collides with some(cid:173)
`thing.
`It is yet another object of the present invention to provide
`a system for predicting crash though matching operation
`between data obtained on driving a vehicle and data learned
`by neural networks. It is still another object of the present
`invention to provide a system for evading crash of a vehicle
`using neural networks to actuate a vehicle safety system for
`protecting an occupant.
`In order to achieve the above mentioned objects, the
`present invention is provided with a system for predicting
`and evading crash of a vehicle comprising: an image pick-up
`device mounted on the vehicle for picking up images of
`ever-changing views when the vehicle is on running to
`produce image data; a crash predicting circuit associated
`with the image pick-up device, the crash predicting circuit
`being successively supplied with the image data for predict(cid:173)
`ing occurrence of crash between the vehicle and potentially
`dangerous objects on the roadway to produce an operational
`50 signal when there is possibility of crash; and a safety driving
`ensuring device connected to the crash predicting circuit for
`actuating, in response to the operational signal, occupant
`protecting mechanism which is operatively connected
`thereto and equipped in the vehicle; wherein the crash
`55 predicting circuit comprises a neural network which is
`previously trained with training data to predict the possibil(cid:173)
`ity of crash, the training data representing ever-changing
`views previously picked-up from the image picking-up
`device during driving of the vehicle and just after actual
`60 crash.
`The neural network comprises at least an input layer and
`an output layer, and the training data are supplied to the
`input layer while the output layer is supplied with, as teacher
`data, flags representing expected and unexpected crash,
`respectively, of the vehicle. In addition, the neural network
`may comprise a two-dimensional self-organizing competi(cid:173)
`tive learning layer as an intermediate layer.
`
`9
`
`
`
`5,541,590
`
`3
`Other advantages and features of the present invention
`will be described in detail in the following preferred
`embodiments thereof.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`FIG. 1 is a block diagram of a conventional system for
`predicting and evading crash of a vehicle;
`FIG. 2 is a schematic view showing a processing element
`in atypical neural network;
`FIG. 3 is a graphical representation of a sigmoid function
`used as a transfer function for training neural networks;
`FIG. 4 is a block diagram of a system for predicting and
`evading crash of a vehicle using neural networks according
`to the first embodiment of the present invention;
`FIG. 5(a) is a schematic structural diagram of a crash
`predicting circuit in FIG. 4 realized by a neural network of
`three layers;
`FIG. 5(b) shows an example of an input layer consisting
`of a two-dimensional array of processing elements of the
`neural network shown in FIG. 5(a);
`FIGS. 6(a) and 6(b) are exemplified views picked up, as
`the training image data supplied to the neural network, at
`different time instances during driving an experimental
`vehicle;
`FIG. 7 is a view showing an example of an image data
`obtained during driving a utility vehicle;
`FIG. 8 is a view showing another example of an image
`data obtained during driving a utility vehicle; and
`FIG. 9 is a block diagram of a system for predicting and
`evading crash using neural networks according to the second
`embodiment of the present invention.
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`
`30
`
`A conventional system for predicting and evading crash of
`a vehicle is described first to facilitate an understanding of
`the present invention. Throughout the following detailed 40
`description, similar reference numerals refer to similar ele(cid:173)
`ments in all figures of the drawing.
`In the following description, the term "crash" is used in a
`wider sense that relates to all unexpected traffic accidents.
`Accidents other than crash include a turnover or fall of a
`vehicle, with which the phenomenon of "crash" is associated
`in some degrees, so that the term crash is used as a cause of
`traffic accidents.
`As shown in FIG. 1, an image pick-up device 21 is 50
`mounted at a front portion of an automobile 10 to pick up
`ever-changing images as analog image data. This image
`pick-up device 21 is any one of suitable devices such as a
`charge-coupled-device (CCD) camera. The image data are
`subject to sampling for a sampling range Ll T at a predeter- 55
`mined sampling interval .-it. The image data are collected up
`to crash. In this event, the image pick-up range of the image
`pick-up device 21 corresponds to a field of view observed
`through naked eyes.
`The image pick-up device 21 is connected to an input
`interface 22. The analog image data obtained by the image
`pick-up device 21 are supplied to the input interface 22. The
`input interface 22 serves as an analog-to-digital converter
`for converting the analog image data into digital image data.
`More particularly, the picked up images are digitized by 65
`means of dividing the same into tiny pixels (data elements)
`isolated by grids. It is preferable to eliminate noises and
`
`5
`
`15
`
`4
`distortions at this stage. The input interface 22 is also
`connected to a speed sensor 23, a steering gear ratio sensor
`24 and a signal processor 30. The speed sensor 23 supplies
`velocity data to the signal processor 30 through the input
`interface 22. The velocity data represents an actual velocity
`of the automobile 10 at the time instant when the image
`pick-up device 21 picks up an image of a view. Likewise, the
`steering gear ratio sensor 24 supplies steering gear ratio data
`to the signal processor 30 through the input interface 22. The
`10 steering gear ratio data represents an actual steering gear
`ratio of the automobile 10.
`The signal processor 30 comprises a central processing
`unit (CPU) 31, a read-only memory (ROM) 32 and a
`random-access memory (RAM) 33. CPU 31, ROM 32 and
`RAM 33 are operatively connected to each other through a
`data bus 34. To evade potentially dangerous objects, CPU 31
`carries out calculation operation in response to the image,
`velocity and steering gear ratio data given through the input
`interface 22. CPU 31 performs proper functions according to
`20 programs stored in ROM 32 and RAM 33. The outputs of the
`signal processor 30 is transmitted through an output inter(cid:173)
`face 40. ROM 32 stores a table relating to numerical values
`required for the calculation. It also stores a table represent(cid:173)
`ing operational amount for a safety drive ensuring arrange-
`25 ment 50. On the other hand, RAM 33 stores programs for
`use in calculating an optimum operational amount for the
`safety drive ensuring arrangement 50. A program for this
`purpose is disclosed in, for example, Teruo Yatabe, Auto(cid:173)
`mation Technique: Intelligent Vehicle, pages 22-28.
`The signal processor 30 first determines, according to the
`picked up image data, whether there is a space available on
`the roadway to pass through. When there is enough space to
`pass through and a potentially dangerous object is present on
`the roadway, the signal processor 30 calculates optimum
`35 operational amount for the safety drive for ensuring arrange(cid:173)
`ment 50 to operate the same. In FIG. 1, the safety drive
`ensuring arrangement 50 consists of a steering actuator 51,
`a throttle actuator 52 and a brake actuator 53. If the signal
`processor 30 determines that it is necessary to operate these
`actuators, it produces steering gear ratio command, set
`velocity command, and brake operation command. The
`steering actuator 51, the throttle actuator 52 and the brake
`actuator 53 are operated depending on the condition in
`response to the steering gear ratio command, the set velocity
`45 command and the brake operation command, respectively.
`The actuators are for use in actuating occupant protecting
`mechanism such as a brake device. Operation of these
`actuators is described now.
`The steering actuator 51 is a hydraulic actuator for use in
`rotating steering wheel (not shown) in an emergency. In this
`event, the steering wheel is automatically rotated according
`to the steering gear ratio and rotational direction indicated
`by the steering gear ratio command. The operational amount
`of the steering or hydraulic actuator can be controlled in a
`well-known manner through a servo valve and a hydraulic
`pump, both of which are not shown in the figure.
`The throttle actuator 52 acts to adjust opening amount of
`a throttle valve (not shown) to decrease speed while evading
`60 objects or so on.
`The brake actuator 53 performs a function to gradually
`decrease speed of a vehicle in response to the brake opera(cid:173)
`tional command. The brake actuator 53 is also capable of
`achieving sudden brake operation, if necessary.
`As mentioned above, CPU 31 carries out its operation
`with the tables and programs stored in ROM 32 and RAM
`33, respectively, calculating for all picked up image data.
`
`10
`
`
`
`5,541,590
`
`30
`
`5
`The conventional system is thus disadvantageous in that the
`calculation operation requires relatively long time interval as
`mentioned in the preamble of the instant specification.
`On the contrary, a system according to the present inven(cid:173)
`tion uses image data representing ever-changing views 5
`picked up from a vehicle until it suffers from an accident.
`These image data are used for training a neural network
`implemented in the present system. After completion of the
`training, the neural network is implemented in a utility
`vehicle and serves as a decision making circuit for starting 10
`safety driving arrangements to evade crash, which otherwise
`will certainly happen. The neural network predicts crash and
`evades the same by means of properly starting an automatic
`steering system or a brake system.
`A well-known neural network is described first to facili- 15
`tate an understanding of the present invention and, following
`which preferred embodiments of the present invention will
`be described with reference to the drawing.
`A neural network is the technological discipline con(cid:173)
`cerned with information processing system, which is still in 20
`a development stage. Such artificial neural network structure
`is based on our present understanding of biological nervous
`systems. The artificial neural network is a parallel, distrib(cid:173)
`uted information processing structure consisting of process(cid:173)
`ing elements interconnected unidirectional signal channels 25
`called connections. Each processing element has a single
`output connection that branches into as many collateral
`connections as desired.
`A basic function of the processing elements is described
`below.
`As shown in FIG. 2, each processing element can receive
`any number of incoming functions while it has a single
`output connection that can fan out to form multiple output
`connections. Thus the artificial neural network is by far more
`simple than the networks in a human brain. Each of the input
`data x1, x2,_ .. , xi is multiplied by its corresponding weight
`coefficient w1, w2, ... , wi, respectively, and the processing
`element sums the weighted inputs and passes the result
`through a nonlinearity. Each processing element is charac(cid:173)
`terized by an internal threshold or offset and by the type of
`nonlinearity and processes a predetermined transfer function
`to produce an output f(X) corresponding to the sum
`(X~xi·wi). In FIG. 2, xi represents an output of an i-th
`processing element in an (s-1)-th layer and wi represents a
`connection strength or the weight from the (s-1)-th layer to
`the s-th layer. The output f(X) represents energy condition of
`each processing element. Though the neural networks come
`in a variety of forms, they can be generally classified into
`feedforward and recurrent classes. In the latter, the output of
`each processing element is fed back to other processing
`elements via weights. As described above, the network has
`an energy or an energy function that will be minimum
`finally. In other words, the network is considered to have
`converged and stabilized when outputs no longer change on 55
`successive iteration. Means to stabilize the network depends
`on the algorithm used.
`The back propagation neural network is one of the most
`important and common neural network architecture, which
`is applied to the present invention. In this embodiment, the
`neural network is used to determine if there is a possibility
`of crash. When the neural network detects the possibility of
`crash, it supplies an operational command to a safety ensur(cid:173)
`ing unit in a manner described below. As well known in the
`art, the back propagation neural network is a hierarchical 65
`design consisting of fully interconnected layers of process(cid:173)
`ing elements. More particularly, the network architecture
`
`6
`comprises at least an input layer and an output layer. The
`network architecture may further comprise additional layer
`or N hidden layers between the input layer and the output
`layer where N represents an integer that is equal to or larger
`than zero. Each layer consists of one or more processing
`elements that are connected by links with variable weights.
`The net is trained by initially selecting small random
`weights and internal thresholds and then presenting all
`training data repeatedly. Weights are adjusted after every
`trial using information specifying the correct result until
`weights converge to an acceptable value. The neural net(cid:173)
`work is thus trained to automatically generate and produce
`a desired output for an unknown input.
`Basic learning operation of the back propagation neural
`network is as follows. First, input values are supplied to the
`neural network as the training data to produce output values,
`each of which is compared with a correct or desired output
`v~ue (teacher data) to obtain information indicating a
`dtfference between the actual and desired outputs. The
`neural network adjusts the weights to reduce the difference
`between them. More particularly, the difference can be
`represented by a well-known mean square error. During
`training operation, the network adjusts all weights to mini(cid:173)
`mize a cost function equal to the mean square error. Adjust(cid:173)
`ment of the weights is achieved by means of back propa(cid:173)
`gation transferring the error from the output layer to the
`input layer. This process is continued until the network
`reaches a satisfactory level of performance. The neural
`network trained in the above mentioned manner can produce
`output data based on the input data even for an unknown
`input pattern.
`The generalized delta rule derived with the steepest
`descent may be used to optimize the learning procedure that
`involves the presentation of a set of pairs of input and output
`patterns. The system first uses the input data to produce its
`35 own output data and then compares this with the desired
`output. If there is no difference, no learning takes place and
`otherwise the weights are changed to reduce the difference.
`As a result of this it becomes possible to converge the
`network after a relatively short cycle of training.
`To train the net weights input data (training data) are
`successively supplied to the processing elements in the input
`layer. Each processing element is fully connected to other
`processing elements in the next layer where a predetermined
`calculation operation is carried out. In other words, the
`training input is fed through to the output. At the output layer
`the error is found using, for example, a sigmoid function and
`is propagated back to modify the weight on a connection.
`The goal is to minimize the error so that the weights are
`repeatedly adjusted and updated until the network reaches a
`satisfactory level of performance. A graphical representation
`of sigmoid functions is shown in FIG. 3.
`In this embodiment a sigmoid function as shown in FIG.
`3 is applied as the transfer function for the network. The
`sigmoid function is a bounded differentiable real function
`that is defined for all real input values and that has a positive
`derivative everywhere. The central portion of the sigmoid
`(whether it is near 0 or displaced) is assumed to be roughly
`linear. With the sigmoid function it becomes possible to
`60 establish effective neural network models.
`As a sigmoid function parameter in each layer, a y-direc(cid:173)
`tional scale and a y-coordinate offset are defined. The
`y-directional scale is defined for each layer to exhibit
`exponential variation. This results in improved convergence
`efficiency of the network.
`It is readily understood that other functions may be used
`as the transfer function. For example, in a sinusoidal func-
`
`40
`
`45
`
`50
`
`11
`
`
`
`5,541,590
`
`40
`
`45
`
`7
`tion a differential coefficient for the input sum in each
`processing element is within a range equal to that for the
`original function. To use the sinusoidal function results in
`extremely high convergence of training though the hardware
`for implementing the network may be rather complex in
`structure.
`An embodiment of the present invention is described with
`reference to FIGS. 4 through 9.
`FIG. 4 is a block diagram of a system for predicting and
`evading crash of a vehicle using neural networks according
`to the first embodiment of the present invention. A system in
`FIG. 4 is similar in structure and operation to that illustrated
`in FIG. 1 other than a crash predicting circuit 60. Description
`of the similar components will thus be omitted by the
`consideration of evading redundancy. FIG. 5 is a schematic
`structural diagram of the crash predicting circuit 60 illus(cid:173)
`trated in FIG. 4 formed by a neural network of three layers.
`The crash predicting circuit 60 in this embodiment is
`implemented by a neural network architecture of a hierar(cid:173)
`chical design with three layers as shown in FIG. 5(a). The
`input layer 61 consists of n processing elements 61-1
`through 61-n arranged in parallel as a one-dimensional
`linear form. Each processing element in the input layer 61 is
`fully connected in series to the processing elements in a
`hidden layer 62 of the network. The hidden layer 62 is 25
`connected to an output layer 63 of a single processing
`element to produce an operational command described
`below. FIG. 5(b) shows an input layer consisting of a
`two-dimensional array of processing elements. In this event,
`the image data are supplied to the input layer as a two- 30
`dimensional data matrix of n divisions. Basically, the input
`and the hidden layers can have any geometrical form
`desired. With the two-dimensional array, the processing
`elements of each layer may share the same transfer function,
`and be updated together. At any rate, it should be considered 35
`that each processing element is fully interconnected to the
`other processing elements in the next layer though only a
`part of which are shown in FIG. 5(a) to evade complexity.
`Referring now to FIG. 6 in addition to FIG. 5, illustrated
`are views picked up, as the image data for use in training the
`neural network. The image pick-up device 21 picks up
`ever-changing images as analog image data as described
`above in conjunction with the conventional system. This
`image pick-up device 21 is also any one of suitable devices
`such as a CCD camera. The image pick-up operation is
`carried out during running of a vehicle at higher speed than
`a predetermined one. The image data are subject to sampling
`for a sampling range ,..Tat a predetermined sampling inter(cid:173)
`val ,..t. The image data are collected before and just after
`pseudo crash. The image pick-up range of the image pick-up
`device 21 corresponds to a field of view observed through
`naked eyes. A view shown in FIG. 6(a) is picked up when
`a station wagon (estate car) 80a on the opposite lane comes
`across the center line. A view shown in FIG. 6(b) is picked
`up when an automobile SOb suddenly appears from a blind
`comer of a cross-street. These ever-changing images are
`collected as the training data for the neural network.
`The image data effectively used for the crash evasive
`purpose are those which allow continuous recognition of the
`ever-changing views before and just after pseudo crash.
`With this respect, the image pick-up device 21 picks up the
`images of a vehicle or other obstructions form a relatively
`short distance. In addition, the picked up images preferably
`are distinct reflections from the outside views.
`The data elements consisting of one image are simulta(cid:173)
`neously supplied to the input layer 61 in parallel. In other
`
`8
`words, each data element is supplied to the respective
`processing element of the input layer 61. The digital image
`data may be normalized before being supplied to the input
`layer 61 to increase a data processing speed. However, each
`5 processing element of the input layer 61 essentially receives
`the data element obtained by dividing the image data pre(cid:173)
`viously. The data elements are subjected to feature extrac(cid:173)
`tion when supplied to the hidden layer 62.
`In typical image processing, feature extraction is carried
`10 out according to any one of various methods of pattern
`recognition to clearly identify shapes, forms or configura(cid:173)
`