throbber
Europaisches Patentamt
`
`European Patent Office
`
`Office european des brevets
`
`111111111111111111111111111111111111111111111111111111111111111111111111111
`
`@ Publication number:
`
`0 582 236 A1
`
`EUROPEAN PATENT APPLICATION
`
`@ Application number: 93112302.0
`
`@ Date of filing: 30.07.93
`
`@ Int. Cl.5: B60R 1/00, B60R 21/00,
`B60K 28/00, G05D 1/02
`
`® Priority: 04.08.92 JP 229201/92
`
`@ Date of publication of application:
`09.02.94 Bulletin 94/06
`
`@ Designated Contracting States:
`DEGB
`
`@ Applicant: TAKATA CORPORATION
`4-30, Roppongi 1-chome
`Minato-ku, Tokyo 106(JP)
`
`@ Inventor: Nishio, Tomoyuki
`663-28, Ozenji,
`Asou-ku
`Kawasaki-shi, Kanagawa-ken(JP)
`
`@ Representative: Heim, Hans-Karl, Dipl.-lng. et
`al
`Weber & Heim
`Patentanwalte
`Hofbrunnstrasse 36
`D-81479 Miinchen (DE)
`
`@ Vehicle crash predictive and evasive operation system by neural networks.
`
`@ A system for predicting and evading crash of a
`vehicle (1 0) comprising image pick-up means (21)
`mounted on the vehicle for picking up images of
`actual ever-changing views when the vehicle is on
`running to produce actual image data, crash predict(cid:173)
`ing means (60) associated with said image pick-up
`means (21 ), said crash predicting means (60) being
`successively supplied with the actual image data for
`predicting occurrence of crash between the vehicle
`and potentially dangerous objects on the roadway to
`produce an operational signal when there is possibil(cid:173)
`ity of crash and safety drive ensuring means (50)
`
`connected to said crash predicting means for actuat(cid:173)
`ing, in response to the operational signal, occupant
`protecting mechanism (51 ,52,53) which is operatively
`in
`the vehicle,
`thereto and equipped
`connected
`wherein said crash predicting means (60) comprises
`a neural network which is previously trained with
`training data to predict the possibility of crash, the
`training data representing ever-changing views pre(cid:173)
`viously picked-up said image picking-up means (21)
`during driving of the vehicle and just after actual
`crash.
`
`....
`<C
`CD
`(V)
`N
`N
`00
`Ln
`0
`0.. w
`
`10
`)
`
`INPUT I /F
`
`22
`
`CRASH
`PREDICTING
`CIRCUIT
`
`40
`
`60
`
`Fl G. 4
`
`Rank Xerox (UK) Business Services
`13.10/3.09/3.3.41
`
`IPR2015-00261 - Ex. 1104
`Toyota Motor Corp., Petitioner
`1
`
`

`

`EP 0 582 236 A1
`
`2
`
`Background of the Invention
`
`This invention generally relates to a system for
`predicting and evading crash of a vehicle, which
`otherwise will certainly be happened.
`A driver has an unconscious and immediate
`sense of various conditions through the objects in
`view and, as a case may be, he must take an
`action to evade any possible crash or collision.
`However, drivers will often be panicked at the
`emergency of above their sense. Such a panicked
`driver may sometimes be the last one who can
`cope with
`the emergency to ensure the active
`safety of the vehicle. Besides, the response delay
`to stimuli in varying degrees is inherent to human
`beings, so that it is impossible in some cases to
`evade crash or danger by physical considerations.
`With this respect, various techniques have been
`developed to evade collision by means of mounting
`on a vehicle a system for determining the possibil(cid:173)
`ity of crash in a mechanical or electrical manner
`before it happens. Accidents could be reduced if
`drivers had an automatic system or the like warn(cid:173)
`ing of potential collision situations.
`An automobile collision avoidance radar is typi(cid:173)
`cally used as this automatic system. Such an auto(cid:173)
`mobile collision avoidance radar is disclosed in, for
`example, M. Kiyoto and A. Tachibana, Nissan
`Technical Review: Automobile Collision-Avoidance
`Radar, Vol. 18, Dec. 1982 that is incorporated by
`reference herein in its entirety. The radar disclosed
`comprises a small radar radiation element and an(cid:173)
`tennas installed at the front end of a vehicle. A
`transmitter transmits microwaves through the radi(cid:173)
`ation element towards the headway. The micro(cid:173)
`wave backscatter from a leading vehicle or any
`other objects as echo returns. The echo returns are
`received by a receiver through the antennas and
`supplied to a signal processor. The signal proces(cid:173)
`sor carries out signal processing operation to cal(cid:173)
`culate a relative velocity and a relative distance
`between the object and the vehicle. The relative
`velocity and the relative distance are compared
`with predetermined values, respectively, to deter(cid:173)
`mine if the vehicle is going to collide with the
`object. The high possibility of collision results in
`activation of a proper safety system or systems.
`However, the above mentioned radar system
`has a disadvantage of faulty operation or malfunc(cid:173)
`tions, especially when the vehicle implementing
`this system passes by a sharp curve in a road. The
`radar essentially detects objects in front of the
`vehicle on which it is mounted. The system thus
`tends to incorrectly identify objects alongside the
`road such as a roadside, guard rails or even an
`automobile correctly running on the adjacent lane.
`An intelligent vehicle has also been proposed
`image processing system for
`that comprises an
`
`cruise and traction controls. Ever-changing views
`spreading ahead the vehicle are successively pic(cid:173)
`ked up as image patterns. These image patterns
`are subjected to pattern matching with predeter-
`mined reference patterns. The reference patterns
`are classified into some categories associated with
`three
`possible driving conditions. For example,
`categories are defined for straight running, right
`turn and left turn. When a matching result indicates
`the presence of potentially dangerous objects in
`the picked up image, a steering wheel and a brake
`system are automatically operated through a par(cid:173)
`ticular mechanism to avoid or evade crash to that
`object.
`The image processing system of the type de-
`scribed is useful in normal driving conditions where
`the pattern matching can be effectively made be(cid:173)
`tween the image patterns successively picked up
`and the reference patterns for safety driving con-
`trol. However, image patterns representing various
`conditions on the roadway should be stored pre(cid:173)
`viously in the intelligent vehicle as the reference
`patterns. Vehicle orientation at initiation of crash
`varies greatly, so that huge numbers of reference
`patterns are required for the positive operation.
`This means that only a time-consuming calculation
`will result in a correct matching of the patterns,
`which is not suitable for evading an unexpected
`crash.
`It is, of course, possible to increase operational
`speed of the pattern matching by using a large
`dedicated image processor. However, such a dedi(cid:173)
`cated processor is generally complex in structure
`and relatively expensive, so that it is difficult to
`apply the same as the on-vehicle equipment. In
`addition, on-vehicle image processors, if achieved,
`its function sufficiently only in the
`will perform
`limited applications such as a supplemental naviga(cid:173)
`tion system during the normal cruising.
`
`Summary of the Invention
`
`An object of the present invention is to provide
`a system for predicting and evading crash of a
`vehicle using neural networks.
`Another object of the present invention is to
`provide a system capable of training neural net(cid:173)
`works by means of collecting image data repre(cid:173)
`senting ever-changing vistas along the travel direc-
`tion of a vehicle until the vehicle collides with
`something.
`It is yet another object of the present invention
`to provide a system for predicting crash though
`matching operation between data obtained on driv-
`ing a vehicle and data learned by neural networks.
`It is still another object of the present invention to
`provide a system for evading crash of a vehicle
`using neural networks to actuate a vehicle safety
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`2
`
`2
`
`

`

`3
`
`EP 0 582 236 A1
`
`4
`
`system for protecting an occupant.
`In order to achieve the above mentioned ob(cid:173)
`jects, the present invention is provided with a sys(cid:173)
`tem for predicting and evading crash of a vehicle
`comprising: an image pick-up device mounted on
`the vehicle for picking up images of ever-changing
`views when the vehicle is on running to produce
`image data; a crash predicting circuit associated
`with the image pick-up device, the crash predicting
`circuit being successively supplied with the image
`data for predicting occurrence of crash between
`the vehicle and potentially dangerous objects on
`the roadway to produce an operational signal when
`there is possibility of crash; and a safety driving
`ensuring device connected to the crash predicting
`circuit for actuating, in response to the operational
`signal, occupant protecting mechanism which is
`operatively connected thereto and equipped in the
`vehicle; wherein the crash predicting circuit com(cid:173)
`prises a neural network which is previously trained
`with training data to predict the possibility of crash,
`the training data representing ever-changing views
`previously picked-up the image picking-up device
`during driving of the vehicle and just after actual
`crash.
`The neural network comprises at least an input
`layer and an output layer, and the training data are
`supplied to the input layer while the output layer is
`supplied with, as teacher data, flags representing
`expected and unexpected crash, respectively, of
`the vehicle. In addition, the neural network may
`comprise a two-dimensional self-organizing com(cid:173)
`petitive learning layer as an intermediate layer.
`Other advantages and features of the present
`invention will be described in detain in the following
`preferred embodiments thereof.
`
`Brief
`
`Fig. 1 is a block diagram of a conventional
`system for predicting and evading crash of a
`vehicle;
`Fig. 2 is a schematic view showing a processing
`element in a typical neural network;
`Fig. 3 is a graphical representation of a sigmoid
`function used as a transfer function for training
`neural networks;
`Fig. 4 is a block diagram of a system for pre(cid:173)
`dicting and evading crash of a vehicle using
`neural networks according to the first embodi(cid:173)
`ment of the present invention;
`Fig. 5(a) is a schematic structural diagram of a
`crash predicting circuit in Fig. 4 realized by a
`neural network of three layers;
`Fig. 5(b) shows an example of an input layer
`consisting of a two-dimensional array of pro(cid:173)
`cessing elements of the neural network shown in
`Fig. 5(a);
`
`Figs. 6(a) and 6(b) are exemplified views picked
`up, as the training image data supplied to the
`neural network, at different time instances during
`driving an experimental vehicle;
`Fig. 7 is a view showing an example of an
`image data obtained during driving a utility ve(cid:173)
`hicle;
`Fig. 8 is a view showing another example of an
`image data obtained during driving a utility ve(cid:173)
`hicle; and
`Fig. 9 is a block diagram of a system for pre(cid:173)
`dicting and evading crash using neural networks
`the second embodiment of the
`according to
`present invention.
`
`Detailed
`
`of the Preferred Embodiments
`
`A conventional system for predicting and evad(cid:173)
`ing crash of a vehicle is described first to facilitate
`an understanding of
`the present
`invention.
`Throughout the following detailed description, simi(cid:173)
`lar reference numerals refer to similar elements in
`all figures of the drawing.
`In the following description, the term "crash" is
`used in a wider sense that relates to all unexpected
`traffic accidents. Accidents other than crash include
`a turnover or fall of a vehicle, with which the
`phenomenon of "crash" is associated in some de(cid:173)
`grees therefore the use of term crash as a cause of
`traffic accidents.
`As shown in Fig. 1, an image pick-up device 21
`is mounted at a front portion of an automobile 1 0 to
`pick up ever-changing images as analog image
`is any one of
`data. This image pick-up device 21
`suitable devices such as a charge-coupled-device
`(CCD) camera. The image data are subject to sam(cid:173)
`pling for a sampling range c,. Tduring a predeter(cid:173)
`mined sampling period c,.t. The image data are
`collected up to crash. In this event, the image pick(cid:173)
`up range of the image pick-up device 21 cor(cid:173)
`responds to a field of view observed through naked
`eyes. The image pick-up device 21
`is con(cid:173)
`nected to an input interface 22. The analog image
`data obtained by the image pick-up device 21 are
`supplied to the input interface 22. The input inter(cid:173)
`face 22 serves as an analog-to-digital converter for
`converting the analog image data into digital image
`data. More particularly, the picked up images are
`digitized by means of dividing the same into tiny
`pixels (data elements) isolated by grids. It is prefer(cid:173)
`able to eliminate noises and distortions at this
`stage. The input interface 22 is also connected to a
`speed sensor 23, a steering gear ratio sensor 24
`and a signal processor 30. The speed sensor 23
`supplies velocity data to the signal processor 30
`through the input interface 22. The velocity data
`represents an actual velocity of the automobile 1 0
`at the time instant when the image pick-up device
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`3
`
`3
`
`

`

`5
`
`EP 0 582 236 A1
`
`6
`
`s
`
`10
`
`15
`
`in a well-known manner through a servo valve and
`a hydraulic pump, both of which are not shown in
`the figure.
`The throttle actuator 52 acts to adjust opening
`amount of a throttle valve (not shown) to decrease
`speed while evading objects or so on.
`The brake actuator 53 performs a function to
`gradually decrease speed of a vehicle in response
`to the brake operational command. The brake ac-
`tuator 53 is also capable of achieving sudden brake
`operation, if necessary.
`As mentioned above, CPU 31 carries out its
`operation with the tables and programs stored in
`ROM 32 and RAM 33, respectively, for every one
`picked up image data. The conventional system is
`thus disadvantageous in that the calculation opera(cid:173)
`tion requires relatively long time interval as men(cid:173)
`tioned in the preamble of the instant specification.
`On the contrary, a system according to the
`present invention uses image data representing
`ever-changing views picked up from a vehicle until
`it suffers from an accident. These image data are
`used for training a neural network implemented in
`the present system. After completion of the train-
`ing, the neural network is implemented in a utility
`vehicle and serves as a decision making circuit for
`to evade
`starting safety driving arrangements
`crash, which otherwise will certainly be happened.
`The neural network predicts crash and evades the
`same by means of properly starting an automatic
`steering system or a brake system.
`A well-known neural network is described first
`to facilitate an understanding of the present inven(cid:173)
`tion and, following which preferred embodiments of
`the present invention will be described with refer(cid:173)
`ence to the drawing.
`A neural network is the technological discipline
`concerned with
`information processing system,
`which has been developed and still in their devel-
`opment stage. Such artificial neural network struc(cid:173)
`ture is based on our present understanding of
`biological nervous systems. The artificial neural
`network is a parallel, distributed information pro(cid:173)
`cessing structure consisting of processing ele-
`45 ments interconnected unidirectional signal channels
`called connections. Each processing element has a
`single output connection that branches
`into as
`many collateral connections as desired.
`A basic function of the processing elements is
`described below.
`As shown in Fig. 2, each processing element
`can receive any number of incoming functions
`while it has a single output connection that can be
`fan out into copies to form multiple output connec-
`tions. Thus the artificial neural network is by far
`more simple than the networks in a human brain.
`Each of the input data x1, x2, ···' xi is multiplied by
`its corresponding weight coefficient w1, w2,. .. , wi,
`
`image of a view. Likewise, the
`21 picks up an
`steering gear ratio sensor 24 supplies steering gear
`ratio data to the signal processor 30 through the
`input interface 22. The steering gear ratio data
`represents an actual steering gear ratio of the auto(cid:173)
`mobile 10.
`The signal processor 30 comprises a central
`processing unit (CPU) 31, a read-only memory
`(ROM) 32 and a random-access memory (RAM)
`33. CPU 31, ROM 32 and RAM 33 are operatively
`connected to each other through a data bus 34. To
`evade potentially dangerous objects, CPU 31 car(cid:173)
`ries out calculation operation in response to the
`image, velocity and steering gear ratio data given
`through the input interface 22. CPU 31 performs
`proper functions according to programs stored in
`ROM 32 and RAM 33. The outputs of the signal
`processor 30 is transmitted through an output inter(cid:173)
`face 40. ROM 32 stores a table relating to numeri(cid:173)
`cal values required for the calculation. It also stores
`a table representing operational amount for a safety
`drive ensuring arrangement 50. On the other hand,
`RAM 33 stores programs for use in calculating an
`optimum operational amount for the safety drive
`ensuring arrangement 50. A program for this pur(cid:173)
`pose is disclosed in, for example, Teruo Yatabe,
`Automation Technique: Intelligent Vehicle, pages
`22-28.
`The signal processor 30 first determines, ac(cid:173)
`cording to the picked up image data, whether there
`is a space available on
`the roadway
`to pass
`through. When there is enough space to pass
`is
`through and a potentially dangerous object
`present on the roadway, the signal processor 30
`calculates optimum operational amount for
`the
`safety drive ensuring arrangement 50 to operate
`In Fig. 1, the safety drive ensuring
`the same.
`arrangement 50 consists of a steering actuator 51,
`a throttle actuator 52 and a brake actuator 53. If the
`signal processor 30 determines that it is necessary
`to operate these actuators, it produces steering
`gear ratio command, set velocity command, and
`brake operation command. The steering actuator
`51, the throttle actuator 52 and the brake actuator
`53 are operated depending on the condition in
`response to the steering gear ratio command, the
`set velocity command and the brake operation
`command, respectively.
`The actuators are for use in actuating occupant
`protecting mechanism such as a brake device.
`Operation of these actuators is described now.
`The steering actuator 51 is a hydraulic actuator
`for use in rotating steering wheel (not shown) in an
`emergency. In this event, the steering wheel is
`automatically rotated according to the steering gear
`ratio and rotational direction indicated by the steer(cid:173)
`ing gear ratio command. The operational amount of
`the steering or hydraulic actuator can be controlled
`
`20
`
`25
`
`30
`
`35
`
`40
`
`so
`
`ss
`
`4
`
`4
`
`

`

`7
`
`EP 0 582 236 A1
`
`8
`
`respectively, and the processing element sums the
`weighted inputs and passes the result through a
`nonlinearity. Each processing element is character(cid:173)
`ized by an internal threshold or offset and by the
`type of nonlinearity and processes a predetermined
`transfer function to produce an output f(X) cor(cid:173)
`responding to the sum (X = Exi • wi). In Fig. 2, xi
`represents an output of an i-th processing element
`in an (s-1 )-th layer and wi represents a connection
`strength or the weight from the (s-1 )-th layer to the
`s-th layer. The output f(X) represents energy con(cid:173)
`dition of each processing element. Though the
`neural networks come in a variety of forms, they
`can be generally classified into feedforward and
`recurrent classes. In the latter, the output of each
`processing element is fed back to other processing
`elements via weights. As described above, the net(cid:173)
`work has an energy or an energy function asso(cid:173)
`ciated with it that will be minimum finally. In other
`words, the network is considered to have con(cid:173)
`longer
`verged and stabilized when outputs no
`change on successive iteration. Means to stabilize
`the network depends on the algorithm used.
`The back propagation neural network is one of
`the most important and common neural network
`architecture, which is applied to the present inven(cid:173)
`tion.
`In
`this embodiment, the neural network is
`used to determine if there is a possibility of crash.
`When the neural network detects the possibility of
`crash, it supplies an operational command to a
`safety ensuring unit in a manner described below.
`As well known in the art, the back propagation
`neural network is a hierarchical design consisting of
`fully interconnected layers of processing elements.
`More particularly, the network architecture com(cid:173)
`prises at least an input layer and an output layer.
`The network architecture may further comprise ad(cid:173)
`ditional layer or N hidden layers between the input
`layer and the output layer where N represents an
`integer that is equal to or larger than zero. Each
`layer consists of one or more processing elements
`that are connected by links with variable weights.
`The net is trained by initially selecting small ran(cid:173)
`dom weights and internal thresholds and then pre(cid:173)
`senting all training data repeatedly. Weights are
`adjusted after every trial using information specify(cid:173)
`ing the correct result until weights converge to an
`acceptable value. The neural network
`is
`thus
`trained to automatically generate and produce a
`desired output for an unknown input.
`Basic learning operation of the back propaga(cid:173)
`tion neural network is as follows. First, input values
`are supplied to the neural network as the training
`data to produce output values, each of which is
`compared with a correct or desired output value
`(teacher data) to obtain information indicating a
`difference between the actual and desired outputs.
`The neural network adjusts the weights to reduce
`
`5
`
`15
`
`20
`
`25
`
`the difference between them. More particularly, the
`difference can be represented by a well-known
`mean square error. During training operation, the
`network adjusts all weights to minimize a cost
`function equal to the mean square error. Adjust(cid:173)
`ment of the weights is achieved by means of back
`propagating the error from the output layer to the
`input layer. This process is continued until the
`network reaches a satisfactory
`level of perfor-
`10 mance. The neural network trained in the above
`mentioned manner can produce output data based
`on the input data even for an unknown input pat(cid:173)
`tern.
`The generalized delta rule derived with the
`steepest descent may be used to optimize the
`learning procedure that involves the presentation of
`a set of pairs of input and output patterns. The
`system first uses the input data to produce its own
`output data and then compares this with the de-
`sired output. If there is no difference, no learning
`takes place and otherwise the weights are changed
`to reduce the difference. As a result of this it
`becomes possible to converge the network after a
`relatively short cycle of training.
`To train the net weights on connections are
`first initialised randomly and input data (training
`data) are successively supplied to the processing
`elements in the input layer. Each processing ele(cid:173)
`ment is fully connected to other processing ele-
`30 ments in the next layer where a predetermined
`calculation operation is carried out. In other words,
`the training input is fed through to the output. At
`the output layer the error is found using, for exam(cid:173)
`ple, a sigmoid function and is propagated back to
`35 modify the weight on a connection. The goal is to
`minimize the error so that the weights are repeat(cid:173)
`the network
`edly adjusted and updated until
`reaches a satisfactory
`level of performance. A
`graphical representation of sigmoid functions is
`shown in Fig. 3.
`function as
`In
`this embodiment a sigmoid
`shown in Fig. 3 is applied as the transfer function
`for the network. The sigmoid function is a bounded
`differentiable real function that is defined for all real
`input values and that has a positive derivative ev(cid:173)
`erywhere. The central portion of
`the sigmoid
`(whether it is near 0 or displaced) is assumed to be
`roughly linear. With the sigmoid function it be(cid:173)
`comes possible to establish effective neural net-
`work models.
`As a sigmoid function parameter in each layer,
`a y-directional scale and a y-coordinate offset are
`defined. The y-directional scale is defined for each
`layer to exhibit exponential variation. This results in
`improved convergence efficiency of the network.
`It is readily understood that other functions
`may be used as the transfer function. For example,
`in a sinusoidal function a differential coefficient for
`
`40
`
`45
`
`50
`
`55
`
`5
`
`5
`
`

`

`9
`
`EP 0 582 236 A1
`
`10
`
`the input sum in each processing element is within
`a range equal to that for the original function. To
`use the sinusoidal function results in extremely
`high convergence of training though the hardware
`for implementing the network may be rather com(cid:173)
`plex in structure.
`An embodiment of the present invention is
`described with reference to Figs. 4 through 9.
`Fig. 4 is a block diagram of a system for
`predicting and evading crash of a vehicle using
`neural networks according to the first embodiment
`of the present invention. A system in Fig. 4 is
`similar in structure and operation to that illustrated
`in Fig. 1 other than a crash predicting circuit 60.
`Description of the similar components will thus be
`omitted by the consideration of evading redun(cid:173)
`dancy. Fig. 5 is a schematic structural diagram of
`the crash predicting circuit 60 illustrated in Fig. 4
`realized by a neural network of three layers.
`The crash predicting circuit 60 in this embodi(cid:173)
`ment is implemented by a neural network architec(cid:173)
`ture of a hierarchical design with three layers as
`shown in Fig. 5(a). The input layer 61 consists of n
`processing elements 61-1 through 61-n arranged in
`parallel as a one-dimensional linear form. Each
`processing element in the input layer 61
`is fully
`connected in series to the processing elements in
`a hidden layer 62 of the network. The hidden layer
`62 is connected to an output layer 63 of a single
`processing element to produce an operational com(cid:173)
`mand described below. Fig. 5(b) shows an input
`layer consisting of a two-dimensional array of pro(cid:173)
`cessing elements. In this event, the image data are
`supplied to the input layer as a two-dimensional
`data matrix of n divisions. Basically, the input and
`the hidden layers can have any geometrical form
`desired. With the two-dimensional array, the pro(cid:173)
`cessing elements of each layer may share the
`same transfer function, and be updated together. At
`any rate, it should be considered that each pro(cid:173)
`cessing element is fully interconnected to the other
`processing elements in the next layer though only
`a part of which are shown in Fig. 5(a) to evade
`complexity.
`Referring now to Fig. 6 in addition to Fig. 5,
`illustrated are views picked up, as the image data
`for use in training the neural network. The image
`pick-up device 21 picks up ever-changing images
`as analog image data as described above in con(cid:173)
`junction with the conventional system. This image
`pick-up device 21
`is also any one of suitable de(cid:173)
`vices such as a CCD camera. The image pick-up
`operation is carried out during running of a vehicle
`at higher speed than a predetermined one. The
`image data are subject to sampling for a sampling
`range ,.,_ T during a predetermined sampling period
`,.,_t. The image data are collected before and just
`after pseudo crash. The image pick-up range of the
`
`image pick-up device 21 corresponds to a field of
`view observed through naked eyes. A view shown
`in Fig. 6(a) is picked up when a station wagon
`(estate car) 80a on the opposite lane comes across
`the center line. A view shown in Fig. 6(b) is picked
`up when an automobile 80b suddenly appears from
`a blind corner of a cross-street. These ever-chang(cid:173)
`ing images are collected as the training data for the
`neural network.
`The image data effectively used for the crash
`evasive purpose are those which allow continuous
`recognition of the ever-changing views before and
`just after pseudo crash. With this respect, the im(cid:173)
`age pick-up device 21 picks up the images of a
`vehicle or other obstructions located at a relatively
`short headway. In addition, the picked up images
`preferably are distinct reflections of the outside
`views.
`The data elements consisting of one image are
`simultaneously supplied to the input layer 61
`in
`parallel. In other words, each data element is sup(cid:173)
`plied to the respective processing element of the
`input layer 61. The digital image data may be
`normalized before being supplied to the input layer
`61 to increase a data processing speed. However,
`each processing element of the input layer 61
`essentially receives the data element obtained by
`dividing the image data previously. The data ele(cid:173)
`ments are subjected to feature extraction when
`supplied to the hidden layer 62.
`In typical image processing, feature extraction
`is carried out according to any one of various
`methods of pattern recognition to clearly identify
`shapes, forms or configurations of images. The
`feature-extracted data are quantized for facilitate
`subsequent calculations.
`In
`this event, separate
`analytical procedure is used for region partitioning
`or for extraction of configuration strokes. In other
`words, a particular program is necessary for each
`unit operation such as region partitioning, feature
`extraction, vectorization and so on. Compared with
`this, the prediction system according to the present
`invention requires no program based on each op(cid:173)
`eration or procedure because a unique algorithm is
`established on completion of network training. This
`single algorithm allows to perform necessary func(cid:173)
`tions without using separate algorithms or pro(cid:173)
`grams.
`In a preferred embodiment, the feature extrac-
`tion is directed to the configuration of an object
`defining the driving lanes such as shoulders, curbs,
`guard rails or the center line. The feature may also
`be extracted on regions such as carriageways. The
`neural network learns these configurations and re-
`gions during training process. This process is con(cid:173)
`tinued until the network reaches a satisfactory level
`of performance. The neural network is thus trained
`while carrying out feature extraction on the input
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`6
`
`6
`
`

`

`11
`
`EP 0 582 236 A1
`
`12
`
`image. Weights are adjusted after every trial on the
`quantized image data, so that the latest training
`data is weighted according to the latest result of
`adjustment and then supplied to the hidden layer
`62. In addition, the neural network can be trained
`with image data including an object at time-varying
`positions. In this event, any one of suitable meth(cid:173)
`ods may be used for digital image processing.
`In the present embodiment, each digital data
`indicative of ever-changing view at a certain sam(cid:173)
`pling time instance is divided into n data elements.
`A product of n represents a positive integer which
`is equal in number to the processing elements in
`the input layer 61. In other words, the series of
`time sequential data is picked up as continuous n
`data elements to be supplied in parallel to the n by
`m processing elements in the input layer 61 as the
`training data. At the same time, an operational
`signal is supplied to the output layer 63 of the
`network as teacher data. The operational signal
`may be a logic "1" for representing crash of the
`automobile 1 0 after elapse of a predetermined time
`interval from the sampling time instant correspond(cid:173)
`ing to the image data just having been supplied to
`the input layer 61.
`In the same manner, the picked up image data
`its corresponding teacher data are succes(cid:173)
`and
`sively supplied to the crash predicting circuit 60.
`The crash predicting circuit 60
`is continuously
`trained until the network reaches a satisfactory
`level of performance. After completion of training,
`the network is capable of matching the picked up
`image with the possibility of crash.

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket