`
`SAMSUNG EXHIBIT 1001
`Samsung v. Image Processing Techs.
`
`
`
`US 6,959,293 132
`
`Page 2
`
`OTHER PUBLICATIONS
`
`Pierre—Francois Riiedi, “Motion Detection Silicon Retina
`Based on Event Correlations”, 1996 IEEE Proceedings of
`MicroNeuro ’96, pp. 23—29.
`Revue Trimestrielle Des <<Techniques de Lingenieur>>,
`“Instantanés Technique” Techniques De ingénieur, Mars
`1997—No 5 (40F), ISSN 0994—0758.
`Es Professionnels de Linformatique En Entreprise Maga-
`zine, “Objectif Securite Des ReseauX”, No 24, Janvier, 1997.
`Electroncique International Hebdo, 5 Decembre 1996—No
`245, “Premier .
`.
`. oeil”, Francoise Gru svelet (With trans-
`lation).
`Nabeel Al Adsani, “For Immediate Release The Generic
`Visual Perception Processor”, Oct. 10, 1997, p. 1.
`Colin Johnson, “Vision Chip’s Circuitry Has Its Eye Out For
`You”,
`http://192.215.107.74/Wire/news/1997/09/0913vi-
`sion.html, pp. 1—3.
`:“British firm has eye on the future”,
`The Japan Times,
`Business & Technology, Tuesday, Nov. 18, 1997, 4th Edi-
`tion.
`
`Inside the Pentagon’s, Inside Missile Defense, an exclusive
`biweekly report on US. missile defense programs, procure-
`ment and policymaking, “Missile Technology” vol. 3, No.
`16—Aug. 13, 1997, p. 5.
`
`Electronique, “Le Mechanisme de la Vision Humaine Dans
`Le Silicium”, Electronique Le Mensuel Des Ingenieurs De
`Conception, No. 68, Mars 1997, ISSN 1157—1151 (With
`translation).
`“Elecktronik Revue” ER, Eine Elsevier—Thomas—Publika-
`tion, Jahrgang 8, Marz 1997, NR.3, ISSN0939—1134.
`“Un Processor de Perception Visuelle”, LehAUT pAR-
`LEUR, 25F Des solutions electroniques pour tous, No 1856,
`15 janvier 1997 (With translation).
`“Realiser Un Decodeur Pour TV Numberique”, Electron-
`ique, Le Mensuel Des Ingenieurs De Conception, No. 66,
`Janvier 1997.
`
`Groupe Revenu Francais, Air & Cosmos Aviation Interna-
`tional, “Un Calculateur De perceoption Visuelle”, Hebdo-
`madaire, vendredi 6 decembre 1996, 34 Année, No 1590,
`22F.
`
`Kenichi Yamada, et al; “Image Understanding Based on
`Edge Histogram Method for Rear—End Collision Avoidance
`System”, Vehicle Navigation & Information Systems Con-
`ference Proceedings; (1994), pp. 445 450 published Aug. 31,
`1994; XP 000641348.
`
`* cited by examiner
`
`SAMSUNG EXHIBIT 1001
`
`Page 2 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 2 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 1 0f 31
`
`US 6,959,293 B2
`
`' ~
`,n.-.-......
`
`...-.‘.. -.-.....-...---.»....._.-.._.-.__...-x........-.-.-....-.--....---....--...--....--..--......__...__. .----.-.-._
`.
`
`xu
`
`.9
`0
`
`|—._J
`(DU)
`
`8m
`
`2929281025
`
`mmkwfiz
`
`SAMSUNG EXHIBIT 1001
`
`Page 3 of 50
`
`mmkzaoo
`
`xogmo
`
`
`
`mtm>>$051364m
`
`awhz300
`
`£38
`
`3220
`
`Q55
`
`82:3
`
`3755
`
`EBQEmH
`
`96885
`
`
`
`._._2_._.m
`
`w><._m
`
`ZO_._.<w_ZOKIoz>m
`
`SAMSUNG EXHIBIT 1001
`Page 3 of 50
`
`
`
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 2 0f 31
`
`US 6,959,293 B2
`
`
`
`AllllllllllillllllvAIIIIIIIIIIi
`
`2d.3.F22.;
`
`
`
`AlxlllYii-AIIIIIIIIIYA:x--i-|l||YAli-.illV
`
`
`
`
`
`
`
`ziillLH..........
`
`$8.32
`
`
`
`
`
`Ei-....-..-J3:-....silfiljldlm.
`
`
`
`ezuaavmfirm983.5eee~.~._weeeflb898275uuu.z.z.z..t.L.LD.uusayea;ea;
`
`QIOIICOIIIID
`
`NEE
`
`
`
`ii...3.5%
`
`SAMSUNG EXHIBIT ‘IOO‘I
`
`Page 4 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 4 of 50
`
`
`
`
`2B392,95
`
`9,_6mSnUW
`
`US. Patent
`
`2wwmom
`
`mMW3mmmEmza.
`x<2mw8mgm%0un,So
`
`>m02m2
`
`wwmmo<
`
`m2,
`
`<F<o
`
`z_
`
`iI
`
`SAMSUNG EXHIBIT 1001
`
`Page 5 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 5 of 50
`
`
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 4 0f 31
`
`US 6,959,293 B2
`
`o>Os_wOn_
`
`
`
`V25?“(:5g:2.
`
`w>m30
`
`10.2mm.sz
`
`358.30
`
`9.50
`
`28...;
`
`PCSO
`
`.33.
`
`3
`
`4
`
`x55
`
`Nor
`
`m4m<22<m00ma
`
`Fumzzoommhz_
`
`oz<.
`
`Jsmow<§m§oo
`
`
`
`><mm<050..
`
`o:
`
`
`
`...........--.A
`
`mmHZDOO
`
`ocoN._n>
`
`5:58-60
`
`02:0
`
`.Qcaooéom
`
`
`
`.zm<mfirm
`
`92m.
`
`2220
`
`$538
`
`mmozwDOmm
`
`SAMSUNG EXHIBIT 1001
`
`Page 6 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 6 of 50
`
`
`
`
`
`
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 5 0f 31
`
`US 6,959,293 B2
`
`DATA (A)
`
`3:55
`
`FIG. 5
`
`MaxHDATA
`Row_Size
`Col_Size
`Row_Curve_Size
`Col_Curve_Size
`lnit_Row_Curve
`
`
`
`lnit_Col__Curve
`
`COUNTER=Max_DATA
`
`YES
`
` ST=0; IN“'80; EN D=0;
`SL=0:
`WRITE-=0}
`
`
`
`INIT=1
`COUNTER=O
`
`
`
`
`INIT=0; WRITE-=0;
`Row_Counter=0
`Col_Counter=O:
`
`_
`
`curve
`sequence
`
`Row Curve
`CoCCun/e
`
`
`
`C‘ol__Counter=Col_Counter+1
`
`
`
`
`
`
`WRITE=O
`
`EN_D=1
`COUNTER=O
`
`Col_Counter=0
`Row_Cou nteFRow_Counter+1
`.
`
`
`
`
`
`COUNTER=Max_DATA
`
`COUNTER=COUNTER+1
`
`FIG. 6
`
`SAMSUNG EXHIBIT 1001
`
`Page 7 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 7 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 6 0f 31
`
`US 6,959,293 B2
`
`w
`
`
`
`
`
`
`
`
`
`
`'N'xungEZE-w‘
`'
`
`
`CLOCK
`
`—
`YES
`
`
`INIT=1
`COUNTER=0
`
`YES
`
`
`
`COUNTER=Max_DATA
`
`MaxfiDATA
`Row_$ize
`Col_Size
`
`Row Curve Size
`-
`- .
`
`Col_Curve_SIze
`
`
`lnit_Row_Curve
`
`lnit_Col__Curve
`
`
` I'N IT=0;
`
`
`
`R0w_Coun!er=0
`Col_Countet=O;
`
`COUNTER=COUNTER+1
`
`YES
`
`Row_Counter=Row_Size
`
`Coi_Counter=Col_Size
`
`Curve
`Sequence
`
`tag—(€33:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
` Col_Counter=Col_Counter+1
`
`FIG.‘7
`
`SAMSUNG EXHIBIT 1001
`
`Page 8 of 50
`
`
`
`__
`END-1
`
`Col_Counter=0
`Row_Counter=Row_Counter+1
`
`COUNTER=0
`
`
`
`
`
`COUNTER=Max_DATA
`
`CLOCK
`
`lam COUNTER=COUNTER+1
`
`SAMSUNG EXHIBIT 1001
`Page 8 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 7 of
`
`31
`
`US 6,959,293 132
`
`
`Row_Curve_Size
`
`Col_Curve_Size
`Init_Row__Curve
`Init_Co|__Curve
`
`Va|_Zone=0
`
`
`
`
`Fig.8
`
`104
`
`
`
`M|N=Max_Param
`MAX=0
`NBPTS=0
`POSRMX=0
`RMAX=O
`
`
`
`
`
`Data_ln=0
`Adress=COUNTER
`WR=1
`
`100
`
`SAMSUNG EXHIBIT 1001
`
`Page 9 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 9 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 8 0f 31
`
`US 6,959,293 B2
`
`1 00
`
`1 01
`
`STAT.
`COMPUTATION
`
`CLOCK
`
`m YES
`
`
`
`
`Data_ln=Dat_Out + Validation
`Adress=DATA (A)
`WR=1
`
`Data_0ut=Resul_Memory
`Adress=DATA (A)
`WR=O
`
`.
`
`YES
`
`104
`
`I 103
`
`YES
`
`104
`
`YES
`
`104
`
`RMAX=Data_In
`POSRMX=DATA (A)
`
`YES
`
`1 04
`
`NBPTS=NBPTS+1
`
`N0
`
`.
`Fig. 10
`
`SAMSUNG EXHIBIT 1001
`
`Page 10 of 50
`
`
`
`RMAX < Data_ln
`
`Validation=1
`
`SAMSUNG EXHIBIT 1001
`Page 10 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 9 0f 31
`
`US 6,959,293 B2
`
`
`
`Update Classification
`
`
` Reg=0
`POSMOY1 = POSMOYo
`
`Adress = COUNTER
`WR = 0 -
`
`Data_0ut = A
`
`
`
`Data_ln = a
`Adress = COUNTER
`WR = 1
`
`
`
`
`
`Anticipation Application
`
`
`
`Fig. 11
`
`SAMSUNG EXHIBIT 1001
`
`Page 11 of 50
`
`New
`POSMOY
`Computation
`
`
`
`Reg > NBPTS/2
`No
`I Reg = Reg + A
`
`I
`
`POSMOYO = COUNTER
`
`
`SAMSUNG EXHIBIT 1001
`Page 11 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 10 0f 31
`
`US 6,959,293 132
`
` DATA
`IN
`OUT
`
`MEMORY
`
`
`199
`\NR
`ADRESS
`
`
`
`
`DATA (A)
`COUNTER
`
`DATA(A)
`COUNTER
`
`SAMSUNG EXHIBIT 1001
`
`Page 12 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 12 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 11 0f 31
`
`US 6,959,293 B2
`
`111
`
` Fig 13b
`
`SAMSUNG EXHIBIT 1001
`
`Page 13 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 13 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 12 0f 31
`
`US 6,959,293 B2
`
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 14 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 14 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 13 0f 31
`
`US 6,959,293 B2
`
`104
`
`Z 3 OO S
`
`AMSUNG EXHIBIT 1001
`
`Page 15 of 50
`
`','OSMOY0
`
`cc
`LIJ
`g...
`
`SAMSUNG EXHIBIT 1001
`Page 15 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 14 0f 31
`
`US 6,959,293 B2
`
`
`
`09..0.”—
`
`m2:
`
`%I:g
`
`.l%.g
`
`{I
`
`.50(.20
`
`2.
`
`E05:
`
`ole.H«2,
`
`.wmmmgq
`
`A:
`
`._._2_
`
`.wtm>>m2,
`
`I
`
`2%$528
`
`<(.55
`
`ozm
`
`o:(
`._..Z_
`
`mmHZDOO
`
`<<F<D
`
`SAMSUNG EXHIBIT 1001
`
`Page 16 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 16 of 50
`
`
`
`
`
`
`
`US. Patent
`
`%
`
`u
`
`m
`
`2B%
`
`.591.81..ainfll
`.o.1.5szII.85.2E$528
`-9mE052ABE;2:
`2m..:2.
`.e6583a;.<<29
`
`.52—
`
`So«Ea
`
`z.5
`
`M5.
`
`wm:
`
`2‘
`
` Ea.m«2...wEmzozm(:z_
`
`<Ea
`
`any0—"—
`
`
`
`M,ozm.:z.%d6,32N:8*s.U..«E228
`
`SAMSUNG EXHIBIT 1001
`
`Page 17 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 17 of 50
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 16 0f 31
`
`US 6,959,293 B2
`
`FIG. 16
`
`SAMSUNG EXHIBIT 1001
`
`Page 18 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 18 of 50
`
`
`
`US. Patent
`
`m592
`
`maUm
`
`6
`
`2B%2
`
`m.8&2E$528
`F>OEwOn_.<<._.<D
`
`3Es.
`
`o>02w0m‘WEE;me‘
`
`E05:
`
`:2.
`
`So53
`
`2.
`
`3
`
`
`
`%ozm.‘
`
`o:
`
`:2.
`
`m35?N:
`9,.m»,.t.0."..ozmQ:2.
`
`
`
`
`E$5.50<<F<o
`
`ozmmo?
`
`SAMSUNG EXHIBIT 1001
`
`Page 19 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 19 of 50
`
`
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 18 0f 31
`
`US 6,959,293 B2
`
`FIG. 18
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 20 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 20 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 19 0f 31
`
`US 6,959,293 B2
`
`WRITE
`
`INIT
`
`END
`
`ETD
`
`
`
`a1-2
`
`a4
`
`:.:-:-
`81-2
`
`:‘:-:-:~
`84-1
`
` t5
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 21 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 21 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 20 0f 31
`
`US 6,959,293 B2
`
`Anticipation
`
`Application
`
`‘
`
`
`
`Sign=0 FIG. 21
`
`A = PQSMOY1~ POSMOYo
`Sign=1
`
`A = POSMOYo- POSMOY1
`
`SAMSUNG EXHIBIT 1001
`
`Page 22 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 22 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 21 0f 31
`
`US 6,959,293 B2
`
`8sumterm
`
`425
`
`Product term n
`
`
`
`
`— ‘ATDEcfic't-fé'nfi' —————— _ ’- _________ “ —
`
`FIG 22
`
`SAMSUNG EXHIBIT 1001
`
`Page 23 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 23 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 22 0f 31
`
`US 6,959,293 B2
`
`
`
`8sumterm
`
`Aproductterm
`
`SAMSUNG EXHIBIT 1001
`
`Page 24 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 24 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 23 0f 31
`
`US 6,959,293 B2
`
`Nv
`
`Mod
`
`w4m<§2<m00mm
`
`Homzzoomwhg
`
`QZ<
`
`._<_mol_.<z_m200
`
`
`
`><mm<0504.
`
`emu—“.200
`
`.GEZOO
`
`NOEZOO
`
`._OEZOO
`
`i
`
`
`-- .---.---- .oIQI—OOI“--...”-‘
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 25 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 25 of 50
`
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 24 0f 31
`
`US 6,959,293 B2
`
`
`
`
`PARAMETER
`
`Space
`transform
`
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 26 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 26 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 25 0f 31
`
`US 6,959,293 B2
`
` X367
`
`.........................-.
`
`F """"‘;
`_ Semi-Graphic ;
`‘ Memory
`
`
`
`
`3 ...G.er.'er=!!9.r..-..i
`- 3—667
`
`
`
`
`SV
`SH
`
`.
`
`INICOL
`‘
`Sequencer
`
`Val_Zone
`
`91
`
`Fig. 28
`
`Co|umn Counter
`
`INICOL
`
`362
`
`361
`
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 27 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 27 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 26 0f 31
`
`US 6,959,293 B2
`
`
`
`I
`
`>-
`
`DATA (D)
`
`v
`
`'
`
`>-
`
`‘5
`E) II16
`
`S(t)
`
`Clock
`
`ST
`
`SL
`
`
`
`
`
`:LQ
`
`
`
`FIG. 310
`
`500
`
`111
`
`SAMSUNG EXHIBIT 1001
`
`Page 28 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 28 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 27 0f 31
`
`US 6,959,293 B2
`
`m
`
`.E»"lAHL“L_H1%m3
`amIm.mumnonm_aamn_M.EEE.1u
`
`
`
`
`
`
`* l
`
`a;
`_=ll
`
`l— 1
`
`-="
`
`a21
`
`aga.mSOMC
`
`FIG. 33
`
`SAMSUNG EXHIBIT 1001
`
`Page 29 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 29 of 50
`
`
`
`
`
`
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 28 0f 31
`
`US 6,959,293 B2
`
`EMBEDDED
`
`APPLICATION
`
`I:
`
`i.
`I.
`E.
`I.
`
`SAMSUNG EXHIBIT 1001
`
`Page 30 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 30 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 29 0f 31
`
`US 6,959,293 B2
`
`aII
`
`-...--.-.--...-...—....-.--—-...-.-.——...‘
`
`Sequencer
`
`
`CMOS lmager
`
`.m..-............_..._..
`
`
`
`SAMSUNG EXHIBIT 1001
`
`Page 31 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 31 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 30 0f 31
`
`US 6,959,293 B2
`
`a
`
`SCREEN
`
`DPDATA
`
`CLRSCR
`
`36
`
`SR
`
`366
`
`15
`
`510
`
`LEARN
`
`365
`
`CLCURV
`' AFCURV
`AFMAP
`CLMAP
`
`START
`STOP
`
`MAP
`MLRN
`
`I
`
`SAMSUNG EXHIBIT 1001
`
`Page 32 of 50
`
`
`
`-
`DATA (3) --I
`I-
`-
`
`DATA (E)
`
`SELECT
`
`SAMSUNG EXHIBIT 1001
`Page 32 of 50
`
`
`
`US. Patent
`
`Oct. 25, 2005
`
`Sheet 31 0f 31
`
`US 6,959,293 B2
`
`______.__.fl.__
`
`862%
`
`Sun—
`
`llllJ
`
`SAMSUNG EXHIBIT 1001
`
`Page 33 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 33 of 50
`
`
`
`US 6,959,293 B2
`
`1
`METHOD AND DEVICE FOR AUTOMATIC
`VISUAL PERCEPTION
`
`BACKGROUND OF THE INVENTION
`
`The invention relates generally to methods and devices
`for automatic visual perception, and more particularly to
`methods and devices for processing image signals using one
`or more self-adapting histogram calculation units capable of
`implementing anticipation and learning modes. Such
`devices can be termed an electronic spatio-temporal neuron,
`and is particularly useful for image processing, but may also
`be used for processing of any other signals, such as sound
`signals.
`Image processing methods and devices are already
`known, which enable real-time recognition,
`localization
`and/or extraction of objects corresponding to certain criteria
`of their context. The selection criteria can be extremely
`varied. They may be related to speed, shape, color .
`.
`. or a
`combination of these criteria. These methods and devices
`
`can be used to facilitate the acquisition of a scene or of a
`phenomenon by an observer or to control an automatism on
`the basis of information thus extracted. Such methods and
`
`devices are for example described in the following publica-
`tions FR-2.611063 and WO-98/05002.
`
`Certain of these methods and devices implement a spatial
`and temporal processing unit that, upon receiving a video-
`type signal S(PI), produces a number of parameters for each
`pixel. It may be, for instance speed V, direction DL, a time
`constant CO and a binary enabling parameter VL in addition
`to the delayed video signal VR and the different frame, line
`and pixel synchronization signals gathered under the
`denomination F.
`
`In such devices, the importance of constituting histograms
`of these parameters and using them in a visual perception
`processor has already been outlined in order to acquire,
`manipulate and process statistical information.
`The purpose of such a visual perception processor
`includes outputting a signal S’(t) that carries for each pixel
`a significant piece of information of the result obtained when
`applying recognition or selection criteria. These criteria are
`predefined or prepared by the image processing methods and
`devices properly speaking.
`in particular, are
`Such a method and such a device,
`divulged in the patent application WO-98/05002, already
`mentioned, that has been integrated thereto for reference
`purposes.
`
`It is therefore desirable to provide an improved visual
`perception processor, and methods, as well as, in preferred
`embodiments, the auto-adapting, anticipation and learning
`functions.
`
`SUMMARY OF THE INVENTION
`
`This invention provides visual perception devices and
`methods for detecting automatically an event occurring in a
`space with respect to at least one parameter.
`According to the invention, a perception device comprises
`a control unit, a data bus, a time coincidences bus and at least
`a histogram calculation unit for processing the parameter.
`The present invention also covers the features that will be
`put in evidence by the following description and that will
`have to be considered either independently or in technical
`combinations:
`
`the device comprises, in order to process a number of
`parameters, a number of histogram calculation units orga-
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`2
`nized into a matrix ; the histogram calculation units process
`data ail—T associated with pixels forming together a multidi-
`mensional space (i,
`evolving with the course of time and
`represented at a succession of instants (T), wherein the said
`data reaches the said calculation unit in the form of a digital
`signal DATA(A) in the form of a succession ail-T of binary
`number of n bits associated with synchronization signals
`enabling to define the given instant (T) of the space and the
`position (i,
`of the pixel in this space, to which the signal
`ail-T received at a given instant (t) is associated, and com-
`prises:
`
`an analysis memory comprising a memory with
`addresses, each associated with possible values of the num-
`bers of n bits of the signal DATA(A) and whose writing
`process is controlled by a signal <<WRITE>>,
`a classifier comprising a memory intended for receiving a
`selection criterion C of the parameter DATA(A), receiving
`the signal DATA(A) at the input and that outputs a binary
`output signal whose value depends on the result of the
`comparison of the signal DATA(A) with the selection cri-
`terion C,
`a time coincidences unit receiving the output signal from
`the classifier and, from outside the histogram calculation
`unit, individual binary enabling signals affecting parameters
`other than DATA(A), wherein the said time coincidences
`unit outputs a positive global enabling signal when all the
`individual time coincidences signals are valid,
`a test unit,
`an analysis output unit,
`an address multiplexer,
`an incrementation enabling unit,
`wherein the counter of each address in the memory
`corresponds to the value d of ail-t at a given instant, which is
`incremented by one unit when the time coincidences unit
`outputs a positive global enabling signal,
`the unit intended for calculating and storing statistical
`data processes, after receiving the data ail—t corresponding to
`the space at an instant T, the content of the memory in order
`to update its own memories,
`the memory is deleted before the beginning of each frame
`for a space at an instant T by an initialization signal
`<<INIT>>.
`
`the memory of the classifier is an addressable memory
`enabling real time updating of the selection criterion C and
`having one data input DATA IN, an address command
`ADDRESS and a writing command WR, receiving on its
`input the output from the analysis memory and a signal END
`on its writing command,
`it also comprises a data input multiplexer with two inputs
`and one output, receiving on one of its inputs a counting
`signal COUNTER and on its other input the succession of
`data ail-t
`to the address command of the memory of the
`classifier and an operator OR controlling the address mul-
`tiplexer and receiving on its inputs an initialization signal
`INIT and the end signal END.
`the space (i,
`is two-dimensional and that the signal
`DATA(A) is associated with the pixels of a succession of
`images.
`it comprises means for anticipating the value of the
`classification criterion C.
`
`the means for anticipating the value of the classification
`criterion C comprise memories intended for containing the
`values of statistical parameters relating to two successive
`frames TO and T1.
`
`SAMSUNG EXHIBIT 1001
`
`Page 34 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 34 of 50
`
`
`
`US 6,959,293 B2
`
`3
`the statistical parameters are the average values of the
`data ail—t enabled.
`the analysis output register constitutes and stores in its
`memory at least one of the following values: the minimum
`‘MIN’,
`the maximum ‘MAX’,
`the maximum number of
`pixels for which the signal Vi], has a particular value
`‘RMAX’, the particular value corresponding POSRMAX,
`the total number of enables pixels ‘NBPTS’.
`the statistical comparison parameter used by the classifier
`is RMAX/2.
`
`it comprises a controlled multiplexer, capable of receiving
`at input several statistical parameters and that the nature of
`the comparison made by the classifier depends on the
`command of the said multiplexer.
`it comprises a learning multiplexer intended for receiving
`an external command signal and producing an operation
`according to a learning mode in which the registers of the
`classifier and of the time coincidences unit are deleted when
`
`starting to process a frame and that the analysis output
`register supplies values typical of the sequence of each of
`these registers.
`the memory of the classifier includes a set of independent
`registers D, each comprising one input, one output and one
`writing command, wherein the number of these registers D
`is equal to the number n of bits of the numbers of the
`succession Vii, and that it comprises a decoder enabling to
`output a command signal corresponding to the related input
`value (address) and a multiplexer controlled by this input
`value, thus enabling to read the chosen register.
`it comprises multiplexers, each of them being associated
`with the input of each register and combinatory modules
`connecting the registers to one another, wherein the said
`multiplexers enable to choose between sequential writing
`and a writing mode common to all the registers connected
`together by the combinatory modules.
`the combinatory modules comprise a morphological
`expansion operator including a three-input logic unit ‘OR’,
`whereby the first input receives the output signal of the
`‘Q’-order register, the second is connected to the output of
`a two-input
`logic unit ‘AND’ receiving respectively the
`output signal of the ‘Q+1’-order register and a positive
`expansion signal, the third is connected to the output of a
`two-input logic unit ‘AND’ receiving respectively the output
`signal of the ‘Q—1’-order register and a negative expansion
`signal.
`the combinatory modules comprise a morphological ero-
`sion operator including a three-input
`logic unit ‘AND’,
`whereby the first input receives the output signal of the
`‘Q’-order register, the second is connected to the output of
`a logic unit ‘AND’, wherein one four-input reverse receives
`respectively the output signal of the ‘Q’-order register, the
`output signal of the ‘Q—1’-order register, the output signal of
`the ‘Q+1’-order register and a negative erosion signal, the
`third is connected to the output of a four-input logic unit
`‘AND’, wherein one reverse receives respectively the output
`signal of the ‘Q’-order register,
`the output signal of the
`‘Q—1’-order register, the output signal of the ‘Q+1’ order
`register and a negative erosion signal
`each combinatory module comprises a multiplexer asso-
`ciating a morphological expansion operator and a morpho-
`logical erosion operator.
`The invention relates to an automatic visual perception
`method of an event occurring in a space with respect to at
`least one parameter. This method includes digitalizing the
`parameter and affecting it as an input to a histogram calcu-
`
`5
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`4
`lation unit in order to get a representative histogram of the
`parameter and to infer the desired result.
`The invention also relates to an analysis method of a
`parameter representative of an event in an electronic device,
`comprising a histogram calculation over data ail—t associated
`with pixels forming together a multidimensional space (i,
`evolving with the course of time and represented at a
`succession of instants (T), wherein the said data reaches the
`said calculation unit in the form of a digital signal DATA(A)
`in the form of a succession ail—t of binary number of n bits
`associated with synchronization signals enabling to define
`the given instant (T) of the space and the position (i,
`of the
`pixel in this space, to which the signal ail-t, received at a
`given instant (T) is associated, and comprises:
`
`to each data ail—t is associated a classification binary signal
`whose value depends on the result of the comparison
`between the signal DATA(A) and the selection criterion C,
`a statistical distribution of the data ail-t is made for a given
`instant for which a global enabling signal is positive, the said
`global enabling signal being made of a set of individual time
`coincidences signals, each one corresponding to a parameter
`DATA(A), DATA(B),
`.
`.
`.
`, DATA(E), resulting from the
`comparison between a time coincidences criterion R and the
`classification signal and being positive.
`Reference to the remaining portions of the specification,
`including the drawings and claims, will realize other features
`and advantages of the present invention. Further features
`and advantages of the present invention, as well as the
`structure and operation of various embodiments of the
`present invention, are described in detail below with respect
`to the accompanying drawings. In the drawings, like refer-
`ence numbers indicate identical or functionally similar ele-
`ments.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The invention will be described more in detail with
`
`reference to the appended drawings in which:
`FIG. 1 is a representation of the histogram calculation unit
`according to the invention, in its context:
`FIG. 2 is a representation of the input video signal,
`processed by the device and the method of the invention and
`of the control signals generated by a sequencer;
`FIG. 3 is a diagram representing a passive histogram
`calculation unit;
`FIG. 4 is a diagram representing a self-adapting histogram
`calculation unit according to the invention with the antici-
`pation and learning functionalities;
`FIG. 5 is a diagram representing signals processed by the
`calculation unit of FIG. 4;
`FIG. 6 is the flow chart of the software controlling the
`calculation unit of FIG. 4 in master mode;
`FIG. 7 is the flow chart of the software controlling the
`calculation unit of FIG. 4 in slave mode;
`FIG. 8 is the flow chart of the insertion software of the
`curve zone;
`FIG. 9 is the flow chart of the initialisation software
`
`(generation of the command ‘INIT’);
`FIG. 10 is the flow chart of the statistical calculation
`
`software (use of the command ‘WRITE’);
`FIG. 11 is a flow chart of processing end (use of the
`command ‘END’);
`FIG. 12 is a representation of the elements of the histo-
`gram calculation unit with a self-adapting functionality
`according to one embodiment of the present invention;
`
`SAMSUNG EXHIBIT 1001
`
`Page 35 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 35 of 50
`
`
`
`US 6,959,293 B2
`
`5
`FIGS. 13a and 13d are representations of an enabling
`counter fitted with several adapting modules according to
`alternate embodiments of the present invention;
`FIGS. 13b and 136 are representations of statistical dis-
`tributions of a parameter and classification criteria;
`FIG. 14 is a representation of the elements of histogram
`calculation unit producing POSMOY values according to
`one embodiment of the present invention;
`FIG. 15a is a diagram representing the elements of a
`self-adapting histogram calculation unit with anticipation
`according to a first embodiment;
`FIG. 15b is a diagram representing the elements of a
`self-adapting histogram calculation unit with anticipation
`according to an alternate embodiment;
`FIG. 16 is a diagram of the classifier memory according
`to one embodiment of the present invention;
`FIG. 17 is a diagram representing the elements of the
`self-adapting histogram calculation unit with anticipation
`according to a alternate embodiment;
`FIG. 18 is a detailed representation of the classifier
`memory with a bit-operated elementary calculation automa-
`ton according to one embodiment of the present invention;
`FIG. 19 is a representation of an elementary anticipation
`calculation automaton according to one embodiment of the
`present invention;
`FIG. 20 is a schematic representation of the anticipation
`process according to one embodiment of the present inven-
`tion;
`FIG. 21 is the flow chart of the anticipation implementa-
`tion software according to one embodiment of the present
`invention;
`FIG. 22 is a representation of the time coincidences unit
`according to one embodiment of the present invention;
`FIG. 23 is a flow chart representation of a field program-
`mable gate array (FPGA) used as a time coincidences unit
`according to one embodiment of the present invention;
`FIG. 24 is the register-based representation, limited to one
`row of the system, of FIG. 23;
`FIG. 25 is a representation of the elements of a histogram
`calculation unit with a learning functionality according to
`one embodiment of the present invention;
`FIG. 26 is a schematic representation of axis selection
`circuitry according to one embodiment of the present inven-
`tion;
`FIG. 27 illustrates various axes selectable by the circuitry
`of FIG. 26;
`FIG. 28 is a schematic representation of a statistical
`visualisation device according to one embodiment of the
`present invention;
`FIG. 29 is an example of the result obtained using the
`visualisation produced by the device of FIG. 28;
`FIG. 30 is the representation of an implementation of a
`number of histogram calculation units according to one
`embodiment of the present invention;
`FIG. 31a is the representation of the use of a single
`programmable histogram calculation unit with a multiplexer
`enabling the calculation unit to process any of a number of
`parameters according to one embodiment of the present
`invention;
`FIG. 31b is a representation of a histogram calculation
`unit called as well an electronic spatio-temporal neuron;
`FIG. 32 represents a set of histogram calculation units
`with programmable input control in their context of usage
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6
`thereby constituting a functional entity according to one
`embodiment of the present invention;
`FIG. 33 is a synthetic representation of a functional unit
`with an associated signal generator according to one
`embodiment of the present invention;
`FIG. 34 corresponds to FIG. 32 in the case of a two-source
`acquisition;
`FIG. 35 corresponds to FIG. 33 in the case of a binocular
`acquisition;
`FIG. 36 is a schematic representation of a signal generator
`fitted with controlled optics according to one embodiment of
`the present invention;
`FIG. 37 shows the case of a three-source acquisition
`according to one embodiment of the present invention;
`FIG. 38 is a representation of the application management
`interface (API) according to one embodiment of the present
`invention;
`FIG. 39 illustrates a system for processing signals in the
`sound perception domain according to one embodiment of
`the present invention; and
`FIG. 40 is a simplified representation of a device accord-
`ing to an embodiment of the present invention.
`
`DESCRIPTION OF THE SPECIFIC
`EMBODIMENTS
`
`The invention can be subject to numerous embodiments.
`The information processed can be of various natures and
`represent multiple data or parameters. However,
`its first
`application is image processing, whereby the said images
`make up the space considered. This space in one embodi-
`ment is two-dimensional. The following detailed description
`corresponds to this particular embodiment.
`The histogram calculation unit 1 of the invention is
`represented in its context by FIGS. 1 and 2.
`This histogram calculation unit 1 is part of a perception
`unit 13 that receives and processes a signal S(t) or S(PI). The
`histogram calculation unit processes and generates time
`coincidences information S’(t) on a bus 111. More precisely,
`FIG. 1 represents several associated histogram calculation
`units 1A, 1B, .
`.
`.
`, 1E in the same perception unit. In one
`embodiment, perception unit 13 is a visual perception unit
`that processes various signals relating to a visual scene or
`scenes. In other embodiments, the perception unit 13 pro-
`cesses signals related to the desired perception parameters,
`for example, sound parameters. The following will discuss
`the invention with respect to the visual perception domain,
`although it will be apparent that other perception domains
`may be implemented.
`A sequencer 9 generates, out of the synchronisation
`signals ST, SL, CLOCK, sequence signals INIT, WRITE and
`COUNTER that control the histogram calculation unit.
`As represented on FIG. 1,
`the input signals of the
`sequencer 9 (St, SL, ST, CLOCK) may come from a signal
`generator assembly 2 comprising a camera 22 or a signal
`generator assembly 3 comprising a CMOS imaging device
`32. It will be apparent that input signals can be supplied by
`any signal generation mechanism.
`When the input signals come from an assembly 2 com-
`prising a camera,
`this assembly imposes frame and line
`synchronisation signals so that the histogram calculation
`unit and its sequencer operate in a slave mode or synchro-
`nisation slave mode. FIG. 7 illustrates a flow chart repre-
`senting software for controlling the histogram calculation
`unit and sequencer in a slave mode.
`
`SAMSUNG EXHIBIT 1001
`
`Page 36 of 50
`
`SAMSUNG EXHIBIT 1001
`Page 36 of 50
`
`
`
`US 6,959,293 B2
`
`7
`Conversely, in case when these signals come from an
`assembly 3 comprising a CMOS imaging device,
`the
`sequencer 9 operates in a master mode and generates itself
`the synchronisation signals. FIG. 6 illustrates a flow chart
`representing software for controlling the histogram calcula-
`tion unit and sequencer in a master mode.
`More precisely, the assembly 2 enables acquisition of data
`from a scene 21 by a camera 22. The camera 22 produces a
`signal S(PI) whose configuration, of the type represented on
`FIG. 2, will be described in detail below.
`The electronic control unit 23 of the camera 22 then
`
`provides the signals S(t) resulting from the extraction of
`S(PI), ST, SL synchronisation signals and the CLOCK signal
`originating from a phase-lock loop, that are used by the
`histogram calculation unit.
`In the case of an assembly 3 comprising a CMOS imaging
`device, this imaging device 32 is used for the acquisition of
`data of the scene 31, it supplies S(t) and is driven by a
`synchronisation unit 33 that produces the frame synchroni-
`sation signals ST and the line synchronisation signals SL, as
`well as the CLOCK signal used by the CMOS imaging
`device 32 as well as by the other elements of the visual
`perception unit 13.
`The histogram calculation units 1 are advantageously
`co-ordinated to a spatial processing unit 6 and a temporal
`processing unit 5 and to a delay line 7 that have been
`described in FR-2.611063 and WO-98/05002, the contents
`of which are each hereby incorporated by reference in its
`entirety for all purposes. The spatial and temporal process-
`ing units 5 and 6 correspond to the device referred to as 11
`in the patent application mentioned. It receives the signal
`S(PI) and generates parameters V (speed), DI (direction),
`each corresp