throbber
United States Patent [19]
`Tomitaka et al.
`
`illlllllllllllllIllllIIIIIQELQMQLIIQEQIIIIlllllllllllllllllllllll
`
`Patent Number:
`[11]
`[45] Date of Patent:
`
`5,546,125
`Aug. 13, 1996
`
`[54] VIDEO SIGNAL FOLLOW-UP PROCESSING
`
`5,412,487
`
`5/1995 Nishimura et a1. ................... .. 358/452
`
`SYSTEM
`
`5,416,848
`
`5/1995 Young . . . . . . . . . . . . . . . . . .
`
`. . . . .. 382/191
`
`[75] Inventors; Tadafusa 'l‘omitaka7 Chiba; Tsuneo
`Sekiya, T°kY°I b°th 01° Japan
`
`.. 348/169
`7/1995 Tomitaka ....... ..
`5,430,809
`5,473,369 12/1995 Abe ....................................... .. 348/169
`FOREIGN PATENT DOCUMENTS
`
`[73] Assignee: Sony Corporation, Tokyo, Japan
`
`0578508
`
`1/1994 European Pat. Off. .
`
`_
`[21] Appl' NO" 268’125
`[22] Filed:
`JuL 6’ 1994
`
`[30]
`
`Foreign Application Priority Data
`
`[JP]
`
`Japan .................................. .. 5-196954
`
`Jul. 14, 1993
`6
`Int. Cl- ..................................................... ..
`[52] 11.8. C1. .............................. .. 348/169; 348/30; 348/32
`[58] Field of Search ................................... .. 348/ 169-172,
`348/135—137, 149-157, 649-652, 214,
`32, 30; H04N 7/18
`
`[561
`
`References Cited
`U‘S. PATENT DOCUMENTS
`
`4,364,089 12/1982 Woolfson .............................. .. 348/169
`4,583,186
`4/1986 Davis et al.
`364/526
`4,718,089
`1/1988 H?y?shi et al- -
`---- -- 382/ 17
`gmdaway
`"" "3334;;
`5,164,825 11/1992 Kobayashi et al. .
`348/441
`5,333,070
`7/1994 Ichikawa .......... ..
`348/652
`5,347,371
`9/1994 Nishimura et a1. ................... .. 348/228
`
`,
`
`,
`
`oyasna ............ ..
`
`Primary Examiner—Tommy P. Chin
`Assistant Examiner-Vu Le
`Attorney, Agent, or Firm—William S. Frommer; Alvin
`Smderbrand
`
`[57]
`ABSTRACT
`A video signal follow-up processing system for adaptively
`tracking to the moving of a Subject A detection feature
`pattern is formed through acquisition of brightness and hue
`frequency characteristic data based on pixel information in
`a detection measurement frame, a similarity calculation
`method, which can distinguish a reference measurement
`frame from other areas, is selected on the screen, and the
`position of a detection measurement frame with a feature
`pattern having the highest similarity with the standard
`feature pattern obtained from the reference measurement
`frame is determined, in order to change and control an image
`to be projected on the display screen based on the positional
`information of the detection measurement frame, so that the
`video signal follow-up processing system can adaptively
`track to the movmg of the Sublect'
`
`-
`
`-
`
`39 Claims, 13 Drawing Sheets
`
`1
`l]
`
`VCS
`Y
`r8
`(5
`S2
`9 —
`/'7
`414
`51 SIGNAL SEPARATE/
`AUTONACITILC GAIN —L> A/D I>DIGIT§§OSAWRAI> D/A NC; C
`
`C J I ; I 3'» S4
`
`S3
`
`LA 2
`
`3
`
`4
`
`‘77-7-7 _ 7 7 7 7 7 7 7 _ 7 7 7777“ ‘ _ _ — _ _ _ _ _ _ _ _ — _]
`
`,
`
`I
`I
`I
`I
`12B ‘
`:
`,
`|
`TlLTlNG
`DRIVE MOTOR '
`I
`1
`1
`'
`:
`
`.
`
`11
`
`426
`
`LPF
`
`“7V1 t 8-11
`Y
`~ 27
`DECIMATION
`+ +
`Y~ SATURATION/ ,14
`HUB DETECTION
`S10”(_" _ $111113
`312
`117
`ADDRESS :1) IMAGE
`SAT
`GEN
`/S15
`MEMORY \15
`
`SAT
`I
`
`12A
`PANNlNG
`R V MOTOR
`D I E
`
`c — —_>~ 55
`~ _ '
`
`I" r - “ - ' ‘ - - — ‘ ‘ ' _' 511*
`
`BRIGHTNESS
`
`"Y ,IIIIE
`
`16
`t
`/
`I
`FOLLOW-UP
`'
`: SIGNAL PROL
`
`~S14 HISTOGRAM GEN \ Z0
`HUB
`/
`
`HUB
`HISTOGRAl/l GEN \19
`
`S13
`
`18
`/ S16
`-
`/
`25 +
`
`GATE
`
`I
`I
`I
`'
`'
`1
`:
`1
`,
`II
`:
`l
`|
`
`I
`
`‘l
`I
`l
`I
`I
`
`Page 1 of 25
`
`SAMSUNG EXHIBIT 1007
`Samsung v. Image Processing Techs.
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 1 of 13
`
`5,546,125
`
`
`
`
`
`||m|m I 1 I | 1 llwwn/w?wh?lnlnu | | | | | | e I | I |1P||||I c\
`
`Md, Ego 55; is; w 952%
`
`
`U __ <2 AH 8”: AH a} Jl 22w wmuégi J D C
`
`r w\ w_\ mm mm m\ H
`
`Q M
`
`
`
`£1 5 S "
`
`1 A El “
`$1 22562 > V ," ME
`
`“E _
`
`P .oI
`
`\ > ms: \ . _
`
`I l I l 1 I I I I I I 1 I l . I . . I . . . ‘ I I I I l I I ‘ ‘ . . . . I . . I I I . I I I l . . I I -_
`
`\ _ Em mug ~ 5 , 5522
`
`‘
`
`
`
`
`222% :& migl? ROE NE E\ m5: 0 9
`
`Q1 EOE: A Q? 2% H mm LN... U
`Q 8 / 2% E85; Ema 2 n “=51
`
`>2 2252mm 12w _ . . 1 . I I . l \ l - I I 1 L
`E5 2M 2% 2582: 5 L9: is; U x a; ma 5:352 _
`E 225:5 > _ ~68: M52. M82 2::
`
`
`
`a n was: @2272
`
`\
`
`SAMSUNG EXHIBIT 1007
`Page 2 of 25
`
`

`
`U.S. Patent
`
`Aug. 13, 1996
`
`Sheet 2 of 13
`
`5,546,125
`
`X32)
`
`FIG. 3
`
`SAMSUNG EXHIBIT 1007
`Page 3 of 25
`
`

`
`U.S. Patent
`
`Aug. 13, 1996
`
`Sheet 3 of 13
`
`5,546,125
`
`?zmsammm
`
`?zmaommm
`
`HueStd(i)
`
`359
`
`HUE ANGLE
`
`FIG. 4
`
`YStd(i)
`
`Z55
`
`BRIGHTNESS LEVEL
`
`FIG. 5
`
`SAMSUNG EXHIBIT 1007
`Page 4 of 25
`
`

`
`U.S. Patent
`
`Aug. 13, 1996
`
`Sheet 4 0f 13
`
`5,546,125
`
`SZmDQmxm
`
`SORT VALUE
`
`FIG. 6A
`
`MEMBERSHIP
`FUNCTION FILTER ,
`
`8
`
`;
`
`FIG. 6B
`
`SORT VALUE
`
`FIG. 6C
`
`SAMSUNG EXHIBIT 1007
`Page 5 of 25
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 5 0f 13
`
`5,546,125
`
`FREQUENCY
`
`o1
`
`0 i7
`
`I
`I
`l
`l
`I
`I
`I
`If
`O 1 2 3 4 5 6 7
`
`HUB
`
`FREQUENCY 01 l l l
`0
`
`I
`I
`I
`I
`l
`I
`I
`I
`0 1 2 3 4 5 6 7
`
`HUE
`
`HUE
`
`0F N-TH FIELD
`
`HUE
`
`0F (N'l'D'TH FIELD
`
`FIG. 7A
`
`FIG. 75
`
`i
`
`jg:
`
`5
`
`‘ — - -
`
`5 " — "
`
`O
`
`0
`
`I
`I
`I
`I
`I
`I
`I
`I
`0 1 2 3 4 5 6 7
`
`HUE
`
`I
`I
`l
`I
`I
`I
`l
`I
`0 1 2 3 4 5 6 7
`
`HUE
`
`HUB
`
`0F N-TH FIELD
`
`HUE
`
`0F (N+1)"TH FIELD
`
`FIG. 8A I
`
`HQ 88
`
`SAMSUNG EXHIBIT 1007
`Page 6 of 25
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 6 0f 13
`
`5,546,125
`
`AUTOMATIC FOLLOW~UP )A/ RT1
`ROCESSING PROCEDURE
`
`SP1
`
`- INITIALIZE EN = O
`' USER RELEASES ”RECPAUSE”
`BUTTON, AND STARTS RECORDING
`
`r
`STORE AS STANDARD FEATURE PATTERN /
`FREQUENCY CHARACTERISTIC DATA FOR
`BRIGIITNESS AND HUE SIGNALS IN
`REFERENCE MEASUREMENT FRAME FMXR
`AT CENTER OF SCREEN
`
`I
`
`SCAN DETECTED MEASUREMENT FRAME 3P3
`FMXD FOR ENTIRE DISPLAY SCREEN,
`AND TAKE IN AS DETECTED FEATURE
`PATTERN FREQUENCY CHARACTERISTIC
`DATA FOR BRIGHTNESS AND HUE
`SIGNAL MEASUREMENT FRAME FMXD
`
`CALCULATE SIMILARITY BETWEEN
`STANDARD FEATURE PATTERN AND
`DETECTED FEATURE PATTERN BY
`EXECUTING PROCESS OF RTZ TE)
`DETERMINE EVALUATION VALUE JZ X, y)
`
`MAKE POSITION OF MEASUREMENT FRAME
`WITH MALLEST EVALUATION VALUE
`JZ (X,v NEW POSITION OF SUBJECT
`
`CONTROL PANNING DRIVE MOTOR AND
`TILTING DRIVE MOTOR SO THAT NEW
`POSITION OF SUBJECT (X, y) COMES
`TO CENTER OF SCREEN
`
`V
`
`- EN = FN+1
`- UPDATE STANDARD FREQUENCY
`CHARACTERISTIC DATA OF BRIGHTNESS
`AND IIUE SIGNALS IN THE REFERENCE
`MEASUREMENT FRAME AS STANDARD
`FEATURE PATTERN
`
`FIG. 9
`
`SAMSUNG EXHIBIT 1007
`Page 7 of 25
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 7 of 13
`
`5,546,125
`
`FIG. 10
`
`SAMSUNG EXHIBIT 1007
`Page 8 of 25
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 8 of 13
`
`5,546,125
`
`>~< E
`Z’
`
`‘“
`
`0
`
`>
`g /
`CL]
`3
`E
`
`0
`
`Hue(X,Y)(i)
`
`359
`HUE ANGLE
`
`H6. 11
`
`Y(X,y)(i)
`
`,255
`BR] GHTNESS LEVEL
`
`FIG. 12
`
`SAMSUNG EXHIBIT 1007
`Page 9 of 25
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 9 0f 13
`
`5,546,125
`
`( SIMILARITY CALCULATION} RTZ
`PROCESSING ROUTINE
`SPII
`
`YES
`
`SP15
`P
`CALCULATE SIMILARITY
`BETWEEN STANDARD FEATURE
`PATTERN AND DETECTION
`FEATURE PATTERN BY
`METHOD OF “(WI
`
`SPIZ
`/
`CALCULATE SIMILARITY
`psg?gggnp?gm
`TOBUEUTSEINGINNINEEIIAIJEUMIPIIIJSN
`VALUE JN x,yI5(N=1,2,...,9)
`
`SPI3 ~ SELECT Jz(X,y) WITH
`LARGEST (SECOND MINIMUM
`VALUE)/(FIRST MINIMUM
`VALUE) AMONG EVALUATION
`VALUE JN (x,y) (N=1,2, . . . , 9)
`
`C RETURN
`
`)Sm
`
`FIG. 13
`
`SAMSUNG EXHIBIT 1007
`Page 10 of 25
`
`

`
`U.S. Patent
`
`Aug. 13, 1996
`
`Sheet 10 of 13
`
`5,546,125
`
`ELEIIENT OF
`DISTANCE VECTOR
`
`DISTANCE
`CALCULATION
`EUCLIDEAN DISTANCE
`CALCULATION IIETIIOD
`(IIISTOOEAII EUCLIDEAN DISTANCE)
`IIAIIIIIINO DISTANCE
`CALCULATION IIIETIIOD
`(HISTOORADI IIAIIIIIINO DISTANCE)
`INTEGRAL DISTANCE (AREA
`DISTANCE) CALCULATION METHOD
`(INTEGRAL HISTGRAM DISTANCE)
`
`Y
`
`Hue Hue+Y
`
`J2
`
`J3
`
`J 4
`
`J5
`
`J7
`
`J8
`
`J6
`
`J9
`
`F I G. 14
`
`
`
`Czggxm?u I 0
`
`AITIALIAIEIO
`
`_ _ l1
`_ _ T2
`
`AIJIIIIAJ
`_|I|I|I4
`
`>uzmzommm5 0
`
`AAITITlIIO
`
`_ _ _ I1
`_ _ _ I3
`_ _ _ I2
`_ u A
`_ I5
`_ I6
`
`_|l|7
`
`E U H
`
`DETECTION FEATURE PATTERN
`Hue(i)
`
`I5
`
`16
`
`I7
`
`E U "n
`
`STANDARD FEATURE PATTERN
`Hue(i)_std(i)
`
`
`
`[O!O,0’5,3!0’0’ FIG. 15A
`
`
`
`[0’ O90’394yO’ 0’ FIG. 15B
`
`SAMSUNG EXHIBIT 1007
`Page 11 of 25
`
`

`
`‘US. Patent
`
`Aug. 13, 1996
`
`Sheet 11 of 13
`
`5,546,125
`
`15
`
`HDNVlSIG
`
`SAMSUNG EXHIBIT 1007
`Page 12 of 25
`
`

`
`U.S. Patent
`
`Aug. 13, 1996
`
`Sheet 12 of 13
`
`5,546,125
`
`\zo_H«m=H¢m
`
`zo_HumHmama:
`
`-,
`
`zo_H<=_uma
`
`ymozmsmmu<=_
`
`mmmzH:u_mm
`
`
`
`zmu:¢xwoHm_=
`
`mmmmaaa
`
`zmw
`
`uz__4_P
`
`gomezou
`
`Dom;
`
`
`
`4<zu_moma_>
`
`zo_Hu=aom¢mm
`
`SAMSUNG EXHIBIT 1007
`
`PagefK30f25
`
`SAMSUNG EXHIBIT 1007
`Page 13 of 25
`
`
`
`
`
`
`

`
`US. Patent
`
`Aug. 13, 1996
`
`Sheet 13 of 13
`
`5,546,125
`
`_ \|\
`
`:5
`
`_ “E _ \
`
`_ @N 5 5 H mm”
`
`
`
`
`
`I I I I I I I I I IQIWIIINIIIIIIIIIIIWUI I I I
`
`8% E05:
`
`MAJIII {a AH $2 222 Al“ was: N a} I ~ AN
`
`N Z L \IKZ " @mm
`_ $1 556% " Mam Q
`
`" 65E”: 5* _
`
`
`
`n E A, > U : 6528 6528
`
`
`
`U 3/ E05: m; 2% _ I - I I
`
`_ mDgIIIWQIIQW am R n
`
`
`
`
`
`N we? >2 wwmzzwzm Izw _ I I I I I I I I I I - I I I I
`
`_ \ \ \ _
`
`
`
`
`
`_ I @E an: m5: 2% $3032 _
`
`
`
`_ I I I I I I Imw I I I I I I I I I QI I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I_
`
`2 .o _ L
`
`
`
`_ I 2w 8% A ~ $22 _ mw III I L
`
`
`
`_ Q @Nm 2% 5x85; EWI 2 n
`
`
`
`" E5 / 25 5x85; ~ WQQME IEZEw M
`
`. ‘ _
`
`
`
`
`
`II 22523 I I I I I I I I IJIIIII < 5
`
`_ _ \ \
`
`
`
`_ _ 22:: @2222
`
`SAMSUNG EXHIBIT 1007
`Page 14 of 25
`
`

`
`- 5,546,125
`
`1
`VIDEO SIGNAL FOLLOW-UP PROCESSING
`SYSTEM
`
`BACKGROUND OF THE INVENTION
`
`1. Field of the Invention
`This invention relates to a video signal follow-up pro
`cessing system, and more particularly to an improvement of
`one which can take an optimal image by automatically
`following changes of a subject in the ?eld of view of a video
`camera.
`2. Description of the Related Art
`As an apparatus for tracking changes of position of a
`subject in the ?eld of view, there is a subject tracking
`apparatus in a video camera which automatically tracks the
`subject according to its movement. First, for a subject
`tracking device in a conventional video camera, an auto
`matic subject tracking method has been disclosed in the
`Japanese Patent Application No. 207107/1992 ?led on Jul.
`10, 1992, which stores a peak value of high-frequency
`components of brightness signals in a measurement frame,
`and automatically tracks its movement as characteristics of
`the subject.
`Secondly, an automatic subject tracking method has been
`disclosed in the Japanese Patent Application No. 322652/
`1992 ?led on Nov. 7, 1992, which performs automatic
`tracking by forming a motion vector through matching of
`representative points for brightness signals of front and back
`?elds in a measurement frame, and by assuming the motion
`vector as the movement of the subject.
`The ?rst automatic tracking method basically utilizes
`signals at peak, so that it is vulnerable to noise. Therefore the
`automatic tracking may not be attained in a shooting envi
`ronment with low illuminance. Moreover, in principle, it
`extracts high-frequency components, so that the automatic
`tracking may not be attained for a subject with low contrast.
`Furthermore, in the second automatic tracking method, it
`is di?icult to determine whether the calculated motion vector
`is caused by movement of the hand on the video camera or
`by movement of the subject, so that a malfunction may arise
`in the practical use.
`
`SUMMARY OF THE INVENTION
`
`In view of the foregoing, an object of this invention is to
`provide a video signal follow-up processing system which
`can easily and surely perform automatic follow-up operation
`for movement of a subject by stably and eifectively extract
`ing features of the subject on a screen.
`The foregoing object and other objects of the invention
`have been achieved by the provision of a video signal
`follow-up processing system, comprising: pixel information
`forming means (1, 5, 6, 7, 14, 26, 27) for forming pixel
`information constituting a display screen PIC based on
`pickup signals obtained through a lens block 1; reference
`measurement frame setting means (16, 17, 15, SP2) for
`setting a reference measurement frame FMXR with a pre
`determined size on a predetermined position on the display
`screen PIC; detection measurement frame setting means (16,
`17, 15, SP3) for setting a detected measurement frame
`(FMXD) with a predetermined size on the display screen
`PIC; standard frequency characteristic data forming means
`(19, 20, 16, SP2) for forming standard frequency character
`istic data YStd (i) and HueStd (i) for a brightness level
`and/or hue angle, based on brightness and/or hue informa
`tion on an image in the reference measurement frame
`
`30
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`2
`FMXR; detection frequency characteristic data forming
`means (19, 20, 16, SP3) for forming detection frequency
`characteristic data Y(x, y) (i) and Hue (x, y) (i) for a
`brightness level and/or hue angle, based on brightness
`and/or hue information on an image in the detected mea
`surement frame (FMXD); similarity calculating means (16,
`SP12) for calculating the similarity of standard frequency
`characteristic data and/or detected frequency characteristic
`data by several methods; effective similarity calculation
`method selection means (16, SP13) for selecting the most
`eifective similarity data on the screen among a plurality of
`similarities obtained from the similarity calculation; detec
`tion measurement frame determination means (16, SP5,
`SP15) for determining a detection measurement frame with
`a higher similarity by using the similarity calculation
`method selected by the effective similarity calculation
`method selection means; and pixel information modi?cation
`control means (16, SP6) for controlling the pixel informa
`tion forming means (1, 5, 6, 7) so as to match the position
`of the pixel information on the determined detection mea
`surement frame with the position of the reference measure
`ment frame.
`The pixel information in the standard measurement frame
`FMXR on the subject is converted into the standard fre
`quency characteristic data YStd (i) and HueStd (i) for the
`brightness level and/or hue angle by the standard frequency
`characteristic data forming means (19, 20, 16, SP2). The
`pixel information in the detection measurement frame
`(FMXD) is converted into the detection frequency charac
`teristic data Y(x, y)(i) and Hue(x, y)(i) for the brightness
`level or hue angle by detection frequency characteristic data
`forming means (19, 20, 16, SP3). The similarity is found by
`several methods for the standard frequency characteristic
`data of the detection frequency characteristic data in the
`similarity calculation means (16, SP12, SP15). The most
`effective similarity calculation method for that image is
`determined in the eifective similarity calculation method
`selection means (16, SP12). The detection frequency char
`acteristic data, which has the highest similarity on the screen
`found by that calculation method, is determined by the
`detection measurement frame determination means (16,
`SP13, SP15). The drive of the pixel information forming
`means (1, 5, 6, 7) is controlled by the pixel information
`modi?cation control means (16, SP6, SP16) so that the
`determined pixel information in the detection measurement
`frame is contained in the reference measurement frame.
`Thus, as follow-up control can be performed to always
`contain a subject in the standard frame on the display screen,
`and the features of an image is arranged to be represented by
`using frequency characteristic data for that purpose. There
`fore, it is possible to attain a video signal follow-up system
`that can be constructed in a relatively simple manner and
`that can surely follow up and operate.
`As described above, according to this invention, it is
`possible to easily attain a video signal follow-up processing
`system that, when a subject moves in a ?eld of view image,
`adaptively operates to such change so that the image to be
`projected on the display screen can surely follow the subject
`by selecting a similarity calculation method best suited
`within the screen based on the frequency characteristic data
`for brightness and hue information in a predetermined
`measurement frame, and by measuring the position of a
`detected measurement frame with a higher similarity
`through use of such method to control the position of the
`subject to be projected on the display screen.
`The nature, principle and utility of the invention will
`become more apparent from the following detailed descrip
`
`SAMSUNG EXHIBIT 1007
`Page 15 of 25
`
`

`
`5,546,125
`
`3
`tion when read in conjunction with the accompanying draw
`ings in which like parts are designated by like reference
`numerals or characters.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`In the accompanying drawings:
`FIG. 1 is a block diagram illustrating a ?rst embodiment
`of the video signal follow-up processing system according to
`this invention;
`FIG. 2 is a schematic diagram explaining an HLS color
`coordinate system representing a visual stimulus;
`FIG. 3 is a schematic diagram explaining a reference
`measurement frame FMR;
`FIG. 4 is a characteristic curvilinear diagram showing
`standard hue frequency characteristics obtained from the
`reference measurement frame FMXR shown in FIG.
`FIG. 5 is a characteristic curvilinear diagram showing
`standard brightness frequency characteristics obtained from
`the reference measurement frame FMXR in FIG. 3;
`FIG. 6 is a schematic diagram showing a membership
`function ?lter used for ?nding the frequency characteristics;
`FIGS. 7A and 7B are characteristic curvilinear diagrams
`showing effects of noise in generating the frequency char
`acteristics;
`FIGS. 8A and 8B are characteristic curvilinear diagrams
`showing the frequency characteristics when the membership
`?lter is used;
`FIG. 9 is a flowchart showing an automatic follow-up
`processing procedure;
`FIG. 10 is a schematic diagram showing a detection
`measurement frame;
`FIG. 11 is a characteristic curvilinear diagram showing
`detection hue frequency characteristics;
`FIG. 12 is a characteristic curvilinear diagram showing
`detection brightness frequency characteristics;
`FIG. 13 is a ?owchart showing a similarity calculation
`processing routine;
`FIG. 14 is a table listing nine types of similarity calcu
`lation methods;
`FIGS. 15A and 15B are characteristic curvilinear dia
`grams explaining a speci?c similarity calculation method;
`FIG. 16 is a schematic perspective view showing distance
`data found on the display screen;
`FIG. 17 is a block diagram showing a second embodi
`ment; and
`FIG. 18 is a block diagram showing a third embodiment.
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`50
`
`DETAILED DESCRIPTION OF THE
`EMBODIMENT
`
`Preferred embodiments of this invention will be described
`with reference to the accompanying drawings:
`
`55
`
`1 First embodiment
`
`(l-—-l) Overall arrangement
`Referring to FIG. 1, VCS generally shows a video camera
`system. An image pickup light LA from a subject is image
`formed on an image pickup element 4, which consists of
`solid-state image pickup elements, such as charge-coupled
`devices (CCDs), through a lens 2 and an iris 3 in a lens block
`1, and then image pickup output signal S1, which represents
`an image of a ?eld of view including an image of the subject,
`
`60
`
`65
`
`4
`is provided to a signal separation/automaticaIly-gain control
`circuit 5.
`The signal separation/automatically-gain control circuit 5
`sample holds the image pickup output signal S1, and simul
`taneously controls gain by control signal from an auto iris
`“AE” (not shown) so that the image pickup output signal S1
`has predetermined gain. Therefore, the signal separation/
`automatically-gain control circuit 5 supplies image pickup
`output signal S2 obtained in the above manner to a digital
`camera processing circuit 7 through an analog-to-digital
`conversion circuit 6.
`The digital camera processing circuit 7 forms brightness
`signal “Y” and chroma signal “C” based on the image
`pickup output signal S2, and sends out the brightness signal
`Y and the chroma signal C as video output signal S3 through
`a digital-to-analog conversion circuit 8.
`In addition, the digital camera processing circuit 7 sup
`plies brightness signal Y, and two color difference signals
`“R-Y” and “B~Y” to a follow-up control circuit 11 as subject
`follow-up detection signal S4. Based on the subject follow
`up detection signal S4, The follow-up control circuit 11
`generates, a follow-up control signal S5 for a panning drive
`motor 12A and a tilting drive motor 12B provided for the
`lens block 1.
`The follow-up control circuit 11 sequentially passes the
`brightness signal Y and the color difference signals R-Y and
`B-Y through a low—pass ?lter 26 and a decimation circuit 27
`to supply to a saturation/hue detection circuit 14, so that hue
`signal HUE and saturation signal SAT are formed. They are
`stored in an image memory 15 together with the brightness
`signal Y as subject follow-up control image data S10 for
`each pixel.
`Then, the decimation circuit 27 is arranged to process the
`brightness signal Y, and color difference signals R-Y and
`B-Y to thin them out by sampling them each several pixels,
`so that the amount of data to be stored in the image memory
`15 is reduced, and further the circuit con?guration can be
`simpli?ed.
`The saturation/hue detection circuit 14 is arranged to form
`the hue signal HUE and the saturation signal SAT by
`rectangular coordinate/curvilinear coordinate conversion of
`the color difference signals R-Y and B-Y, so that, in the
`con?guration in the later stage, the subject can be recognized
`based on visual stimulus, which human beings can perceive,
`by using the brightness signal Y, the hue signal HUE, and the
`saturation signal SAT.
`In this connection, the visual stimulus, which the human
`being can perceive, is expressed by color coordinates, the
`HLS system, having an “L" axis and an SH plane rectan
`gular thereto, as shown in FIG. 2.
`The L axis represents lightness, and corresponds to the
`brightness signal Y. The “SH” plane is expressed by curvi
`linear coordinates rectangular to the L axis. In the SH plane,
`“S” represents saturation, and is expressed by a distance
`from the L axis; “H” represents hue, and is expressed by an
`angle assuming the direction of the color diiference signal
`R-Y to be 0°.
`When the light source becomes brighter, all colors of a
`solid in the HLS system become white, and simultaneously
`the SH plane rises up along the L axis. At that moment, the
`saturation S decreases. On the other hand, when the light
`source becomes darker, all colors become black and simul
`taneously the color coordinates or the SH plane goes down
`along the L axis. At that moment, the saturation S also
`decreases.
`Based on such characteristics of the HLS color coordinate
`system, the saturation S and the brightness Y are susceptible
`
`SAMSUNG EXHIBIT 1007
`Page 16 of 25
`
`

`
`5,546,125
`
`10
`
`25
`
`35
`
`5
`to the lightness of the light source, so that it cannot be said
`that they are best suited as parameters representing features
`of a subject. On the contrary, the hue H as one representing
`inherent features quantity of the subject is not susceptible to
`the lightness of the light source.
`Nevertheless, if the color of the subject is near the L axis,
`that is, it is white, black, or gray, the signal of the hue H
`becomes meaningless as information. In the worst case, it is
`found that, for an image with a low S/N ratio, it may have
`various vectors for the hue H even though it is white.
`By utilizing such characteristics of the HLS color coor
`dinate system, the follow-up control circuit 11 extracts
`features of the subject, and, when the features vary, drives
`the panning drive motor 12A and the tilting drive motor 12B
`to follow it, so that image signal accommodating and
`following up the movement of the subject is obtained as
`video signal S3.
`That is, block specifying signal S11 is supplied from a
`follow-up signal processing circuit 16 consisting of micro
`processor to an address generation circuit 17, so that, as
`shown in FIG. 3, the pixel information stored in the image
`memory 15, which constitute subject follow-up control data
`S10, is read by address signal S12 which divides a display
`screen PIC substantially formed in the image memory 15
`into blocks consisting of small areas AR of predetermined
`size based on x-y rectangular coordinates (x, y).
`Thus, each pixel data in the image memory 15 constitut
`ing the display screen PIC is read by the small area AR to
`be processed as image information of one block for each
`small area AR.
`In this embodiment, the display screen PIC is divided‘into
`16 small areas AR in the “x” and “y” directions, respectively.
`Thus, by specifying coordinates x=i, y=j of rectangular
`coordinates (x, y) with regard to l6><l6 (:256) small areas
`AR, it is possible to read out the image information I of the
`speci?ed small area AR.
`Of the image information I (x=i, y=j) thus read out for
`each small area AR from the image memory 15, the com
`ponent of hue signal HUE is supplied to a hue histogram
`generation circuit 19 through the gate circuit 18, while the
`component of the brightness signal Y is directly supplied to
`a brightness histogram generation circuit 20.
`As shown in FIG. 4, the hue histogram generation circuit
`19 determines the hue frequency characteristic HueStd (i)
`representing the number of pixels with hue angles in a range
`of 0° to 359° for the hue of pixels in a measurement frame
`FMX set on the display screen PIC to send as a hue
`histogram signal S13 to the follow-up signal processing
`circuit 16.
`Thus, the hue histogram generation circuit 19 converts the
`features on hue, which an image in the measurement frame
`FMX has, to a hue feature pattern represented by the hue
`frequency characteristics HueStd (i), and supplies it to the
`follow-up signal processing circuit 16.
`Similarly, as shown in FIG. 5, the brightness histogram
`generation circuit 20 determines the brightness frequency
`characteristics YStd (i) representing the number of pixels
`with a brightness level in a range of 0 to 255 based on
`brightness signal Y for a pixel in a measurement frame
`FMX, and then supplies it as brightness histogram signal
`S14 to the follow-up signal processing circuit 16.
`Thus, the brightness histogram generation circuit 20 con
`verts the features on brightness, which an image in the
`measurement frame FMX has, to a brightness feature pattern
`represented by the brightness frequency characteristics YStd
`(i) to supply to the follow-up signal processing circuit 16.
`By comparing these components with threshold values
`representing sort values (that is, .hue angles ° to 359°, and
`
`45
`
`50
`
`55
`
`65
`
`6
`brightness levels 0 to 255), the hue histogram generation
`circuit 19 and the brightness histogram generation circuit 20
`are arranged to sort components of the hue signal HUE and
`the components of the brightness signal Y into hue angles
`and brightness levels to be included as generation frequency
`values. To ameliorate the effect of noise during this sorting
`processing, as shown in FIGS. 6A to 6C, the hue histogram
`generation circuit 19 and the brightness histogram genera
`tion circuit 20 are arranged to pass the frequency data D1 of
`each sort value through a membership function ?lter MFF to
`send out ?lter output data D2 obtained at its output terminal
`as the hue histogram signal S13 and the brightness histo
`gram signal S14, respectively.
`This is to avoid a such situation that, in practice, when the
`values of components of the hue signal HUE and the
`components of the brightness signal Y are close to the
`threshold values corresponding to each sort value, the sort
`values to be included become uncertain depending on pres
`ence or absence of noise. As a solution for this, when the
`frequency is included, it is converted into data to which
`ambiguity is introduced by a membership function.
`For example, for the components of the hue signal HUE,
`there may arise such a case where, as shown in FIGS. 7A and
`7B, while the hue frequency characteristics Hue (i) for a hue
`sort value HUE=4 is Hue (i)=5 in the N-th ?eld, the hue
`frequency characteristics Hue (i) for a hue sort value HUE=3
`is determined to be Hue (i)=5 at the (N+l)-th ?eld because
`there is noise although the image is the same.
`In this case, the hue frequency characteristics Hue (i)
`obtained by using the ?lter output data D2, which is the
`frequency data D1 after passing through the membership
`function ?lter MFF, can make the hue frequency distribution
`substantially the same depending on the presence or absence
`of noise, as shown in FIGS. 8A and 8B in correspondence to
`FIGS. 7A and 7B.
`Thus, when frequency characteristics are obtained, it is
`possible to effectively suppress the effect of noise contained
`in the components of the hue signal HUE and the brightness
`signal Y.
`A hue noise gate signal formation circuit 25 with a
`comparator construction is provided for the gate circuit 18.
`It is arranged not to input the hue signal HUE of the pixel
`to the hue histogram generator circuit 19 by comparing the
`hue signal HUE read out from the image memory 15 for
`each pixel with the noise determination signal S15 sent out
`from the follow-up signal processing circuit 16, and sup
`plying a gate signal S16, which causes the gate circuit 18 to
`close when the hue signal HUE is at a predetermined level
`or. less, to the gate circuit 18.
`When the hue signal HUE detected at the saturation/hue
`detection circuit 14 is close to the L axis (shown in FIG. 2),
`there is a possibility that the hue signal HUE may not have
`meaning as information since it is buried in noise because of
`having low saturation. Such a meaningless hue signal HUE
`is removed in the gate circuit 18.
`(l—2) Operation of automatic following signal
`With the above arrangement, by performing the automatic
`follow-up processing procedure RT1 shown in FIG. 9, the
`follow-up signal processing circuit 16 forms a brightness
`detected feature pattern and a hue-detected feature pattern at
`the brightness histogram generator circuit 20 and the hue
`histogram generator circuit 19 based on the brightness signal
`Y and the hue signal HUE for each pixel taken in the image
`memory 15. Thereby, they are compared with an image
`portion in a reference measurement frame so that the pan
`ning and tilting operation of the lens block 1 is adaptively
`controlled to always move the position of a detected mea
`
`SAMSUNG EXHIBIT 1007
`Page 17 of 25
`
`

`
`5,546,125
`
`10
`
`25
`
`30
`
`7
`surement frame having an image with the highest similarity
`to the signal of the reference measurement frame.
`That is, when the follow-up signal processing circuit 16
`enters the automatic follow-up processing procedure RTl, it
`initializes the frame number FN to FN =0 in step SP1, and
`simultaneously waits for the user to release the recording
`pause state through operation of the recording pause button
`RECPAUSE.
`In this state, if the user releases the recording pause, the
`follow-up signal processing circuit 16 proceeds to step SP2
`to perform such processing, as described for FIGS. 3 to 5,
`which the reference measurement frame FMXR at the center
`of the screen is speci?ed by the address generator circuit 17
`to send the brightness signal Y and the hue signal HUE
`corresponding to the pixels in the reference measurement
`frame FMXR to the brightness histogram generation circuit
`20 and the hue histogram generation circuit 19 so as to be
`taken in the brightness histogram signal S14 and the hue
`histogram signal S13 having the standard brightness fre
`quency characteristics YStd (i) (FIG. 5) and the standard hue
`frequency characteristics HueStd (i) (FIG. 4) as the standard
`feature pattern.
`Then, the follow-up signal processing circuit 16 proceeds
`to step SP3, as shown in FIG. 10, to scan the position of the
`detection measurement frame FMXD by the address gen
`eration circuit 17, thereby extract the pixel information on
`the display screen PIC for each detection measurement
`frame FMXD with the detection measurement frame
`FMXD.
`In this embodiment, the detection measurement frame
`FMXD consists of 4X4 small areas AR similar to the
`standard detection frame FMXR. The address generation
`circuit 17 scans the address of small areas at the upper left
`comer of the detected measurement frame FMXD by
`sequentially specifying from left to right and from top to
`bottom.
`35
`As a result, the detection measurement frame FMXD
`scans by sequentially shifting addresses, such as (x, y)=(0,
`0), (1, 0), .
`.
`. , (12, 0),(0,1),(1,1),...,(12,l),...,(0,
`12),(1, 12),. .
`. ,(12, 12),
`During such scanning, the follow~up

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket