throbber
United States Patent (19)
`Shiraiwa et al.
`
`54) IMAGE PROCESSINGAPPARATUS AND
`METHOD
`
`75 Inventors: Yoshinobu Shiraiwa, Machida; Yoshiro
`Udagawa, Saitama-ken; Kenji
`Takahashi, Eiichiro Ikeda, both of
`Kawasaki, Yumiko Hidaka, Inagi, all
`of Japan
`
`*
`
`Notice:
`
`73 Assignee: Canon Kabushiki Kaisha, Japan
`This patent issued on a continued pros
`ecution application filed under 37 CFR
`1.53(d), and is subject to the twenty year
`patent term provisions of 35 U.S.C.
`154(a)(2).
`
`21 Appl. No.: 08/689,054
`22 Filed:
`Jul. 30, 1996
`30
`Foreign Application Priority Data
`Aug. 1, 1995
`JP
`Japan .................................... T-196677
`Aug. 11, 1995
`JP
`Japan ...
`7-205886
`Aug. 23, 1995
`JP
`Japan .................................... 7-214552
`(51) Int. Cl. ................................................. H04N 9/04
`52 U.S. Cl. ............................................. 348/224; 348/228
`58 Field of Search ..................................... 348/222, 223,
`348/224, 225, 228, 229, 230, 231
`
`56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`4,901,152 2/1990 Hieda et al. ............................ 348/228
`5,111,289 5/1992 Lucas ............
`... 348/148
`5,295,001 3/1994 Takahashi ............................... 358/482
`
`USOO6160579A
`Patent Number:
`11
`(45) Date of Patent:
`
`6,160,579
`*Dec. 12, 2000
`
`5,347,371 9/1994 Nishimura ............................... 348/228
`5,406,391
`4/1995 Takahashi .......
`... 358/482
`5,424,774 6/1995 Takayama et al. .
`348/222
`5,448,292 9/1995 Matsui .................................... 348/225
`5,488,414
`1/1996 Hirasawa et al. ....................... 348/207
`5,796,428 8/1998 Matsumoto ............................. 348/231
`FOREIGN PATENT DOCUMENTS
`0258673 3/1988 European Pat. Off..
`0363988 4/1990 European Pat. Off..
`0502369 9/1992 European Pat. Off..
`5-183789 7/1993 Japan.
`7-131796 5/1995 Japan.
`2182821 5/1987 United Kingdom.
`Primary Examiner Bryan Tung
`Attorney, Agent, or Firm-Fitzpatrick, Cella, Harper &
`Scinto
`ABSTRACT
`57
`In converting image Sensing data into image data, a plurality
`of image Sensing data meeting a predetermined condition are
`processed as a group of image Sensing data. An image
`reproduction parameter is obtained from this image Sensing
`data group, and each image Sensing data of the image
`Sensing data group is converted into image data by using the
`image reproduction parameter. Accordingly, an image repro
`duction parameter for obtaining an optimum reproduced
`image can be accurately set from the image Sensing data
`group. Also, Since a reproduction luminance level (range) is
`determined from the image Sensing data group, the correla
`tion between the luminances of image planes is not lost. This
`allows an easy comparison of reproduced images and pre
`vents the boundaries of luminances from becoming unnatu
`ral when the reproduced images are Synthesized.
`
`8 Claims, 15 Drawing Sheets
`
`
`
`IMAGE
`SENSING
`DATA
`MEMORY
`
`COLOR
`IMAGE
`SENSING
`UNIT
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`COLORMAGE REPRODUCTION PROCESSOR
`
`30
`
`IMAGE
`SENSING
`DATA
`HOLDING
`UNIT
`
`MAGE
`REPRODUCTION
`PARAMEETER
`DETERMINING
`UNIT
`
`REPErion
`Prossing
`
`
`
`
`
`
`
`REGNG
`
`IMAGE
`SENSING
`DATA
`SELECTING
`UNIT
`
`IMAGE
`SENSING
`DATA
`SELECTION
`DESIGNATING
`UNIT
`
`
`
`REPRODUCED
`IMAGE
`DATA
`MEMORY
`
`HTC, Exhibit 1018
`
`

`

`US. Patent
`
`m
`
`f0
`
`EOE—n.N.U~n~
`Hm<
`
`m.:2:
`1.oznommmoo$53225.60a3226m;E9298DEEzmzmdzoo
`
`2,:2::2:wzcmm>zoo
`
`
`
`eOmNUNK
`uE<5.5
`
`.bvmNF
`
`
`
`$2520$2.848TEfizmzmdzooH
`
`
`
`
`
`.550C,917m5,.fi.0mM:2:hx6,:75oz_Em>zooEozpommmoo
`
`00
`
`HTC, Exhibit 1018
`
`
`
`

`

`US. Patent
`
`Dec. 12, 2000
`
`Sheet 2 0f 15
`
`6,160,579
`
`
`
`m..UNK
`
`on
`
`
`
`
`
`mOmmmoOmmzoFoaoOmmmmmos):$0.50
`
`398%?azammoomn.oz_z_2mmmo62.30:Ea
`
`
`
`
`
`m2;zopoaomnmm295:8Emmozazmm
`$985%..mos):
`
`Emu—2E:<53azazwm
`
`
`
`Emma:2:
`
`
`
`._._z::23
`
`>mo_2m_>_
`
`mos):was):
`
`<._.<n_
`
`>m02m§
`
`m0<s=
`
`
`omoaoOmmmmwzzbmjmm:23
`
`mo<s=:2:
`
`
`0252mm
`
`<._.<Q
`
`zOfiomjmw
`
`Gz_._.<zw_mmo
`
`:23
`
`
`
`Gz_wzwwm0<_2_
`
`
`
`wo<s=$0400
`
`
`
`<l_.<o0252mm
`
`HTC, Exhibit 1018
`
`HTC, Exhibit 1018
`
`
`
`
`
`
`
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 3 of 15
`
`6,160,579
`
`F I G. 4
`
`
`
`
`
`START
`
`DESIGNATE IMAGE SENSING DATA
`TO BE SELECTED
`
`READ OUT SELECTED IMAGE
`SENSING DATA FROMMAGE
`SENSING DATA STORED
`IN MAGE SENSING DATA MEMORY
`
`HOLD GROUP OF
`SELECTED IMAGE SENSING DATA
`
`CALCULATE OR CORRECT IMAGE
`REPRODUCTION PARAMETER
`BY USING
`HELD IMAGE SENSING DATA GROUP
`
`PERFORM
`IMAGE REPRODUCTION PROCESSING
`BYUSING DETERMINED IMAGE
`REPRODUCTION PARAMETER
`
`END
`
`S 1
`
`S2
`
`S3
`
`S4
`
`S5
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`US. Patent
`
`Dec. 12, 2000
`Dec. 12, 2000
`
`Sheet 4 of 15
`Sheet 4 0f 15
`
`6,160,579
`6,160,579
`
`lf)
`
`5 F
`
`s
`IG.
`
`HTC, Exhibit 1018
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`US. Patent
`
`Dec. 12, 2000
`Dec. 12,2000
`
`Sheet 5 of 15
`Sheet 5 0f 15
`
`6,160,579
`6,160,579
`
`
`
`
`s . s
`
`4M-
`
`HTC, Exhibit 1018
`
`6
`
`FIG.
`
`[\
`
`Q5
`
`N4
`
`LE.
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 6 of 15
`
`6,160,579
`
`START
`
`F I G. 8
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`S 11
`
`LOAD SELECTION CONDITION
`OF IMAGE SENSING DATA
`
`LOAD IMAGE SENSING DATA
`
`OBTAN COLOR TEMPERATURE
`INFORMATION
`
`COLOR TEMPERATURE
`INFORMATION OF IMAGE DOES
`ENSING DATA MEET SELECTION CONDITIO
`OF MAGE SENSING DATA
`
`NO
`
`SEND IMAGE SENSING DATA TO
`IMAGE SENSING DATA
`HOLDING UNIT
`
`ARE ALL IMAGE SENSING DATA
`STORED IN MAGE SENSING DATA
`MEMORY LOADED?
`
`
`
`
`
`YES
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 7 of 15
`
`6,160,579
`
`UUU069
`
`
`
`Uuu G89~
`
`6 (5) I „H
`
`
`
`*Z%
`/ ////// Z/
`ZZZ777777||
`/#ffff]
`
`ºffL >
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`US. Patent
`
`Dec. 12, 2000
`Dec. 12,2000
`
`Sheet 8 of 15
`Sheet 8 0f 15
`
`6,160,579
`6,160,579
`
`0I
`10
`FIG.
`
`‘O I „H
`
`
`
`(R/G)o
`
`R/G
`
`HTC, Exhibit 1018
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 9 of 15
`
`6,160,579
`
`F I G.
`
`II
`
`LOAD SELECTION CONDITION OF
`MAGE SENSING DATA
`
`S21
`
`LOAD IMAGE SENSING DATA
`
`S22
`
`PERFORM SIMPLIFIED IMAGE
`REPRODUCTION
`
`S23
`
`S24
`
`
`
`
`
`
`
`CAN
`DETERMINATION FROM USER BE
`EXPECTED?
`
`YES
`
`NO
`
`S27
`
`S25
`
`FORMEDGE IMAGE DATA
`
`DISPLAY SIMPLE REPRODUCED
`IMAGE ON MONITOR
`
`S28
`
`S26
`
`CHECK SPATAL CORRELATION
`BETWEEN EDGES IN MAGE
`
`EXECUTE IMAGE SELECTION
`DETERMINATION BY USER
`
`
`
`SELECT IMAGE SENSING DATA
`WITH HIGH SPATIAL CORRELATION
`BETWEEN EDGES
`
`SEND SELECTED IMAGE SENSING
`DATA TO IMAGE SENSING DTA
`HOLDING UNIT
`
`S30
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 10 of 15
`
`6,160,579
`
`
`
`F I G.
`
`12
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`S3
`
`LOAD SELECTION CONDITION
`OF IMAGE SENSING DATA
`
`LOAD IMAGE SENSING DATA
`
`S INFORMATION
`MEETING SELECTION CONDITION
`APPENDED?
`
`DOES
`APPENDED INFORMATION MEET
`ELECTION CONDITION?
`
`SEND IMAGE SENSING DATA
`TO MAGE SENSING DATA
`HOLDING UNIT
`
`ARE ALL IMAGE
`SENSING DATA STORED IN
`IMAGE SENSING DATA MEMORY
`READ OUT
`
`
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 11 Of 15
`
`6,160,579
`
`
`
`AHWINE WETd|WOO
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`LIN[] ©NINIWHELEC|NO||LONO-, NOISHEANOO
`
`
`
`
`
`
`HOTOO BHÍld-HOTOO ÅHWILNE WETGEWOO
`
`CIG
`
`eg
`
`HTC, Exhibit 1018
`
`

`

`US. Patent
`
`D
`
`mwhS
`
`975,
`
`
`$98n$25.50:2:m:2:wz_Em>zoo2,wzpowmmoo
`
`$9298a.EEZmzmigoo
`
`
`EEzmzmdzoos:2:ozpoflmm20.522zoamw>zoonoEBB#51560
`
`
`
`
`
`
`
`
`
`
`EEEEEEOQw:2:moéoa20.52:”.zoammszooM,N$98maimgoo
`
`
`
`
`
`HTC, Exhibit 1018
`
`HTC, Exhibit 1018
`
`
`
`

`

`US. Patent
`
`Dec. 12, 2000
`
`Sheet 13 0f 15
`
`6,160,579
`
`MNU~n~
`
`<_2_2<0
`
`02F0wmm00
`
`.22:
`
`>m<._.zm§m_._n_§00
`
`mmamigoo
`
`$0.50
`
`02_.Em_>200
`
`:23
`
`_m_>_
`
`.9@2522
`9ozazwm5«2II!
`
`.223
`
`
`
`
`
`$0.50mmnm-m0._00>m<._.2m=>_m._n_s_00
`
`
`
`202.023“.20_wmm>200
`
`
`
`.22:02_h0m._mw\02_2=2mm._.mo
`
`
`
`
`
`$0400mmzmigoo>m<._.2m§m._n_s_00
`
`
`
`
`
`:23m0<m05202.023“.20.0mm>200
`
`or
`
`m>>
`
`._.zm_0_n_um_00
`
`02_2_§mm._.m_0
`
`.223
`
`02_In_<m00._.0In_
`
`
`
`mom30m2.10:
`
`02_._.ow._.mo
`
`.223
`
`HTC, Exhibit 1018
`
`HTC, Exhibit 1018
`
`
`
`
`
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 14 of 15
`
`6,160,579
`
`
`
`
`
`
`
`
`
`9I ‘O I „H
`
`
`
`W LWO ESOWW!!
`
`
`
`HOSSHOOH) NOLLOQQOHdBH 39WWI HOTOO
`
`
`
`
`
`HTC, Exhibit 1018
`
`

`

`U.S. Patent
`
`Dec. 12, 2000
`
`Sheet 15 of 15
`
`6,160,579
`
`F I G.
`
`I. 7
`
`IS IMAGE POSITION
`INDICATED?
`
`
`
`
`
`
`
`
`
`
`
`
`
`S102
`
`IS IMAGE DATA INPUT
`
`OBTAN
`REPRODUCED IMAGE DATA
`BY EXECUTING NORMAL
`WHITE BALANCE PROCESSING
`
`
`
`
`
`
`
`
`
`
`
`
`
`DETERMINE MAGE
`REPRODUCTION PARAMETER
`SO THAT IMAGE SENSING
`DATA IN INDICATED
`POSITION IS CONVERTED
`NTO INPUT IMAGE DATA
`
`
`
`
`
`DETERMINE MAGE
`REPRODUCTION PARAMETER
`SO THAT IMAGE SENSING DATA
`N INDICATED POSITIONS
`CONVERTED INTO REPRODUCED
`MAGE DATANDICATING WHITE
`
`
`
`OBTAIN REPRODUCED IMAGE
`DATA BY EXECUTING COLOR
`BALANCE PROCESSING BY
`USING OBTANED IMAGE
`REPRODUCTION PARAMETER
`
`HTC, Exhibit 1018
`
`

`

`1
`IMAGE PROCESSINGAPPARATUS AND
`METHOD
`
`BACKGROUND OF THE INVENTION
`1. Field of the Invention
`The present invention relates to an image processing
`apparatus and method and, more particularly, to an image
`processing apparatus and method of adjusting the hue of an
`input image Signal.
`The present invention also relates to an image reproduc
`ing method and apparatus and, more particularly, to an
`image reproducing method and apparatus for converting an
`image Sensing Signal obtained from an image Sensing
`device, Such as an image Sensing tube or a CCD, into a
`Visualizable image Signal, e.g., an NTSC-RGB signal.
`2. Description of the Related Art
`In a television camera using an image Sensing device Such
`as a CCD, Some image reproduction parameters are gener
`ally determined from image Sensing data during image
`reproduction processing, in order to constantly obtain
`imageS which apparently give the Same impression or to
`obtain as faithful reproduced imageS as possible regardless
`of deterioration with time of the image Sensing device or a
`color filter and changes in an illuminating light Source. The
`image Sensing data is two-dimensional digital image data
`formed from an image Signal obtained by photographing an
`object by using an image Sensing device.
`Examples of the image reproduction parameters are a
`color temperature and a reproduction luminance level. The
`image production parameters are used to correct the color
`temperature or Set the reproduction luminance level.
`More Specifically, the correction of the color temperature
`is to adjust a So-called white balance So that an object which
`is Supposed to look white looks white. Generally, this color
`temperature correction is performed on the basis of image
`Sensing data. That is, data of an object which is Supposed to
`look white is extracted from image Sensing data, and a white
`balance coefficient as one image reproduction parameter is
`determined on the basis of the extracted data. In the white
`balance adjustment, a plurality of color component Signals
`constituting an output image Signal from an image Sensing
`device are amplified in accordance with the white balance
`coefficient. Consequently, the Signal levels of the color
`components constituting the image Signal of the object
`which is Supposed to look white are So adjusted as to be
`equal to each other.
`The Setting of the reproduction luminance level is done by
`calculating a luminance distribution from image Sensing
`data and Setting an optimum reproduction luminance level
`(range). The parameter is adjusted Such that a reproduced
`image is obtained within this range, and the image is
`reproduced.
`FIGS. 1 and 2 are block diagrams showing configurations
`for performing the color temperature correction.
`Referring to FIG. 1, complementary color data (consisting
`of color component Signals of magenta Ma, green Gr, Yellow
`Ye, and cyan Cy) obtained by an image Sensing unit 1 is
`Supplied to a complementary color-pure color converting
`unit 11. The complementary color data is converted into pure
`color data (consisting of color component signals of red R,
`green G, and blue B) in the converting unit 11. The white
`balance of the pure color data obtained by the complemen
`tary color-pure color converting unit 11 is adjusted by a
`white balance (WB) adjusting unit 12 in the subsequent
`Stage, and the gamma of the data is corrected by a gamma
`correcting unit 4.
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,160,579
`
`2
`In the configuration shown in FIG. 1 as above, the WB
`adjusting unit 12 is arranged Subsequently to the comple
`mentary color-pure color converting unit 11, and the color
`temperature correction is done by performing the white
`balance adjustment for the pure color data (R,G,B) after
`complementary colors are converted into pure colors. This
`configuration is advantageous in that the color temperature
`correction can be relatively easily performed because the
`gain of the pure color data (R,G,B) can be directly adjusted.
`In the configuration shown in FIG. 2, on the other hand,
`a WB adjusting unit 2 adjusts the white balance of comple
`mentary color data (Ma,Gr,Ye,Cy) obtained by an image
`Sensing unit 1. Thereafter, a complementary color-pure color
`converting unit 3 performs complementary color-pure color
`conversion to obtain pure color data (R,G,B). This configu
`ration has the advantage that a luminance signal with a
`higher resolution than that obtained in the configuration
`shown in FIG. 1 can be easily obtained.
`The method of adjusting the hue of an image by adjusting
`the white balance is effective when many objects which are
`Supposed to look white exist in an image Signal obtained
`from an image Sensing device. However, no Such object
`which is Supposed to look white exists in an image signal or
`only a very few Such objects exist in an image Signal in
`Specific instances. In these instances, therefore, it is in
`principle impossible to adjust the hue by adjusting the white
`balance. In Such instances, the general approach is to aver
`age image Sensing data of one image plane for each color
`component and adjust the white balance by using the aver
`age. However, a color indicated by the obtained average is
`not necessarily white (the color of a light Source), and So the
`white balance cannot be accurately adjusted.
`That is, the white balance coefficient cannot be accurately
`Set if it is determined from image Sensing data in order to
`obtain an optimum reproduced image.
`Also, in the Setting of the reproduction luminance level, if
`the reproduction luminance level (range) is determined for
`each image plane, the correlation between the luminances of
`a plurality of image planes is lost. This makes the compari
`Son of reproduced images difficult, or the connection of
`luminances becomes unnatural when the reproduced images
`are Synthesized.
`For example, the above inconveniences are significant
`when an object which is to be originally, desirably photo
`graphed as one image plane is divisionally photographed
`because the photographing area is Small and one image
`plane is formed by Synthesizing image Sensing data of the
`obtained image planes.
`That is, in the method of obtaining an image production
`parameter for each image Sensing data of one image plane,
`it is impossible to obtain a reproduced image which is used
`when information between a plurality of images is extracted
`by comparing and analyzing the images, Such as when
`physical property information is obtained from luminance
`information. Also, if the reflectance of an object Spatially,
`gradually changes, individual image Sensing data obtained
`by divisionally photographing the object have different
`luminance levels (ranges). If images are reproduced by
`independently optimizing these image Sensing data, the
`correlation between luminances originally corresponding to
`the respective image Sensing areas is lost in the reproduced
`images. Accordingly, if one image is formed by Synthesizing
`these images taken in the respective image Sensing areas, an
`unnatural Synthetic image in which the correlation between
`luminances is lost results.
`The hue of an image is adjusted by adjusting the white
`balance as follows. An object which is Supposed to look
`
`HTC, Exhibit 1018
`
`

`

`3
`white under a certain photographing light Source is photo
`graphed. The amplification factor of each of a plurality of
`color component signals constituting an image Signal
`obtained from the image Sensing device is So adjusted that
`the white object accurately looks white when the image
`Signal is reproduced. That is, it can be considered that the
`white balance adjustment is performed to compensate for
`changes in the light Source during photography.
`Commonly, the white balance adjustment described above
`is a principal means for compensating for changes in the
`light Source during photography. A white balance coefficient
`used in this white balance adjustment is obtained on the
`basis of information of the light Source during photography.
`Of a plurality of different image reproduction parameters
`used in image reproduction, Some parameters are preferably
`obtained on the basis of information of the light source
`during photography, like the image reproduction parameter
`(white balance coefficient) used in the white balance adjust
`ment. An example is a complementary color-pure color
`conversion matrix used to convert an image Signal obtained
`by using a complementary color filter into a pure color
`Signal.
`The complementary color-pure color conversion matrix is
`determined by the Spectral transmittance characteristic of a
`complementary color filter. Usually, the Spectral transmit
`tance characteristic of a complementary color filter is not
`ideal. The influence of this difference from the ideal char
`acteristic changes in accordance with the characteristics of
`the light Source during photography. That is, a complemen
`tary color-pure color conversion matrix optimally Selected
`under a certain photographing light Source gives an optimum
`complementary color-pure color conversion result under this
`light Source. However, this matrix does not give Suitable
`conversion results to all light Sources.
`When a photographing light Source changes, therefore, it
`is desirable to change the complementary color-pure color
`conversion matrix in accordance with the light Source. Also,
`the above two image reproduction parameters, i.e., the white
`balance coefficient and the complementary color-pure color
`conversion matrix, are related to each other under a certain
`photographing light Source. Accordingly, it is undesirable to
`individually determine these parameters.
`Generally, however, the complementary color-pure color
`conversion is performed by using a Semi-fixed complemen
`tary color-pure color conversion matrix which is optimally
`Set under a certain photographing light Source. If the pho
`tographing light Source changes, therefore, the influence of
`the difference of the Spectral transmittance characteristic of
`a complementary color filter from the ideal characteristic
`increases. Also, a contradiction Sometimes occurs between
`the white balance coefficient and the complementary color
`pure color conversion matrix having the correlation.
`Consequently, no complementary color-pure color conver
`Sion can be properly performed, and this makes faithful
`reproduction of an image difficult.
`SUMMARY OF THE INVENTION
`The present invention has been made to individually or
`collectively Solve the above conventional problems, and has
`as its object to provide an image processing apparatus and
`method capable of accurately Setting, from a group of image
`Sensing data, an image reproduction parameter for obtaining
`an optimum reproduced image.
`To achieve the above object, one preferred embodiment of
`the present invention discloses an image processing appa
`ratus for converting image Sensing data obtained by image
`
`1O
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,160,579
`
`4
`Sensing means into a Visualizable image Signal by using an
`image reproduction parameter, comprising Storage means
`for Storing the image Sensing data, designating means for
`designating a condition of Selection of the image Sensing
`data, Selecting means for Selecting image Sensing data
`meeting the Selection condition designated by the designat
`ing means from the image Sensing data Stored in the Storage
`means, the holding means for holding the image Sensing
`data Selected by the Selecting means, Setting means for
`Setting the image reproduction parameter on the basis of the
`image Sensing data held by the holding means, and convert
`ing means for converting the image Sensing data held by the
`holding means into the image Signal by using the image
`reproduction parameter Set by the Setting means.
`The present invention has been made to individually or
`collectively Solve the above conventional problems, and has
`as its object to provide an image processing apparatus and
`method capable of converting an image Sensing Signal into
`an image Signal by using a group of more accurate image
`reproduction parameters meeting the condition of a photo
`graphing light Source.
`To achieve the above object, one preferred embodiment of
`the present invention discloses an image processing appa
`ratus for converting image Sensing data obtained by image
`Sensing means into a Visualizable image Signal by using a
`plurality of different image reproduction parameters, com
`prising Setting means for Setting at least one of the different
`image reproduction parameters, and converting means for
`converting the image Sensing data into the image Signal by
`using the image reproduction parameter Set by the Setting
`means, wherein the Setting means Sets at least one parameter
`on the basis of another one of the different image reproduc
`tion parameters.
`The present intention has been made to individually or
`collectively Solve the above conventional problems, and has
`as its object to provide an image processing apparatus and
`method capable of adjusting a color balance even when no
`object which is Supposed to look white exists or only a few
`Such objects exist.
`To achieve the above object, one preferred embodiment of
`the present invention discloses an image processing appa
`ratus comprising first input means for inputting an image
`Signal; Second input means for inputting position informa
`tion indicating an arbitrary position of an image and image
`data in the position; extracting means for extracting the
`image data in the position corresponding to the position
`information from the image Signal input from the first input
`means, Setting means for Setting an image processing param
`eter on the basis of the image data extracted by the extracting
`means and the input image data from the Second input
`means, and processing means for processing the input image
`Signal from the first input means by using the image pro
`cessing parameter Set by the Setting means.
`Other features and advantages of the present invention
`will be apparent from the following description taken in
`conjunction with the accompanying drawings, in which like
`reference characters designate the same or similar parts
`throughout the figures thereof.
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIGS. 1 and 2 are block diagrams showing configurations
`for performing color temperature correction;
`FIG. 3 is a block diagram showing the configuration of an
`image reproducing apparatus according to the first embodi
`ment of the present invention;
`FIG. 4 is a flow chart showing the procedure of image
`processing done by the image processing apparatus of the
`first embodiment;
`
`HTC, Exhibit 1018
`
`

`

`S
`FIG. 5 is a view for explaining an example of a Synthetic
`image,
`FIG. 6 is a view for explaining composition information
`of an object;
`FIG. 7 is a view for explaining image Sensing data having
`a time continuity;
`FIG. 8 is a flow chart showing image Sensing data
`grouping processing;
`FIG. 9 shows an example of a chromaticity diagram;
`FIG. 10 is a view showing the range of a correlated color
`temperature of image Sensing data represented by (R/G,B/G)
`Signal values,
`FIG. 11 is a flow chart showing image Sensing data
`grouping processing when a composition is designated as
`the condition of grouping,
`FIG. 12 is a flow chart showing processing of grouping
`image Sensing data by using information appended to the
`image Sensing data;
`FIG. 13 is a schematic block diagram showing the overall
`configuration of a color image reproducing apparatus
`according to the Second embodiment;
`FIG. 14 is a schematic block diagram showing the overall
`configuration of a color image reproducing apparatus
`according to the third embodiment;
`FIG. 15 is a schematic block diagram showing the overall
`configuration of a color image reproducing apparatus
`according to the fourth embodiment;
`FIG. 16 is a block diagram showing the configuration of
`an image processing apparatus according to the fifth
`embodiment; and
`FIG. 17 is a flow chart showing the operation of a color
`image reproduction processor shown in FIG. 16.
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`Embodiments of the present invention will be described
`below with reference to the accompanying drawings.
`<First Embodiment>
`Structure
`FIG. 3 is a block diagram showing an embodiment of an
`image reproducing apparatus according to the present inven
`tion.
`A color image Sensing unit 10 Such as a digital camera
`Senses the image of an object and outputs the image Sensing
`data of the object to an image Sensing data memory 20. The
`image Sensing data memory 20 Stores the image Sensing data
`Supplied from the color image Sensing unit 10.
`A color image reproduction processor 30 performs pre
`determined image reproduction processing for the image
`Sensing data Stored in the image Sensing data memory 20.
`For example, the color image reproduction processor 30
`converts the image sensing data into digital NTSC-RGB
`data and outputs the digital data to a color image reproduc
`ing display 40 and a reproduced image data memory 50.
`The color image reproducing display 40 includes a color
`Video card and a monitor. The color image reproducing
`display 40 receives an output color image Signal from the
`color image reproduction processor 30 or reads out a color
`image Signal from the reproduced image data memory 50
`and displays the Signal as a color image on the monitor.
`The reproduced image data memory 50 Stores the image
`data reproduced by the color image reproduction processor
`30. The image data Stored in the reproduced image data
`memory 50 is Supplied to and displayed by the color image
`reproducing display 40 as needed.
`
`15
`
`25
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`6,160,579
`
`6
`More Specifically, the color image reproduction processor
`30 comprises an image Sensing data holding unit 31, an
`image Sensing data Selecting unit 33, an image Sensing data
`Selection designating unit 34, an image reproduction param
`eter determining unit 32, an image reproduction processing
`unit 35, and a control unit 36. The image Sensing data
`holding unit 31 holds the digital image Sensing data from the
`image Sensing data memory 20. The image Sensing data
`Selecting unit 33 Selects image Sensing data from the image
`Sensing data memory 20 and outputs the Selected data to the
`image Sensing data holding unit 31. The image Sensing data
`Selection designating unit 34 designates the condition by
`which the image Sensing data Selecting unit 33 Selects image
`Sensing data. The image reproduction parameter determin
`ing unit 32 determines an image reproduction parameter by
`using the image Sensing data held by the image Sensing data
`holding unit 31. The image reproduction processing unit 35
`reproduces an image of the image Sensing data held by the
`image Sensing data holding unit 31 by using the image
`reproduction parameter determined by the image reproduc
`tion parameter determining unit 32. The control unit 36
`controls these units of the color image reproduction proces
`Sor 30.
`The control unit 36 is constituted by, e.g., one-chip
`microcontroller (MPU) and executes various processes (to
`be described later) in accordance with programs previously
`Stored in an internal ROM 36a. An internal RAM 36b of the
`MPU is used as a work memory of the MPU.
`Examples of the image reproduction parameter deter
`mined by the image reproduction parameter determining
`unit 32 are parameters indicating a color temperature, a
`white balance coefficient, a color component gain, white
`point information, black point information, a gamma
`coefficient, a gradation characteristic, a gradation conver
`Sion curve, a gradation conversion lookup table, a knee
`point, a dynamic range, a color gamut, light Source
`information, a color coordinate conversion matrix
`coefficient, a Spatial frequency characteristic, a black bal
`ance coefficient, an SIN ratio, an auto-correlation coefficient,
`a Wiener spectrum, an intensity (density) distribution, and a
`luminance distribution, and parameters obtained directly or
`indirectly from these pieces of information.
`The image Sensing data Selection designating unit 34 can
`designate the condition of Selection in order that of a
`plurality of image Sensing data Stored in the image Sensing
`data memory 20, image Sensing data meeting a predeter
`mined condition be processed as a group of image Sensing
`data. The designated condition is given as a condition related
`to color temperature information or the composition of an
`object obtained from image Sensing data, or as a predeter
`mined condition pertaining to the photographing time, the
`photographing mode, the photographing place, the EV
`value, the aperture value, the object distance, the Shutter
`Speed, the use/nonuse of an electronic flash, or the use/
`nonuse of an optical low-pass filter, each of which is
`information appended to image Sensing data.
`Image Reproduction Processing
`FIG. 4 is a flow chart showing the procedure of image
`processing done by the image reproducing apparatus of this
`embodiment. FIG. 4 shows processing executed by the color
`image reproduction processor 30. ASSume that a plurality of
`image Sensing data obtained by the color image Sensing unit
`10 are already Stored in the image Sensing data memory 20.
`When the processing shown in FIG. 4 starts, the control
`unit 36 for controlling the processing checks the contents of
`image Sensing data Selection designated (in Step S1) by the
`image Sensing data Selection designating unit 34 and sends
`
`HTC, Exhibit 1018
`
`

`

`7
`the contents of the designated Selection to the image Sensing
`data selecting unit 33. On the basis of the contents of the
`Selection designated by the image Sensing data Selection
`designating unit 34, the image Sensing data Selecting unit 33
`Selects image Sensing data meeting the contents of the
`designated Selection from the image Sensing data Stored in
`the image Sensing data memory 20 and outputs the Selected
`data to the image Sensing data holding unit 31 (step S2). If
`the image Sensing data meeting the contents of the desig
`nated Selection is data of less than one image plane, this
`processing is not executed. If the Selected image Sensing
`data is data of one image plane, the image Sensing data is
`converted into image data by ordinary image reproduction
`processing. If there is no image Sensing data to be selected,
`information indicating this is output, and the processing is
`terminated.
`The image Sensing data holding unit 31 holds the group of
`image Sensing data Selectively output from the image Sens
`ing data memory 20 (step S3).
`The image reproduction parameter determining unit 32
`analyzes the image Sensing data group held by the image
`Sensing data holding unit 31, obtains, e.g., light Source
`information (e.g., the color temperature or the chromaticity
`of the light Source) when the image Sensing data are
`obtained, which is necessary in white balance adjustment,
`and determines an image reproduction parameter on the
`basis of the light source information (step S4). That is, by
`using the plurality of image Sensing data Selectively output
`as a group of image Sensing data from the image Sensing
`data memory 20 and held in the image Sensing data holding
`unit 31, the image reproduction parameter determining unit
`32 determines an image reproduction parameter and Sends
`the parameter to the image reproduction processing unit 35.
`By using the image reproduction parameter thus
`determined, the image reproduction processing unit 35 per
`forms image reproduction processing by which the image
`Sensing data group held by the image Sensing data holding
`unit 31 is converted into a group of image data (step S5).
`Selection Conditions of Image Sensing Data
`The conditions under

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket