`Hashima et al.
`
`1|||||m|||||mumlllllggwgyttylln|||||m|||||n||||||||
`
`[11] Patent Number:
`[45] Date of Patent:
`
`5,521,843
`May 28, 1996
`
`[54] SYSTEM FOR AND METHOD OF
`RECOGNIZING AND TRACKING TARGET
`MARK
`
`[75] Inventors: Masayoshi Hashima; Fumi Hasegawa;
`Keiju Okabayashi; Ichiro Watanabe;
`Shinji Kanda; Naoyuki Sawasaki;
`Yuichi Murase, all of Kawasaki, Japan
`
`[73] Assignee: Fujitsu Limited, Kawasaki, Japan
`[21] Appl. No.:
`119,228
`[22] PCT Filed:
`Jan. 29, 1993
`
`[86] PCT No.:
`PCT/JP93I00107
`§ 371 Date:
`Sep. 28, 1993
`§ 102(6) Date: Sep. 28, 1993
`[87] PCT Pub. No.: WO93/15376
`
`PCT Pub. Date: Aug. 5, 1993
`Foreign Application Priority Data
`
`[30]
`
`Jan. 30, 1992
`
`[JP]
`
`Japan .................................. .. 4-015557
`
`Jun. 26, 1992
`
`[JP]
`
`Japan . . . . . .
`
`Aug. 18, 1992
`
`[JP]
`
`Japan . . . . . .
`
`. . . .. 4-193457
`
`. . . .. 4-219029
`
`Oct. 29, 1992
`Nov. 17, 1992
`
`[JP]
`[JP]
`
`Japan . . . . . .
`. . . .. 4-291628
`Japan .................................. .. 4-307015
`
`[51] Int. Cl? .................................................... .. G01S 15/06
`[52] U.S. Cl. .................... .. 364/516; 340/815.54; 382/103
`[58] Field of Search ............................. .. 364/167.01, 559,
`364/516; 340/815.54, 815.57, 815.68; 382/10,
`14, 18, 25, 51, 65; 250/203 CT; 33/293;
`348/94
`
`[56]
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`5,008,804
`5,207,003
`
`4/1991 Gordon et al. ................... .. 364/167.01
`5/1993 Yamada et al. ......................... .. 33/293
`
`Japan .
`Japan .
`
`FOREIGN PATENT DOCUMENTS
`61-126406 6/1986
`62-54107
`3/ 1987
`62-54108
`3/1987
`62185106 8/1987
`62-185105
`8/1987
`63-75508 4/1988
`3-131710 6/1991
`4-119481
`4/1992
`
`Japan .
`Japan .
`Japan .
`Japan .
`Japan .
`Japan .
`
`Primary Examiner-—Emanuel T. Voeltz
`Assistant Examiner~Thomas Peeso
`Attorney, Agent, or Firm—Armstrong, Westerman, Hattori,
`McLeland & Naughton
`
`[57]
`
`ABSTRACT
`
`A system for and a method of recognizing and tracking a
`target mark with a video camera is disclosed. The system
`includes a target mark (10) disposed on an object (1) and
`composed of a black circle and a white triangle mounted
`centrally on the black circle and three-dimensionally shifted
`from the black circle, a video camera (20) for imaging the
`target mark (10), a robot (30) supporting the video camera
`(20) and movable in directions with six degrees of freedom,
`an image processor (40) for processing image data of the
`target mark which is produced by the video camera (20), a
`shift calculating unit (50) for detecting a shift of the target
`mark (10) from projected histogram information of the
`target mark (10) which is produced by the image processor
`(40), and a robot controller (60) for controlling movement of
`the robot depending on the shift to enable the video camera
`(20) to track the target mark (10). The system is capable of
`tracking the target mark (20) attached to the object (1) on a
`real-time basis. Mark recognizing apparatus capable of
`accurately recognizing target marks of other shapes is also
`disclosed.
`
`7/1981 Ueda et a1. ............................. .. 348/94
`4,281,342
`4,297,725 10/1981 Shimizu et a1. ................ .. 250/203 CT
`
`14 Claims, 47 Drawing Sheets
`
`I0
`
`20
`
`IH
`
`30
`
`__
`
`IMAGE
`PROCESSOR
`
`\fo
`
`ROBOT
`CONTROLLER
`
`m5”
`
`§3fFI7tw~AL
`CALCULATING
`umr
`
`Page 1 of 68
`
`SAMSUNG EXHIBIT 1006
`Samsung v. Image Processing Techs.
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 1 of 47
`
`5,521,843
`
`.Ptmm
`
`av
`
`mcmmmuomm
`
`mu 3:
`
`cm
`
`on
`
`5
`
`GE
`
`
`
`@22. <30 .36
`
`.22:
`
`qqzoiacm
`532229“ s got
`/s
`
`SAMSUNG EXHIBIT 1006
`Page 2 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 2 of 47
`
`5,521,843
`
`FIG.2
`
`'2
`
`f '"n
`
`ma
`
`PM
`
`'2
`
`FIG.4
`
`SAMSUNG EXHIBIT 1006
`Page 3 of 68
`
`
`
`US. Patent
`
`May
`28, 1996
`
`Sheet 3 0f 47
`
`5,521,843
`
`S!
`
`@ READ ORIGINAL IMAGE
`
`l
`32 _\-|CONVERT T0 BINARY IMAGE J
`I
`LABEL BINARY IMAGE
`
`S3
`
`84¢
`
`85
`
`SII
`
`N0
`
`SI2
`
`YES
`SEPARATE IMAGE 0F
`(\- GROUP NUMBER n
`
`MARK DETECTION
`FAILED
`
`AREA OF
`IMAGE 0F
`GROUP
`NUMBER n
`
`<
`
`<
`
`AREA
`
`DETERMINE X- AND Y-PRDJECTED HISTDGRAMS
`
`DE TE C T PEAKS
`
`SID
`
`IS NUMBER OF PEAKS IN X- AND
`Y-PROJE C TED HISTDGRAMS 2?
`
`PRESENT GROUP
`NUMBER n
`I
`5'4 \(MARK osrscnou SUCCESSFUL
`
`3
`
`FIG.5
`
`SAMSUNG EXHIBIT 1006
`Page 4 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 4 of 47
`
`5,521,843
`
`
`
`
`
`Y-PROJE C TED HIS TOGRAM
`
`X-PROJEC TED HISTOGRAM
`
`FIG.6
`
`SAMSUNG EXHIBIT 1006
`Page 5 of 68
`
`
`
`US. Patent '
`
`May 28,1996
`
`Sheet 5 of 47
`
`5,521,843
`
`FIG.7
`
`GROUP 3
`
`GROUP 2
`
`GROUP 4 l;
`
`moi/Pi
`
`GROUP 6'\®
`
`GROUP 7%)
`
`GROUP 8
`
`FIG.8
`
`SAMSUNG EXHIBIT 1006
`Page 6 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 6 of 47
`
`5,521,843
`
`FIG.9
`
`SAMSUNG EXHIBIT 1006
`Page 7 of 68
`
`
`
`US. Patent
`
`May-28, 1996
`
`Sheet 7 of 47
`
`5,521,843
`
`SAMSUNG EXHIBIT 1006
`Page 8 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 8 of 47
`
`5,521,843
`
`
`
`33322: P": umsmmi
`
`
`
`2352.2: cm: 056mm-»
`
`FIG.”
`
`X-PROJEC TED HISTOGRAM
`
`FIGJZ
`
`X-PROJEC TED HISTOGRAM
`
`SAMSUNG EXHIBIT 1006
`Page 9 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 9 of 47
`
`5,521,843
`
`M2.’ AUXILIARY ORIGIN MARK
`
`M4
`
`512 PIXELS
`
`4!
`_
`\J
`
`(22
`y e-(f
`X
`
`42
`
`i
`I _
`
`Y
`
`‘2.
`g
`
`l
`s
`In
`
`V
`
`SAMSUNG EXHIBIT 1006
`Page 10 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 10 of 47
`
`5,521,843
`
`Yb2
`
`FIG.I5
`
`42
`
`X
`
`512 PIXELS
`
`‘
`
`y M
`
`-
`
`10B
`
`22
`
`_
`
`<3
`\
`y e? 15
`X m E
`‘W
`CAMERA
`%
`'OA
`COORDINATE
`SYSTEM
`
`FIGJS
`
`if
`
`SAMSUNG EXHIBIT 1006
`Page 11 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 11 of 47
`
`5,521,843
`
`@267.‘
`
`2156C
`
`SAMSUNG EXHIBIT 1006
`Page 12 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 12 of 47
`
`5,521,843
`
`FIG.I8
`
`Xbl Xpl K KXpZ Xb2
`XPc
`Xbc
`
`SAMSUNG EXHIBIT 1006
`Page 13 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 13 0f 47
`
`5,521,843
`
`F 16.20
`
`FIG.22
`
`FIG.2I
`
`F 16.23
`
`SAMSUNG EXHIBIT 1006
`Page 14 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 14 of 47
`
`5,521,843
`
`100
`
` 120
`
`(20
`
`130
`
`HO
`
`F1IG.24(A)
`
`F16.24(B)
`
`lol
`
`121 J
`
`F1G.25
`
`_
`
`I
`
`tA
`
`1B
`
`20
`
`N
`LL
`
`10!
`
`SAMSUNGEXHIBIT 1006
`Page 15 of 68
`
`F1G.26
`
`SAMSUNG EXHIBIT 1006
`Page 15 of 68
`
`
`
`US. Patent
`
`May 28', 1996
`
`Sheet 15 0f 47
`
`5,521,843
`
`( sTART
`
`)
`
`RECOGNIZE MARK
`._I
`"w
`SET wmoow
`
`\921
`
`\szz
`
`READ ORIGINAL IMAGE
`
`\sza
`
`CONVERT T0 BINARY IMAGE \sz“
`i
`DETERMINE x- AND Y-
`PROJECTED HISTOGRAMS
`*
`CONFIRM MARK
`
`\j25
`
`S26
`
`3”
`
`DOES
`MARK EXIST?
`
`$28
`as
`DETECT POSITIONAL SHIFT \
`
`GENERA TE SPEED c0MMAM0 VALUE \ 829
`
`s30
`i'
`OUTPUT SPEED coMMAMo VALUE '\
`
`DETERMINE wmnow POSITION
`AND SIZE FOR NEXT MEAsuREMEMT
`
`\S3I
`
`832
`
`OBJECT GEIPPED
`
`F1627
`
`SAMSUNG EXHIBIT 1006
`Page 16 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 16 0f 47
`
`5,521,843
`
`
`
`FIG 281A) y
`
`F I 6 28(8)
`
`SAMSUNG EXHIBIT 1006
`Page 17 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 17 of 47
`
`' 5,521,843
`
`Z
`[UNI T.‘ mm]
`
`0
`
`E2 IUNIT: PIXEL]
`
`FIG.3I
`
`SAMSUNG EXHIBIT 1006
`Page 18 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 18 of 47
`
`5,521,843
`
`Ex [UNI T: PIXEL]
`
`Dx lumr: mm]
`
`F1632
`
`Ax [UNI T.‘ PIXEL/mm]
`
`0
`
`ZIUN! T: mm]
`
`SAMSUNG EXHIBIT 1006
`Page 19 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 19 0f 47
`
`5,521,843
`
`DETERMINE DISTANCE DETwEEN MARK
`AND CAMERA FRDM AREA OF MARK: \ S4 ,
`z=f IE2)
`I
`‘DETERMINE SHIFT FROM GIVEN
`PDsITIDN IN 2 DIREcTIDN
`
`"\842
`
`I
`
`DETERMINE INCLINATIONS A IN X, Y, ROLL,
`PITCH, YAW DIRECTIONS:
`
`’\_S43
`
`I
`DETERMINE ACTUAL SHIFTS IN x, Y, ROLL,
`PITCH, YAW DIRECTIONS:
`
`\844
`
`SAMSUNG EXHIBIT 1006
`Page 20 of 68
`
`
`
`US. Patent
`
`May 28, 1996
`
`Sheet 20 of 47
`
`5,521,843
`
`‘\
`
`\
`RT
`\
`
`0
`
`II |
`
`\
`
`z[mm]
`
`300
`
`200
`
`100
`
`FIG.36
`
`0
`
`0
`
`0000
`
`20000
`Ez [PIXEL]
`
`30000
`
`40000
`
`50000
`
`SAMSUNG EXHIBIT 1006
`Page 21 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 21 of 68
`
`
`
`May 28, 1996
`
`Sheet 21 of 47
`
`5,521,843
`
`U.S. Patent
`
`Ey [PIXEL]
`
`200
`
`100
`
`-100
`
`-2
`
`-3 -
`
`-5
`
`6
`
`100
`
`F1G.38
`
`200
`
`zim]
`
`300
`
`SAMSUNG EXHIBIT 1006
`Page 22 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 22 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 22 of 47
`
`5,521,843
`
`START
`
`DETECT SHIFT ON SCREEN
`
`CONVERT TO SHIFT IN
`
`89
`
`$52
`
`$53
`
`
`INPUT MARK IMAGE
`
`
`
`ACTUAL COORDINATES
`
`
` OBJECT GRIPPED ?
`
`GENERATE SPEED COMMAND VALUE
`IN NEW COORDINATE SYSTEM
`
`OUTPUT SPEED COMMAND VALUE
`
`$54
`
`$55
`
`YES
`
`F1G.39
`
`$56
`
`SAMSUNGEXHIBIT 1006
`Page 23 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 23 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 23 of 47
`
`5,521,843
`
`S
`*
`
`Dpitch
`
`F1G.42
`F1G.40
`
`SAMSUNGEXHIBIT 1006
`Page 24 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 24 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 24 of 47
`
`5,521,843
`
`START
`
`INPUT MARK IMAGE
`
`DETECT SHIFT ON SCREEN
`
`CONVERT TO SHIFT IN ACTUAL
`COORDINATES
`
`CONVERT TO SHIFT IN CAMERA
`COORDINATE SYSTEM
`
`GENERATE SPEED COMMAND VALUE
`IN CAMERA COORDINATE SYSTEM
`
`OUTPUT SPEED COMMAND VALUE
`
`56!
`
`562
`
`$63
`
`S64
`
`565
`
`566
`
`
`
`
`
`
`
`$67
`
`OBJECT GRIPPED ?
`
`YES
`
`END
`
`F16.4]
`
`SAMSUNGEXHIBIT 1006
`Page 25 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 25 of 68
`
`
`
`Sheet 25 of 47
`
`5,521,843
`
`U.S. Patent
`
`May28, 1996
`
`>~&&
`
`SAMSUNGEXHIBIT 1006
`Page 26 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 26 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 26 of 47
`
`5,521,843
`
`Yeg
`
`Yeqg
`
`F1G.45(A)
`
`F1G.45(B)
`
`F1G.45(C)
`
`Xeqg
`
`Xeg
`
`SAMSUNGEXHIBIT 1006
`Page 27 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 27 of 68
`
`
`
`US. Patent
`
`5,521,843
`
`May28, 1996
`
`Sheet 27 of 47
`
`F1G.47
`
`
`F1G.46
`
`SAMSUNGEXHIBIT 1006
`Page 28 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 28 of 68
`
`
`
`U.S. Patent
`
`May 28, 1996
`
`Sheet 28 of 47
`
`5,521,843
`
`
`
`(N87914(78914(8HOld(9180914(8/8091S=(wI8bOld
` [21]06+2/¥=9[II][ol]09-¥=6[6]3/1=6[2]|||—|ylyb|vviv
`
`
`'‘[9]oa=@[S][¢]09-z/1=6[¢][2]0-6[I]
`v\|ov|2?v0;
`
`
`
`
`
`Divyi
`
`ty
`
`qqqg
`
`SAMSUNGEXHIBIT 1006
`Page 29 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 29 of 68
`
`
`
`U.S. Patent
`
`May 28, 1996
`
`Sheet 29 of 47
`
`(6b94=(NevIs(MNévOld(D6b914(G60914(WI6bIII
`7_||||olyp|yoly"iyJ4vivo
`[81]06+¥=6[Zi][91]00-2/1£=6[CI][pi]¥=6/E1]
`
`
`
`
`gagqaqaq
`
`q4q
`
`5,521,843
`
`SAMSUNGEXHIBIT 1006
`Page 30 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 30 of 68
`
`
`
`U.S. Patent
`
`(WerOld(hlé6yOld(Pi6b914(NebOld(HI6DOld(9/60914
`
`
`
`
`
`
`
`May 28, 1996
`
`[be]oatZ/¥£=6[€2][22]09-¥2=6[Iz][oz]2/1e=6[61]
`
`Sheet 30 of 47
`
`5,521,843
`
`ylv|¥|:
`
`oly
`
`gqqgqgqqiaq
`
`gqgDiyDiyov
`
`a)frp54
`
`SAMSUNGEXHIBIT 1006
`Page 31 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 31 of 68
`
`
`
`U.S. Patent
`ey
`[e]d7=4HBAcwA
`
`[ez]d7>XH
`
`ea
`
`en[P|
`
`OPH.
`
`LNINILYAd
`
`AGGalAISSVT)
`WAWIXVWAGC41dISSVTD
`
`NUFILLVd
`
`HLONTTWAWIXVW
`
`ONVNOILISOdHLONIT
`
`dIHSNOILV134
`ALIAWY9-4O-YFLNID
`
`NOILISOd
`
`May28, 1996
`
`Sheet 31 of 47
`
`5,521,843
`
`(0S914
`
`(VI0S914
`
`SAMSUNGEXHIBIT 1006
`Page 32 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 32 of 68
`
`
`
`
`
`
`U.S. Patent
`
`fd)
`
`WVYSOLSIH
`
`AUYOWAIN
`
`||I||||||||
`INV
`{{I
`
`AUOWIN
`
`Iwas
`
`AYOWIN
`
`IWVUd
`
`AUYOWIN
`
`
`
`WVY9OLSIHJIVWIONILLISQNIYOLS
`
`
`
`
`
`GILYIANOD
`
`AUUNIGOLMOONIM
`
`ONVAWODONVANWODGNVWNODONVNWOD
`
`GNVNWOD
`
`May28, 1996
`
`
`
`
`
`
`
`Sheet 32 of 47
`
`5,521,843
`
`bok£0£c0£10g
`
`
`
`WVY9OLSIH*JOVWIAYYNIGOLCILYFANOD
`
`
`
`*ONILLISMOGNIM*ONIYOLS
`
`SOf
`
`id)
`
`ONIOVIu
`
`AYONIN
`
`WYYDOLSIHONINNYIS
`
`JONI
`
`GNVNN0D|
`
`aNYWNO9||||||
`(GG914
`
`(WIS94
`
`SAMSUNGEXHIBIT 1006
`Page 33 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 33 of 68
`
`
`
`
`
`
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 33 of 47
`
`5,521,843
`
`
`
`SAMSUNG EXHIBIT 1006
`Page 34 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 34 of 68
`
`
`
`U.S. Patent
`
`May 28, 1996
`
`Sheet 34 of 47
`
`5,521,843
`
`GI£AVIdSId
`
`LINA
`
`YOLINOW
`
`ble
`
`;ee9I¢YOSSII0Ud
`
`WVY9OLSIH
`
`HEOFGIA
`
`TUNIIS
`
`NOSINVdW0D
`
`JOVITOA
`
`INININYILIC
`
`MOCNIM
`
`LINN
`
`60£
`
`QNIZINOUHINAS
`
`YOLVYYdIS
`
`YOLYIANOD
`
`V/0
`
`/olf
`
`80£
`
`LINN
`
`INIdWV19©
`
`VIVOWVYSOLSIH
`
`VLV@NOILISOdMOGNIM
`
`
`VLVO19AF97JOT
`
`£G9l4
`
`SAMSUNGEXHIBIT 1006
`Page 35 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 35 of 68
`
`
`
`
`
`
`
`U.S. Patent
`
`May 28, 1996
`
`Sheet 35 of 47
`
`5,521,843
`
`JINTS
`
`TAAF1
`
`
`
`
`
`QNIZINOUHINASTWLNOZIYOH
`
`TWNOIS
`
`(ZHNSel)
`
`NIO19L00
`(2b914
`
`
`
`(SISTNdAYVNIG)
`
`LNdiNo
`
`YOLVYYNOD
`(oS9!l4
`
`MOQNIMNITYNSIS
`(JOS94
`
`LNdino3I1V9ONV
`
`(0S914
`
`TVNIISOFGIA
`
`(Ves914
`
`TYNSIS
`
`INIZINOUHINASTWIILYAA
`
`(Gv914
`
`SAMSUNGEXHIBIT 1006
`Page 36 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 36 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 36 of 47
`
`5,521,843
`
`FIG.55
`
`MAINMEMORY
`
`|
`
`
`
`IMAGEMEMORY
` INTERFACE
`PROCESSOR
`CAMERA
`
`
`SAMSUNGEXHIBIT 1006
`Page 37 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 37 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`5,521,843
`
`Sheet 37 of 47
`
`F1G.56
`
`SAMSUNGEXHIBIT 1006
`Page 38 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 38 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 38 of 47
`
`5,521,843
`
`f
`
`0
`
`Z
`F16.58
`
`SAMSUNGEXHIBIT 1006
`Page 39 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 39 of 68
`
`
`
`May28, 1996
`
`Sheet 39 of 47
`
`U.S. Patent
`
`5,521,843
`
`F1G.60
`
`SAMSUNGEXHIBIT 1006
`Page 40 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 40 of 68
`
`
`
`U.S. Patent
`
`5,521,843
`
`May28, 1996
`
`Sheet 40 of 47
`
`F1G.61
`
`F16.62
`
`231
`
`230
`
`232
`
`234
`
`233
`
`SAMSUNGEXHIBIT 1006
`Page 41 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 41 of 68
`
`
`
`U.S. Patent
`
`May 28, 1996
`
`Sheet 41 of 47
`
`5,521,843
`
`INPUT IMAGE
`
`$7
`
`BAND-CONVERT TO
`BINARY IMAGE
`
`$72
`
`LABEL BINARY IMAGE AND LABEL
`
`HISTOGRAM
`
`SELECT MARK
`
`PRODUCE X- AND Y-
`PROJECTED HISTOGRAMS
`
`$73
`
`$76
`
`$74
`
`$79
`
`
`
`
`
`
`
`
`
`CALCULATE X AND Y ADDRESSES (WEIGHTED
`MEANS OF X- AND Y- PROJECTED
`
`HISTOGRAMS)
`
`
` FOUR
`
`
`POINTS
`
`EXTRACTED ?
`
`YES
`
`CALCULATE DISTANCE
`AND ATTITUDE
`
`578
`
`O)
`
`F16.63
`
`SAMSUNG EXHIBIT 1006
`Page 42 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 42 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 42 of 47
`
`5,521,843 F1G.64(A)
`
`230
`
`F1G.64(B)
`
`234
`
`Y
`
`2356
`
`x
`
`F1G.65
`
`235
`
`Xo
`
`SAMSUNGEXHIBIT 1006
`Page 43 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 43 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 43 of 47
`
`5,521,843
`
`8b2
`
`poz
`
`obd
`
`Lobe
`
`(VI89914
`
`(G89914
`
`(G99914 (VI99914
`
`SAMSUNGEXHIBIT 1006
`Page 44 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 44 of 68
`
`
`
`
`$87
`
`$88
`
`$89
`
`EXTRACT TWO PEAKS OF X-
`AND Y- PROJECTED HISTOGRAMS
`
`
`
`
`
`
`
`
`
`
`
`
`CALCULATE DISTRIBUTED
`WEIGHTS B OF X- AND Y-
`
`
`PROJECTED HISTOGRAMS
`OF REVERSED SMALL CIRCLE
`
`
`SELECT MARK
`
`
`CALCULATE WEIGHT MEAN
`
`OF SMALL CIRCLE
`
`
` USING A+B
`
`
` PRODUCE X- AND Y-
`PROJECTED HISTOGRAMS
`CALCUATE-WHGHT WEAN
`
`OF SMALL CIRCLE
`$86
`
`
`
`
`
`CALCULATED DISTRIBUTED
`USING B
`WEIGHTS A OF X- AND Y-
`
`Wo
`PROJECTED HISTOGRAMS
`
`
`CIRCLES EXTRACTED
`?
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 44 of 47
`
`5,521,843
`
`INPUT IMAGE
`
`BAND-CONVERT TO
`BINARY {MAGE
`
`$8!
`
`$82
`
`
`
`SET MASK AROUND SMALL CIRCLE
`
`LABEL BINARY IMAGE
`
`
`
`
`83
`$
`
`$84
`
`$390
`
`$91
`
`$92
`
`$93
`
`
`
`
`CALCULATE WEIGHT AND
`ATTITUDE
`
`YES
`
`F1G.67
`
`SAMSUNGEXHIBIT 1006
`Page 45 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 45 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 45 of 47
`
`5,521,843
`
`F1G.69
`
`
`
`SAMSUNGEXHIBIT 1006
`Page 46 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 46 of 68
`
`
`
`U.S. Patent
`
`May28, 1996
`
`Sheet 46 of 47
`
`5,521,843
`
`INPUT IMAGE
`
`510]
`
`BAND-CONVERT TO
`BINARY IMAGE
`
`slo
`
`LABEL BINARY IMAGE
`
`$103
`
`SELECT TWO MARKS WITH IST AND
`2ND LARGEST NUMBERS OF PIXELS
`
`$104
`
`APPLY THREE-DIMENSIONAL MARK
`DETECTING PROCESS
`
`5105
`
`S106
`
`YES
`
`
`
`NO
`
`SELECT FOUR MARKS
`
`
`
`APPLY FOUR-CIRCLE
`DETECTING PROCESS
`
`
`
`S109
`
`
`
`C) CALCULATE DISTANCE AND ATTITUDE
`
`FIG.70
`
`SAMSUNGEXHIBIT 1006
`Page 47 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 47 of 68
`
`
`
`U.S. Patent
`
`May 28, 1996
`
`Sheet 47 of 47
`
`5,521,843
`
`£22
`
`=~~—~~_
`——-=<—
`
`(0'Z0'10}10
`
`
`
`(OZLILL
`
`——ee
`—
`
`eS—_—i
`
`——.
`—
`——
`—
`——
`
`——
`
`—
`
`(0
`
`9'19)9
`
`yZ
`
`(0'°29'198
`
`SAMSUNGEXHIBIT 1006
`Page 48 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 48 of 68
`
`
`
`
`
`5,521,843
`
`1
`SYSTEM FOR AND METHOD OF
`RECOGNIZING AND TRACKING TARGET
`MARK
`
`TECHNICAL FIELD
`
`The presentinvention relates to a system for and a method
`of recognizing and tracking a target mark using a video
`camera, and moreparticularly to a system for and a method
`of recognizing and tracking a target mark for detecting the
`position and attitude of the target mark by processing an
`image of the target mark produced by a video camera,
`detecting a shift of the position of the target mark from a
`predetermined position, and controlling the position and
`attitude of a processing mechanism based on the detected
`shift.
`
`BACKGROUND ART
`
`To have a robot grip a moving object by itself or dock a
`spacecraft with another spacecraft, it is necessary to recog-
`nize and track a target mark on the moving object or the
`spacecraft using a video camera.
`There has heretofore been knowna process of measuring
`the position and attitude of an object by producing an image
`of a target mark on the object with a video camera, and
`processing the data of the produced image to determine the
`position andattitude of the object. The process may be used
`in an application for gripping the object with a robot hand.
`In such an application, the video camera is mounted on a
`robot, which tracks the target mark based on position and
`attitude data of the target mark which are produced by the
`video camera, for gripping the object with the robot hand.
`The conventional process takesits time until the position
`and attitude of the object are recognized by processing the
`imagedata of the target mark. It has been impossible for the
`prior process to effect a real-time data feedback to the robot
`and also difficult to track the object.
`Another process which effects pattern matching on
`images to track an object is time-consuming as it requires
`lots of calculations in a two-dimensional space.
`Accordingto still another process of tracking an object,
`movement of the object is grasped, and the position of the
`moving object is predicted. This process cannot simply be
`applied to movement of an ordinary object because the
`process is based on the fact that the object makes regular
`movements.
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`45
`
`2
`attached to a certain plane of an object, then the position and
`attitude of the object can be measured from the positional
`relationship of the image of the target mark in an image
`space. In the measurement, calculations based on projective
`geometry are effected on the coordinates of image points that
`are projected from the object space of the target mark onto
`the image plane of a camera. Whenthe position orattitude
`of the object changes, the relationship between image points
`on the target mark also changes. Therefore, it is possible to
`calculate the position and attitude of the object in the
`three-dimensional space based on the change in the rela-
`tionship between image points on the target mark.
`Since a conventional measuring system using target
`marks calculates the position and attitude of an object based
`on the coordinates of image points that are extracted from a
`triangular or rectangular target mark image, the measuring
`accuracy tends to vary dependingon the attitude of the target
`mark with respect to the camera. Specifically, when image
`data containing a directional component is obtained from
`each image point on an image plane to describe a certain
`plane of the object to which a target mark is attached, a
`reference distance with respect to each image point varies,
`resulting in a lack ofstability with respect to the measuring
`accuracy for the position and attitude.
`Conventional calculations of a position using a target
`mark require that the planeof the target mark beat a certain
`angle to the plane of an image, and hence need much more
`calculation parameters than if the camera faces the target
`mark head on. Therefore, the calculations in measuring the
`position are complex, and the measuring accuracy is low-
`ered.
`
`When a mark in the form of four points is converted into
`an image by an imaging means,the four points are shown as
`having a certain area on the image, making it impossible to
`accurately determine the positionsof the points in the image
`data. Accordingly, the positions of the points on the image
`cannot be determined in terms of subpixels. Since the
`distance up to the object and the attitude of the object are
`calculated based on the inaccurate positions of the points in
`the image data, the distance up to the object and the attitude
`of the object cannot be measured with accuracy.
`
`DISCLOSURE OF THE INVENTION
`
`In view of the above problems of the conventional sys-
`tems and processes,it is a first object of the present invention
`to provide a system for and a method of recognizing and
`tracking a target mark on a real-time basis using a video
`camera.
`
`Before a target mark is recognized, it is necessary to
`extract a desired mark from an image which either contains
`another object or objects or has a lot of noises. To meet such
`A second object of the present invention is to provide a
`a requirement, the conventional processes compare the area
`system for and a methodof recognizing and tracking a target
`of the mark or extracts features by way of pattern matching.
`mark so as to be capable of extracting a desired target mark
`quickly andreliably.
`The area comparison procedure determines as a desired
`mark an extracted image having substantially the same area
`55
`A third object of the present invention is to provide a
`as the desired mark. It is virtually impossible, however, to
`system for and a methodof recognizing and trackingatarget
`extract a desired mark from an image whicheither contains
`mark while eliminating measuring error variations due to the
`an object of almost the same size around the mark or has a
`positional relationship between the target mark and a cam-
`era,
`lot of noises. The area comparison procedure thusfinds use
`in a limited range of applications.
`The feature extraction procedure based on pattern match-
`ing needs a large expenditure of time for searching an image
`memory, and hence it processing time is long.
`To measure the position and attitude of an object in a
`three-dimensional space, there is employed a triangular or
`rectangular target mark representing the positional relation-
`ship between three or four points. If such a target mark is
`
`60
`
`65
`
`A fourth object of the present invention is to provide a
`system for and a method of recognizing and tracking a target
`mark to measure the distance up to and the attitude of an
`object simply with high accuracy.
`To achieve the above objects, there is provided in accor-
`dance with the present invention a target mark tracking
`system for tracking a target mark with a video camera,
`comprising a target mark disposed on an object and com-
`
`SAMSUNG EXHIBIT 1006
`Page 49 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 49 of 68
`
`
`
`5,521,843
`
`15
`
`25
`
`30
`
`20
`
`3
`4
`posed of a black circle and a white triangle mounted
`Bythusrelating the changes, the change in the image of
`centrally on the black circle and three-dimensionally shifted
`the target mark can quickly be converted into the relative
`from the black circle, a video camera for imagingthe target
`actual change from the predetermined relationship between
`mark, a moving mechanism supporting the video camera
`the video camera and the target mark. Thus, even when the
`and movable in directions with six degrees of freedom,
`distance between the target mark and the video camerais
`image processing means for processing image data of the
`greatly varied,
`the moving object can stably be tracked
`target mark which is produced by the video camera, shift
`without a reduction in the response.
`detecting meansfor detecting a shift of the target mark from
`According to the present invention, there is also provided
`projected histogram information of the target mark whichis
`a target mark attitude detecting method of detecting the
`produced by the image processing means, and moving
`attitude of a target mark to detect the attitude of an object
`mechanism control means for controlling movementof the
`aboutthe direction of a camera based on an image produced
`moving mechanism depending on the shift to enable the
`by the camera of a target mark which is composedofat least
`video camera to track the target mark.
`a triangle of a particular shape, comprising the steps of
`The target mark is composed of the black circle and the
`determining projected histograms in X and Y directions of
`white triangle mounted centrally on the black circle and
`the imageofthe triangle of the target mark, determining the
`three-dimensionally shifted from the black circle, and is
`positions of the centers of gravity in the X and Y directions
`mounted on the object. The video camera is mounted on the
`of the image of the triangle of the target mark in the
`moving mechanism which is movable in the directions with
`projected histograms, determining maximum histogram val-
`six degrees of freedom and images the target mark. The
`ues and X- and Y-axis values in the projected histograms,
`image processing means processes the image data of the
`determining which ofclassified and presetattitude patterns
`target mark which is produced by the video camera. The
`the attitude of the triangle of the target mark belongs to
`shift detecting means detects a shift of the target mark from
`based on the positions of the centers of gravity, the maxi-
`projected histogram information thereof. The moving
`mum histogram values, the X- and Y-axis values, and known
`mechanism control means controls movementof the moving
`geometrical data of the target mark, and calculating the
`mechanism depending on the shift
`to enable the video
`attitude of the triangle of the target mark in the determined
`camerato track the target mark.
`attitude pattern about the direction of the camera.
`As described above, the target mark composed of the
`Attitudes for the triangle of the target mark are classified
`three-dimensionally shifted white triangle disposed centrally
`into attitude patterns. Then, it is determined which of the
`onthe black circle is imaged by the video camera,and a shift
`classified and preset attitude patterns the attitude of the
`in each of the coordinate axis directions of the target mark
`triangle of the target mark belongs to based onthe positions
`is determined from the image data. The shifts can be
`of the centers of gravity, the maximum histogram values, the
`determined from projected histograms of the image data,
`which are calculated in a one-dimensional domain. There-
`X- and Y-axis values, and known geometrical data of the
`target mark. The attitudeofthe triangle of the target mark in
`fore, the calculations of the projected histograms are very
`the determined attitude pattern about the direction of the
`simple and small in amount. Therefore, the shifts of the
`camera is then calculated. Therolling interval of the target
`target mark can be determined by a high-speed processing.
`mark can properly and simply be grasped.
`As a consequence, real-time data can be fed back to the
`moving mechanism depending on the determined shifts,
`According to the present invention, there is further pro-
`making it possible to enable the video camera to track the
`vided a method of detecting a target mark, comprising the
`target mark on a real-time basis.
`steps of converting an original image to binary images,
`grouping the binary images into images with joined pixels,
`According to the present invention, there is also provided
`determining X- and Y-projected histograms of the grouped
`a visual target mark tracking control system for imaging a
`images, counting extreme values of the X- and Y-projected
`target mark with a video camera and processing image data
`histograms of the grouped images, and comparing the
`of the target mark produced by the video camerato hold the
`counted extreme values with predetermined extreme values
`video camera in a predetermined positional relationship to
`of X- and Y-histograms of a target mark to determine
`the target mark at all
`times, comprising image change
`whether the grouped images represent the target mark.
`detecting means for detecting a change fromatarget position
`and attitude for the target mark in an image ofthe target
`According to the present invention,there is also provided
`mark produced by the video camera, actual change detecting
`an apparatus for measuring the position and attitude of an
`means for detecting a relative actual change from the pre-
`object based on an image of a target mark, comprising a
`determined positional relationship between the video cam-
`target mark disposed on a particular flat surface of an object
`era and the target mark, and relating means for experimen-
`and composed of a circle and a central point thereof, a
`tally shifting the predetermined positional
`relationship
`camera for imaging the target mark to generate an image of
`between the video camera and the target mark, and relating
`the circle and the central point thereof, feature extracting
`values which are detected by the image change detecting
`means for extracting feature points required to measure the
`means and the actual change detecting means when the
`position and attitude of the object, from the image of the
`predetermined positional
`relationship is experimentally
`target mark, and calculating means for calculating the posi-
`shifted, to each other.
`tion and attitude of the target mark in an object space
`according to projective geometrical calculations on the
`The relating means experimentally shifts the predeter-
`feature points.
`minedpositional relationship between the video camera and
`the target mark, and relates a value which is detected by the
`According to the present invention, there is also provided
`image change detecting means,i.e., a change fromatarget
`a distance and attitude measuring apparatus for measuring
`position and attitude for the target mark on its image, and a
`the distance up to and the attitude of an object, comprising
`value which is detected by the actual change detecting
`four disk-shaped marks disposed on an object and having
`means, i.e., a relative actual change from the predetermined
`respective centers of gravity positionedin one plane,at least
`relationship between the video camera and the target mark,
`one of the disk-shaped marks having a radius different from
`to each other.
`the radii of the other disk-shaped marks, imaging means for
`
`35
`
`40
`
`45
`
`50
`
`55
`
`60
`
`65
`
`SAMSUNGEXHIBIT 1006
`Page 50 of 68
`
`SAMSUNG EXHIBIT 1006
`Page 50 of 68
`
`
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`5,521,843
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`5
`6
`imaging the disk-shaped marks, center-of-gravity calculat-
`FIGS. 28(A) and 28(B) are diagrams showing the posi-
`ing means for calculating the positions of the centers of
`tional relationship between a video camera andatarget
`mark;
`gravity of the respective disk-shaped marks based on image
`data of the four disk-shaped marks which are outputted by
`FIGS. 29(A) and 29(B) are views showing movement of
`the imaging means, and calculating means for solving a
`the video camera when a shift is actually measured;
`four-point perspective problem to calculate the distance up
`FIG. 30 is a diagram showing the shift that is actually
`to and the attitude of the object based on the positionsof the
`measured;
`centers of gravity calculated by the center-of-gravity calcu-
`FIG. 31 is a graph showing a z-Ez relationship;
`lating means.
`FIG. 32 is a graph showing an Ex-Dxrelationship;
`FIG. 33 is a graph showing a z-Ax relationship;
`FIG.34 is a flowchart of a sequence for converting shifts
`E* on an image quickly to actual shifts D*;
`FIG, 35 is a diagram of coordinates indicative of com-
`ponents of the actual shift;
`FIG. 36 is a graph showing a z-Ez relationship;
`FIG. 37 is a graph showing an Ey-Dyrelationship;
`FIG. 38 is a graph showing a z-Ay relationship;
`FIG.39 is a flowchart of a control sequence according to
`a first embodiment;
`FIG.40 is a diagram illustrative of a coordinate transfor-
`mation;
`FIG.41 is a flowchart of a control sequence according to
`a second embodiment;
`FIG. 42 is a diagram illustrative of a coordinate transfor-
`mation;
`FIG. 43 is a diagram showing a projected histogram
`indicating thata triangle is largely displaced from the center
`of a circle in the image of a target mark;
`FIG. 44 is a diagram showing a projected histogram
`illustrative of the manner in which the centerofthe circle in
`the image of the target mark is determined;
`FIGS. 45(A), 45(B), and 45(C) are diagrams showing the
`manner in which projected histograms of an image of only
`a triangle are determined from actually measured projected
`histograms of a target mark composedof a triangle and a
`circle;
`FIG. 46 is a diagram showing the reference shape and
`position of the triangle of the target mark;
`FIG.47 is a diagram showingtheposition of the center of
`gravity of the triangle of the target mark and the maximum
`values of the histograms and their positions;
`FIG. 48 is a diagram showing a former half group of
`attitude patterns produced whentheroll angle ofthe triangle
`is progressively increased;
`FIG. 49 is a diagram showing a latter half group of
`attitude patterns produced whentheroll angle of the triangle
`is progressively increased;
`FIG. 50 is a diagram of attitude patterns classified accord-
`ing to the position of the center of gravity, the maximum
`values of the histograms, and their positions;
`FIGS. 51(A) and 51(B) are hardware arrangements for
`producing projected histogram data;
`FIG. 52 is a diagram illustrative