`
`(12) Japanese Unexamined Patent
`Application Publication (A)
`
`
`(11) Japanese Unexamined Patent
`Application Publication Number
`H06–309138
`
`
`
`(43) Publication date: 04 Nov 1994
`
`Theme codes (reference)
`
`(51) Int. Cl.5
`G06F 3/14
` 3/03
` 3/033
`
`
`ID codes JPO Filing No.
`360 D 7165–5B
`380 L 7165–5B
`360 C 7165–5B
`
`
`
`
`
` FI
`
`
`
`Request for examination: None Number of claims: 1 Online (Total of 9 pages)
`
`(21) Application number
`(22) Date of application
`
`H5-99135
`26 Apr 1993
`
`
`
`
`
`
`
`
`
`
`
`
`
`(71) Applicant
`
`(72) Inventor
`
`(74) Agent
`
`
`
`000003078
`Toshiba Corp.
`72 Horikawa-cho, Saiwai-ku, Kawasaki-shi,
`Kanagawa-ken
`SHINJI NARUTAKA
`℅Toshiba Fuchu Plant
`1 Toshiba-cho, Fuchu-shi, Tokyo-to
`KAZUHIDE MIYOSHI, patent attorney (and 3
`others)
`
`
`
`
`INVENTION) METHOD FOR
`(TITLE OF THE
`(54)
`CONTROLLING A SCREEN USING A TOUCH
`PANEL
`
`(57) [Abstract]
`[Object] The present invention realizes screen scrolling
`using a screen display that is simpler, faster, and more
`intuitive, without using icons or the like, thereby reducing
`operator fatigue and avoiding control lag, etc.
`[Configuration] A determination is made as to whether or
`not a display screen 8 of a CRT 1 is being operated by
`being touched on the basis of touch location detection
`information output by a touch panel 2, and if it is
`determined that a touch location where the display screen 8
`has been touched has moved, the display screen 8 is
`scrolled on the basis of changes to the location detection
`information at a predetermined timing.
`
`
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 1 of 19
`
`
`
`[Claims]
`[Claim 1] A method for controlling a screen using a
`touch panel that scrolls a display screen on the basis of
`touch location detection information of the touch
`panel that is disposed in front of the display screen,
`wherein a determination is made as to whether or not
`the display screen is being operated by being touched
`on the basis of touch location detection information
`output by the touch panel, and if it is determined that a
`touch location where the display screen has been
`touched has moved, the display screen is scrolled on
`the basis of changes
`to
`the
`location detection
`information at a predetermined timing
`[Detailed Description of the Invention]
`[0001]
`[Industrial Field of Application] The present invention
`relates to a method for controlling a screen using a
`touch panel that controls a screen of a CRT or the like,
`the touch panel being provided in front of the CRT,
`and being optical, resistive, capacitative, ultrasonic,
`piezoelectric, or the like.
`[0002]
`[Prior Art] It is common in control systems that
`control various types of plants, etc., to display icons or
`the like for scrolling on screens of CRTs that display
`various types of control information and the like, and
`when an operator touches the icons or the like with a
`finger or a pen, the touch is sensed by an optical,
`resistive, capacitative, ultrasonic, piezoelectric, or
`other type of touch panel disposed in front of the CRT
`to scroll the screen of the CRT.
`[0003]
`[Problem to be Solved by the Invention] However, in
`such conventional methods for controlling screens
`using touch panels, the icons or the like for scrolling
`have to be displayed on the CRT screen, creating the
`problem of reducing the useful display area of the
`screen.
`[0004] Moreover, because the operator has to aim for
`a very small area on the screen when operating the
`icons, etc., displayed on the CRT, touch input
`operations must be done with great care, creating the
`problem of concomitant increased operator fatigue.
`[0005] Further, when the operator wishes to scroll the
`screen only by a particular amount, he or she must
`continuously touch the icon for a corresponding
`amount of time, preventing rapid input and creating
`control lags, etc.
`[0006] The present invention was devised in light of
`these circumstances, and has as an object to provide a
`method for controlling a screen using a touch panel
`whereby screen scrolling using a screen display that is
`simpler, faster, and more intuitive is realized, without
`
`using icons or the like, thereby reducing operator
`fatigue and avoiding control lag, etc.
`[0007]
`[Means for Solving the Problem] To attain this object,
`the present invention is a method for controlling a
`screen using a touch panel that scrolls a display screen
`on the basis of touch location detection information of
`the touch panel that is disposed in front of the display
`screen, wherein a determination is made as to whether
`or not the display screen is being operated by being
`touched on the basis of touch location detection
`information output by the touch panel, and if it is
`determined that a touch location where the display
`screen has been touched has moved, the display screen
`is scrolled on the basis of changes to the location
`detection information at a predetermined timing.
`[0008]
`[Action] In the above configuration, a determination is
`made as to whether or not the display screen is being
`operated by being touched on the basis of touch
`location detection information output by the touch
`panel, and if it is determined that a touch location
`where the display screen has been touched has moved,
`the display screen is scrolled on the basis of changes
`to
`the
`location detection
`information
`at
`a
`predetermined timing, thereby providing a method for
`controlling a screen using a touch panel whereby
`screen scrolling using a screen display that is simpler,
`faster, and more intuitive is realized, without using
`icons or the like, thereby reducing operator fatigue and
`avoiding control lag, etc.
`[0009]
`[Examples] FIG. 1 is a block diagram showing one
`example of a control system to which one embodiment
`of a method for controlling a screen using a touch
`panel according to the present invention is applied.
`[0010] A control system shown in the drawing is
`provided with a CRT 1, a touch panel 2, two interface
`circuits (interfaces) 3 and 4, a CPU 5, and a bus 6. The
`touch panel 2 when the control system is operated by
`an operator touching a screen 8 on the CRT 1 using a
`finger 7 (see FIG. 3) and the screen 8 displayed on the
`CRT 1 is scrolled by the CPU 5 in accordance with the
`direction and amount of movement of the finger 7.
`[0011] The CRT 5 is provided with a vacuum tube or
`the like having sufficient display capacity to display
`states of a plant or the like, for example, which is what
`is controlled by the control system in question. When
`a display signal is output by the interface circuit 4, this
`is read and displayed to the screen.
`[0012] The touch panel 2 is constituted by an optical,
`resistive, capacitative, ultrasonic, or other type of
`panel. When the screen 8 on the CRT 1 is touched by
`a finger 7 or the like of the operator, the touch panel 2
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 2 of 19
`
`
`
`
`
`(3)
`
`JP H6-309138 A
`
`detects this and generates touch location data, which is
`supplied to the interface circuit 3.
`[0013] The interface circuit 3 receives the touch
`location data output by the touch panel 2, converts it
`to touch location data that is in a pre-set format, and
`sends it to the bus 6 for supply to the CPU 5.
`[0014] The CPU 5 reads process signals and the like
`which are output by the plant or the like which is
`being controlled to create control screen data that is
`supplied to the interface circuit 4 via the bus 6. When
`the touch location data is output by the interface
`circuit 3, the CPU 5 reads this touch location data via
`the bus 6 and computes the direction and amount of
`scrolling of the screen on the basis of changes in the
`touch location data. The CPU 5 then creates control
`screen data which has been scrolled on the basis of the
`computation results and supplies this touch location
`data to the interface circuit 4 via the bus 6.
`[0015] The interface circuit 4 reads via the bus 6 the
`control screen data that has been output by the CPU 5,
`converts the control screen data to an image signal and
`supplies this to the CRT 1 for screen display.
`[0016] Next, operation of the present embodiment is
`described with reference to the flowchart shown in
`FIG. 2.
`[0017] First, the CPU 5 monitors output by the
`interface circuit 3, imports information indicating
`whether or not the screen 8 on the CRT 1 has been
`touched by the finger 7 of an operator or the like, and
`waits until part of the screen 8 on the CRT 1 has been
`touched (step ST1).
`[0018] When the touch panel 2 detects that the screen
`8 on the CRT 1 has been touched by the finger 7 of the
`operator or the like as shown in FIG. 3 and the touch
`location data is output by the interface circuit 3, the
`CPU 5 imports the touch location data and checks
`whether or not the touch location data is continuously
`output by the interface circuit 3 for a predetermined
`fixed amount of time (step ST2).
`[0019] If the touch location data is not output for the
`fixed amount of time by the interface circuit 3, the
`CPU 5 determines that the operator has input an
`instruction other than scroll input, and performs a
`touch input process in accordance with an icon or the
`like at the coordinates indicated by that touch location
`data.
`[0020] If the touch location data is output for the fixed
`amount of time or longer by the interface circuit 3
`(step ST2), however, the CPU 5 determines that the
`operator has given a scroll instruction and stores the
`touch location data as starting coordinates (X1, Y1)
`(step ST3), and waits until touch location data is
`output by the interface circuit 3 (step ST4).
`
`[0021] If the finger 7 of the operator that is touching
`the CRT 1 is moved without moving off of the screen
`8, the CPU 5 reads and stores every change in the
`values of the touch location data output by the touch
`panel 2 as candidate ending coordinates (X3, Y3).
`[0022] The touch panel 2 then detects when the finger
`7 of the operator is lifted off of the CRT 1, after which
`no touch location data is output by the interface circuit
`3. The CPU 5 then stores the last candidate ending
`coordinates (X3, Y3) as official ending coordinates
`(X2, Y2) (step ST5).
`[0023] The CPU 5 then calculates a movement amount
`(ΔX, ΔY) by making the computation indicated by the
`following equations on the basis of the starting
`coordinates (X1, Y1) and the ending coordinates (X2,
`Y2) as shown in FIG. 4 (step ST6).
`[0024] ΔX=X2-X1 …(1)
`ΔY=Y2-Y1 …(2)
`Next, the CPU 5 cuts out only those parts of the
`current screen 8 displayed to the CRT 1 which have
`moved by the movement amount (ΔX, ΔY) within an
`entire control screen 10 (see FIG. 3) in the memory,
`sends those parts to the bus 6 as control screen data for
`supply to the interface circuit 4, causing the screen 8
`corresponding to the control screen data to be
`displayed to the CRT 1 (step ST7).
`[0025] In the present embodiment, the touch panel 2
`detects when the screen 8 on the CRT 1 is moved by
`being touched by the finger 7 or the like of the
`operator and the screen 8 displayed to the CRT 1 is
`scrolled by the CPU 5 in accordance with the direction
`and amount of movement of the finger 7, and therefore
`scrolling of the screen 8 using screen display is
`realized that is simpler, faster, and more intuitive,
`without using icons or the like, thereby reducing
`operator fatigue and avoiding control lag, etc.
`[0026] In the present embodiment the screen 8 on the
`CRT 1 is scrolled when the finger 7 of the operator is
`lifted off of the screen 8 on the CRT 1, but it is also
`possible to continuously scroll the screen 8 on the
`CRT 1 when the finger 7 of the operator touches the
`screen 8 on the CRT 1 and is moved over the screen 8
`without being lifted off of the screen 8.
`[0027] In this case, the CPU 5 of the control system
`performs scrolling control according to the procedure
`shown in flowchart in FIG. 5.
`[0028] First, the CPU 5 monitors output by the
`interface circuit 3, imports information indicating
`whether or not the screen 8 of the CRT 1 has been
`touched by the finger 7 of an operator or the like, and
`waits until part of the screen 8 of the CRT 1 has been
`touched (step ST11).
`[0029] When the touch panel 2 detects that the screen
`8 of the CRT 1 has been touched by the finger 7 of the
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 3 of 19
`
`
`
`
`
`(4)
`
`JP H6-309138 A
`
`operator or the like and the touch location data is
`output by the interface circuit 3, the CPU 5 imports
`the touch location data and checks whether or not the
`touch location data is continuously output by the
`interface circuit 3 for a predetermined fixed amount of
`time (step ST12).
`[0030] If the touch location data is not output for the
`fixed amount of time by the interface circuit 3, the
`CPU 5 determines that the operator has input an
`instruction other than scroll input, and performs a
`touch input process in accordance with an icon or the
`like at the coordinates indicated by that touch location
`data.
`[0031] If the touch location data is output for the fixed
`amount of time or longer by the interface circuit 3,
`however, the CPU 5 determines that the operator has
`given a scroll instruction and stores the touch location
`data as starting coordinates (X1, Y1) (step ST13), and
`checks whether or not the touch location data has been
`output by the interface circuit 3 (step ST14).
`[0032] If the touch location data is continuously
`output by the interface circuit 3, the CPU 5 checks
`whether or not the values of the touch location data
`output by the interface circuit 3 (step ST15), and if the
`values have changed, the CPU 5 determines that the
`finger 7 of the operator touching the CRT 5 has been
`moved without being lifted off of the screen 8 and the
`values of the touch location data output by the touch
`panel 2 have changed, and stores new touch location
`data output by the interface circuit 3 as the ending
`coordinates (X2, Y2) (step ST16).
`[0033] The CPU 5 then calculates a movement amount
`(ΔX, ΔY) by making the computation indicated by the
`following equations on the basis of the starting
`coordinates (X1, Y1) and the ending coordinates (X2,
`Y2) (step ST17).
`[0034] ΔX=X2-X1 …(3)
`ΔY=Y2-Y1 …(4)
`Next, the CPU 5 cuts out only those parts of the
`current screen 8 displayed to the CRT 1 which have
`moved by the movement amount (ΔX, ΔY) within an
`entire control screen 10 in the memory, sends those
`parts to the bus 6 as control screen data for supply to
`the
`interface circuit 4, causing
`the screen 8
`corresponding to the control screen data to be
`displayed to the CRT 1 (step ST18).
`[0035] Thereafter, the CPU 5 stores the ending
`coordinates (X2, Y2) as new starting coordinates (X1,
`Y1) and returns to the aforementioned touch ending
`judgment process.
`[0036] The touch panel 2 detects when the operator
`lifts his or her finger 7 from the CRT 5. The CPU 5
`repeats the aforementioned screen scrolling process
`(steps ST14 to ST18) until the touch location data is
`
`no longer output by the interface circuit 3, and stops
`the scrolling process when the finger 7 of the operator
`is lifted off of the CRT 1 (step ST14).
`[0037] When the screen 8 on the CRT 1 is operated by
`being touched by the finger 7 of the operator in this
`manner, the screen of the CRT 1 can be continuously
`scrolled, thereby making it possible to improve ease of
`use.
`[0038] Moreover, in the present embodiment, an
`optical, resistive, capacitative, ultrasonic, or other type
`of touch panel can be used as the touch panel 2, but it
`is also possible to use a piezoelectric touch panel.
`[0039] In this case, the CPU 6 of the control system
`performs scrolling control according to the procedure
`shown in flowchart in FIG. 5.
`[0040] First, the CPU 5 monitors output by the
`interface circuit 3, imports information indicating
`whether or not the screen 8 of the CRT 1 has been
`touched by the finger 7 of an operator or the like, and
`waits until part of the screen 8 of the CRT 1 has been
`touched (step ST21).
`[0041] The touch panel 2 detects when the screen 8 on
`the CRT 1 is touched by the finger 7 of the operator or
`the like. Once the touch location data is output by the
`interface circuit 3 together with touch pressure data,
`the CPU 5 receives this touch location data and the
`touch pressure data, and checks whether or not the
`value of the touch pressure data is greater than or
`equal to a fixed pressure value (step ST22).
`[0042] If the value of the touch pressure output by the
`interface circuit 3 is not greater than or equal to the
`fixed pressure value, the CPU 5 determines that the
`operator has input an instruction other than scroll
`input, and performs a
`touch
`input process
`in
`accordance with an icon or the like at the coordinates
`indicated by that touch location data.
`[0043] If the value of the touch pressure data output
`by the interface circuit 3 is greater than or equal to the
`fixed pressure value, the CPU 5 determines that the
`operator has given a scroll instruction, stores the touch
`location data output by the interface circuit 3 as the
`starting coordinates (X1, Y2) (step ST23) and waits
`until the touch location data is no longer output by the
`interface circuit 3 (step ST24).
`[0044] If the finger 7 of the operator that is touching
`the CRT 5 is moved without moving off of the screen
`8, the CPU 5 reads and stores every change in the
`values of the touch location data output by the touch
`panel 2 as candidate ending coordinates (X3, Y3).
`[0045] The touch panel 2 then detects when the finger
`7 of the operator is lifted off of the CRT 1, after which
`no touch location data is output by the interface circuit
`3. The CPU 5 then stores the last candidate ending
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 4 of 19
`
`
`
`
`
`(5)
`
`JP H6-309138 A
`
`coordinates (X3, Y3) as official ending coordinates
`(X2, Y2) (step ST25).
`[0046] The CPU 5 then calculates a movement amount
`(ΔX, ΔY) by making the computation indicated by the
`following equations on the basis of the starting
`coordinates (X1, Y1) and the ending coordinates (X2,
`Y2) (step ST26).
`[0047] ΔX=X2-X1 …(5)
`ΔY=Y2-Y1 …(6)
`Next, the CPU 5 cuts out only those parts of the
`current screen 8 displayed to the CRT 1 which have
`moved by the movement amount (ΔX, ΔY) within an
`entire control screen 10 in the memory, sends those
`parts to the bus 6 as control screen data for supply to
`the
`interface circuit 4, causing
`the screen 8
`corresponding to the control screen data to be
`displayed to the CRT 1 (step ST27).
`[0048] Thus, even when using a piezoelectric touch
`panel, the present invention realizes scrolling of the
`screen 8 using a screen display that is simpler, faster,
`and more intuitive, without using icons or the like,
`thereby making it possible to reduce operator fatigue
`and avoid control lag, etc.
`[0049] Moreover, in the aforedescribed embodiments,
`the entire screen 8 on the CRT 1 is scrolled, but it is
`also possible to divide the screen 8 of the CRT 1 into a
`scrolling region 15 and a non-scrolling region 16 as
`shown in FIG. 7 and put ordinary touch-key icons 17
`and scrolling icons 18, etc., in the non-scrolling region
`16.
`[0050] Doing so allows operators unfamiliar with the
`scrolling control provided by the present invention as
`described above, i.e., operators familiar with ordinary
`conventional operations, to scroll the screen 8.
`[0051] Furthermore, in the embodiments described
`above,
`the entire display screen
`is scrolled
`in
`accordance with how much the screen 8 on the CRT 1
`is moved when moved by being touched on the touch
`location, but it is also possible to turn the page and
`display a next page screen 8b or a previous page
`screen when a lower part of a current screen 8a is
`touched and the touch location is moved left or right,
`as shown in FIG. 8.
`[0052] In this case, when the lower part of the screen
`8a is touched and the touch location is moved right,
`the next page screen 8b is displayed, and when the
`
`
`
`
`
`
`FIG. 8
`
`lower part of the screen 8a is touched and the touch
`location is moved left, the previous page screen is
`displayed.
`[0053] A page-turning process can thus be performed
`smoothly when a screen with text or the like is
`displayed on the CRT 1.
`[0054]
`[Effects of the Invention] With the present invention
`as described above, screen scrolling using a screen
`display is realized that is simpler, faster, and more
`intuitive, without using icons or the like, thereby
`making it possible to reduce operator fatigue and
`avoid control lag, etc.
`[Brief Description of the Drawings]
`[FIG. 1] is a block diagram showing one example of a
`control system to which one embodiment of a method
`for controlling a screen using a touch panel according
`to the present invention is applied.
`[FIG. 2] is a flowchart showing an example of a
`scrolling operation in the control system shown in
`FIG. 1.
`[FIG. 3] is a schematic view showing an example of a
`scrolling operation in the control system shown in
`FIG. 1.
`[FIG. 4] is a schematic view showing an example of a
`calculating a scrolling amount during a scrolling
`operation in the control system shown in FIG. 1.
`[FIG. 5] is a flowchart showing another embodiment
`of a method for controlling a screen using a touch
`panel according to the present invention.
`[FIG. 6] is a flowchart showing another embodiment
`of a method for controlling a screen using a touch
`panel according to the present invention.
`[FIG. 7] is a schematic view showing another
`embodiment of a method for controlling a screen using
`a touch panel according to the present invention.
`[FIG. 8] is a schematic view showing another
`embodiment of a method for controlling a screen using
`a touch panel according to the present invention.
`[Explanation of the Reference Numerals]
`1 CRT
`2 Touch panel
`3, 4 Interface circuits
`5 CPU
`6 Bus
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 5 of 19
`
`
`
`
`
`(6)
`
`JP H6-309138 A
`
`FIG. 1
`
`FIG. 2
`
`FIG. 7
`
`Wait until input on touch
`panel
`
`Pressed for a
`fixed amount of
`time?
`
`Save coordinates (x1, y1)
`of point A that has been
`input
`
`Wait until finger is
`released
`
`Use last saved coordinates
`(x2, y2) as point B
`
`Calculate amount of
`movement from point A
`to point B
`
`Scroll the screen
`
`
`
`
`
`
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 6 of 19
`
`
`
`
`
`(7)
`
`JP H6-309138 A
`
`FIG. 3
`
`FIG. 4
`
`CRT screen
`
`The vertical axis is indicated by y and the horizontal
`axis is indicated by x for the coordinates. Maxima of x
`and y are xmax and ymax, respectively.
`A (x1, y1): The point first touched by a finger or
`other input medium.
`B (x2, y2): The point where the finger or other input
`medium is lifted off of the screen
`
`
`
`
`
`Reference point
`
`
`
`
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 7 of 19
`
`
`
`
`
`(8)
`
`JP H6-309138 A
`
`FIG. 5
`
`Wait until input on touch
`panel
`
`
`Pressed for a fixed
`amount of time?
`
`Save coordinates (x1, y1)
`of point A that has been
`input
`
`
`Finger
`lifted off?
`
`Finger
`moved?
`
`Point B (x2, y2), saved as the
`coordinates of a newly-input point
`
`Calculate amount of movement
`from point A to point B
`
`Scroll screen
`
`
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 8 of 19
`
`
`
`
`
`FIG. 6
`
`(9)
`
`JP H6-309138 A
`
`Wait until input on touch
`panel
`
`
`
`
`Pressed for a fixed
`amount of time?
`
`
`
`
`Save coordinates (x1, y1) of
`point A that has been input
`
`
`
`
`Wait until finger lifted off
`
`Use last saved coordinates (x2, y2)
`as point B
`
`
`Calculate amount of movement
`from point A to point B
`
`
`
`Scroll screen
`
`
`
`
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 9 of 19
`
`
`
`Certificate of Translation and Declaration of Marc Adler
`
`in the English and Japanese languages, on
`I, Marc Adler, a translator fluent
`behalf of TransPerfect Translations International Inc., do solemnly and
`sincerely declare that the following is,
`to the best of my knowledge and
`belief, a true and correct translation of the dOCUment listed below in a form
`that best reflects the intention and meaning of the original text.
`
`The translated document is designated as:
`Japanese Un xamined Patent Application Publication Number HOG-309138
`
`I have determined that the publication
`Based on my review of this document,
`date of the above document was November 4, 1994. The inventor listed on the
`face of the document is Shinji Narutaka.
`
`TransPerfect ref rence number: TPT869700
`File name:
`JPHO,309138A
`
`I hereby declare that all of the statements made herein of my own knowledge
`are true and that all statements made on information and belief are believed
`to be true; and further that these statements were made with knowledge that
`willful false statements and the like so made are punishable by fine or
`imprisonment, or both, under Section 1001 of Title 18 of the United States
`Code.
`
`
`
`Marc Adler .
`
`Sworn to before me this
`
`MW”rd '
`
`90W
`
`fix
`prfiiy 0%
`Travis
`
`,
`fSTkKL UP
`'TPXCLS
`
`[41/417,de F 41%;:
`signature, Notary Public
`
`Stamp, Notary Public
`
`
`
`
`NICHOLAS FTALDiE-ar
`NOTARY Poem.
`5m, or 'rtxm
`/,
`
`
`’ MY COMM 37; _ :
`
`a”
`
`Microsoft Ex. 1006
`
`Microsoft v. Philips -
`
`|PR2018—00025
`Page 10 of 19
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 10 of 19
`
`
`
`
`
`
`
`
`(19)E|2L<~fiE-‘HT (JP)
`
`(12) 4}} fi-fil #1} Eff :7} fi (A)
`
`
`(11)%?rtiififlfifi§%
`
`aFfifa'cfifl?6—309138
`
`
`
`(43)£BfiH M1182 6::(1994)11FJ 4 El
`
`
`
`330% Iflflfifififii
`fimfigm
`3 60 D 7165—5B
`3 80 L 7165—5B
`3 60 C 7165—5B
`
`<51)1m.c1.5
`G06F 3/14
`3/03
`3/033
`
`FI
`
`gfifififi
`26%
`Efisjfilfimfil 0L (4: 9 E)
`
`
`
`
`(21)tfi§£fi%
`
`
`
`(mam!
`
`
`fiEEflS—QQBS
`
`
`m 5E(1993)4H26|3
`
`(54) I§€000)ééfifil
`
`
`
`
`
`7 ., +A$JL ME}? L tfijfilfiiflfii‘fi
`
`
`
`
`(57) @fifi’a}
`{EE’J} Méfifici7’4 fiV’iE’i’FiJfifi‘é: a>::<\
`
`; Ofigfii‘éc; 73ml Gaga l DIEfiE’JMMW/EHE
`
`
`
`
`
`
`
`gififfiomx 7 D—JI/i’fiif‘r x : 110:; o '6 EH?
`Emfifiifli‘iré fitcybi‘a fill ; Bari Eifi‘fié; Lin \
`gfimféc
`($151?)
`$7 v‘I-JW‘VI/Z flatm‘éflé y "/“ffilfifi
`
`
`HfifitgdwaRTlflflifi#8fi§v éht
`
`
`
`
`
`4tfiéfi'éwh‘éhf b \5 75%" 5 bvfifli IA fifiéflmifi*
`
`875W ~75‘§1’L7€4fi§§?§7~7 flfififiwttnxa ail]
`E‘éi’u‘: a % \ Ffiifimw s yrc-fiéflifififiti'rfiifia)
`
`
`
`
`
`E1ELZE’3%W§EK$FEF8 E17 D—iI/‘é’d‘éu
`
`
`
`
`
`
`(71):!11EEA 000003078
`Hfiéfififi
` 01% ILEKJ Infirfimnmw
`
`(mm? 02%; MI:
` fiasfirarfizmnw fiifiéfifii
`
`
`fitliibfi'fl
`
`
`(74mm 4% we aim:
`
`(mg)
`
`
`
`
`
`
`
`n’ x
`
`
`
`Microsoft Ex. 1006
`
`Microsoft v. Philips -
`
`|PR2018—00025
`Page 11 of 19
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 11 of 19
`
`
`
`(2)
`
`
`EE367309‘38
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
` IEEEEmE I
`
`
`
`
`
`
`{EEE 1 EREE®E:IZEEEEE7IENE
`
`
`
` ”037 .y ”{jfifitflfifigcz oyszfiuEaKTumE/“w
`
`
`
`
`
`
`
`n—JVéfiz, y ~yv4/fq‘k11/7gfifiijfEIIEEEIm
`II?
`
`
`WE77“N%WflgHfiéhéy7%fifififififim
`EdmCEEEEUEE7ywéhchETEEéhC
`
`
`
`
`
`
`141%?3385E’EEEL EE4 RE7617M 23117”: E
`fi??v¢fifififikLTWétfli§flit§Lfii
`®74EVETEEEEEEEEwEEKEdEEEE
`
`
`EE EX7D—EEE5C
`
`
`
` :tEEEtEé7vENEwEE LEEEEEE
`E
`[EEmEEEEE}
`I0 0 0 1}
`IEELwEEEE}EEEIECRTEE®EEKEH
`
`
`
` QKEEEEEEEELEEEEE EEEELEEE
`
`
`
`
` EE®7vENEwEE LTEECR”®EEEEW
`
`
`
`
`
`E57vENEEEEILtEEEEEEKEEEO
`{0002}
`IEEmEE}EETEVEEEEEEEEEEEXE
`
`
`AE:BmTECEEwEEWEEEEEEEECRT
`
`
`
`
`
`@E FKX7D EJ074ZVQEEEELCZE
` III
`E'E'EI E®EEENV7SKET7 ‘76-‘5th t E TEEECR
`T®EEKEE§EEEEEEEHEL#EEEECE
`EEEIEfiEEE®7v+NEwKEoTZEEEE
`LTEEECRTmEEEX7u—1/é’éza:EEEIM
`[0003}
`[EEEfi‘fiEEI/Jiikfi‘ififiiE} LEI/{Chg ELL
`
`
`
`
`
`
`
`
` rgfisleaw VENEJI/EEIE JJEEEEEI:Eur
`
`
`
`
`
`
`
`Ii: CRTmE H3713 11/ 1037mm: EE
`EEEEEEEE’:E$ITEmEEEEEEEEé
`
`<E01LiitmiEEE%ot
`[0 0 04} E71: CRTiIzEE‘éfU“: 7’4 3772:}:
`
`
`
`EEWEékE EEEHEEEULEHEwa7y
`
`ELEHflIiflEemxrIf) 7 yfldj EUDEI’EEIEE
`IZIEEW IWiZCBEL Ewfia‘fEIEE’EEOJEEEE
`EL? LE 5 tux ififiEEEmm
`{0 0 0 5} E BI; %5EE®$§§$E7£HX7D—U
`éfiEtEC EEBEIEEIT? UV§7~7EL¥EITEIE
`flIi‘EBEL IcmfiréIj‘EEMjJEfifii:tETE
`'6” EWEEEEEEELT LE 5 C : Eiyjto
`
`I0 0 0 6} EEHHI EL:EO)EEIEI:EA 7% 37:58
` EfaEEé: HEM aDEE—IL EC; 05$<L l I”)
`[EEE’JILEEUEJI\UEE'EE’C‘FEIEUDX7u—JWHEE
`5 L2: 75‘? E C CELL; O’CEI’EEUDIEE’EEMEEE
`
`
` E'QL E'Hfifligflit EEEELZCIII 5 IZEEZ tE’C‘E
`
`
`
`
`
`
`
`
`
`57~7E1W~1L§f§§ JLfL’E E'Hfifljfifé’fféh’féit
`EEE’JtL’CIIéo
`[0007}
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`[EEEEEEEEEQEEIFEm EEEEEEE
`
`
`
`EKEEEELEEEEwEEKEEEK67IENE
`
`
`
`1W)7 ~7EIfiEEtH'IEEIZE/fiwf HUEEEEI 7827
`
`
`EI—JI/éfl'% 7 v ‘I‘IWV/E ESE Lfc’lE EEEI:35’
`
`
`IIC *fiE7 yE/WJ/Eétjjj‘éhé7~7EIEEEH
`
`
`
`EEL OM?ME§FHM#7V”§KE%§T§W
`
`
`EECWEEEEEEEEL EEEfEEE7vTE
`2W:EET7 yEIEEEEIELCMéHIEézILE:
`mem74L/7ZEEUEEJEEmEEKEd
`EEEEEEEE27u—wéfié:t&EEtLCu
`55
`
` [0008}
`
`
`IE~ILE®EEKEMCL7v+NEwEEHfi§
`
`
`
`h57vEfiEEEEEKEdwCEE4EEIE7v
`
`EEflchE’C'EEEEL’CIxéEEfiEEEELL EE
`
`
`
`
`EEE E7vE§htEEE7v+fiEEELLTw
`
`ékEfiéhttéLFE®7EEVVEEEEEEH
`
`
`
`
`
`
`
`EEwEEKEdEmEEEE E17D—WEE5:
`
`
`
`
`:KEOCT437EEEE Eéatg<CJOEE
`
`
`
`
`
`KLEGEOE<CEOEEWKEEOEwEIEE?
`
`
`
`
`
`EImx7u—wéfiEm\:hmiowaE®EE
`
`EEEEEEEECEEEEEEEEELEM;5KE
`éo
`[0 0 0 9}
`
`
`
`[EEM]E1M$EEKE57VENEWEEHLE
`
`
`
`
`EEEEEEwéEEMEE LEEEEXELQEM
`
` EEE7U y7I:'CE>é—3
`
`
`
`
`
`[OOIOIZfi‘XIZETEHI9X7‘AIiCRT‘kC
`
`
`
`7‘VENF‘NI/2tC 200M 797v—XDE%( /
`F) 3‘ 4b CPUSL NXétEIfiEZTEVL E
`
` I’EEmEE? (EBEE) Ea‘zioTCRT1mfiE8
`E7vféhtiiLEEéflttEC7vENEw2
`
`KiofihEEEEEakBK CPUBKioTE
`
`EEWDEE)’: ua‘olUEEEIZIISLTEECRfllJ:
`
`
`
`
`
`KEEEEZHEE 8EX7D—wéfié
`[0 011} c ET 5 IiszIIIflvaTAIzotoTEE
`EEEEEE EZIE‘TEy EEQEEEEEEM)
`I:EE‘EEEEE®7‘EWVEE EEIEZTES 0‘ EE
`4 7577 T —x DEE 7T5K7<fé7g75§tMEI/11% :
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`EC :fl’EHSIUflA’C‘EfiE'EEEE
`
`
`I0 0 1 2} if; 7 xE/v‘rll/ZIEEEEEEEE
`EEEEEE EEEE‘EEWW Mzc; oTIEfiJZ‘éfl’C
`35 I) L EI’EENE WEEK; oTnuEC ET 1 mfiES
`
`
`fJ‘7-yEéhfckEL :flEEtl-JL'C? 1EIfiEE—7
`
`
`
`EEJEELL :fl’i’EJWD/I V77 T—X DEEBIZInfii/E'9L
`5 I,
`[0 0 1 3} EEOD/f V771—ZIEEEBI3EEE7 WE
`
`NEH/273%tjfi‘éflé7 /EijE‘—57EHSCVJSLUE
`am: LhEEKOEEEEETIléfi/ififl)? rflfiET
`
`—57I:¢E>tEBILC 1\7\6J:IL5‘£HL“CHIJEECPU5
`
`
`
`
`
`Microsoft Ex. 1006
`
`Microsoft v. Philips -
`
`|PR2018—00025
`Page 12 of 19
`
`Microsoft Ex. 1006
`Microsoft v. Philips - IPR2018-00025
`Page 12 of 19
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`EEEEEO
`
`{0014}CPLBEEEEEtEETEybEEE
`
`QEEéhafuthEEEEEDEATEEEEE
`—7&¢EL\ZEEEEN16EELTEEQ4V7
`
`71—XEE4ELEL\EEEEEE®EV77I—
`
`
`
`
`
`ZDEBEE77EEEE—7fififiéhttfi\*E
`NXGEELTZEEHUflUthK\Z®77Efi
`
`
`
`
`
`
`EE—7wEEEEmeE mx7n—wfimbiw
`x7U—WEEEELEE\:wEEEEEEdeX
`
`
`7U—uéEEEEEEE—7EEEL\ZEEEEN
`
`
`x6EfiLTLfi®4V77;—XHEWELEEEQ
`
`
`
`[00 514fi04y77E—XDEWHEECPU
`
`
`BEEEEE $E7fiflfi§hhk§\fifiNX6E
`ELT:EEJDEUtt$E\EEEEEEELTE
`ECR“1KEEL\E EEEEEO
`
`[00‘6}&E\E2EEE7D—EE—FEEEL
`Efi5\:mEEE®EEEEEE50
`[00‘71if\CPU5M4Vy7i—Xfifi30
`
`57%EELT\EWEEE®E7KEETCRT1®
`
`
`
`
`E 8Efi7vEEflfw5E85EEEEEEEEfl
`3‘EECRleEE8;®EEfi7VE§fl5ET
`EO<XEvTST1)o
`
`{00‘81%L1\E3EEEE<EEEEE®E7
`EEOTCRT1®EE8;#7VEEE\7VENE
`w2mgofiflfififi§hf{777z—XEE3W
`
`67vEEEE—7fidfiéhhfi CPUBEC®7
`vEEEE—7EEOEUktbL Ewa‘EEéhfm
`EEEEE\EE4777x—1EE3EBEE7VE
`EEE—7fiEfiéflEH5EEiEEEIv7EE
`(XEvTST2)o
`
`[0019}%LT\WE4Vy7i—XE%3#59
`
`vEEEE—7EEE—EEExafiéhTmEEh
`K‘CPUSEEWEKEOTX7D—wlfiflflfii
`EEEEAfiéhttEELT7vEEEE—7uio
`
`TEEEEEEEEEE74UVEEEEGE77EA
`
`EEEéfififio
`
`
`
`{0020}EE\EE4V77T—XDE3EE77
`
`EEEE—7EEEEEEEUL\EfiéhznhE
`(ZEVTST2)\CPUEEEWEKloT17U
`
`—wEEEEéhEkEELE\EE77EEEE—7
`
`EEEEE<X11)tLZEEL7:E(XEvE
`
`8T3) EE4/771—2EE3E877EEEE
`
`—7@Hfi§flfi<fiéETEO(XEV7S”4)a
`[0021}:mtE EECRTlLE7yELEm
`5EEE®E7EEE8LEQEEEE: tE<EEé
`
`fl\:hkfimLTyV?N$W2flgfifiéfl577
`EEEE—7mfifiEWEEEK\CPU5E y77
`l—XEEBEELT7wEEEE—7EE0EAE:
`
`hEEEEEEE<X3\Y3)tLTEEEéo
`[0 0 2 21%L’C EEECRT lifllngf’EEUDi‘ET
`
`(3)
`
`fifififléEBOQIBS
`
`
`
`fifi§fl\ZflfiyvaEWZKgoTfiH§fl\4
`
`y771—1EE3E57yEEEE—7EEEEEE
`<7:th CPU5 ca:ELEEEEUEEEEEF
`
`
`
`E<X3\Y3)EFEEEEKE(X2\YZ)3L
`TEE¢%(XEVTST5L
`
`[0023}:®E\CPUBEE4E$¢W<EEE
`E<X1\Y1)k\EEEE(X2\Y2)tKE6
`thfiKfi?fifi§fififiTfi§E(AX\AY)E
`E§¢%(XEVTST6)O
`[0024}AX:X2EX1
`~(2)
`AY:Y2EY
`mmT\CPUBéiXEUWKEéEW®EEEE10
`
`(BBEE)®EJ73=% CRT1L02EE<§ELGEE
`
`Emfi 8KfiL\EEE(AX\AY):OfffltE
`
`
`
`
`EwEEEEUHL\:flEEfiEEE—7ttfflx
`
`
`6;KEELT4V77I—XDE4KLEL\CRT
`
`
`
`
`
`
`
`
`
`
`
`1;E EEEE E—7EEEEEE 8E4E§E
`E2 (1E‘77”S”7) o
`{0025}:miamszEflEwaE\EEE
`
`
`
`®E7EEEEoECRT1mE 8E77E§hti
`E\E@§htk§\7VEN$V2KEOTZEEE
`
`EEEtt&E\CPUSEEOTEEE7®EEEE
`
`ngfiEEEEUEEECRT‘:EEEEEEnE
`
`E78E17D—WEEéiifibth‘7437E
`
`
`EEEEEECaE< EOEEE WOIDE< ;
`OEEELEEDEwEEEEfEE8®27u—WE
`E72: 52:7??? CELLZEOT