`Boies et al.
`
`54
`
`(75)
`
`73)
`
`(21)
`22
`
`I63)
`(51)
`(52)
`58
`
`METHOD AND APPARATUS FOR USER
`CONTROL BY DERVING NEXT STATES OF
`A PROCESS FROM A CURRENT STATE AND
`BY PROVIDING A VISUAL PRESENTATION
`OF THE DERVED NEXT STATES
`Inventors: Stephen J. Boies, Mahopac, Liam D.
`Comerford, Carmel; John D. Gould,
`Yorktown Heights; Susan L.
`Spraragen, Ossining; Jacob P.
`Ukelson, Bronx, all of N.Y.
`Assignee: International Business Machines
`Corporation, Armonk, N.Y.
`Appl. No.: 320,891
`Fied:
`Oct. 5, 1994
`
`Related U.S. Application Data
`Continuation of Ser. No. 870,503, Apr. 17, 1992.
`Int. Cl. .............................................. G06F 19/00
`U.S. C. .................................................... 395/161
`Field of Search ............... 395/155, 156, 159, 160,
`395/161; 364/188, 190, 474.22, 474.27, 578;
`345/121
`
`(56)
`
`References Cited
`U.S. PATENT DOCUMENTS
`3,911,215 10/1975 Hurst et al. ........................... 178/18
`4,085,438 4/1978 Butler ...........
`... 364/107
`4,220,815 9/1980 Gibson et al.......................... 178/18
`4,479,197 10/1984 Haag et al. ...
`... 364/900
`4,649,499 3/1987 Sutton et al. ...
`... 364/518
`4,763,356 8/1988 Day, Jr. et al...
`... 379/368
`4,802,116 l/1989 Ward et al. ..........
`... 364/900
`4,803,039 2/1989 Impink, Jr. et al. .
`... 364/88
`4,815,014 3/1989 Lipner et al. ...
`... 364/188
`4,823,283 4/1989 Diehn et al. ...
`... 364/518
`4,831,548 5/1989 Matoba et al. ...
`... 364/188
`4,833,592 5/1989 Yamanaka ...
`... 364/138
`4,847,604 7/1989 Doyle ......................
`... 345/180
`4,853,498 8/1989 Meadows et al. .................... 178/19
`4,868,912 9/1989 Doering ...........
`... 34.5/175
`4,873,623 10/1989 Lane et al.
`... 364/188
`4,903,012 2/1990 Ohuchi .....
`... 34.5/178
`4,910,504 3/1990 Eriksson ...
`... 34.5/174
`4,914,624 4/1990 Dunthorn ............................ 395/275
`
`
`
`||||||||||||||
`US005426732A
`11
`Patent Number:
`5,426,732
`45) Date of Patent:
`Jun. 20, 1995
`
`4,929,934 5/1990 Ueda et al. .......................... 345/174
`4,929,935 5/1990 Rysavy et al. ...
`... 345/1.78
`4,931,018 6/1990 Herbst et al. ....................... 434/234
`(List continued on next page.)
`OTHER PUBLICATIONS
`Townsend et al., “Distribution Management Systems',
`3rd International Conference on Power Systems Moni
`toring and Control (1991), pp. 103-108.
`Feddema et al., "Adaptive Image Feature Prediction
`and Control for Visual Tracking with a Moving Cam
`era”, 1990 IEEE International Conference on Systems,
`Man, and Cybernetics, pp. 20-24.
`Krotkov et al., “An Agile Stereo Camera System for
`Flexible Image Acquisition', IEEE Journal of Robotics
`and Automation, vol. 4 No. 1 (Feb. 1988) pp. 108-113.
`Primary Examiner-Mark K. Zimmerman
`Assistant Examiner-N. Kenneth Burraston
`Attorney, Agent, or Firm-Perman & Green
`(57
`ABSTRACT
`A user interface includes a process model unit (34) for
`predicting one or more allowable next states, from a
`current state of a process, and a display processing unit
`(26) for deriving, for each of the allowable next states, a
`representation of the allowable next state. The display
`processing unit has an output coupled to a display
`screen (30) for displaying each of the representations
`(30b-30g) in conjunction with a representation (30a) of
`a current state of the process. The user interface further
`includes an actuator control unit (22) that is coupled to
`an input mechanism whereby a user selects one of the
`displayed representations of one of the allowable next
`states. The motor control unit controls the process to
`cause it to enter a new current state that corresponds to
`the selected derived representation. In one embodiment,
`the display screen has a touchscreen capability whereby
`the user selects one of the representations by physically
`touching the display screen within an area associated
`with a selected one of the derived allowable states.
`
`37 Claims, 6 Drawing Sheets
`
`Qualcomm, Exh. 2009, p. 1
`Apple v. Qualcomm, 2018-01278
`
`
`
`5,426,732
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`4,954,967
`9/1990 Takahashi ........................... 345/173
`4,957,690
`9/1990 Fennern .............................. 364/188
`Fujisaki et al.
`... 34.5/112
`4,967,190 10/1990
`4,980,646 12/1990
`Zemel ...............
`324/716
`4,988,982
`1/1991 Rayner et al.
`... 34.5/173
`5,003,505
`3/99 McCleland.
`... 34.5/173
`5,016,008
`5/199 Gruaz et al. .....
`... 34.1/33
`5,027,279
`6/1991 Gottlieb et al. ...
`. 364/188
`5,038,142
`8/1991 Flowers et al........................ 341/34
`5,051,912
`9/1991 Johanson et al.
`364/474.22
`5,053,758
`10/1991 Cornett et al....................... 34.5/174
`
`5,055,840
`5,117,285
`5,121,318
`5,121,319
`5,123,088
`5,185,628
`5,191,645
`5,202,726
`5,208,903
`5,224,053
`5,274,574
`5,276,789
`5,309,369
`
`10/1991
`5/1992
`6/1992
`6/1992
`6/1992
`2/1993
`3/1993
`4/1993
`5/1993
`6/1993
`12/1993
`1/1994
`5/1994
`
`Bartlett ................................. 34/31
`Nelson et al. .
`... 348/20
`Lipner et al. ....................... 364/88
`Fathet al. ........................... 364/88
`Kasahara et al.
`... 395/159
`Wilson et al. ....................... 364/188
`Carclucci et al. ...
`... 395/159
`McCulley et al. ...
`... 364/188
`Curry ...............
`... 395/131
`Cook .............
`... 364/188
`Tsujido et al.
`364/551.02
`Besaw et al. ........................ 395/140
`Kamiguchi et al... 364/88
`
`
`
`Qualcomm, Exh. 2009, p. 2
`Apple v. Qualcomm, 2018-01278
`
`
`
`U.S. Patent
`
`June 20, 1995
`
`Sheet 1 of 6
`
`5,426,732
`
`PAN
`CONTROL
`MOTOR
`
`6
`
`Y
`C&P A
`
`21
`
`CONTROL
`MOTOR
`
`FIG. l.
`o
`
`O
`M
`
`VOEO
`SGNAL
`
`2b
`
`24
`
`CURRENT
`MAGE
`
`PROCESS
`MODEL
`UNIT
`
`DISPLAY
`PROCESSING - 26
`UNIT
`NCREMENTS
`
`2O
`
`
`
`
`
`4.
`
`TILT, PAN
`CONTROLLER corol\
`MOTOR
`
`
`
`
`
`
`
`
`
`
`
`22d
`
`
`
`
`
`
`
`
`
`
`
`
`320
`
`MOTOR
`COMMANDS
`
`CLOSURES
`
`
`
`TOUCH
`SCREEN
`DECODNG
`UNIT
`TOUCH
`SCREEN
`SIGNALS
`
`3.
`
`CURRENT AND
`REE D
`
`DISPLAY
`BUFFER
`
`28
`
`DISPLAY
`REFRESH
`
`DISPLAY
`SCREEN,3O
`
`DERVED
`PAN
`RGHT
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`DERVED PERYFD
`
`ZOOM
`
`UP
`
`3OC
`
`3Of-DERVED
`TLT
`DOWN
`
`DERVED
`ZOOM
`N
`
`
`
`Qualcomm, Exh. 2009, p. 3
`Apple v. Qualcomm, 2018-01278
`
`
`
`US. Patent
`
`June 20, 1995
`
`Sheet 2 of 6
`
`5,426,732
`
`non52m23
`
`82me
`
`_3._.___fl_\__=__g____
`
`sonasHE.
`
`8.2me
`
`om>Emo
`
`my
`conH32.08hum.—z<n_
`
`com2.SOON
`
`Qm>Emo
`
`fix
`
`an
`
`82me
`
`238Si
`
`hzmmznoo
`
`Qualcomm, Exh. 2009, p. 4
`Apple v. Qualcomm, 2018-01278
`
`Qualcomm, Exh. 2009, p. 4
`Apple v. Qualcomm, 2018-01278
`
`
`
`US. Patent
`
`nJ
`
`%m
`
`.m3amS
`
`6
`
`5
`
`237ax
`
`0,8»29¢25.n825°
`4,8m2.2oo~
`ommeo
`
`
`
`tonknoSOON
`
`comme424a
`
`om>Emo
`
`om>_mmo
`
`
`
`Zgoo.5;
`
`wOm
`
`Qualcomm, Exh. 2009, p. 5
`Apple v. Qualcomm, 2018-01278
`
`Qualcomm, Exh. 2009, p. 5
`Apple v. Qualcomm, 2018-01278
`
`
`
`US. Patent
`
`hS
`
`2
`
`
`
`z<n_om>Ewo
`
`oz<.Em...
`
`PIQE
`
`NV
`
`Fzmémgoz.zap
`
`bzwmmgo
`
`O\n_
`
`%82me5:w8%.
`
`o...
`
`
`
`
`09OUVuautismEvNwas;m3.53mugsszmmno8.2802.28kzmmmso
`
`
`mo<§_
`
`22,8524
`
`F5534».
`
`
`
`n55:Imaummpzmmmao
`
`6z.oz<mF30200M
`
`
`
`482me3%wo}.
`
`Fzmzw‘mooofi6,EaflwwmomxmM.
`
`5,m.9...—
`Bmm.52:
`
`
`
`gm0».szmmno
`
`Qualcomm, Exh. 2009, p. 6
`Apple v. Qualcomm, 2018-01278
`
`Qualcomm, Exh. 2009, p. 6
`Apple v. Qualcomm, 2018-01278
`
`
`
`
`
`
`U.S. Patent
`
`June 20, 1995
`
`Sheet 5 of 6
`
`5,426,732
`
`CURRENT
`PROCESS
`MODEL
`
`INCREMENTS
`
`ALLOWED
`STATES
`
`OUNTERS
`C
`
`
`
`
`
`
`
`
`
`71 SWITCH
`CLOSURES
`
`
`
`32a
`
`F.G. 4
`O
`PROCESS
`MODEL
`UNIT, 34
`
`PAN, TILT, ZOOM
`NCREMENTS
`
`
`
`
`
`
`
`
`
`b
`34
`
`TO
`DISPLAY
`PROCESSING
`UNIT, 26
`
`TO MOTOR
`CONTROL
`UNIT, 22
`
`PAN TILT ZOOM
`
`
`
`
`
`controlLER
`
`INCREMENTS
`
`F.G. 5
`MOTOR
`CONTROL
`UNIT, 22
`
`SWITCH
`CLOSURES
`
`Qualcomm, Exh. 2009, p. 7
`Apple v. Qualcomm, 2018-01278
`
`
`
`U.S. Patent
`
`June 20, 1995
`
`Sheet 6 of 6
`
`5,426,732
`
`NITALIZE:
`CAMERA 2
`DISPLAY SCREEN 3O
`PROCESS MODEL 52
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`YES
`
`
`
`D
`SELECT
`CURRENT
`STATE
`2
`
`E
`YES ACCEPT
`CURRENT
`STATE
`
`GENERATE
`NCREMENTS FOR
`MOTOR CONTROL
`UNT 22
`
`DRIVE CAMERA 12
`TO NEXT STATE
`
`G
`
`UPDATE PROCESS
`MODEL 52. UPDATE H
`NCREMENTS
`
`
`
`UPDATE MAGES OF
`CURRENT AND
`DERVED NEXT
`STATES
`
`FIG. 6
`
`Qualcomm, Exh. 2009, p. 8
`Apple v. Qualcomm, 2018-01278
`
`
`
`1.
`
`METHOD AND APPARATUS FOR USER
`CONTROL BY DERVTNG NEXT STATES OF A
`PROCESS FROM A CURRENT STATE AND BY
`PROVIDING A VISUAL PRESENTATION OF THE
`DERVED NEXT STATES
`
`5
`
`10
`
`15
`
`5,426,732
`2
`of a plurality of derived next states to become a next
`current State.
`It is thus an object of this invention to provide a
`process control system user interface that derives one or
`more next states of a system based upon a current state
`of the system and upon a computational model of the
`system, and which furthermore provides a visual dis
`play of the one or more derived next states in conjunc
`tion with a visual display of the current state.
`It is a further object of this invention to provide a
`process control system user interface that derives one or
`more next states of a system based upon a current state
`of the system and upon a computational model of the
`system, which provides a visual display of the one or
`more derived next states and a visual display of the
`current state, and which drives the system to a next
`state in response to the operator selecting one of the one
`or more displayed derived next states.
`SUMMARY OF THE INVENTION
`The foregoing and other problems are overcome and
`the objects of the invention are realized by a method for
`controlling a process, and by a user interface that is
`operated in accordance with the invention. The method
`includes the steps of (a) presenting a representation of a
`current state of a process; (b) deriving, from the current
`state of the process, one or more possible alternative
`states; (c) presenting a representation of the one or more
`derived possible alternative states; and, in response to a
`user selecting one of the representations of the one or
`more derived alternative states, (d) controlling the pro
`cess to assume a new current state that corresponds to
`the derived alternative state that is associated with the
`selected representation.
`In response to the user selecting the representation of
`the current state of the process, the method includes a
`step of accepting the current state.
`An aspect of the invention relates to the perception of
`a controlled process from the point of view of the user.
`In accordance with the invention, the user is presented
`with an array of possibilities, each of which represents
`either the current state of the process or one or more
`possible next states that are derived from the current
`state. Each possibility may be graphically or otherwise
`presented to the user. The user selects from amongst the
`presented possibilities in order to cause the representa
`tion of the current state to more closely approximate a
`desired next state or a desired end state. The system and
`method of the invention operates to support the user by
`selecting actions required to bring about the conver
`gence of the current state and the desired next or end
`state. The system and method also operates to refresh
`the array of possibilities, based on previous selections by
`the user and/or as a function of time. As a result, the
`total cognitive burden of the user is reduced to choosing
`amongst displayed alternatives until a satisfactory con
`dition is achieved.
`Preferably, the representations of the current state
`and the representations of the alternative states that are
`derived from the current state are arrayed on the dis
`play screen in a manner that supports the user's physical
`intuition concerning the process. In the context of this
`invention, "derived” refers to computer-generated pre
`dicted transformations. In an exemplary embodiment,
`the process is the aiming of a camera to obtain a self
`portrait of the user. For this case, derived camera tilt
`images are disposed along a y-axis of the screen, derived
`camera pan images are disposed along an x-axis of the
`
`This is a continuation of copending application Serial
`Number 07/870,503, filed on Apr. 17, 1992.
`FIELD OF THE INVENTION
`This invention relates generally to process control
`systems and, in particular, to process control systems
`that provide a user interface for displaying information
`related to a controlled process and that receive input
`from a user of the system.
`BACKGROUND OF THE INVENTION
`A process control system may include a digital data
`20
`processor that includes actuators for controlling a pro
`cess, and a graphical display device for displaying proc
`ess-related information to an operator. The system may
`also include one or more operator input devices for
`receiving input from the operator of the system. The
`25
`display may graphically illustrate a current state of the
`process being controlled, and may also provide a list of
`one or more state variables for control by the operator.
`Examples include, but are not limited to, an environ
`mental control system for a large building, a control
`30
`system for a pharmaceutical manufacturing process, a
`petrochemical refining process, a control system for an
`automated manufacturing operation, and a control sys
`tem for imaging or photographic devices.
`In general, in known types of systems the operator is
`35
`restricted to attempting to reach an alternate state by
`trial and error manipulation of some system variables.
`By example, in an environmental control system the
`operator may be enabled to vary the hot air flow to a
`selected zone of a building. However, the use of trial
`and error methods may not always provide an optimum
`environmental solution.
`Furthermore, the use of preprogrammed, or ac
`cessed, next system states may not always provide an
`optimum solution to a particular process control prob
`45
`lem.
`For either the trial and error approach, or the prepro
`grammed approach, the operator may not be suffi
`ciently aware of a consequence of a particular next state
`upon the controlled system so as to make an informed
`50
`decision as to an appropriate next state, based on a cur
`rent state of the system.
`Known systems typically provide graphical represen
`tations of objects, and provide means for implementing
`"controls' on these objects to obtain a desired behavior.
`55
`As an example, in U.S. Pat. No. 4,649,499, issued Mar.
`10, 1987, entitled "Touchscreen Two-Dimensional Em
`ulation of Three-Dimensional Objects' J. Sutton et al.
`described the use of a touchscreen to emulate three-di
`mensional objects on a two-dimensional computer
`screen. The system is employed to emulate the opera
`tion of a desktop rotary card file and a hand held calcu
`lator. In the calculator embodiment an image of a calcu
`lator is displayed and an operator touches the displayed
`"keys' to operate the calculator. What is not disclosed
`65
`is the generation and display of derived (potential) next
`states of a process, in conjunction with a current state of
`the process, or the ability to enable a user to select one
`
`Qualcomm, Exh. 2009, p. 9
`Apple v. Qualcomm, 2018-01278
`
`
`
`5,426,732
`4.
`3
`Camera 12 is gimbal-mounted and has an associated
`screen, and derived camera zoom images are disposed
`along an inferred z-axis, with a physically larger de
`tilt control motor 14 for rotating the camera 12 around
`rived zoom-in image being positioned to appear
`an x-axis, a pan control motor 16 for rotating the camera
`12 around a y-axis, and a lens 12a zoom control motor
`"nearer” to the operator.
`18 for providing an apparent translation along a z-axis.
`A user interface that is taught by the invention in
`cludes a process model unit, for predicting one or more
`The zoom control motor 18 enables camera 12 zoom-in
`and zoom-out operations. Motors 14, 16, and 18 receive
`allowable next states from a current state of a process,
`and a display processing unit for deriving, for each of
`electrical command signals from a tilt, pan, and Zoom
`the allowable next states, a representation of the allow
`controller 20, which in turn receives higher level motor
`commands from a motor control unit 22. By example, a
`able next state. The display processing unit has an out
`10
`given motor command generated by the motor control
`put coupled to a display screen for displaying each of
`the representations in conjunction with a single repre
`unit 22 may be one to change the pan angle by +3. In
`response, the controller 20 interprets the motor com
`sentation of the current state of the process. The user
`mand and generates an appropriate pan motor 16 actua
`interface further includes a process controller that is
`tion signal so as to pan the camera 12 by the specified
`coupled to an input mechanism whereby a user selects
`one of the representations of one of the allowable next
`amount.
`The operation of the tilt, pan, zoom controller 20, and
`states. The process controller controls mechanisms em
`bodying the controlled process in a manner that causes
`the attached motors 14, 16, and 18, is well known, and
`the operation of same will not be described in further
`the process to enter a new current state that corre
`sponds to the selected derived representation.
`detail.
`20
`Camera 12 provides a video output, or video signal
`By example, the display screen has touchscreen capa
`bility whereby the user selects one of the representa
`12b, to a frame capture buffer 24, also referred to in the
`tions by physically touching the display screen within
`art as a frame grabber. Frame capture buffer 24 includes
`a memory for storing digital information representing at
`an area associated with a selected one of the derived
`least one full image frame that is generated by the cam
`allowable states or the current state.
`25
`era 12. If the video signal 12b is provided in analog
`In the example above, selection of the current state
`has the effect of storing the portrate image. In general,
`form, the the buffer 24 also includes an analog to digital
`converter for converting, prior to storage, the analog
`selection of the current state has effects which are cha
`video signal to a digital signal. The frame capture buffer
`racterisitic of the particular process being controlled.
`24 provides the stored digital information as a current
`These effects could, for example, include rendering the
`30
`image 24a to a display processing unit 26.
`representation of the user in greater detail, or changing
`the representation from black and white to color.
`In accordance with an aspect of the invention the
`display processing unit 26 operates generally to derive
`BRIEF DESCRIPTION OF THE DRAWING
`visual representations of one or more possible next pro
`cess states from a current state of the process. For the
`The above set forth and other features of the inven
`35
`illustrated embodiment, the display processing unit 26
`tion are made more apparent in the ensuing Detailed
`derives a plurality of next camera image states from the
`Description of the Invention when read in conjunction
`current image 24a, and provides at an output the cur
`with the attached Drawing, wherein:
`FIG. 1 is a block diagram of an exemplary process
`rent image and the one or more derived camera images.
`control system that is constructed and operated in ac
`This information is stored in a conventional display
`buffer 28, from where it is provided to a display screen
`cordance with the invention;
`30 having graphical capability.
`FIGS.2a and 2b illustrate an operator display screen,
`for the exemplary process control system of FIG. 1,
`For the illustrated embodiment, and as is also shown
`in FIG. 2, the display screen 30 displays the current
`having a graphical display of a current system state and
`camera image state in a region 30a, a derived pan right
`also a plurality of derived possible next states, wherein
`45
`state in a region 30b, a derived pan left state in a region
`FIG. 2a shows the current state and a plurality of de
`30c, a derived zoom-out state in a region 30d, a derived
`rived next states at a first instant in time, and wherein
`zoom-in state in a region 30e, a derived tilt down state in
`FIG.2b shows the result of the selection of one of the
`a region 30f, and a derived tilt up state in a region 30g.
`derived next states of FIG. 2a
`Displayed states 30b-30g are all derived from the cur
`FIG. 3 is block diagram that shows in greater detail
`rent image state 30a, and display to the operator a result
`the display processing unit of FIG. 1;
`FIG. 4 is a block diagram which shows in greater
`of the execution of the associated action. That is, each
`derived state 30b-30g displays what the current image
`detail the display model unit of FIG. 1;
`FIG. 5 is a block diagram which shows in greater
`state 30a will become if the associated derived state is
`selected by the operator for execution.
`detail the motor control unit of FIG. 1; and
`In the context of this invention, "derived' refers to
`FIG. 6 is a flow chart that illustrates a method of the
`computer-generated predicted transformations.
`invention.
`In this embodiment of the invention the display
`DETAILED DESCRIPTION OF THE
`screen 30 has touchscreen capabilities. That is, each of
`NVENTION
`the areas 30a-30g is defined and operated to have
`switching capabilities so that, in response to the opera
`FIG. 1 is a block diagram of an exemplary process
`tor physically touching one of the areas 30a-30g, touch
`control system 10 that is constructed and operated in
`screen signals 31 are generated. The touchscreen signals
`accordance with the invention. In FIG. 1 the process
`31 include x-y coordinate information that specify a
`controls the pointing direction and the size of the field
`location on the display screen 30 that was touched by
`of view of a camera 12. However, and as will become
`65
`the operator. The touchscreen signals 31 are provided
`apparent below, the illustrated embodiment is not in
`tended to be read in a limiting sense upon the practice of
`to a touchscreen decoding unit 32. In the touchscreen
`decoding unit 32 the screen coordinates associated with
`the invention.
`
`50
`
`15
`
`55
`
`Qualcomm, Exh. 2009, p. 10
`Apple v. Qualcomm, 2018-01278
`
`
`
`10
`
`5,426,732
`5
`6
`each area 30a-30g are predetermined to encompass one
`updated to reflect the result of a pan right operation,
`of a plurality of specific touchscreen areas so that, in
`based on the current updated image 30a. This process
`response to the touchscreen signals 31, the touchscreen
`continues until the current image 30a meets a criteria
`applied by the operator. At this point, the operator
`control unit is enabled to identify which of the areas
`30a–30g was touched by the operator. That is, the touch
`selects the current image area 30a, and the current
`screen decoding unit 32 converts the touch screen sig
`image 30a is stored and/or is printed out.
`nals 31 into a specific switch closure signal. For the
`In FIG. 2a the operator selects the derived pan right
`illustrated embodiment the display screen 30 has seven
`image 30b to better center the image presented in the
`(software generated) independent momentary-type
`current image area 30a. In response, the camera 12 is
`switches associated therewith, corresponding to the
`repositioned by a current pan right increment, resulting
`areas 30a-30g.
`in the capture and display of a new current image in the
`area 30a in FIG. 2a. The images presented in areas
`In other unillustrated embodiments of the invention
`the touch screen capability may be replaced by, for
`30b-30g are all derived from the new current image,
`example, a keyboard-driven cursor and/or a pointing
`and present possible alternative next states for the imag
`ing process.
`device (mouse) cursor, whereby the operator is enabled
`15
`to specify to the system 10 one of the areas 30a-30g. The
`Further in accordance with the invention each of the
`specific implementation of the operator input function is
`derived next states is quantizised. That is, each next
`thus not germane to the understanding of, or the opera
`state is derived so as to be within a predetermined dis
`tion of, the invention. That is, other methods for selec
`tance from the current state. For the example of FIG. 2,
`tion, such as voice recognition, light pen, etc., may
`the pan right and pan left derived next states are a pre
`20
`determined angular displacement about the y-axis from
`occur to those skilled in the art.
`Reference is now made to FIGS. 2a and 2b for illus
`the current state, and the derived tilt down and the tilt
`trating the display screen 30 for the exemplary process
`up next states are a predetermined angular displacement
`control system of FIG. 1. In this example the subject
`about the x-axis. Similarly, the derived zoom-out and
`that is imaged by the camera 12 is the operator of the
`zoom-in next states are a predetermined linear distance
`25
`system 10, as in done in a self-portrait booth (kiosk). As
`along the z-axis.
`a result, the system 10 may also include a printer or
`Referring again to FIG. 1, the information relating to
`other suitable device for producing a hard copy of the
`the derived next state quantization factors is maintained
`camera image, after the camera 12 is correctly posi
`by a process model unit 34. The process model unit 34
`tioned with respect to the subject being imaged. Other
`is coupled to the display processing unit 26, as is also
`30
`means may be provided for storing and/or for transmit
`seen in FIG. 3, for providing quantization factors (cur
`ting the image for later use.
`rent increments) thereto. The process model unit 34 is
`This self-portrait application particularly emphasizes
`also coupled to the motor control unit 22 for providing
`the utility of the invention. Conventional self-portrait
`the quantization factors thereto.
`techniques suffer from problems resulting from large
`In general, the process model unit 34 is responsible
`35
`differences in height between subjects and a lack of a
`for predicting the allowed next states, for providing the
`information required by the display processing unit 26
`suitable feedback mechanism for determining where the
`camera is aimed. These problems have become more
`to derive the appearance of those states, and for provid
`apparent in that there is a developing class of computer
`ing the information required by the motor control unit
`applications wherein self-portraits or pictures of the
`22 to position the camera to achieve a state having the
`participants are an integral part of the application.
`appearance of a selected one of the derived states.
`These applications include, but are not limited to, secu
`As seen in FIG. 3, the display processing unit 26
`rity systems, multi-media applications, and telecon
`receives the current image 24a. The current image 24a
`ferencing systems.
`is applied to a scaling block 40 wherein the current
`image 24a is reduced to a size compatible with the cur
`In accordance with the invention the area 30a dis
`45
`plays the current image of the subject. Areas 30b-30g
`rent image area 30a (FIG. 2). The output of the image
`each display an image that is derived from the current
`scaling block 40 is the scaled current image 40a. The
`image 30a, and present to the operator an image of a
`scaled current image 40a is applied to the display buffer
`next possible state of the current image.
`28 for display within the current image area 30a, and is
`Preferably, the current and derived images are ar
`also applied to an image pan transformation block 42, an
`50
`rayed on the display screen 30 in a manner that supports
`image tilt transformation block 44, and an image Zoom
`the user's physical intuition concerning the process, in
`transformation block 46. Blocks 42, 44, and 46 each
`apply a predetermined image transformation function to
`this case the movement of the camera. That is, the de
`rived tilt areas 30g and 30fare disposed along a y-axis of
`the scaled current image 40a, in accordance with an
`the screen, the derived pan areas 30b and 30c are dis
`associated current pan increment, a current tilt incre
`55
`posed along an x-axis of the screen, and the derived
`ment, and a current zoom increment, respectively. The
`zoom areas 30d and 30e are disposed along an inferred
`current increments are received from the process model
`z-axis, with the physically larger derived zoom-in image
`unit 34. The operation of blocks 42, 44, and 46 generates
`area 30e being positioned to appear "nearer' to the
`the derived next states of the scaled current image 40a
`operator.
`for display in the areas 30b-30g of the display screen 30.
`In operation, the operator touches or otherwise se
`By example, image pan transformation block 42 hori
`lects one of the areas 30b-30g. In response, the camera
`zontally shifts the scaled current image 4.0a by a number
`12 is repositioned accordingly, and the current image
`of pixels that correspond to an angular displacement
`30a reflects the result of the repositioning operation.
`that is specified by the current pan increment. The
`Furthermore, each of the derived images displayed in
`image tilt transformation block 44 vertically shifts the
`65
`the areas 30b-30g is updated in accordance with the
`scaled current image 40a by a number of pixels that
`correspond to an angular displacement that is specified
`new current image, and also at regular intervals of time.
`For example, the derived pan right image area 30b is
`by the current tilt increment. The image zoom transfor
`
`Qualcomm, Exh. 2009, p. 11
`Apple v. Qualcomm, 2018-01278
`
`
`
`10
`
`5,426,732
`8
`7
`mation block 46 applies both a pixel expansion and
`FIG. 1. The switch closure for the current image state
`area 30a may be employed by the controller 60 to reset
`contraction operation to the scaled current image 4.0a to
`the camera 12 to a predetermined initial orientation.
`generate the derived zoom-in state and the derived
`FIG. 6 is a flow chart that illustrates the operation of
`zoom-out state, respectively. The amount of pixel ex
`pansion and contraction is specified by the current
`the method of the invention for the embodiment of
`Zoom increment signal.
`FIGS. 1-5. At Block A the camera 12 position and
`FIG. 4 is a block diagram that illustrates the process
`process model 52 are initialized. The display screen 30 is
`thus also initialized with a current image state 30a and a
`model unit 34. The process model unit 34 includes a
`corresponding plurality of derived image states
`controller 50 that receives switch closure information
`32a from the touchscreen decoding unit 32. The switch
`30b-30g. At Block B a determination is made if the
`closure information 32a is representative of the operator
`operator has caused a switch closure. If NO, a determi
`touching the display screen 30 in one of the areas
`nation is made at Block C if a timer 36 (FIG. 1) has
`30a-30g. Controller 50 includes a plurality of counters
`generated a timing pulse. Timer 36 is provided to cause
`the frame capture buffer 24 to capture a new image at
`50a for counting a number of switch closures for each of
`regular intervals so as to update the representations of
`the regions 30b-30g. The use of the counters 50a enables
`15
`the current image, and the derived images. This enables
`different increments to be employed as a function of a
`the current image area 30a and the derived image areas
`number of switch closures for a particular touchscreen
`30b-30g to accurately reflect changes in position and
`switch. For example, for the first three pan left and pan
`facial expression of the operator. If NO at Block C, the
`right switch closures a first pan increment resolution
`may be employed (5), and for succeeding pan left and
`method continues at Block B.
`20
`pan right switch closures a second, finer increment
`If YES at Block B, a determination is made at Block
`resolution may be employed (2). A further predeter
`D if the operator has made a switch closure for a de
`mined number of depressions of an individual image
`sired one of the derived states 30b-30g, or for the cur
`state touchscreen switch may be employed to reset the
`rent image state 30a. If the switch closure is for the
`increment to the larger value, as repeated depressions
`current state 30a, the YES branch is taken and the
`25
`may indicate that the user is attempting to make a large
`method continues at Block E where the current image
`change in image aspect or position.
`state 30a is accepted. That is, the current image may be
`A depression of the switch associated with the cur
`stored, and/or converted to a tangible form, and/or
`rent image area 30a