`a2) Patent Application Publication 10) Pub. No.: US 2007/0176906 Al
`(43) Pub. Date: Aug. 2, 2007
`
`WARREN
`
`US 20070176906A1
`
`(54)
`
`PROXIMITY SENSOR AND METHOD FOR
`INDICATING EXTENDED INTERFACE
`RESULTS
`
`(75)
`
`Inventor:
`
`Andrew I. WARREN,San Jose,
`CA (US)
`
`Correspondence Address:
`INGRASSIA, FISHER & LORENZ,P.C.
`7150 E. CAMELBACK ROAD, SUITE 325
`SCOTTSDALE, AZ 85251
`
`(73)
`
`Assignee:
`
`SYNAPTICS INCORPORATED,
`Santa Clara, CA (US)
`
`(21)
`
`Appl. No.:
`
`11/613,063
`
`(22)
`
`Filed:
`
`Dec. 19, 2006
`
`Related U.S. Application Data
`
`(60)
`
`Provisional application No. 60/764,406, filed on Feb.
`1, 2006.
`
`Publication Classification
`
`(51)
`
`Int. CL
`(2006.01)
`GO6F 3/041
`(52) US. CMe
`ccceccsscssssssssssssseneeseesessessssssssssniennes 345/173
`
`(57)
`
`ABSTRACT
`
`A proximity sensor device and method is provided that
`facilitates improved system usability. Specifically, the prox-
`imity sensor device and method provide a user with the
`ability to easily cause different results in an electronic
`system using a proximity sensor device as a user interface.
`For example,
`it can be used to facilitate user interface
`navigation, such as dragging and scrolling. As another
`example, it can be usedto facilitate value adjustments, such
`as changing a device parameter. In general, the proximity
`sensor device is adapted to distinguish between different
`object combination motions, determine relative temporal
`relationships between those motions, and generate user
`interface results responsive to the motions. This allows a
`userto selectively generate different results using the motion
`of two different object combinations.
`
`0 = NO OBJECT PRESENCE
`1= FIRST OBJECT COMBINATION MOTION
`2= SECOND OBJECT COMBINATION MOTION
`
`
`
`502
`
`504
`
`501
`
`0
`
`Petitioner Samsung Ex-1007, 0001
`
`Petitioner Samsung Ex-1007, 0001
`
`
`
`US 2007/0176906 Al
`
`Petitioner Samsung Ex-1007, 0002
`
`u
`
`O-OW
`
`W— L
`
`Patent Application Publication
`
`Aug. 2, 2007 Sheet 1 of 7
`
`=L
`
`u-”>n O Z
`
`zOo
`
`Petitioner Samsung Ex-1007, 0002
`
`
`
`Patent Application Publication
`
`Aug. 2, 2007 Sheet 2 of 7
`
`US 2007/0176906 Al
`
`Oo
`
`©N
`
`S
`
`Petitioner Samsung Ex-1007, 0003
`
`+
`
`a
`
`Cc)
`rT
`Le
`
`224
`
`\
`
`NN
`
`co
`
`a
`N
`o
`A
`
`212an316“3<200=S200
`214FIG.2FIG.3
`
`Petitioner Samsung Ex-1007, 0003
`
`
`
`Patent Application Publication
`
`Aug. 2, 2007 Sheet 3 of 7
`
`US 2007/0176906 Al
`
`Tt
`oO
`LO
`
`
`
`
`
`MOTION2=SECONDOBJECTCOMBINATIONMOTION
`
`
`1=FIRSTOBJECTCOMBINATION
`
`NOOBJECTPRESENCE
`
`502
`
`O=
`
`501
`
`Petitioner Samsung Ex-1007, 0004
`
`Petitioner Samsung Ex-1007, 0004
`
`
`
`File
`
`Patent Application Publication
`
`Aug. 2,2007 Sheet 4 of 7
`
`US 2007/0176906 Al
`
`FIG.7
`
`2®
`x=
`
`S6
`oS
`&
`
`2°3
`
`FR
`
`<®w
`
`n = E
`
`ditView
`
`st
`°.
`
`602
`
`604
`
`Petitioner Samsung Ex-1007, 0005
`
`Petitioner Samsung Ex-1007, 0005
`
`
`
`Aug. 2, 2007 Sheet 5 of 7
`
`File
`FIG.8
`
`- tow
`
`n & E
`
`ditView
`
`US 2007/0176906 Al
`
`Petitioner Samsung Ex-1007, 0006
`
`Patent Application Publication
`
`2ox =}U
`
`C &
`
`2Q
`
`o3k
`
`Petitioner Samsung Ex-1007, 0006
`
`
`
`Patent Application Publication
`
`Aug. 2, 2007 Sheet 6 of 7
`
`US 2007/0176906 Al
`
`700-4 FIG.11
`
`FIG.10
`
`Petitioner Samsung Ex-1007, 0007
`
`File
`
`2®
`x=
`
`S6
`oS
`&
`
`L8a <®” = E
`
`ditView
`
`Petitioner Samsung Ex-1007, 0007
`
`
`
`File
`
`602
`
`606.¢DC)
`
`606
`
`6004
`
`FIG.12
`
`Petitioner Samsung Ex-1007, 0008
`
`233- t©w
`
`n = E
`
`ditView
`
`US 2007/0176906 Al
`
`FIG.13
`
`Patent Application Publication
`
`Aug. 2, 2007 Sheet 7 of 7
`
`2®
`
`i =}o
`
`S &
`
`Petitioner Samsung Ex-1007, 0008
`
`
`
`US 2007/0176906 Al
`
`Aug. 2, 2007
`
`PROXIMITY SENSOR AND METHOD FOR
`INDICATING EXTENDED INTERFACE
`RESULTS
`
`PRIORITY DATA
`
`[0001] This application claimspriority of U.S. Provisional
`Patent Application Ser. No. 60/764,406, filed on Feb. 1,
`2006, which is hereby incorporated herein by reference.
`
`FIELD OF THE INVENTION
`
`to electronic
`invention generally relates
`[0002] This
`devices, and more specifically relates to proximity sensor
`devices and using a touch sensor device for producing user
`interface inputs.
`
`BACKGROUND OF THE INVENTION
`
`then movesthe cursor to an end point—optionally “rowing”
`(lifting the mouse whenit reaches the edge of the mouse pad
`and setting it back down away from the edge) to drag for
`long distances—then releases the mouse button to stop
`dragging. With a traditional touch sensor device, dragging is
`much more awkward, particularly for dragging long dis-
`tances. Dragging for long distances is typically more awk-
`ward on touch sensor devices because it can require “row-
`ing”, e.g., lifting the finger when the edge of the touch sensor
`is reached to reposition the finger on the touch sensor.
`Specifically, while some previous techniques have facili-
`tated the use of two fingers to initiate dragging, dragging
`ends when both the fingers are removed and they havefailed
`to provide any mechanism for maintaining dragging selec-
`tion without cursor movement. Thus,
`in these and other
`systems maintaining dragging with a touch sensor device
`requires simultaneously pressing another input device(e.g.,
`button) while moving the cursor with the touch sensor
`device. Pressing a button while moving the cursor using the
`touch sensor device can be difficult for some users.
`
`Proximity sensor devices (also commonly called
`[0003]
`touch pads or touch sensor devices) are widely used in a
`variety of electronic systems. A proximity sensor device
`[0007] The motion in a dragging action often consists of a
`typically includes a sensing region, often demarked by a
`straight line from a start point to an end point, but some uses
`surface, which uses capacitive, resistive, inductive, optical,
`of dragging involve other kinds of motions. An effective
`acoustic and/or other technology to determine the presence,
`dragging gesture for touch sensor devices must accommo-
`location and/or motion of one or more fingers, styli, and/or
`date all these usage patterns. Someprior art solutions, such
`other objects. The proximity sensor device, together with
`as edge motion, help with simple linear drags but are less
`finger(s) and/or other object(s), can be used to provide an
`helpful with the kinds of drag motions used, for example,
`input
`to the electronic system. For example, proximity
`whenoperating a scroll bar.
`sensor devices are used as input devices for larger comput-
`[0008]
`Some current techniques facilitate touch sensor
`ing systems, such as those found integral within notebook
`device dragging without requiring input to other buttons. For
`computers or peripheral to desktop computers. Proximity
`example, the current market standard is a gesture called
`sensor devices are also used in smaller systems, including:
`“tap-and-a-half’ dragging. To utilize tap-and-a-half drag-
`handheld systemssuch as personaldigital assistants (PDAs),
`ging, once the user has ensured that the cursor or other
`remote controls, communication systems such as wireless
`indicator is at a desired start point, the user lifts any finger
`telephones and text messaging systems. Increasingly, prox-
`that is on the sensitive surface of the touch sensor device
`imity sensor devices are used in media systems, such as CD,
`taps once, and quickly places the finger back down on the
`DVD, MP3, video or other media recorders or players.
`sensitive surface. This gesture activates dragging. The user
`[0004] Many electronic devices includea userinterface, or
`then movesthe cursor by moving the finger to an end point
`UI, and an input device for interacting with the UI (e.g.,
`and thenlifts the finger to stop dragging. Typically the same
`interface navigation). A typical UI includes a screen for
`finger is used for the entire dragging motion, but different
`displaying graphical and/or textual elements. The increasing
`fingers or objects may be used for different portions of the
`use of this type of UI has led to a rising demand for
`gesture.
`proximity sensor devices as pointing devices. In these appli-
`[0009] While the use of the basic tap-and-a-half gesture to
`cations the proximity sensor device can function as a value
`initiate dragging is an improvement,its efficiency in facili-
`adjustment device, cursor control device, selection device,
`tating dragging over long distances is limited. Again, when
`scrolling
`device,
`graphics/character/handwriting
`input
`dragging for long distances the user can be required to
`device, menu navigation device, gaming input device, but-
`“row”, e.g., lift the finger when the edge of the touch sensor
`ton input device, keyboard and/or other input device.
`is reachedto reposition the finger on the touch sensor. When
`[0005] One issue with past touch sensor devices has been
`the userlifts the finger to “row”the cursor, the selection will
`enabling dragging, scrolling, and similar functions with
`be lost and the drag will end, and typically must be restarted
`gestures. Specifically, many users cite difficulty in using
`with another tap-and-a-half gesture, greatly complicating the
`touch sensor devices for “dragging”. In general, “dragging”
`gestures required to perform a long distance drag. Many
`comprises continued selection, optionally with motion. For
`solutions have been used and proposed to enhance the
`example, dragging occurs when an icon is selected and
`tap-and-a-half gesture for long distance drags. For example,
`moved using a mouse. Another example is when a portion of
`USS. Pat. No. 5,880,411 discloses locking drags, extended
`text is selected and highlighted. A third example is when a
`scrollbar thumb onascrollbar is selected and moved to
`drags, and edge motion. However,all of these solutions, and
`indeed the tap-and-a-half gesture itself, have the disadvan-
`tage that performing them involves distinctly different and
`complicated hand and finger actions than are used with a
`mouse, hence making dragging difficult for users familiar
`with mice.
`
`scroll throughtext. In all three examples, dragging is accom-
`plished with continued selection (e.g., pressing a button)
`combined with motion (e.g., cursor motion). Continued
`selection with zero motion, often referred to as a “press
`gesture,” may be viewedeither as a special case of the drag
`gesture or as a distinct gesture
`[0006] With a mouse, dragging is simple: One moves the
`cursor to a start point, presses and holds a mouse button,
`
`[0010] Thus, while many different techniques have been
`used to facilitate dragging, there remains a continuing need
`for improvements in device usability. Particularly, there is a
`
`Petitioner Samsung Ex-1007, 0009
`
`Petitioner Samsung Ex-1007, 0009
`
`
`
`US 2007/0176906 Al
`
`Aug. 2, 2007
`
`continuing need for improved techniques for facilitating
`dragging with proximity sensor devices.
`
`BRIEF SUMMARY OF THE INVENTION
`
`[0011] The present invention provides a proximity sensor
`device and methodthat facilitates improved system usabil-
`ity. Specifically, the proximity sensor device and method
`provide a user with the ability to easily cause different
`results in an electronic system using a proximity sensor
`device as a user interface. For example, it can be used to
`facilitate user interface navigation, such as dragging and
`scrolling. As another example, it can be used to facilitate
`value adjustments, such as changing a device parameter. In
`general, the proximity sensor device is adapted to distin-
`guish between different object combination motions, deter-
`minerelative temporalrelationships between those motions,
`and generate userinterface results responsive to the motions.
`Specifically, the proximity sensor device is adapted to indi-
`cate a first result responsive to detected motion ofthefirst
`object combination, indicate a second result responsive to
`detected motion of the second object combination,
`the
`second result different from the first result, and indicate a
`third result responsive to detected motion ofthe first object
`combination following the detected motion of the second
`object combination,the third result different from first result
`and the second result. This allows a user to selectively
`generate different results using the motion of two different
`object combinations.
`[0012]
`In one specific embodiment, the proximity sensor
`device is implemented to facilitate continued cursor move-
`ment with selection, commonly referred to as “dragging”
`using motion of different object combinations. For example,
`the proximity sensor device is implemented to indicate
`selection with cursor movement responsive to detected
`motion of two adjacent objects across the sensing region,
`indicate selection without cursor movement responsive to
`detected motion of one object across the sensing region
`whenthe detected motion of one object across the sensing
`region followed the detected motion of two adjacent objects
`across the sensing region without an intervening termination
`event, and indicate further selection with cursor movement
`responsive to detected motion of two adjacent objects across
`the sensing region when the detected motion of two adjacent
`objects across the sensing region followed the detected
`motion of one object across the sensing region that followed
`the detected motion of the adjacent objects across the
`sensing region. This facilitates use of the proximity sensor
`device by a user to indicate results such as extended drag-
`ging, and is particularly useful for indicating continuing
`adjustments, for example, to facilitate dragging an object
`over a large distance or scrolling through a large document.
`This allows a user to continue to drag an object without
`requiring the user to perform more complex gestures on the
`proximity sensor device or activate extra control buttons.
`
`BRIEF DESCRIPTION OF DRAWINGS
`
`[0013] The preferred exemplary embodiment of the
`present invention will hereinafter be described in conjunc-
`tion with the appended drawings, where like designations
`denote like elements, and:
`[0014]
`FIG. 1 is a block diagram of an exemplary system
`that includes a proximity sensor device in accordance with
`an embodiment of the invention;
`
`FIGS. 2-4 are side views of exemplary object
`[0015]
`combinations in the sensing area of a proximity sensor
`device in accordance with embodiments of the invention;
`[0016]
`FIG. 5 is a state diagram of a proximity sensor
`device process in accordance with embodiments of the
`invention;
`[0017]
`FIGS. 6, 8, 10 and 12 are schematic views of a
`proximity sensor device with object combination motions in
`accordance with embodiments of the invention; and
`[0018]
`FIGS. 7, 9, 11 and 13 are schematic viewsresults
`on a program interface in accordance with embodiments of
`the invention.
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`
`[0019] The following detailed description is merely exem-
`plary in nature and is not intendedto limit the invention or
`the application and usesof the invention. Furthermore, there
`is no intention to be bound by any expressed or implied
`theory presented in the preceding technical field, back-
`ground, brief summary or the following detailed description.
`[0020] The present invention provides a proximity sensor
`device and methodthat facilitates improved system usabil-
`ity. Specifically, the proximity sensor device and method
`provide a user with the ability to easily cause different
`results in an electronic system using a proximity sensor
`device as a user interface. For example, it can be used to
`facilitate user interface navigation, such as dragging and
`scrolling.
`[0021]
`To cause selective results the proximity sensor
`device is adapted to distinguish between different object
`combination motions, determine relative temporal relation-
`ships between those motions, and generate user interface
`results responsive to the motions. Specifically, the proximity
`sensor device is adapted to indicate a first result responsive
`to detected motion of the first object combination, indicate
`a second result responsive to detected motion of the second
`object combination, the secondresult different from the first
`result, and indicate a third result responsive to detected
`motion ofthe first object combination following the detected
`motion of the second object combination, the third result
`different from first result and the second result. This allows
`
`a user to selectively generate different results using the
`motion of two different object combinations.
`[0022] Turning now to the drawing figures, FIG. 1 is a
`block diagram of an exemplary electronic system 100 that is
`coupled to a proximity sensor device 116. Electronic system
`100 is meant to represent any type of personal computer,
`portable computer, workstation, personal digital assistant,
`video game player, communication device (including wire-
`less phones and messaging devices), media device, includ-
`ing recorders and players (including televisions, cable
`boxes, music players, and video players) or other device
`capable of accepting input from a user and of processing
`information. Accordingly, the various embodiments of sys-
`tem 100 may include any type of processor, memory or
`display. Additionally,
`the elements of system 100 may
`communicate via a bus, network or other wired or wireless
`interconnection. The proximity sensor device 116 can be
`connected to the system 100 through any type ofinterface or
`connection, including 12C, SPI, PS/2, Universal Serial Bus
`(USB), Bluetooth, RF, IRDA,or any other type of wired or
`wireless connection to list several non-limiting examples.
`
`Petitioner Samsung Ex-1007, 0010
`
`Petitioner Samsung Ex-1007, 0010
`
`
`
`US 2007/0176906 Al
`
`Aug. 2, 2007
`
`Proximity sensor device 116 includes a processor
`[0023]
`119 and a sensing region 1 18. Proximity sensor device 116
`is sensitive to the position of a stylus 114, finger and/or other
`input object within the sensing region 118. “Sensing region”
`118 as used herein is intended to broadly encompass any
`space above, around, in and/or near the proximity sensor
`device 116 wherein the sensor of the touchpad is able to
`detect a position of the object. In a conventional embodi-
`ment, sensing region 118 extends from the surface of the
`sensor in one or more directions for a distance into space
`until signal-to-noise ratios prevent object detection. This
`distance may be on the order of less than a millimeter,
`millimeters, centimeters, or more, and may vary signifi-
`cantly with the type of position sensing technology used and
`the accuracy desired. Accordingly, the planarity, size, shape
`and exact locationsof the particular sensing regions 116 will
`vary widely from embodiment to embodiment.
`[0024]
`In operation, proximity sensor device 116 suitably
`detects a position of stylus 114, finger or other input object
`within sensing region 118, and using processor 119, provides
`electrical or electronic indicia of the position to the elec-
`tronic system 100. The system 100 appropriately processes
`the indicia to accept inputs from the user, to move a cursor
`or other object on a display, or for any other purpose.
`[0025] The proximity sensor device 116 can use a variety
`of techniques for detecting the presence of an object. As
`several non-limiting examples, the proximity sensor device
`116 can use capacitive, resistive, inductive, surface acoustic
`wave, or optical techniques. In a commoncapacitive imple-
`mentation of a touch sensor device a voltage is typically
`applied to create an electric field across a sensing surface. A
`capacitive proximity sensor device 116 would then detect
`the position of an object by detecting changes in capacitance
`caused by the changesin the electric field due to the object.
`Likewise, in a commonresistive implementation a flexible
`top layer and a bottom layer are separated by insulating
`elements, and a voltage gradientis created acrossthe layers.
`Pressing the flexible top layer creates electrical contact
`between the top layer and bottom layer. The resistive prox-
`imity sensor device 116 would then detect the position of the
`object by detecting the voltage output due to changes in
`resistance caused by the contact of the object. In an induc-
`tive implementation, the sensor might pick up loop currents
`induced by a resonating coil or pair of coils, and use some
`combination of the magnitude, phase and/or frequency to
`determine distance, orientation or position. In all of these
`cases the proximity sensor device 116 detects the presence
`ofthe object and delivers position information to the system
`100. Examples of the type of technologies that can be used
`to implementthe various embodiments of the invention can
`be found at U.S. Pat. No. 5,543,591, U.S. Pat. No. 6,259,234
`and U.S. Pat. No. 5,815,091, each assigned to Synaptics Inc.
`[0026]
`Proximity sensor device 116 includes a sensor (not
`shown)that utilizes any combination of sensing technology
`to implementone or more sensing regions. For example, the
`sensor of proximity sensor device 116 can use arrays of
`capacitive sensor electrodes to support any numberof sens-
`ing regions. As another example, the sensor can use capaci-
`tive sensing technology in combination with resistive sens-
`ing technology to support
`the same sensing region or
`different sensing regions. Depending on sensing technique
`used for detecting object motion, the size and shape of the
`sensing region, the desired performance, the expected oper-
`ating conditions, and the like, proximity sensor device 116
`
`can be implemented with a variety of different ways. The
`sensing technology can also vary in the type of information
`provided, such as to provide “one-dimensional” position
`information (e.g. along a sensing region) as a scalar, “two-
`dimensional” position information (e.g. horizontal/vertical
`axes, angular/radial, or any other axes that span the two
`dimensions) as a combination of values, and the like.
`[0027] The processor 119, sometimes referred to as a
`proximity sensor processor or touch sensor controller,
`is
`coupled to the sensor and the electronic system 100. In
`general, the processor 119 receives electrical signals from
`the sensor, processes the electrical signals, and communi-
`cates with the electronic system. The processor 119 can
`perform a variety of processes on the signals received from
`the sensor to implement the proximity sensor device 116.
`For example, the processor 119 can select or connect indi-
`vidual sensor electrodes, detect presence/proximity, calcu-
`late position or motion information, and report a position or
`motion when a threshold is reached, and/or interpret and
`waitfor a valid tap/stroke/character/button/gesture sequence
`before reportingit to the electronic system 100, or indicating
`it to the user. The processor 119 can also determine when
`certain types or combinations of object motions occur proxi-
`mate the sensor. For example, the processor 119 can distin-
`guish between motion ofa first object combination(e.g., one
`finger, a relatively small object, etc.) and motion of a second
`object combination (e.g., two adjacent fingers, a relatively
`large object, etc.) proximate the sensing region, and can
`generate the appropriate indication in response to that
`motion. Additionally, the processor can distinguish the tem-
`poral relationship between motions of object combinations.
`For example,
`it can determine when motion of the first
`object combination has followed motion of the second
`object combination, and provide a different result responsive
`to the motions and their temporal relationship.
`[0028]
`In this
`specification,
`the term “processor” is
`defined to include one or more processing elements that are
`adapted to perform the recited operations. Thus, the proces-
`sor 119 can comprise all or part of one or more integrated
`circuits, firmware code, and/or software code that receive
`electrical signals from the sensor and communicate with the
`electronic system 100. In some embodiments, the elements
`that comprise the processor 119 would be located with or
`nearthe sensor. In other embodiments, some elements of the
`processor 119 would be with the sensor and other elements
`of the processor 119 would reside on or near the electronic
`system 100. In this embodiment minimal processing could
`be performed near the sensor, with the majority of the
`processing performed on the electronic system 100.
`[0029]
`Furthermore, the processor 119 can be physically
`separate from the part of the electronic system that
`it
`communicates with, or the processor 119 can be imple-
`mented integrally with that part of the electronic system. For
`example, the processor 119 can reside at least partially on a
`processor performing other functions for the electronic
`system aside from implementing the proximity sensor
`device 116.
`
`[0030] Again, as the term is used in this application, the
`term “electronic system”broadly refers to any type of device
`that communicates with proximity sensor device 116. The
`electronic system 100 could thus comprise any type of
`device or devices in which a touch sensor device can be
`implemented in or coupled to. The proximity sensor device
`could be implementedas part of the electronic system 100,
`
`Petitioner Samsung Ex-1007, 0011
`
`Petitioner Samsung Ex-1007, 0011
`
`
`
`US 2007/0176906 Al
`
`Aug. 2, 2007
`
`to the motions. This allows a user to selectively generate
`different results using the motion of two different object
`combinations.
`
`or coupled to the electronic system using any suitable
`technique. As non-limiting examples the electronic system
`100 could thus comprise any type of computing device,
`media player, communication device, or another input
`In one specific embodiment, the proximity sensor
`[0034]
`device (such as another touch sensor device or keypad). In
`device 116 is implemented to facilitate continued cursor
`somecasesthe electronic system 100 is itself a peripheral to
`movement with selection, a type of “dragging,” using
`a larger system. For example, the electronic system 100
`motion of different object combinations. For example, the
`could be a data input or output device, such as a remote
`proximity sensor device 116 can be implementedto indicate
`control or display device, that communicates with a com-
`selection with cursor movement(e.g., dragging) responsive
`puter or media system (e.g., remote control for television)
`to detected motion of two adjacent objects across the sensing
`using a suitable wired or wireless technique. It should also
`region, indicate selection without cursor movement respon-
`be noted that the various elements (processor, memory,etc.)
`sive to detected motion of one object across the sensing
`of the electronic system 100 could be implementedas part
`region when the detected motion of one object across the
`of an overall system, as part of the touch sensor device, or
`sensing region followed the detected motion of two adjacent
`as a combination thereof. Additionally, the electronic system
`objects across the sensing region without an intervening
`100 could be a host oraslave to the proximity sensor device
`termination event, and indicate further selection with cursor
`1 16.
`movement responsive to detected motion of two adjacent
`objects across the sensing region when the detected motion
`of two adjacent objects across the sensing region followed
`the detected motion of one object across the sensing region
`that followed the detected motion of the adjacent objects
`across the sensing region. This facilitates use of the prox-
`imity sensor device 116 by a user to indicate results such as
`extended dragging over long distances. Thus, the proximity
`sensor device 116 allowsa user to continue to drag an object
`without requiring the user to perform more complex gestures
`on the proximity sensor device or activate extra control
`buttons.
`
`In the illustrated embodimentthe proximity sensor
`[0031]
`device 116 is implemented with buttons 120. The buttons
`120 can be implemented to provide additional input func-
`tionality to the proximity sensor device 116. For example,
`the buttons 120 can be used to facilitate selection of items
`
`using the proximity sensor device 116. Of course,this is just
`one example of how additional input functionality can be
`added to the proximity sensor device 116, and in other
`implementations the proximity sensor device 116 could
`include alternate or additional input devices, such as physi-
`cal or virtual switches, or additional proximity sensing
`regions. Conversely, the proximity sensor device 116 can be
`implemented with no additional input devices.
`[0032]
`It should be noted that although the various
`embodiments described herein are referred to as “proximity
`sensor devices”, “touch sensor devices”, “proximity sen-
`sors”, or “touch pads”,
`these terms as used herein are
`intended to encompass not only conventional proximity
`sensor devices, but also a broad range of equivalent devices
`that are capable of detecting the position of a one or more
`fingers, pointers, styli and/or other objects. Such devices
`mayinclude, without limitation, touch screens, touch pads,
`touch tablets, biometric authentication devices, handwriting
`or character recognition devices, and the like. Similarly, the
`terms “position” or “object position” as used herein are
`intended to broadly encompass absolute andrelative posi-
`tional information, and also other types of spatial-domain
`information such as velocity, acceleration, and the like,
`including measurementof motion in one or more directions.
`Various forms of positional information may also include
`time history components, as in the case of gesture recogni-
`tion and the like. Accordingly, proximity sensor devices can
`appropriately detect more than the mere presence or absence
`of an object and may encompass a broad range of equiva-
`lents.
`
`In the embodiments of the present invention, the
`[0033]
`proximity sensor device 116 is adapted to provide the ability
`for a user to easily cause different results in an electronic
`system using a proximity sensor device 116 as part of a user
`interface. For example,
`it can be used to facilitate user
`interface navigation, such as cursor control, dragging and
`scrolling. As another example, it can be used to facilitate
`value adjustments, such as changing a device parameter. To
`cause selective results the proximity sensor device 116 is
`adapted to distinguish between different object combination
`motions, determine relative temporal relationships between
`those motions, and generate user interface results responsive
`
`It should also be understood that while the embodi-
`[0035]
`ments of the invention are described herein the context of a
`fully functioning proximity sensor device, the mechanisms
`of the present invention are capable of being distributed as
`a program product in a variety of forms. For example, the
`mechanisms of the present invention can be implemented
`and distributed as a proximity sensor program on a com-
`puter-readable signal bearing media. Additionally,
`the
`embodiments of the present invention apply equally regard-
`less of the particular type of signal bearing media used to
`carry out the distribution. Examples of signal bearing media
`include: recordable media such as memory cards, optical and
`magnetic disks, hard drives.
`in the embodiments of the
`[0036] As described above,
`invention the proximity sensor device is adapted to distin-
`guish between different object combination motions, deter-
`minerelative temporal relationships between those motions,
`and generate userinterface results responsive to the motions.
`The different object combinations can be distinguished
`based on a variety of different parameters, such as object
`type, object size, object proximity, pressure on the sensing
`region, and the number of objects proximate the sensing
`region, to list several non-limiting examples.
`[0037] As one specific example,
`the proximity sensor
`device is adapted to distinguish the number of objects
`proximate sensing region. Turning now to FIGS. 2-4, side
`views of exemplary object combinations are illustrated.
`Specifically, FIGS. 2-4 illustrate an embodiment where the
`proximity sensor device is adapted to distinguish between
`the numberoffingers or other objects proximate the sensing
`region.In FIG.2, the first object combination 202 comprises
`one finger 204 proximate the sensing region 200. In FIG. 3,
`a second object combination 212 comprises two fingers 214
`and 216 proximate the sensing region 200. Finally, in FIG.
`4, the third object combination 222 comprises three fingers
`224, 226 and 228. In this embodiment, the object position
`
`Petitioner Samsung Ex-1007, 0012
`
`Petitioner Samsung Ex-1007, 0012
`
`
`
`US 2007/0176906 Al
`
`Aug. 2, 2007
`
`detector detects the position of the fingers proximate the
`sensing region, and determines the number of fingers
`present. Thus, the object position detector can determine if
`one, two or morefingers are proximate the touch sensor and
`generate a result responsive to the numberoffingers and the
`motionsof the fingers. Of course, the system could also be
`adapted to distinguish between any other quantities of
`objects, such as between three and four fingers, etc. It should
`be noted that such a system allowsthe user to easily change
`the object combination presented to the proximity sensor by
`selectively placing and lifting