throbber
US 7,631,811 B1
`(10) Patent No.:
`a2) United States Patent
`Brown
`(45) Date of Patent:
`Dec. 15, 2009
`
`
`US007631811B1
`
`(54) OPTICAL HEADSET USER INTERFACE
`
`(75)
`
`Inventor: William Owen Brown, Santa Cruz, CA
`(US)
`
`(73) Assignee: Plantronics, Inc., Santa Cruz, CA (US)
`
`(*) Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 103 days.
`
`(21) Appl. No.: 11/906,803
`
`(22)
`
`Filed:
`
`Oct. 4, 2007
`
`(51)
`
`Int. Cl.
`(2006.01)
`GO6K 7/14
`(52) U.S.Coe 235/454; 379/428 .02; 455/73
`(58) Field of Classification Search ................. 235/454;
`.
`3791428.02
`See applicationfile for complete search history.
`References Cited
`
`(56)
`
`2007/0274530 Al* 11/2007 Builetal.0... 381/74
`2008/0130910 Al
`6/2008 Jobling et al.
`2008/0284734 AL* 11/2008 Visser... eee 345/166
`OTHER PUBLICATIONS
`
`Gregory, Peter; Doria, Tom; Stegh, Chris; Su, Jim; SIP Communica-
`tions For Dummies, Avaya Custom Edition, 2006, Wiley Publishing,
`Inc., Hoboken, NJ, USA.
`* cited by examiner
`.
`.
`.
`Primary Examiner—Daniel A Hess
`Assistant Examiner—Laura Gudorf
`(74) Attorney, Agent, or Firm—Intellectual Property Law
`Office of Thomas Chuang
`
`ABSTRACT
`(57)
`A headset includes a finger pad on an exterior ofthe headset
`on which a finger of a headset weareris placed. The headset
`includes an optical line scanner which scans the finger pad
`and outputs a series of successive imagesofthe finger placed
`on the finger pad. A headset processor processesthe output of
`the optical line scanner to detect relative motion ofthe finger
`on the finger pad or detect tapping of the finger on the finger
`pad.
`
`U.S. PATENT DOCUMENTS
`6,980,673 B2* 12/2005 Funahashi .......0..00.... 382/124
`2001/0017934 A1l*
`8/2001 Paloniemiet al.
`........... 382/107
`
`20 Claims, 5 Drawing Sheets
`
`1
`
`APPLE 1008
`
`1
`
`APPLE 1008
`
`

`

`U.S. Patent
`
`US 7,631,811 B1
`
`
`
`Dec. 15, 2009
`
`Sheet 1 of 5
`
`FIG. 1
`
`2
`
`

`

`U.S. Patent
`
`Dec. 15, 2009
`
`Sheet 2 of 5
`
`US 7,631,811 B1
`
`2 oS
`
`Processor
`10
`
`50
`
`Power Source
`—_ —
`
`Memory
`= NO
`
`Microphone
`—_ BAS
`
`Speaker
`= Oo)
`
`User Interface 18
`
`Line
`Scanner 20
`
`Light Source 22
`
`Input Keys 6
`
`-
`Optical Sensor 26
`
`Finger Pad 4
`
`FIG. 2
`
`3
`
`

`

`U.S. Patent
`
`Dec. 15, 2009
`
`Sheet 3 of 5
`
`US 7,631,811 B1
`
`
`
`FIG. 3
`
`4
`
`

`

`U.S. Patent
`
`Dec. 15, 2009
`
`Sheet 4 of 5
`
`US 7,631,811 B1
`
`Line Scannerfor Scrolling
`
`FIG. 4A
`
`FIG. 4B
`
`32
`
`eeee
`
`34
`
`ANS
`
`FIG. 4C
`
`FIG. 4D
`
`5
`
`
`

`

`U.S. Patent
`
`Dec. 15, 2009
`
`Sheet 5 of 5
`
`US 7,631,811 B1
`
`Line Scannerfor Selecting
`
`——
`
`Figure 5A
`
`36
`
`4
`
`4
`
`Figure 5B
`
`4
`
`So
`
`Figure 5C
`
`6
`
`6
`
`

`

`2
`This invention relates generally to the field of headset user
`interfaces andspecifically to the field ofheadset userinterface
`input mechanisms. In one example, this description describes
`a method and apparatus for a headset with an optical line
`scamneronalightweight headset, wherethe opticalline scan-
`Recent developmentsin the telecommunications industries
`ner detects finger movements, such as tapping, sliding for-
`have produced telecommunications devices with increased
`ward and sliding backward to be translated into various
`capabilities. As a result, the complexity of interacting with
`inputs, such as volume up and down, menuscrolling, and
`these devices has increased. Headsets are now capable of
`other headset user interface options knownintheart.
`doing more than being simple peripherals to legacy phones.
`For example, the headsets may control navigation through
`As a user moveshis or her finger moves across the line
`menusorfiles.
`scamner, relative motionofthe fingerprints ridges and valleys
`However, headset form factors do not lend themselves well
`are scanned. In the same mannerthat an optical mouse inter-
`to traditional user interface technologies like keypads and
`prets the changing images to detect movementof the mouse,
`displays which are suited for complex user man-machine
`motion ofthe userfinger is determined. However,for a head-
`interface interactions. For example, the available space on the
`set user interface, only one axis is neededfor scrolling, allow-
`headset housingis limited. In the priorart, headset userinter-
`ing the possibility of using a line scanner. For tapping, an
`faces typically consist of a small number of multifunction
`algorithm is used to determine the amount of light being
`buttons and a multifunction visual indicator. This limited user
`received by the optoelectronic sensor. In a further example, to
`interface makes access to more complex features and capa-
`reducefalse triggers, such as dueto hair falling in front ofthe
`bilities difficult and non-intuitive, particularly whenthe head-
`sensor, secondary mechanismsare used such as overlaying a
`set is being worn. Visualindicators have limited use while the
`transparent touch sensor such as a capacitance sensor on the
`headset is being worn. Multifunction buttons are non-intui-
`line scanner pad.
`tive and awkwardto use.
`In one example of the invention, a headset includes a
`microphone,a speaker, and a finger pad on an exterior of the
`headset on which a finger of a headset wearer is placed. The
`headset includes an optical line scanner which scansthe fin-
`ger pad and outputs a series of successive imagesofthe finger
`placed on the finger pad. A headset processor processes the
`output of the optical line scanner to detect relative motion of
`the finger on the finger pad or detect tapping of the finger on
`the finger pad. The optical line scanner may include a light
`source, an optical guide for forming a line of light from the
`light source, an imaging sensor, and a lens for directing the
`line of light reflected from the finger pad onto the imaging
`sensor.
`
`US 7,631,811 B1
`
`1
`OPTICAL HEADSET USER INTERFACE
`
`BACKGROUND OF THE INVENTION
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`they offer
`As headsets become more “intelligent”,
`advanced features and functionality. With increased features
`and functionality, these headsets require more complex user
`interfaces. However, the limited physical size of headset
`housings makes it desirable to minimize the number of or
`required size of the headset user interface mechanisms.
`As a result, there is a need for improved methods and
`apparatuses for headset user interface input mechanisms.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The present invention will be readily understood by the
`following detailed description in conjunction with the accom-
`panying drawings, wherein like reference numerals designate
`like structural elements.
`
`FIG.1 illustrates a headset capable ofreceiving user inputs
`utilizing an optical line scanner.
`FIG.2 illustrates a simplified block diagram of the com-
`ponents of the headset shown in FIG.1.
`FIG. 3 illustrates a side view of a headset showing the
`internal arrangementofa line scanner system.
`FIGS. 4A-4Dillustrate sample operation of a line scanner
`to detect user input scrolling as the user “wipes”his finger.
`FIGS. 5A-5Cillustrate sample operation of a line scanner
`to detect user input tapping.
`
`In one example ofthe invention, a headset includesa finger
`receiving means for placement ofa userfinger, and an optical
`line scanning means for scanning the finger receiving means
`on a headset housing exterior and providing an output of
`successive imagesofthe finger receiving means. The process-
`ing meansprocesses the output of successive images on the
`finger pad to determine a relative movementofa user finger
`across the finger pad or to determine a tapping of the user
`finger on the finger pad. The processing means modifies a
`headset control operation responsive to the relative move-
`mentofthe user finger or the tapping.
`In one example of the invention, a method for receiving
`user inputat a headset includes providing a transparentfinger
`pad ona headset housing for receiving a userfinger, providing
`an optical line scanner disposed within the headset housing,
`and scanning the transparent finger pad with the optical line
`scamnerto output a series of successive images. The method
`Methods and apparatuses for a headset user interface is
`further includes processing the successive images to deter-
`disclosed. The following description is presented to enable
`mine a relative movementof a user finger across the finger
`any person skilled in the art to make and use the invention.
`pad, and modifying a headset control operation responsive to
`Descriptions of specific embodiments and applications are
`the relative movementofthe userfinger.
`provided only as examples and various modifications will be
`In one example ofthe invention, a headset includesa finger
`readily apparentto those skilled in the art. The general prin-
`receiving means for placement ofa userfinger, and an optical
`ciples defined herein may be applied to other embodiments
`scanning means for scanning the finger receiving means on a
`and applications without departing from the spirit and scope
`headset housing exterior and providing a sequenceofelectri-
`ofthe invention. Thus, the present invention is to be accorded
`cal signals associated with successive line scan imagesof the
`the widest scope encompassing numerousalternatives, modi-
`finger receiving means. The headset further includes a pro-
`fications and equivalents consistent with the principles and
`cessing means for processing the sequenceofelectrical sig-
`features disclosed herein. For purposeofclarity, details relat-
`65
`nals to determinearelative bi-directional movementofa user
`ing to technical material that is known in the technicalfields
`related to the invention have not been described in detail so as
`finger across the finger pad along a single axis or determine a
`tapping of the user finger on the finger pad. The processing
`
`40
`
`45
`
`50
`
`55
`
`60
`
`DESCRIPTION OF SPECIFIC EMBODIMENTS
`
`not to unnecessarily obscure the present invention.
`
`7
`
`

`

`US 7,631,811 B1
`
`5
`
`10
`
`15
`
`3
`4
`In one example the line scanner 20 continuously monitors
`means modifies a headset control operation responsive to the
`finger pad 4 to identify whethera user has placed a finger on
`relative movementof the user finger or the tapping.
`finger pad 4 to perform an input action. User interface 18
`In one example of the invention, a method for receiving
`allows for manual communication between the headset user
`user inputat a headset includes providing a transparentfinger
`and the headset. User interface 18 may also include, for
`pad ona headset housingfor receiving a user finger, providing
`example, an audio and/or visual interface such that an audio
`an optical line scanner disposed within the headset housing,
`prompt maybeprovidedto the user’s ear and/or an LED may
`and scanning the transparent finger pad with the optical line
`belit. For example, the prompt may inform the user to place
`scanner to output a sequenceofelectrical signals associated
`his or her finger on the finger pad to perform a fingerprint
`with successive line scan images. The sequenceofelectrical
`scan. Although illustrated in FIG. 2 as separate from line
`signals is processed to determine a relative movement of a
`scanner 20, finger pad 4 may also be considered to be a
`user finger across the finger pad. The headset control opera-
`componentof line scanner 20.
`tion is modified responsive to the relative movement of the
`FIG.3 illustrates a side view of the headset 2 showing the
`userfinger.
`internal arrangementofthe line scanner system. The housing
`FIG. 1 illustrates a headset 2 capable of receiving user
`body ofheadset 2 includes a finger pad 4. Headset 2 may also
`inputs utilizing an optical line scanner. Headset 2 includes a
`include one or more user interface buttons or keys 6 which the
`narrow finger pad 4 serving as a scanning surface on which a
`user may depress.
`user finger is placed and scanned by the user “wiping”his
`The finger pad 4 is optically transparent, allowing light
`finger across the scanning surface. During optical scanning,
`from a light source 22 disposed within the headset housing to
`the user slides his or her finger across the scanning surface,
`20
`exit the headset. For example, finger pad4is a planar surface
`wherebythe line scanner imagesthefinger line byline asit is
`composedofglass or plastic. After light from the light source
`slid across the scanning surface.
`22 is reflected off the finger pad 4, it re-enters the headset
`FIG.2 illustrates a simplified block diagram of the com-
`housing and is focused by a lens 24 on an optical sensor 26.
`ponents of the headset 2 shown in FIG. 1. The headset 2
`For example, optical sensor 26 is a motion sensor integrated
`includes a processor 10 operably coupled via a bus 50 to a
`circuit (IC) having an array ofphotodetectors for capturing an
`memory 12, a microphone 14, power source 11, and user
`image. Optical sensor 26 converts light information into an
`interface 18. User interface 18 includesa line scanner 20 and,
`electrical signal and transmits the signal to a system image
`optionally, one or more input buttons or keys 6. In one
`processing unit. Lens 24 and optical motion sensor 26 are
`example, line scanner 20 includes a light source 22, lens 24,
`disposed within the headset 2. An optical guide may be
`and optical sensor 26. Optical sensor 26 is, for example, a
`arranged to convert light from light source 22 into a line-
`charge coupled device (CCD) such as a CMOSsquare pixel
`shaped light to illuminate a fingerprint in a line-shape. In a
`array. The CCD is an array of light sensitive diodes which
`further example, a scanning component may sweep a beam
`generate an electrical signal in responseto light which hits a
`spot oflight across the finger pad 4 and the optical sensor may
`particular pixel. Line scanner 20 mayalso include a processor
`be a single element.
`for processing scan data. Alternatively, line scanner 20 may
`Thelight reflected off the finger pad 4 forms an imageof a
`utilize processor 10 to process scan data. Line scanner 20 may
`user finger placed on the finger pad 4 on optical sensor 26.
`also include memory separate from memory 12 for storing
`This image is captured by optical sensor 26. For example,
`scan data or firmware/software executable to operate line
`optical sensor 26 has a plurality of line-shaped photoelectric
`scanner 20 and process scan data. The firmware/software may
`converting elements. Successive imagesofa finger placed on
`include a user input
`identifier application for analyzing
`the finger pad 4 are then comparedby a processor. The pro-
`scanned finger motion data to determine user input at the
`cessor may be integrated with the optical sensor 26 or may be
`finger pad 4. Alternatively,
`line scanner 20 may utilize
`a separate processor such as the headset processor 10. The
`memory 12 for such purposes. The line scanner 20 is properly
`successive images are compared to determine the forward or
`aligned and integrated with finger pad 4 within the headset
`backward motion ofthe user finger across finger pad 4. The
`housing. Ina further example, line scanner 20 is replaced with
`successive images are also compared to determine whether
`an alternative optical scanner. Examples of optical scanners
`the useris “tapping”or “double tapping”the finger pad 4, 1.e.,
`include, without limitation, image sensors, planar scanners,
`quickly placing his fingertip on finger pad 4 and then remov-
`CMOSsensors, contact image sensors, or other optical sys-
`ing it. Depending on the current operational state ofthe head-
`tems such as used by optical mouse devices.
`set, the forward or backward motion is translated to a pre-
`Memory 12 mayinclude a variety of memories, and in one
`defined user input, such as scrolling through a menu or
`example includes SDRM, ROM,flash memory, or a combi-
`volumeincrease or decrease. User tapping or double tapping
`nation thereof. Memory 12 may further include separate
`is translated, for example, to a user selected command.
`memory structures or a single integrated memory structure. In
`The directional motion of a finger on finger pad 4 along a
`one example, memory 12 may be usedto store passwords,
`single axis (e.g., X or Y) or the presenceofa finger on finger
`network and telecommunications programs, and/or an oper-
`pad 4 is detected optically by optical sensor 26 by directly
`ating system (OS).
`imaging, as an array of pixels, the various particular ridges
`Processor 10, using executable code and applications
`and valleys of the user fingerprint placed on the finger pad 4.
`stored in memory, performs the necessary functions associ-
`The particular features of the fingerprint are illuminated by
`ated with headset operation described herein. Processor 10
`the light source 22. The use of optical sensors to detect direc-
`allows for processing data,
`in particular managing data
`tion and degree ofmovement along an X-Y coordinate system
`between user interface 18 and operation of headset 2 func-
`is described in U.S. Pat. No. 6,233,368 issued May 15, 2001,
`tions. In one example, processor 10 is a high performance,
`entitled “CMOSDigital Optical Navigation Chip”, which is
`highly integrated, and highly flexible system-on-chip (SOC),
`hereby incorporated by referencefor all purposes.
`including signal processing functionality such as echo can-
`The motion ofa finger on finger pad 4 is detected by optical
`cellation/reduction and gain control in another example. Pro-
`sensor 26 by comparing a newly captured image with a pre-
`cessor 10 may include a variety of processors (e.g., digital
`viously captured imageto ascertain the direction and amount
`signal processors), with conventional CPUsbeing applicable.
`ofmovement. The newly captured imageand previously cap-
`
`40
`
`25
`
`30
`
`35
`
`45
`
`50
`
`55
`
`60
`
`65
`
`8
`
`

`

`US 7,631,811 B1
`
`5
`tured images may be stored in memory on the optical sensor
`26 or stored in headset memory 12. For example, referring to
`FIGS. 4A-4C, sample operation of line scanner 20 to detect
`user input scrolling as the user “wipes”his finger in a direc-
`tion 30 is illustrated. FIG. 4A illustrates a fingerprint 28
`corresponding to a userfinger placed on finger pad 4. Refer-
`ring to FIG. 4C, a fingerprint portion 32 scanned by line
`scanner 20 is captured. At an immediate point in time there-
`after as the user finger is wiped in direction 30, the fingerprint
`28 is at a secondposition onfinger pad 4, as shown in FIG. 4B.
`As a result, a different fingerprint portion 34 scanned by line
`scanner 20 is captured, as shown in FIG. FIG. 4D. The image
`of fingerprint portion 32 is comparedto the imageoffinger-
`print portion 34 to ascertain the direction of movementof the
`user finger across finger pad 4. For example, the ridges and
`valleys of the fingerprint line scans may be pattern matched
`and aligned to determine the direction of movement.
`For example, referring to FIGS. 5A-5C, sample operation
`of line scanner 20 to detect user input tappingis illustrated. In
`this example, line scanner 20 detects user tapping by deter-
`mining whether the user has placed his finger across finger
`pad 4 and removedit immediately thereafter. For example, at
`a first ttme shown in FIG. 5A,the user finger pad 4 does not
`have a finger placed uponit. At a second time shownin FIG.
`5B, the user has placedhis finger upon finger pad 4, which is
`scanned by line scanner 20. At third time illustrated in FIG.
`5C, the finger pad 4 is once again clear as the user has
`removed his finger. In one example, the quantity of light
`detected by the sensoris used to determine a tap. In a further
`example, a time period on which the user finger is placed on
`finger pad 4 to indicate a user tap is empirically determined.
`Double tapping is detected, for example, by detecting user
`tapping twice within a predefined time period.
`The information developed by optical sensor 26 regarding
`the motion ofthe user finger on the finger pad 4 is relayed to
`the headset processor 10, which translates the information to
`correspond to user input actions at the headset. The headset
`processor 10 then implements the desired input action. For
`example, such desired input actions may include volume
`control, power control, call answer, call
`terminate,
`item
`select, next item, and previous item,or other actions typically
`performedat a headset device.
`Ina further example, line scanner 20 is used to authenticate
`the identity of the headset user by scanning the fingerprint of
`the user and comparing it to a previously stored authorized
`fingerprint. During optical scanning,the user slides his or her
`finger across the scanning surface, whereby the line scanner
`imagesthe finger line by lineasit is slid across the scanning
`surface. In this manner, the fingerprint of the user is gener-
`ated. In this example, the headset memory includesprevi-
`ously storedfingerprint data correspondingto validated users,
`a feature identifier application for analyzing scanned finger-
`print scan data, and a fingerprint match application for com-
`paring the analyzed scanned fingerprint scan data to previ-
`ously stored fingerprint data. In one example, headset user
`authentication is required prior to allowingthe userto operate
`the headset. In this example, the line scanner 20 serves the
`dual function of being a user interface input device and an
`authentication device.
`
`The various examples described above are provided by
`wayof illustration only and should not be construed to limit
`the invention. Based on the above discussion andillustrations,
`those skilled in the art will readily recognize that various
`modifications and changes may be madeto the present inven-
`tion without strictly following the exemplary embodiments
`and applications
`illustrated and described herein. For
`example, the methods and systems described herein may be
`
`40
`
`45
`
`6
`applied to other body worn devices in addition to headsets.
`Furthermore, the functionality associated with any blocks
`described above may be centralized or distributed. It is also
`understood that one or more blocks of the headset may be
`performed by hardware, firmware or software, or some com-
`binations thereof. Such modifications and changes do not
`depart from the true spirit and scopeof the present invention
`that is set forth in the following claims.
`While the exemplary embodiments ofthe present invention
`are described andillustrated herein, it will be appreciated that
`they are merely illustrative and that modifications can be
`made to these embodiments without departing from the spirit
`and scopeof the invention. Thus, the scopeof the invention is
`intended to be defined only in terms ofthe following claims as
`may be amended, with each claim being expressly incorpo-
`rated into this Description of Specific Embodiments as an
`embodimentofthe invention.
`
`Whatis claimedis:
`
`1. A headset comprising:
`a microphone;
`a speaker;
`a finger pad on an exterior of the headset on which a finger
`of a headset wearer is placed;
`an optical line scanner, wherein the optical line scanner
`scans the finger pad and outputs a series of successive
`imagesofthe finger placed on the finger pad; and
`a processor, wherein the processor processesthe series of
`successive imagesto detect relative motion of the finger
`on the finger pad or detect tapping of the finger on the
`finger pad.
`2. The headset of claim 1, wherein the optical line scanner
`comprises:
`a light source;
`an optical guide for forminga line of light from the light
`source an imaging sensor; and
`alens for directing the line of light reflected from the finger
`pad onto the imaging sensor.
`3. The headset of claim 2, wherein the light source com-
`prises a light emitting diode.
`4. The headset of claim 2, wherein the imaging sensor
`comprises an integrated circuit sensor.
`5. The headset of claim 1, wherein the optical line scanner
`detects relative motion of the finger on the finger pad along a
`single axis.
`6. The headset of claim 1, wherein the finger pad comprises
`a glass planar surface.
`7. A headset comprising:
`a finger receiving meansfor placementofa userfinger;
`an optical line scanning means for scanning the finger
`receiving means on a headset housing exterior and pro-
`viding an output of successive images of the finger
`receiving means;
`a capacitive sensing means overlaid on thefinger receiving
`meansfor sensing placementof a userfinger to detect a
`false trigger of the optical line scanning means; and
`a processing meansfor processing the output of successive
`images of the finger receiving means to determine a
`relative movement of a user finger across the finger
`receiving meansor determine a tapping ofthe user finger
`on the finger receiving means, wherein the processing
`means modifies a headset control operation responsive
`to the relative movementofthe user finger or the tap-
`ping;
`a first transducer meansfor receiving a user speech signal;
`and
`
`a second transducer means for outputting an audiosignal.
`
`9
`
`

`

`US 7,631,811 B1
`
`10
`
`15
`
`25
`
`30
`
`8
`7
`8. The headset of claim 7, wherein the headset control
`mentof a user finger across the finger receiving means
`along a single axis or determine a tapping of the user
`operation comprises one or moreselected from the following
`finger on the finger receiving means, wherein the pro-
`group: volume control, powercontrol, call answer, call ter-
`cessing means modifies a headset control operation
`minate, item select, next item, and previous item.
`responsiveto therelative bi-directional movementofthe
`9. A methodfor receiving user input at a headset compris-
`user finger or the tapping;
`ing:
`a first transducer means for receiving user speech; and
`providing a transparentfinger pad on a headset housing for
`a second transducer means for outputting an audiosignal.
`receiving a user finger;
`15. The headset of claim 14, wherein the headset control
`providing an optical line scanner disposed within the head-
`operation comprises one or moreselected from the following
`set housing;
`group: volume control, power control, call answer, call ter-
`scanning the transparent finger pad with the optical line
`minate, item select, next item, and previous item.
`scamnerto output a series of successive images;
`16. Amethodfor receiving user inputat a headset compris-
`processing the series of successive images to determine a
`ing:
`relative movementof a userfinger across the transparent
`providing a transparentfinger pad on a headset housing for
`finger pad; and
`receiving a user finger;
`modifying a headset control operation responsive to the
`providing an optical line scanner disposed within the head-
`relative movementof the user finger.
`set housing;
`10. The method of claim 9, wherein processing the succes-
`scanning the transparent finger pad with the optical line
`sive images to determine a relative movementofa user finger
`scamnerto output a sequenceofelectrical signals asso-
`across the finger pad comprising determining whether the
`ciated with successive line scan images;
`user finger is moving inafirst direction along an axis or ina
`processing the sequenceofelectrical signals to determine a
`second direction oppositethe first direction along the axis.
`relative movementofa user finger across the transparent
`11. The methodof claim 9, further comprising processing
`finger pad; and
`the successive images to determinea tap ofthe userfinger on
`modifying a headset control operation responsive to the
`the finger pad.
`relative movementofthe userfinger.
`12. The method of claim 11, wherein processing the suc-
`17. The method of claim 16, wherein processing the
`cessive images to determine a tap of the user finger on the
`sequence of electrical signals comprises pattern matching
`finger pad comprises measuring a quantity oflight received at
`fingerprint ridges and valleys.
`the optical line scanner.
`18. The method of claim 16, wherein processing the
`13. The method of claim 9, wherein processing the succes-
`sequence ofelectrical signals to determine a relative move-
`sive images comprises identifying and comparing fingerprint
`mentofauserfinger across the finger pad comprising deter-
`ridges andvalleys.
`mining whetherthe user finger is moving inafirst direction
`14. A headset comprising:
`along an axis or in a seconddirection oppositethefirst direc-
`a finger receiving means for placementofa user finger;
`tion along the axis.
`an optical scanning meansfor scanningthe finger receiving
`19. The methodof claim 16, further comprising processing
`means on a headset housing exterior and providing a
`the sequence ofelectrical signals to determine a tap ofthe user
`sequenceof electrical signals associated with successive
`finger on the finger pad.
`line scan imagesof the finger receiving means;
`20. The method of claim 19, wherein processing the
`acapacitive sensing meansoverlaid on the finger receiving
`sequence ofelectrical signals to determine a tap of the user
`meansfor sensing placementof a userfinger to detect a
`finger on the finger pad comprises measuring a quantity of
`false trigger of the optical line scanning means; and
`light received at the optical line scanner.
`a processing meansfor processing the sequenceofelectri-
`*
`*
`*
`*
`*
`cal signals to determinea relative bi-directional move-
`
`20
`
`35
`
`10
`
`10
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket