`Glynn
`
`1111111111111111111111111111111111111 IIIII IIIII IIIII IIIII IIIIII Ill lllll llll
`US005181181A
`(I 1] Patent Number:
`[45] Date of Patent:
`
`5,181,181
`Jan. 19, 1993
`
`(54] COMPUTER APPARATUS INPUT DEVICE
`FOR THREE-DIMENSIONAL
`INFORMATION
`Inventor: Brian J. Glynn, Merritt Island, Fla.
`(75]
`[73] Assignee: Triton Technologies, Inc., Burke, Va.
`[21] Appl. No.: 588,824
`
`(56]
`
`[22] Filed:
`Sep. 27, 1990
`[51]
`Int. Cl. 5 .......•................ GOlP 7/00; G0IP 15/00;
`G09G 1/00
`(52] U.S. Cl . .................................... 364/566; 340/710;
`364/709.11
`[58] Field of Search ................... 364/560, 453, 709.11,
`364/565,566;340/710, 706; 178/18-20; 73/505
`References Cited
`U.S. PATENT DOCUMENTS
`3.541,541 11/1970 Engelbart ......................... 178/18 X
`3,835,464 9/1974 Rider ................................... 340/710
`3,892,963 7/1975 Hawley et al. ................. 250/231.16
`3,898,445 8/1975 MacLeod et al. .................. 356/141
`3,987,685 10/1976 Opocensky .......................... 340/710
`4,241,409 12/1980 Nolf ...................................... 340/705
`4,390,873 6/1983 Kirsch ................................. 340/710
`4,409,479 10/1983 Sprague et al. ...................... 33/1 M
`4,578,674 3/1986 Baker et al .......................... 340/710
`4,608,641 8/1986 Snell ................................ 364/453 X
`4,617,634 10/1986 Izumida et al. ................. 364/453 X
`4,682,159 7/1987 Davison .............................. 340/706
`4,688,lM 8/1987 Taniguti et al. .................... 364/560
`4,695,831 9/1987 Shinn ................................... 340/707
`
`4,754,268 6/1988 Mori .................................... 340/710
`4,766,423 8/1988 Ono et al. ........................... 340/709
`4,787.051 11/1988 Olson .................................. 364/518
`4,812,829 3/1989 Ebina et al. ......................... 340/709
`4,835,528 5/1989 Flinchbaugh ....................... 340/709
`4,839,838 6/1989 LaBiche et al. ................ 364/709.11
`4,887,230 12/1989 Noguchi et al. .................... 364/560
`4,916,650 4/1990 Oikawa ........................... 364/453 X
`4,922,444 5/1990 Baba .................................... 364/566
`5,001,647 3/1991 Rapiejko et al. .................... 364/453
`Primary Examiner-Thomas G. Black
`Assistant Examiner-Michael Zanelli
`Attorney, Agent, or Firm-Bums, Doane, Swecker &
`Mathis
`[57]
`ABSTRACT
`A mouse which senses six degrees of motion arising
`from movement of the mouse within three dimensions.
`A hand-held device includes three accelerometers for
`sensing linear translation along three axes of a Cartesian
`coordinate system and three angular rate sensors for
`sensing angular rotation about the three axes. Signals
`produced by the sensors are processed to permit the
`acceleration, velocity and relative position and attitude
`of the device to be conveyed to a computer. Thus, a
`person may interact with a computer with six degrees of
`motion in three-dimensional space. Computer interface
`ports and unique address identification ensure proper
`communication with the computer regardless of the
`orientation of the mouse.
`
`21 Claims, 7 Drawing Sheets
`
`: r;~:-;:i::n~---:-1,,2
`
`RATE SENSOR
`
`Roll Rate
`
`3
`
`~1
`
`15
`
`Pitch Rate
`
`Yaw Rate
`
`z
`
`X Axis (Trll\llalion)
`
`I
`I
`I
`7,8,9, 10, 11, 12:
`I
`I
`I
`I
`Transceiver
`:
`I
`,---------------'
`
`\-~·
`
`X Acc:elefation Analog to
`ACCELEROMETER - - - - - - Digital
`Signal
`,__ ___ .._1 ....... • I
`I
`~ocessing.
`ACCELEROMETER ..,.__Y_Acceler __ at_ion_ tuneric
`I
`Integration,
`z Acceleration
`and
`: L_ _______ ~_! ~
`
`ACCELEROMETER
`
`23
`
`...:i:,_,.-4
`
`...:i:,_,.-6
`
`Align EX1021
`Align v. 3Shape
`IPR2022-00145
`
`
`
`U.S. Patent
`
`Jan. 19, 1993
`
`Sheet 1 of 7
`
`5,181,181
`
`! I• •
`
`I I
`
`I
`I
`
`I
`
`·;····
`• •,•o• • • • • • • • • • t :
`~ ~ 7 ::
`•: 12
`~ ....... ~
`,.
`I
`. ~
`I
`I
`
`1 1
`
`I
`
`I
`I
`
`- • • • - • • •~
`
`FIG.1(1)
`
`4
`
`7
`
`FIG. 1(d)
`
`•
`
`7
`
`·········;···x:J
`
`FIG. 1(b)
`
`•
`
`I
`
`...
`,.. ...
`-----------------·-··
`•
`
`FIQ.1(c)
`
`
`
`U.S. Patent
`
`Jan. 19, 1993
`
`Sheet 2 of 7
`
`5,181,181
`
`.----------------~
`
`I
`I
`I
`
`I
`
`I 3
`~
`
`:
`
`I
`I
`I
`I 14
`I
`
`FIG. 2(1)
`
`____ [_: _
`
`__________ ; z +
`
`. y~
`
`19
`
`FIG. 2(b)
`
`FIG. 2(c)
`
`
`
`----- - ---- -: 1
`
`:,/
`
`;
`:
`
`:
`1.a.9. 10. 11. 12:
`:
`t-----
`
`I
`
`:
`
`!
`
`r--------
`Transcei-
`I
`~-------------,
`:
`\ , . . - - 21
`:
`22
`'
`: r------"~
`: L-.--.--__Jl4----
`
`23
`
`Pitch Rate
`v- Raio
`
`X Acceleration Analog to
`Digital
`Signal
`Processi
`Y Acceleration Numeric ng,
`Integration
`and
`'
`Vector
`
`Z Acceleration
`
`Processing
`
`c:: •
`
`rJ1
`•
`~
`~
`""""
`
`('I) = """"
`
`~
`
`~
`~
`~v:,
`~ v:,
`v:,
`(J,J
`
`tD
`
`00 =(cid:173)
`tD ....
`0 .....
`.......
`
`(J,J
`
`(II
`....
`li-l
`00
`li-l
`....
`li-l
`00
`li-l
`
`: - - - - - - - - - - ·
`..z__,..-4
`I
`1
`I
`
`I
`I
`:
`I
`I
`
`I
`:
`I
`I
`I
`
`I
`
`0--------------J
`---:;---:-::-;;:------------_J
`..z__,..-5
`U--------------J
`- - - : : - - : - ; : - - - - - - - - - - -~
`u--------------J
`..z__,..-6
`._ _______________ J
`------------------------
`._ ____ J:
`-----------------------4
`FIG.3
`
`COMPUTER
`
`,
`'
`I
`I
`
`I
`I
`:
`I
`I
`
`I
`:
`I
`I
`I
`
`I
`
`14 tJ
`:f:.-.::..-::.:.::::::-..::.:.-----------------------------
`: , X Axis (Rotation)
`3
`: I
`,----«-
`: I :j
`
`RATE SENSOR
`
`Roll Rate
`
`15
`RATE SENSOR
`
`: 1
`
`:i
`: I
`: 1 z
`
`~================
`RATE SENSOR
`16
`
`I
`: 1 X Axis (Translation)
`17 1
`ACCELEROMETER
`_ _ _ _ _ _ _ ___:_J
`1,
`
`I I
`
`: I
`: I
`'
`:
`
`: • z
`: I
`
`ACCELEROMETER
`
`;======== I
`19 I
`
`ACCELEROMETER
`
`1,
`
`1
`
`
`
`. -------.
`3
`----L···········--------------------------------.
`•
`• • • • •
`1---1\.
`FIFO
`Buffer
`t-yl'
`Memory
`
`34
`
`'
`
`L! •
`rJl
`•
`~
`~
`f"+(cid:173)
`
`(1) = f"+-
`
`/"2
`
`__ Control
`"'
`
`... Roll Rate
`
`MOTION
`SENSING
`ASSEMBLY
`
`... -....
`
`r
`
`Pitch Rate
`
`Yaw Rate
`
`_:c_~4
`~o
`..:c..~5
`~o
`..:C,.,t11!""8
`~o
`
`X Acceleration :
`
`-
`-
`
`Y Acceleration :
`•
`Z Acceleration ;
`•
`~
`:
`• •
`
`.
`
`..
`
`• • •
`• • .
`
`I-
`
`Sensor
`
`Interface D
`
`Analog to
`Digital
`1-1 Converter
`
`/ ' 35
`
`Processing
`Element
`
`~
`
`Dual-Ported
`Memory
`
`I-
`
`I-
`
`'-31
`
`'32
`
`TI
`
`Computer
`Interface
`Control
`
`..
`I( ...-
`..
`
`~
`
`37""
`
`..
`Unique
`I( ,
`Address
`36
`._ _______ __,,
`'-
`Register
`-----------
`'
`'
`·----------------························---------~
`FIG.4
`
`• • • • • • • •
`
`7, 8, 9,
`10, 11, 12'":'\
`
`. ..
`:>j Trancelvers
`I
`..
`•
`
`~
`ri.,
`
`? ...
`~IC ... IC
`
`IC
`CH
`
`(I)
`(I)
`
`00 =(cid:173)
`.....
`,f:>,
`0 ....,
`........
`
`...
`OI
`~
`00
`~ ...
`~
`00
`~
`
`
`
`U.S. Patent
`
`Jan. 19, 1993
`
`Sheet 5 of 7
`
`5,181,181
`
`~ OHurnan ~ .a~~~~~-~~!~ RtlHMI
`· .
`---
`••
`perator
`•••
`.
`'•
`
`.·
`. ·
`
`... Extnecl
`• Forces
`
`lnlllal
`Condltl0n1
`
`Commands _ .. - - -
`
`.
`. . ·····
`
`Compullf
`
`Rutt
`or Hold
`Condit~~-·
`...
`
`----~ _3_D _Mo_l'-lo-n-and-
`Position Dltl
`
`Formatted
`30 Motion and
`Position Dita
`• Vtloclly
`• Poslllon
`• Attitude
`
`lnstant1neou1
`30 Information
`
`FIG. 5
`
`30 Motion
`Slgnall
`
`lnacantaneoua
`30 lntonnatlon
`
`FIG. 6
`
`®.
`
`Sen10r
`Control
`Slgnall
`
`4 •
`
` .. Control
`
`,..,.,.nee
`
`Slgnall
`.
`
`
`
`U.S. Patent
`
`Jan. 19, 1993
`
`Sheet 6 of 7
`
`5,181,181
`
`Initial
`Conditions
`
`Attitude
`
`Reset
`or Hold
`Conditions
`
`3D Motion and
`Position Data
`
`Zsi
`
`I
`
`: Bxs
`
`ZS
`
`Yi
`
`Translatlonal
`Acceltratlon
`
`Rotational
`Velocity
`
`Instantaneous
`3D Information
`
`FIG. 7
`
`ZS
`
`Vxs
`
`Rzs df.-Ys
`
`Xs
`
`Zi
`
`z
`
`Xi
`
`FIG. 8
`
`FIG. 9
`
`
`
`U.S. Patent
`
`Jan. 19, 1993
`
`Sheet 7 of 7
`
`5,181,181
`
`Operator •........•.
`Human
`• •. _
`Button Push11· •.
`and Releases ~- • 4•1 ~
`Push
`Button
`•
`~
`;
`Status
`',
`• •• ~etecuon • .'
`
`ft
`
`I
`
`.....
`
`vandattd
`,· • · • 4.4 ·-~.
`Commands
`,' Command ',
`Decoding ........... .
`•
`•
`Reset
`:
`•,
`\.
`••
`or Hold
`.
`Conditions • • • '-- •• • • • • •.
`•
`•
`•
`• • • •
`
`.
`
`Unique
`Address
`
`......
`.
`
`Commands
`•
`
`Computer
`
`Fonnattld
`3D Motion and
`Position Data
`• Vtloclty
`• Position
`• Attitude
`
`30 Motion and
`Position Data
`
`30 Motion and
`Position Data
`
`FIG. 10
`
`
`
`COMPUTER APPARATUS INPUT DEVICE FOR
`THREE-DIMENSIONAL INFORMATION
`
`BACKGROUND OF THE INVENTION
`The present invention relates to a computer periph(cid:173)
`eral input device used by an operator to control cursor
`position or the like. More particularly, the invention is
`directed to an apparatus and method for inputting posi(cid:173)
`tion, attitude and motion data to a computer in, for 10
`example, terms of three-dimensional spatial coordinates
`over time.
`Most computer systems with which humans must
`interact, include a 'cursor' which indicates current posi(cid:173)
`tion on a computer display. The cursor position may 15
`indicate where data may be input, such as the case of
`textual information, or where one may manipulate an
`object which is represented graphically by the com(cid:173)
`puter and depicted on a display. Manipulation of the
`cursor on a computer display may also be used by an 20
`operator to select or change modes of computer opera(cid:173)
`tion.
`Early means of cursor control were centered around
`the use of position control keys on the computer key(cid:173)
`board. These control keys were later augmented by 25
`other devices such as the light pen, graphics tablet,
`joystick, and the track ball. Other developments utilized
`a device called a 'mouse' to allow an operator to di(cid:173)
`rectly manipulate the cursor position by moving a small,
`hand-held device across a flat surface.
`The first embodiment of the mouse detected rotation
`of a ball which protrudes from the bottom of the device
`to control cursor position. As the operator moves the
`mouse on a two dimensional surface, sensors in the
`mouse detect rotation of the ball along two mutually 35
`perpendicular axes. Examples of such devices are illus(cid:173)
`trated in U.S. Pat. Nos. 3,541,541; 3,835,464; 3,892,963;
`3,987,685; and 4,390,873. U.S. Pat. No. 4,409,479 dis(cid:173)
`closes a further development which utilizes optical
`sensing techniques to detect mouse motion without 40
`moving parts.
`These mouse devices all detect and convey motion
`within two dimensions. However, with the increased
`use of computers in the definition, representation, and
`manipulation of information in terms of three-dimen- 45
`sional space, attempts have been made to devise tech(cid:173)
`niques that allow for the definition of positional coordi(cid:173)
`nates in three dimensions. Advances of computer
`graphical display and software technology in the area of
`three-dimensional representation of information has 50
`made desirable the capability of an operator to input
`and manipulate or control more than merely three-di(cid:173)
`mensional position information, but also three-dimen(cid:173)
`sional motion and attitude information. This is particu(cid:173)
`larly true in the case of modeling, simulation, and ani- 55
`mation of objects that are represented in either two or
`three-dimensional space.
`U.S. Pat. No. 3,898,445, issued to MacLeod et al.,
`relates to a means for determining position of a target in
`three dimensions by measuring the time it takes for a 60
`number of light beams to sweep between reference
`points and comparing that with the time required for
`the beams to sweep between the reference points and
`the target.
`U.S. Pat. No. 4,766,423, issued to Ono et al., and U.S. 65
`Pat. No. 4,812,829, issued to Ebina et al., disclose dis(cid:173)
`play and control devices and methods for controlling a
`cursor in a three-dimensional space by moving the cur-
`
`1
`
`5,181,181
`
`2
`sor as if to maneuver it by use of joystick and throttle
`type devices to alter the direction and velocity of a
`cursor.
`U.S. Pat. No. 4,835,528, issued to Flinchbaugh, illus-
`5 trates a cursor control system which utilizes movement
`of a mouse device upon a two-dimensional surface that
`contains logically defined regions within which move(cid:173)
`ment of the mouse is interpreted by the system to mean
`movement in three-dimensional space.
`None of the three-dimensional input devices noted
`above has the intuitive simplicity that a mouse has for
`interacting in two dimensions. A three-dimensional
`input device disclosed in U.S. Pat. No. 4,787,051 to
`Olson utilizes inertial acceleration sensors to permit an
`operator to input three-dimensional spatial position.
`However, the disclosed device does not consider input
`of either motion or 'orientation', henceforth referred to
`herein as 'attitude'.
`To determine changes in position within a plane, the
`Olson system senses translation from a first accelerome(cid:173)
`ter. Rotation is obtained from the difference in transla(cid:173)
`tional acceleration sensed by the first accelerometer and
`a second accelerometer. This technique mandates preci(cid:173)
`sion mounting of the accelerometer pair, as well as
`processing to decouple translation from rotation and to
`determine both the rotational axis and rate. Thus, the
`device disclosed by Olson et al. requires extensive pro(cid:173)
`cessing by the computer which may render the device
`30 incompatible with lower end computing devices. Addi(cid:173)
`tionally, this technique would require highly sensitive
`accelerometers to obtain low rotational rates.
`In the Olson device, analog integrator circuits are
`used for the first stage of integration required to obtain
`displacement from acceleration. These integrator cir(cid:173)
`cuits have limits in dynamic range and require periodic
`resets to zero to eliminate inherent errors of offset and
`drift. The Olson device also constrains the use of the
`remote device to a limited range of orientation with
`respect to the computer in order to permit proper com(cid:173)
`munication. Furthermore, no provision is made in the
`Olson system to prevent interference from signals origi(cid:173)
`nating from nearby input devices which control nearby
`computer work stations.
`Accordingly, one of the objects of the present inven(cid:173)
`tion is to provide a new and improved device that over(cid:173)
`comes the aforementioned shortcomings of previous
`computer input techniques.
`Additionally, a primary object of the present inven(cid:173)
`tion is to provide a means for an operator to input to a
`computer information which allows the computer to
`directly ascertain position, motion and attitude of the
`input device in terms of three-dimensional spatial coor(cid:173)
`dinates.
`It is another object of the present invention to pro-
`vide a new and improved apparatus and method for
`controlling movement of a cursor, represented on a
`computer display fn terms of three-dimensional spatial
`coordinates.
`It is yet another object of the present invention to
`provide an apparatus and method for providing input to
`a computer, position, motion, and attitude with a device
`which is intuitively simple to operate by the natural
`motion of a person, for the purpose of conveying a
`change in position or attitude, in a particular case, or a
`change in the state of variables, in the general case.
`
`
`
`5,181,181
`
`4
`to convey to a computer a desire to change an operating
`state. Examples of operating state changes include pro(cid:173)
`viding a new input device initial position or attitude, or
`indicating a new computer cursor position, or changing
`5 a mode of operation of the computer and/or the input
`device.
`Additiona\ aspects of the preferred embodiment of
`the present invention include a wireless interface be(cid:173)
`tween the input device and a computer. The wireless
`interface permits attributes of the input device motion
`and push-button actuation to be communicated to a
`computer in a manner that allows for motion to be
`unencumbered by an attached cable. Preferably, the
`15 interface is bi-directional, allowing the computer to
`control certain aspects of the input device operational
`states. In an additional embodiment of the input device,
`the interface between the device and a computer may be
`a cable.
`
`10
`
`3
`SUMMARY OF THE INVENTION
`In accordance with one embodiment of the present
`invention, a computer input device apparatus detects
`and conveys to a computer device positional informa(cid:173)
`tion within six degrees of motion-linear translation
`along, and angular rotation about, each of three mutu(cid:173)
`ally perpendicular axes of a Cartesian coordinate sys(cid:173)
`tem. The preferred embodiment of the present inven(cid:173)
`tion includes three accelerometers, each mounted with
`its primary axis oriented to one of three orthogonally
`oriented, or mutually perpendicular, axes. In this way,
`each accelerometer independently detects translational
`acceleration along one of the primary axes of a three-di(cid:173)
`mensional Cartesian coordinate system. Each acceler(cid:173)
`ometer produces a signal which is directly proportional
`to the acceleration imparted along its respective pri(cid:173)
`mary axis.
`Additionally, the preferred embodiment includes
`three orthogonally oriented rate sensors, each mounted 20
`in such a way that each primary axis is parallel or collin(cid:173)
`early oriented with each of the respective aforemen(cid:173)
`tioned accelerometer primary axes. Each rate sensor
`independently detects angular rotational rates about an
`individual axis of the Cartesian coordinate system and 25
`produces a signal which is directly proportional to the
`angular momentum, or angular rotational displacement
`rate, which is detected as the rate sensor rotates about
`its respective primary axis.
`Additional aspects of the preferred embodiment of 30
`the present invention include techniques for producing
`velocity and position vectors of the input device. These
`vectors, each consisting of a scalar magnitude and a
`direction in terms of a three-dimensional Cartesian co(cid:173)
`ordinate system, are continually updated to reflect the 35
`instantaneous input device velocity as well as its posi(cid:173)
`tion and attitude, relative to an arbitrary stationary
`initial position and attitude.
`The velocity vector is determined from the geomet-
`ric sum of the acceleration integrated once over time, as 40
`indicated by signals produced by the three accelerome(cid:173)
`ters. The velocity vector may be expressed in terms of
`a velocity at which the input device is moving and a
`direction of motion. The direction of motion can be
`defined by two angles indicating the direction the de- 45
`vice is moving relative to the input device coordinate
`system.
`The relative position vector is determined from the
`geometric sum of the accelerations integrated twice
`over time, as indicated by signals produced by the three SO
`accelerometers. The relative position vector may be
`expressed in terms of a linear displacement from the
`arbitrary initial origin 'zero coordinates' of the input
`device and a direction, defined by two angles, indicat(cid:173)
`ing the position of the device relative to the initial posi- 55
`tion.
`The relative attitude may be determined from the rate
`of change in rotational displacements integrated once
`over time as indicated by signals produced by the three
`rate sensors. The relative attitude is expressed in terms 60
`of angular displacements from the arbitrary initial atti(cid:173)
`tude 'zero angles', normalized to a maximum rotation of
`360° (21r radians) about each of the three axes of the
`input device, indicating the attitude of the device rela-
`tive to the initial attitude.
`Additional aspects of the preferred embodiment of
`the present invention include push-button switches on
`the input device which may be actuated by the operator
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`The aforementioned and other objects, features, and
`advantages of the present invention will be apparent to
`the skilled artisan from the following detailed descrip(cid:173)
`tion when read in light of the accompanying drawings,
`wherein:
`FIG. l(a) is a top view of an input device in accor(cid:173)
`dance w.ith the present invention;
`FIG. l(b) is a front view of the input device of FIG.
`l(a);
`FIG. l(c) is a side view of the input device of FIG.
`l(a);
`FIG. l(d) is a perspective view of the input device of
`FIG. l(a);
`FIGS. l(a)-l(c) are cut-away top, front and side
`views, respectively, of the device of FIG. 1, illustrating
`the motion sensing assembly and ancillary components
`area of the input device;
`FIG. 3 is a block diagram of.the major functional
`elements of the preferred embodiment of the present
`invention, including a wireless interface to a computer;
`FIG. 4 is a block diagram showing additional detail
`on major processing elements of the preferred embodi(cid:173)
`ment of the present invention;
`FIG.Sis a process flow diagram illustrating the func(cid:173)
`tions used by the preferred embodiment in processing
`input device motion, push-button status and external
`computer interface;
`FIG. 6 is a process flow diagram illustrating motion
`sensing assembly interface functions, and further illus(cid:173)
`trating details of the sensor interface and data conver(cid:173)
`sion process of FIG. S;
`FIG. 7 is a process flow diagram showing velocity
`and position calculation functions, and illustrating fur(cid:173)
`ther details of the three-dimensional computation pro(cid:173)
`cess of FIG. S;
`FIG. 8 depicts a tangential velocity arising from rota(cid:173)
`tion of the input device;
`FIG. 9 depicts translation and rotation from initial
`coordinates of a coordinate system centered on the
`input device; and
`FIG. 10 is a process flow diagram illustrating the
`functions used in processing changes in status of push(cid:173)
`buttons, and in providing external interface to a com(cid:173)
`puter, and illustrating further details of the external
`interface process of FIG. S.
`
`65
`
`
`
`5,181,181
`
`45
`
`5
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENT
`Referring to FIGS. l(a) through l(d), a hand-held
`computer input device or 'mouse' 1 is illustrated which 5
`includes a plurality of interface ports 7-12. As shown,
`each external face of the mouse 1 is provided with an
`interface port. Each of these interface ports may pro(cid:173)
`vide communication with the computer through known
`wireless communication techniques, such as infrared or 10
`radio techniques. Thus, regardless of the orientation of
`the mouse 1 with respect to the computer work station,
`at least one interface port will be directed generally at
`the computer.
`During operation, the interface ports are operated 15
`simultaneously to transmit or receive signals. Since at
`least one of the interface ports generally faces the com(cid:173)
`puter at all times, communication with the computer is
`permitted regardless of the orientation of the mouse. As
`will be described below in greater detail, the system 20
`may include apparatus for avoiding interference from
`nearby devices when operated in a relatively crowded
`environment. Alternatively, communication with the
`computer may be established through an interface cable
`13.
`The mouse 1 may also include a plurality of push-but(cid:173)
`tons 4, 5 and 6 for additional interaction of the operator
`with the host computer system. The push-buttons may
`be used, for example, to provide special command sig(cid:173)
`nals to the computer. These command signals are trans- 30
`mitted through the interface ports 7-12 or the cable
`interface 13, to the computer.
`Preferably, the push-buttons are positioned on an
`angled surface of the mouse 1 to allow easy actuation by
`an operator from at least two different orientations of 35
`the mouse device. Alternatively, multiple sets of push(cid:173)
`buttons may be provided to permit easy actuation. Indi(cid:173)
`vidual push-button sets may be arranged, for example,
`on opposed faces of the mouse 1 and connected in paral-
`lel to permit actuation of command signals from any one 40
`of the multiple push-button sets. Additionally, push-but(cid:173)
`tons which detect and convey a range of pressure ap(cid:173)
`plied to them, as opposed to merely conveying a change
`between two states, may be provided in an alternative
`embodiment.
`The push-buttons can be operated to perform a num(cid:173)
`ber .of functions. For example, actuation of a push-but(cid:173)
`ton may command the system to reset the zero refer(cid:173)
`ence point from which the relative position and attitude
`of the mouse 1 is measured. Additionally, the position 50
`and attitude attributes of the mouse 1 may be held con(cid:173)
`stant, despite movement, while a push-button is de(cid:173)
`pressed. The push-buttons may also function to indicate
`a desire by the operator to indicate a point, line, surface,
`volume or motion in space, or to select a menu item, 55
`cursor position or particular attitude.
`Numerous other examples of the function of push(cid:173)
`buttons 4, 5, and 6 could be given, and although no
`additional examples are noted here, additional functions
`of the push-buttons will be apparent to the skilled arti- 60
`san. Furthermore, additional embodiments of the mouse
`1 may include more or fewer than three push-buttons. A
`motion sensing assembly 2 and an ancillary components
`area 3 are indicated by phantom dashed lines in FIGS.
`l(a)-l(c). In an alternative embodiment of the present 65
`invention, components included in the ancillary compo(cid:173)
`nents area 3, as described herein, may be contained
`within a separate chassis remote from the mouse 1. In
`
`6
`such an embodiment, the mouse 1 would contain only
`the motion sensing assembly 2.
`Turning now to FIGS. 2(a)-2(c), the motion sensing
`assembly 2 includes conventional rotational rate sensors
`14, 15 and 16, and accelerometers 17, 18 and 19 aligned
`on three orthogonal axes. Rotational rate sensor 14 and
`accelerometer 17 are aligned on a first axis; rotational
`rate sensor 15 and accelerometer 18 are aligned on a
`second axis perpendicular to the first axis; and rotational
`rate sensor 16 and accelerometer 19 are aligned on a
`third axis perpendicular to both the first axis and the
`second axis.
`By mounting the rotational rate sensors 14, 15 and 16
`and the accelerometers 17, 18 and 19 in the foregoing
`manner, the motion sensing assembly 2 simultaneously
`senses all six degrees of motion of the translation along,
`and rotation about, three mutually perpendicular axes
`of a Cartesian coordinate system. Translational acceler(cid:173)
`ations are sensed by the accelerometers 17, 18 and 19,
`which detect acceleration along the X, Y and Z axes,
`respectively, and produce an analog signal representa-
`tive of the acceleration. Similarly, rotational angular
`rates are detected by the rotational rate sensors 14, 15
`and 16, which sense rotational rates about the X, Y and
`25 Z axes, respectively, and produce an analog signal rep(cid:173)
`resentative of the rotational rate.
`The motion sensing assembly 2 is referred to numer-
`ous times below. For simplification, the general term
`'sensors' shall be used herein to describe the motion
`sensing assembly 2 or its individual components de(cid:173)
`scribed above.
`FIG. 3 is a system block diagram which depicts major
`functional elements included in the motion sensing as(cid:173)
`sembly 2 and ancillary components area 3 of the mouse
`1 of FIG. 1, and the wireless interface 21 between the
`mouse transceivers 7-12 and a transceiver 22 associated
`with a computer 23. Sensor signals from the individual
`components of the motion sensing assembly 2 are pro(cid:173)
`vided to circuits in the ancillary componenis area 3
`which performs processing of the sensor signals. Infor(cid:173)
`mation regarding the sensor signals is then provided to
`mouse transceivers 7-12 for communication to the com(cid:173)
`puter transceiver 22. Functions performed within the
`ancillary components area 3, labeled analog to digital
`signal processing, numeric integration, and vector pro(cid:173)
`cessing, will be further described below.
`FIG. 4 is a block diagram of the preferred embodi-
`ment of the mouse 1 of FIG. 1. Sensor interface circuits
`31 provides electrical power to the components of the
`motion sensing assembly 2 and bias and control circuits
`to ensure proper operation of the sensor components.
`Additionally, sensor interface circuits 31 preferably
`include amplification circuitry for amplifying sensor
`signals from the motion sensing assembly 2.
`Amplified sensor signals are provided to analog-to(cid:173)
`digital converter 32 which converts the amplified ana(cid:173)
`log sensor signals into corresponding quantized digital
`signals. The analog-to-digital converter 32 preferably
`includes a single channel which multiplexes sequentially
`through the respective sensor signals. Alternatively, the
`analog-to-digital converter may include six channels,
`with individual channels dedicated to individual sensor
`signals.
`The digitized sensor signals from the analog-to-digi(cid:173)
`tal converter 32 are stored in a first-in-first-out (FIFO)
`buffer memory 33 for further processing by processing
`element 34. The FIFO buffer memory 33 is preferably
`organized as a single memory stack into which each of
`
`
`
`5,181,181
`
`7
`the respective digitized sensor signals is stored in se(cid:173)
`quential order. The FIFO buffer memory 33 may alter(cid:173)
`natively be organized with six separate memory areas
`corresponding to the six sensor signals. In other words,
`a separate FIFO stack is provided for digitized sensor 5
`signals from each of the six motion sensing components
`of the motion sensing assembly 2. By providing the
`FIFO buffer memory 33, the mouse 1 compensates for
`varying processing speeds in processing element 34 and
`the need for elaborate synchronization IO
`eliminates
`schemes controlling the operation between the 'front(cid:173)
`end' sensor interface 31 and analog-to-digital converter
`32, and the 'back-end' processing element 34 and com(cid:173)
`puter interface control 36.
`The processing element 34 is preferably a conven- 15
`tional microprocessor having a suitably programmed
`read-only memory. The processing element 34 may also
`be a suitably programmed conventional digital signal
`processor. Briefly, the processing element 34 periodi(cid:173)
`cally reads and numerically integrates the digitized 20
`sensor signals from the FIFO buffer memory 33. Three(cid:173)
`dimensional motion, position and attitude values are
`continually computed in a known manner based on the
`most recent results of the numerical integration of the
`sensor data, the current zero position and attitude infor- 25
`mation, and the current processing configuration estab(cid:173)
`lished by push-button or computer commands. Informa(cid:173)
`tion processed by the processing element 34 is stored in
`dual-ported memory 35 upon completion of processing
`by known techniques such as direct memory access. 30
`This information may then be sent via interface ports
`7-12 to the computer by a computer interface control
`36.
`Numerical integration is performed continually on
`the digital acceleration and rate values provided by the 35
`sensors. Numeric integration of digital values allows for
`a significant dynamic range which is limited only by the
`amount of memory and processing time allocated to the
`task. In addition to numerical integration, the process(cid:173)
`ing element 34 computes and applies correction values 40
`which compensate for the effects of gravity and the
`translational effects of rotation on the linear sensors, i.e.
`both gravitational accelerations and tangential veloci(cid:173)
`ties. Additionally, by establishing a threshold level for
`motion signals above which the signals are attributable 45
`to operator movements of the mouse 1, the processing
`element 34 effectively reduces or eliminates errors
`which might be induced by sensor drift, earth rotational
`effects and low level noise signals that may be present
`when an operator is not moving the mouse.
`The processing element 34 responds to external com(cid:173)
`mands by changing the processing configuration of the
`calculation processes. For example, in one processing
`mode the sensor value for any axis, or axes, will be
`ignored upon computation of the three-dimensional 55
`motion and position values. These external commands
`would normally be initiated by an operator actuating
`one of the three push-button switches 4, 5, or 6. The
`computer interface control 36 detects actuation of the
`push-buttons ~ and sends corresponding command 60
`data to the computer. Upon receipt of the command
`data, the computer returns command data to the com(cid:173)
`puter interface control 36 which, in turn, provides com(cid:173)
`mand data to the processing element 34 to change the
`processing configuration. Another external command 65
`initiated from the push