`a2) Patent Application Publication (0) Pub. No.: US 2010/0231509 Al
`
` Boillotet al. (43) Pub. Date: Sep. 16, 2010
`
`
`US 20100231509A1
`
`(54) STERILE NETWORKED INTERFACE FOR
`MEDICAL SYSTEMS
`Inventors:
`MarcBoillot, Plantation, FL (US);
`Martin Roche, Fort Lauderdale, FL
`(US)
`°
`°
`
`(76)
`
`Correspondence Address:
`MareBoillot
`9110 NW 11th Court
`
`Plantation, FL 33322 (US)
`
`(21) Appl. No.:
`
`12/723,486
`
`(22)
`
`Filed:
`
`Mar. 12, 2010
`
`Related U.S. Application Data
`
`(60) Provisional application No. 61/159,793, filed on Mar.
`12, 2009.
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`(2006.01)
`AGIB 17/00
`(2006.01)
`GO6F 3/033
`(2006.01)
`G09G 5/00
`(52) US. CV. ec seeeeeecnees 345/156; 606/1; 715/863
`
`(57)
`
`ABSTRACT
`
`One embodimentofa sterile networked interface system is
`provided comprising a hand-held surgical tool and a data
`processing system. The surgical tool includes a sensor for
`sensing a physical variable related to the surgery, a wireless
`communication unit to transmit the physical variable to the
`data processing system, and a battery for powering the hand-
`held surgical tool. The surgical tool sends the physical vari-
`able and orientation information responsive to a touchless
`gesture control and predeterminedorientation ofthe surgical
`tool. Other embodiments are disclosed.
`
`139
`
`140
`
` Processor
`
`Wireless Unit
`
`
`
`151
`
`152
`
`Sensing Unit
`‘
`
`1414
`
`Wireless Unit
`
`153
`
`132
`
`134
`
`
`(Ge. Processor
`
`QOepo}U]SSO|YONOL
`
`UserInterface
`
`136
`
`Comm unit
`
`
`
`Memory
`
`138
`
`
`
`137
`Battery
`
`
`>
`
`«
`
`Sterile Field
`
`User Interface
`
`154
`
`142
`
`<4
`
`Non-Sterile Field
`
`
`
`Patent Application Publication
`
`Sep. 16,2010 Sheet 1 of 6
`
`US 2010/0231509 Al
`
`©-
`~
`
`120
`
`122
`
`FIG.1B
`
`
`
`Patent Application Publication
`
`Sep. 16,2010 Sheet 2 of 6
`
`US 2010/0231509 Al
`
`OSI
`
`Or|
`
`
`
`
`
`Pjal49[J9}S-uON
`
`Ol‘SIA
`
`
`
`PlelyS|WOIS
`
`
`
`ZEtAsayeg
`
`Touchless Interface
`
`JOSS8901q
`
`Josuas
`
`yupBursuas
`
`UUMSSO[OJ1M
`
`
`
`S0BLa]UJ8SF)
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`Sep. 16,2010 Sheet 3 of 6
`
`US 2010/0231509 Al
`
`VecticalRangeCalibration
`
`
`
`HorizontalRangeCalibration
`
`
`FIG.2B
`
`FIG.2A
`
`
`
`Patent Application Publication
`
`Sep. 16,2010 Sheet 4 of 6
`
`US 2010/0231509 Al
`
`322
`
`SPUBLULUOD 9910,
`
`oD
`
`©L
`
`e
`
`:
`
`Pause
`
`co
`8
`
`Tt
`O
`
`©o
`
`Dm
`c
`=
`oOSen
`oO
`w
`
`3
`
`o
`
`303 m
`Scroll Center/
`
`.©L
`
`L
`
`
`
`Patent Application Publication
`
`Sep. 16,2010 Sheet 5 of 6
`
`US 2010/0231509 Al
`
`
`
`$]0.1]U04a1n]1Se5
`
`€Ov
`
`ay‘Sis
`
`cOPr
`
`VvSls
`
`3)©o33o5oO.a
`
`<o2.
`
`vOV
`
`a}eCiaen
`
`edebuesiq
`
`$NO-WWOOZ
`
`JE]US0@)
`
`84
`
`paEs©
`
`
`
`
`
`
`Patent Application Publication
`
`Sep. 16,2010 Sheet 6 of 6
`
`US 2010/0231509 Al
`
`SPURLUWOD Bd10A
`
`motion
`
`Circular
`
`
`
`imageRotation
`
`wf
`Oo
`ike)
`
`FIG.5
`
`
`
`US 2010/0231509 Al
`
`Sep. 16, 2010
`
`STERILE NETWORKED INTERFACE FOR
`MEDICAL SYSTEMS
`
`CROSS-REFERENCE TO RELATED
`APPLICATION
`
`[0001] This application claimsthe priority benefit of U.S.
`Provisional Patent Application No. 61/159,793 filed Mar. 12,
`2009, the entire contents ofwhich are hereby incorporated by
`reference. This application also incorporates by reference the
`following Utility Applications:
`
`FIELD
`
`[0002] The present embodimentsofthe invention generally
`relate to the field ofhospital systems, more particularly medi-
`cal device data processing and control.
`
`BACKGROUND
`
`[0003] As data becomes morereadily available in hospitals
`and clinics, doctors and patients have more information to
`process. Computer systems and medical devices provide an
`interface which allows them to retrieve, interpret and display
`the information. In the operating room environment, com-
`puter systems are generally outside the surgical field and
`operated by a technician. Electronic surgical tools are provid-
`ing the surgeon with new means for performing surgery.
`Although the medical devices may communicate with the
`computer system there still lacks an intuitive user interface
`which allows the surgeonto retrieve information.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0004] The features of the embodiments of the invention,
`whichare believedto be novel, are set forth with particularity
`in the appended claims. Embodiments of the invention,
`together with further objects and advantages thereof, may
`best be understood by reference to the following description,
`taken in conjunction with the accompanying drawings, in the
`several figures of which like reference numerals identify like
`elements, and in which:
`[0005]
`FIG. 1A is an exemplary illustration of an operating
`room system configured for touchless user interface control
`according to one embodiment;
`[0006]
`FIG. 1B is an exemplary illustration of a sensory
`device for processing touchless movements and gesture con-
`trol according to one embodiment;
`[0007]
`FIG. 1C is an exemplary illustration of a sterile
`networkedinterface system according to one embodiment;
`[0008]
`FIGS. 2A and 2B are exemplary illustrations for
`user interface calibration according to one embodiment;
`[0009]
`FIG. 3A is an exemplary depiction for touchless
`user interface scrolling according to one embodiment;
`[0010]
`FIGS. 4A and 4B are exemplary depictions for
`touchless user interface gesture controls according to one
`embodiment; and
`[0011]
`FIG. 5is an exemplary depiction for touchless user
`interface image analysis and feedback according to one
`embodiment.
`
`DETAILED DESCRIPTION
`
`[0012] While the specification concludes with claims
`defining the features of the invention that are regarded as
`novel, it is believed that the invention will be better under-
`
`stood from a consideration of the following description in
`conjunction with the drawingfigures, in which like reference
`numerals are carried forward.
`
`[0013] As required, detailed embodiments of the present
`invention are disclosed herein; however,it is to be understood
`that the disclosed embodiments are merely exemplary of the
`invention, which can be embodiedin various forms. There-
`fore, specific structural and functional details disclosed
`herein are not to be interpreted as limiting, but merely as a
`basis for the claims andas a representative basis for teaching
`one skilled in the art to variously employ the present invention
`in virtually any appropriately detailed structure. Further, the
`terms and phrases used herein are not intendedto be limiting
`but rather to provide an understandable description.
`[0014] The termsaoran, as used herein, are defined as one
`or morethan one. The term plurality, as used herein, is defined
`as two or more than two. The term another, as used herein,is
`defined as at least a second or more. The terms including
`and/or having,as used herein, are defined as comprising(1.e.,
`open language). The term coupled, as used herein,is defined
`as connected, although not necessarily directly, and not nec-
`essarily mechanically. The terms program, software applica-
`tion, and the like as used herein, are defined as a sequence of
`instructions designed for execution on a computer system. A
`program, computer program, or software application may
`include a subroutine, a function, a procedure, an object
`method, an object implementation, an executable application,
`a source code, an object code, a shared library/dynamic load
`library and/or other sequence ofinstructions designed for
`execution on a computer system or embedded electronic
`device.
`
`FIG. 1A shows an exemplary sterile networked
`[0015]
`interface in an operating room environment comprising a
`touchless sensing device 100 and a data processing system
`104 display. The sensing device 100 permits the surgeon a
`sterile user interface for interacting and viewing operational
`data, processing medical information, and viewing surgical
`toolfunctionality. It maybe positioned 1-2 feet within reach.
`It can also be placedfarther outofthesterile field. As will be
`described ahead, the touchless sensing device 100 in various
`embodiments permits the surgeon the ability to scroll, select,
`rotate and save data via touchless gestures, and in certain
`embodiments, update operational tool parameters in accor-
`dance with a predetermined work flow. Although shown as a
`separate unit, the touchless sensing device 100 can be con-
`figured peripheral to the display or positioned above. One
`example of an arrangementis disclosed in U.S. patent appli-
`cation Ser. No. 11/683,416, the entire contents of which are
`hereby incorporated by reference. The touchless sensing
`device 100 can comprise infrared sensors, ultrasonic sensors,
`camera elements or a combination thereof as therein speci-
`fied.
`
`FIG. 1B is an exemplary illustration of the compo-
`[0016]
`nents of the touchless sensing device 100 according to one
`embodiment. The sensing device 100 can include a transmit-
`ter 112, three or more receivers 122 (at the corners), a Digital
`Signal Processor (DSP) 110 to process sensory information
`from thereceivers 122, acommunications module (e.g., Blue-
`Tooth, ZigBee, or other IEEE protocol), a memory 120, and
`one or more analogto digital converters and digital to analog
`converters 118. Operation of the touchless sensing unit in
`various configurations for achieving touchless sensing are
`disclosed in the following U.S. patent applications, all of
`
`
`
`US 2010/0231509 Al
`
`Sep. 16, 2010
`
`which are hereby incorporated by reference in their entirety:
`11/559,295; 11/559,325; 11/562,404; 12/146,445.
`[0017] The touchless sensing device 100 in the configura-
`tion shownincludes an optical camera element 117 to image
`an anatomicalfeaturein the sterile field, an ultrasonic trans-
`ducer 112 to measure a distanceto the anatomicalfeature that
`
`is associated with the physical variable, and an accelerometer
`121 for identifying an axial tilt and movement. This informa-
`tion can be used by the data processing system in accordance
`with touchless user input to report proper use andorientation
`ofthe surgical tool. As one example, the processor configures
`one or more of its operational parameters responsive to ana-
`lyzing the anatomical feature of interest and determining a
`proper working angle from the orientation.
`[0018]
`FIG. 1C is a diagrammatic illustration of an exem-
`plary sterile networked interface system. The sterile net-
`worked interface system comprises a hand-held surgical tool
`139 used within a sterile field during surgery, a touchless
`interface 140, and a data processing system 104 outside the
`sterile field and wirelessly coupled to the surgical tool. The
`surgical tool comprises a sensor 130 for sensing a physical
`variable relatedto the surgery, a wireless communication unit
`134 to transmit the physical variable, and a battery 139 for
`powering the hand-held surgicaltool. It is not limited to these
`features and mayinclude other components of a surgical tool,
`for example, a processor 132 a user interface 116 and asso-
`ciated memory 138. It can also includea drill, a saw,a rotor,
`a stator or other mechanical hardware. The data processing
`system 150 receives the physical variable and orientation
`information reported from andrelated to the hand-held sur-
`gical tool during the surgery.
`[0019] The surgical tool 139 communicates a physical vari-
`able associated with the surgical procedure to the data pro-
`cessing system 104 responsive to the sensing unit 110 detect-
`ing a touchless gesture control and predeterminedorientation
`of the surgical tool 109. The sensing unit 141 (see processor
`114 of FIG. 1B) detects touchless gestures of re-centering,
`accelerated movements,
`left-right movements, up-down
`movements, and zoom in and out movements. Touchlessfin-
`ger pointing and hand gestures controlaspectsof a user inter-
`face presented through the data processing system 104.
`Aspects oftouchless sensing are disclosed in issued U.S. Pat.
`No. 7,620,316 which is incorporated by reference in its
`entirety.
`In another configuration, the hand-held surgical tool
`[0020]
`139 includesthe touchless sensing device as the user interface
`component 136 thereon for use withina sterile field during
`surgery. The hand-held surgical tool 139 comprises the sensor
`130 for sensing a physical variable related to the surgery, the
`wireless communication unit 134 to transmit the physical
`variable, the battery 137 for powering the hand-held surgical
`tool. The touchless sensing unit on the hand-held surgicaltool
`identifies a location and movementofa fingeror hand gesture,
`and the processor 132 communicates the physical variable
`from the hand-held surgical tool 139 to the data processing
`system 150 responsive to the touchless gesture control and
`predeterminedorientation ofthe surgical tool. The hand-held
`surgical tool 139 sends physical variable and orientation
`information to the data processing system 150 outside the
`sterile field that is wirelessly coupled to the surgical tool, and
`that provides operative directives back to the surgical tool for
`performing a work flow procedure ofthe surgery based on the
`physical variable and orientation.
`
`[0021] The touchless user interface 150 comprises a sens-
`ing unit 141 for identifying a location and movementof a
`finger or hand gesture; a processor to operate display infor-
`mation for touchless sensing,
`touchless image scrolling,
`selection, and saving, touchless gesture controls, and touch-
`less image rotation; and a communications interface 142 for
`sending the location and movementto a display of the data
`processing system for controlling a user interface ofthe hand-
`held surgical
`tool. The data processing system 150 can
`include a processor 151, a memory 152, a wireless unit 153
`and user interface 154. The touchless sensing device 100
`provides 1) display calibration for touchless sensing, 2)
`touchless thumbnail Scrolling, 3) touchless gesture controls,
`4) touchless image rotation, 5) small vowel recognition
`vocabulary for enhancedinterface control, and 6) pointing to
`image features with drag & drop labeling.
`[0022] Oneaspect ofrange detection and positioning deter-
`mination as described below are disclosed in issued U.S. Pat.
`No. 7,414,705 which is incorporated by reference in their
`entirety. As one particular example, the sensor comprises a
`pulse shaper for producing a pulse shaped signal, the pulse
`shaped signal intendedfor reflecting offan anatomical feature
`to produce a reflected signal, wherein at least one portion of
`the pulse shaped signal is at least one among a frequency
`modulated region, constant frequency region, phase modu-
`lated region, and a chirp region, a phase detectorfor receiving
`and identifying a relative phase of the reflected signal with
`respect to a previously received reflected signal, and a pro-
`cessor operatively coupled to the pulse shaper for receiving
`the reflected signal, tracking a location and amovementofthe
`hand-held surgical tool from an estimated arrival time of the
`reflected signal and the relative phase, and providing the
`physical variable to a user interface control in accordance
`with the location and the movementof the hand-held surgical
`tool. The processor estimates the location of the first object
`from a frequency modulated region of the reflected signal,
`and a velocity ofthe first object from the relative phase from
`a continuous frequency region of the reflected signal and
`responsive to a touchless gesture control and predetermined
`orientation of the surgical tool communicates the physical
`variable from the hand-held surgical tool to the data process-
`ing system.
`[0023] FIGS.2A and 2B shows exemplary calibration steps
`for setting up the touchless sensing device 100. The deviceis
`calibratedto the display based onthe finger range of motion.
`That is, the dimensions of the touchless sensing space are
`mappedto the display. For instance, a 6 in left and 6 in right
`motion (12 inch horizontal span) is mappedto the 30-50 inch
`wide display range. A 4 in up and 4 in down motion (8 inch
`vertical span) is mapped to the 20-30 inchdisplay height since
`screen is rectangular and vertical arm movement may require
`morelifting (fatiguing) motion. The touchless sensing device
`100 can also be used in conjunction with a hand-held device
`such as a wireless pointer in addition to the previously shown
`hand-held drill. As illustrated the surgeon pausesfinger/hand,
`then movesfinger/handleft and rightfor desired full range of
`arm/hand motion. Sensors can also be includedin a surgical
`glove to provide further gesture functionality. The dual bars
`open during the jitter motion to visually identify the mapped
`horizontal boundaries. Similarly during height calibration
`dual bars are shown opening duringa left-right jitter motion
`to visually identify a mapped horizontal boundaries.
`[0024]
`In another embodiment, the touchless sensing unit
`can be configured for touchless interface control and range
`
`
`
`US 2010/0231509 Al
`
`Sep. 16, 2010
`
`detection. In either configuration the sensor identifies a loca-
`tion and movementof a finger or hand gesture, and the com-
`munications interface sends the location and movement to a
`display device of the data processing system for controlling
`one or more parameters of the surgical tool 139 by wayofthe
`touchless user interface 150. The touchless sensing unit com-
`municates a physical variable captured during a surgical pro-
`cedure from the surgical tool to the data processing system for
`presentation in the display responsive to a touchless gesture
`control and predeterminedorientation of the surgical tool. As
`one example, the user pauses the surgical tool 139 at a certain
`angle and with the other hand points to the screen. The accel-
`erometeridentifies an axial tilt and movement. The processor
`references the orientation of the hand-held surgical tool with
`respect to a coordinate system established by the anatomical
`feature position, tilt and movement. The interface isolates and
`displays a pointof interest, for example, an anatomical fea-
`ture of the patient according to an orientation of the surgical
`tool. The surgical tool can capture an image of and a distance
`to the anatomical feature which is reported on the display.
`Surgical feedback provided via the touchless user interface
`can then be usedto set one or more parameters of the surgical
`tool 139 for updating the work flow. The sensorcan be opera-
`tively coupledto the surgical tool. In another embodimentthe
`sensor is operatively coupled to the data processing system
`apart from the tool. As another example, the processor 114
`references the orientation of a hand-held surgical tool with
`respect to the anatomicalfeature, distance, tilt and movement.
`The data processing system 150 provides operative directives
`to the surgical tool in accordance with a predetermined work
`flow that takes into accountthe physical variable.
`[0025]
`FIGS. 3A and 3B showsa thumbnail scrolling user
`interface on the screen 300. The screen can include a variable
`
`or fixed numberof thumbnail images 302 on the top row. To
`refresh more images, surgeon can re-center (1) the finger once
`its pointedto thefarleft or right ofreach. As another example,
`the system can have 10 thumbnail images 302 in a buffer, but
`only show 5 at a time, wherethe entire 10 images are mapped
`to the surgeon’s full horizontal hand motion to permit scroll-
`ing of the 10 images. The active image(the one pointed too)
`can have a colored box outline for example. That image can be
`actively displayed in the lowerleft corner 304 as the surgeon
`fingers across (scrolls 2) thumbnail images. The surgeon can
`disengage (3) via retracted finger movement or voice com-
`mand.
`
`[0026] One example method ofthumbnail scrolling five (5)
`thumbnail imagesfixed on the display, includes methodsteps
`where the surgeonraises and centers finger/hand and pauses
`for a brief moment(e.g., 1 second) in front of the touchless
`sensing device 100 to indicate readiness, a border 302 will be
`displayed indicating touchless control acquired. The surgeon
`movestheir finger/handleft or right to scroll through images.
`Anactively selected image will be outlined by border box and
`will enlarge in (e.g., left main) display area. The surgeon
`retracts finger/hand to disengage touchless control, which
`leaves active image enlarged. Alternatively, surgeon can say a
`wordto disengage (exit) touchless control via voice recogni-
`tion applications. One example of voice recognition com-
`bined with touchless sensing is disclosed in U.S. patent appli-
`cation Ser. No. 12/120,654, the entire contents of which are
`hereby incorporated by reference.
`[0027] FIGS.4A and 4B show exemplary touchless gesture
`controls for the user interface on the screen 400. To select an
`
`image the surgeon can perform a brief up/down (~1 inch)
`
`finger motion.In an alternate configuration, the surgeon can
`do briefright->re-center movementsto scroll to the right, and
`left->re-center movements to scroll
`left, so the hand can
`remain sufficiently centered. A simple voice command can be
`usedto start touchless gesture controls, to override touchless
`control for image thumbnail scrolling as shown. The 3D
`location at which the surgeon re-centers the finger-hand can
`establish the reference zoom plane. The surgeon can slowly
`move the finger forward (in) to zoom in. Then, the surgeon
`can slowly move finger back to zoom-out back to the refer-
`ence zoom plane.
`[0028] One example method of gesture control comprises
`steps where the surgeonjitters the finger/hand up and downto
`select the image. The thumbnail border turns green and
`flashes to indicate a waiting state. The surgeon then re-centers
`and pauses the finger/hand in front of touchless sensing
`device 100 to acquire gesture control (e.g., thumbnail border
`then stops flashing and turns solid green indicating ready)
`Re-centering is also the motion required if the surgeon pre-
`viously disengaged iPoint control. As one example, the sur-
`geon speaks a voice commandto start touchless navigation/
`zoom, and can then move up/down/left/right to navigate
`image in conjunction with inward pointing movement to
`zoom-in on image. (Zoom-outis permitted after zoom-in).
`[0029] Meansfor operation ofthe touchless sensing unit for
`gesture control and user interfaces are disclosed in the fol-
`lowing U.S. patent applications, all ofwhich are herebyincor-
`porated by reference in their entirety: 11/562,413;11/566,
`137; 11/566,148; 11/566,156; 11/683,410; 11/683,412;
`11/683,413; 11/683,415 and 11/683,416.
`[0030] Anaccelerated retracting finger movement(or voice
`command) can disable (exit) touchless control thereby tem-
`porarily locking the imageat the zoom-level and positionjust
`prior to the accelerated retracted finger movement. This
`releases touchless control and permits the surgeon to continue
`the medical procedure. In order to enable and re-engage
`touchless control, the surgeon can center and pause thefinger/
`handin front ofthe iPoint again. A thumb motion on the same
`handan also be performed for control (an action similar to
`mimicking a thumbtrigger). The 3D location at which the
`surgeon re-centers the finger-hand establishes the reference
`zoom plane. The surgeon can slowly movethe finger forward
`(in) to zoom in. Then, the surgeon can slowly movefinger
`back to zoom-out back to the reference zoom plane.
`[0031] An accelerated retracting finger movementcan dis-
`able touchless control thereby temporarily locking the image
`at the zoom-level and position just prior to the accelerated
`retracted finger movement. This releases touchless control
`and permits the surgeon to continue the medical procedure. A
`thumb motion on the same handcan alternatively signal the
`image lock instead of the accelerated retracting movement).
`To enable and re-engage touchless control, the surgeon can
`center and pausethe finger/handin frontofthe sensing device
`100 again. Voice commandscanalso be used in conjunction
`with drag and drop labeling. The surgeon can select labels
`from a user interface 306 which can then be dragged to
`anatomical features on the image 304.
`[0032]
`FIG. 5 showsan exemplary touchless user interface
`application for image rotation. Uponselecting an image 502,
`then displayed in a larger window 504,the surgeon can speak
`a voice command 506 such as “rotate”, to commence touch-
`less rotation controls. The surgeon can then perform touch-
`less clockwise and counter clockwise finger motionsto rotate
`the image. The touchless sensing device 100 translates the
`
`
`
`US 2010/0231509 Al
`
`Sep. 16, 2010
`
`touchless sensing,
`touchless imagescrolling, selection, and saving;
`touchless gesture controls, and
`touchless imagerotation; and
`a communications interface for sending the location and
`movementto a display of the data processing system for
`controlling a user interface of the hand-held surgical
`tool.
`3. The sterile networked interface of claim 2, where the
`processordetects touchless gestures of re-centering, acceler-
`ated movements,left-right movements, up-down movements,
`and zoom in and out movements.
`4. The sterile networked interface of claim 1 where the
`sensor includes
`
`finger motions to imagetranslationsthatrotate the image. The
`ipoint can identify a finger (or hand) pauseto stop rotation and
`lock to the current rotation to permit surgeon toretract hand.
`Rotation controls include voice recognition, circular finger
`motions, and forward andretracting hand motions.
`[0033] Meansfor operation of the touchless sensing unit to
`detect scrolling, gestures and rotations for controlling user
`interfaces are disclosed in the following U.S. patent applica-
`tions, all of which are hereby incorporated by reference in
`their entirety: 11/839,323; 11/844,329; 11/850,634; 11/850,
`637; 11/930,014; 11/936,777; 11/936,778; 12/050,790;
`12/099,662 and 12/120,654
`[0034] The illustrations of embodiments described herein
`are intended to provide a general understandingof the struc-
`an optical camera element to image an anatomical feature
`ture of various embodiments, and they are not intended to
`in thesterilefield;
`serve as a complete description ofall the elements and fea-
`an ultrasonic transducer to measure a distance to the ana-
`tures of apparatus and systems that might make use of the
`tomical feature that is associated with the physical vari-
`structures described herein. Many other embodiments will be
`able;
`apparentto those of skill in the art upon reviewing the above
`and, an accelerometer for identifying an axial tilt and
`description. Other embodiments may be utilized and derived
`movement; where the processor references the orienta-
`therefrom, such that structural and logical substitutions and
`tion of the hand-held surgical tool with respect to the
`changes may be made without departing from the scope of
`anatomicalfeature, distance,tilt and movement.
`this disclosure. Figures are also merely representational and
`5. Thesterile networked interface ofclaim 3 where the data
`may not be drawnto scale. Certain proportions thereof may
`processing system provides operative directives to the surgi-
`be exaggerated, while others may be minimized. Accord-
`cal tool in accordance with a predetermined work flow that
`ingly, the specification and drawingsare to be regarded in an
`illustrative rather than a restrictive sense.
`takes into account the physical variable.
`6. A hand-held surgical tool suitable for use within a sterile
`[0035]
`Such embodiments of the inventive subject matter
`field during surgery, the surgical tool comprising:
`maybereferred to herein, individually and/orcollectively, by
`the term “invention” merely for convenience and without
`a sensor for sensing a physical variable related to the sur-
`gery;
`intending to voluntarily limit the scopeofthis application to
`any single invention or inventive concept if more than oneis
`a wireless communication unit to transmit the physical
`in fact disclosed. Thus, although specific embodiments have
`variable;
`beenillustrated and described herein,it should be appreciated
`a battery for powering the hand-held surgical tool;
`that any arrangementcalculated to achieve the same purpose
`a touchless sensing unit on the hand-held surgical tool for
`maybe substituted for the specific embodiments shown. This
`identifying a location and movementofafinger or hand
`disclosure is intended to cover any and all adaptations or
`gesture; and
`variations of various embodiments. Combinations of the
`a processor to communicatesthe physical variable from the
`above embodiments, and other embodiments notspecifically
`hand-held surgical tool to a data processing system
`described herein, will be apparent to those of skill in the art
`responsive to the touchless gesture control and predeter-
`upon reviewing the above description.
`minedorientation of the surgical tool.
`Whatis claimedis:
`7. The hand-held surgical tool of claim 6 sends the physical
`1. A sterile networked interface system comprising
`variable and orientation information to the data processing
`a hand-held surgical tool used within a sterile field during
`system outside the sterile field that is wirelessly coupledto the
`surgery, the surgical tool comprising:
`surgical tool, and that provides operative directives back to
`a sensor for sensing a physical variable related to the
`the surgical tool for performing a work flow procedure of the
`surgery;
`surgery based on the physical variable and orientation.
`a wireless communication unit to transmit the physical
`8. The hand-held surgical tool of claim 6, where the sensor
`includes
`variable; and
`a battery for powering the hand-held surgical tool;
`a data processing system outsidethesterile field wirelessly
`coupled to the surgical tool for receiving the physical
`variable and orientation information reported from and
`related to the hand-held surgical tool during the surgery;
`and
`
`atouchless interface that responsive to a touchless gesture
`control and predetermined orientation of the surgical
`tool communicates the physical variable from the hand-
`held surgical tool to the data processing system.
`2. The sterile networked interface of claim 1 where the
`
`touchless user interface comprises:
`asensing unitfor identifying a location and movementof a
`finger or hand gesture; a processor to operate display
`information for:
`
`an optical camera element to image an anatomical feature
`in thesterilefield;
`an ultrasonic transducer to measure a position of the ana-
`tomical feature relative to the hand-held surgical tool
`that is associated with the physical variable;
`and, an accelerometer for identifying an axial tilt and
`movement; where the processor references the orienta-
`tion of the hand-held surgical tool with respect to a
`coordinate system established by the anatomical feature
`position, tilt and movement.
`9. The hand-held surgical tool of claim 6, where the sensor
`comprises:
`a pulse shaper for producing a pulse shaped signal, the
`pulse shaped signal intended for reflecting off an ana-
`tomical feature to produce a reflected signal, wherein at
`
`
`
`US 2010/0231509 Al
`
`Sep. 16, 2010
`
`system for controlling one or more parameters of a
`least one portion ofthe pulse shaped signalis at least one
`surgical tool by way of a touchless userinterface,
`among a frequency modulated region, constant fre-
`wherethe touchless sensing unit communicates a physi-
`quency region, phase modulated region, and a chirp
`cal variable captured during a surgical procedure from
`region;
`the surgical tool to the data processing system for
`a phase detector for receiving and identifying a relative
`presentation in the display responsive to a touchless
`phaseofthe reflected signal with respect to a previously
`gesture control of the finger or hand movement and
`received reflected signal, and
`predeterminedorientation ofthe surgical tool.
`a processor operatively coupled to the pulse shaper for
`12. The touchless sensing unit of claim 11, where the
`receiving the reflected signal, tracking a location and a
`sensoris operatively coupled to the data processing system.
`movement of the hand-held surgical tool from an esti-
`13. The touchless sensing unit of claim 11, where the
`matedarrival time ofthe reflected signal andthe relative
`sensoris operatively coupled to the surgical tool.
`phase, and providing the physical variable to a user
`14. The touchless sensing unit of claim 11, comprising
`interface control in accordancewith the location and the
`a wireless communication unit to transmit the physical
`movementof the hand-held surgical tool,
`variable; and
`a battery for powering the hand-held surgicaltool.
`wherein the processor estimates the location of the first
`15. The touchless sensing unit of claim 11, where the
`object
`from a frequency modulated region of the
`sensor includes
`reflected signal, andavelocity ofthe first object from the
`an optical camera element to image an anatomical feature
`relative phase from a continuousfrequency region ofthe
`in thesterilefield;
`reflected signal and responsive to a touchless gesture
`an ultrasonic transducer to measure a position of the ana-
`control and predetermined orientation of the surgical
`tomical feature relative to the hand-held surgical tool
`tool communicates the physical variable from the hand-
`that is associated with the phy