throbber
Exploiting Proprioception in
`Virtual-Environment Interaction
`
`by
`
`Mark Raymond Mine
`
`A dissertation submitted to the faculty of the University of North Carolina at Chapel Hill in
`partial fulfillment of the requirements for the degree of Doctor of Philosophy in the
`Department of Computer Science.
`
`Chapel Hill
`
`1997
`
`Approved by:
`
`_____________________________
`Dr. Frederick P. Brooks Jr., Advisor
`
`__________________________
`Dr. Henry Fuchs
`
`_____________________________
`Dr. Gary Bishop, Reader
`
`__________________________
`Dr. Anselmo Lastra
`
`_____________________________
`Dr. John Tector, Reader.
`
`__________________________
`Dr. Carlo H. Sequin
`
`SCEA Ex. 1032 Page 1
`
`

`
`ii
`
`© 1997
`Mark Raymond Mine
`ALL RIGHTS RESERVED
`
`SCEA Ex. 1032 Page 2
`
`

`
`iii
`
`ABSTRACT
`
`Mark Raymond Mine
`Exploiting Proprioception in Virtual-Environment Interaction
`(Under the direction of Frederick P. Brooks, Jr.)
`
`Manipulation in immersive virtual environments is difficult partly because users
`must do without the haptic contact with real objects they rely on in the real world to orient
`themselves and the objects they are manipulating. To compensate for this lack, I propose
`exploiting the one real object every user has in a virtual environment, his body. I present a
`unified framework for virtual-environment interaction based on proprioception, a person's
`sense of the position and orientation of his body and limbs. I describe three forms of
`body-relative interaction:
`• Direct manipulation—ways to use body sense to help control manipulation
`• Physical mnemonics—ways to store/recall information relative to the body
`• Gestural actions—ways to use body-relative actions to issue commands
`
`Automatic scaling is a way to bring objects instantly within reach so that users can
`manipulate them using proprioceptive cues. Several novel virtual interaction techniques
`based upon automatic scaling and our proposed framework of proprioception allow a user
`to interact with a virtual world intuitively, efficiently, precisely, and lazily.
`
`Two formal user studies evaluate key aspects of body-relative interaction. The
`virtual docking study compares the manipulation of objects co-located with one's hand and
`the manipulation of objects at a distance. The widget interaction experiment explores the
`differences between interacting with a widget held in one's hand and interacting with a
`widget floating in space.
`
`Lessons learned from the integration of body-relative techniques into a real-world
`system, the Chapel Hill Immersive Modeling Program (CHIMP), are presented and
`discussed.
`
`SCEA Ex. 1032 Page 3
`
`

`
`iv
`
`ACKNOWLEDGMENTS
`
`Thanks to
`Frederick P. Brooks Jr., Gary Bishop, Henry Fuchs, Anselmo Lastra, John
`Tector, and Carlo H. Sequin for serving on my doctoral dissertation committee;
`Frederick P. Brooks Jr., my advisor, for his insights, inspiration, and for making it
`all so clear;
`Gary Bishop for many fruitful years of collaboration and for not minding too much
`that my dissertation didn't have wires and accelerometers coming out of it;
`Henry Fuchs for the inspiration of his boundless energy, enthusiasm, and love of
`knowledge;
`Anselmo Lastra for his kindness, advice, and for keeping Pixel-Planes 5 alive long
`enough for me to graduate;
`Carlo Sequin for asking the hard questions, and for helping me to keep it simple;
`John Tector for the many wonderful conversations about architecture and design;
`Warren Robinett for leading me to the University of North Carolina;
`Rick Zobel for paving the way for my investigations into immersive design;
`Robert Zeleznik for his invaluable contributions to this work;
`Linda Houseman, Dave Harrison, Todd Gaul, and Peggy Wetzel for all of their
`help during the years;
`Hans Weber and Greg Welch for being such good friends and for the meetings of
`the IHBI at the TOTH1;
`Erik Erikson for Vinimini, G2, Speed Racer, and for keeping it fun;
`Eliza Graves for the laughter and the smiles;
`My parents for all they have done for me through the years;
`Dylan for the incredible joy he has brought to my life;
`Baby X for the many wonderful years to come;
`and most importantly,
`Sandra for her unwavering love, support, faith, and devotion, and for, more than
`anyone else, making it all possible.
`
`Financial support for this work came from the following agencies: Defense Advanced
`Research Projects Agency, Link Foundation, Lockheed Missiles and Space, Inc. (indirect
`DARPA)
`
`
`1Institute for Half-Baked Ideas at the Top of the Hill
`
`SCEA Ex. 1032 Page 4
`
`

`
`TABLE OF CONTENTS
`
`Page
`LIST OF TABLES..................................................... x
`LIST OF FIGURES.................................................... xi
`LIST OF ABBREVIATIONS ............................................ xiii
`
`v
`
`II.
`
`III.
`
`Chapter
`I.
`Introduction .................................................... 1
`1.1
`The Research ............................................. 1
`1.2
`The Challenge ............................................ 1
`1.3
`The Attack ............................................... 2
`1.4
`A Complication ........................................... 3
`1.5
`A Proposal ............................................... 5
`1.6
`Overview ................................................ 6
`Related Work ................................................... 8
`2.1
`3-DoF and 6-DoF Object Manipulation Using 2D Input ............ 8
`2.2
`Object Manipulation Using Higher-Dimensional Input ............. 11
`2.3
`Two-handed Interaction ..................................... 14
`2.3.1
`Example Techniques ................................ 14
`2.3.2
`Theoretical and Experimental Results ................... 19
`2.4 Manipulating Objects Using Gesture and Voice................... 22
`2.5
`Systems for Interactive Design................................ 24
`2.5.1 Working Through-the-window........................ 25
`2.5.2 Working Immersed ................................. 30
`Body-Relative Interaction Techniques ................................ 33
`3.1 Working Within Arm's Reach ................................ 33
`3.2
`Sample Interaction Techniques................................ 36
`3.2.1
`Direct Manipulation................................. 36
`3.2.1.1
`Scaled-World Grab for Manipulation .......... 36
`3.2.1.2
`Scaled-World Grab for Locomotion ........... 38
`Physical Mnemonics ................................ 38
`3.2.2.1
`Pull-Down Menus......................... 38
`3.2.2.2
`Hand-Held Widgets ....................... 39
`3.2.2.3
`FOV-Relative Mode Switching ............... 41
`Gestural Actions ................................... 41
`3.3.3.1
`Head-Butt Zoom .......................... 41
`
`3.2.2
`
`3.3.3
`
`SCEA Ex. 1032 Page 5
`
`

`
`IV.
`
`V.
`
`VI.
`
`vi
`
`Look-at Menus ........................... 43
`3.3.3.2
`Two-Handed Flying ....................... 43
`3.3.3.3
`Over-the-Shoulder Deletion.................. 44
`3.3.3.4
`User Study 1—Virtual Object Docking ............................... 46
`4.1
`Introduction .............................................. 46
`4.2
`Hypotheses .............................................. 47
`4.3
`The Experiment ........................................... 47
`4.3.1
`Subjects.......................................... 47
`4.3.2
`Experimental Platform............................... 47
`4.3.3
`The Task ......................................... 48
`4.3.4
`Experimental Conditions............................. 49
`4.3.5
`Experimental Procedure ............................. 50
`Results .................................................. 51
`4.4
`Questionnaire Results....................................... 54
`4.5
`Discussion ............................................... 56
`4.6
`Conclusion............................................... 56
`4.7
`User Study 2—Proprioception and Virtual Widget Interaction ............. 58
`5.1
`Introduction .............................................. 58
`5.2
`Hypotheses .............................................. 59
`5.3
`The Experiment ........................................... 59
`5.3.1
`Subjects.......................................... 59
`5.3.2
`Experimental Platform............................... 60
`5.3.3
`The Task ......................................... 61
`5.3.4
`Experimental Procedure ............................. 62
`Results .................................................. 63
`5.4
`Questionnaire Results....................................... 64
`5.5
`Discussion ............................................... 65
`5.6
`Conclusions .............................................. 66
`5.7
`CHIMP—The Chapel Hill Immersive Modeling Program................. 68
`6.1
`CHIMP Overview ......................................... 68
`6.2 Managing Modes .......................................... 73
`6.2.1
`Rotary Tool Chooser................................ 74
`6.2.2
`Two-Dimensional Control Panels ...................... 75
`6.2.3
`Look-at Menus .................................... 77
`6.2.4
`Pull-Down Menus.................................. 77
`Object Selection ........................................... 78
`
`6.3
`
`SCEA Ex. 1032 Page 6
`
`

`
`vii
`
`6.4
`6.5
`6.6
`
`Object Manipulation ........................................ 81
`Object Generation.......................................... 82
`Constrained Object Manipulation .............................. 83
`6.6.1
`Co-Located Widgets ................................ 83
`6.6.2
`Hand-Held Widgets ................................ 85
`Numeric Input in a Virtual World.............................. 86
`6.7
`Final Words .................................................... 90
`7.1
`Conclusions .............................................. 90
`7.2
`Contributions ............................................. 91
`7.3
`Future Work.............................................. 93
`Localized Haptic Feedback................................... 93
`A Review of the State-of-the-Art of Computer-Aided Modeling ............ 95
`A.1
`Introduction .............................................. 95
`A.2 Modeling Techniques and Paradigms........................... 96
`A.2.1
`Input for a Three-Dimensional Task .................... 96
`A.2.1.1 Numeric Input. ........................... 96
`A.2.1.2 Relative Input............................. 96
`A.2.1.3
`2D Interactive Input. ....................... 97
`A.2.2 Output of a Three-Dimensional Space................... 98
`A.2.2.1
`Format of the Modeling View. ............... 99
`A.2.2.2
`Three-Dimensional Visualization: Separate or
`Integrated ............................... 100
`Three-Dimensional Visualization: Static or
`Interactive ............................... 101
`A.2.2.4 Complications of Two-Dimensional Output ..... 101
`A.3 Modeling System Capability Comparison ....................... 102
`A.4 Modeler Reviews Overview.................................. 105
`A.5
`Archicad................................................. 107
`A.5.1 Overview......................................... 107
`A.5.2 Model Creation .................................... 108
`A.5.3 Model Modification................................. 108
`A.5.4 Model Interaction/Visualization........................ 109
`A.5.5 Manual .......................................... 109
`A.5.6
`Comments/Impressions.............................. 110
`AutoCAD ................................................ 111
`A.6.1 Overview......................................... 111
`A.6.2 Model Creation .................................... 112
`
`A.2.2.3
`
`VII.
`
`A.
`
`A.6
`
`SCEA Ex. 1032 Page 7
`
`

`
`viii
`
`A.8
`
`A.9
`
`A.7
`
`A.6.3 Model Modification................................. 113
`A.6.4 Model Interaction/Visualization........................ 114
`A.6.5 Manual .......................................... 114
`A.6.6
`Comments/Impressions.............................. 114
`DesignWorkshop .......................................... 115
`A.7.1 Overview......................................... 115
`A.7.2 Model Creation .................................... 116
`A.7.3 Model Modification................................. 117
`A.7.4 Model Interaction/Visualization........................ 117
`A.7.5 Manual .......................................... 118
`A.7.6
`Comments/Impressions.............................. 118
`Designer's Workbench...................................... 119
`A.8.1 Overview......................................... 119
`A.8.2 Model Creation .................................... 120
`A.8.3 Model Modification................................. 121
`A.8.4 Model Interaction/Visualization........................ 122
`A.8.5 Manual .......................................... 122
`A.8.6
`Comments/Impressions.............................. 122
`Form-Z.................................................. 123
`A.9.1 Overview......................................... 123
`A.9.2 Model Creation .................................... 124
`A.9.3 Model Modification................................. 125
`A.9.4 Model Interaction/Visualization........................ 125
`A.9.5 Manual .......................................... 126
`A.9.6
`Comments/Impressions.............................. 126
`A.10 IGRIP .................................................. 128
`A.10.1 Overview......................................... 128
`A.10.2 Model Creation .................................... 130
`A.10.3 Model Modification................................. 130
`A.10.4 Model Interaction/Visualization........................ 131
`A.10.5 Manual .......................................... 131
`A.10.6 Comments/Impressions.............................. 132
`A.11 Minicad+4 ............................................... 133
`A.11.1 Overview......................................... 133
`A.11.2 Model Creation .................................... 134
`A.11.3 Model Modification................................. 135
`
`SCEA Ex. 1032 Page 8
`
`

`
`ix
`
`A.11.4 Model Interaction/Visualization........................ 136
`A.11.5 Manual .......................................... 136
`A.11.6 Comments/Impressions.............................. 136
`A.12 MultiGen ................................................ 138
`A.12.1 Overview......................................... 138
`A.12.2 Model Creation .................................... 139
`A.12.3 Model Modification................................. 140
`A.12.4 Model Interaction/Visualization........................ 140
`A.12.5 Manual .......................................... 141
`A.12.6 Comments/Impressions.............................. 141
`A.13 Sculpt 3D ................................................ 143
`A.13.1 Overview......................................... 143
`A.13.2 Model Creation .................................... 144
`A.13.3 Model Modification................................. 145
`A.13.4 Model Interaction/Visualization........................ 145
`A.13.5 Manual .......................................... 146
`A.13.6 Comments/Impressions.............................. 146
`A.14 Upfront ................................................. 148
`A.14.1 Overview......................................... 148
`A.14.2 Model Creation .................................... 149
`A.14.3 Model Modification................................. 150
`A.14.4 Model Interaction/Visualization........................ 150
`A.14.5 Manual .......................................... 151
`A.14.6 Comments/Impressions.............................. 151
`A.15 WalkThrough............................................. 153
`A.15.1 Overview......................................... 153
`A.15.2 Model Creation .................................... 154
`A.15.3 Model Modification................................. 154
`A.15.4 Model Interaction/Visualization........................ 155
`A.15.5 Manual .......................................... 155
`A.15.6 Comments/Impressions.............................. 156
`References ..................................................... 157
`
`B.
`
`SCEA Ex. 1032 Page 9
`
`

`
`x
`
`LIST OF TABLES
`
`Successful virtual-world application domains..................... 3
`Interactive design systems input/output comparison................ 25
`Mean time of trial completion by experimental condition. ........... 52
`Mean questionnaire results by technique......................... 54
`F statistic and significance by questionnaire category............... 54
`Co-located vs. fixed-offset, F statistic and significance by
`questionnaire category. ..................................... 55
`Co-located vs. variable-offset, F statistic and significance by
`questionnaire category. ..................................... 55
`Fixed-offset vs. variable-offset, F statistic and significance by
`questionnaire category. ..................................... 55
`Mean positional accuracy by experimental condition. .............. 63
`Mean questionnaire results by technique......................... 65
`F statistic and significance by questionnaire category............... 65
`CHIMP system overview. ................................... 70
`CHIMP's hand-held widgets. ................................ 71
`CHIMP's control panels..................................... 72
`Modeling packages reviewed. ................................ 95
`Modeling system capability comparison. ........................ 103
`Modeling system paradigms. ................................. 106
`
`Table 1.1:
`Table 2.1:
`Table 4.1:
`Table 4.2:
`Table 4.3:
`Table 4.4:
`
`Table 4.5:
`
`Table 4.6:
`
`Table 5.1:
`Table 5.2:
`Table 5.3:
`Table 6.1:
`Table 6.2:
`Table 6.3:
`Table A.1:
`Table A.2:
`Table A.3:
`
`SCEA Ex. 1032 Page 10
`
`

`
`xi
`
`LIST OF FIGURES
`
`Figure 2.1: Nielson's triad cursor. ...................................... 8
`Figure 2.2: Constrained geometric transformation using widgets. .............. 10
`Figure 2.3: Using object associations. ................................... 11
`Figure 2.4:
`The Rockin' Mouse. ....................................... 12
`Figure 2.5:
`Zhai et al.'s framework for the study of multi-degree-of-freedom
`manipulation schemes....................................... 13
`Layers in a toolglass system. ................................. 14
`Figure 2.6:
`Figure 2.7: Using toolglasses, two-hands, and transparency in T3. ............ 15
`Figure 2.8: Marking Menus interaction. .................................. 16
`Figure 2.9: Using two hands and props in Netra. .......................... 17
`Figure 2.10: Object manipulation and spline editing using Fitzmaurice et al's
`graspable user interface...................................... 18
`Figure 2.11: The Responsive Workbench.................................. 19
`Figure 2.12: Guiard's handwriting experiment. ............................. 20
`Figure 2.13: Buxton and Meyer's two handed input experiment................. 21
`Figure 2.14: Kabbash et al's two-hand connect the dots experiment.............. 22
`Figure 2.15: VIDEODESK two-handed interaction........................... 23
`Figure 2.16: Using a gesture to move a group in GEdit. ...................... 23
`Figure 2.17: Schmandt's stereoscopic display. ............................. 26
`Figure 2.18: University of Alberta's JDCAD system. ........................ 27
`Figure 2.19: Using T junctions to infer object placement in SKETCH. ........... 29
`Figure 2.20: UNC's nanoWorkbench..................................... 30
`Figure 2.21: University of Virginia's World-In-Miniature. .................... 31
`Figure 3.1: Automatic scaling of the world when the user grabs and releases an
`object. .................................................. 34
`Figure 3.2: Vectors used in determining automatic scaling factor. .............. 35
`Figure 3.3: Using a pull-down menu. ................................... 39
`Figure 3.4: Using a hand-held widget.................................... 40
`Figure 3.5:
`Selecting a region for closer inspection.......................... 42
`Figure 3.6:
`Look-at menu. ............................................ 43
`Figure 3.7:
`Two-handed flying. ........................................ 44
`Figure 3.8: Over-the-shoulder deletion. .................................. 45
`Figure 4.1:
`Experimental conditions for the docking test. .................... 49
`Figure 4.3: Mean docking times by technique.............................. 53
`
`SCEA Ex. 1032 Page 11
`
`

`
`xii
`
`Figure 5.1: Widget test objects. ........................................ 60
`Figure 5.2: Widget test experimental conditions............................ 62
`Figure 5.3: Mean positional accuracies by technique......................... 64
`Figure 6.1: Using the CHIMP system.................................... 69
`Figure 6.2: CHIMP's primary and secondary input devices. .................. 69
`Figure 6.3: Rotary tool chooser......................................... 74
`Figure 6.4:
`Interacting with a control panel using a laser beam................. 76
`Figure 6.5:
`Interacting with a control panel using occlusion selection............ 76
`Figure 6.6: CHIMP's look-at menus..................................... 77
`Figure 6.7: Occlusion selection, first person point of view.................... 79
`Figure 6.8: Occlusion selection, third person point of view. .................. 79
`Figure 6.9:
`Spotlight selection, third person point of view. ................... 80
`Figure 6.10: Spotlight selection, first person point of view. ................... 80
`Figure 6.11: First and second generation constrained manipulation widgets. ...... 84
`Figure 6.13: Constrained manipulation mode selection based upon hand
`separation. ............................................... 86
`Figure 6.14: Numeric input using the Arithma Addiator. ...................... 87
`Figure 6.15: Linear interactive numbers. .................................. 87
`Figure 6.16: Rotary interactive numbers. .................................. 88
`Figure A.1: Orthogonal-view system..................................... 100
`Figure A.2a: Perspective projection ambiguity. ............................. 101
`Figure A.2b: Three orthogonal views of the object in Figure A.2a. .............. 101
`Figure A.3: Archicad interface. ......................................... 107
`Figure A.4: AutoCAD interface. ........................................ 111
`Figure A.5: DesignWorkshop interface. .................................. 115
`Figure A.6: Designer's Workbench interface............................... 119
`Figure A.7: Form-Z interface........................................... 123
`Figure A.8: IGRIP interface............................................ 128
`Figure A.9: Minicad+ interface. ........................................ 133
`Figure A.10: 2D vs. 3D objects. ........................................ 134
`Figure A.11: MultiGen interface. ........................................ 138
`Figure A.12: Sculpt 3D interface. ........................................ 143
`Figure A.13: Upfront interface........................................... 148
`Figure A.14: WalkThrough interface. ..................................... 153
`
`SCEA Ex. 1032 Page 12
`
`

`
`LIST OF ABBREVIATIONS
`
`xiii
`
`1D
`
`2D
`
`3D
`
`ANOVA
`
`CAD
`
`CHIMP
`
`DoF
`
`GUI
`
`HMD
`
`K
`
`one-dimensional
`
`two-dimensional
`
`three-dimensional
`
`analysis of variances
`
`computer-aided design
`
`Chapel Hill Immersive Modeling Program
`
`degree of freedom
`
`graphical user interface
`
`head-mounted display
`
`kilo
`
`MANOVA
`
`multivariate analysis of variances
`
`U I
`
`U N C
`
`VE
`
`VR
`
`WIM
`
`user interface
`
`University of North Carolina
`
`virtual environment
`
`virtual reality
`
`world in miniature
`
`SCEA Ex. 1032 Page 13
`
`

`
`Chapter 1
`
`Introduction
`
`1.1 The Research
`
`The goal of my research is a better understanding of what it means to work in a
`virtual world. The focus is the characterization of the benefits and limitations of this new
`medium. The hope is that improved understanding will lead to more effective virtual-
`environment interaction techniques; those that minimize user energy and make it possible to
`perform real-world work in a virtual world.
`
`To motivate my research I have chosen the driving problem of three-dimensional
`(3D) modeling for architectural design, for several reasons. First, to evaluate the benefits
`and limitations of working in a virtual world fairly, it is important to concentrate on real-
`world tasks and not toy problems; architectural models are complex models that are difficult
`to build. Second, if we are to realize any benefits from working in a virtual world, it is
`important to focus on tasks that will truly profit from being in an immersive environment;
`the shapes and spaces inside architectural models, more so than mechanical models, are just
`as important as their external form.
`
`1.2 The Challenge
`
`The architectural design of three-dimensional spaces is inherently a difficult task.
`Even given the ultimate design system, in which thoughts magically become material, an
`architect would still encounter many difficulties in solving a typical design problem with its
`myriad of trade-offs and constraints.
`
`In the real world, these inherent difficulties are compounded by incidental
`difficulties, problems which are the result of the chosen medium of expression and not
`inherent in the design problem itself. The duration and flexibility of the design cycle is
`highly sensitive to the amount of time required to represent and modify designs in the
`
`SCEA Ex. 1032 Page 14
`
`

`
`2
`
`chosen medium. This is clearly true of sketches, of formal drawings, and of scale
`model—all media used for the expression of architectural designs.
`
`The choice of the computer as a design medium has greatly simplified and sped up
`many aspects of the architectural design process. Just ease of copying and erasing is one
`big plus. Many of the gains, however, are restricted to the transformation of existing
`design data or aspects (structural, mechanical, electrical) such as redrawing a single design
`from many views, or managing large databases of materials and parts. The specification of
`original data is still a time-consuming and difficult task (a half a man year for a 30K-
`element model [Brooks, 1994] ).
`
`It is my belief that many of the shortcomings of the computer as medium for the
`design of three-dimensional spaces are the result of the limitations of existing two-
`dimensional (2D) interfaces. Two-dimensional displays inherently inject ambiguity into the
`interpretation of displayed information [Gregory, 1973] . The use of two-dimensional
`input devices, such as the mouse or the data tablet, precludes the direct specification of
`three-dimensional positions, orientations and extents. Designers are forced to look and
`work through small windows onto their virtual world. They tend to limit themselves to
`views along the principal axes plus a few other classical view directions. Indeed, despite
`many years of research and development, few computer-aided design programs approach
`the flexibility of the cocktail napkin or the architect's "trash"2 as a design tool.
`
`1.3 The Attack
`
`The underlying thesis motivating this research is that architects can design three-
`dimensional spaces more easily in an immersive environment than they can modeling
`through-the-window using conventional workstation inputs and displays. I believe this to
`be true for several reasons.
`
`In an immersive environment one can directly perceive and manipulate three-
`dimensional objects instead of interacting with abstract interface elements. Users can
`harness interaction skills learned in the real world. This helps to make the computer
`interface really transparent and allows users to work more directly with the objects of
`design.
`
`
`2Tracing paper used by architects which can be placed on top of existing drawings to try out new design
`ideas quickly without having to redraw the entire design.
`
`SCEA Ex. 1032 Page 15
`
`

`
`3
`
`In a virtual world one turns his head to look at something. Contrast this with the
`frustration of setting one's viewpoint in a 3D through-the-window application. Dynamic
`viewpoint change, whether immersive or through the window, gives better space
`perception.
`
`Finally, using a head-mounted display, one becomes immersed within the virtual
`space within which one intuitively changes viewpoint. Not only does this make it easier to
`understand the shapes and spaces being created, it means that controls and information can
`now be distributed about the user instead of being packed into a small window.
`
`1.4 A Complication
`
`Working in a virtual world is not without its own set of incidental difficulties.
`Indeed, though promising results have been demonstrated in several key application
`domains (Table 1), the number of successful virtual environment applications still remains
`small, with even fewer applications having gone beyond the research laboratory. Why?
`
`Training and practice of different skills
`
`Table 1.1: Successful virtual-world application domains.
`Domain
`Example Applications
`"Being There", experience for the sake of
`Phobia treatment: [Rothbaum, et al., 1995]
`experience
`Aesthetics: [Davies and Harrison, 1996]
`Entertainment: [Pausch, et al., 1996]
`Surgery: [Hunter, et al., 1993]
`Military : [Macedonia, et al., 1994]
`Maintenance: x[Wilson, et al., 1995]
`Wayfinding: x[Witmer, et al., 1995]x
`Architecture: [Brooks, 1986]
`Fluid Flow: [Bryson and Levit, 1992]
`Nano-surfaces: [Taylor, et al., 1993]
`3D models: [Butterworth, et al., 1992]
`Cityscapes: [Mapes and Moshell, 1995]
`
`Visualization of unrealized or unseeable
`objects
`
`Design
`
`Besides the well known technological limitations such as system latency and
`display resolution, several less obvious factors complicate the task of virtual object
`manipulation and hamper the development of real-world virtual environment applications.
`
`Many of these successes fall within the realm of spatial visualization. The
`applications exploit the intuitive view specification (via head tracking) offered by VR
`systems but make little use of direct virtual-object manipulation. Why is it difficult to do
`much more than look around in a virtual world?
`
`SCEA Ex. 1032 Page 16
`
`

`
`1) The precise manipulation of virtual objects is hard. Although immersion, head-
`tracked view specification, and six DoF hand tracking facilitate the coarse manipulation of
`virtual objects, the precise manipulation of virtual objects is complicated by:
`
`4
`
`•
`
`•
`
`•
`
`Lack of haptic feedback: Humans depend on haptic feedback and physical
`constraints for precise interaction in the real world; the lack of physical work-
`surfaces to align against and rest on limits precision and exacerbates fatigue.
`Though there is considerable ongoing research in the area of active haptic
`feedback [Durlach and Mavor, 1995] , general-purpose haptic feedback devices
`that do not restrict the mobility of the user are not yet practical or available.
`
`Limited input information: Most virtual-environment systems accept position
`and orientation (pose) data on the user's head and (if lucky) two hands. One
`also typically has a button or glove to provide signal/event information. This
`suffices for specifying simple 6-DoF motion and placement. In the real world,
`we do this and much more:
`
`a) Object modification, usually with tools.
`b) Directing the cooperation of helping hands, by spoken commands ("Put that
`there").
`c) Measuring.
`d) Annotating objects with text.
`
`In contrast, today in most VR systems:
`
`a) Tool selection is difficult.
`b) Voice command technology is marginally effective.
`c) Measuring tools are rarely available.
`d) Alphanumeric input is difficult.
`
`Limited precision: The lack of haptic and acoustic feedback, inaccurate tracking
`systems, and the whole-hand input typical of current VR systems restrict users
`to the coarse manipulation of virtual objects. Fine-grained manipulations are
`extremely difficult using this "boxing-glove" style interface. Shumin Zhai of
`the University of Toronto, for example, has demonstrated that users' task
`completion times were slower in a 3D docking task when using a 3D input
`device which excluded the use of the fingers (vs. a similar device that utilized
`the fingers) [Zhai, et al., 1996] .
`
`2) Virtual environments lack a unifying framework for interaction, such as the
`desktop metaphor used in conventional through-the-window computer applications.
`
`SCEA Ex. 1032 Page 17
`
`

`
`5
`
`Without haptics neither real-world nor desktop computer interaction metaphors are adequate
`in a virtual environment. Knowledge on how to manipulate objects or controls can no
`longer be "stored in the world" [Norman, 1988] , with the physical constraints of the
`devices giving the user clues as to their use (e.g. a dial can only be rotated about its axis).
`
`The desktop metaphor further breaks down when the user is inside the user
`interface. Interface controls and displays must move with the user as he moves through the
`environment and be made easy to locate and reach. The differences between working in a
`conventional computer environment and working immersed are analogous to the
`differences between a craftsman at a workbench and one moving about a worksite wearing
`a toolbelt. His toolbelt had better be large and filled with powerful tools.
`
`1

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket