`Durward et al.
`
`USOO56596 91A
`[11] Patent Number:
`[45] Date of Patent:
`
`5,659,691
`Aug. 19, 1997
`
`[54] VIRTUAL REALITY NETWORK WITH
`SELECTIVE DISTRIBUTION AND
`UPDATING OF DATA TO REDUCE
`BANDWIDTH REQUIREMENTS
`
`[75]
`
`Inventors: James Durward; Jonathan Levine;
`Michael Nemeth; Jerry Prettegiani;
`Ian T. 'l‘weedie, all of Calgary, Canada
`
`5,307,456
`5,310,349
`5,322,441
`5,381,158
`5,469,511
`5,495,576
`5,588,139
`
`4/1994
`5/ 1994
`6/1994
`l/1995
`l1/l995
`2/1996
`12/1996
`
`MacKay ................................ .. 395/154
`Daniels et al.
`Lewis et al. .... ..
`
`
`
`Takahara et a1. Lewis et al. .... ..
`
`Ritchey ...... ..
`Lanier et a1. ......................... .. 395/500
`
`FOREIGN PATENT DOCUMENTS
`
`[73]
`
`Assignee: Virtual Universe Corporation,
`Calgary, Canada
`
`0 479 422 A2 4/1992 European Pat. O?". .
`WO 94/17860 8/1994 WIPO .
`
`[21]
`[22]
`[5 1]
`
`[52]
`
`[53]
`
`[5 6]
`
`Appl. No.: 125,950
`Filed:
`Sep. 23, 1993
`
`Int. Cl.6 .......................... .. G06F 3/14; G06F 15/163;
`G06F 3/16
`US. Cl. .................. .. 395/329; 395/615; 395/200.34;
`395/978
`Field of Search ................................... .. 395/600, 154,
`395/155, 119, 152, 162, 326, 329,511,
`501, 502, 526, 339, 200.09, 335, 978; 364/578,
`514 A
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`
`9/1976 Kuipers ...................... .. 324/43 R
`3,983,474
`4/1977 Kuipers .... ..
`343/100 R
`4,017,858
`9/1983 Howlett .... ..
`354/114
`4,406,532
`4,479,195 10/1984 Herr et a1. .
`364/900
`4,540,850
`9/1985 Herr et a1 . . . . . .
`. . . . .. 179/2 DP
`4,542,291
`9/1985 Zimmerman
`250/231 R
`4,640,989
`2/1987 Riner et al . . . . .
`. . . . .. 379/94
`4,710,870 12/1987 Blackwell et al
`364/200
`4,714,989 12/1987 Billings ........ ..
`364/200
`4,734,934
`3/1988 Boggs ...... ..
`379/202
`4,757,714
`7/1988 Purdy etal
`.. 73/597
`4,796,293
`1/1989 Blinken et a1
`. 379/202
`4,937,444
`6/1990 Zimmerman
`250/2311
`4,945,305
`7/1990 Blood ....... ..
`324007.17
`4,984,179
`l/l991 Waldern .................... .. 364/514
`4,988,981
`1/1991 Zimmerman et al
`340/709
`5,001,628
`3/1991 Johnson et al. ....... ..
`364/200
`5,003,300
`3/1991 Wells ................ ..
`340/705
`5,021,976
`6/ 1991 Wexelblat et al
`364/521
`
`5,275,565
`
`l/1994 Moncrief . . . . . . . .
`
`. . . . .. 434/29
`
`OTHER PUBLICATIONS
`
`“Europe Is Bursting With Virtual Reality Ideas”, Comput
`ergrarn International, Jan. 1993.
`
`(List continued on next page.)
`
`Primary Examiner-Thomas G. Black
`Assistant Examiner-Jack M. Choules
`Attomey, Agent, or Firm-Townsend and Townsend and
`Crew LLP
`
`[57]
`
`ABSTRACT
`
`A virtual reality system has a database for de?ning one or
`more three-dimensional virtual spaces. A communication
`unit establishes a communication between the database and
`a user, and a data communication unit communicates data
`from the database to the user so that the user’s computer may
`display a portion of a selected virtual space on the user’s
`head mounted display. The communications unit also
`receives data corresponding to the position, orientation,
`and! or movement of the user relative to areference point and
`uses the data to de?ne a virtual being within the virtual
`space, wherein the position, orientation, and/or movements
`of the virtual being are correlated to the received data.
`Preferably, the data communicated to the user typically
`corresponds to the portion of the virtual space viewed ?om
`the perspective of the virtual being. To reduce the amount of
`data communicated between the computer and each user,
`visual and sound priority spaces may be de?ned within the
`portion of the virtual space data communicated to the user,
`and elements within selected priority spaces may be updated
`in priority over other priority spaces.
`
`15 Claims, 4 Drawing Sheets
`
`VISUAL RELEVANT
`SPACES AND PRIORITY
`SPACES
`
`BUNGIE - EXHIBIT 1008
`
`
`
`5,659,691
`Page 2
`
`OTHER PUBLICATIONS
`
`“Virtual Audio Finally Sounds Like _Music to the Ears”,
`Electronic Engineering Times Oct. 1992.
`Scarborough, E., “Enhancement 01' Audio Localization Cue
`Synthesis by Adding Enviromental and Visual Clues”.
`NTIS Dec. 1992.
`“W Industries Makes Virtual Reality a Reality at #20,000”
`Computergram International Mar. 1991.
`Michael Snoswell, Overview of Cyberterm, a Cyberspace
`Protocol Implementation, from the World Wide Web at
`httpzllwww.cs.uidaho.edu/lal/cyberspace/VR/docs/
`SnoswelLQberterm. Jul. 1992.
`
`Kamae, T. “Development of a public facsimile communi
`cation system using storage and conversion techniques.”
`IEEE National Telecommunications Conference, Houston,
`TX (30 Nov. —4 Dec. 1930), pp. 19.4.1 through 19.4.5.
`CHI’92 Conference Proceedings, ACM Conference on
`Human Factors in Computing Systems, May 1992,
`Monterey, CA, pp. 329-334, Codella, C., et a1. “Interactive
`Simulation in a Multi-Person Virtual World.”
`Machine design, vol. 62, No. 24. Nov. 1990, Cleveland. US,
`pp. 40-41, “3D Sound Points Pilots Towards The Enemy.”
`Proceedings 1990 Symposium on INteractive 3D Graphics,
`Mar. 1990, Utah. USA, pp. 35-36, Blancard. C., et al.,
`“Reality Built For Two: A Virtual Reality Tool.”
`
`
`
`US. Patent
`US. Patent
`
`Aug. 19, 1997
`Aug. 19, 1997
`
`Sheet 1 of 4
`Sheet 1 0f 4
`
`5,659,691
`5,659,691
`
`
`
` 26
`CENTRAL
`CONTROL
`
`UN”
`
`FIG I
`
`
`
`US. Patent
`
`Aug. 19, 1997
`
`Sheet 2 0f 4
`
`5,659,691
`
`I4\
`'56
`[I08
`mom I INPUT
`
`TELEPHONL TELEPHONE '60 umonx__ INTERFACE
`
`,us
`
`[16!
`
`,132
`PERSPECTIVE
`MONITOR
`
`,l36
`INTERNAL
`nmzn
`
`,l4o
`“6132??
`UN"
`
`‘——-——' A44
`Wm“
`
`UN"
`
`1'00 _ 4
`
`pm
`RECEIVER
`A20
`souno
`
`DATA RECEIVER
`
`I163
`
`UPDATE
`couu.
`*' CONTROL
`
`‘
`
`soulw
`comm
`mm
`
`To - OUTPU'T
`TELEPHONE: TELEPHONE
`
`r'_J_‘Posmon/
`Isa‘ CONTROL
`DATA
`"'
`
`‘7%
`
`DATABASE
`
`FIG: 2
`
`'(IM
`cAo
`
`m~ mo
`
`12¢ lDl
`
`cAusl
`
`14~ m2
`
`TASK!
`
`VIRTUAL
`SPAOEN
`
`'0"
`
`FIG! 3.
`
`
`
`US. Patent
`
`Aug.19, 1997 ‘
`
`Sheet 301*4
`
`5,659,691
`
`VISUAL RELEVANT
`SPACES AIIO PRIORITY
`SPACES
`
`FIG .5
`
`SOUND RELEVANT SPACES
`AND PRIORITY SPACES
`
`FIG‘. 61
`
`
`
`US. Patent
`
`Aug. 19, 1997
`
`5,659,691
`
`Sheet 4 0f 4
`,300
`
`{304
`
`r308
`
`310
`
`LOG ON
`TO CENTRAL
`CONTROL UNIT
`I
`ESTABLISH
`AUDIO
`CONNECTION
`I
`SELECT
`VIRTUAL SPACE
`I
`ESTABLISH USER/
`OBJECTS WITHIN
`VIRTUAL SPACE
`(3M
`A
`DOWNLOAD CURRENT
`STATE TO
`USER
`
`{542
`
`322
`
`
`
`£99299 USERS .
`
`SESSION?
`
`f [338
`
`PROCESS
`DATA IN
`PRIORITY SPACES
`I
`PROCESS
`DATA IN
`RELEVANT SPACES
`
`r334
`
`RECEIVE
`UPDATE DATA
`EROII USERS
`
`UPDATE
`VIRTUAL
`SPACE
`
`33D
`
`
`
`5,659,691
`
`1
`VIRTUAL REALITY NETWORK WITH
`SELECTIVE DISTRIBUTION AND
`UPDATING OF DATA TO REDUCE
`BANDWIDTH REQUIREMENTS
`BACKGROUND OF THE INVENTION
`This invention relates to virtual reality systems and, more
`particularly, to a virtual reality network wherein multiple
`users at remote locations may telephone a central commu
`nications center and participate in a virtual reality experi
`ence.
`Virtual reality systems are computer controlled systems
`which simulate arti?cial Worlds and which allow users to
`experience and interact with the arti?cial worlds as if the
`users actually existed within them. Examples of virtual
`reality systems and components are disclosed in U.S. Pat.
`Nos. 4,542,291; 4,017,858; 4,945,305; 3,983,474; 4,406,
`532; 5,003,300; 4,984,179; 4,988,981; and 4,757,714; all of
`which are incorporated herein by reference. The typical
`virtual reality system includes a computer, a head-mounted
`display for displaying an arti?cial world to the user, and
`instrumentation for sensing the position and orientation of
`the user with respect to the computer or some other reference
`point. The arti?cial world is de?ned within the computer’s
`database. Instrumentation data is communicated to the
`computer, and the computer creates a virtual being within
`the arti?cial world which emulates the position, orientation,
`and movements of the user. The computer then communi
`cates graphical data to the head-mounted display which then
`displays the arti?cial world from the perspective of the
`virtual being. By gesturing in an appropriate manner, the
`user may interact with virtual objects within the arti?cial
`world as if they were real. For example, the user may drive
`an arti?cial automobile, throw an arti?cial ball, etc.
`Although virtual reality has proven to be an exciting-new
`technology, it is also a very expensive one. Most virtual
`reality hardware is located at universities and government
`agencies. although some virtual reality arcades have been
`built in major shopping centers located in large cities for
`playing a few basic games. Still, access to sophisticated
`virtual reality systems has been extremely limited and is
`often not available to the general public without great
`inconvenience.
`
`SUMMARY OF THE INVENTION
`The present invention is directed to a virtual reality
`system wherein multiple users located at different remote
`physical locations may communicate with the system via
`conventional dialup telephone lines and may perform inde
`pendent and/or interactive/collaborative tasks within the
`system, aided by inter-user audio cormnunications.
`In one embodiment of the present invention, the virtual
`reality system has a central database for de?ning one or
`more three-dimensional virtual spaces. A communication
`unit establishes a communication between the database and
`a user, and a data communication unit communicates data
`from the database to the user so that the user’s computer may
`display a portion of a selected virtual space on the user’s
`head mounted display. The communications unit also
`receives data corresponding to the position, orientation,
`and/or movement of the user relative to a reference point and
`uses the data to de?ne a virtual being Within the virtual
`space, wherein the position, orientation, and/or movements
`of the virtual being are correlated to the received data.
`Preferably, the data communicated to the user typically
`corresponds to the portion of the virtual space viewed ?rom
`the perspective of the virtual being.
`
`2
`The system de?nes other virtual beings within the data
`base in response to position, orientation, and/or movement
`data received from other users, and the portions of the virtual
`space communicated to the other users may correspond to
`the perspectives of their associated virtual beings. The
`system periodically updates the database and communicates
`the updated portions of the virtual space to the users to
`re?ect changes in the position of moving objects within the
`virtual space. To further reduce the amount of data commu
`nicated between the computer and each user, priority spaces
`may be de?ned within the portion of the virtual space data
`communicated to the user, and elements within selected
`priority spaces may be updated in priority over other priority
`spaces.
`The system also supports audio communication with the
`users. Sounds may be de?ned Within the virtual space, and
`data correlated to the sounds may be communicated to the
`users. Additionally, the communications unit may receive
`sounds from each user and then communicate the sounds to
`the other users to facilitate verbal or other audio communi
`cation among the users. The sounds may be assigned origins
`within the virtual space, and a sound control unit may then
`send data to each user for simulating the origin of the sound
`within the virtual space. The sounds may be assigned to
`sound priority spaces so that the amplitudes of sounds
`assigned to a particular priority space may be varied relative
`to the amplitudes of sounds in other priority spaces.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`FIG. 1 is diagram of a particular embodiment of a virtual
`reality network according to the present invention;
`FIG. 2 is a block diagram of a particular embodiment of
`the central control unit shown in FIG. 1;
`FIG. 3 is a block diagram illustrating how the database
`shown in FIG. 2 is partitioned into multiple virtual spaces;
`FIG. 4 is a diagram illustrating a particular embodiment
`of a virtual space according to the present invention;
`FIG. 5 is a diagram illustrating the concepts of visual
`relevant spaces and priority spaces within the virtual space;
`FIG. 6 is a diagram illustrating the concepts of sound
`relevant spaces and sound priority spaces within the virtual
`space; and
`FIG. 7 is a ?owchart showing operation of the virtual
`reality network according to the present invention.
`
`10
`
`20
`
`25
`
`30
`
`35
`
`45
`
`DETAILED DESCRIPTION OF THE
`PREFERRED EMBODIMENTS
`FIG. 1 is a diagram illustrating a particular embodiment
`of a virtual reality network 10 according to the present
`invention. Network 10 includes a central control unit 14 for
`communicating with a plurality of users, e.g., users 18 and
`22, through a public telephone system represented by dialup
`telephone lines 26 and 30 coupled to telephones 34 and 38,
`respectively. Although telephone lines 26 and 30 have been
`shown as single lines, each may comprise multiple lines
`wherein one or more lines may be used for virtual object or
`virtual space data and other lines may be used for audio data.
`Furthermore, the present invention is not limited to tele
`phone communications. Any data transmission network may
`su?ice. For example, network 10 may comprise high speed
`digital cormnunication lines, cable broadcasting communi
`cation lines, etc. The number of users supported by network
`10 is not limited to the two shown. Any number of users,
`even thousands. may be supported.
`Typically, user 18 is equipped with a computer 42, a
`head-mounted display 46. earphones 50, a microphone 52, a
`
`50
`
`55
`
`65
`
`
`
`5,659,691
`
`10
`
`15
`
`25
`
`30
`
`35
`
`40
`
`3
`head position sensor 53, and an instrumented garment 54.
`User 22 is ordinarily equipped in the same manner. Com
`puter 42 may include a keyboard 43 for entering control
`information. Computer 42 may also include a monitor 58 for
`displaying control information or the virtual space viewed
`by user 18, but it should be understood that the primary
`display of the virtual space to user 18 is preferably accom
`plished by head-mounted display 46. Alternatively, head
`mounted display may be substituted in some applications by
`a stand-alone display unit which realistically displays the
`virtual space to the user. Earphones 50 receive sounds
`associated with the displayed virtual space from central
`control unit 14 via computer 42, and microphone 52 com
`municates sounds from user 18 via computer 42 to central
`control unit 14 which, in turn, merges the sounds received
`into the virtual space. Head position sensor 53 senses the
`position and/or orientation of the user’s head relative to
`computer 42 or some other reference point and communi
`cates the positional data to computer 42 which, in turn,
`communicates the data to central control ‘unit 14. Instru
`mented garment 54 is shown as a glove in this embodiment,
`but other instrumented garments such as shirts, pants, or
`full-body suits may be used as well. Instrumented garment
`54 typically senses the position, orientation, and/or ?exure
`of the associated body part relative to computer 42 or some
`other reference point and communicates the data via com
`puter 42 to central control unit 14. Central control unit 14
`uses the data from head position sensor 53 and instrumented
`garment 54 to de?ne a virtual being within the virtual space.
`The virtual being may take the form of another human being,
`an animal, machine, tool, inanimate object, etc., all of which
`may be visible or invisible within the virtual space. The
`position, orientation, and/or ?exure data from the sensors
`may be used to emulate the same position, orientation,
`and/or ?exure of the de?ned virtual being, or else the data
`may be used to control some other action. For example, if
`the user is de?ned as a virtual automobile within the virtual
`space, then position of the user’s hand may be used to
`control acceleration whereas ?exure of one or more of the
`user’s ?ngers may be used to steer the automobile.
`Additionally, the data may be used to de?ne multiple virtual
`beings. For example, data from head position sensor 53 may
`be used to de?ne an aircraft carrier, and data from instru
`mented garment 54 may be used to de?ne an aircraft.
`Although head-mounted display 46, earphones 50, micro
`phone 52, head position sensor 53 and instrumented glove
`54 are shown attached to computer 42 through wires, any or
`all of these elements may be controlled by radio frequency
`or other wireless technologies.
`In the preferred embodiment, head mounted display 46
`displays the portion of the virtual space viewed from the
`perspective of the virtual being de?ned for user 18 together
`with all other de?ned virtual beings and objects within its
`?eld of vision. Each user may talk to and interact with other
`virtual beings and objects in the virtual space as the user
`desires, subject to constraints noted below when discussing
`the concepts of relevant spaces.
`FIG. 2 is a block diagram of a particular embodiment of
`central control unit 14. Central control unit 14 includes a
`processor 100, a database memory 104 for storing virtual
`space data, an input telephone interface 108 for receiving
`data from the users, an output telephone interface 112 for
`communicating data to the users, a position/control data
`receiver 116 for receiving position, motion and control data
`from the users. a sound data receiver 120 for receiving sound
`information from the users, a position/control data transmit
`ter 124 for communicating position, motion and control data
`
`4
`to the users, a sound data transmitter 128 for communicating
`sound data to the users, a perspective monitor 132 for
`monitoring the visual perspectives of the virtual beings
`de?ned in database 104, a timer 136, a database update unit
`140 for updating database 104 with data received from the
`users (and other program controlled changes), a virtual
`object control unit 144 for controlling virtual objects de?ned
`in database 104, an update communication control unit 148
`for controlling the communication of updated data to the
`users, and a sound control unit 152 for processing sound
`data.
`'
`
`Data communicated by output telephone interface 112 to
`the users may comprise any data de?ning the virtual space
`together with the appropriate control information. For
`example, the data may comprise graphics data for rendering
`the virtual space, position/motion data for implementing
`moving objects within the virtual space, sound data, etc. In
`the preferred embodiment, however, graphics data per se is
`not communicated on an interactive basis. Instead, each
`user’s computer has a copy of the entire virtual space (e.g.,
`background, objects and primitives), and the data de?ning
`the virtual space communicated to the users comprises only
`position, motion, control, and sound data. After initial
`position, motion, control and sound data is communicated to
`the users, only changes in the position, motion, control and
`sound data is communicated thereafter. This dramatically
`reduces bandwidth requirements and allows the system to
`operate with many concurrent users without sacri?cing
`real-time realism.
`v
`In the preferred embodiment, database 104 stores data for
`multiple virtual spaces. FIG. 3 is a block diagram showing
`one possible embodiment of database 104. Database 104
`may contain a CAD virtual space 160 which allows users to
`engage in computer aided design, a game virtual space 164
`which allows users to play a game, a task virtual space 168
`which allows users to manipulate virtual objects to perform
`a particular task, and other virtual spaces. Central control
`unit 14 allows the user to interactively communicate with
`the virtual space either alone or in collaboration with other
`users. Each virtual space includes identi?cation information
`170, 172, 174, etc. so that users may specify which virtual
`space they intend to interact with.
`FIG. 4 is a diagram of a particular embodiment of a virtual
`space 169. Virtual space 169 may include a plurality of
`virtual beings such as virtual beings 182, 183, and 184,
`sound origins 186, 188, 190, and 192, a movable virtual
`object 194, other virtual objects 196, 197 and 198, and
`graphics primitives 199A-F. Graphics primitives 199A-F
`may be used by any user to create further virtual objects if
`desired. To provide maximum ?exibility and to facilitate
`communication with the users, each virtual being, and hence
`each user, is assigned a visual relevant space which deter
`mines which data de?ning the virtual space may be per
`ceived by the user. In the context of the preferred
`embodiment, visual relevant spaces determine which state
`changes are communicated to (or perceivable by) the users.
`Of course, in other embodiments the visual relevant space
`may include all the graphical information encompassed by
`the boundaries of the visual relevant space. FIG. 5 is a
`diagram showing how the concepts of visual relevant spaces
`are applied to virtual beings 182 and 184 of FIG. 4. Virtual
`being 182 is assigned a visual relevant space 200, and virtual
`being 184 is assigned a visual relevant space 204. Virtual
`beings 182 and 184 may view only those elements (objects
`or states changes) which are disposed within their visual
`relevant spaces. For example, elements 194 and 196 and/or
`their motion may be visible to both virtual beings 182 and
`
`50
`
`55
`
`65
`
`
`
`5,659,691
`
`5
`184; element 197 and/or its motion may be visible only to
`virtual being 184; element 183 and/or its motion may be
`visible only to virtual being 182, and element 198 and/or its
`motion may be visible to neither virtual, motion being 182
`nor virtual being 184. In the preferred embodiment which
`communicates only position, control and sound data to the
`users, those elements outside of a visual relevant space may
`be visible to the user, but any real-time or program con
`trolled position/motion associated with the element is not
`processed for that user so that the element appears stationary
`in a ?xed position, or else the element moves in accordance
`with a ?xed script. The visual relevant space may be ?xed
`as shown for virtual being 182. Alternatively, the user’s
`visual relevant space may be de?ned by the ?eld of view of
`the virtual being and areas in close proximity to it (as with
`virtual being 184). in which case the visual relevant space
`may move about the virtual space as the perspective or
`position of the virtual being changes. Visual relevant spaces
`need not be contiguous and need not have a direct spatial
`relationship to the virtual space. For example, a visual
`relevant space may include a virtual object without the
`accompanying background.
`Each visual relevant space may be further subdivided into
`one or more Visual priority spaces. For example, visual
`relevant space 200 may include visual priority spaces 206,
`208, and 210, and visual relevant space 204 may include
`visual priority spaces 212, 214, and 216. In the preferred
`embodiment, visual priority spaces may be used to deter
`mine the update frequency of elements located within them.
`For example, the position and orientation of elements 183,
`194 and 196 may be updated very frequently (e. g., at 30 Hz),
`whereas the position and orientation of element 197 may be
`updated less frequently (e.g., at 1 Hz). Alternatively, visual
`priority spaces closer to the user may be updated more
`frequently than other visual priority spaces. This reduces the
`amount of data that must be communicated to each user
`while maintaining realism of important elements. Since
`many virtual objects are designed to move about the virtual
`space, they may cross into diiferent priority spaces over time
`and be processed accordingly. The change from one priority
`space to another may be continuous or discrete as desired.
`Relevant spaces also may be de?ned for sound data. FIG.
`6 is a diagram illustrating the concept of sound relevant
`spaces. Virtual being 182 has a sound relevant space 230
`associated with it, and virtual being 184 has a sound relevant
`space 234 (indicated by a dashed line) associated with it.
`Only sound sources disposed within a virtual being’s sound
`relevant space (and/or changes in the sounds) may be
`perceived by that being, and hence the corresponding user.
`In this case, sound sources 186, 188, 190, and 192 and/or
`-50
`their changes may be perceived by both virtual beings 182
`and 184. It should be noted that sound sources associated
`with an element which cannot be visually perceived by a
`virtual being may nevertheless be heard by the virtual being.
`Such may be the case with element 183 associated with
`sound source 192. If so, while only virtual being 182 may
`see the element, both virtual beings 182 and 184 may hear
`it. It should also be noted that sound relevant spaces, like
`visual relevant spaces, need not be contiguous and need not
`have a direct spatial relationship to the virtual space.
`Sound priority spaces may also be de?ned for each sound
`relevant space. As shown in FIG. 6, sound relevant space
`230 includes sound priority spaces 238, 242, and 246, and
`sound relevant space 234 includes sound priority spaces
`252, 256, and 260. In this embodiment, the amplitude of the
`sound source perceived by the virtual being depends upon
`which sound priority space the sound source is located.
`
`55
`
`45
`
`65
`
`10
`
`20
`
`25
`
`30
`
`35
`
`6
`Thus. virtual being 182 may perceive sound source 186
`louder than sound source 190, and sound source 190 louder
`than sound sources 188 and 192. Similarly, virtual being 184
`may perceive sound source 188 louder than sound sources
`186 and 190, and sound source 186 and 190 louder than
`sound source 192. Sound priority spaces also may be de?ned
`to set the update frequency of the sound data. Since sound
`origins may move along with virtual objects to which they
`are attached. a given sound may cross into different sound
`priority spaces over time and be processed accordingly. The
`change from one sound priority space to another may be
`continuous or discrete as desired.
`Input telephone interface unit 108 (FIG. 2) receives input
`data from the users through the telephone network and
`communicates control data and positional data such as that
`from the users’ head position sensors and instrumented
`garments (e.g., position, orientation and/or movement) to
`position/control data receiver 116 through a communication
`path 156. Sound data received from the users (e.g., from
`their microphones) is communicated to sound data receiver
`120 through a communication path 160. Position/control
`data receiver 116 and sound data receiver 120 communicate
`with processor 100 through communication paths 161 and
`163, respectively. Processor 100 maps the position, orien
`tation and/or movement data from each user to correspond
`ing virtual beings within the requested virtual space in
`database 104. Control data received from the users may be
`used to establish the telephonic communication with central
`control unit 14, to specify a desired virtual space. to specify
`the type of virtual being the user wishes to assume, and
`possibly how the positional data is to be mapped to the
`selected virtual being, to create virtual objects (using graph
`ics primitives 199A-F), to specify visual and sound relevant
`spaces and their corresponding priority spaces, etc. Sound
`data received from the users may be associated with the
`virtual beings de?ned for those users or assigned in some
`other manner. For example, sound data from one of the users
`may be assigned to a virtual public address system for
`announcing the beginning of a race in which that user and
`other users compete.
`Processor 100 updates database 104 with the received
`position, motion. control, and sound data, determines which
`user is to receive which data according to the relevant and
`priority spaces de?ned for that user, communicates position,
`motion and control information to position/control data
`transmitter 124 through a communication path 162, and
`communicates sound data to sound data transmitter 128
`through a communication path 164. Position/control data
`transmitter 124 and sound data transmitter 128 communicate
`with output telephone interface 112 through respective com
`munication paths 168 and 172 for sending the data to the
`users.
`Perspective monitor 132 monitors the de?ned ?eld of
`view of each virtual being to determine the visual state
`change data to be communicated to the users. As noted
`above, in the preferred embodiment, each user has a copy of
`the selected virtual space in his or her computer, and
`processor 100 periodically sends only the positional and
`sound data assigned to points within the user’s relevant
`space or ?eld of view to the user so that the user’ s computer
`may update the images viewed and sounds heard with the
`new positional and sound data. To further reduce the amount
`of data communicated to the users, which updated data is
`sent to the user at a particular time may be determined by the
`priority space in which the object or sound is located. Thus,
`data for updating objects or sounds in one priority space may
`be communicated thirty times per second. whereas data for
`
`
`
`7
`updating objects or sounds in another priority space may be
`communicated once per second.
`In another embodiment of the invention, processor 100
`may communicate all graphical data associated with the
`relevant space or ?eld of view of the virtual being to the
`corresponding user and then instruct update communication
`control unit 148 to send updated data as appropriate. For
`example, processor 100 may use the positional data from the
`user’s head position sensor to determine the position of the
`head of the virtual being de?ned for that user and commu
`nicate the graphical data for that portion of the relevant
`space to the user. As the user moves about, processor v100
`receives new positional data, and database update unit 140
`uses that data to update the position (and hence the ?eld of
`view) of the corresponding virtual being in database 104.
`Perspective monitor detects the occurrence of a selected
`event and then instructs update communication control unit
`148 to communicate the graphical data for the updated ?eld
`of view to the user. The event which triggers the commu
`nication of the updated data to the users may be the passage
`of a selected time interval measured by timer 136.
`Alternatively, perspective monitor 132 may instruct update
`communication control unit 148 to send the updated data
`when the position of the user’s head changes by a selected
`amount, or it may rely upon the occurrence of some other
`variance in the data received from the user.
`Virtual object control unit 144 de?nes or maintains the
`virtual objects within database 104 and assigns the position,
`orientation, and/or movement data received from the user to
`the virtual objects. For example, data designating ?exure
`and position of the user’s legs, arms, ?ngers, etc. may be
`assigned to the virtual being’s legs, arms, ?ngers, etc. so that
`the virtual being may emulate the gestures of the user for
`running, kicking, catching virtual balls, painting, writing,
`etc. Of course, as noted above, the virtual being de?ned or
`maintained by virtual object control unit 144 need not be
`humanoid and it may be speci?ed by the user using primi
`tives 199A-F in any desired combination. The position,
`orientation, and/or movement data received from a user may
`be assigned to one or more visible or invisible objects as the
`imagination allows. Virtual object control unit 144 also may
`de?ne or maintain program-generated and controlled virtual
`objects within the virtual space. For example, virtual object
`control unit 144 may de?ne or maintain a virtual volley ball
`or a virtual satellite which moves in accordance with
`program-de?ned constraints such as range of motion, speed,
`gravitational forces, etc.
`Sound control unit 152 de?nes the origin and nature of
`sounds within the virtual space and assigns the sound data
`received from the users to them. The simplest example of the
`assignment of sounds is the assignment of a user’s voice to
`the voice of the corresponding virtual being. The voice of the
`virtual being may track the user’s voice exactly, or else
`sound control unit 152 may vary the pitch or timbre of the
`voice accordingly. The user’s voice also could be changed to
`emulate the chosen virtual being. For example, the user’s
`voice could be changed to emulate a frog or jet aircraft, with
`the amplitude of the user’s voice being used to control the
`amplitude of the virtual sound. Sound control unit 152 also
`may assign prog