throbber
\
`
`This resource is also available
`on the WWW.
`Use MADCAT tolaunch.
`
`
`
`
`
`MIT PRESS
`VOLUME 10/ NUMBER | / FEBRUARY 2001
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`
`
`Seace
`
`
` Larry Hodges
` Karun Shimoga
`
`CMU Robotics Institute
`Barbara Shinn-Cunningham
`“Boston University
`Gurminder Singh
`National University of Singapore
`Robert J. Stone
`VR Solutions/Virtual Presence
`Ltd. UK
`Martin Stytz
`Air ForceInstitute of
`Technology
`Susumu Tachi
`University of Tokyo
`James Templeman
`Naval Research Laboratory
`Copyright Information
`Geb Thomas
`Permission to photocopyarticles for
`University of lowa
`internal or personal use, or the in-
`Colin Ware
`temal or personal use ofspecific client,
`Director of Corporate
`University of New Hampshire
`Contributions
`is granted by the copyright owner for
`Richard C. Waters
`Editorial Board
`Thomas A. FurnessIII
`users registered with the Copyright
`Mitsubishi Electric Research
`Bernard Adelstein
`Clearance Center (CCC) Transactional
`HumanInterface Technology
`Laboratory
`NASA AmesResearch Center
`Reporting Service, provided that the
`Laboratory
`Suzanne Weghorst
`Terry Allard
`fee of $8.00 perarticle-copyis paid
`University of Washington
`University of Washington
`NASA Ames Research Center
`directly to CCC, 222 Rosewood Dnve,
`Janet Weisenberger
`Editorial Address
`Walter A. Aviles
`Danvers, MA 01923. The fee code for
`Ohio State University
`Editor-in-Chief, PRESENCE
`Teneo Computing, LLC
`users of the Transactional Reporting
`Robert B. Welch
`Frank Biocca
`MIT
`Service is 1054-7460/01 $8.00. For
`NASA Ames Research Center
`77 Massachusetts Avenue
`Michigan State University
`those organizations that have been
`Thomas F. von Wiegand
`Room 36-709
`Jens Blauert
`MIT
`granted a photocopylicense with
`Ruhr-University Bochum,
`Cambridge, MA 02139
`CCC,aseparate system of payment
`Business Offices and
`Germany
`presence@mit.edu
`has been arranged. Address all other
`Subscription Rates
`Grigore Burdea
`inquiries to Subsidiary Rights Manager,
`Rutgers University
`PRESENCE:Teleoperators andVirtual
`MIT Press Journals, Five Cambridge
`H. Steven Colbum
`Environments is published bimonthly
`Center, Cambridge, MA 02142-1407;
`Boston University
`(February, April, June, August,
`e-mail journals-rights@mitedu.
`Rudy Darken
`October, December) by The MIT
`Naval Postgraduate School
`Press, Cambridge, MA 02142-1407.
`Paul Dizio
`Subscriptions and address changes
`should be addressed to MIT Press
`Brandeis University
`StephenEllis
`Journals, Five Cambridge Center,
`NASA Ames Research Center
`Cambridge, MA 02142-1407; (617)
`Scott S. Fisher
`253-2889; fax (617) 577-1545;
`e-mail journals-orders@mit.edu.
`An electronic, full-text version of
`PRESENCEis available from the MIT
`Press. Subscriptions are on a volume-
`yearbasis. Subscription rates:
`Electronic only—ndividuals
`$45.00, Students/retired $43.00,
`Institutions $342.00, Canadians add
`the 7% GST. Print and
`Electronic— Individuals $8000,
`Students/retired $48.00,Institutions
`$380.00. Outside the U.S. and Canada
`add $30.00 for postage and handling,
`Current issues are $15.00, Back issue
`rates: Individuals $32.00,Institutions
`$64.00. Outside the U.S. and Canada
`add $5.00 perissue for postage and
`handling. Canadians add 7% GST.
`Claims may be e-mailed to: joumals-
`claims@mit. edu. Claims for missing
`issues will be honored free of charge if
`madewithin three monthsafter the
`publication date ofthe issue. Prices
`subject to change without notice,
`http//mitpress.mitedu/PRES
`Postmaster
`
`PRESENCE
`Teleoperators and
`Virtual Environments
`
`Volume 10, Number |
`February 2001
`ISSN_ 1054-7460
`
`Editor-in-Chief
`Nathaniel |. Durlach
`Virtual Environment and
`
`Teleoperator Research Consortium
`Research Laboratory of Electronics
`MIT.
`
`Managing Editor
`Rebecca Lee Garett
`MIT
`
`Individuals wishing to submit
`manuscripts should follow the
`guidelines providedin this issue.
`
`Abstracting and Indexing
`PRESENCE:Teleoperators andVirtual
`Environments is included in Computer
`Abstracts, Ergonomics Abstracts,
`Information Science Abstracts,
`Intemational Aerospace Abstracts, and
`Multi-Index to Cyber-Space, Virtual,
`and Artificial Reality. CompuMath
`Citation Index, Current Contents/
`Engineering Computing and
`Technology, Research Alert, and
`SciSearch,
`
`Sponsorship
`PRESENCEis endebted to the
`following organizations for their
`generous support:
`Air Force Office of Scientific
`Research
`Department of Biomedical
`Engineering, Boston University
`Human Interface Technology
`Laboratory at the University of
`Washington
`Office of Naval Research
`Research Laboratory of Electronics,
`MIT
`Visual Computing Department,
`Hewlett Packard Laboratories
`Senior Editors
`Woodrow Barfield
`Virginia Polytechnic Institute
`Gary Bishop
`University of North Carolina
`
`
`
`Georgia Institute of Technology
`John M. Hollerbach
`University of Utah
`Randy Pausch
`Camegie Mellon University
`Thomas B. Sheridan
`MIT
`Mel Slater
`University College London
`Kay Stanney
`University of Central Florida
`Elizabeth Wenzel
`NASA Ames Research Center
`David Zeltzer
`Fraunhofer Center for Research
`in Computer Graphics
`Michael Zyda
`Naval Postgraduate School
`
`.
`
`
`
`Telepresence Research
`Richard M. Held
`MIT
`Kenneth O. Johnson
`Johns Hopkins University
`Lynette A. Jones
`MIT
`James R. Lackner
`Brandeis University
`Jaron Lanier
`Columbia University
`Susan Lederman
`Queen's University, Canada
`Jack M. Loomis
`University of California
`Michael Macedonia
`STRICOM
`Michael Moshell
`University of Central Florida
`Michael Naimark
`Interval Research Corporation
`Albert "Skip" Rizzo
`University of Southem California
`Warren Robinett
`University of North Carolina
`Jannick Rolland
`University of Central Florida
`Roy Ruddle
`University of Leeds
`
`Teleoperators and Virtual Environments,
`Five Cambridge Center, Cambridge,
`MA 02142-1407. Periodicals postage
`paid at Boston, MA, and at additional
`post offices.
`
`Advertising and Mailing List
`Rental
`
`Inquiries may be addressed to the
`Marketing Dept, MIT Press Journals,
`Five Cambridge Center, Cambridge,
`MA 02142-1407. (617) 253-2866; fax
`(617) 258-5028; e-mail: journals-
`info@mitedu.
`
`PRESENCE: Teleoperators and
`Virtual Environments is available on
`microfilm from University Microfilms,
`Inc, 300 N. Zeeb Road, Ann Arbor,
`MI 48106.
`
`© 2001 by the Massachusetts
`Institute of Technology.
`
`Send address changes to PRESENCE:
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`TELEOPERATORS AND VIRTUAL ENVIRONMENTS
`
`VOLUME 10 NUMBER 1, FEBRUARY
`
`2001
`
`ARTICLES
`
`Editorial Notes
`
`Guest Editors’ Introduction: VRST’99 Special Issue
`
`S High-Performance Wide-Area Optical Tracking: The
`HiBall Tracking System
`Greg Welch, Gary Bishop, Leandra Vicct, Stephen Brumback,
`Kurtis Keller, and D’nardo Colucci
`S GNU/MAVERIK:A Microkernel for Large-Scale Virtual
`Environments
`
`Roger Hubbold, Jon Cook, Martin Keates, Simon Gibson,
`Toby Howard, Alan Murta, Adrian West,
`andSteve Pettifer
`S Patterns of Network and User Activity in an Inhabited
`Television Event
`
`Chris Greenhalgh, Steve Benford, and Mike Craven
`S Componentsfor Distributed Virtual Environments
`Manuel Oliveira, Jon Crowcroft, and Mel Slater
`S An Adaptive Multiresolution Method for Progressive
`Model Transmission
`Danny To, Rynson W. H. Lau, and Mark Green
`S Testbed Evaluation of Virtual Environment Interaction
`
`Techniques
`Doug A, Bowman, Donald B. Johnson, and Larry F. Hodges
`S An Introduction to 3-D User Interface Design
`Doug A. Bowman, Ernst Kruiff, Joseph J. LaViola, Jr.
`and Ivan Poupyrev
`An Overview of the COVEN Platform
`
`Emmanuel Frécon, Gareth Smith, Anthony Steed,
`Marten Stenius, and Olov Stahl
`
`iii
`
`iv
`
`22
`
`35
`
`51
`
`62
`
`75
`
`96
`
`109
`
`FORUM
`
`WHAT’S HAPPENING
`
`
`
`%. ye
`
`
`
`oom
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`Greg Welch
`welch@cs.unc.edu
`
`Gary Bishop
`gb@cs.unc.edu
`
`Leandra Vicci
`
`vicci@cs.unc.edu
`
`High-Performance Wide-Area
`Optical Tracking
`The HiBall Tracking System
`
`
`
`Stephen Brumback
`brumback@cs.unc.edu
`
`Abstract
`
`Kurtis Keller
`
`keller@cs.unc.edu
`Department of Computer Science
`University of North Carolina at
`Chapel Hill
`
`D’nardo Colucci
`
`colucci@virtual-reality.com
`Alternate Realities Corporation
`
`Since the early 1980s, the Tracker Project at the University of North Carolina at
`Chapel Hill has been working on wide-area head tracking for virtual and augmented
`environments. Our long-term goal has been to achieve the high performancere-
`quired for accurate visual simulation throughout our entire laboratory, beyond into
`the hallways, and eventually even outdoors.
`
`In this article, we present results and a complete description of our most recent
`electro-optical system, the HiBall Tracking System. In particular, we discuss motiva-
`tion for the geometric configuration and describe the novel optical, mechanical,
`electronic, and algorithmic aspects that enable unprecedented speed, resolution,
`accuracy, robustness, and flexibility.
`
`I
`
`Introduction
`
`Systemsfor head tracking for interactive computer graphics have been
`explored for more thanthirty years (Sutherland, 1968). Asillustrated in
`figure 1, the authors have been working on the problem for more than twenty
`years (Azuma, 1993, 1995; Azuma & Bishop, 1994a, 1994b; Azuma & Ward,
`1991; Bishop, 1984; Gottschalk & Hughes, 1993; UNC Tracker Project,
`2000; Wang, 1990; Wang etal., 1990; Ward, Azuma, Bennett, Gottschalk, &
`Fuchs, 1992; Welch, 1995, 1996; Welch & Bishop, 1997; Welchetal., 1999).
`From the beginning, ourefforts have been targeted at wide-area applications
`in particular. This focus was originally motivated by applications for which we
`believed that actually walking around the environment would be superior to
`virtually “flying.” For example, we wantedtointeract with room-filling virtual
`molecular models, and to naturally explorelife-sized virtual architectural mod-
`els. Today, we believe that a wide-area system with high performance every-
`where in ourlaboratory providesincreasedflexibility for all of our graphics,
`vision, and interaction research.
`
`1.1 Previous Work
`
`|-21
`|, February 2001,
`Presence,Vol. 10, No.
`©2001 by the MassachusettsInstitute of Technology
`
`In the early 1960s, Ivan Sutherland implemented both mechanical and
`ultrasonic (carrier phase) head-tracking systemsas part of his pioneering work
`in virtual environments. He describes these systems in his seminal paper “A
`Head-Mounted Three Dimensional Display” (Sutherland, 1968). In the
`
`Welch et al.
`
`I
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`
`
`
`
`
`
`
`2 PRESENCE: VOLUME 10, NUMBER |
`
`Initial wide-area
`
`Simpler LED panels
`andoff-line calibration
`
`SCAAT and
`autocalibration
`
`
`predict| |
`
`
`
`
`
`= Original system
`
`
`
`The HiBall
`
`The HiBall system
`
`Self-Tracker
`
`(SIGGRAPH 91)
`
`Figure |.
`
`ensuing years, commercial and research teams have ex-
`plored mechanical, magnetic, acoustic, inertial, and op-
`tical technologies. Complete surveys include Bhatnagar
`(1993); Burdea & Coiffet (1994); Meyer, Applewhite,
`& Biocca (1992); and Mulder (1994a, 1994b, 1998).
`Commercial magnetic tracking systems for example
`(Ascension, 2000; Polhemus, 2000) have enjoyed popu-
`larity as a result of a small user-worn component and
`relative ease of use. Recently, inertial hybrid systems
`(Foxlin, Harrington, & Pfeifer, 1998; Intersense, 2000)
`have been gaining popularity for similar reasons, with
`the added benefit of reduced high-frequency noise and
`direct measurements ofderivatives.
`
`An early example of an optical system for tracking or
`motion capture is the Twinkle Box by Burton (Burton,
`1973; Burton & Sutherland, 1974). This system mea-
`sured the positions of user-wornflashing lights with
`optical sensors mounted in the environmentbehindro-
`tating slotted disks. The Se/spot system (Woltring, 1974)
`used fixed, camera-like, photodiode sensors and target-
`mountedinfrared light-emitting diodes that could be
`tracked in a one-cubic-meter volume. Beyond the
`HiBall Tracking System, examples of current optical
`tracking and motion-capture systems include the Flash-
`
`Point and Pixsys systems by Image Guided Technologies
`(IGT, 2000), the /aserBIRD system by Ascension Tech-
`nology (Ascension, 2000), and the CODA Motion Cap-
`ture System by B & L Engineering (BL, 2000). These
`systems employ analog optical-sensor systems to achieve
`relatively high sample rates for a moderate numberof
`targets. Digital cameras (two-dimensional, image-forming
`optical devices) are used in motion-capture systems such
`as the HiRes 3D Motion Capture System by the Motion
`Analysis Corporation (Kadaba & Stine, 2000; MAC,
`2000) to track a relatively large numberoftargets,al-
`beit at a relatively low rate because of the need for 2-D
`image processing.
`
`1.2 Previous Work at UNC-ChapelHill
`
`As part of his 1984 dissertation on Self-Tracker,
`Bishop put forward the idea of outward-looking track-
`ing systems based on user-mountedsensorsthat esti-
`mate user pose! by observing landmarksin the environ-
`ment (Bishop, 1984). He described two kinds of
`
`1. We use the word pose to indicate both position and orientation
`(six degrees of freedom).
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`
`
`
`
`Welch et al. 3
`
`
`
`Figure 2.
`
`landmarks: high signal-to-noise-ratio beacons such as
`light-emitting diodes (LEDs) and low signal-to-noise-
`ratio landmarks such as naturally occurring features.
`Bishop designed and demonstrated custom VLSI chips
`(figure 2) that combined image sensing and processing
`onasingle chip (Bishop & Fuchs, 1984). The idea was
`to combine multiple instances of these chips into an
`outward-looking cluster that estimated cluster motion
`by observing natural features in the unmodified environ-
`ment. Integrating the resulting motion to estimate pose
`is prone to accumulating error, so further development
`required a complementary system based oneasily de-
`tectable landmarks (LEDs) at known locations. This
`LED-based system was the subject of a 1990 disserta-
`tion by Jih-Fang Wang (Wang, 1990).
`In 1991, we demonstrated a working,scalable, elec-
`tro-optical head-tracking system in the Tomorrow’s Re-
`alities gallery at that year’s ACM SIGGRAPHconfer-
`ence (Wang et al., 1990; Wang, Chi, & Fuchs, 1990;
`Wardet al., 1992). The system (figure 3) used four,
`head-worn,lateral-effect photodiodes that looked up-
`ward at a regulararray of infrared LEDsinstalled in pre-
`cisely machinedceiling panels. A user-worn backpack
`contained electronics that digitized and communicated
`the photo-coordinates of the sighted LEDs. Photo-
`grammetric techniques were used to computea user’s
`head pose using the known LED positions and the cor-
`responding measured photo-coordinates from cach
`LEPD sensor (Azuma & Ward, 1991). The system was
`ground-breaking in that it was unaffected by ferromag-
`
`
`
`
`
`Figure 3.
`
`netic and conductive materials in the environment, and
`the working volume of the system was determined
`solely by the numberofceiling panels. (See figure 3,
`top.)
`
`
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`4 PRESENCE: VOLUME 10, NUMBER |
`
`
`
`
`Figure 4.
`
`1.3 The HiBall Tracking System .
`
`In this article, we describe a new and vastly im-
`proved version of the 1991 system. Wecall the new sys-
`tem the HiBall Tracking System. Thanksto significant
`improvements in hardware and software, this HiBall
`system offers unprecedented speed, resolution, accuracy,
`robustness, and flexibility. The bulky and heavy sensors
`and backpack of the previous system have been replaced
`by a small HiBall unit (figure 4, bottom). In addition,
`the precisely machined LED ceiling pancls of the previ-
`ous system have been replaced by looser-tolerance pan-
`els that are relatively inexpensive to make and simple to
`install (figure 4, top; figure 10). Finally, we are using an
`unusual Kalman-filter-based algorithm that generates
`very accurate pose estimates at a high rate with lowla-
`tency, and that simultaneouslyself-calibrates the system.
`As a result of these improvements, the HiBall Track-
`ing System can generate more than 2,000 poseesti-
`mates per second, with less than 1 msoflatency, better
`
`than 0.5 mmand 0.03 deg. of absolute error and noise,
`everywhere in a 4.5 m X 8.5 m room (with more than
`two meters of heightvariation). The area can be ex-
`panded by adding more panels, or by using checker-
`board configurations that spread panels overa larger
`area. The weight of the user-worn HiBall is approxi-
`mately 300 grams, makingit lighter than one optical
`sensor in the 1991 system. Multiple HiBall units can be
`daisy-chained together for head or hand tracking, pose-
`aware input devices, or precise 3-D point digitization
`throughoutthe entire working volume.
`
`2
`
`Design Considerations
`
`In all of the optical systems we have developed
`(see section 1.2), we have chosen what wecall an imside-
`looking-out configuration, in which the optical sensors
`are on the (moving) user and the landmarks(for in-
`stance, the LEDs) arefixed in the laboratory. The corre-
`sponding outside-looking-in alternative would be to
`place the landmarks onthe user and to fix the optical
`sensors in the laboratory. (One can think about similar
`outside-in and inside-out distinctions for acoustic and
`
`magnetic technologies.) The two configurations are de-
`picted in figure 5.
`There are some disadvantages to the inside-looking-
`out approach. For small or medium-sized working vol-
`umes, mounting the sensors on the user is more chal-
`lenging than mounting them in the environment.It is
`difficult to make user-worn sensor packaging small, and
`communication from the moving sensors to the rest of
`the system is more complex. Incontrast, there are fewer
`mechanical considerations when mounting, sensors in
`the environmentfor an outside-looking-in configura-
`tion. Because landmarks can berelatively simple, small,
`and cheap, they can often be located in numerousplaces
`on the user, and communication from the user to the
`rest of the system canberelatively simple or even un-
`necessary. This is particularly attractive for full-body
`motion capture (BL, 2000; MAC, 2000).
`However, there are somesignificant advantages to the
`inside-looking-out approach for head tracking. By
`operating with sensors on the user rather than in the
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`Welch et al.
`
`5
`
`Jab-mounted (fixed)
`optical sensor
`
`head-mounted landmarks
`
`lab-mounted
`(fixed) landmarks
`
`head-mounted sensor
`
`Outside-Looking-In
`
`- Inside-Looking-Out
`
`Figure 5.
`
`environment, the system can bescaled indefinitely. The
`system can evolve from using dense active landmarks to
`fewer, lower signal-to-noise ratio, passive, and some day
`natural features for a Self-Tracker that operates entirely
`without explicit landmark infrastructure (Bishop, 1984;
`Bishop & Fuchs, 1984; Welch, 1995).
`Theinside-looking-out configuration is also moti-
`vated by a desire to maximizesensitivity to changes in
`user pose. In particular, a significant problem with an
`outside-looking-in configuration is that only position
`estimates can be madedirectly, and so orientation must
`be inferred fromposition estimates of multiple fixed
`landmarks. Theresult is that orientation sensitivity is a
`function of both the distance to the landmarks from the
`sensor and the baseline between the landmarks on the
`user. In particular, as the distance to the user increases
`or the baseline between the landmarks decreases, the
`sensitivity goes down. Forsufficient orientation sensitiv-
`ity, one would likely need a baseline thatis considerably
`larger than the user’s head. This would be undesirable
`from an ergonomic standpoint and could actually re-
`strict the user’s motion.
`
`With respect to translation, the change in measured
`photo-coordinates is the same for an environment-
`mounted(fixed) sensor and user-mounted (moving)
`landmarkasit is for a user-mounted sensor and an envi-
`
`ronment-mounted landmark. In other words, the trans-
`lation and corresponding sensitivity are the same for
`either case.
`
`3
`
`System Overview
`
`The HiBall Tracking System consists of three
`main components (figure 6). An outward-looking
`sensing unit we call the HiBal/is fixed to each user to
`be tracked. The HiBall unit observes a subsystem of
`fixed-location infrared LEDs wecall the Ceiling.”
`Communication and synchronization between the
`host computer and these subsystems is coordinated
`
`2. At the present time, the LEDsare in fact entirely located in the
`ceiling of our laboratory (hence the subsystem name Ceiling), but
`LEDs could as well be located on walls or other fixed locations.
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`6
`
`PRESENCE: VOLUME 10, NUMBER |
`
`
`
`Ceiling-HiBall Interface
`Board (CIB)
`
`Figure 6.
`
`
`
`
`
`4
`
`System Components
`
`4.1 The HiBall
`
`by the Ceiling-HiBall Interface Board (CIB). In sec-
`tion 4, we describe these components in more detail.
`Each HiBall observes LEDs through multiple sen-
`The original electro-optical tracker (figure 3, bot-
`sor-lens views that are distributed overa large solid
`tom) used independently housed latcral-effect photo-
`angle. LEDs are sequentially flashed (one at a time)
`diode units (LEPDs) attached to a lightweight tubular
`such that they are seen via a diverse set of views for
`framework. As it turns out, the mechanical framework
`each HiBall. Initial acquisition is performed using a
`would flex (distort) during use, contributing to estima-
`brute-force search through LED space, but, once ini-
`tion errors. In part to address this problem, the HiBall
`tial lock is made, the selection of LEDsto flashis tai-
`sensor unit was designedasasingle, rigid, hollow ball
`lored to the views of the active HiBall units. Pose es-
`having dodecahedral symmetry, with lenses in the upper
`timates are maintained using a Kalman-filter-based
`six faces and LEPDsontheinsides of the opposing six
`prediction-correction approach knownas single-
`constraint-at-a-time (SCAAT) tracking. This tech-
`lower faces (figure 7). This immediately gives six pri-
`mary “camera” views uniformly spaced by 57 deg. The
`nique has been extended to provideself-calibration of
`viewsefficiently share the sameinternal air space and are
`the ceiling, concurrent with HiBall tracking. In sec-
`rigid with respect to each other. In addition,light enter-
`tion 5, we describe the methods we employ, includ-
`ing anylens sufficiently off-axis can be scen by a neigh-
`ing the initial acquisition process and the SCAATap-
`boring LEPD,givingrise to five secondary views through
`proach to pose estimation, with the autocalibration
`the top or central lens, and three secondary views
`extension.
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`Welch et al.
`
`7
`
`
`
`Figure 8.
`
`Figure 7.
`
`through the five other lenses. Overall, this provides 26
`fields of view that are uscd to sense widely separated
`groups of LEDsin the environment. Although the extra
`views complicate theinitialization of the Kalmanfilter as
`described in section 5.5, they turn out to be ofgreat
`benefit during steady-state tracking by effectively in-
`creasing the overall HiBallfield of view withoutsacrific-
`ing optical-sensorresolution.
`Thelenses are simple plano-convex fixed-focus lenses.
`Infrared (IR) filtering is provided by fabricating the
`lenses themselves from RG-780 Schottglass filter mate-
`rial which is opaque to better than 0.001% forall visible
`wavelengths and transmissive to better than 99% for IR
`wavelengths longer than 830 nm. The longwavefilter-
`ing limit is provided by the DLS-4 LEPDsilicon photo-
`detector (UDT Sensors, Inc.) with peak responsivityat
`950 nm butessentially blind above 1150 nm.
`The LEPDsthemselves are not imaging devices;
`rather, they detect the centroid of the luminous flux
`incident on the detector. The x-position of the centroid
`determines the ratio of two output currents, and the
`
`y-position determinesthe ratio of two other output cur-
`rents. Thetotal output current of each pair are com-
`mensurate and are proportional to the total incident
`flux. Consequently, focusis not an issue, so the simple
`fixed-focus lenses work well over a range of LED dis-
`tances from about halfa meter to infinity. The LEPDs
`and associated electronic components are mounted on a
`custom rigid-flex printed circuitboard (figure 8). This
`arrangement makesefficient use of the internal HiBall
`volume while maintaining isolation between analog and
`digital circuitry, and increasingreliability by alleviating
`the need for intercomponent mechanical connectors.
`Figure 9 shows the physical arrangementof the
`folded electronics in the HiBall. Each LEPD has four
`transimpedance amplifiers (shown together as one
`“Amp”in figure 9), the analog outputs of which are
`multiplexed with those of the other LEPDs, then sam-
`pled, held, and converted by four 16-bit Delta-Sigma
`analog-to-digital (A/D) converters. Multiple samples
`are integrated via an accumulator. The digitized LEPD
`data are organized into packets for communication back
`to the CIB. The packets also contain information to
`assist in error detection. The communication protocolis
`simple, and, while presently implemented by wire, the
`modulation schemeis amenable to a wireless implemen-
`tation. The present wired implementation allows multi-
`ple HiBall units to be daisy-chained, so a single cable
`can support a user with multiple HiBall units.
`
`Ce
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`
`
`6 Sensors
`and
`Amplifiers
`
`Analog
`
`Connector
`
`Digital
`
`Base and
`
`
`
`
`rao}verter
`Controller
`
`
`
`
`
`Accumulator
`
`
`Figure 9.
`
`4.2 The Ceiling
`
`As presently implemented, the infrared LEDsare
`packaged in 61 cm square panelstofit a standard false-
`ceiling grid (figure 10, top). Each panel usesfive printed
`circuit boards: a main controller board and four identi-
`
`cal transverse-mountedstrips (bottom). Eachstrip is
`populated with eight LEDsfor a total of 32 LEDs per
`panel. We mount the assembly on top of a metal panel
`such that the LEDsprotrude through 32 corresponding
`holes. The design results in a ceiling with a rectangular
`LED pattern with periods of 7.6 cm and 15.2 cm. This
`spacing is used forthe initial estimates of the LED posi-
`tions in the lab; then, during normal operation, the
`SCAATalgorithm continually refines the LED position
`
`estimates (section 5.4). The SCAATautocalibration not
`onlyrelaxes design andinstallation constraints, but pro-
`vides greater precision in the face of initial and ongoing
`uncertainty in the ceiling structure.
`Wecurrently have enough panels to cover an area
`approximately 5.5 m by 8.5 m with a total of approxi-
`mately 3,000 LEDs.* Thepanels are daisy-chained to
`each other, and pancl-selection encodingis position
`(rather than device) dependent. Operational commands
`are presentedto thefirst panel of the daisy chain. At
`each panel, if the panel-select code is zero, the
`
`3. The area is actually L-shaped; a small storage room occupies one
`corner.
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`
`
`Figure 10.
`
`controller decodes and executes the operation; other-
`wise, it decrements the panel-select code andpassesit
`along to the next panel (controller). Upon decoding, a
`particular LEDis selected and the LEDis energized.
`The LED brightness (power)is selectable for automatic
`gain control as described in section 5.2.
`Wecurrently use Siemens SFH-487P GaAs LEDs,
`which provide both a wide-angle radiation pattern and
`high peak power, emitting at a center wavelength of
`880 nm in the near IR. These devices can be pulsed up
`to 2.0 Amps for a maximum duration of 200 ys with a
`1:50 (on:off) duty cycle. Although the currentceiling
`architecture allows flashing of only one LED ata time,
`LEDs maybeflashed in any sequence. As such, nosin-
`gle LED can be flashed too long or too frequently. We
`include both hardware and software protection to pre-
`ventthis.
`
`4.3 The Ceiling-HiBall Interface Board
`
`The Ceiling-HiBall Interface Board (CIB)
`(figure 11) provides communication and synchroniza-
`tion between a host personal computer, the HiBall
`(section 4.1), and the ceiling (section 4.2). The CIB has
`four ceiling ports allowing interleaving of ceiling panels
`for up to four simultaneous LED flashes and/or higher
`
`Welch etal.
`
`9
`
`
`
`Figure II.
`
`ceiling bandwidth. (The ceiling bandwidth is inherently
`limited by LED powerrestrictions as described in sec-
`tion 4.2, but this can be increased byspatially multiplex-
`ing the ceiling panels.) The CIB has twotether inter-
`faces that can communicate with up to four daisy-
`chained HiBall units. The full-duplex communication
`with the HiBall units uses a modulation scheme (BPSK)
`allowing future wireless operation. The interface from
`the CIB to the host PC is the stable IEEE1284C ex-
`tended parallel port (EPP) standard.
`The CIB comprises analog drive and receive compo-
`nents as well as digital logic components. The digital
`components implementstore and forward in both direc-
`tions and synchronize the timing of the LED “on”in-
`terval within the HiBall dark-light-dark intervals
`(section 5.2). The protocol supports full-duplex flow
`control. The data are arranged into packets that incor-
`porate error detection.
`
`5 Methods
`
`5.1 Bench-Top (Offline) HiBall
`Calibration
`
`After each HiBall is assembled, we perform anoff-
`line calibration procedure to determine the correspon-
`dence between image-plane coordinates and rays in
`space. This involves more than just determining the
`view transform for each of the 26 views. Nonlinearities
`in the silicon sensor and distortions in the lens (such as
`spherical aberration) cause significant deviations from a
`simple pinhole camera model. We dealt with all of these
`issues through the use of a two-part camera model. The
`first part is a standard pinhole camera represented by a
`
`META 1007
`META V. THALES
`
`META 1007
`META V. THALES
`
`

`

`10
`
`PRESENCE: VOLUME 10, NUMBER |
`
`3 X 4 matrix. The secondpart is a table mappingreal
`image-plane coordinates to ideal image-plane coordi-
`nates.
`
`Both parts of the camera model are determined using
`a calibration procedure that relies on a goniometer (an
`angular positioning system) of our own design. This
`device consists of two servo motors mounted together
`such that one motorprovides rotation aboutthe vertical
`axis while the second motorprovides rotation about an
`axis orthogonalto vertical. An important characteristic
`of the goniometeris that the rotational axes of the two
`motors intersect at a point at the center of the HiBall
`optical sphere; this pointis defined as the origin of the
`HiBall. (It is this origin that provides the reference for
`the HiBall state during runtimeas described in section
`5.3.) The rotational positioning motors were rated to
`provide twenty arc-second precision; we further cali-
`brated them to six arc seconds using a laboratory grade
`theodolite—an angle measuring system.
`To determine the mapping between sensor image-
`plane coordinates and three-space rays, we use a sin-
`gle LED mountedat a fixed location in the laboratory
`such thatit is centered in the viewdirectly out of the
`top lens of the HiBall. This ray defines the z or up
`axis for the HiBall coordinate system. We sample
`other rays by rotating the goniometer motors under
`computercontrol. We sample each view with rays
`spaced abouteverysix minutes of arc throughout the
`field of view. We repeat cach measurement 100 times
`to reduce the effects of noise on the individual mea-
`surements and to estimate the standard deviation of
`the measurements.
`
`Given the tables of approximately 2,500 measure-
`ments for each of the 26 views, wefirst determine a
`3 X 4 view matrix using standard linear least-squares
`techniques. Then, we determine the deviation of each
`measured point from that predicted by the ideal linear
`model. These deviations are resampled into a 25 x 25
`grid indexed by sensor-plane coordinates using a simple
`scan-conyersion procedure and averaging. Given a mea-
`surement froma sensor at runtime (section 5.2), we
`convert it to an “ideal” measurement by subtracting a
`deviation bilinearly interpolated from the nearest four
`entries in the table.
`
`5.2 Online HiBall Measurements
`
`Uponreceiving a command from the CIB (section
`4.3), which is synchronized with a CIB commandto the
`ceiling, the HiBall selects the specified LEPD andper-
`forms three measurements, one before the LED flashes,
`one during the LED flash, and one after the LED flash.
`Knownas “dark-light-dark,” this technique is used to
`subtract out DC bias, low-frequency noise, and back-
`groundlight from the LED signal. We then convert the
`measured sensor coordinates to “ideal” coordinates us-
`ing the calibration tables described in section 5.1.
`In addition, during runtime we attempt to maximize
`the signal-to-noise ratio of the measurement with an
`automatic gain-control scheme. For each LED, westore
`a target signal strength factor. We compute the LED
`current and numberofintegrations (of successive accu-
`mulated A/D samples) bydividing this strength factor
`bythe square ofthe distance to the LED, estimated
`from the current position estimate. After a reading, we
`lookat the strength of the actual measurement. Ifit is
`larger than expected, we reduce the gai

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket