throbber
The Brain Opera Technology: New
`Instruments and Gestural Sensors for
`Musical Interaction and Performance
`
`Joseph A. Paradiso
`
`MIT Media Laboratory E15-325
`Cambridge MA 02139 USA
`
`joep@media.mit.edu
`Tel: +1-617-253-8988
`Fax: +1-617-258-7168
`
`Keywords: Brain Opera, human-computer interface, multimodal input devices, electronic
`music interfaces, interactive music systems, percussion interfaces, interactive dance,
`electric field sensing, capacitive sensing
`
`ABSTRACT
`
`This paper describes the array of new musical instruments and interactive installations
`developed for the Brain Opera, a large, touring multimedia production, where the audience
`first explores a set of musical modes at a variety of novel, interactive stations before
`experiencing them in an actual performance. Most of the Brain Opera's installations were
`intended for the general public, employing different gestural measurements and mappings
`that allow an untrained audience to intuitively interact with music and graphics at various
`levels of complexity. Another set of instruments was designed for a trio of trained
`musicians, who used more deliberate technique to perform the composed music. This
`paper outlines the hardware and sensor systems behind these devices: the electric field
`sensors of the Gesture Wall and Sensor Chair, the smart piezoelectric touchpads of the
`Rhythm Tree, the instrumented springs in Harmonic Driving, the pressure-sensitive touch
`screens of the Melody Easels, and the multimodal Digital Baton, containing a tactile
`interface, inertial sensors, and precise optical tracker. Also discussed are a set of
`controllers developed for the Brain Opera, but not currently touring with the production,
`including the Magic Carpet (immersive body sensing with a smart floor and Doppler radar)
`and an 8-channel MIDI-controlled sonar rangefinder.
`
`____________________________________________________________________________________
`
`Version 2.0, November 1998; To appear in the Journal of New Music Research
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0001
`
`

`

`Figure 1: Overhead view of the Brain Opera Lobby truss structure, as being assembled
`before a Tokyo run in November 1996. All electronics are mounted atop the truss,
`leaving only the interactive interfaces (such as the Rhythm Tree bags at lower right)
`visible to the participants
`
`1) Introduction
`
`New sensing technologies and the steadily increasing power of embedded
`computation, PC's, and workstations have recently enabled sophisticated, large-scale
`experiments in interactive music to be conducted with the general public. Although most
`(e.g., Ulyate 1998) have been one-time occasions, the Brain Opera is the largest touring
`participatory electronic musical installation to have been thusfar constructed. The
`interactive section alone, termed the “Mind Forest” or "Lobby" (named after the Juilliard
`Theater's Marble Lobby where it opened in July, 1996 at the first Lincoln Center Festival),
`is composed of 29 separate installations, run by an array of circa 40 networked PC's and
`workstations. Figure 1 shows an overhead view of the Lobby electronics being deployed
`atop its supporting truss structure, indicating the large physical scale. During a Brain Opera
`run, these interactive stations are open to the general public, who wander through them
`freely, in any desired order. The stations are of 5 types, each creating a different
`experience and exploiting various gestural sensing and multimedia mapping modalities, as
`described in the following section. Some of these stations allow manipulation of sound
`structures, others acquire voice samples from the users, and others enable parametric
`manipulation of various Brain Opera themes. After about an hour of Lobby experience, the
`
`2
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0002
`
`

`

`audience is conducted into a theater space, where a trio of musicians performs the entire
`Brain Opera composition on a set of “hyperinstruments” (Machover 1991), similar in style
`and technology to those previously experienced in the Lobby.
`The Brain Opera, conceived and directed by Tod Machover, was designed and
`constructed by a highly interdisciplinary team at the MIT Media Laboratory during an
`intensive effort from the fall of 1995 through summer of 1996. A major artistic goal of this
`project was to integrate diverse, often unconnected control inputs and sound sources from
`the different Lobby participants into a coherent artistic experience that is "more than the
`sum of its parts", inspired by the way our minds congeal fragmented experiences into
`rational thought (Machover 1996). This congealing process was anticipated to culminate in
`the Brain Opera performance, where the diverse themes and activities experienced in the
`Lobby were actively sculpted into a succinct musical piece. Such analogies to brains and
`the thought process, particularly as interpreted by artificial intelligence pioneer Marvin
`Minsky (Minsky 1988), drove much of the initial Brain Opera inspiration, from the use of
`uncorrelated, essentially stochastic audience input (emulating neural stimulation) to the
`somewhat "biological" appearance of the physical set. More generally, the Brain Opera
`attempts to make a strong statement about actively involving non-specialized audiences in
`artistic environments, confronting many questions about interactive music to which ready
`answers are currently lacking (Machover 1996).
`The overall design of the Brain Opera as an interactive installation is described in
`(Orth 1997), and its artistic motivation and goals have been discussed in many articles;
`e.g., (Machover 1996), (Rothstein 1996), (Wilkinson 1997). This paper, in contrast,
`concentrates on the many different instruments and interactive stations developed for this
`project, describing their technical design, sensor architectures, and functional performance.
`The Brain Opera is by no means a fixed or purely experimental installation; it had to
`operate in many real-world environments (already having appeared at 7 international
`venues), and function with people of all sizes, ages, cultures, and experience levels. As a
`result, the interface technologies were chosen for their intuitiveness, overall robustness and
`lack of sensitivity to changing background conditions, noise, and clutter. This tended to
`rule out computation-intensive approaches, such as computer vision (e.g., Wren et. al.
`1997), which, although improving in performance, would be unable to function adequately
`in the very dense and dynamic Brain Opera environment.
`Most of the Brain Opera's software is run on IBM PCs under Windows 95 or NT
`using ROGUS (Denckla and Pelletier 1996), a C++ MIDI utility library developed for this
`project, although some of the performance instruments are based around Apple
`Macintoshes running vintage code written in Hyperlisp (Chung 1988). Most of the musical
`
`3
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0003
`
`

`

`Figure 2: Schematic of the Speaking and Singing Trees
`
`interfaces described in this paper were designed to communicate via MIDI. In order to limit
`data rates, continuous controllers were polled; i.e., an interface waits for a poll command
`(in this case, a MIDI program change directed to the appropriate channel), then responds
`with its latest data. All of the custom-designed interfaces employed a 68HC11-based
`circuit as their MIDI engine, incorporated as either a “Fish” for electric-field and capacitive
`sensing (Paradiso and Gershenfeld 1997) or “FishBrain” card. The latter is essentially a
`Fish without the capacitive sensing electronics; a 68HC11 with 4 raw analog inputs, 4
`adjustable (gain/bias) analog inputs, 4 digital inputs, 8 digital outputs, and MIDI plus
`RS-232 input/output. The FishBrain is used as a general-purpose MIDI interface to analog
`sensors. With minor modification, the same embedded 68HC11 code is run on nearly all
`the Brain Opera devices.
`
`2) The Lobby Installations
`
`The simplest and most plentiful Lobby stations were the Speaking Trees. As
`depicted in Fig. 2 and shown in Fig. 3, these interfaces feature a dedicated PC, pair of
`headphones, microphone, 10” color LCD screen, and a modified ProPoint mouse from
`Interlink Electronics (http://www.interlinkelec.com/). The ProPoint is a handheld device
`that allows the thumb to navigate the cursor by adjusting the center of pressure atop a
`fingertip-sized, force-sensitive resistor array; the “clicks” are still determined by a
`pushbutton (mounted for forefinger access). In order to accommodate the “organic” look
`of the Brain Opera, the ProPoint circuit cards were removed from their dull plastic
`housings and potted into a somewhat elastic, leaf-shaped mold. As seen in Fig. 3, these
`
`4
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0004
`
`

`

`
`Figure 3: Layout and photograph of a Speaking Tree
`
`components were all mounted onto an adjustable-height polypropylene frame that fit over
`the head, nicely encapsulating the user in a somewhat private and isolated environment. A
`simple 17" x 23" switchmat is mounted under the carpet beneath each speaking tree. When
`an occupant steps under the tree, the switchmat closes, bridging a set of polling lines on the
`PC serial port. When this event is detected, a MacroMind Director sequence starts up,
`featuring video clips of Marvin Minsky, whose Society of Mind (Minsky 1985) inspired
`the libretto and overall concept of the Brain Opera. Throughout the dialog, the image of
`Minsky asks the user several questions; their answers are recorded and indexed on the host
`PC, then subsequently transferred over the network to a bank of samplers in the theater for
`playback during following performances. There are a total of 15 Speaking Trees in the
`Brain Opera, 12 of which run different Director sequences. Although the dialog with
`Minsky can be interesting and amusing, it’s only one simple application of the facilities
`available at each Tree; several other, more engaging experiences are now being developed.
`More detail on the Speaking Trees can be found in (Orth 1997).
`Similar in construction are the Singing Trees, schematically shown at right in Fig.
`2. Lacking a tactile interface, they respond solely to the singing voice, which they analyze
`into 10 dynamic features. These parameters drive an algorithmic composition engine,
`which effectively resynthesizes the participant’s voice on a Kurzweil K2500 synthesizer.
`The Singing Trees look for consistency in the singing voice at a single pitch; the longer the
`pitch is held, the more tonal and “pleasing” the resynthesis becomes. The derived quality
`factors are also used to drive an animation playing on the LCD screen (Daniel 1996); as the
`pitch is held longer, the animation propagates forward and becomes more engaging (e.g., a
`ballerina appears and begins to dance, as in Fig. 2). When the voice falters, the animation
`
`5
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0005
`
`

`

`Figure 4: A Melody Easel (left) and Harmonic Driving joystick (right) in action
`
`
`
`rewinds into a set of simpler images. The audio and video feedback on the singing voice
`has proven quite effective; the tonal and visual rewards encourage even poor amateurs to
`try for a reasonable tone. There are 3 Singing Trees in the Brain Opera, each running
`different image sequences. More details on the Singing Tree design and synthesis/analysis
`algorithms are given in (Oliver, Yu, Metois 1997) and (Oliver 1997).
`Another relatively straightforward interface used in the Brain Opera is the Melody
`Easel. These are 21” computer monitors, recess-mounted into a hanging “table”, such their
`screens are horizontal and embedded into the tabletop (see Fig. 4, left). These monitors,
`however, are equipped with pressure-sensitive touchscreens (the IntelliTouch from ELO
`TouchSystems), which can deliver 11 bits of position accuracy and circa 5 bits of pressure
`information at 10 msec updates. Users manipulate a parametric sequence running one of
`the Brain Opera themes by moving a finger across the screen; the synthesized voices
`(generated on a Kurzweil K2500 sampler and Korg Prophecy synthesizer) respond to
`position, pressure, and velocity. A video sequence, playing on the monitor, is likewise
`perturbed by the finger position and pressure, using various realtime video processing
`algorithms (Dodge 1997). There are 3 melody easels in the Brain Opera. Each uses a pair
`of PC’s (one for music and another for video), and runs different musical voicings and
`visuals. Fig. 5 shows data from an IntelliTouch screen used in the Brain Opera for a finger
`tracing a circle and an “X”. The position and pressure data are shown vs. time at left, and
`as a raster (x vs. y) at right, with the pressure values determining the radius of the
`overplotted circles (higher pressure = wider circles); the pressure goes to zero when the
`finger is lifted off the glass. The IntelliTouch uses surface-acoustic waves propagating
`through the touchscreen glass to determine the finger coordinates; the timing of the acoustic
`absorption peak gives position and the amount of signal absorbed by the finger determines
`
`6
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0006
`
`

`

`Figure 5: Data from a pressure-sensing Melody Easel touchscreen
`
`Figure 6: The Harmonic Driving joystick and sensor data
`
`pressure (Kent 1997). Unfortunately, it does not work properly when more than one
`finger is in contact with the screen, but users quickly adapted to this drawback, and had no
`difficulty using this somewhat ubiquitous interface. As the analog signals produced by the
`screen always give response (the packaged digital controller produces data when only one
`finger is in contact), it is possible to modify the touchscreen so it detects and responds to
`multifinger input, albeit with some ambiguity, but still useful for simple musical
`interpretation.
`At the Harmonic Driving stations, the user "drives" an animated vehicle shaped like
`a note through a graphical and musical landscape. Rather than using a conventional
`steering wheel or commercial joystick, which would hint too heavily of a common arcade
`experience, the user controls the experience with a novel interface made from a large (2”
`diameter, 15” in length), bendable spring (Fig. 4 & 6), which has an entirely different feel,
`
`7
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0007
`
`

`

`Figure 7: Rhythm Tree string configuration (left) and pad schematic (right)
`
`
`
`better suited to the “jovial” mood of the music and nonphotorealistically-rendered graphics
`(Robertson 1997). Musical parameters are selected both graphically (by steering onto
`different “roads” or hitting musical objects) and continuously (the joystick parameters
`themselves are mapped directly onto musical effects). The two-axis bending angles are
`measured using capacitive sensing to detect the relative displacement between the spring's
`coils at its midpoint. As shown in Fig. 6, four pickups are mounted outside the coils at 90°
`intervals. These are made from small sections of the insulated copper conductors from
`heavy gauge electrical cable, tied and epoxied to the spring. A transmit electrode
`(broadcasting a 50 kHz sinewave), similarly fashioned, is wound completely around the
`coil above the pickups. As the spring bends, the pickups move closer together on one side
`(further apart on the other), and the capacitive coupling between transmitter and receivers
`changes accordingly. Shielded cables are run from these electrodes to a nearby
`preamplifier, then to a "Fish" electric field sensing circuit (Paradiso and Gershenfeld
`1997), which digitizes the four proximity signals into 7-bit MIDI values. The spring is
`electrically grounded to prevent extraneous coupling, and provided that hands are kept
`away from the pickups (as encouraged by the mounting geometry), the capacitive proximity
`measurement is quite accurate (Paradiso 1997a).
`The spring’s twist is also measured with a potentiometer that rotates through the
`relative angle between the top and bottom of the spring. The presence of a seated
`participant is detected when a light beam pointed across the chair is interrupted, at which
`point the experience is started; when the occupant leaves the seat, the software is
`automatically reset. The potentiometer and photodetector signals are also digitized by the
`Fish. Fig. 6 shows the resulting bend (difference between opposing pickups), twist, and
`occupancy signals for an occupant moving into the seat and putting the joystick through all
`
`8
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0008
`
`

`

`Figure 8: Photographs of a Rhythm Pad, showing top view (left) of unpotted circuitry
`with electronics, PVDF pickup and LED vs. bottom (right) with PIC microprocessor
`and bus connections
`
`
`
`degrees of freedom: first bending it around in a circle, then twisting it clockwise and
`counterclockwise, and finally bending it in horizontal then vertical axes before leaving.
`There are 3 Harmonic Driving units in the Brain Opera, all running the same experience;
`each uses a PC for music (generated by an E-Mu Morpheus synthesizer) and an IBM RS-
`6000 workstation for graphics. The hub of the joystick is embedded with an array of 8
`LEDs, which flash under MIDI control.
`The Rhythm Tree is the world’s largest electronic percussion rig, composed of 320
`smart drumpads, organized into 10 “strings” of 32 pads each. As schematically depicted in
`Fig. 7, the pads in each string are daisy-chained along an 19.2 Kbaud RS-485 serial bus,
`like a line of “Christmas bulbs”. Each pad contains a 16-MHz PIC 16C71 microprocessor
`from Microchip Systems (see http://www.microchip.com/), which monitors the bus,
`processes data from a piezoelectric (PVDF) sensor strip (Paradiso 1996), and controls the
`illumination of a large, on-pad LED. As seen in Fig. 8, all electronics, the PVDF pickup,
`and the LED are potted in a compliant, translucent urethane (Orth 1997), which is struck by
`the hand when the pad is played. The PIC digitizes the pad signal into 8 bits at 50 kHz.
`When a peak is detected over threshold, a valid strike is assumed, and the PIC extracts a set
`of features from a subsequent 0-15 msec remotely programmable interval of the pickup
`signal. These parameters include the polarity of the initial PVDF signal peak, the number
`of significant zero crossings detected, and the net integrated signal amplitude (producing 14
`bits of velocity information). The initial significant peak polarity yields a very reliable
`discrimination between top and side impacts, as illustrated for many separate hits
`superimposed at left in Fig. 9 (top hits start negative, whereas side hits start positive); this
`
`9
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0009
`
`

`

`Figure 9: Pickup response to different types of pad strikes
`
`arises from the PVDF strip initially going into compression or expansion. As shown at
`right in Fig. 9, the number of zero crossings detected over the sampling interval can
`discriminate between a sharp, elastic hit (where the pad exhibits strong mechanical
`ringdown), and a dull slap with the hand remaining on the pad (a heavily damped
`response). Because this parameter depends more on the pad construction and strike
`kinematics, it is less consistent for the player without introducing a longer sampling interval
`and excessive latency.
`Because they are all on a shared bus, every connected rhythm pad must have a
`unique ID. This can be accomplished in three ways for these pads. Each pad has a 100 W
`resistor in series between a line in the bus input and the corresponding line in the bus
`output; this signal also goes to one of the PIC A/D inputs. The concentrator sets this bus
`line to 5 Volts at the first pad, and the terminator holds this line to ground at the last pad,
`thus this sampled voltage, read after the PIC powers up and the lines stabilize, is
`proportional to the position of the pad along the chain (the resistors form a divider ladder).
`Although this has promise, when put into practice, problems crept in from voltage stability
`and component tolerance. Another A/D input pin of the PIC is fed by a solid-state white
`noise generator (Simonton 1973), enabling the pads to access truly random numbers, thus
`statistically accumulate different ID’s (Smith 1998). This works, but looses information on
`the physical position of each pad, which the music software desires. The technique that is
`currently used is very brute-force; each PIC on a string has a unique ID (running 0-31)
`programmed into its PROM.
`The drumpads employ a very efficient round-robin polling scheme that transfers
`data to the bus concentrator with minimal delay. After a drumpad has been struck and has
`hit data ready to send, it waits for a poll-setup message from the bus concentrator, with a
`
`10
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0010
`
`

`

`Figure 10: A typical audience encounter with a Rhythm Tree Bag
`
`
`
`base pad address as its argument. Starting at this base, the pads on a string are sequentially
`addressed after each subsequent serial bit sent from the concentrator. Each of these
`transmitted bits advances a counter in all pads; when this counter value matches the
`assigned ID of a pad with data waiting, that pad seizes the bus and responds with its data.
`The concentrator then transmits another poll-setup message commanding the pads to set
`their counters to the next pad in the sequence, and continues polling as before. As each pad
`is independently addressed, this scheme returns the hit data after a bounded, deterministic
`interval. Addressing sequential pads with a single transmitted bit (rather than a full
`address) entails minimal readout delay.
`The hit parameters are thereby passed to the concentrator, where they are formatted
`into a MIDI stream and routed through a MIDI merger (grouping up to 8 strings) to the host
`computer, which then triggers synthesized sounds and illuminates the pad LEDs according
`to a simple pre-defined mapping scheme with two sounds on each pad; one for top and
`another for side impacts (Back 1997). In order to facilitate easy testing using a commercial
`drum synthesizer, the pad number and high velocity byte are sent as a MIDI note, followed
`by the hit polarity, ringdown count, and low velocity byte sent as a pitch bend command.
`The drumpads generally respond within a 15 msec interval (much of which is due to the
`integration time rather than network latency); a bit slow for a performance instrument, but
`adequate for their application in the Brain Opera Lobby.
`The pad’s LED intensity is controlled by the PIC via duty-cycle modulation.
`Normally, the PIC is set in a mode that automatically illuminates the LED upon hit detection
`with an initial brilliance proportional to the detected velocity, then exponentially dimming.
`The LED can also be directly controlled or triggered over the bus (hence MIDI); this mode
`is exploited to send occasional “waves” of illumination across the strings.
`
`11
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0011
`
`

`

`Figure 11: Gesture Wall in action (left) and schematic (right)
`
`Nearly all pad parameters (trigger thresholds, integration times, LED modes, etc.)
`are programmable over the bus. A Visual Basic program has been written to allow these
`parameters to be rapidly adjusted for individual pads and groups of pads; a data file is then
`produced, which is downloaded to the pads by the music software. This parameter list is
`continually sent out over the bus in order to reprogram any pad processors that reset
`through power interruptions; a valuable feature, in that the RJ-11 connectors that mate to
`the pads are occasionally pulled and momentarily break contact (the initial, solid, linear
`structure of the rhythm tree design gave way to a less expensive mounting scheme using
`foam-filled bags, which have several mechanical disadvantages). Despite these drawbacks,
`the rhythm tree system successfully survives long periods of kinetic abuse (e.g., Fig. 10),
`as expected for such an installation in open, public venues.
`All pads also have a common bus line connected to the remaining PIC A/D input;
`each PIC can adjust the voltage on this line. Although it is currently unused, it was
`installed to enable fast collective communication and computation of a common function for
`other purposes, such as direct sound synthesis. Even though the PIC chips, having only
`64 bytes of data memory and 1K of PROM, provide a restrictive development
`environment, all PIC code was written entirely in C (http://www.ccsinfo.com/picc.html).
`The last installation in the Lobby is the Gesture Wall, which uses transmit-mode
`electric field sensors (Paradiso and Gershenfeld 1997) to measure the position and
`movement of a user's hands and body in front of a projection screen. The device is
`diagrammed at right in Fig. 11, and shown in operation at left; there are five Gesture Walls
`in the Brain Opera. A brass transmitter plate atop the floor is driven by a low-frequency
`sinusoidal signal (ranging 50-100 kHz; each gesture wall is tuned to a different frequency
`
`12
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0012
`
`

`

`
`
`Figure 12: Distribution of Gesture Wall transmitter drive voltage, corresponding to
`shoe impedance, for a sample of 700 Brain Opera attendees
`
`to avoid crosstalk) at very low voltage (2-20 Volts P-P), far removed from any FCC or
`health restrictions. When a performer steps atop the transmitter platform, this signal
`couples through the performer's shoes into their body. A set of four pickup antennas
`mounted on goosenecks around the perimeter of the screen are narrowly tuned through
`synchronous demodulation (Paradiso and Gershenfeld 1997) to receive this transmit signal
`and reject out-of-band background. The amplitude of these received signals, which
`corresponds to the strength of capacitive coupling (hence proximity) to the body, is
`detected and routed through log amplifiers to approximate a linear range-to-voltage
`relationship, then 8-bit digitized and output via MIDI to a PC running ROGUS. An LED
`potted in each sensor “bud” is driven with the detected signal, thus glows with increasing
`intensity as the performer's body approaches (see Fig. 11 left); these LED’s can also be
`directly controlled through MIDI to illuminate in any desired fashion.
`When transmitting into the body through the feet, the complex impedance of the
`shoe sole has a large effect on the coupling efficiency, hence signal strength. This depends
`not only on the size and thickness of the sole, but most heavily on the sole material and
`composition, thus varies highly from one shoe to the next, hence needs to be compensated.
`We solved this problem by having the player first put a hand flat on a calibration plate after
`stepping on the transmit electrode, measuring the current flowing into the body through the
`shoe sole. During the calibration interval, which requires well under a second, the
`transmitter's output voltage was servo'ed to drive this current to a fixed reference, thereby
`removing the effect of shoe impedance and making everybody more-or-less identically
`efficient transmitters. A small second-order nonlinearity was introduced into this loop to
`
`13
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0013
`
`

`

`Figure 13: Reconstructed Gesture Wall position for a hand tracing the indicated
`patterns
`
`compensate for the series capacitance of the insulated calibration plate. After calibration, all
`players can access the full range of response that the Gesture Wall is capable of delivering.
`Fig. 12 shows the distribution of transmitter drive voltages (corresponding to shoe
`impedances via the calibration servo loop) for all users of one Gesture Wall during a 1-
`week Brain Opera run in Lisbon, illustrating the associated spread in coupling efficiency
`caused by different shoes. As seen in the fitted curve, the data can be approximated by a
`gamma distribution (Hoel 1971).
`An attached PC determines the position of a hand in the plane of the receivers and
`its distance from this plane by linear combinations of the four sensor signals. The
`weighting factors are determined by a least-squares fit to data taken with a hand placed at 9
`predetermined locations in the sensor plane. This is done only once, after the Gesture Wall
`is first set up. The resulting data can track hand position very well, as seen in Fig. 13,
`which shows coordinates derived from the calibrated Gesture Wall sensors for a hand
`drawing in the air. As shown in the lower diagram, the hand first tracks a spiral, then a
`square, then both diagonals, with the “pen” changing every time the hand is pulled back.
`The individual shown in Fig. 11 is using the Gesture Wall in this “tracking” mode, with
`one hand forward and body back. More generally, people will introduce two hands at
`once, if not their entire body. In this case, the above algorithm produces “averaged”
`coordinates, which reflect the body’s bulk position and articulation. Although the least-
`squares calibration was performed with an average-sized person, the tracking accuracy will
`also vary somewhat with the size of the participant. The mappings chosen for the Gesture
`Wall were selected to nonetheless give adequate audio/visual response for a wide range of
`body types and postures. We have subsequently developed another device (Strickon and
`Paradiso 1998) based on a low-cost scanning laser rangefinder that can determine accurate
`
`14
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0014
`
`

`

`Figure 14: The Digital Baton and Gesture Tree in performance
`
`position of multiple hands in a plane, independent of body size or posture; although this
`was inspired by Gesture Wall applications, it is not currently in the Brain Opera.
`The “back end” of each Gesture Wall consisted of a pair of PC’s (one running the
`music and sensor analysis code and another running the graphics code), a Kurzweil 2500
`synthesizer, and a video projector. The musical mappings (Machover 1998) played
`sequences that would increase in amplitude as the body approached the sensor plane
`(starting at silence with the player away from the sensors), and change pitch range as the
`player moved their hands/body vertically (favoring low notes with hands near lower
`sensors, high notes with hands near upper sensors) while changing the instrument timbre
`and panning as the hands/body are moved from right to left. The visual mappings (Dodge
`1997), (Smith et. al. 1998) created perturbations to a video sequence (Daniel 1996) when a
`player approached the sensors, with effects centered at the perceived hand/body position.
`
`15
`
`Fitbit, Inc. v. Philips North America LLC
`IPR2020-00783
`
`Fitbit, Inc. Ex. 1011 Page 0015
`
`

`

`Figure 15: The Brain Opera's Sensor Chair; diagram (left) and in performance (right)
`
`
`
`3) The Performance Instruments
`
`The segment of the Brain Opera that was actually performed before the audience
`was played by a trio, each using a different custom-designed electronic instrument. One of
`them, the "Gesture Tree" (visible at right in Fig. 14), is a simple hybrid of two of the
`interactive instruments described in the last section. A bag of Rhythm Tree pads enables
`percussive and tactile triggering of sounds, which are then modified in various ways by
`waving an arm around the four Gesture Wall sensors mounted above (the performer is
`standing on a transmitter plate). Although this instrument is usually played with bare
`hands, some performers have used metal drumsticks, to which the Gesture Wall pickups
`will also respond.
`Another of the instruments, the Sensor Chair (Fig. 15), is based solely on transmit-
`mode electric field sensing. It is very similar to the Gesture Wall, except here the
`performer sits on a chair with a transmit electrode affixed to the seat, providing excellent
`electrical coupling into the performer's body. Since the differences in shoe coupling (Fig.
`12) are no longer an issue here (everybody couples nearly as well through the seat; clothing
`differences have only minor effect), there is no need for a calibrator pickup. A linear
`software calibration, run roughly once a day for each chair performer, is sufficient to
`enable good, repeatable gesture response. A halogen bulb embedded in th

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket