`Geos MAAN UNIAN NM ANNUUA 0A
`oie
`a is
`(10) International Publication Number
`PCT
`WO 2010/062479 Al
`
`tional Publication Dat
`(43) Int
`3 June 2010 (03.06.2010)
`
`
`
`(51) International Patent Classification:
`G02B 27/02 (2006.01)
`
`(74) Agent: PARCHER,Tiffany, A.; Christie, Parker & Hale,
`LLP, P.O. Box 7068, Pasadena, CA 91109-7068 (US).
`
`(21) International Application Number:
`PCT/US2009/059887
`
`ere
`.
`(22) International Filing Date:
`
`(25) Filing Language:
`
`7 October 2009 (07.10.2009)
`English
`:
`English
`
`(81) Designated States (unless otherwise indicated, for every
`kind of national protection available): AE, AG, AL, AM,
`AO, AT, AU, AZ, BA, BB, BG, BH, BR, BW, BY, BZ,
`CA, CH, CL, CN, CO, CR, CU, CZ, DE, DK, DM, DO,
`DZ, EC, EE, EG, ES, Fl, GB, GD, GE, GH, GM, GT,
`HN, HR, HU,ID, IL, IN, IS, JP, KE, KG, KM,KN,KP,
`KR, KZ, LA, LC, LK, LR, LS, LT, LU, LY, MA, MD,
`ME, MG, MK, MN, MW, MX, MY, MZ, NA, NG, NL
`(26) Publication Language:
`NO, NZ, OM,PE, PG, PH, PL, PT, RO, RS, RU, SC, SD,
`(30) Priority Data:
`SE, SG, SK, SL, SM,ST, SV, SY, TJ, TM, TN, TR, TT,
`US
`2 November 2008 (02.11.2008)
`61/110,591
`TZ, UA, UG,US, UZ, VC, VN, ZA, ZM, ZW.
`US
`3 January 2009 (03.01.2009)
`61/142,347
`US (84) Designated States (unless otherwise indicated, for every
`PCT/US2009/000217 6 April 2009 (06.04.2009)
`US
`kind of regional protection available): ARIPO (BW, GH,
`PCT/US2009/000218 6 April 2009 (06.04.2009)
`US
`ZW), Eurasian (AM, AZ, BY, KG, KZ, MD, RU, TJ,
`61/171,168
`21 April 2009 (21.04.2009)
`US
`TM), European (AT, BE, BG, CH, CY, CZ, DE, DK, EE,
`61/173,700
`29 April 2009 (29.04.2009)
`US
`ES. FL FR. GB. GR, HR, HU,
`IE,
`IS,
`IT, LT, LU, LV.
`61/180,101
`20 May 2009 (20.05.2009)
`US
`MC. MK. MT. NL. NO. PL. PT. RO. SE. SL SK. SM.
`61/180,982
`26 May 2009 (26.05.2009)
`US
`TR), OAPI (BF, BJ, CF, CG, cL CM, GA, GN, GQ, GW,
`61/230,744
`3 August 2009 (03.08.2009)
`US
`ML, MR, NE. SN, TD. TG).
`61/232,426
`8 August 2009 (08.08.2009)
`Published:
`Inventor; and
`(72)
`.
`.
`(71) Applicant : CHAUM, David [US/US]; 14652 Sutton
`St., Sherman Oaks, CA 91403 (US). —__with international search report (Art. 21(3))
`
`
`(54) Title: SYSTEM AND APPARATUS FOR EYEGLASS APPLIANCE PLATFORM
`
`Fig ure 2.
`Z20
`
`frie vg 2°
`
`(57) Abstract: The present invention relates to a personal multimedia electronic device, and more particularly to a head- worn de-
`vice such as an eyeglass frame having a plurality ofinteractive electrical/optical components. In one embodiment, a personal mul-
`timedia electronic device includes an eyeglass frame having a side arm and an optic frame; an output device for delivering an out-
`put to the wearer; an input device for obtaining an input; and a processor comprising a set of programminginstructions for con-
`trolling the input device and the output device. The output device is supported by the eyeglass frame and is selected from the
`group consisting of a speaker, a bone conduction transmitter, an image projector, and a tactile actuator. The input device is sup-
`ported by the eyeglass frame andis selected from the group consisting of an audio sensor, a tactile sensor, a bone conduction sen-
`sor, an image sensor, a body sensor, an environmental sensor, a global positioning system receiver, and an eye tracker. In one em-
`bodiment, the processor applies a user interface logic that determines a state of the eyeglass device and determines the output in
`response to the input andthe state. TP/tn
`
`
`
`2010/062479A|IITTINIIINIININNATINTANCHCATTAAAT
`
`©W
`
`w
`
`(19) World Intellectual Property Organization
`International Bureau
`
`nternavional
`
`Fubhlication
`
`Date
`
`
`
`an
`GET
`eaeay
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`SYSTEM AND APPARATUS FOR EYEGLASS APPLIANCE PLATFORM
`
`CROSS-REFERENCE TO RELATED APPLICATION(S)
`[0001]
`This application claims priority to PCT Application Nos. PCT/US2009/002174,
`entitled “Proximal Image Projection System,”filed April 6, 2009 and PCT/US2009/0021 82,
`entitled “Proximal Image Projection System,” filed April 6, 2009, the entire contents of
`which are incorporated by reference herein.
`[0002]
`This application claims priority to and the benefit of U.S. Provisional Application
`Nos. 61/110,591, entitled “Foveated Spectacle Projection Without Moving Parts,”filed
`November 2, 2008; 61/142,347, entitled “Directed Viewing Waveguide Systems,”filed
`January 3, 2009; 61/169,708, entitled “Holographic Combiner Production Systems,”filed
`April 15, 2009; 61/171,168, entitled “Proximal Optic Curvature Correction System,”filed
`April 21, 2009; 61/173,700, entitled “Proximal Optic Structures and Steerable Mirror Based
`Projection Systems Therefore,” filed April 29, 2009; 61/180,101, entitled “Adjustable
`Proximal Optic Support,” filed May 20, 2009; 61/180,982,entitled “Projection of Images into
`the Eye Using Proximal Redirectors,” filed May 26, 2009; 61/230,744, entitled “Soft-
`Launch-Location and Transmissive Proximal Optic Projection Systems,”filed August3,
`2009; and 61/232,426, entitled “Soft-Launch-Location and Transmissive Proximal Optic
`Projection Systems,” filed August 8, 2009, the entire contents of all of which are incorporated
`by reference herein.
`
`10
`
`15
`
`20
`
`FIELD OF THE INVENTION
`
`The present invention relates to a personal multimedia electronic device, and more
`[6003]
`particularly to a head-worn device such as an eyeglass frame havingaplurality of interactive
`electrical/optical components.
`25
`
`BACKGROUND OF THE INVENTION
`
`30
`
`35
`
`Portable electronic devices have become increasingly popular among consumers
`{0004]
`and are now available for a wide variety of applications. Portable electronic devices include
`cellular phones, MP3 or other music players, cameras, global positioning system (GPS)
`receivers, laptop computers, personal digital assistants (such as the iPhone, Blackberry, and
`others), and others. These devices have enabled consumersto access, store, and share
`electronic information while away from a desktop computer. Consumersare able to send
`emails and text messages, browse the Internet, take and upload photographs,receivetraffic
`alerts and directions, and other useful applications while away from the homeoroffice.
`Additionally, consumers have begun to expect and rely on this mobile capability as these
`portable electronic devices become more available and affordable.
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`However, the increasing use of these various devices has some disadvantages.
`{0005}
`First, many people find that they need to carry multiple different devices with them
`throughout the day in order to have accessto all of the applications that they wantto use,
`such as, for example, a compact digital camera, an MP3 player, a cellular phone, and an
`automotive GPS unit. Each of these different devices has its own operating instructions and
`operating system, must be properly charged, and may require a particular signal or accessory.
`Another disadvantage is the distraction of using these devices while driving, as people may
`drive recklessly when attempting to locate and use one of these devices. Thus, thereis still a
`need for these many different applications and portable electronic devices to be consolidated
`and made more easy to use.
`[0006]
`At the same time that these various applications are being developed and offered
`to consumers, optical imaging systems are improving, complex optical displays are being
`developed, and many electrical/optical components such as sensors, processors, and other
`devices are becoming more capable and more compact. The present invention utilizes these
`new technologies and creates a new portable electronic device that consolidates and
`facilitates many of the capabilities of prior devices.
`
`SUMMARY OF THE INVENTION
`
`The present invention relates to a personal multimedia electronic device, and more
`(6007)
`particularly to a head-worn device such as an eyeglass frame having a plurality of interactive
`electrical/optical components.
`In one embodiment, a personal multimedia electronic device
`includes an eyeglass frame with electrical/optical components mountedin the eyeglass frame.
`The electrical/optical components mounted in the eyeglass frame can include input devices
`such as touch sensors and microphones, which enable the user to input instructions or content
`to the device. The electrical/optical components can also include output devices such as
`audio speakers and image projectors, which enable the eyeglass device to display content or
`provide information to the wearer. The electrical/optical components can also include
`environmental sensors, such as cameras or other monitors or sensors, and communications
`devices such as a wireless antenna for transmitting or receiving content (e.g., using
`Bluetooth) and/or power. Additionally, the electrical/optical components include a computer
`processor and memory device, which store content and programminginstructions.
`In use, the
`user inputs instructions to the eyeglass device, such as by touching a touch sensor mounted
`on the side arm of the eyeglass frame or speaking a command, and the eyeglass device
`responds with the requested information or content, such as displaying incoming email on the
`image projector, displaying a map and providing driving instructions via the speaker, taking a
`photograph with a camera, and/or many otherapplications.
`[6608]
`In one embodiment, a multimedia eyeglass device includes an eyeglass frame
`having a side arm and an optic frame; an output device for delivering an output to the wearer;
`
`10
`
`13
`
`20
`
`25
`
`30
`
`35
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`an input device for obtaining an input; and a processor comprising a set of programming
`instructions for controlling the input device and the output device. The output deviceis
`supported by the eyeglass frame and is selected from the group consisting of a speaker, a
`bone conduction transmitter, an image projector, and a tactile actuator. The input device is
`supported by the eyeglass frame and is selected from the group consisting of an audio sensor,
`a tactile sensor, a bone conduction sensor, an image sensor, a body sensor, an environmental
`sensor, a global positioning system receiver, and an eye tracker. In one embodiment, the
`processor applies a user interface logic that determinesa state of the eyeglass device and
`determines the output in response to the input andthestate.
`[0009]
`In one embodiment, a head-worn multimedia device includes a frame comprising
`a side arm and an optic frame; an audio transducer supported by the frame; a tactile sensor
`supported by the frame; a processor comprising a set of programming instructions for
`receiving and transmitting information via the audio transducer and the tactile sensor; a
`memory device for storing such information and instructions; and a power supplyelectrically
`coupled to the audio transducer, the tactile sensor, the processor, and the memory device.
`{0010]
`In an embodiment, a method for controlling a multimedia eyeglass device includes
`providing an eyeglass device. The eyeglass device includes an output device for delivering
`information to the wearer, the output device being selected from the group consisting of a
`speaker, a bone conduction transmitter, an image projector, and a tactile actuator; an input
`device for obtaining information,the input device being selected from the group consisting of
`an audio sensor, a tactile sensor, a bone conduction sensor, an image sensor, a body sensor,
`an environmental sensor, a global positioning system receiver, and an eye tracker; and a
`processor comprising a set of programminginstructionsfor controlling the input device and
`the output device. The method also includes providing an input bythe input device;
`determining a state of the output device, the input device, and the processor; accessing the
`programming instructions to select a response based on the input and the state; and providing
`the response by the output device.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`Figure 1A is a side elevational view ofan electronic eyeglass device according to
`[0011]
`an embodimentof the invention, in an unfolded position.
`[0612]
`Figure 1B is a side elevational view of a side arm of an eyeglass device according
`to another embodimentofthe invention.
`
`Figure 1C is a front elevational view ofan electronic eyeglass device according to
`{0013}
`another embodimentof the invention, in an unfolded position.
`[0014]
`Figure 2 is a front view of an electronic eyeglass device according to an
`embodiment of the invention, in a folded position.
`
`10
`
`15
`
`25
`
`39
`
`35
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`Figure 3 is a front view of an electronic eyeglass device according to an
`[6015]
`embodiment of the invention, in a folded position.
`(0016]
`Figure 4 is a front view of an electronic eyeglass device according to an
`embodiment of the invention, in a folded position.
`[0017]
`Figure 5A is a front viewof an electronic eyeglass device according to an
`embodiment of the invention, in a folded position.
`[0018]
`Figure 5B is a side view ofthe device of Figure 5A,in an unfolded position.
`[0019]
`Figure SC is a top view of the device of Figure 5A, in an unfolded position.
`[0020}
`Figure 6A is a partial top view of an electronic eyeglass device according to an
`embodiment of the invention.
`
`Figure 6B is a partial front view of the device of Figure 6A.
`[0021]
`Figure 6C is a cross-sectional view of an optic lens according to an embodiment
`[0022]
`of the invention.
`
`Figure 6D is a partial front view of an eyeglass device according to another
`[0023]
`embodiment of the invention.
`
`Figure 6E is a side view of the eyeglass device of Figure 6D.
`(0024)
`Figure 6F is a partial top view of the eyeglass device of Figure 6D.
`{0025]
`Figure 7A is a partial top viewofan electronic eyeglass device according to an
`[0026]
`embodiment of the invention.
`
`10
`
`15
`
`20
`
`Figure 7B is a partial top view of an electronic eyeglass device according to
`(6027)
`another embodiment of the invention.
`
`Figure 7C is a partial top viewof an electronic eyeglass device according to
`10023]
`another embodimentof the invention.
`
`Figure 7D is a partial front view of an electronic eyeglass device according to an
`10029]
`embodimentofthe invention.
`
`25
`
`Figure 8A is a partial side view of a side arm of an electronic eyeglass device
`[0030]
`according to an embodimentof the invention.
`{0031]
`Figure 8B is a schematic view ofa coil according to the embodimentof Figure
`BA,
`
`30
`
`Figure 8C is a partial side view of the device of Figure 8A with a boot, according
`[0632]
`to an embodiment of the invention.
`
`Figure 8D is a cross-sectional view of the device of Figure 8C, taken along the
`[0033]
`line 8D-8D.
`
`Figure 8E is a front view of an electronic eyeglass device according to an
`[0034]
`embodiment of the invention.
`
`35
`
`[0635]
`invention.
`
`Figure 8F is a top view of a storage case according to an embodimentof the
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`Figure 8G is a top view of an electronic eyeglass device according to an
`[0036]
`embodimentof the invention, with a lanyard.
`[0037]
`Figure 8H is a top view of an electronic eyeglass device according to another
`embodiment of the invention, with a lanyard.
`[0033]
`Figure 9A is a side view of a side arm of an electronic eyeglass device according
`to an embodiment of the invention.
`
`Figure 9B is a side view ofan electronic eyeglass device with a replacement side
`{0039}
`arm, according to an embodimentof the invention.
`[0040]
`Figure 9C is a close-up view of a hinge connection according to the embodiment
`of Figure 9B.
`[0041]
`Figure 10A is a side view of an attachment unit for an electronic eyeglass device
`according to an embodimentofthe invention.
`[6042]
`Figure 10B is a side viewof a traditional eyeglass frame, for use with the
`attachment unit of Figure 10A.
`[0043]
`Figure 10C is a side viewof an attachment unit according to an embodiment of
`the invention.
`
`Figure 10D is a cross-sectional view of a side arm and attachment unit according
`[0044]
`to an embodimentof the invention.
`
`10
`
`15
`
`[0045]
`invention.
`
`20
`
`Figure 11A is a flow chart of a control system according to an embodimentofthe
`
`Figure 11B is a flow chart of a control system according to another embodiment
`[0046]
`of the invention.
`
`Figure 11C is a flow chart of a control system according to another embodiment
`[0047]
`of the invention.
`
`25
`
`Figure 11D is a flow chart of a control system according to another embodiment
`[0048]
`of the invention.
`
`Figure 12 is a block diagram of various components according to an exemplary
`[0049]
`embodiment of the invention.
`
`Figure 13 is a block diagram of a control system according to an exemplary
`[0050]
`embodimentof the invention.
`
`30
`
`Figure 14A is a block diagram of a dual transducer system according to an
`[0051]
`embodimentof the invention.
`
`Figure 14B is a block diagram of a dual transducer system according to an
`(0052}
`embodiment of the invention.
`
`35
`
`Figure 15A is a front view of a folded eyeglass frame according to an embodiment
`[0053]
`of the invention.
`
`Figure 15B is a side view of an unfolded eyeglass frame according to an
`(0054}
`embodiment of the invention.
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`Figure 15C is a bottom view ofan unfolded eyeglass frame according to an
`{0055}
`embodiment of the invention.
`
`Figure 16 is a partial horizontal cross-sectional view of an eyeglass frame with a
`[0056]
`clamp, according to an embodimentof the invention.
`(0057)
`Figure 17A is a partial side view of an adjustable eyeglass frame according to an
`embodiment of the invention.
`
`Figure 17B is a partial side view of an adjustable eyeglass frame according to an
`[0058]
`embodimentof the invention.
`
`Figure 17C is a partial side view of an adjustable eyeglass frame according to an
`[0059]
`embodiment of the invention.
`
`Figure 17D is a partial horizontal cross-sectional view of an adjustable eyeglass
`[0060]
`frame according to an embodimentofthe invention.
`(0061)
`Figure 18A is a partial vertical cross-sectional view of an adjustable eyeglass
`frame according to an embodimentofthe invention.
`[0062]
`Figure 18B is a partial side view of an adjustable eyeglass frame according to an
`embodimentof the invention.
`
`Figure 18C is a partial cross-sectional view of the adjustable eyeglass frame of
`[6663]
`Figure 18A taken along line Y-Y.
`10064]
`Figure 18D is a partial cross-sectional view of the adjustable eyeglass frame of
`Figure 18A taken along line Z-Z.
`
`DETAILED DESCRIPTION OF THE INVENTION
`
`The present invention relates to a personal multimedia electronic device, and more
`[0065]
`particularly to a head-worn device such as an eyeglass frame having a plurality of interactive
`electrical/optical components. In one embodiment, a personal multimedia electronic device
`includes an eyeglass frame with electrical/optical components mounted in the eyeglass frame.
`The electrical/optical components mounted in the eyeglass frame can include input devices
`such as touch sensors and microphones, which enable the user to input instructions or content
`to the device. The electrical/optical components can also include output devices such as
`audio speakers and image projectors, which enable the eyeglass device to display content or
`provide information to the wearer. Theelectrical/optical components can also include
`environmental sensors, such as cameras or other monitors or sensors, and communications
`devices such as a wireless antenna for transmitting or receiving content(e.g., using
`Bluetooth) and/or power. Additionally, the electrical/optical components include a computer
`processor and memory device, which store content and programminginstructions. In use, the
`user inputs instructions to the eyeglass device, such as by touching a touch sensor mounted
`on the side arm of the eyeglass frame or speaking a command, and the eyeglass device
`responds with the requested information or content, such as displaying incoming email on the
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`image projector, displaying a map and providing driving instructions via the speaker, taking a
`photograph with a camera, and/or many other applications.
`[0066]
`This integrated, electronic eyeglass device consolidates many different
`functionalities into one compact, efficient, and easy to use device. The eyeglass device can
`be constructed according to different user preferences, so that it includesthe electrical/optical
`components that are necessary for the user's desired applications. Different components such
`as cameras, projectors, speakers, microphones, temperature sensors, Bluetooth connections,
`GPS receivers, heart rate monitors, radios, music players, batteries, and other components can
`be selected as desired to provide applications such as videos, music, email, texting, maps,
`web browsing, health monitoring, weather updates, phone calls, and others. All of these
`components and applications can be controlled by the user through touch sensors, audio
`commands, and other sensors through which the wearer gives instructions to the eyeglass
`device. The inventor has discovered that this integrated, multi-media head-worm device can
`be created with advanced optical projections, compact electrical/optical components, and a
`control system controlling these components.
`[0067]
`An embodiment of the invention is shown in Figures 1[A-C. Figure 1A shows a
`head-worn electronic device 10 including an eyeglass frame 12. The eyeglass frame 12
`includes first and second temples or side arms 14 (only one of which is visible in the side
`viewof Figure 1A) and first and second optic frames 16 (only one of which is visible in the
`side view of Figure 1A). The optic frame 16 may bereferred to in the industry as the "eye"
`of the eyeglass frame. The side arms 14 are connectedto the optic frame 16 by a hinge 29.
`Each optic frame 16 supports an optic 18 (see Figure 1C), which maybe a lensor glass or
`mirror or other type ofreflective or refractive element. The frame 12 also includes a nose
`bridge 20 which connects the two optic frames 16, and two nose pads 22 that are mounted on
`the optic frames and that rest on either side of the wearer's nose. The two optic frames 16 and
`nose bridge 20 make up the front face 17 of the frame 12. Each side arm 14 includes an
`elbow 24 where the arm curves or bends to form an ear hook 26 which rests behind the
`
`wearer's ear.
`
`As shown in Figures 1A-1C, the eyeglass frame 12 includes variouselectrical
`(0068]
`and/or optical components 30a, 30b, 30c, etc. supported bythe frame 12 and powered by
`electricity and/or light. The components 30 can be MEMS(microelectromechanical
`systems). In Figure 1A, the electrical/optical components 30 are supported by the side arm
`14. The electrical/optical components 30 may be mounted within the side arm 14, under the
`top-mostlayer of the side arm, such as under a top plastic cover layer. Alternatively or in
`addition, the components 30 may be mountedto the side arm 14 by adhesive, or by printing
`the electrical/optical components onto a substrate on the side arm 14, or by any other suitable
`method. The components 30 can be spaced out along the side arm 14 as necessary depending
`on their size and function.
`In Figure 1B, electrical/optical components 30 are shown
`
`10
`
`15
`
`20
`
`25
`
`35
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`supported on the wing 28 of the side arm 14', and they may be located as necessary according
`to their size and function. In Figure 1C, the electrical/optical components 30 are supported
`bythe two optic frames 16 and the nose bridge 20. The necessary conductors 27 such as
`wires or circuit board traces are integrated into the frame 12 to connect and powerthe various
`electrical/optical components 30 at their various locations on the frame. An antenna 25 can
`also be connected to one or more components 30.
`[0069]
`The components of the frame 12 can take on various sizes and shapes. For
`example, an alternate side arm 14’, shown in Figure 1B, includes a wing 28 that extends down
`below the hinge 29 and increases the area of the side arm 14'. The larger side arm 14' can
`support more electrical/optical components 30 and/or can allow the components 30 to be
`spaced apart. In other embodiments the side arm 14 and/or optic frame 16 may have other
`shapes and sizes, including different diameters, thicknesses, lengths, and curvatures.
`{0070}
`Particular locations on the eyeglass frame 12 have been discovered to be
`especially advantageousforcertain electrical/optical components. A few examples will be
`discussed. In Figure 2, an embodiment is shown in which an eyeglass frame 212 includes
`electrical/optical components 232 mounted on the nose pads 222 of the eyeglass frame 212.
`In one embodiment,the electrical/optical components 232 mounted on the nose pads 222 are
`bone conduction devices that transmit audio signals to the wearer by vibration transmitted
`directly to the wearer's skull. Bone conduction devices transmit sound to the wearer's inner
`ear through the bones of the skull. The bone conduction device includes an
`electromechanical transducer that converts an electrical signal into mechanical vibration,
`which is conducted to the ear throughthe skull.
`In addition to transmitting sound through
`vibration to the user, the bone conduction device can also record the user's voice by receiving
`the vibrations that travel through the wearer's skull from the wearer's voice.
`(0071)
`Thus, in one embodiment, the electrical/optical components 232 include bone
`conduction transducers that transmit and receive vibrations to transmit and receive sound to
`and from the wearer. These bone conduction devices may be mounted anywhere on the
`frame 212 that contacts the wearer's skull, or anywhere that they can transmit vibrations
`through another element (such as a pad orplate) to the user's skull. In the embodiment of
`Figure 2, the devices are mounted on the nose pads 222 and directly contact the bone at the
`base of the wearer's nose. The inventor has discovered that this location works well for
`transmitting sound to the wearer as well as receiving the vibrations from the wearer's voice.
`Bone conduction devices operate most effectively when they contact the user with some
`pressure, so that the vibrations can be transmitted to and from the skull. The nose pads
`provide some pressure against the bone conduction devices, pressing them against the user's
`nose, due to the weight of the eyeglass devices sitting on the nose pads. At this location, the
`bone conduction devices can transmit sound to the user and can pick up the user's voice,
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`without picking up as much background noise as a standard microphone,since the user's
`voice is coming directly through the skull.
`[0072]
`The eyeglass frame 212 can transmit sounds such asalerts, directions, or music to
`the wearer through the electrical/optical components 232 and can also receive instructions
`and commands from the user through the sameelectrical/optical components 232. In other
`embodiments, the electrical/optical components 232 mounted on the nose pads 222 may be
`devices other than bone conduction devices. For example, in one embodiment these
`components 232 are standard microphones, used to pick up the user's voice asit is spoken
`throughthe air, rather than through the skull. Two components 232 are shown in Figure 2,
`such as for stereo sound, but in other embodiments only oneis provided.
`[0073]
`Turning to Figure i4A, an embodiment of a dual transducer input system is shown
`in block diagram. Figure 14A shows two input devices 1473a, 1473b. In one embodiment,
`device 1473a is a bone conduction sensor that detects sound transmitted through the user’s
`skull, and device 1473b is a microphone that detects sound transmitted through the air. The
`bone conduction sensor 1473a can detect the user’s voice, which will transmit through the
`skull, and the microphone 1473b can detect other types of noises that do not transmit well
`through the skull, such as background noises or other noises madebythe user (claps,
`whistles, hisses, clicks, etc}. Each of these devices passes the signal through an amplifier
`1474a, 1474b, as necessary, and then to an analog-to-digital converter 1475a, 1475b. This
`converter converts the analog signal from the devices 1473 into a digital signal, and then
`passes it to a digital signal processor (“DSP”) 1477. The DSP processesthe signal according
`to program 1478, and optionally stores the signal in a memory device 1476.
`[0074]
`The DSP can perform various typesof digital signal processing according to the
`particular devices, signals, programming and selected parameters being used. For example,
`when device 1473a is a bone conduction sensor, the sensor 1473a detects the wearer’s voice
`as it is transmitted through the wearer’s skull. However, the user’s voice may sound different
`if it is transmitted through air versus through the skull. For example, a voice may have a
`different frequency response as heard through the skull than would be picked up by a
`microphone through the air. Thus, in one embodiment, the DSP adjusts the signal to
`accommodatefor this difference. For example, the DSP may adjust the frequency response
`of the voice, so that the voice will sound asif it had been detected throughthe air, even
`though it was actually detected through the skull. The DSP can also combinesignals from
`multiple devices into one output audio stream. For example, the DSP can combinethe user’s
`voice as picked up by the bone conduction sensor 1473a with sounds from the environment
`picked up by the microphone 1473b. The DSP combines these audio signals to produce a
`combined audio signal.
`{60075]
`In another embodiment, the DSP combines different aspects of speech from the
`microphone 1473b and from the bone conduction sensor 1473a. For example, at different
`
`10
`
`15
`
`20
`
`25
`
`30
`
`35
`
`
`
`WO2010/062479
`
`PCT/US2009/059887
`
`times during a conversation, one of these sensors may pick up better quality sound than the
`other, or may pick up different components of sound. The DSP mergesthe twosignals, using
`each one to compensate for the other, and blending them together to enhance the audiosignal.
`As an example, the DSP may blend in some outside or background noise behind the user's
`voice. In one embodiment, the user can adjust the amount of backgroundnoise, turning it up
`or down.
`
`In another embodiment, the DSP creates a model of the user's speech, built from
`[0076]
`data collected from the user's voice. The DSP can then process the signals from the two
`sensors 1473a, 1473b to create an output signal based on the model of the user's speech. As
`one example of such processing, sounds from the environment can be distinguished as to
`whether they are from the user's speech or not, and then those from the speech can be used in
`the process of enhancing the speech. As explained with respect to Figure 14B,a related
`process can take place in reverse, to provide soundsto the user.
`[0077]
`Figure 14B shows a dual transducer output system, for providing outputto the
`wearer. The DSP 1477 creates a digital signal, such an audio or video signal, based on
`instructions from the program 1478 and/or content stored in memory 1476. The DSP 1477
`may create the signal and store it in the memory 1476. The DSP maydivide the signal into
`two signals, one for sending to output device 1479a and anotherfor sending to output device
`1479b. For example, device 1479a can be a bone conduction transducer, and device 1479b
`can be an audio speaker. In such a case, the DSP divides the audio signal into a first
`componentthat is transmitted through the skull by the bone conduction transducer 1479a, and
`a second componentthat is transmitted through the air by the speaker 1479b. The signals
`pass through digital-to-analog converters 1475c, 1475d, and then optionally through
`amplifiers 1474a, 1474», andfinally to the output devices 1479a, 1479b. The two signals
`mayberelated to each other, such that when they are both transmitted by the output devices
`1479a, 1479b, the user hears a combined audio signal.
`[0078]
`in still another embodiment, where multiple bone conduction transducers are used,
`such as output device 1479a and input device 1473a, one device mayin effect listen to the
`other, and they may be connected to the same or cooperating DSP’s.
`In other words, the
`sound sent into the skull by one transduceris picked up by another transducer. The DSP
`1477 can then adjust the sound, such as intensity or frequency response, so thatit is
`transmitted with improved and more consistent results. In some examples users can adjust
`the frequency response characteristics for various types of listening.
`[0079]
`In another example embodiment, the sound picked up from the environment can
`be what may be called “cancelled” and/or “masked”in effect for the user by beingsent in by
`bone conduction. For instance, low-frequency s