`(12) Patent Application Publication (10) Pub. No.: US 2015/034.8554 A1
`(43) Pub. Date:
`Dec. 3, 2015
`Orr et al.
`
`US 2015034.8554A1
`
`(54) INTELLIGENT ASSISTANT FOR HOME
`AUTOMATION
`
`(52) U.S. Cl.
`CPC ...................................... G10L 1722 (2013.01)
`
`(71) Applicant: Apple Inc., Cupertino, CA (US)
`
`(72) Inventors: Ryan M. Orr, Cupertino, CA (US);
`Garett R. Nell, Redmond, WA (US);
`Benjamin L. Brumbaugh, San Jose, CA
`(US)
`(21) Appl. No.: 14/503,105
`(22) Filed:
`Sep. 30, 2014
`Related U.S. Application Data
`(60) Provisional application No. 62/005,893, filed on May
`30, 2014.
`Publication Classification
`
`(51) Int. Cl.
`GOL 7/22
`
`(2006.01)
`
`(57)
`
`ABSTRACT
`
`This relates to systems and processes for using a virtual
`assistant to control electronic devices. In one example pro
`cess, a user can speak an input in natural language form to a
`user device to control one or more electronic devices. The
`user device can transmit the user speech to a server to be
`converted into a textual representation. The server can iden
`tify the one or more electronic devices and appropriate com
`mands to be performed by the one or more electronic devices
`based on the textual representation. The identified one or
`more devices and commands to be performed can be trans
`mitted back to the user device, which can forward the com
`mands to the appropriate one or more electronic devices for
`execution. In response to receiving the commands, the one or
`more electronic devices can perform the commands and
`transmit their current states to the user device.
`
`System
`100
`
`Server System 110
`
`IO
`Interface
`To Client
`122
`
`Virtual Assistant Server 114
`a
`Processing
`Data & Models
`Modules 118
`120
`
`I/O Interface To External Services 116
`
`External
`Services
`124
`
`
`
`
`
`
`
`Electronic
`Device
`130
`
`Electronic
`Device
`132
`
`Electronic
`Device
`128
`
`
`
`APPLE 1005
`
`1
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 1 of 15
`
`US 2015/034.8554 A1
`
`
`
`
`
`
`
`2
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 2 of 15
`
`US 2015/034.8554 A1
`
`
`
`
`
`
`
`3
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 3 of 15
`
`US 2015/034.8554 A1
`
`
`
`
`
`
`
`
`
`
`
`4
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 4 of 15
`
`US 2015/034.8554 A1
`
`
`
`
`
`
`
`5
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 5 of 15
`
`US 2015/034.8554 A1
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`6
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 6 of 15
`
`US 2015/034.8554 A1
`
`sseool:
`
`009
`
`pueUuUuOO
`
`9.JPG
`
`pueUULIJOO
`
`FTG
`
`
`
`
`
`
`
`
`
`7
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 7 of 15
`
`US 2015/034.8554 A1
`
`
`
`809
`
`SS33OJE
`
`OOZ
`
`8
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 8 of 15
`
`US 2015/034.8554 A1
`
`ZZ8
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`9
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 9 of 15
`
`US 2015/0348554 Al
`
`uonesnByuoy
`
`uoneinbyuoy
`
`
`zig|MeiNidSULJO4YORSJOSIeISeUL18SO|
`zz|Wes10e1e1syoeedsves)euJOuoNejueselday perepdneuenrsoey
`
`
`
`9060}Bulpuodseuoyeyegsy]eAIeDeY
`
`
`
`soe|24LoLBulpuodsauoyBleqau,UO
`
`
`O16JOS/)YSjJussaideyUole}Uasalday
`
`1Aureunid841JOa}e1SV}8SOLJUE}U|
`
`saolAeg91U0.)99/9JOAI[esN|d@ULJO
`SPUBWLUODJOAyjesunidyWwSsuUelL
`
`
`ayUOpesegsaolAaqJ1U0N}9e/FJO
`
`
`
`UOISIGAUOD1X9]O|YDeedsULOLIed
`
`
`@y}UOpasegSadIAaqDIUONN9/FJO
`
`jenyxeyYeyes9uUes)OYINdU|OIpNY
`jenyxe_ayyeuLBululwuejeq
`
`JOYyaezJOa1e1gpayepdrUyealsooy
`
`
`SAd1AEqS1UONIE/ZJOAEsSULJO
`
`
`SODIAGQ]DIUON}DE/ZIOAyesnidSULOL
`SPUBWLUODJOAupeinidauWUISUeLL
`
`
`yesJOee}gpeyepdrey,ywsuel,
`
`
`eyOLBulpuodseuoeleqjwsuel)
`SPUBLULUODJOAyljeinidSueAlaoey
`
`
`Bulsiidwosjndu]oipnyuysAleoey
`
`SOD1ABQS1U0M}DA/FJOAyesOYL
`
`JBAIaSYOLjndu|cIpny
`
`
`
`yndu|olpnyay
`
`
`
`yosedsJesn
`
`206
`
`
`
`vO6ee
`
`SS8001g
`
`006
`
`vL6
`
`9L6
`
`86
`
`026
`
`6‘Sls
`
`10
`
`10
`
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 10 of 15
`
`US 2015/034.8554 A1
`
`
`
`11
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 11 of 15
`
`US 2015/034.8554 A1
`
`
`
`Áueno
`
`12
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 12 of 15
`
`US 2015/034.8554 A1
`
`
`
`
`
`qndu] opny
`
`pueuJuJOO
`
`13
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 13 of 15
`
`US 2015/034.8554 A1
`
`
`
`?un KeIds[C]
`
`
`Z09||
`
`
`
`ueeJOS ?OnOL
`
`14
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 14 of 15
`
`US 2015/034.8554 A1
`
`
`
`007],
`
`15
`
`
`
`Patent Application Publication
`
`Dec. 3, 2015 Sheet 15 of 15
`
`US 2015/034.8554 A1
`
`
`
`pueuuuuOO
`
`16
`
`
`
`US 2015/034.8554 A1
`
`Dec. 3, 2015
`
`INTELLIGENTASSISTANT FOR HOME
`AUTOMATION
`
`more electronic devices can perform the commands and
`transmit their current states to the user device.
`
`CROSS-REFERENCE TO RELATED
`APPLICATION
`0001. This application claims priority from U.S. Provi
`sional Ser. No. 62/005,893, filed on May 30, 2014, entitled
`INTELLIGENT ASSISTANT FOR HOME AUTOMA
`TION, which is hereby incorporated by reference in its
`entirety for all purposes.
`
`FIELD
`0002 This relates generally to natural language process
`ing and, more specifically, to the use of a virtual assistant with
`natural language processing to control electronic devices.
`
`BACKGROUND
`0003 Home electronic devices that can be controlled
`remotely using Software applications running on a computing
`device. Such as a mobile phone, tablet computer, laptop com
`puter, desktop computer, or the like, have become increas
`ingly popular. For example, numerous manufacturers create
`light bulbs that can be controlled by a software application
`running on a mobile phone to adjust the brightness and/or
`color of the bulb. Other devices, such as door locks, thermo
`stats, and the like, having similar controls are also available.
`0004 While these devices can provide users with a greater
`level of control and convenience, it can become exceedingly
`difficult to manage these devices as the number of remotely
`controlled devices and the number of types of remotely con
`trolled devices in the home increase. For example, a typical
`home can include 40-50 light bulbs placed throughout the
`various rooms of the home. Using conventional Software
`applications, each light bulb is given a unique identifier, and
`a user attempting to control one of these devices must select
`the appropriate identifier from a list of available devices
`within a graphical user interface. Remembering the correct
`identifier for a particular lightbulb and finding that identifier
`from a list of 40-50 identifiers can be a difficult and time
`consuming process. To add to the difficulty of managing and
`controlling a large number of remotely controlled devices,
`different manufactures typically provide different software
`applications that must be used to control their respective
`devices. As a result, a user must locate and open one software
`application to turn on/off their light bulbs, and must then
`locate and open another software application to set the tem
`perature of their thermostat.
`
`SUMMARY
`0005 Systems and processes for using a virtual assistant
`to control electronic devices are provided. In one example
`process, a user can speak an input in natural language form to
`a user device to control one or more electronic devices. The
`user device can transmit the user speech to a server to be
`converted into a textual representation. The server can iden
`tify the one or more electronic devices and appropriate com
`mands to be performed by the one or more electronic devices
`based on the textual representation. The identified one or
`more devices and commands to be performed can be trans
`mitted back to the user device, which can forward the com
`mands to the appropriate one or more electronic devices for
`execution. In response to receiving the commands, the one or
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0006 FIG. 1 illustrates an exemplary environment in
`which a virtual assistant can be used to control electronic
`devices according to various examples.
`0007 FIG. 2 illustrates an exemplary environment in
`which a virtual assistant can be used to remotely control
`electronic devices according to various examples.
`0008 FIG. 3 illustrates an exemplary user device accord
`ing to various examples.
`0009 FIG. 4 shows a visual representation of multiple
`entries used to store information associated with electronic
`devices according to various examples.
`0010 FIG. 5 illustrates an exemplary process for control
`ling electronic devices using a virtual assistant implemented
`using a client-server model according to various examples.
`0011
`FIG. 6 illustrates an exemplary process for remotely
`controlling electronic devices using a virtual assistant imple
`mented using a client-server model according to various
`examples.
`0012 FIG. 7 illustrates an exemplary process for control
`ling electronic devices using a virtual assistant on a standal
`one user device according to various examples.
`0013 FIG. 8 illustrates an exemplary process for storing
`the states of electronic devices as a configuration according to
`various examples.
`0014 FIG. 9 illustrates an exemplary process for setting
`the states of electronic devices using a previously stored
`configuration according to various examples.
`0015 FIG. 10 illustrates a functional block diagram of an
`electronic device configured to control electronic devices
`according to various examples.
`0016 FIG. 11 illustrates a functional block diagram of an
`electronic device configured to store the states of electronic
`devices as a configuration according to various examples.
`0017 FIG. 12 illustrates a functional block diagram of an
`electronic device configured to set the states of electronic
`devices based on a stored configuration according to various
`examples.
`0018 FIG. 13 illustrates a functional block diagram of an
`electronic device configured to control electronic devices
`according to various examples.
`0019 FIG. 14 illustrates a functional block diagram of an
`electronic device configured to store the states of electronic
`devices as a configuration according to various examples.
`0020 FIG. 15 illustrates a functional block diagram of an
`electronic device configured to set the states of electronic
`devices based on a stored configuration according to various
`examples.
`
`DETAILED DESCRIPTION
`0021. In the following description of examples, reference
`is made to the accompanying drawings in which it is shown by
`way of illustration specific examples that can be practiced. It
`is to be understood that other examples can be used and
`structural changes can be made without departing from the
`Scope of the various examples.
`0022 Intelligent automated assistants (or virtual assis
`tants) provide an intuitive interface between users and elec
`tronic devices. These assistants can allow users to interact
`with devices or systems using natural language in spoken
`
`17
`
`
`
`US 2015/034.8554 A1
`
`Dec. 3, 2015
`
`and/or text forms. For example, a user can access the services
`of an electronic device by providing a spoken user input in
`natural language form to a virtual assistant associated with
`the electronic device. The virtual assistant can perform natu
`ral language processing on the spoken user input to infer the
`users intent and operationalize the users intent into tasks.
`The tasks can then be performed by executing one or more
`functions of the electronic device and a relevant output can be
`returned to the user in natural language form.
`0023 This relates to systems and processes for using a
`virtual assistant to control electronic devices. In one example
`process, a user can speak an input in natural language form to
`a user device to control one or more electronic devices. The
`user device can transmit the user speech to a server to be
`converted into a textual representation. The server can iden
`tify the one or more electronic devices and appropriate com
`mands to be performed by the one or more electronic devices
`based on the textual representation. The identified one or
`more devices and commands to be performed can be trans
`mitted back to the user device, which can forward the com
`mands to the appropriate one or more electronic devices for
`execution. In response to receiving the commands, the one or
`more electronic devices can perform the commands and
`transmit their current states to the user device.
`
`System Overview
`0024 FIG. 1 illustrates exemplary system 100 for imple
`menting a virtual assistant to control electronic devices
`according to various examples. The terms “virtual assistant.”
`'digital assistant,” “intelligent automated assistant, or “auto
`matic digital assistant can refer to any information process
`ing system that interprets natural language input in spoken
`and/or textual form to infer user intent, and performs actions
`based on the inferred user intent. For example, to act on an
`inferred user intent, the system can perform one or more of the
`following: identifying a task flow with steps and parameters
`designed to accomplish the inferred user intent; inputting
`specific requirements from the inferred user intent into the
`task flow; executing the task flow by invoking programs,
`methods, services, APIs, or the like; and generating output
`responses to the user in an audible (e.g., speech) and/or visual
`form.
`0025. A virtual assistant can be capable of accepting a user
`request at least partially in the form of a natural language
`command, request, statement, narrative, and/or inquiry. Typi
`cally, the user request seeks eitheran informational answer or
`performance of a task by the virtual assistant. A satisfactory
`response to the user request can include provision of the
`requested informational answer, performance of the
`requested task, or a combination of the two. For example, a
`user can ask the virtual assistant a question, such as “Where
`am I right now? Based on the user's current location, the
`virtual assistant can answer, “You are in Central Park.” The
`user can also request the performance of a task, for example,
`"Please remind me to call Mom at 4 p.m. today.” In response,
`the virtual assistant can acknowledge the request and then
`create an appropriate reminder item in the user's electronic
`schedule. During the performance of a requested task, the
`virtual assistant can sometimes interact with the user in a
`continuous dialogue involving multiple exchanges of infor
`mation over an extended period of time. There are numerous
`other ways of interacting with a virtual assistant to request
`information or performance of various tasks. In addition to
`providing verbal responses and taking programmed actions,
`
`the virtual assistant can also provide responses in other visual
`or audio forms (e.g., as text, alerts, music, videos, animations,
`etc.).
`0026. An example of a virtual assistant is described in
`Applicants’ U.S. Utility application Ser. No. 12/987,982 for
`“Intelligent Automated Assistant, filed Jan. 10, 2011, the
`entire disclosure of which is incorporated herein by reference.
`0027. As shown in FIG. 1, in some examples, a virtual
`assistant can be implemented according to a client-server
`model. The virtual assistant can include a client-side portion
`executed on a user device 102, and a server-side portion
`executed on a server system 110. User device 102 can include
`any electronic device. Such as a mobile phone, tablet com
`puter, portable media player, desktop computer, laptop com
`puter, PDA, television, television set-top box, wearable elec
`tronic device, or the like, and can communicate with server
`system 110 through one or more networks 108, which can
`include the Internet, an intranet, or any other wired or wireless
`public or private network. The client-side portion executed on
`user device 102 can provide client-side functionalities, such
`as user-facing input and output processing and communica
`tions with server system 110. Server system 110 can provide
`server-side functionalities for any number of clients residing
`on a respective user device 102.
`0028 Server system 110 can include one or more virtual
`assistant servers 114 that can include a client-facing I/O inter
`face 122, one or more processing modules 118, data and
`model storage 120, and an I/O interface to external services
`116. The client-facing I/O interface 122 can facilitate the
`client-facing input and output processing for virtual assistant
`server 114. The one or more processing modules 118 can
`utilize data and model storage 120 to determine the user's
`intent based on natural language input, and perform task
`execution based on inferred user intent. Additionally, data and
`model storage 120 can store a unique identifier, a state, a type,
`a location, and any other relevant information associated with
`one or more of electronic devices (e.g., electronic devices
`128, 130, and 132) capable of being controlled by user device
`102 and/or server system 110. In some examples, virtual
`assistant server 114 can communicate with external services
`124. Such as telephony services, calendar services, informa
`tion services, messaging services, navigation services, and
`the like, through network(s) 108 for task completion or infor
`mation acquisition. The I/O interface to external services 116
`can facilitate such communications.
`0029 Server system 110 can be implemented on one or
`more standalone data processing devices or a distributed net
`work of computers. In some examples, server system 110 can
`employ various virtual devices and/or services of third party
`service providers (e.g., third-party cloud service providers) to
`provide the underlying computing resources and/or infra
`structure resources of server system 110.
`003.0 User device 102 can be further coupled to electronic
`devices 128, 130, and 132 via one or more networks 126.
`Electronic devices 128, 130, and 132 can include any type of
`remotely controlled electronic device, such as a light bulb
`(e.g., having a binary ON/OFF state, numerical dimmable
`State, color state, etc.), garage door (e.g., having a binary
`OPEN/CLOSED state), door lock (e.g., having binary
`LOCKED/UNLOCKED state), thermostat (e.g., having one
`or more numerical temperature states. Such as a high tempera
`ture, low temperature, time-based temperatures, etc.), electri
`cal outlet (e.g., having a binary ON/OFF state), switch (e.g.,
`having a binary ON/OFF state), or the like. Network(s) 126
`
`18
`
`
`
`US 2015/034.8554 A1
`
`Dec. 3, 2015
`
`can include a WiFi network or any other wired or wireless
`public or private local network. Additionally or alternatively,
`user device 102 can be coupled to communicate directly with
`electronic devices 128, 130, or 132 using, for example, Blue
`tooth, BTLE, line of sight, peer-to-peer, or another radio
`based or other wireless communication. Thus, in the illus
`trated example, user device 102 can be located near electronic
`devices 128, 130, and 132, such that it can communicate with
`them directly or over the same local network. For example,
`user device 102 and electronic devices 128, 130, and 132 can
`be located within the same home or building, and network(s)
`126 can include the home or building's WiFi network. As
`discussed in greater detail below with respect to FIGS. 5, 8,
`and 9, user device 102 can issue commands to control any of
`electronic devices 128, 130, and 132 in response to a natural
`language spoken input provided by a user to user device 102.
`0031 While only three electronic devices 128, 130, and
`132 are shown, it should be appreciated that system 100 can
`include any number of electronic devices. Additionally,
`although the functionality of the virtual assistant is shown in
`FIG.1 as including both a client-side portion and a server-side
`portion, in Some examples, the functions of the assistant can
`be implemented as a standalone application installed on a
`user device. Moreover, the division of functionalities between
`the client and server portions of the virtual assistant can vary
`in different examples. For instance, in Some examples, the
`client executed on user device 102 can be a thin-client that
`provides only user-facing input and output processing func
`tions, and delegates all other functionalities of the virtual
`assistant to a backend server.
`0032 FIG. 2 illustrates another exemplary system 200 for
`implementing a virtual assistant to remotely control elec
`tronic devices according to various examples. Similar to sys
`tem 100, system 200 can include user device 102, server
`system 110, and external services 124 communicatively
`coupled together by network(s) 108. However, in contrast to
`system 100, user device 102 may not be coupled to electronic
`devices 128, 130, and 132. Instead, system 200 can include a
`second user device 134 coupled to communicate with user
`device 102 and/or server system 110 via network(s) 108 and
`coupled to communicate with electronic devices 128, 130,
`and 132 via network(s) 126. This configuration can represent
`a situation in which the user and user device 102 are located
`remotely from electronic devices 128, 130, and 132 (e.g., the
`user and user device 102 are at the users office, while elec
`tronic devices 128, 130, and 132 are at the user's home).
`0033 Second user device 134 can include any type of
`electronic device. Such as a mobile phone, tablet computer,
`portable media player, desktop computer, laptop computer,
`PDA, television, television set-top box, wearable electronic
`device, or the like, and can be configured to receive com
`mands from user device 102 and/or server system 110 and to
`issue commands to electronic devices 128, 130, and 132. As
`discussed in greater detail below with respect to FIG. 6,
`second user device 134 can issue commands to control any of
`electronic devices 128, 130, and 132 in response to a natural
`language spoken input provided by a user to user device 102.
`
`User Device
`0034 FIG. 3 is a block diagram of a user-device 102 (or
`second user device 134) according to various examples. As
`shown, user device 102 can include a memory interface 302,
`one or more processors 304, and a peripherals interface 306.
`The various components in user device 102 can be coupled
`
`together by one or more communication buses or signal lines.
`User device 102 can further include various sensors, sub
`systems, and peripheral devices that are coupled to the
`peripherals interface 306. The sensors, subsystems, and
`peripheral devices gather information and/or facilitate Vari
`ous functionalities of user device 102.
`0035. For example, user device 102 can include a motion
`sensor 310, a light sensor 312, and a proximity sensor 314
`coupled to peripherals interface 306 to facilitate orientation,
`light, and proximity sensing functions. One or more other
`sensors 316, such as a positioning system (e.g., a GPS
`receiver), a temperature sensor, a biometric sensor, a gyro
`Scope, a compass, an accelerometer, and the like, are also
`connected to peripherals interface 306, to facilitate related
`functionalities.
`0036. In some examples, a camera subsystem 320 and an
`optical sensor 322 can be utilized to facilitate camera func
`tions, such as taking photographs and recording video clips.
`Communication functions can be facilitated through one or
`more wired and/or wireless communication Subsystems 324.
`which can include various communication ports, radio fre
`quency receivers and transmitters, and/or optical (e.g., infra
`red) receivers and transmitters. An audio subsystem 326 can
`be coupled to speakers 328 and a microphone 330 to facilitate
`Voice-enabled functions, such as Voice recognition, Voice
`replication, digital recording, and telephony functions.
`0037. In some examples, user device 102 can further
`include an I/O subsystem340 coupled to peripherals interface
`306. I/O subsystem 340 can include a touch screen controller
`342 and/or other input controller(s) 344. Touch-screen con
`troller 342 can be coupled to a touchscreen 346. Touchscreen
`346 and the touch screen controller 342 can, for example,
`detect contact and movement or break thereof using any of a
`plurality of touch sensitivity technologies. Such as capacitive,
`resistive, infrared, and Surface acoustic wave technologies,
`proximity sensor arrays, and the like. Other input controller
`(s) 344 can be coupled to other input/control devices 348,
`Such as one or more buttons, rocker Switches, a thumb-wheel,
`an infrared port, a USB port, and/or a pointer device such as
`a stylus.
`0038. In some examples, user device 102 can further
`include a memory interface 302 coupled to memory 350.
`Memory 350 can include any electronic, magnetic, optical,
`electromagnetic, infrared, or semiconductor system, appara
`tus, or device, a portable computer diskette (magnetic), a
`random access memory (RAM) (magnetic), a read-only
`memory (ROM) (magnetic), an erasable programmable read
`only memory (EPROM) (magnetic), a portable optical disc
`such as CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or
`flash memory Such as compact flash cards, secured digital
`cards, USB memory devices, memory Sticks, and the like. In
`Some examples, a non-transitory computer-readable storage
`medium of memory 350 can be used to store instructions
`(e.g., for performing some or all of processes 500, 600, 700,
`800, or 900, described below) for use by or in connection with
`an instruction execution system, apparatus, or device, such as
`a computer-based system, processor-containing system, or
`other system that can fetch the instructions from the instruc
`tion execution system, apparatus, or device, and execute the
`instructions. In other examples, the instructions (e.g., for
`performing processes 500, 600, 700, 800, or 900, described
`below) can be stored on a non-transitory computer-readable
`storage medium of server system 110, or can be divided
`between the non-transitory computer-readable storage
`
`19
`
`
`
`US 2015/034.8554 A1
`
`Dec. 3, 2015
`
`medium of memory 350 and the non-transitory computer
`readable storage medium of server system 110. In the context
`of this document, a “non-transitory computer readable Stor
`age medium' can be any medium that can contain or store the
`program for use by or in connection with the instruction
`execution system, apparatus, or device.
`0039. In some examples, the memory 350 can store an
`operating system 352, a communication module 354, a
`graphical user interface module 356, a sensor processing
`module 358, a phone module 360, and applications 362.
`Operating system 352 can include instructions for handling
`basic system services and for performing hardware depen
`dent tasks. Communication module 354 can facilitate com
`municating with one or more additional devices, one or more
`computers, and/or one or more servers. Graphical user inter
`face module 356 can facilitate graphic user interface process
`ing. Sensor processing module 358 can facilitate sensor
`related processing and functions. Phone module 360 can
`facilitate phone-related processes and functions. Application
`module 362 can facilitate various functionalities of user
`applications, such as electronic-messaging, web browsing,
`media processing, navigation, imaging, and/or other pro
`cesses and functions.
`0040 Memory 350 can also store client-side virtual assis
`tant instructions (e.g., in a virtual assistant client module 364)
`and various user data 366 (e.g., user-specific Vocabulary data,
`preference data, and/or other data, Such as the user's elec
`tronic address book, to-do lists, shopping lists, etc.) to provide
`the client-side functionalities of the virtual assistant.
`0041. In various examples, virtual assistant client module
`364 can be capable of accepting Voice input (e.g., speech
`input), text input, touch input, and/or gestural input through
`various user interfaces (e.g., I/O subsystem 340, audio sub
`system 326, or the like) of user device 102. Virtual assistant
`client module 364 can also be capable of providing output in
`audio (e.g., speech output), visual, and/or tactile forms. For
`example, output can be provided as Voice, Sound, alerts, text
`messages, menus, graphics, videos, animations, vibrations,
`and/or combinations of two or more of the above. During
`operation, virtual assistant client module 364 can communi
`cate with the virtual assistant server using communication
`subsystem 324.
`0042. In some examples, virtual assistant client module
`364 can utilize the various sensors, Subsystems, and periph
`eral devices to gather additional information from the Sur
`rounding environment of user device 102 to establish a con
`text associated with a user, the current user interaction, and/or
`the current user input. In some examples, virtual assistant
`client module 364 can provide the contextual information or
`a subset thereof with the user input to the virtual assistant
`server to help infer the user's intent. The virtual assistant can
`also use the contextual information to determine how to pre
`pare and deliver outputs to the user.
`0043. In some examples, the contextual information that
`accompanies the user input can include sensor information,
`Such as lighting, ambient noise, ambient temperature, images
`or videos of the Surrounding environment, distance to another
`object, and the like. The contextual information can further
`include information associated with the physical state of user
`device 102 (e.g., device orientation, device location, device
`temperature, power level, speed, acceleration, motion pat
`terns, cellular signal strength, etc.) or the Software state of
`user device 102 (e.g., running processes, installed programs,
`past and present network activities, background services,
`
`error logs, resources usage, etc.). Any of these types of con
`textual information can be provided to the virtual assistant
`server 114 as contextual information associated with a user
`input.
`0044. In some examples, virtual assistant client module
`364 can selectively provide information (e.g., user data 366)
`stored on user device 102 in response to requests from the
`virtual assistant server 114. Virtual assistant client module
`364 can also elicit additional input from the user via a natural
`language dialogue or other user interfaces upon request by
`virtual assistant server 114. Virtual assistant client module
`364 can pass the additional input to virtual assistant server
`114 to help virtual assistant server 114 in intent inference
`and/or fulfillment of the users intent expressed in the user
`request.
`0045 Memory 350 can further store electronic device data
`370 that can include a unique identifier, a state, a type, a
`location, and any other relevant information associated with
`one or more of the electronic devices capable of being con
`trolled by user device 102 and/or server system 110 (e.g.,
`electronic devices 128, 130, and 132). FIG. 4 shows a visual
`representation of entries that can be stored in electronic
`device data 370 for seven different electronic devices. As
`shown, each entry includes a unique name, type, and state of
`the electronic device. Data and model storage 120 of virtual
`assistant server 114 can include similar or identical entries for
`the electronic devices that can be maintained separately from
`that of electronic device data 370 of memory 350.
`0046 Referring back to FIG. 3, memory 350 can further
`include instructions (e.g., in daemon module 368) for creating
`and updating entries for electronic devices in electronic
`device data 370, communicating with the electronic devices
`of system 100, and for communicating with server system
`110. For example, to add an electronic device to system 100,
`a Software application associated with the electronic device
`can communicate with processor(s) 304 executing daemon
`module 368 to provide user device 102 with a unique name,
`type, state, location, and the like, of the electronic device. The
`Software application can allow the user to enter the unique
`name in any desired manner. For example, a dropdown box
`with common names and/or a freeform text field can be pro
`vided in the application to allow a user to name a particular
`device. The type, state, and/or location of the electronic
`device can be predetermined or determined by the software
`application through communication with the electronic
`device. Processor(s) 304 executing daemon module 368 can
`store this information as an entry in electronic device data 370
`and can also transmit this information to server system 110 to
`be stored in data and models storage 120. Additionally, when
`executed by processor(s) 304, daemon module 368 can
`receive commands that are to be provided to electronic
`devices 128, 130, and 132 from server system 110 via