`(12) Patent Application Publication (10) Pub. No.: US 2010/0094500 A1
`(43) Pub. Date:
`Apr. 15, 2010
`UN
`
`US 2010.0094500A1
`
`(54) TELEMATICSTERMINAL AND METHOD
`FOR CONTROLLING VEHICLEUSING THE
`SAME
`
`Publication Classification
`
`(51) Int. Cl.
`G06F 7/00
`
`(2006.01)
`
`Seung-Hee JIN, Seoul (KR)
`(76) Inventor:
`Correspondence Address:
`BRCH STEWARTKOLASCH & BRCH
`PO BOX 747
`FALLS CHURCH, VA 22040-0747 (US)
`(21) Appl. No.:
`12/404,148
`
`(22) Filed:
`(30)
`
`Mar. 13, 2009
`Foreign Application Priority Data
`
`Oct. 14, 2008 (KR) ........................ 10-2008-01OO784
`
`(52) U.S. Cl. ............................................... 701/29: 701/1
`
`ABSTRACT
`(57)
`A device and method for controlling a vehicle with a telemat
`ics terminal installed in or on the vehicle. The method
`includes a) receiving coordinates of a geo-fence area by the
`telematics terminal; b) determining whether or not the vehicle
`is located within the geo-fence area or is on a route that will
`intercept the geo-fence area; and c) if the vehicle is deter
`mined to have entered into the geo-fence area, controlling the
`vehicle to meet predetermined drive requirements previously
`set for the geo-fence area.
`
`S2O
`
`S2O2
`
`RECEIVING EVENT OCCURRENCE-RELATED
`INFORMATION FROM SERVER
`
`WARIABLE SETTING GEO-FENCE AREABY
`INFORMATION RECEIVED FROM SERVER
`
`S2O3
`
`DISPLAYING SET GEO-FENCE AREA
`
`S204
`
`WEHICLE
`HAS ENTERED TO GEO-FENCE
`AREA
`YES
`NOTIFYING THE ENTRANCE OF VEHICLE INTO
`GEO-FENCE AREA
`
`NO
`
`GUIDING DRIVE REQUIREMENTS SET TO
`GEO-FENCE AREA
`O
`
`S205
`
`S2O6
`
`S20
`
`SENSING DRIVE CONDITIONS OF WEHICLE
`
`S208
`
`
`
`
`
`
`
`DRIVE CONDITIONS
`OF VEHICLE CONFORM TO GUIDED
`DRIVE REQUIREMENTS
`
`
`
`YES
`
`NO
`GUIDING CONTROL OF VEHICLE TO CONFORM
`DRIVE CONDITIONS OF WEHICLE TO GUIDED
`DRIVE REQUIREMENTS
`
`S209
`
`S20-J CONTROLLING VEHICLE BY CONTROL SIGNAL
`RECEIVED FROM SERVER
`
`END
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 1
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 1 of 7
`
`US 2010/0094500 A1
`
`
`
`110
`
`100
`/
`
`-- - - - - - - - |-
`--no
`WIRELESS COMMU
`W
`MMUNICATION UNIT
`BROADCAST RECEIVING |
`MODULE
`MOBILE COMMUNICATION
`WIRELESS INTERNET
`Mt.
`113
`14- SHORTRANGE
`COMMUNICATION MODULE
`------------------------------
`POSITION-10CATION
`MODULE
`130
`I
`A/V INPUT UNIT
`131 - CAMERA
`
`CONTROLLER
`
`191
`
`MULTIMEDIA
`opus
`|| AR BAG
`CONTROLLER
`EMERGENCY
`|| BATTERY
`
`CONTROLLER
`
`se
`
`MICROPHONE
`
`----------------------------- -
`140
`USER INPUT UNIT
`150
`
`f
`------------------------
`OUTPUT UNIT
`
`161
`DISPLAY
`AUDIO OUTPUT -162
`ALARM
`tes
`ALARM
`HAPTIC MODULEH-164
`- -
`
`t
`
`191
`
`192
`
`193
`
`MEMORY
`
`10
`
`INTERFACE
`
`180
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 2
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 2 of 7
`
`US 2010/0094500 A1
`
`FIG 2
`
`S102
`
`S03
`
`
`
`S105
`
`SO6
`
`S104
`
`WEHICLE
`HAS ENTERED GEO-FENCE
`AREA
`YES
`NOTIFYING THE ENTRANCE OF WEHICLE INTO
`GEO-FENCE AREA
`
`NO
`
`GUIDING DRIVE REQUIREMENTS SET TO
`GEO-FENCE AREA
`D
`
`SO
`
`SENSING DRIVE CONDITIONS OF WEHICLE
`
`S.08
`
`DRIVE CONDITIONS
`OF WEHICLE CONFORM TO GUIDED
`DRIVE REQUIREMENTS
`
`YES
`
`S109
`
`GUIDING CONTRO OF WEHICLE TO CONFORM
`DRIVE CONDITIONS OF WEHICLE TO GUIDED
`DRIVE REQUIREMENTS
`
`
`
`SO CONTORLLING WEHICLE TO CONFORM DRIVE
`CONDITIONS TO GUIDED DRIVE REQUIREMENTS
`
`END
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 3
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 3 of 7
`
`US 2010/0094500 A1
`
`FIG 3
`
`S2Ol
`
`S2O2
`
`RECEIVING EVENT OCCURRENCE-RELATED
`INFORMATION FROM SERVER
`
`WARIABLE SETTING GEO-FENCE AREA BY
`INFORMATION RECEIVED FROM SERVER
`
`S2O3
`
`DISPLAYING SET GEO-FENCE AREA
`
`
`
`
`
`
`
`S204
`
`WEHICLE
`HAS ENTERED TO GEO-FENCE
`AREA
`YES
`NOTIFYING THE ENTRANCE OF WEHICLE INTO
`GEO-FENCE AREA
`
`NO
`
`
`
`GUIDING DRIVE REQUIREMENTS SET TO
`GEO-FENCE AREA
`
`S205
`
`S2O6
`
`S20
`
`SENSING DRIVE CONDITIONS OF WEHICLE
`
`S208
`
`
`
`
`
`
`
`
`
`DRIVE CONDITIONS
`OF WEHICLE CONFORM TO GUIDED
`DRIVE REQUIREMENTS 2
`
`
`
`NO
`GUIDING CONTROL OF WEHICLE TO CONFORM
`DRIVE CONDITIONS OF WEHICLE TO GUIDED
`DRIVE REQUIREMENTS
`
`CONTROLLING VEHICLE BY CONTROL SIGNAL
`RECEIVED FROM SERVER
`
`S209
`
`S20
`
`END
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 4
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 4 of 7
`
`US 2010/0094500 A1
`
`FIG. 4A
`
`OG) ill
`
`FD
`ARNING 00NE AREA
`Mauriaen
`
`M
`
`FIG 4B
`
`
`
`D
`
`P
`
`EE. t
`100m, ATS
`S2.É.-17
`A. 144
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 5
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 5 of 7
`
`US 2010/0094500 A1
`
`FIG 4C
`
`
`
`||
`SSWARNING OZONE AREA
`21 ill N.
`van. An
`// SA. In
`ck: EE
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 6
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 6 of 7
`
`US 2010/0094500 A1
`
`FIG 5A
`
`WARNING OZONE AREA
`
`M
`
`FIG 5B
`
`
`
`IANN: 02ONE ARE
`
`M
`
`s
`5 Nes
`
`E.
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 7
`
`
`
`Patent Application Publication
`
`Apr. 15, 2010 Sheet 7 of 7
`FIG 5C
`
`US 2010/0094500 A1
`
`
`
`ETEEA
`it DHD 5
`gEEEifes
`tillH
`
`
`
`
`
`
`
`
`
`
`
`
`
`(D-Tip
`
`GANGNAM-GUI
`to () ill.
`SETEOAONE AREA
`t 2S
`N. f
`- TPEG
`J)
`
`train)
`3
`05:24
`A-AG)
`234km
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 8
`
`
`
`US 2010/0094500 A1
`
`Apr. 15, 2010
`
`TELEMATICSTERMINAL AND METHOD
`FOR CONTROLLING VEHICLEUSING THE
`SAME
`
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`0001 Pursuant to 35 U.S.C. S 119(a), this application
`claims the benefit of earlier filing date and right of priority to
`Korean Application No. 10-2008-0100784, filed on Oct. 14,
`2008, the contents of which are hereby incorporated by ref
`erence herein in their entirety.
`
`BACKGROUND OF THE INVENTION
`0002 1. Field of the Invention
`0003. The present invention relates to a telematics termi
`nal capable of controlling a vehicle and a method for control
`ling a vehicle using the same.
`0004 2. Description of the Background Art
`0005. The term telematics is a compound word of Tele
`communications and Informatics, and is also known as Infor
`mation and Communications Technology (ICT). More spe
`cifically, telematics is the science of sending, receiving and
`storing information via telecommunication devices.
`0006 More recently, telematics have been specifically
`applied to the use of Global Positioning System (GPS) tech
`nology integrated with computers and mobile communica
`tions technology in automotive navigation systems.
`0007 Vehicle telematics may be applied to various fields
`Such as remote diagnostics for vehicles, diagnostics for in
`vehicle electric/mechanical components, vehicle controls,
`communications between a call center and a vehicle or
`between vehicles equipped with telematics terminals, intelli
`gent transportation systems, and an interface between a user
`and a vehicle.
`0008. As discovered by the present inventors, telematics
`may also be used for controlling moving objects (including
`vehicle) by using a telematics terminal in a vehicle equipped
`with the telematics terminal.
`
`SUMMARY OF THE INVENTION
`0009. The present invention allows a vehicle within a vari
`ably set geo-fence area to be driven under prescribed require
`ments upon occurrence of events, thereby enhancing conve
`nience and safety in a vehicle drive.
`0010. To achieve these and other advantages and in accor
`dance with the purposes of the present invention, as embodied
`and broadly described herein, there is provided a method for
`controlling a vehicle by a telematics terminal. The method
`includes variably setting a geo-fence area upon occurrence of
`events; determining whether or not a vehicle has entered into
`the geo-fence area; and if the vehicle is determined to have
`entered into the geo-fence area, controlling the vehicle to
`meet drive requirements set to the geo-fence area.
`0011. Here, the events may include at least one of traffic
`accident occurrence, an increase of traffic Volume, a decrease
`of traffic Volume, environmental pollution, natural disasters
`and weather change.
`0012. The method may also include notifying that the
`vehicle has entered into the geo-fence area.
`0013 The method also may include receiving information
`related to the event occurrence from a server; and variably
`setting a geo-fence area by using the received information
`related to the event occurrence.
`
`0014. The method also may include receiving information
`related to a geo-fence area variably set by a server; and setting
`a geo-fence area by the received information.
`0015 The method also may include variably setting a
`geo-fence area by considering road information on which a
`vehicle can be driven if the event has occurred.
`0016. The method also may include notifying drive
`requirements set to the geo-fence area.
`0017. The method also may include sensing drive condi
`tions of the vehicle; determining whether or not the sensed
`drive conditions of the vehicle meet the notified drive require
`ments; and if the drive conditions do not meet the drive
`requirements, controlling the vehicle Such that the drive con
`ditions of the vehicle meet the drive requirements.
`0018. The method also may include, if the drive conditions
`of the vehicle do not meet the drive requirements, guiding a
`control of the vehicle such that the drive conditions of the
`vehicle meet the drive requirements.
`0019. The drive requirements may relate to at least one of
`an opened or closed state of a window of a vehicle, a maxi
`mum or minimum speed of a vehicle, a distance between the
`vehicle and an object positioned front or rear thereof, a lit
`status of a lamp of a vehicle, a drive gear of a vehicle, a drive
`type of a vehicle, opening/closing or locking of a door of a
`vehicle, and a drive or detour on a specific road.
`0020. The method may also include controlling the
`vehicle by a control signal received from a server.
`0021. The method may also include displaying the set
`geo-fence area.
`0022. To achieve these and other advantages and in accor
`dance with the purposes of the present invention, as embodied
`and broadly described herein, there is provided a telematics
`terminal. The telematics terminal includes a position-location
`module configured to recognize a location of a vehicle; a
`wireless communication unit configured to receive informa
`tion related to an event occurrence from a server; and a con
`troller configured to variably Setageo-fence area based on the
`event occurrence, determine whether or not the vehicle has
`entered to the geo-fence area, and control the vehicle to meet
`drive requirements set to the geo-fence area.
`0023. Here, the events may include at least one of traffic
`accident occurrence, an increase OT traffic Volume, a
`decrease of traffic Volume, environmental pollution, natural
`disasters and weather change.
`0024. The telematics terminal may also include an output
`unit configured to notify that the vehicle has entered into the
`geo-fence area.
`0025. The controller can variably set a geo-fence area by
`considering road information on which a vehicle can be
`driven if the event has occurred.
`0026. The telematics terminal may also include a control
`ler configured to notify drive requirements set to the geo
`fence area.
`0027. The telematics terminal may also include a sensing
`means configured to sense drive conditions of the vehicle,
`wherein if the drive conditions of the vehicle sensed by the
`sensing means do not meet the guided drive requirements, the
`controller performs a control of the vehicle such that the drive
`conditions meet the drive requirements.
`0028. The foregoing and other objects, features, aspects
`and advantages of the present invention will become more
`apparent from the following detailed description of the
`present invention when taken in conjunction with the accom
`panying drawings.
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 9
`
`
`
`US 2010/0094500 A1
`
`Apr. 15, 2010
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`0029. The above and other aspects, features, and advan
`tages of the present invention will become more apparent
`upon consideration of the following description of preferred
`embodiments, taken in conjunction with the accompanying
`drawing figures.
`0030. In the drawings:
`0031
`FIG. 1 is a schematic block diagram of a telematics
`terminal according to one embodiment of the present inven
`tion;
`0032 FIG. 2 is a flowchart showing a method for control
`ling a vehicle using a telematics terminal according to one
`embodiment of the present invention;
`0033 FIG. 3 is a flowchart showing a method for control
`ling a vehicleusing a telematics terminal according to another
`embodiment of the present invention;
`0034 FIGS. 4a, 4b and 4c are exemplary views each
`showing a state that geo-fence areas are variably set by a
`telematics terminal according to another embodiment of the
`present invention; and
`0035 FIGS. 5a, 5b, 5c and 5d are exemplary views each
`showing a state that geo-fence areas are variably set by a
`telematics terminal based on road data serving as a drive
`reference according to another embodiment of the present
`invention.
`
`DETAILED DESCRIPTION OF THE INVENTION
`0036 FIG. 1 is a block diagram showing an exemplary
`telematics terminal according to one embodiment of the
`present invention, and configured to execute one or more of
`the methods described below. For the various methods
`described below, the telematics terminal may be composed of
`components more or less than the components of FIG. 1.
`0037. The telematics terminal 100 includes a wireless
`communication unit 110, a position-location module 120, an
`audio/video (A/V) input unit 130, a user input unit 140, a
`sensing unit 150, an output unit 160, a memory 170, an
`interface unit 180, a controller 190, a power supply unit 200,
`and so on.
`0038 Hereinafter, the components will be explained in
`more detail.
`0039. The wireless communication unit 110 may include
`one or more modules configured to enable a wireless com
`munication between the telematics terminal 100 and a wire
`less communications system, or between the telematics ter
`minal 100 and a network where the telematics terminal 100 is
`located. For instance, the wireless communication unit 110
`may include a broadcast receiving module 111, a mobile
`communication module 112, a wireless Internet module 113,
`a short range communication module 114, and so on.
`0040. The broadcast receiving module 111 may be con
`figured to receive broadcasting signals and/or broadcasting
`related information from an external broadcasting manage
`ment server through broadcasting channels.
`0041. The broadcasting channels may include satellite
`channels and terrestrial wave channels. The broadcasting
`management server may indicate a server to generate and
`transmit broadcasting signals and/or broadcasting related
`information, or a server to receive previously generated
`broadcasting signals and/or broadcasting related information
`and to transmit to the telematics terminal 100. The broadcast
`ing signals may include not only TV or radio broadcasting
`signals and data broadcasting signals, but also broadcasting
`
`signals implemented as data broadcasting signals are coupled
`to TV or radio broadcasting signals.
`0042. The broadcasting related information may indicate
`information relating to broadcasting channels, broadcasting
`programs or a broadcasting service provider. The broadcast
`ing related information may be provided through a mobile
`communication network. In this case, the broadcasting
`related information may be received by the mobile commu
`nication module 112.
`0043. The broadcasting related information may be imple
`mented in various forms, such as Electronic Program Guide
`(EPG) of Digital Multimedia Broadcasting (DMB), or Elec
`tronic Service Guide (ESG) of Digital Video Broadcast
`Handheld (DVB-H).
`0044) The broadcast receiving module 111 may receive
`digital broadcasting signals by using digital broadcasting sys
`tems such as Digital Multimedia Broadcasting-Terrestrial
`(DMB-T), Digital Multimedia Broadcasting-Satellite
`(DMB-S), Media Forward Link Only (MediaFLO), Digital
`Video Broadcast-Handheld (DBV-H), Integrated Services
`Digital Broadcast-Terrestrial (ISDB-T). Here, the broadcast
`receiving module 111 may be configured to be adopted to not
`only the aforementioned digital broadcasting systems, but
`also any other broadcasting systems.
`0045 Broadcasting signals and/or broadcasting related
`information received through the broadcast receiving module
`111 may be stored in the memory 170.
`0046. The mobile communication module 112 transmits
`or receives wireless signals to/from at least one of a base
`station, an external terminal, and a server on a mobile com
`munication network. The wireless signals may include Voice
`call signals, video call signals, or various types of data
`according to transmission/reception of text/multimedia mes
`Sages.
`0047. The wireless Internet module 113 is a module for
`wireless Internet access, and may be internally or externally
`mounted to the telematics terminal 100. Wireless Internet
`techniques may include Wireless LAN (Wi-Fi), Wireless
`Broadband (Wibro), World interoperability for Microwave
`Access (Wimax), High Speed Downlink Packet Access (HS
`DPA), and so on.
`0048. The short range communication module 114 indi
`cates a module for short range communication. Short range
`communication techniques may include Bluetooth, Radio
`Frequency Identification (RFID), Infrared Data Association
`(IrDA), Ultra Wideband (UWB), ZigBee, and so on.
`0049. The position-location module 120 indicates a mod
`ule to obtain a position of the telematics terminal 100, and
`includes a Global Positioning System (GPS) as a representa
`tive example.
`0050. The GPS module receives signals from one or more
`GPS satellites. With three or more satellites, the GPS module
`applies a triangulation method to the calculated distance,
`thereby obtaining position information. The GPS module
`further applies Map matching, Dead reckoning, etc. to posi
`tion information obtained by the triangulation method,
`thereby enhancing precision of calculated position informa
`tion.
`0051. The position-location module 120 may obtain posi
`tion information of the telematics terminal 100 by using not
`only the GPS module, but also various techniques such as Cell
`tower signals, wireless Internet signals, and a Bluetooth sen
`sor. The techniques are referred to as Hybrid Positioning
`System.
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 10
`
`
`
`US 2010/0094500 A1
`
`Apr. 15, 2010
`
`0052 Referring to FIG. 1, the AN input unit 130 serves to
`input audio or video signals, and may include a camera 131,
`a microphone 132, and so on. The camera 131 processes
`image frames such as still pictures or video obtained by an
`image sensor in a capturing mode. Then, the processed image
`frames may be displayed on the display 161.
`0053. The image frames processed by the camera131 may
`be stored in the memory 170, or may be transmitted to outside
`through the wireless communication unit 110. The camera
`131 may be implemented in two or more in number according
`to usage environments.
`0054 Further, the microphone 132 receives an external
`audio signal while the portable device is in a particular mode,
`Such as a phone call mode, recording mode and Voice recog
`nition mode. The received audio signal is then processed and
`converted into digital data. Also, the microphone 132 may
`include assorted noise removing algorithms to remove noise
`generated in the course of receiving the external audio signal.
`0055. The user input unit 140 generates input data respon
`sive to user's manipulations with respect to the telematics
`terminal. The user input unit 140 may be implemented as a
`key pad, a dome Switch, a touchpad (e.g., static pressure?
`capacitance), a jog wheel and a jog Switch. The user input unit
`140 may be also implemented as a steering wheel, an accel
`eration pedal, a brake pedal, a gear shift of a vehicle, and so
`O.
`0056. The sensing unit 150 may be configured to sense a
`current status of a vehicle or the telematics terminal 100, such
`as presence or absence of user contact with the telematics
`terminal 100, opening or closing of a vehicle door or window,
`whether or not a passenger has fastened a safety belt, manipu
`lated Statuses of a steering wheel, an acceleration pedal, a
`brake pedal, a gear shift, etc., a temperature inside or outside
`a vehicle, presence or absence of crash of a vehicle with an
`object, and a crash degree, a distance between a vehicle and an
`object, a status of components mounted to a vehicle, a lit
`status or brightness of a lamp mounted to inside or outside of
`a vehicle, and whether or not a passenger has been seated.
`Then, the sensing unit 150 generates a sensing signal to
`control an operation of the telematics terminal 100 or a
`vehicle. For instance, the sensing unit 150 may sense an
`opened status of a vehicle door, or a user's seated Status by
`using a pressure applied to a seat. The sensing unit 150 may
`also sense whether power has been supplied from the power
`supply unit 200, or whether the interface unit 180 has been
`coupled to an external device or a vehicle component. The
`sensing unit 150 may include a proximity sensor 151.
`0057 The output unit 160 serves to generate video, audio,
`or tactile outputs, and may include the display 161, an audio
`output module 162, an alarm 163, a haptic module 164, etc.
`0058. The display 161 displays information processed by
`the telematics terminal 100. For instance, when the telematics
`terminal 100 is in a route guidance mode, the display 161
`displays User Interface (UI) or Graphic User Interface (GUI)
`relating to the route guidance. However, when the telematics
`terminal 100 is in a video call mode or an image capturing
`mode, the display 161 displays captured or received images,
`or UI or GUI.
`0059. The display 161 may include at least one of a Liquid
`Crystal Display (LCD), a Thin Film Transistor-Liquid Crys
`tal Display (TFT-LCD), an Organic Light-Emitting Diode
`(OLED), a Flexible Display, a 3D Display.
`0060 Some of the above displays may be configured as
`transparent or transmissive type of displays. These displays
`
`may be referred to as transparent displays, and include a
`Transparent OLED (TOLED) as a representative example.
`0061 The display 161 may be implemented as a Head Up
`Display (HUD). The display 161 may be mounted to a front
`glass of a vehicle, or a door window. Here, the display 161
`may be implemented as a transparent or transmissive type.
`0062 Two or more displays 161 may be implemented
`according to a configuration of the telematics terminal 100.
`0063. When the display 161 and a sensor to sense a touch
`operation (hereinafter, will be referred to as touch sensor)
`have a structure to be layered with each other, the display 161
`may serve as an input device as well as an output device. The
`touch sensor may be implemented as a touch film, a touch
`sheet, a touchpad, and so on.
`0064. The touch sensor may be configured to convert
`changes of a pressure applied to a specific portion of the
`display 161, or changes of a capacitance occurring from a
`specific portion of the display 161, into electric input signals.
`The touch sensor may be configured to sense not only a touch
`position and a touch area, but also a touch pressure.
`0065. Once touch inputs are sensed by the touch sensor,
`corresponding signals are transmitted to a touch controller.
`The touch controller processes the signals, and then transmits
`corresponding data to the controller 190. Accordingly, the
`controller 190 can sense a touch position on the display 161.
`0.066
`Referring to FIG. 1, the proximity sensor 151 may
`be arranged at an inner region of the telematics terminal
`covered by the touch screen, or near the touch screen. The
`proximity sensor indicates a sensor to sense presence or
`absence of an object approaching to a surface to be sensed, or
`an object disposed near a surface to be sensed, by using an
`electric field or infrared rays without a mechanical contact.
`The proximity sensor has a longer lifespan and a more
`enhanced utilization degree than a contact sensor.
`0067. The proximity sensor may include a transmissive
`type photoelectric sensor, a direct reflective type photoelec
`tric sensor, a mirror reflective type photoelectric sensor, a
`high-frequency oscillation type proximity sensor, a capaci
`tance type proximity sensor, a magnetic type proximity sen
`Sor, an infrared rays proximity sensor, and so on. When the
`touch screen is implemented as a capacitance type, proximity
`of a pointer to the touch screen is sensed by changes of an
`electric field. In this case, the touchscreen (touch sensor) may
`be categorized into a proximity sensor.
`0068. Hereinafter, a status that the pointer is positioned to
`be proximate onto the touch screen without contact will be
`referred to as proximity touch, whereas a status that the
`pointer Substantially comes in contact with the touch screen
`will be referred to as contact touch. The pointerina status of
`proximity touch is positioned so as to be vertical with
`respect to the touch screen.
`0069. The proximity sensor senses proximity touch, and
`proximity touch patterns (e.g., distance, direction, speed,
`time, position, moving status, etc.). Information relating to
`the sensed proximity touch, and the sensed proximity touch
`patterns may be output onto the touch screen.
`0070 The audio output module 162 may output audio data
`received from the wireless communication unit 110 or stored
`in the memory 160, in a call-receiving mode, a call-placing
`mode, a recording mode, a Voice recognition mode, a broad
`cast reception mode, a route guidance mode, and so on. The
`audio output module 152 may output audio signals relating to
`functions performed in the telematics terminal 100, e.g., call
`signal reception Sound, message reception Sound, route guid
`
`Textron Specialized Vehicles Inc., Exh. 1006, p. 11
`
`
`
`US 2010/0094500 A1
`
`Apr. 15, 2010
`
`ance Voice, and so on. The audio output module 162 may
`include a receiver, a speaker, a buZZer, and so on.
`0071. The alarm 163 outputs signals notifying occurrence
`of events from the telematics terminal 100. The events occur
`ring from the telematics terminal 100 may include call signal
`reception, message reception, touch input, problems of com
`ponents mounted to a vehicle, abnormal opening or closing of
`a vehicle door/window/trunk/hood/etc. (e.g., opening with
`out a key, or opening without a pass code, or opening inside or
`outside a predetermined time), and so on. The alarm 163 may
`output not only video or audio signals, but also other types of
`signals such as signals notifying occurrence of events in a
`vibration manner. The video or audio signals may be output
`through the display 161 or the audio output module 162.
`Accordingly, the display 161 and the audio output module
`162 may be categorized into some parts of the alarm 163.
`0072 The haptic module 164 generates various tactile
`effects. A representative example of the tactile effects gener
`ated by the haptic module 164 includes vibration. Vibration
`generated by the haptic module 164 may have a controllable
`intensity, a controllable pattern, and so on. For instance, dif
`ferent vibration may be output in a synthesized manner or in
`a sequential manner.
`0073. The haptic module 164 may generate various tactile
`effects including not only vibration, but also arrangement of
`pins vertically moving with respect to a skin Surface contact
`ing the haptic module 164, air injection force or air Suction
`force through an injection hole or a suction hole, touch by a
`skin Surface, presence or absence of contact with an electrode,
`effects by stimulus such as an electrostatic force, and repro
`duction of cold or hot feeling using a heat absorbing device or
`a heat emitting device.
`0074 The haptic module 164 may be configured to trans
`mit tactile effects through a user's direct contact, or a user's
`muscular sense using a finger or a hand. The haptic module
`164 may be implemented in two or more in number according
`to a configuration of the telematics terminal 100. The haptic
`module 164 may be provided at a portion to which a user
`frequently contacts. For instance, the haptic module 164 may
`be provided at a steering wheel, a gear shift, a seat, and so on.
`0075. The memory 170 may store programs to operate the
`controller 190, or may temporarily store input/output data
`(e.g., music, still images, moving images, map data, and so
`on). The memory 170 may store data relating to vibration and
`Sound of various patterns output when touches are input onto
`the touch screen.
`0076. The memory 170 may be implemented using any
`type or combination of Suitable memory or storage devices
`including a flash memory type, a hard disk type, a multimedia
`card micro type, a card type (SD or XD memory), random
`access memory (RAM), static random access memory
`(SRAM), electrically erasable programmable read-only
`memory (EEPROM), programmable read-only memory
`(PROM), read-only memory (ROM), magnetic memory,
`magnetic or optical disk, or other similar memory or data
`storage device. The telematics terminal 100 may operate on
`the Internet in association with a web storage that performs a
`storage function of the memory 170.
`0077. The interface unit 180 interfaces the telematics ter
`minal 100 with all external devices connected to the telemat
`ics terminal 100. The interface 180 receives data or power
`from an external device, and transmits it to each component
`inside the telematics terminal 100. Otherwise, the interface
`180 transmits data inside the telematics terminal 100 to an
`external device. The interface unit 180 may include a wire/
`
`wireless headset port, an external charger port, a wire/wire
`less data port, a memory card port, a port to connect a device
`having an identification module to the telematics terminal
`100, an audio Input/Output (I/O) port, a video Input/Output
`(I/O) port, an earphone port, and so on.
`(0078. The interface unit 180 may be implemented in the
`form of Controller-Area Network (CAN). Local Interconnect
`Network (LIN), FlexRay, Media Oriented Systems Transport
`(MOST), etc.
`0079 A recognition module may be implemented as a
`chip to store each kind of information to identify an authori
`zation right for the telematics terminal 100, and may include
`a User Identity Module (UIM), a Subscriber Identity Module
`(SIM), a Universal Subscriber Identity Module (USIM), and
`so on. A device having the recognition module (hereinafter,
`will be referred to as identification device) may be imple
`mented as a Smart card type. Accordingly, the identification
`device may be connected to the telematics terminal 100
`through a port. The identification device may be also imple
`mented as a vehicle key type.
`0080. The controller 190 controls an overall operation of
`the telematics terminal 100. For instance, the controller 190
`performs controls and processes relating to data communica
`tion, video call, route guidance, vehicle control, etc. The
`controller 190 may include a multimedia module 191 config
`ured to play multimedia, an airbag controller 192 configured
`to control an air bag mounted to a vehicle, an emergency
`battery controller 193 configured to control an emergency
`battery mounted to a vehicle, and so on. The multimedia
`module 191, the air bag controller 192, and the emergency
`battery controller 193 may be implemented inside the con
`troller 180, or may be separately implemented from the con
`troller 190. The controller 190 may be referred to as Telemat
`ics Control Unit: TCU.
`I0081. The controller 190 may perform a pattern recogni
`tion process to recognize handwriting inputs or picture inputs
`on the touch screen, as texts or images, respectively.
`I0082. The power supply unit 200 supplies power required
`by various components under the control of the controller
`190. The provided power may be internal power, external
`power, or combination thereof.
`I0083. The power supply unit 200 may be implemented as
`a battery mounted to a vehicle, or a battery independently
`mounted to the telematics terminal 100.
`I0084. In addition, the above various embodiments may be
`implemented in a computer-readable medium using, for
`example, computer Software, hardware, or some combination
`thereof.
`I0085 For a hardware implementation, the embodiments
`described above may be implemented within one or more
`application specific integrated circuits (ASICs), digital signal
`processors (DSPs), digit