`
`(19) World Intellectual Property Organization
`International Bureau
`
`(10) International Publication Number
`(43) International Publication Date
`WO 2008/053945 Al
`8 May 2008 (08.05.2008)
`
` 'IWWMMWWWWWMWWWWWWMWW
`
`(51) International Patent Classification:
`GOIC 21/36 (2006.01)
`GOIC 21/34 (2006.01)
`
`(74) Agent: KAWAI,Makoto; thono Bldg, 7-10, Kandami—
`toshirocho, Chiyoda-ku, Tokyo 101-0053 (JP).
`
`(21) International Application Number:
`PCT/JP2007/071272
`
`(22) International Filing Date: 25 October 2007 (25.10.2007)
`
`(25) Filing Language:
`
`(26) Publication Language:
`
`English
`
`English
`
`(30) Priority Data:
`2006-296660
`
`31 October 2006 (31.10.2006)
`
`JP
`
`(71) Applicant (for all designated States except US): AISIN
`AW CO., LTD. [JP/JP]; 10, Takane, Fujii-cho, Anjo-shi,
`Aichi 444-1 192 (JP).
`
`(81) Designated States (unless otherwise indicated, for every
`kind f nationalprotection available): AE, AG, AL, AM,
`AT, AU, AZ, BA, BB, BG, BH, BR, BW, BY, BZ, CA, CH,
`CN, CO, CR, CU, CZ, DE, DK, DM, DO, DZ, EC, EE, EG,
`ES, FI, GB, GD, GE, GH, GM, GT, HN, HR, HU, ID, IL,
`IN, IS, KE, KG, KM, KN, KP, KR, KZ, LA, LC, LK, LR,
`LS, LT, LU, LY,MA, MD, ME, MG, MK, MN, MW, MX,
`MY, MZ, NA, NG, NI, NO, NZ, OM, PG, PH, PL, PT, Ro,
`RS, RU, SC, SD, SE, SG, SK, SL, SM, SV, SY, TJ, TM,
`TN, TR, TT, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW
`
`(84)
`
`Designated States (unless otherwise indicated, for every
`kind f regionalprotection available): ARIPO (BW, GH,
`GM, KE, LS, MW, MZ, NA, SD, SL, SZ, TZ, UG, ZM,
`ZW), Eurasian (AM, AZ, BY, KG, KZ, MD, RU, TJ, TM),
`European (AT,BE, BG, CH, CY, CZ, DE, DK, EE, ES, FI,
`FR, GB, GR, HU, IE, IS, IT, LT, LU, LV,MC, MT, NL, PL,
`PT, Ro, SE, SI, SK, TR), OAPI 03F,BJ, CF, CG, CI, CM,
`GA, GN, GQ, GW, ML, MR, NE, SN, TD, TG).
`Published:
`
`(72) Inventors; and
`OONISHI,
`only):
`(for US
`(75) Inventors/Applicants
`Shino [JP/JP];
`c/o AISIN AW CO., LTD.,
`6-18,
`with international search report
`Harayama, Oka-cho, Okazaki—shi, Aichi,
`444-8564
`before the expiration f the time limitfor amending the
`(JP). NAKAYAMA, Takaaki [JP/JP]; c/o AISIN AW
`claims and to be republished in the event f receipt f
`CO., LTD., 6-18, Harayama, Oka-cho, Okazaki—shi, Aichi,
`444-8564 (JP).
`amendments
`
`
`(54) Title: ROUTE GUIDANCE SYSTEM AND PROGRAM
`
`
`L51 1
`
`m21’ 7 ‘
`
`ZZIBRIDGE
`i
`'
`i will
`
`'
`
`'
`
`k13
`
`—.
`
` 1
`
`m2 m3 m4 m5 m6 m7 m8
`
`
`
`8/053945A1|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
`
`(57) Abstract: It is possible to display a lane guide map that takes into consideration connections between each of the lanes (ml
`to m.8) and reduction in the visibility of the lane guide map is prevented. The invention includes a current position detecting unit;
`c
`C a lane list setting processing means that produces a lane list (LsI 1) that takes into consideration connections between each of the
`N lanes for lane groups (LkI 1 to LkI 3) in road links that are present in a lane list display section; a rendering range determination
`processing means that determines whether or not the number of lanes in the lane list (LsI 1) is larger than the number of lanes that
`has been set in a display unit (35); and a display range adjustment processing means that selects predetermined lanes in the lane list
`(LsI 1) and displays only the selected lanes. Lanes that have a low display necessity can be removed.
`
`C V
`
`V
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`1
`
`DESCRIPTION
`
`ROUTE GUIDANCE SYSTEM AND PROGRAM
`
`TECHNICAL FIELD
`
`The present invention relates to route guidance systems and programs.
`
`BACKGROUND ART
`
`10
`
`15
`
`20
`
`25
`
`30
`
`Conventionally, in navigation apparatuses, when a driver inputs a destination
`
`and sets search conditions, route search processing is carried out based on the search
`
`conditions, and based on map data, a route from an origin, which is indicated by the
`
`position of the guided vehicle, to a destination, is retrieved.
`
`In addition, the route that
`
`has been retrieved, that is, the retrieved route, is displayed along with the position of
`
`the guided vehicle on a map screen that is formed in a display unit, and guidance
`
`about the retrieved route, that is, route guidance, is carried out. Thereby, it is possible
`
`for the driver to drive the vehicle along the displayed retrieved route.
`
`However, when passing through an intersection on the retrieved route, in the
`
`case in which the road that forms the retrieved route has a plurality of lanes, a lane
`
`list, which serves as a lane guide map, is displayed in a predetermined screen that is
`
`formed in the display unit (refer, for example, to Patent Document 1).
`
`FIG. 1 is a drawing that shows an example of a display of a conventional lane
`
`list, and FIG. 2 is a drawing that shows an example of a display of a lane list that
`
`takes into account the connections between each of the lanes.
`
`In this case, a section within a range that is apredetermined distance ahead of
`
`the position ofthe guided vehicle is set as a lane list display section, and a lane list
`
`LsI is formed for each of the intersections in the lane list display section that have a
`
`traffic signal. Reference numeral rl denotes a road, LkI denotes the lane group that
`
`contains the road link from the position of the guided vehicle to the
`
`"Umedashinmichi" intersection, Lk2 denotes the lane group that contains the road link
`
`from the "Umedashinmichi" intersection to the "Oebashi Minamizume", Lk3 denotes
`
`the lane group that contains the road link from the "Oebashi Minamizume"
`
`intersection to the "Yodovabashi Kitazume" intersection. and Lk4 denotes the lane
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`2
`
`group that contains the road link from the "Yodoyabashi Kitazume" intersection to the
`
`"Yodoyabashi" intersection.
`
`Lane group LkI includes lanes kl to k5, lane group Lk2 includes lanes kl l to
`
`kl8 5lane group Lk3 includes lanes k21 to k28, and lane group Lk4 includes lanes k31
`
`to k3 8, and in each of the lane groups LkI to Lk4, the lanes k2, kll, k21, and k31 are
`
`displayed as recommended lanes, in which travel is recommended, for each road link.
`In addition, in the lane list LsI, traffic sections that indicate the forward
`
`direction are determined for each of the lanes kl to k5, kll
`
`to kl8, k21 to k28, and
`
`k31 to k38, and for each traffic section, arrows are appended that show the traffic
`
`10
`
`direction, that is, the exit direction, at each of the intersections.
`
`Specifically, the lanes kl and k31 are left turn lanes, and arrows are
`
`appended that show that these lanes are left turn lanes. The lanes k2, kl 1, and k32 are
`
`left turn and through traffic lanes, and arrows are appended that show that these lanes
`
`are left turn and through traffic lanes. The lanes k3, k4, klZ to kl6, k21 to k28, and
`
`15
`
`k33 to k36 are through traffic lanes, and arrows are appended that show that these
`
`20
`
`25
`
`30
`
`lanes are through traffic lanes. Lanes k5 and k37 are right turn and through traffic
`
`lanes, and arrows are appended that show that these lanes are right turn and through
`
`traffic lanes. Lanes kl7, kl 8, and k38 are right turn lanes, and arrows placed showing
`
`that these lanes are right turn lanes.
`
`In addition, because lanes k2, kl 1, k21, and k31 are recommended lanes,
`
`they are shown by making the color of the background that surrounds the arrows
`
`different so that they can be distinguished from the other lanes.
`
`Therefore, on the retrieved route, in the case in which a left turn is made at
`
`the "Yodoyabashi" intersection after the vehicle has traveled forward along the road
`
`rl up to the "Yodoyabashi" intersection, lane guidance is carried out so that the
`
`vehicle will pass through, in order, lanes k2, kl l, k21, and k31. Note that the
`
`"Yodoyabashi" intersection is a retrieved route and a guided intersection at which
`
`guidance about the vehicle turning left, right, or travelling forward, is provided.
`
`However, as shown in FIG. 2, in the actual road rl, because lane k2 and lane
`
`kl4 are connected, lane kl l and lane k23 are connected, and lane k21 and lane k31
`
`are connected, when the guided vehicle attempts to travel along lanes k2, kl 1, k21,
`
`and k31, which are the recommended lanes, the guided vehicle travels along lane k2
`
`in the road link from the position of the guided vehicle to the "Umedashinmichi"
`
`intersection, enters lane kl4 at the "Umedabashimichi" intersection, moves from lane
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`3
`
`kl4 to lane kl 1 in the road link between the "Umedashinmichi" intersection and the
`
`"Oebashi Minamizume" intersection, enters lane k23 at the "Oebashi Minamizume"
`
`intersection, moves from lane k23 to k21 in the road link between the "Oebashi
`
`Minamizume" intersection and the "Yodoyabashi Kitazume" intersection, enters the
`
`lane k31 at the "Yodoyabashi Kitazume" intersection, travels along lane 3 1 in the road
`
`link between the "Yodoyabashi Kitazume" intersection and the "Yodoyabashi"
`
`intersection, and turns left at the "Yodoyabashi" intersection.
`
`Thus, it is possible to consider displaying the lane list by taking into
`
`consideration the connections between each of the lanes between the lane groups LkI
`
`to Lk4 for each of the road links such that it is possible for the driver to recognize
`
`whether it is necessary move between lanes in each of the road links, and how the
`
`movement between lanes in each of the road links needs to be carried out, and such
`
`that, at each of the intersections such as those that are shown in FIG. 2, the road links
`
`at which an intersection is entered, that is, an entrance road, and a road at which an
`
`intersection is exited, that is, an exit road, are connected.
`
`[Patent Document 1] Japanese Patent Application Publication No. JP-A-2005-214630
`
`10
`
`15
`
`DISCLOSURE OF THE INVENTION
`
`20
`
`[Problems to be Solved by the Invention]
`
`25
`
`30
`
`However, in conventional navigation apparatuses, because of, for example,
`
`restrictions on the dimensions of the display unit, the maximum number of lanes that
`
`can be displayed together is arbitrarily determined, and for example, in the case in
`
`which a lane list such as the one that is shown in FIG. 2 is displayed, there are 10
`
`lanes arranged between the leftmost lanes k21 and k3 1 and the rightmost lane kl 8
`
`among all ofthe lanes kl to k5, kll
`
`to kl8, k21 to k28, and k31 to k38, and the size
`
`of the lane list exceeds the maximum rendering range'of the lane display unit.
`
`Thus, when rendering each of the lanes kl to k5, kl 1 to kl8, k21 to k28, and
`
`k31 to k3 8, reducing the size, that is, the width in the transverse direction, can be
`
`considered, but in this case, the visibility of the lane list is reduced.
`
`It is an object of the present invention to provide a route guidance system and
`
`programs that solve the problems of the conventional navigation apparatus, can
`
`display lane guide maps that take into consideration the connections between each of
`
`the lanes, and prevent the visibility of the lane guide maps from being reduced.
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`[Means for Solving the Problem]
`
`In order to solve the problems described above, the route guidance system of
`
`the present invention includes a current position detecting unit that detects the current
`
`position of the vehicle as the position of the guided vehicle; a lane list setting
`
`processing means that produces a lane list that takes into account the connections
`
`between each of the lanes for a lane group of a road link that are present in a lane list
`
`display section that has been set ahead of the position of the guided vehicle; a
`
`rendering range determination processing means that determines whether or not the
`
`number of lanes in the lane list is larger than the number of lanes that has been set in
`
`the display unit; and a display range adjustment processing means that selects
`
`predetermined lanes in the lane list and displays only the selected lanes in the case in
`
`which the number of lanes in the lane list is larger than the number of lanes that has
`
`been set in the display unit.
`
`[Effects of the Invention]
`
`10
`
`15
`
`According to the present invention, in the case in which the number of lanes in
`
`a lane list that takes into consideration the connections between each of the lanes is
`
`larger than the number of lanes that has been set in the display unit, predetermined
`
`20
`
`lanes in the lane list are selected, and only the selected lanes are displayed. Thus,
`
`lanes that have a low display necessity can be removed from the display range.
`
`Therefore, it is possible to display reliably the main portion of a lane list that takes
`
`into account the connections between each of the lanes.
`
`In addition, because the size of each of the rendered lanes does not need to be
`
`25
`
`made small, it is possible to prevent the visibility of the lane list from being reduced.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[FIG. 1]
`
`30
`
`FIG.
`
`1 is a drawing that shows an example of the display of a conventional
`
`lane list.
`
`[FIG. 2]
`
`FIG. 2 is a drawing that shows an example of the display of a lane list that
`
`takes into account the connections between each of the lanes.
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`[FIG. 3]
`
`FIG. 3 is a drawing that shows the navigation system in a first embodiment
`
`of the present invention.
`
`[FIG. 4]
`
`FIG. 4 is a flowchart that shows the operation of the lane display processing
`
`means in the first embodiment of the present invention.
`
`[FIG. 5]
`
`FIG. 5 is a drawing that shows an example of a lane list that takes into account
`
`the connections between lanes in the first embodiment of the present invention.
`
`10
`
`[FIG. 6]
`
`FIG. 6 is a first drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 7]
`
`FIG. 7 is a second drawing that shows the recommended lane position
`
`15
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 8]
`
`FIG. 8 is a third drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 9]
`
`20
`
`FIG. 9 is a fourth drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 10]
`
`FIG. 10 is a fifth drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`25
`
`[FIG. 11]
`
`FIG. 11 is a sixth drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 12]
`
`FIG. 12 is a seventh drawing that shows the recommended lane position
`
`30
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 13]
`
`FIG. 13 is a eighth drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 14]
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`6
`
`FIG. 14 is a ninth drawing that shows the recommended lane position
`
`determining process method in the first embodiment of the present invention.
`
`[FIG. 15]
`
`FIG. 15 is a first drawing that shows an example of the display of the lane list
`
`5
`
`in the first embodiment of the present invention.
`
`[FIG. 16]
`
`FIG. 16 is a second drawing that shows an example of the display of the lane
`list in the first embodiment ofthe present invention.
`
`[FIG. 17]
`
`10
`
`FIG. 17 is a flowchart that shows the operation of the lane display processing
`
`means in a second embodiment of the present invention.
`
`[FIG. 18]
`
`FIG. 18 is a drawing that shows an example of a display of the lane list that
`
`takes into account the connections between lanes in the second embodiment of the
`
`1 5
`
`present invention.
`
`[FIG. 19]
`
`FIG. 19 is a first drawing that shows an example of the display of the lane list
`
`in the second embodiment of the present invention.
`
`[FIG. 20]
`
`20
`
`FIG. 20 is a second drawing that shows an example of the display of the lane
`
`list in the second embodiment of the present invention.
`
`[FIG. 21]
`
`FIG. 21 is a third drawing that shows an example of the display of the lane
`
`list in the second embodiment of the present invention.
`
`25
`
`30
`
`[Brief Explanation of the Symbols]
`
`10
`
`14
`
`15
`
`51
`
`63
`
`automatic transmission control unit
`
`navigation apparatus
`
`GPS sensor
`
`information center
`
`network
`
`LsI 5LsIl
`
`lane lists
`
`LkI - 4, LkIl
`
`- 13
`
`lane groups
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`7
`
`BEST MODES FOR CARRYING OUT THE INVENTION
`
`Below, embodiments of the present invention will be explained in detail with
`
`reference to the figures. Note that a navigation apparatus that serves as a route
`
`guidance system will be explained.
`
`FIG. 3 is a drawing that shows the navigation system in a first embodiment of
`
`the present invention.
`
`In the figure, reference numeral 10 denotes an automatic transmission control
`
`unit, and this automatic transmission control unit 10 carries out control of the
`
`10
`
`automatic transmission.
`
`In addition, reference numeral 14 denotes a data terminal, for
`
`example, a navigation apparatus, which is a mounted apparatus that is installed on a
`
`vehicle, 63 denotes a network, and 51 denotes an information center that serves as an
`
`information provider. The navigation system is structured by the automatic
`
`transmission control unit 10, the navigation apparatus 14, the network 63, the
`
`15
`
`information center 51 and the like.
`
`20
`
`25
`
`30
`
`The navigation apparatus 14 is provided with a GPS sensor 15 that serves as a
`
`current position detecting unit that detects the current position of the vehicle as the
`
`position of the guided vehicle; a data recording unit 16 that serves as an information
`
`recording unit in which, in addition to map data, various types of information are
`
`recorded; a navigation processing unit 17 that carries out various types of arithmetic
`
`processing such as navigation processing; a direction sensor 18 that serves as a
`
`direction detecting unit that detects the direction of the vehicle as the direction of the
`
`guided vehicle; an operating unit 34 that serves as a first input unit, and the driver,
`
`who is the operator, makes prescribed inputs by operating the same; a display unit 35
`
`that serves as a first output unit for carrying out various types of display by using
`
`images that are rendered on a screen (not illustrated) and providing notifications to the
`
`driver; an audio input unit 36 that serves as a second input unit, and the driver makes
`
`prescribed inputs by using audio; an audio output unit 37 that serves as a second
`
`output unit for carrying out various types of display by using audio and providing
`
`notifications to the driver; and a communication unit 38 that serves as a transceiving
`
`unit that functions as a communication terminal; and the GPS sensor 15, the data
`
`recording unit 16, the direction sensor 18, the operating unit 34, the display unit 35,
`
`the audio input unit 36, the audio output unit 37, and the communication unit 38 are
`
`connected to the navigation processing unit 17.
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`8
`
`In addition, the following are connected to the navigation processing unit 17 : a
`
`frontward monitoring apparatus 48 that monitors the area in the front of the vehicle; a
`
`back camera (a rearward monitor camera) 49 that serves as an image capturing
`
`apparatus that photographs the area to the rear ofthe vehicle and serves as a rearward
`
`direction monitoring apparatus; an accelerator pedal sensor 42 that serves as an engine
`
`load detecting unit, which detects the operation of the accelerator pedal (not
`
`illustrated) by the driver according to the accelerator pedal open angle; a brake pedal
`
`sensor 43 that serves as a braking detecting unit, which detects the operation of the
`
`brake pedal (not illustrated) by the driver according to the brake pedal depression
`
`amount; and a vehicle speed sensor 44 that serves as a vehicle speed detecting unit
`
`that detects the vehicle speed,.
`
`Note that the accelerator pedal sensor 42, the brake pedal sensor 43 and the
`
`like structure the operation detecting unit for detecting the operation of the vehicle by
`
`the driver. Specifically, the accelerator pedal sensor 42 structures an acceleration
`
`operation detecting unit that detects an operation in which the driver intends to
`
`accelerate the vehicle, and the brake pedal sensor 43 structures a deceleration
`
`operation detecting unit that detects an operation in which the driver intends to
`
`decelerate the vehicle.
`
`The GPS sensor 15 detects the current position of the vehicle on the surface of
`
`10
`
`15
`
`20
`
`the earth, along with the time.
`
`25
`
`30
`
`The data recording unit 16 is provided with a map database (not illustrated)
`
`that includes map data files, and the map data is recorded in this map database. This
`
`map data includes feature data related to features on the road, in addition to including
`
`intersection data that is related to intersections (branches), node data that is related to
`
`nodes, road data that is related to road links, search data that is prepared for retrieval,
`
`and facility data that is related to facilities.
`
`The features are indicators that are disposed or formed on the road for
`
`providing various types of information for travel and carrying out various types of
`
`guidance for travel to the driver. These indicators include, for example, traffic display
`
`lines, road signs, crosswalks, manholes, signals and the like. The lane marks include,
`
`for example, stop lines for stopping a vehicle, vehicle traffic boundary lines that
`
`divide each of the lanes, section lines that indicate parking spaces, and the like, and
`
`the road signs include traffic section signs that indicate the forward direction in each
`
`of the lanes by using an arrow and guidance signs such as "stop", and the like, that
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`9
`
`provide warnings about temporary stopping locations.
`
`In addition, the feature data
`
`includes, for example, position data that indicates the position of each of the features
`
`by using coordinates and image information that shows each of the features by using
`
`images. Note that a temporary stopping location includes an entrance location from a
`
`non-prefectural road to a prefectural road, rail crossings, intersections at which red
`
`signals are flashing, and the like.
`
`In addition, for each of the road links, the road data that is related to lanes
`
`includes lane data, which serves as lane information, consisting, for example, of the
`
`number of lanes, a lane number that is appended to each of the lanes of the road, the
`
`positions of the lanes, and the exit direction at an intersection for each of the lanes.
`
`The data for outputting predetermined information by using the audio output unit 37
`
`is also recorded in the data recording unit 16.
`
`Furthermore, a statistics database (not illustrated) that includes a statistics data
`
`file and a travel history database (not illustrated) that includes atravel history data file
`
`are formed in the data recording unit 16, and the statistical data is recorded in the
`
`statistics data file as cumulative data and travel history data is recorded in the travel
`
`history data file as cumulative data.
`
`Additionally, in order to record various types of data, in addition to being
`
`provided with a disk (not illustrated) such as a hard disk, a CD, a DVD, or an optical
`
`disk or the like, the data recording unit 16 is provided with head (not illustrated) such
`
`as a read/write head for reading and writing the various types of data.
`
`In addition, it is
`
`possible to use a memory card or the like in the data recording unit 16. Note that the
`
`external storage apparatus is structured by the above disk, memory card, and the like.
`
`In the present embodiment, the map database, the statistics database, the travel
`
`history database and the like are produced in the data recording unit 16. However, it
`
`is possible to produce the map database, statistics database, and the travel history
`
`database and the like at the information center 51.
`
`In addition, the navigation processing unit 17 is provided, for example, with a
`
`CPU 31 that serves as a control apparatus for carrying out overall control of the
`
`navigation apparatus 14 and that serves as an arithmetic apparatus; a RAM 32 that is
`
`used as a working memory when the CPU 31 carries out various types of arithmetic
`
`processing; a ROM 33 on which, in addition to control programs, various types of
`
`programs for retrieving a route to a destination, carrying out route guidance, and the
`
`like, are recorded; and a flash memory (not illustrated) and the like that is used in
`
`10
`
`15
`
`20
`
`25
`
`30
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`10
`
`order to record the various types of data, programs, and the like. Note that the
`
`internal memory apparatus is structured, for example, by the RAM 32, ROM 33, and
`
`the flash memory.
`
`It is possible to use a keyboard or a mouse or the like (not illustrated) as an
`
`operation unit 34 that is installed separately from the display unit 35.
`
`In addition, as
`
`an operating unit 34, it is possible to use a touch panel that enables the carrying out of
`
`prescribed input operations by touching or clicking an image operating unit such as
`
`various types of keys, switches, or buttons, or the like, that are displayed as an image
`
`on the screen that is formed by the display unit 35 .
`
`A display is used as the display unit 35, and in the various types of screen that
`
`are produced on the display unit 35 , the current position of the vehicle can be
`
`displayed as the position of the guided vehicle, the direction of the vehicle can be
`
`displayed as the direction of the guided vehicle, the maps, retrieved routes, the
`
`guidance information along the retrieved routes, traffic information and the like can
`
`be displayed, and the distance to the next intersection in the retrieved route and the
`
`forward direction at the next intersection can be displayed.
`
`In addition, the audio input unit 36 is structured by a microphone or the like
`
`(not illustrated), and can input necessary information by voice. Furthermore, the
`
`audio output unit 37 is provided with a voice synthesizing apparatus (not illustrated)
`
`10
`
`15
`
`20
`
`and a speaker, and the retrieved routes, the guidance information, the traffic
`
`information and the like are output from the audio output unit 37, for example, by a
`
`voice that has been synthesized by the voice synthesizing apparatus.
`
`The communication unit 3 8 is provided with a beacon receiver (not illustrated)
`
`in order to receive various types of information such as the present traffic information,
`
`25
`
`30
`
`general information and the like that has been broadcast from a road traffic
`
`information center and an FM receiver (not illustrated) or the like in order to receive
`
`the above information as an FM multiplex broadcast via a FM broadcast station (not
`
`illustrated). Additionally, in addition to data such as the map data, statistics data,
`
`travel history data, and the like that are received from the information center 51, the
`
`communication unit 38 can receive various types of information such as traffic
`
`information, general information, and the like, via the network 63.
`
`Thus, the information center 51 is provided, for example, with a server 53, a
`
`communication unit 57 that is connected to the server 53, and a database (DB) 58 that
`
`serves as an information recording unit, and the server 53 is provided, for example,
`
`
`
`WO 2008/053945
`
`PCT/JP2007/071272
`
`11
`
`with a CPU 54 that serves as a control apparatus and an arithmetic apparatus, a RAM
`
`55, and a ROM 56. In addition, data that is similar to the various types of data that is
`
`stored in the data recording unit 16 is recorded in the database 58.
`
`Note that the navigation system, the navigation processing unit 17, the CPUs
`
`3 1 and 54, the server 53 and the like function as a computer singly or in a
`
`combination of two or more, and carry out arithmetic processing based on the various
`
`types ofprogram, data, and the like. In addition, the recording medium is structured
`
`by the data recording unit 16, RAMs 32 and 55, ROMs 33 and 56, the database 58, a
`
`flash memory, and the like.
`
`In addition, it is possible to use an MPU or the like as the
`
`arithmetic apparatus instead of the CPUs 31 and 54 as the arithmetic unit.
`
`Next, the basic operation of a navigation system having the structure that has
`
`been described above will be explained.
`
`First, when the operating unit 34 is operated by the driver and the navigation
`
`apparatus 14 is activated, the navigation initiation processing means (not illustrated)
`
`of the CPU 31 carries out navigation initiation processing, and the position of the
`
`guided vehicle that has been detected by the GPS sensor 15 and the direction of the
`
`guided vehicle that has been detected by the direction sensor 18 are read, and the
`
`various types of data are initialized. Next, a matching processing means (not
`
`illustrated) in the CPU 3 1 carries out matching processing, and the position of the
`
`guided vehicle is specified by determining whether the guided vehicle is positioned on
`
`any road links based on the locus of the position of the guided vehicle that has been
`
`read, the contours and arrangement of each of the road links that form the road in the
`
`vicinity of the position of the guided vehicle, and the like.
`
`In addition, in the present embodiment, the matching processing means further
`
`specifies the position of the guided vehicle based on the positions of features that are
`
`photographed objects photographed by the back camera 49.
`
`Thereby, an image recognition processing means (not illustrated) in the CPU
`
`31 carries out image recognition processing, the image data is read from the back
`
`camera 49, and the features in the image, which consists of the image data, are
`
`recognized.
`
`In addition, a distance calculation processing means (not illustrated) in
`
`the CPU 31 carries out distance calculation processing, and calculates the actual
`
`distance from the back camera 49 to a feature based on the position of the feature in
`
`an image.
`
`In addition, a guided vehicle position specification processing means in the
`
`matching processing means carries out guided vehicle position specification
`
`10
`
`15
`
`20
`
`25
`
`30
`
`
`
`WO 2008/053945
`
`PCT/JPZOO7/071272
`
`12
`
`processing, reads the distance, acquires the coordinates of the feature by reading the
`
`feature data from the data recording unit 16, and specifies the position of the guided
`
`vehicle based on the coordinates and the distance thereof.
`
`In addition, the travel lane specification processing means (not illustrated) in
`
`the CPU 31 carries out travel lane specification processing, and similarly, specifies
`
`the position of the guided vehicle by referring to features that have been recognized
`
`based on the image data, and feature data and lane data that have been read from the
`
`data recording unit 16, and specifies the travel lane along which the vehicle is
`
`travelling based on the position of the specified guided vehicle.
`
`Note that the travel lane specification processing means reads the sensor
`
`output of the terrestrial magnetism sensor (not illustrated), and based on this sensor
`
`output, determines whether or not there are any detected objects that consist of a
`
`ferromagnetic material, such as a manhole, in a predetermined lane on the road, and
`
`can specify the travel lane based on the results of this determination. Furthermore, the
`
`position of the guided vehicle can be detected with high precision by using a high
`
`precision GPS sensor, and based on the detected results, it is possible to determine the
`
`lane in which the vehicle is travelling. In addition, when necessary, it is possible to
`
`specify the travel lane by carrying out image processing on the image data of the lane
`
`marks, and simultaneously, combining the sensor output of the terrestrial magnetism
`
`10
`
`15
`
`20
`
`sensor and the position of the guided vehicle.
`
`Next, the basic information acquisition processing means (not illustrated) in
`
`the CPU 31 carries out basic information acquisition processing, and either reads out
`
`and acquires the map data from the data recording unit 16, or receives and acquires
`
`the map data from the information center 51 via the communication unit 38. Note that
`
`in the case in which the map data is acquired, for example, from the information
`
`center 51, the basic information acquisition processing means downloads the received
`
`map information to flash memory.
`
`In addition, the display processing means (not illustrated) in the CPU 31
`
`carries out display processing, and forms various types of screens in the display unit
`
`35. For example, the map display processing means in the display processing means
`
`carries out map display processing, and forms a map screen in the display of the
`
`display unit 35, dis