throbber
I, Noah Oskow, hereby declare under penalty of perjury that the following statements are made
`
`based on personal knowledge and are true and correct:
`
`1. I am fluent in both Japanese and English and have worked as a professional translator
`
`since 2012.
`
`2. My education experience includes two years at Sophia University in Tokyo and a BA
`
`in East Asian Languages and Cultures: Japanese from the University of Kansas.
`
`3. I am a native English speaker.
`
`4. I have been speaking, reading, and writing Japanese for 16 years.
`
`5. I lived and worked in Japan for over six years.
`
`6. My translation experience has covered a wide variety of topics, including many patent
`
`translations.
`
`7. I prepared the attached translation, which is, to the best of my knowledge, a true and
`
`correct translation of Japanese Patent JP07280583.
`
`I understand that willful false statements and the like are punishable by fine or imprisonment, or
`
`both, under 18 U.S.C. Section 1001.
`
`Date: 10/05/2018
`
`________________________
`Noah Oskow
`Translator
`
`IPR2020-00407
`Apple EX1029 Page 1
`
`

`

`(19) Japanese Patent Office (JP)
`
`(12) Open patent publication
`(A)
`
`(51) Int.Cl.6
`
`G01C
`
`G08G
`
`//G01S
`
`21/00
`
`1/005
`
`1/0969
`
`5/02
`
`5/14
`
`Identification symbol Office reference
`number
`Z
`
`Z
`
`(11) Patent application publication number
`Patent disclosure Heisei7-280583
`(43) release date 10/27/1995
`
`F 1
`
`Technical display place
`
`(21) application number
`Japanese patent application6-99360
`
`(22) Filing date 4/13/1994
`
`Request for examination Unclaimed
`Number of claims 2 FD (All 10 terms)
`
`(71) Applicant 591261509
`Ekos Research Co., Ltd.
`2-19-12, Sotokanda, Chiyoda-ku,Tokyo
`
`72) Inventor Seiichi Suzuki
`2-19-12, Sotokanda, Chiyoda-ku,Tokyo
`In Ekosu Research Inc.
`
`(72) Inventor Toshihiro Mori
`2-19-12, Sotokanda,Chiyoda-ku,Tokyo
`In Ekosu Research Inc.
`
`(74) An agent patent attorney
`Takashi Kawai (other 1)
`
`(54) [Title of the invention] Portable navigation device
`
`(57) [summary]
`[purpose] Provided is a portable navigation device which can
`facilitate direction recognition.
`[Constitution] The map drawing unit 28 shows the map drawing
`data 243, character data 24, etc. of the map information storage
`unit 24 as a map screen coincident with the actual orientation on
`display 12 in accordance with the azimuth of the device detected
`by the orientation centimeter 40.
`When the direction of the apparatus coincides with the azimuth of
`the destination or the recommended route, the guidance
`information generating section 20 causes speaker 13 to output the
`guidance sound, and causes display 12 to display an
`arrow indicating the direction of the destination, etc.
`
`IPR2020-00407
`Apple EX1029 Page 2
`
`

`

`( 2 )
`
`NO. 7-280583
`
`【Range of Claims】
`
`【0003】
`
`【Claim 1】“Guidance object information storage”
`refers to that in which at least one of the position
`information of the destination and the route
`information to the destination is stored as a guidance
`object, “current position detection” means for
`detecting the current position of the apparatus main
`body, “guidance target azimuth determining” refers to
`determining an azimuth of the guidance target stored
`in the guidance object information storage, referring
`with respect to a current position detected by the main
`body direction detection, referring to a main body
`direction detection referring to the means for detecting
`an azimuth in which the apparatus body faces. From
`the azimuth of the main apparatus body detected by
`the main body direction detecting means and the
`azimuth of the guidance object judged by the
`guidance target azimuth judging method.
`
`【Claim 2】A map information storage unit that stores
`map information, a display unit that is fixed to the
`main apparatus body and that outputs image
`information, map information stored in the map
`information storage group, and map rendering
`methods for rendering on the display method on the
`basis of the azimuth of the apparatus main body in
`coincidence with the actual orientation.
`
`【DETAILED DESCRIPTION OF THE INVENTION】
`
`【0001】
`
`【Industrial field of application】The present invention
`relates to a portable navigation device.
`
`【0002】
`
`【Conventional technology】
`
`The navigation device shows the current position, the
`direction of the destination, and the route to the
`destination for a person who is unfamiliar with
`geography, and in recent years, for those who move
`about on foot, for whom navigation apparatuses have
`been developed. The portable navigation device
`draws a map around the current position, a mark
`indicating the current position and a route to the
`destination on display. Also, the guidance of the
`current position and the guidance of the course
`direction for going to the destination are outputted in
`voice format. For example, the map displayed on
`display is drawn such that the upper side of the
`display is north. By viewing this map, the carrier
`(guided person) of the portable navigation device can
`recognize the direction in which the user actually
`faces. It makes you recognize the direction to go.
`
`【Problem to be solved by invention】 In a
`conventional navigation device for portable use, for
`example, in a map on a drawn display, even when a
`building serving as a marker is located diagonally to
`the right, sometimes it is that it cannot be easily
`recognized which way is located in the view which you
`are seeing. In such a case, one has to change the
`direction it was facing and compare the actual
`buildings and the circumstances of the surroundings
`with the buildings, etc. on the map. Another example
`is possessing a compass and having to match the
`actual orientation with the orientation on the map as
`drawn. In this way, it is sometimes difficult for the
`guided person to recognize the direction of the target
`and the direction to go in the actual road situation.
`
`【0004】 Therefore, an object of the present
`invention is to provide a portable navigation apparatus
`which can facilitate direction recognition.
`
`【0005】
`
`【Means for solving the problem】 According to a first
`aspect of the present invention, there is provided a
`guidance object information storage means in which
`at least one of position information of a destination
`and route information to a destination is described as
`a target of guidance, a current position detection
`means a guidance target azimuth judging means for
`judging an azimuth of the guidance object stored in
`the guidance object information means with respect to
`the current position detected by the present position
`detecting means, the main body for detecting an
`azimuth in which the main body of the apparatus
`faces an azimuth detecting means, an azimuth of the
`apparatus body detected by the main body direction
`detecting means and the azimuth of the guidance
`target azimuth judging means. According to a second
`aspect of the present invention, there is provided a
`portable navigation apparatus comprising: map
`information storage means for storing map
`information; display means fixed to the main
`apparatus body for outputting image information; map
`information storage means and map rendering means
`for rendering the on-map information stored in said
`main body display means coincident with the actual
`orientation based on the azimuth of said apparatus
`main body detected by said main body direction
`detecting means.
`
`【0006】
`
`【Action】
`
`IPR2020-00407
`Apple EX1029 Page 3
`
`

`

`( 3 )
`
`NO. 7-280583
`
`In the portable navigation device according to the first
`aspect, the guidance target azimuth determining
`means determines the azimuth of the guidance target
`stored in the guidance object information storage
`means with respect to the current position detected by
`the current position detection means. Then, the
`guide sound output means outputs the guide sound in
`accordance with the direction of the main body of the
`apparatus determined by the main body orientation
`determining means. In the portable navigation device
`according to the second aspect of the invention, the
`map drawing means coincides the map information
`stored in the map information storage means with the
`actual orientation based on the azimuth of the main
`apparatus body detected by the main body-direction
`detection means and draws on the display means.
`
`【0007】
`
`【Example】 Hereinafter, an embodiment of a
`portable navigation device of the present invention will
`be described in detail with reference to FIGS.1 to 12.
`FIG. 1 shows an appearance of portable navigation
`device 10 according to this embodiment. The portable
`navigation device 10 has a weight and a size enough
`to be able to be placed on the palm of the hand, and
`the display 12 on which the guide information such as
`the map around the current location and the guide
`information is displayed, and a guide sound such as
`guidance sound or "beeping" are outputted from
`speaker 13, a power switch, an input key 14 for
`performing various operations. Guide voice can be
`output from earphone 16 as well.
`
`【0008】FIG. 2 shows the configuration of the
`portable navigation device 10. The portable navigation
`device 10 includes a map information storage unit 24
`in which map information is stored from a guidance
`information generating unit 20 for generating various
`guidance information for carriers of the device, a
`current information processing unit and a position
`measuring unit 26. Further, the portable navigation
`device 10 includes a map drawing section 28 for
`drawing a map, data input section 30 for inputting
`various data by the input key 14, and guidance sound
`signal output section 32 for outputting guidance
`sound. 【0009】 The map information storage unit 24
`stores, as map information, road data 241,
`intersection data 242, map drawing data 243,
`character data 244, photograph information of
`characteristic points, and various areas such as
`hotels, tourist guides, etc. , Information in each area,
`and other data 245 in which voice data for voice
`guidance is stored. The map information storage unit
`24 supplies each map information to the guidance
`information generating unit 20 and the map drawing
`unit 28. As the storage medium of the map
`information storage unit 24, for example, a CD-ROM
`
`(Read Only Memory), an IC card, or a magneto-
`optical disk or a magnetic disk is used.
`
`【0010】Here, the road data 241 includes the
`thickness of each road, the length of the road, the
`coordinate position at each point between the starting
`point and the ending point (longitude, latitude), such
`as the intersection number at the starting point or
`ending point of the road, as the data necessary for
`route guidance. Map drawing data 243 is a data for
`drawing the river water system, mountain topography,
`buildings, routes, roads, etc. on display 12, and each
`data has absolute coordinates specified by longitude
`and latitude there. The map drawing data 243 is
`hierarchized for each scale of the map, and the data
`of the lowest layer is a data for drawing a map which
`is a scale of 1 / 10,000, including narrow allies, shop
`names, etc. This includes detailed data.
`
`【0011】As shown in FIG. 1, character data 244 is
`used for displaying on the map an abbreviated
`number of a feature to be a mark showing a
`department store, a school, or a name of a location
`and a place name (hereinafter referred to as "place
`name, etc."). Character data 244 has data of a
`character string corresponding to each place name,
`etc. and dictionary data composed of font data of all
`characters to be displayed on the map. The character
`string data includes coordinate data indicating the
`absolute coordinates (latitude and longitude) of the
`center point of the character string, that is the
`intersection point of the diagonal line of the rectangle
`determined by the vertical width and the length of the
`character string, and coordinate data and code data
`for specifying the font data (abbreviation is one
`character). Each character string data is linked with
`map drawing data 243 by coordinate data.
`
`【0012】The current position measuring unit 26
`continues to the GPS (Global Positioning System)
`receiver 34, the beacon receiver 36, the distance
`sensor 38, and the direction sensor 40, respectively.
`The GPS receiver 34 receives radio waves of GPS
`satellites, and the current position measuring unit 26
`calculates the absolute position of the portable
`navigation device 20 based on the received data of
`the GPS receiver 34. On the other hand, the beacon
`receiver 36 receives position information from a
`beacon arranged on the road. The azimuth sensor 40
`has a geomagnetism sensor 401 for detecting the
`orientation of the portable navigation device 10 by
`detecting the geomagnetism, and a gyro sensor 402
`such as a gas rate gyroscope or a fiber optic
`gyroscope for detecting the rotational angular velocity.
`
`【0013】Here, the two types of sensors are used
`because the geomagnetic sensor 401 detects the
`
`IPR2020-00407
`Apple EX1029 Page 4
`
`

`

`( 4 )
`
`NO. 7-280583
`
`geomagnetism of the mobile navigation device 10
`itself, the structure made of iron, such as a bridge, or
`the like. This is because the magnetic field is detected
`and the direction detection may be erroneous in some
`cases. Accordingly, in the azimuth sensor 40, an
`azimuth detection is performed using a gyro sensor
`402 which is not usually influenced by an external
`magnetic field, and an error of the detected value is
`corrected on the basis of a detection value of the
`geomagnetic sensor 401, whereby accurate azimuth
`detection.
`
`【0014】The azimuth data detected by the direction
`sensor 40 is supplied not only to the current position
`measurement unit 26 but also to the guide information
`generation unit 20 and the map drawing unit 28. The
`distance sensor 264 detects the acceleration of the
`portable navigation device 10, for example, and
`obtains the moving distance by integrating twice.
`Although the current position measuring unit 26 can
`measure the position independently from the GPS
`receiver 34 and the beacon receiver 36, in a place
`where it is impossible to receive from a GPS satellite
`or a beacon, the current position measuring unit 26
`measures the distance sensor 264 and the direction
`sensor The absolute position is calculated by dead
`reckoning navigation using 263.
`
`【0015】The map drawing unit 28 includes a drawing
`CPU (central processing unit) that performs various
`processes for drawing a map, a recommended route,
`an arrow indicating the direction of the route direction
`and the destination, and a map information storage
`unit And a drawing data RAM (random access
`memory) in which various data read from the RAM 24
`are stored. The map drawing unit 28 rotates the map
`around the current position so as to render on display
`12 so that the direction on the drawn map coincides
`with the actual orientation. That is, the coordinate data
`in the map drawing data 243 and the character data
`244 are converted in accordance with the current
`position measured by the current position measuring
`section 26 and the azimuth detected by the direction
`sensor 40, and the map and character after
`coordinate conversion on display 12. The RAM for the
`drawing data of the map drawing unit 28 includes
`storage areas for storing map drawing data 243 after
`coordinate conversion and drawing data of character
`data 244 and other drawing data displayed on the
`map, for example, arrows indicating the current
`position and a storage area for storing drawing data
`such as commands.
`【0016】FIG. 3 schematically shows a storage area
`of each drawing data in the drawing data RAM. The
`storage areas of the respective drawing data are
`linked to each other by the coordinate data of the
`respective drawing data 243, 244, etc., and have a
`layer structure as shown in FIG. 3. That is, the
`
`drawing data RAM has a map layer 50, and a
`character layer 52 as storage areas in which the map
`is drawing data 243 and the character data 244 after
`coordinate conversion are respectively stored. It also
`has a remaining material first layer 54 for storing
`drawing data of arrows indicating the position and
`orientation of the portable navigation device 10 and a
`command layer 56 for storing various commands
`drawing data. By overlapping and developing the data
`of each layer 50, 52, 54, 56 on the bit map memory of
`the display 12, for example, the screen shown in FIG.
`1 is displayed. As the display 12, a liquid crystal
`display, a plasma display or the like is used.
`
`【0017】The data input unit 30 is for the user to input
`various destinations (arrival points) and various
`operation instructions to the portable navigation
`device 10. In the present embodiment, the data input
`unit 30 is mainly composed of a plurality of input keys
`14 shown in FIG. 1 and a touch panel on which an
`input operation is performed by touching the display
`screen of the display 12, but a keyboard, a mouse, a
`light pen, a joystick, a voice recognition device, may
`be used. The guidance voice signal output unit 32
`synthesizes predetermined voice and guidance tone
`according to a command from the guidance
`information generating unit 20 and supplies the signal
`to the voice output terminal 42. For example,
`guidance sounds such as "please turn to the right at
`the next intersection," and guidance sounds such as
`"beep-beep-beep" are outputted from the speaker 13
`or the earphone 16 which are thrown to the sound
`signal output terminal 42.
`
`【0018】The guidance information generating unit 20
`includes a CPU that performs processing for
`outputting various guidance information to the wearer,
`a ROM (read-only memory) in which a predetermined
`program is stored, and input data and the CPU and a
`navigation RAM for storing processing results. The
`guidance information generating unit 20 generates
`navigation information stored in the ROM from the
`current position measured by the current position
`measuring unit 26 and the destination input to the
`data input unit 30 and the road data 241 and the
`intersection data 242 of the map information storage
`unit 24 Based on the program, the recommended
`route to the destination is searched. Also, the
`direction in which the wearer should travel along the
`recommended route is determined from the searched
`route data and the current position measured by the
`current position measuring unit 26.
`
`【0019】The guidance information generating unit 20
`also determines the direction of the destination from
`the coordinate data of the input destination. Then, by
`supplying the various kinds of guidance information as
`the judgment result to the map drawing unit 28 and
`
`IPR2020-00407
`Apple EX1029 Page 5
`
`

`

`( 5 )
`
`NO. 7-280583
`
` the guidance audio signal output unit 32, it is possible
`to display an arrow on the display 12, or to turn from
`the speaker 13 "Turn right at the next intersection. "
`Based on the detection value of the azimuth sensor
`40, the guidance information generating section 20
`judges the direction of the recommended route, that
`is, the direction of the traveling navigation device 1
`with respect to the traveling direction of the road
`which is the route and the direction of the destination,
`and instructs the guidance sound signal output unit 32
`and the map drawing unit 28 to display the guidance
`sound output, the arrow, etc. in accordance with the
`direction difference. It should be noted that the data of
`the searched route is composed of road data and
`intersection data to the destination, and these are
`stored in the navigation RAM together with the
`destination data and are read out by the map drawing
`unit 28 and is drawn on the map of the display 12.
`
`【0020】Next, the operation of the embodiment
`configured as described above will be described.
`
`(1) Outline of operation
`
`In the portable navigation device 10, the map on the
`display 12 is rendered so as to rotate in accordance
`with the change in the orientation of the portable
`navigation device 10 around the current position on
`the display map, and the orientation on the map and
`the actual so that the direction coincides. The
`guidance by the portable navigation device 10 is
`performed by displaying a map with the current
`position only on the display 12, with guidance for
`guiding the wearer along the recommended route that
`has been instructed, and a guide informing only of
`directions. Hereinafter, the operation of the portable
`navigation device 10 in each guide will be described
`respectively.
`
`【0021】FIG.4 shows the change in the orientation of
`the carrier m on the actual road and FIG. 5 shows the
`change in the orientation of the carrier m shown in
`FIG. 4 in the case of guiding with only the display of
`the map and the current position In accordance with
`the change in the screen drawn on the display 12. As
`shown in FIG. 4, when the wearer m turns from A to
`B, that is, when the orientation of the portable
`navigation device 10 changes from A to B, the screen
`shown in FIG. 5 (A) is rewritten to the screen as
`shown in (B). That is, the map is rotated around the
`current position by an angle changed by the portable
`navigation device 10 and rendered.
`
`(A) (B) In both screens, the arrow C indicating the
`north direction on the display 12 coincides with the
`actual north direction. Further, the arrow indicating the
`position and orientation of the portable navigation
`device 10 always points in the same direction (the
`upper side of the screen) with respect to the display
`
`12. It is to be noted that, although only characters
`which are generally symbolized as shown in the figure
`are usually displayed on the display 12, by changing
`the mode by pressing a predetermined input key 14, it
`is possible to formal name, place name (address), etc.
`are displayed.
`
`【0022】FIG. 6 shows the change of the screen of
`the display 12 in the case of guiding the wearer along
`the recommended route. In the case where the
`wearer moves along the recommended route K, the
`orientation of the wearer shown in FIG. 6 (A) is
`changed so as to be indicated by an arrow D so as to
`match the direction of the recommended route K as
`shown in FIG. 6 (B) and the map is rotated and the
`large arrow is displayed on the screen, and a
`guidance sound such as "beep-beep-beep" is output
`from the speaker 13 or the earphone 16, for example.
`By watching this arrow or listening to the guidance
`sound, the wearer can easily recognize which
`direction of the recommended route is in the
`landscape actually being watched.
`
`【0023】FIG. 7 shows a change in the screen of the
`display 12 in the case of performing guidance
`informing only the direction of the destination. In the
`case of traveling on the basis of the direction of the
`destination n instead of moving along a specific route,
`the wearer changes his/her direction as indicated by
`the arrow E, and the direction of the destination n and
`the direction of the portable navigation device 10
`Similarly, when the direction coincides with each
`other, display of large arrows and output of guidance
`sound are performed together with rotation display of
`the map. FIG. 8 shows the positional relationship
`between the display 12 and the wearer m. FIF. 5 to 7
`shows the display change when the display 12 and
`the carrier are in the positional relationship shown in
`FIG.8. And, Characters (abbreviations) on the map
`are usually drawn in a fixed direction to the display 12,
`as shown by (A) to (C) in FIG. 8. That is, it is always
`drawn at the same angle as seen from the carrier m.
`
`【0024】FIG. 9 shows a change of the display 12
`screen when the portable person m has rotated the
`portable navigation apparatus 10 on the palm of the
`hand. When the wearer changes the drawing mode of
`the character by depressing the predetermined input
`key 14 by the wearer, the portable navigation device
`10 rotates the map along with the rotation direction of
`the map and draws it. That is, as shown in (A) to (C)
`of FIG. 9, when the carrier m rotates the portable
`navigation device 10 on the palm of the hand,
`characters on the map are oriented with respect to the
`drawing map and is always rendered constant. In this
`case, the wearer can see the character always facing
`the same direction to himself/herself. For example,
`when walking on a road, if changing the orientation of
`
`IPR2020-00407
`Apple EX1029 Page 6
`
`

`

`the portable navigation device 10 without changing its
`own direction and checking the direction of the
`destination or the like by relying on the guide sound
`such as "beep-beep-beep" the character becomes
`easy to see for person m.
`
`【0025】(2)Detailed operation
`
`FIG.10 to 12 show the flow of the operation of the
`portable navigation device 10. First, the guidance
`information generating unit 20 is shown on display 12
`a message asking as to whether he / she has a
`destination, and if the user inputs that there is a
`destination (step 1: Y), the guidance information
`generating unit 20 sets the destination (Step 2). In
`other words, a list of place names of places of
`destination or hiragana names with 50 Japanese
`syllabaries is displayed on display 12. Then, when the
`wearer selects a predetermined place name from the
`destination list or inputs a destination to the data input
`unit 30 by inputting the place name (address) by
`choosing letters from 50 Japanese syllabaries for the
`destination, guidance information generation Unit 20
`stores the destination data in the navigation RAM and
`sets the destination (step 2).
`
`【0026】Next, the guidance information generating
`unit 20 displays on the display 12 a message asking
`whether or not there is a need to guide to the
`destination along the predetermined recommended
`route, and if the wearer inputs it as necessary
`(Step 3), and searches for a route to the destination
`set in step 2 (step 4). That is, based on the road data
`241 and the intersection data 242 in the map
`information storage unit 24, the measured values of
`the current position measuring unit 26, and the set
`destination data, a route from the current position to
`the destination is searched for (step 4). It should be
`noted that the current position may be entered by the
`wearer via the data input unit 30 in the same way as
`the destination. When the route search to the
`destination is completed, a message is displayed on
`display 12 asking whether to draw the orientation of
`the character displayed on the map in the mode
`shown in FIG. 8 or in the mode shown in FIG. 9.
`Then, the mode selected by the wearer is set (step 5).
`
`【0027】Next, the guidance information generating
`unit 20 obtains the azimuth change of the portable
`navigation device 10 within a predetermined time from
`the detection value of the direction sensor 40 (step 6).
`If the direction change is equal to or greater than a
`predetermined angle, for example, 5 °, or 10 ° or more
`(step 6; Y), the map drawing unit 28 transmits the
`orientation detected by the orientation sensor 40 to
`the display 12. The map drawing data 243 read from
`the map information storage unit 24 is coordinate-
`transformed so as to be rotated and drawn (step 7).
`
`( 6 )
`
`NO. 7-280583
`
`The map drawing data 243 subjected to the
`coordinate conversion is stored in the map layer 50
`(FIG. 3) of the drawing data RAM of the map drawing
`unit 28.
`
`【0028】Next, the map drawing unit 28 performs
`coordinate conversion of the character data 244 in
`accordance with the drawing mode set in step 5 (step
`8). In other words, as shown in FIG. 8, when the
`mode of drawing characters at a fixed angle with
`respect to the display 12 is selected, only the
`coordinate data in each character string data of the
`character data 244 is converted. In other words, as
`shown in FIG. 8, when the mode of drawing
`characters at a fixed angle with respect to the display
`12 is selected, only the coordinate data in each
`character string data of the character data 244 is
`converted. On the other hand, as shown in FIG. 9,
`when the mode of displaying the character orientation
`rotated with the map is selected, conversion
`processing is performed not only on the coordinates
`of the character data 244 but also on the font data of
`each character. The map drawing unit 28 stores the
`coordinate-converted character data 244 in the
`character layer 52 of the drawing data RAM.
`
`【0029】Then, together with the data of the current
`position layer 54 and the command layer 56, the map
`drawing data 243 after the coordinate conversion and
`the character data 244 are superimposed and drawn
`on the display 12 (step 9). As a result, on display 12,
`for example, a map screen as shown in FIG. 1 is
`displayed. In case of no destination In step 1,(N) and
`inputting that route guidance is not required in step 4
`(N), draw a map based on the drawing data after the
`coordinate transformation (step 8, step 9) stored in
`each layer 50 – 56 (step 9). That is, the map on
`display 12 is not rotationally displayed.
`
`【0030】Next, the guidance information generating
`unit 20 judges whether there is route data in the
`navigation RAM (step 10). When there is route data
`(step 10; Y), the guidance information generating unit
`20 outputs the output of the arrow and guidance
`sound When the predetermined input key 14 to be
`obtained is pressed by the wearer (step 11; Y), from
`the measurement data of the current position
`measurement unit 26 and the route data stored in the
`navigation RAM, the current position is on the route
`searched for or not (step 12). When the route is on
`the route (step 12; Y), the guidance information
`generating section 20 determines, from the
`intersection data and the road data constituting the
`route data and the detection data of the direction
`sensor 40, whether the direction of the portable
`navigation device 10 is the direction of the route (Step
`13).
`
`IPR2020-00407
`Apple EX1029 Page 7
`
`

`

`【0031】If the orientation of the portable navigation
`device 10 matches the direction of the route (Y in step
`13), the map drawing unit 28 displays a large arrow
`on the display 12 as shown in FIG. 6 (B) (Step 14),
`the guidance speech symbol output unit 32 causes
`the speaker 13 or the earphone 16 to output guidance
`sounds such as "beep-beep-beep" at a large volume
`(step 15). Then, for example, it is determined whether
`the wearer has touched the command (FIG. 3) for
`cancellation of guidance on display 12 and instructed
`to stop guiding (step 16). If it is not instructed to
`cancel guidance (step 16; N), the routine goes to step
`5, and if guidance stop is instructed, let the process
`be ended.
`
`【0032】If it is determined in step 13 that the azimuth
`of the route and the azimuth of the device are not
`coincident (N), the guidance information generating
`unit 20 determines whether the angular difference
`between the direction of the route and the direction of
`the device is within a certain angle, for example, 10 °
`It is determined whether or not there is any (step 17).
`If it is determined that the guidance tone is within a
`certain angle (step 17; Y), the guidance tone is output
`with a small volume (step 18). When it is judged that it
`is not within the fixed angle (step 18; N), move back to
`the step 16. In this way, It is easy to know the
`direction of the route by changing the volume of the
`guidance tone outputted when the azimuth of the
`apparatus approaches the azimuth of the route (step
`17; Y) and when it coincides (step 13; Y). If it is
`determined in step 11 that the predetermined input
`key 14 has not been pressed (N), the process
`proceeds to step 16. If it is determined in step 12 that
`the current position is not on the route (N) A message
`notifying that it is out of the route is displayed on the
`display 12, or a sound is output from the speaker 13
`(step 19), and the process goes to step 16.
`
`【0033】On the other hand, if there is no route data
`(N) in step 10, the guidance information generating
`section 20 judges whether there is destination data in
`the navigation RAM (step 20). When there is
`destination data (step 20; Y), the guidance information
`generating unit 20 calculates the azimuth of the
`destination from the coordinate data included in the
`destination data and the measurement data of the
`current position measuring unit 26 (step 21). Then, it
`is judged whether or not the direction of the
`destination coincides with the azimuth of the
`apparatus with the detected value of the direction
`sensor 40 being coincident (step 22). If it is identical
`(step 22; Y), the figure as indicated by 7 (B), a large
`arrow is displayed on the display 12 (step 23). Then,
`the guidance tone is output with a large volume (step
`24), and the process procee

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket