throbber

`
`
`
`DECLARATION OF ACCURACY
`
`
`I, David Baldwin, declare the following:
`
`
`1. I am over 18 years of age and competent to make this declaration.
`
`2. I am a qualified Japanese to English translator.
`
`3. I have translated the attached document identified as JPH09-311625.
`
`4. I affirm that the translated text has been translated and edited to the best of my ability and
`knowledge to accurately reflect the content, meaning, and style of the original text and constitutes
`in every respect a correct and true translation of the original document.
`
`5. I declare that all statements made herein of my knowledge are true, and that all statements made
`on information and belief are believed to be true, and that these statements were made with the
`knowledge that willful false statements and the like so made are punishable by fine or
`imprisonment, or both, under Section 1001 of Title 18 of the United States Code.
`
` hereby certify under penalty of perjury under the laws of the United States of America that the foregoing
`is true and correct. Dated and signed on August 8, 2019.
`
`
`_____________________________
` (Translator’s Signature)
`
`
`David Baldwin
`_____________________________
`
` (Translator’s Printed Name)
`
`
`
`
`
`
` I
`
`
`
`IPR2020-00408
`Apple EX1007 Page 1
`
`

`

`(12) Unexamined Patent Application Publication (A)
`(19) Japan Patent Office (JP)
`(11) Patent Application Publication No.
`H9-311625
`(43) Publication Date: Dec. 2, 1997 (Heisei 9)
`FI
`Technical Indication
`G09B 29/00
`A
`G01C 21/00
`B
`G08G 1/0969
`
`(51) Int. Cl.6
`G09B 29/00
`G01C 21/00
`G08G 1/0969
`
`Ident. Code
`
`Internal Ref. No.
`
`
`
`Examination Request: Not Made
`(21) Filing No.
`H8-149740
`
`(22) Filing Date
`
`May 22, 1996 (Heisei 8)
`
`(16 Pages Total)
`
`(72) Inventor
`
`(74) Agent
`
`FD
`No. of Claims: 13
`(71) Applicant 000002185
`Sony Corp.
`6-7-35 Kita Shinagawa,
`Shinagawa-ku, Tokyo
`Kiyokazu Ikeda
`c/o Sony Corp.
`6-7-35 Kita Shinagawa,
`Shinagawa-ku, Tokyo
`Atsuo Waki, Patent Attorney
`(and One Other)
`
`
`(54) [Title] DISPLAY DEVICE, MAP DISPLAY DEVICE, DISPLAY METHOD, AND MAP
`DISPLAY METHOD
`
`
`(57) [Abstract]
`[Problem] To realize display that is easy to
`view and use for a user, display with a large
`information amount, and display that is
`interesting.
`[Resolution Means] By detecting a posture
`and/or movement state of a main body of a
`display device and changing a display content
`to match conditions such as a posture or
`movement of the main body of the device,
`display matching an actual orientation, display
`according to a posture of the device, switching
`between three-dimensional display and flat
`display, and the like are realized.
`
`
`
`IPR2020-00408
`Apple EX1007 Page 2
`
`

`

`1
`
`[Scope of Patent Claims]
` [Claim 1] A display device, comprising:
`a state detection means of detecting a posture state and/or a
`movement state of a main body of the display device;
`a display means; and
`a display control means that can display a predetermined
`image on the display means and, based on detection
`information from the state detection means, change a display
`state of the displayed image.
`[Claim 2] A display device, comprising:
`a state detection means of detecting a posture state and/or a
`movement state of a main body of the map display device;
`a map-information
`storage means of
`storing map
`information;
`a display means; and
`a display control means that can display a map image based
`on map information read from the map-information storage
`means on the display means and, based on detection
`information from the state detection means, change a display
`aspect or a map display region of the displayed map image.
`[Claim 3] The map display device of claim 2, wherein the
`state detection means is provided with an incline sensor that
`detects an incline state of the main body of the map display
`device.
`[Claim 4] The map display device of claim 2, wherein the
`state detection means is provided with an orientation sensor
`that detects an actual orientation.
`[Claim 5] The map display device of claim 2, wherein the
`state detection means is provided with a movement-state
`sensor that detects a movement direction and/or a movement
`amount when the main body of the map display device is
`moved.
`[Claim 6] The map display device of claim 2, wherein the
`map image displayed on the display means can be set with an
`absolute direction or a relative direction, and
`the display control means changes a display state of the map
`image according to the posture state of the main body of the
`map display device detected by the state detection means so
`the set direction is substantially continually an upward
`direction in terms of gravity of the display means.
`[Claim 7] The map display device of claim 2, wherein the
`display control means changes a display state of the map
`image according to an orientation state of the main body of
`the map display device detected by the state detection means
`so an orientation of the map image displayed on the display
`means substantially continually matches an actual orientation.
`[Claim 8] The map display device of claim 2, wherein the
`display control means changes a display state of the map
`image according to the movement state of the main body of
`the map display device detected by the state detection means
`so the region displayed as the map image on the display
`means undergoes scrolling movement.
`[Claim 9] The map display device of claim 2, wherein the
`
`2
`
`display control means changes a display state of the map
`image according to an incline state of the main body of the
`map display device detected by the state detection means so
`the map image displayed on the display means switches
`between a two-dimensional image and a three-dimensional
`image.
`[Claim 10] The map display device of claim 2, further
`comprising: a current -position detection means; wherein
`the display control means is configured to be able to
`synthesize and display on the display means the map image
`based on the map information read from the map-information
`storage means and a position-presenting image based on
`current-position information detected by the current-position
`detection means and, based on the detection information from
`the state detection means, change the display aspect or the
`map display region of the displayed map image.
`[Claim 11] A display method of detecting a posture state
`and/or a movement state of a main body of a device having a
`display unit and changing a display state of an image
`displayed on the display unit according to the detected
`posture state and/or movement state of the main body of the
`device.
`[Claim 12] A map display method of detecting a posture state
`and/or a movement state of a main body of a device having a
`display unit and, when displaying a map image based on
`predetermined map information on the display unit, changing
`a display aspect or a map display region of the map image
`according to the detected posture state and/or movement state
`of the main body of the device.
`[Claim 13] The map display method of claim 12, wherein an
`incline posture of the main body of the device having the
`display unit is detected and the map image displayed on the
`display unit is switched between a two-dimensional image
`and a three-dimensional image according to a detected incline
`state.
`[Detailed Description of the Invention]
`[0001]
`[Field] The present invention relates to a display device and
`a display method that change a display state according to a
`state of a main body of the display device and particularly
`relates to a map-image display device and a map-image
`display method.
`[0002]
`[Conventional Art] In recent years, as are widespread in, for
`example, navigation systems, devices that display a map
`image on a display are known. Many of these store map
`information on a medium such as a CD-ROM; read map
`information of a required region, a vicinity of a current
`position, or the like from the CD-ROM; and display a map
`image based on this read map information.
`[0003]
`[Problem to be Solved by the Invention] Now, in such
`conventional electronic map display, the displayed map
`image itself is displayed with a specified direction of a screen
`as up, regardless of a posture or the like of a main body of
`this display device. As such, a user needs to view the map
`image by mentally aligning cardinal directions of the map
`image with actual cardinal directions.
`
`
`
`IPR2020-00408
`Apple EX1007 Page 3
`
`

`

`3
`
`Moreover, from a flat map image, it is difficult to visualize
`different regions, and realizing map display with higher
`added value is in demand.
`[0004]
`[Means for Solving the Problem] In view of such problems,
`the present invention has as an object to provide a display
`device, a display method, a map-display device, and a map-
`display method that can realize display that is easy to view
`and use for a user and display that can impart added value and
`is interesting.
`[0005] As such, the display device is provided with a state
`detection means of detecting a posture state and/or a
`movement state of a main body of the display device; a
`display means; and a display control means that can display
`a predetermined image on the display means and, based on
`detection information from the state detection means, change
`a display state of the displayed image. The display method
`detects a posture state and/or a movement state of a main
`body of a device having a display unit and changes a display
`state of an image displayed on the display unit according to
`the detected posture state and/or movement state of the main
`body of the device.
`[0006] Furthermore, the map display device is provided with
`a map-information storage means of storing map information,
`a display control means being able to display a map image
`based on map information read from the map-information
`storage means on a display means and, based on detection
`information from a state detection means, change a display
`aspect or a map display region of the displayed map image.
`The map display method detects a posture state and/or a
`movement state of a main body of a device having a display
`unit and, when displaying a map
`image based on
`predetermined map information on the display unit, changes
`a display aspect or a map display region of the map image
`according to the detected posture state and/or movement state
`of the main body of the device.
`[0007] That is, in the present invention, by changing a display
`content to match conditions such as a posture or movement
`of the main body of the device, display matching an actual
`orientation, display according to a posture of the device,
`switching between three-dimensional display and flat display,
`and the like are realized.
`[0008]
`[Embodiments of the Invention] An embodiment of the
`present invention is described below by using as an example
`an electronic map device that displays a map image. The
`description is given in the following order.
`1. Configuration of Electronic Map Device
`2. Posture and Movement Detection by Sensors
`3. Map Display Operations in Display-Orientation-
`Designated Mode
`4. Map Display Operations in Actual-Orientation-
`Reflecting Mode
`5. Map Display Operations in Vicinity-Map Display Mode
`6. Map Display Operations in Virtual Display Mode
`7. Map Display Operations in Navigation Display Mode
`8. Composite Operation of Various Modes
`[0009] 1. Configuration of Electronic Map Device
`
`4
`
`FIG. 1 illustrates a block diagram of the electronic map
`device of the present example, and FIG. 2 illustrates an
`appearance example of the electronic map device. As
`illustrated in FIG. 2, an electronic map device 1 is formed,
`for example, in a notebook shape of an extent that enables the
`display to be portable and has a display unit 2, which is a
`liquid-crystal display or the like, formed in an upper face.
`Moreover, used as a recording medium of electronic map data
`is, for example, a CD-ROM such as is used in a normal
`navigation system, and an insertion portion 3 for inserting
`this CD-ROM is provided.
`[0010] Furthermore, various controllers 4 are formed for user
`control. As the controllers 4, it is sufficient to provide those
`of forms necessary for various controls, such as pressable
`keys and jog dials. Of course, these may be of other forms,
`such as slide switches or rotating knobs. Necessary controls
`include a power on/off control; a mode-setting control; an
`operation for selecting a map region to display; scrolling and
`zooming of screen display; a control of requesting, for
`example, presentation of various information; and the like,
`and any forms may be adopted as long as these controls can
`be performed. Moreover, a configuration may be such that
`control devices such as a mouse and a keyboard can be
`connected and used.
`[0011] An internal configuration of the electronic map device
`1 is as illustrated in FIG. 1, and a CPU 10 is provided as a
`part that performs overall operation control. Moreover, a
`RAM 11 is prepared as a work region used in operations such
`as control/computation by the CPU 10, and a ROM 12 is
`provided as a region for holding an operation program or the
`like.
`[0012] A CD-ROM 20 inserted from the insertion portion 3
`illustrated in FIG. 2 is loaded in a CD-ROM driver 14. The
`CD-ROM driver 14 is a part that performs a reproduction
`operation of the CD-ROM 20 based on CPU 10 control. The
`CD-ROM 20 loaded in the CD-ROM driver 14 is recorded
`with map information and additional information such as
`names of map locations and building height information.
`Information reproduced from the CD-ROM 20 by the CD-
`ROM driver 14 is taken in by the RAM 11 and subjected to
`necessary processing.
`is
`that
`information
`[0013] Furthermore, map-image
`reproduced from the CD-ROM 20 and used for display is
`taken in by a map-image memory 15. The CPU 10 generates
`image data to display based on the map-image information
`and various additional information read from the CD-ROM
`20 and deploys this to the map-image memory 15. A
`necessary portion of the image data held by the map-image
`memory 15 is then sent to a display driver 13, and map
`display of a certain region is executed on the display unit 2.
`[0014] Furthermore, in the present example, the displayed
`map image is not limited to simply a normal map image
`recorded on the CD-ROM 20, and as described below, a
`bird’s-eye image and a three-dimensional image (3D image)
`can be displayed. While image data of the bird’s-eye image
`and the 3D image may of course be recorded in advance on
`the CD-ROM 20 and
`
`
`
`
`IPR2020-00408
`Apple EX1007 Page 4
`
`

`

`5
`
`a read bird’s-eye image or 3D image be supplied as-is to the
`display driver 13 and displayed, to conserve a recording
`capacity of the CD-ROM 20, it is favorable to only record
`building heights, building and facility types, and the like as
`additional information for generating the bird’s-eye image
`and the 3D image, using the map information and the
`additional information to generate the bird’s-eye image and
`the 3D image by image synthesis processing. An image
`synthesis unit 16 is provided to perform this processing, and
`this unit can virtually generate bird’s-eye images and 3D
`images of various regions (or images centered around various
`points on the map) based on CPU 10 control.
`[0015] Furthermore, the present example has not only a map-
`display function but also a function similar to a navigation
`system. That is, map display centered around a current
`position is performed automatically to enable travel guidance
`to be executed. From this necessity for current-position
`detection, a GPS receiver 18 is provided. The GPS receiver
`18 is a part for obtaining current-position information by a
`so-called GPS (global positioning system) and detects
`position
`information
`(latitude/longitude),
`absolute-
`orientation information, and velocity information based on a
`reception signal from a satellite. This information is supplied
`to the CPU 10.
`[0016] A sensor unit 17 is provided with sensors necessary to
`detect a posture state and a movement state of a main body of
`the electronic map device 1. Detection information from the
`sensor unit 17 is supplied to the CPU 10. Data and control
`signals are transmitted between the above units via a bus 19.
`Moreover, various control information from the control unit
`4 illustrated in FIG. 2 is input to the CPU 10.
`[0017] Based on the mode-setting control and a display-
`region-designating control from the control unit 4, the
`detection information from the sensor unit 17, detection
`information from the GPS receiver 18, and the operation
`program stored in the ROM 12, the CPU 10 controls the
`reproduction operation by the CD-ROM driver 14, the
`synthesis processing by the image synthesis unit 16,
`write/read operations of the map-image memory 15, and a
`display operation by the display driver 13. By this, map
`display sought by the user is executed on the display unit 2.
`[0018] 2. Posture and Movement Detection by Sensors
`Here, posture and movement detection of the main body of
`the electronic map device 1 by the sensor unit 17 is described.
`The sensor unit 17 is provided with an incline sensor function
`of detecting an incline state of the main body, an orientation
`sensor function of detecting an absolute orientation (north,
`south, east, and west), and a movement sensor function of
`detecting movement of the main body (movement direction
`and movement amount) and is equipped with various sensors
`necessary for these functions.
`[0019] In the description, first, as illustrated in FIG. 2,
`directions of up, down, left, and right relative to a screen of
`the display unit 1 of the main body of the electronic map
`device 1 are referred to as screen-up, screen-down, screen-
`right, and screen-left so as to be distinguished from up and
`down in terms of earth’s gravity and left and right in terms of
`absolute orientation.
`
`6
`
`[0020] FIG. 3 illustrates incline-state examples of the main
`body of the electronic map device 1. (a), (b), and (c) in FIG.
`3 illustrate the main body of the electronic map device 1
`rotated in a screen-up/screen-down direction around an axis
`from screen-left to screen-right. That is, in (a) in FIG. 3, the
`electronic map device 1 is substantially horizontal, and (b)
`and (c) in FIG. 3 illustrate postures of a screen-up portion
`being lifted upward in terms of gravity. Although not
`illustrated, a posture of a screen-down portion being lifted
`upward in terms of gravity is of course also possible.
`Moreover, (a), (d), and (e) in FIG. 3 illustrate the main body
`of the electronic map device 1 rotated in a screen-left/screen-
`right direction around an axis from screen-up to screen-down.
`That is, (d) and (e) in FIG. 3 illustrate postures of a screen-
`left portion being lifted upward in terms of gravity from the
`horizontal state in (a) in FIG. 3. Although not illustrated, a
`posture of a screen-right portion being lifted upward in terms
`of gravity is also possible.
`[0021] As the sensor that detects these incline states, it is
`favorable to biaxially form in the sensor unit 17 an incline
`sensor made using, for example, a mercury switch or equip in
`the sensor unit a gravity sensor that detects the direction of
`gravity. The posture changes arise according to how the user
`holds the electronic map device 1, how the user places the
`device (horizontally, on a desk or the like, or vertically,
`leaning up against something), and the like.
`[0022] Next, FIG. 4 illustrates relationships between the main
`body of the electronic map device 1 and an absolute
`orientation. (a) in FIG. 4 is a state wherein the screen-up
`portion faces the absolute orientation north, (b) in FIG. 4 is a
`state wherein the screen-left portion faces north, and (c) in
`FIG. 4 is a state wherein a portion between the screen-down
`portion and the screen-left portion faces north. As in the
`above for example, the present example also detects an
`orientation posture of the electronic map device 1 in terms of
`absolute orientation. As such, the sensor unit 17 is equipped
`with an orientation sensor such as an electronic compass.
`Conceivable as an actual example is, for example, adopting a
`magnetic-field sensor. Moreover, orientation information of
`a travel direction obtained by the GPS receiver 18 may be
`used as the orientation sensor.
`[0023] FIG. 5 illustrates movement conditions of the main
`body of the electronic map device 1. The arrow in (a) in FIG.
`5 illustrates, for example, a state wherein the user holds the
`electronic map device 1 horizontally in their hand and moves
`the device in front of their body so as to draw a circle, and the
`arrows in (b) in FIG. 5 illustrates a state wherein the device
`is moved in a certain direction, screen-up, -down, -left, or -
`right. In the present example, when the electronic map device
`1 is moved in this manner, the movement direction and the
`movement amount are detected in terms of screen-up, -down,
`-left, and -right. As such, the sensor unit 17 is equipped with
`at least one sensor that can detect movement such as an
`acceleration sensor, velocity sensors, or a magnetic-field
`sensor. Note that the movement conditions may include, in
`addition to the movement of (a) and (b) in FIG. 5 in a state of
`the electronic map device 1 being horizontal relative to the
`earth’s surface, the movement of (a) and (b) in FIG. 5 in a
`state of the device being vertical or diagonal relative to the
`earth’s surface.
`[0024] In the present example,
`
`IPR2020-00408
`Apple EX1007 Page 5
`
`

`

`7
`
`the sensor unit 17 detects inclines, movements, and directions
`in terms of absolute orientation of the main body of the
`electronic map device such as those illustrated in FIG. 3, FIG.
`4, and FIG. 5 above and is equipped with sensors necessary
`for such. Sensor types and a sensor count may be any that
`enable detection of these inclines/movements/orientations.
`Moreover, as described below, the CPU 10 executes
`predetermined display output processing based on detected
`incline/movement/orientation conditions.
`[0025] Note that FIG. 6 illustrates an example of using the
`electronic map device 1 by installing the device on a
`predetermined stand 30. Here, the stand 30 is formed with a
`mounting portion 32 via a shank 31, and a mounting
`mechanism that is not illustrated on a back face of the
`electronic map device 1 is coupled to the mounting portion
`32. Moreover, by the shank 31, the electronic map device 1
`can incline/rotate in any direction in a state of being installed
`to the stand 30. With such a mechanism, it is also conceivable
`to provide a mechanical position sensor on a mounting-
`portion 32 and shank 31 side and detect posture conditions of
`the electronic map device 1 by operations thereof.
`[0026] 3. Map Display Operations in Display-Orientation-
`Designated Mode
`Display operations based on incline/movement/orientation
`detection of the electronic map device 1 are sequentially
`described below as display aspects in display operation
`modes. Note that a display-orientation-designated mode, an
`actual-orientation-reflecting mode, a vicinity-map display
`mode, a virtual display mode, and a navigation display mode
`can be set as the display operation modes of the present
`example, the operations in each mode described below being
`executed by the user selecting the mode by controlling the
`control unit 4.
`[0027] First, map display operations in the display-
`orientation-designated mode is described using FIG. 7, FIG.
`8, and FIG. 9. The display-orientation-designated mode is a
`display operation mode wherein a direction designated by the
`user is continually up in the map image regardless of the
`posture of the electronic map device 1 and is displayed
`matching an upward direction in terms of gravity of the
`electronic map device 1. Note that in the description, up,
`down, left, and right in the displayed map image, bird’s-eye
`image, or 3D image are referred to as map-up, map-down,
`map-left, and map-right, which are distinct from screen-up,
`screen-down, screen-right, and screen-left described above;
`up and down in terms of the earth’s gravity; and left and right
`in terms of absolute orientation.
`[0028] That is, in the display-orientation-designated mode,
`the designated direction is made to be map-up regardless of
`the posture of the electronic map device 1 (regardless of
`whether screen-up, screen-down, screen-right, or screen-left
`is up in terms of gravity) and display is performed so map-up
`and up in terms of gravity match. Directions able to be
`designated by the user are made to be absolute orientations
`such as north, south, east, and west as well as a current travel
`direction, a direction the user is facing, and the like.
`Moreover, when setting is not performed by the user,
`automatic setting such as making, for example, north be map-
`up as a reference direction may be performed.
`[0029] FIG. 7 illustrates control operations by the CPU 10 in
`the display-orientation-designated mode. Note that the user
`
`8
`
`designates a certain region as the map region to be displayed
`and map information and additional information of this
`region is read from the CD-ROM 20 and held in the map-
`image memory 15.
`[0030] During this display-orientation-designated mode, at
`step F101, the CPU 10 continually monitors posture detection
`information (incline state) from the sensor unit 17. Then, at
`step F102, it is determined whether the posture of the
`electronic map device 1 is in a horizontal/vertical/diagonal
`state as illustrated in FIG. 3. When the vertical or diagonal
`state is determined—as in, for example, (b), (c), (d), or (e) in
`FIG. 3—the flow proceeds to step F103. Then, it is confirmed
`whether additional information whereby a bird’s-eye image
`and a 3D image as viewed from a certain point in the region
`can be synthesized (or bird’s-eye image data or 3D image
`data itself) is present as the map information of the region to
`currently display.
`[0031] When no data necessary for the processing of
`synthesizing the bird’s-eye image and the 3D image is
`prepared for this region (not recorded on the CD-ROM 20),
`the flow proceeds from step F104 to F105 to perform normal,
`flat map display. Then, according to the posture of the main
`body of the electronic map device 1 detected at step F101,
`display-image data is generated so the map image, wherein
`up (map-up) is the designated direction, matches up in terms
`of gravity. This display-image data is supplied to the display
`driver 13, and display on the display unit 2 is executed. FIG.
`8 illustrates images of this situation.
`[0032] It is supposed that the user designates, for example,
`south as the designated direction in the display-orientation-
`designated mode. (a) in FIG. 8 is when it is supposed that the
`user is holding the electronic map device 1 vertically with the
`screen-up portion up in terms of gravity at this time. Here, as
`the map image, an image of the certain region wherein south
`is map-up is displayed; because screen-up and up in terms of
`gravity match, as illustrated, an image is displayed wherein
`screen-up is map-up (that is, south).
`[0033] Furthermore, (b) in FIG. 8 illustrates the screen-right
`portion being inclined downward in a state of the user holding
`the electronic map device 1 vertically. At this time, screen-up
`of the main body of the electronic map device 1 no longer
`matches up in terms of gravity. However, processing at step
`F105 generates the map image simply so map-up (= south)
`matches up in terms of gravity. Therefore, as illustrated, the
`displayed map image displays south as being up in terms of
`gravity such that a map image that is long from a south-east
`portion to a north-west portion of the map is displayed.
`[0034] Furthermore, (c) in FIG. 8 is a state wherein the user
`holds the electronic map device 1 vertically with the screen-
`left portion being upward. At this time as well, the processing
`at step F105 generates the map image simply so map-up (=
`south) matches up in terms of gravity. Therefore, as
`illustrated, the displayed map image displays south as being
`up in terms of gravity such that a map image that is long in a
`north-south direction of the map is displayed.
`[0035] As evident from comparing (a) to (c) in FIG. 8,
`
`
`
`IPR2020-00408
`Apple EX1007 Page 6
`
`

`

`9
`
`when the electronic map device 1 is held vertically in the
`display-orientation-designated mode, the map of the certain
`region is displayed simply according to the posture of the
`electronic map device 1 at this time so the designated
`direction is map-up in a direction that is up in terms of gravity
`among the screen-up, -down, -left, and -right directions.
`Therefore, as viewed by the user, the designated direction is
`displayed continually upward regardless of how
`the
`electronic map device 1 is rotated while being held vertically.
`In terms of a display control operation, the displayed map
`image changes orientation according to which direction up in
`terms of gravity is in terms of the screen-up, screen-down,
`screen-left, and screen-right directions. Note that this
`operation takes place not only in the vertical state but also
`when the diagonal state is detected.
`[0036] Now, when it is determined at step F104 that a bird’s-
`eye view or 3D display is possible, the flow proceeds to step
`F106, and three-dimensional image display from the certain
`point is performed. Then, here as well, regardless of the
`posture of the electronic map device 1, a vanishing point of
`the three-dimensional image (farthest point in a depth
`direction) becomes the designated direction, display being
`performed so this is up in terms of gravity. That is, up in the
`three-dimensional image is made to match up in terms of
`gravity. Note that the 3D image is displayed when a posture
`of an incline direction of the main body is vertical and the
`bird’s-eye image is displayed when this is diagonal.
`[0037] FIG. 9 illustrates an example of the electronic map
`device 1 being held vertically and the 3D image from the
`certain point being displayed. As the 3D image itself, an
`image viewing the designated direction (for example, south)
`from the certain point is generated. This is virtually
`synthesized by using additional information such as height
`information of buildings surrounding the specified point.
`Moreover, as evident from a comparison with (a), (b), and (c)
`in FIG. 8, regardless of which direction among the screen-up,
`screen-down, screen-left, and screen-right directions is up in
`terms of gravity in holding the electronic map device 1, in the
`displayed 3D image, map-up (that is, up in the 3D image)
`simply matches up in terms of gravity.
`[0038] Note that although not illustrated, when the electronic
`map device 1 is held in a diagonally inclined state, bird’s-eye
`display is executed in a similar aspect to FIG. 9. The bird’s-
`eye image is an image such as that illustrated in (b) in FIG.
`15.
`[0039] Now, when the electronic map device 1 is horizontal,
`the processing flows from step F102 to F107. Here, at this
`point, there is no direction among two-dimensional directions
`of screen-up, screen-down, screen-left, and screen-right that
`corresponds to up in terms of gravity. Therefore, at step F107,
`a posture immediately before the posture becomes horizontal
`is determined, and at step F108, display is performed so a
`direction deemed to be upward immediately before is
`considered to be up in terms of gravity. For example, when,
`as in (c) in FIG. 8, the electronic map device 1 is placed
`horizontally immediately after screen-left is deemed to be up
`in terms of gravity, display of the map image wherein screen-
`left is the designated direction (here, south) is executed.
`[0040] Note that as above, in the display-orientation-
`
`
`
`10
`
`designated mode, display is performed so the designated
`direction simply matches up in terms of gravity at a given
`time; it is needless to say that a detection precision and
`number of detection stages of the screen-up, -down, -left, and
`-right directions and a detection precision and number of
`detection stages of an incline can be set in various ways. For
`example, if the number of detection stages is increased,
`nearly continuous display-image changing when the main
`body is rotated is possible so map-up is continually up in
`terms of gravity. Moreover, if the number of stages is
`decreased to every 45° or 90° or the like, a display operation
`is performed of switching display directions upon rotating

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket