throbber
JP2007-41143A Feb. 15, 2007
`
`CERTIFICATE OF TRANSLATION ACCURACY
`
`I am a professional reviewer and coordinator specializing in translating Asian and
`European languages to English and vice versa.
`I served as Chief Examiner of the certified court interpreter test for the State of California
`and as a translator and interpreter for various federal agencies through the U.S. Department of State
`of for more than a decade. I served as an instructor at the University of California at Berkeley and
`the Middlebury Institute of International Studies at Monterey.
`I have more than 30 years of experience translating thousands of technical, legal, and
`business submitted to, among others, Asian judicial authorities, various U.S. federal courts, the U.S.
`International Trade Commission (ITC), and the USPTO Patent Trial and Appeal Board (PTAB).
`I certify that the following document translated from Japanese to the English language is a
`true, correct, and complete translation of the corresponding source text to the best of my
`knowledge and ability.
`I certify under penalty of perjury that the foregoing is true and correct.
`Executed this 26th day of August 2023 in the Contra Costa County of the State of
`California.
`
`
`
`
`
`
`
`
`
`
`
`By:
`
`
`
`
`
`Alex N. Jo
`Member, ATA
`
`Petitioner Samsung Ex-1014, 0001
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`JP2007-41143A Feb. 15, 2007
`
`{19) Japanese Patent Office
`(JP)
`
`(12) Patent Application Gazette (A)
`
`
`(5l)hlt. Cl
`G09B
`G0lC
`G09B
`G08G
`G06T
`
`
`
`
`
`ID Number
`29/00
`(2006.01)
`21/00
`(2006.01)
`29110
`(2006.01)
`1/005
`(2006.01)
`11/60
`(2006.01)
`
`
`
`A
`C
`A
`
`(21) Filed Application No.: 2005-
`223014 (?2005-223014)
`
`(22) Filed: Aug. 1, 2005
`(2005.8.1)
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`(74) Agent or Attorney:
`
`(72) Inventor:
`
`(72) Inventor:
`
`F term (reference)
`
`(11) Published Application No.
`JP2007-41143
`(P2007-41143A)
`(43) Published: Feb. 15, 2007
`Theme code (ref.)
`
`
`2C032
`2Fl29
`5B050
`5B069
`5Hl80
`Number of Claims: 8 OL Total ages: 18
`
`FI
`29/00
`G09B
`
`GOl C
`21/00
`
`29/10
`G09B
`
`1/005
`G08G
`
`
`11/60
`G06T
`
`
`Re nest for Examination: No
`(71) Applicant:
`303046277
`c/o Asahi Kasei Microdevices Corporation 1-23-7
`Nishi-Shinjuku, Shinjuku-ku, Tokyo, Japan
`100077481
`TANI Yoshikazu Patent Attorney
`NISHIJIMA Junichi
`c/o Asahi Kasei Microdevices Corporation 3050 Okada,
`Atsugi-shi, Kanagawa-ken, Japan
`YAMASHITA Masaya
`c/o Asahi Kasei Microdevices Corporation 3050 Okada,
`Atsugi-shi, Kanagawa-ken, Japan
`2C032 HB22 HCl 1 HC25 HD03
`2Fl29 AA02 BB03 BB19 BB21 BB26 BB29 DD31
`EE02 EE05 EE23 EE52 EE85 EE86 FF12
`HH12
`Continued on last aoe
`(54) [TITLE OF THE INVENTION] MOBILE DEVICE AND DRAWING PROCESSING CONTROL METHOD THEREOF
`
`(57) ABSTRACT
`
`[PROBLEM]
`fa accordance with changes to a held orientation state of a dev
`ice, enable switching to low power consumption or optimal m
`ap image display with nosense of discomfort.
`
`fMEANS FOR SOLUTIONSl
`A mobile device where device direction is detected by an azim
`uth angle detecting pa1t 13 and that displays a map image on a
`displaying part based on this direction. The mobile device is p
`rovided with an acceleration detecting part 11 for detecting the
`degree of tilt of the display screen of the displaying pait of th
`e device and an estin1ation calculating pa1t 16 that detennines
`the holding, cai1ying, use state of the device using these detect
`ion results. When the tilt of the device is detected to be outside
`a prescribed angle for a prescribed 3111ount of time, the estima
`tion calculating part 16 detemiines that the device is in a use st
`ate where the pedestrian using the device is not view-ing the111
`ap image and therefore switches to low power consumption by
`halting display calculation processing for the map iniage, ai1d
`writes the map in1age data from prior to the detemiination to r
`etention memo1y, to prepaiꞏe for redisplaying when viewing us
`e state is resumed thereafter.
`Selected Diagr3111: FIG. 1
`
`12
`
`
`
`
`
`
`
`_j'
`22
`
`
`
`15
`
`
`
`17,J' I= :.:.
`
`
`
`-ꞏꞏ
`
`
`
`
`
`Petitioner Samsung Ex-1014, 0002
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`
`(2)
`
`JP 2007-41143 A 2007.2.15
`
`What is claimed is:
`[Claim 1]
`A mobile device that displays drawing on a displaying part of the device according to the device use state, comprising:
`azimuth angle detecting means for detecting azimuth the device faces;
`acceleration detection means for detecting the degree of tilt of the displaying part of the device; and
`device orientation calculation estimating means that calculates and estimates whether the device orientation is in viewing use stat
`e or non-viewing state based on azimuth angle data from the azimuth angle detecting means and acceleration data from the accele
`ration detection means; wherein
`when the orientation of the device is in non-viewing state, the drawing data just prior thereto is retained, drawing processing is h
`alted, and when the orientation of the device is returned to viewing use state, the retained drawing data is used for redisplaying dr
`awing on the displaying part.
`
`[Claim 2]
`The mobile device according to claim 1, further comprising:
`movement amount calculation estimating means that calculates and estimates movement amount of the device based on a change
`pattern of acceleration detected by the acceleration detection means over time; wherein
`when the orientation of the device is returned to viewing use state, the retained drawing data is shifted according to the movemen
`t amount for redisplay.
`[Claim 3]
`The mobile device according to claim 1 or 2, further comprising:
`change amount calculation estimating means that calculates and estimates the amount of change in traveling direction of the devi
`ce based on a change pattern of acceleration detected by the acceleration detection means over time and on a change pattern of th
`e azimuth angle detected by the azimuth angle detecting means; wherein
`when the orientation of the device is returned to viewing use state, the retained drawing data is rotated according to the change a
`mount for redisplay.
`
`[Claim 4]
`The mobile device according to claim 1, 2, or 3, wherein the drawing data is map image data, topographical image data, or navi
`gation information.
`
`[Claim 5]
`A drawing processing control method for a mobile device that displays drawing on a displaying part of the device according to t
`he use state of the device, comprising:
`an azimuth angle detecting step for detecting azimuth the device is facing;
`an acceleration detecting step of detecting the degree of tilt of the displaying part of the device;
`a device orientation calculating and estimating step for calculating and estimating whether the orientation of the device is in view
`ing use state or non-viewing state based on the azimuth angle from the azimuth angle detecting step and the acceleration from the
`acceleration detecting step;
`a calculating step for calculating current position and travel direction of the device; and
`a display control step for displaying drawing of the current position and the travel direction on the displaying part based on the c
`alculation result from the calculating step; wherein
`when the orientation of the device is in non-viewing state, the drawing data just prior thereto is retained, drawing processing is h
`alted, and when the orientation of the device is returned to viewing use state, the retained drawing data is used for redisplaying dr
`awing on the displaying part.
`
`[Claim 6]
`The drawing processing control method for a mobile device according to claim 5, further comprising:
`a movement amount calculating and estimating step for calculating and estimating movement amount of the device based on a ch
`ange pattern of acceleration detected in the acceleration detecting step over time; wherein
`when the orientation of the device is returned to the viewing use state, the retained drawing data is shifted according to the move
`ment amount for redisplay.
`
`[Claim 7]
`The drawing processing control method for a mobile device according to claim 5 or 6, further comprising:
`a change amount calculating and estimating step for calculating and estimating the amount of change in traveling direction of the
`device based on a change patter of acceleration detected in the acceleration detecting step over time and a change pattern of the a
`zimuth angle detected by the azimuth angle detecting means; wherein
`when the orientation of the device is returned to viewing use state, the azimuth of the retained drawing data is rotated according t
`o the change amount for redisplay.
`
`Petitioner Samsung Ex-1014, 0003
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`
`
`
`(3)
`
`JP 2007-41143 A 2007.2.15
`
`[Claim 8]
`The drawing processing control method for a mobile device according to claim 5, 6, or 7, wherein the drawing data is map imag
`e data, topographical image data, or navigation information.
`
`DETAILED DESCRIPTION OF THE INVENTION
`TECHNICAL FIELD
`[0001]
`The present invention relates to a mobile device equipped with an autonomous navigation device for a pedestrian and to a drawi
`ng processing control method and in more detail, relates to a mobile device equipped with azimuth detecting means such as a geo
`magnetism sensor or the like and acceleration detection means such as an acceleration sensor or the like, and having an autonomo
`us navigation device for a pedestrian that can be used for a walking navigation device installed therein, and to a drawing processi
`ng control method that displays drawing on the displaying part based on the azimuth detected.
`
`CONVENTIONAL TECHNOLOGY
`[0002]
`In recent years small mobile devices, represented by a cell phone, prepared with a GPS (Global Positioning System) function ha
`ve been actively developed. In these devices, there is an application example in which navigation for pedestrians is performed in
`conjunction with a GPS device, map distribution by a mobile phone network or the like, a display application, and the like.
`
`[0003]
`Some mobile devices that provide walking navigation for pedestrians are equipped with azimuth detecting means such as a geo
`magnetism sensor or the like in order to align the traveling direction of the pedestrian with the display of the map. In addition, the
`re are also devices equipped with acceleration detection means such as an acceleration sensor for detecting orientation of the mob
`ile device (detecting and correcting the orientation using acceleration detection means is essential for knowing the correct azimut
`h for an arbitrary orientation).
`
`[0004]
`Autonomous navigation using an azimuth angle sensor and an acceleration sensor is disclosed, for example, in Patent Document
`1 and Patent Document 2. The device in Patent Document 1 is a mobile position detection device that enables improving position
`detection accuracy of the pedestrian, using autonomous navigation (for example, a configuration with a sensor that detects the nu
`mber of steps x stride length, correction of stride length, and movement direction) by changing the stride to match the walking sta
`te, enabling detecting the movement direction of a pedestrian more accurately, even if a GPS signal cannot be received.With the
`mobile position detection device, the amount of steps is used to detect movement position by calculating “number of steps X strid
`e length”, and by correcting the stride length according to walking state from walking time per step using the acceleration sensor
`and detecting the movement direction using the geomagnetism sensor, movement position of the pedestrian can be accurately det
`ected using autonomous navigation. Herein, even in cases of being located in a forest or in a valley between high-rise buildings w
`here a GPS satellite signal cannot be received, the position of a pedestrian carrying a mobile position detection device can readily
`be known with sufficient accuracy for practical use with improved stride length accuracy.
`
`[0005]
`In addition, Patent Document 2 is related to a walking navigation device that measures walking navigation of a pedestrian that
`moves in and out of buildings. This walking navigation device is equipped with a computer and an input device connected to this
`computer is mounted on a waist of the pedestrian. When the pedestrian walks, acceleration in the traveling direction and upward
`direction are detected by a traveling direction accelerometer and an upward direction accelerometer. A central processing unit (C
`PU) provided in the computer calculates a cross‐correlation function from the detection results and compares this with a cross‐cor
`relation function of a horizontal, ascending and descending gait stored in advance in a hard disk (HD). Any one of the walking be
`haviors is used for determination and thus it is possible to readily know that the pedestrian has walked on a path with a small heig
`ht difference such as going up or down stairs.
`
`[0006]
`Patent Document 1: Japanese Unexamined Patent Application 2000-97722
`Patent Document 2: Japanese Unexamined Patent Application 2002-139340
`Patent Document 3: Japanese Unexamined Patent Application 2005-157465
`
`Petitioner Samsung Ex-1014, 0004
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`
`(4)
`
`JP 2007-41143 A 2007.2.15
`
`DISCLOSURE OF THE INVENTION
`PROBLEMS TO BE SOLVED BY THE INVENTION
`[0007]
`However, these imposed a certain level of restrictions on the user such as use in a state in a specified orientation such as a mobil
`e device fixed to a waist or requiring calibration after mounting in the case of orientation not being specified.
`[0008]
`In addition, with the mobile device used for this manner of walking navigation, map drawing processing is continued even if the
`device is held in an orientation where the device screen is not being viewed causing a problem of consuming power. In addition,
`there were problems that the orientation angle of the device in a state where the device screen is not being viewed is very differen
`t from the angle when in use so azimuth calculation and drawing processing were very different from actual so just after returning
`to a view position and using, map data in a completely different direction relative to the final direction display while the device
`was being viewed was displayed on the screen.
`
`[0009]
`In light of this manner of problems, an object of the present invention is to provide a mobile device and drawing processing cont
`rol method that enables the mobile device to be held freely while being used for walking navigation, switches to low power consu
`mption based on changes to hold orientation in viewing state or non-viewing state and enables navigation that does not cause stre
`ss to the user through optimal drawing display that does not cause discomfort.
`
`MEANS FOR SOLVING THE PROBLEM
`[0010]
`The present invention is for achieving this manner of object and the invention according to claim 1 is a mobile device that displa
`ys drawing on a displaying part of the device according to the device use state, including:
`azimuth angle detecting means for detecting azimuth the device faces;
`acceleration detection means for detecting the degree of tilt of the displaying part of the device; and
`device orientation calculation estimating means that calculates and estimates whether the device orientation is in viewing use stat
`e or non-viewing state based on azimuth angle data from the azimuth angle detecting means and acceleration data from the accele
`ration detection means; wherein
`when the orientation of the device is in non-viewing state, the drawing data just prior thereto is retained, drawing processing is h
`alted, and when the orientation of the device is returned to viewing use state, the retained drawing data is used for redisplaying dr
`awing on the displaying part.
`
`[0011]
`In addition, the invention according to claim 2 is the invention according to claim 1, further including:
`movement amount calculation estimating means that calculates and estimates movement amount of the device based on a change
`pattern of acceleration detected by the acceleration detection means over time; wherein
`when the orientation of the device is returned to viewing use state, the retained drawing data is shifted according to the movemen
`t amount for redisplay.
`
`[0012]
`In addition, the invention according to claim 3 is the invention according to claim 1 or 2, further including:
`change amount calculation estimating means that calculates and estimates the amount of change in traveling direction of the devi
`ce based on a change pattern of acceleration detected by the acceleration detection means over time and on a change pattern of th
`e azimuth angle detected by the azimuth angle detecting means; wherein
`when the orientation of the device is returned to viewing use state, the retained drawing data is rotated according to the change a
`mount for redisplay.
`
`[0013]
`The invention according to claim 4 is the invention according to claim 1, 2, or 3, wherein the drawing data is map image data, to
`pographical image data, or navigation information. Here, the navigation information refers to various display information such as
`distances to some target points on the route to a preset destination, directions of travel indicated by arrows, and the like.
`[0014]
`
`Petitioner Samsung Ex-1014, 0005
`
`

`

`
`
`(5)
`
`JP 2007-41143 A 2007.2.15
`
`In addition, the invention according to claim 5 is a drawing processing control method for a mobile device that displays drawing
`on a displaying part of the device according to the use state of the device, including:
`an azimuth angle detecting step for detecting azimuth the device is facing;
`an acceleration detecting step of detecting the degree of tilt of the displaying part of the device;
`a device orientation calculating and estimating step for calculating and estimating whether the orientation of the device is in view
`ing use state or non-viewing state based on the azimuth angle from the azimuth angle detecting step and the acceleration from the
`acceleration detecting step;
`a calculating step for calculating current position and travel direction of the device; and
`a display control step for displaying drawing of the current position and the travel direction on the displaying part based on the c
`alculation result from the calculating step; wherein
`when the orientation of the device is in non-viewing state, the drawing data just prior thereto is retained, drawing processing is h
`alted, and when the orientation of the device is returned to viewing use state, the retained drawing data is used for redisplaying dr
`awing on the displaying part.
`[0015]
`In addition, the invention according to claim 6 is invention according to claim 5, further including:
`a movement amount calculating and estimating step for calculating and estimating movement amount of the device based on a ch
`ange pattern of acceleration detected in the acceleration detecting step over time; wherein
`when the orientation of the device is returned to the viewing use state, the retained drawing data is shifted according to the move
`ment amount for redisplay.
`[0016]
`The invention according to claim 7 is the invention according to claim 5 or 6, further including:
`a change amount calculating and estimating step for calculating and estimating the amount of change in traveling direction of the
`device based on a change patter of acceleration detected in the acceleration detecting step over time and a change pattern of the a
`zimuth angle detected by the azimuth angle detecting means; wherein
`when the orientation of the device is returned to viewing use state, the azimuth of the retained drawing data is rotated according t
`o the change amount for redisplay.
`[0017]
`The invention according to claim 8 is the invention according to claim 5, 6, or 7, wherein the drawing data is map image data, to
`pographical image data, or navigation information.
`[0018]
` In other words, the mobile device of the present invention causes the displaying part of the device to display a drawing based
`on the drawing data according to the usage state of the device, and includes azimuth angle detecting means for detecting the
`azimut h the device is facing, acceleration detection means for detecting tilt of the device, and a calculation estimating part for
`determining the hold, carrying, or usage state of the device based on the detection results. When the tilt of the device is detected
`to be outside a prescribed angle for a prescribed amount of time, the calculation estimating part determines that the device is in a
`use state w here the pedestrian using the device is not viewing the drawing image and therefore switches to low power
`consumption by halting display calculation processing for the drawing data, and writes the drawing data from prior to the
`determination to retention memory, to prepare for redisplaying when viewing use state is resumed thereafter.
`[0019]
`In addition, even in the case of being at a device tilt such that the device is determined to be in a non-viewing state, if the calcula
`tion estimating part detects a prescribed pattern of vibration and oscillation state of the device for a prescribed amount of time or
`more, the pedestrian using the device is determined to be walking and to intend to immediately use azimuth applicable informatio
`n. Therefore, the azimuth angle detecting means and acceleration detection means continue to calculate azimuth angle and acceler
`ation of the device at prescribed intervals and the movement distance and travel direction are calculated based thereon.
`[0020]
` When the calculation estimating part determines that the detected tilt of the device is within a prescribed angle range for a
`prescribed period of time or more, the low power consumption mode is canceled and the drawing data saved in retention memory
`is dis played immediately or this drawing data image is moved according to the walking movement amount and travel direction
`change angle based on detection and calculation data determined while walking.
`[0021]
`In addition, the drawing processing control method of the mobile device of the present invention determines whether or not the
`device orientation angle is within a prescribed range to estimate viewing state or non-viewing state and in the case of estimating n
`on-viewing state, switches to low power consumption by halting display calculation processing, and writes the drawing data from
`prior to the determination to retention memory, to prepare for redisplaying when viewing use state is resumed thereafter.
`
`Petitioner Samsung Ex-1014, 0006
`
`

`

`
`
`
`
`
`
`
`
`
`
`
`
`(6)
`
`JP 2007-41143 A 2007.2.15
`
`[0022]
`In addition, device movement is determined to be straight forward movement or path movement, and in the case of straight forw
`ard movement, the retained drawing data is shifted by the movement amount of the device for redisplay, while in the case of path
`movement of the device and returning to viewing use state, the retained drawing data is rotated by the direction change angle acc
`ording to the path movement distance of the device.
`
`[0023]
`Based on calculation and display control in the configuration described above, it is possible to detect the orientation and behavio
`r of the device in space when using walking navigation, determine the state in which the device screen is not being viewed and th
`e pedestrian's walking state, and thus stop unnecessary drawing processing. In addition, since the orientation angle of the device
`when not looking at the device screen is very different from the angle when using the device, the azimuth calculation and drawin
`g processing will be completely different from what is actual. During this time, the drawing processing can be stopped while retai
`ning the drawing data for that time. Immediately after returning to the use state, the drawing data can be moved and redrawn base
`d on the final position where the pedestrian was looking at the device. During various usage actions by the pedestrian, a screen in
`a totally different direction is not displayed while the device is being held in a position for viewing so navigation use that does no
`t cause sense of discomfort can be provided.
`
`[EFFECTS OF THE INVENTION]
`[0024]
`According to the present invention, in walking navigation using a mobile device equipped with azimuth angle detecting means a
`nd acceleration detection means, it is possible to detect whether the device is in a viewing state or non-viewing state such as "loo
`king at the display screen/not looking at the display screen" and switch to low power consumption in response to changes in the s
`tate of the holding orientation of the device, and navigation that does not cause stress to the user by displaying optimal images wi
`th no sense of discomfort can be provided.
`
`[DESCRIPTION OF THE PREFERRED EMBODIMENTS]
`[0025]
`The following is a description of an embodiment of the present invention with reference to the drawings.
`FIG. 1 is a block diagram depicting an equipment configuration in a mobile device according to the present invention and in the
`figure, code 11 represents a 3-axis acceleration detecting part, 12 represents a 3-axis acceleration sensor, 13 represents a 3-axis az
`imuth angle detecting part, 14 represents a 3-axis azimuth angle sensor, 15 represents a central processing unit (CPU), 16 represe
`nts a state calculation estimating part, 16a represents a walking state estimating part, 16b represents a device orientation estimatin
`g part (device orientation calculation estimating means), 16c represents a walking direction estimating part (change amount calcu
`lation estimating means), 16d represents a walking amount estimating part (movement amount calculation estimating means), 17
`represents a calculating part, 18 represents a display controlling part, 19 represents various types of memory, 20 represents displa
`y memory, 21 represents other units, 22 represents a display for displaying, and 23 represents a pedometer.
`
`[0026]
`The 3-axis acceleration detecting part 11 has the 3-axis acceleration sensor 12 for detecting tilt and motion (acceleration) of the
`mobile device incorporated therein and the 3-axis azimuth angle detecting part 13 has the 3-axis azimuth angle sensor 14 for dete
`cting geomagnetism incorporated therein. The 3-axis azimuth angle sensor 14 responds to geomagnetism and a direction angle inf
`ormation signal that is output accordingly is periodically captured into the 3-axis azimuth angle detecting part 13, which then gen
`erates data indicating facing direction on a screen part of a device displaying a map image.
`
`[0027]
`In addition, the central processing unit (CPU) 15 performs control and calculation for the mobile device, receives 3-axis acceler
`ation data from the 3-axis acceleration detecting part 11 and 3-axis azimuth angle data from the 3-axis azimuth angle detecting pa
`rt 13, and integrates dedicated software for the calculation estimating part 16 that calculates orientation angle, motion, advancing
`direction, and azimuth direction of the device itself, the calculating part 17 that calculates current position and traveling direction
`of the pedestrian based on information the calculation estimating results from this calculation estimating part 16, GPS, and infor
`mation from other units 21, and the display controlling part 18 that performs processing for drawing on the screen displaying a m
`ap image according to the calculation results from the calculating part 17.
`[0028]
`
`Petitioner Samsung Ex-1014, 0007
`
`

`

`
`
`
`
`
`
`(7)
`
`JP 2007-41143 A 2007.2.15
`
`Various types of memory 19 such as ROM and RAM are provided for this software, and various types of controllers, I/O, other
`units 21, and the like are also controlled by the CPU 15. The display memory 20 stores map image data for displaying on the disp
`lay screen and this map image data itself is retained and used in conjunction with various memory mechanisms 19 through data di
`stribution and the like using physical integration or communication means with the mobile device. The display for displaying 22 i
`s for providing display for the mobile device such as a liquid crystal displaying part.
`[0029]
`The calculation estimating part 16 includes:
`a walking state estimating part 16a that estimates the walking state of the pedestrian, in other words, whether the state is a device
`orientation fixed state or swing state based on the 3-axis acceleration data from the 3-axis acceleration detecting part 11 and 3-ax
`is azimuth angle data from the 3-axis azimuth angle detecting part 13;
`a device orientation estimating part 16b that estimates orientation of the device whether in a viewing use state or a non-viewing s
`tate;
`a walking direction estimating part 16c that estimates walking direction of the pedestrian, in other words whether straight forwar
`d movement or moving along a path based on data from the walking state estimating part 16a and the device orientation estimatin
`g part 16b; and
`a walking amount estimating part 16d that estimates walking amount from step count data and stride data from the pedometer 2
`3; and
`is configured such that the calculation estimating results data from the calculation estimating part 16 is input into the calculating
`part 17.
`[0030]
`In addition, the pedometer 23 is provided in the CPU 15 and this pedometer 23 calculates the number of steps of the pedestrian
`based on the 3-axis acceleration data from the 3-axis acceleration detecting part 11 and the 3-axis azimuth angle data from the 3-a
`xis azimuth angle detecting part 13 and this number of steps data is input to the calculating part 17 and the walking amount estim
`ating part 16d. Note that the stride data is pre-set by the pedestrian, is stored in the various types of memory 19, and read for use
`when estimating walking amount.
`[0031]
`In this manner, the mobile device of the present invention is a mobile device that detects direction of the device using the azimut
`h angle detecting part 13 and displays map image data on the displaying part of the device based on the detected direction of the d
`evice. The acceleration detecting part 11 detects the degree of tilt of the display surface of the displaying part of the device. The d
`evice orientation calculation estimating part 16b estimates whether the orientation of the device is in a viewing use state or in a n
`on-viewing state based on the azimuth angle data from the azimuth angle detecting part 13 and the acceleration data from the acc
`eleration detecting part 11.
`
`[0032]
`The device is configured so that when the orientation is in the non-viewing state, the previous map image data is retained and dr
`awing processing is stopped, and when the orientation of the device is returned to viewing use state, the retained map image data
`is used to redisplay the map image on the displaying part. Note that when drawing processing is paused and calculation processin
`g or new drawing operation of drawing itself is paused, there are cases where additionally, display accompaniment means such as
`a backlight is stopped, or darkened, for lower power consumption.
`[0033]
`In addition, a walking amount estimating part 16d is provided for calculating and estimating the amount of movement of the dev
`ice based on the change pattern over time of acceleration detected by the acceleration detecting part 11 so that when the orientatio
`n of the device is restored to viewing use state, the retained map image data can be shifted according to the movement amount for
`redisplay.
`
`[0034]
`In addition, a walking direction estimating part 16c is provided to calculate and estimate a change amount of the device in the tr
`aveling direction based on the change pattern over time of the acceleration detected by the acceleration detecting part 11 and the
`change pattern of the azimuth angle detected by the azimuth angle detecting part 13 so that when the orientation of the device is r
`eturned to viewing use state, the retained map image data can be redisplayed rotated in the direction according to the amount of c
`hange in the traveling direction.
`[0035]
`Next, map drawing processing on the screen of the mobile device of the present invention composed of the configuration depict
`ed in FIG. 1 will be described.
`[0036]
`
`Petitioner Samsung Ex-1014, 0008
`
`

`

`
`
`
`
`
`
`
`
`
`
`(8)
`
`JP 2007-41143 A 2007.2.15
`
`FIG. 2 is a diagram depicting as a flowchart, a procedure for performing map drawing processing according to the orientation of
`the mobile device by detecting the degree of tilt and direction of the display screen of the walking navigation information display
`ing part using the mobile device equipped w

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket