`
`(12) Patent Application Publication (10) Pub. No.: US 2008/0107411 A1
`Hope
`(43) Pub. Date:
`May 8, 2008
`
`US 20080107411A1
`
`(54) USER DEFINED AUTOFOCUS AREA
`
`Publication Classification
`
`(75)
`
`Julian Charles Hope. Bolton (GB)
`Inventor:
`Correspondence Address:
`HARRITY SNYDER, L.L.P.
`11350 RANDOM HILLS ROAD. SUITE 600
`FAIRFAX, VA 22030
`
`(73) ASsignee:
`
`SONY ERICSSON MOBILE
`COMMUNICATIONS AB. Lund
`(SE)
`
`(2]) Appl. No.:
`
`11/557,200
`
`(22)
`
`Filed:
`
`Nov. 7, 2006
`
`1
`
`oo_.\fi‘
`
`(51)
`
`gldéBCli3/34
`‘
`,
`(52) U.$. (.l.
`
`(2006.01)
`
`....................................................... 396/121
`
`(57)
`
`ABSTRACT
`
`A device allows a user to select an arbitrary portion of :1
`scene being previewed for picture taking. In one implemen-
`tation. the device includes a camera; a display to preview a
`scene associated with the camera: an interface configured to
`allow a user oftlte device to select a portion ofthe scene: and
`autofocus logic to adjust the focus of the camera based on
`the selected portion of the scene.
`
`ANTENNA
`
`250
`
`COMMUNICATION
`INTERFACE
`
`
`
`
`260
`
`USER
`WTERFACE
`LOGm
`
`PRogfiggmG
`
`MEMORY
`
`220
`
`AUTOFOCUS
`Loom
`
`1
`
`Exhibit 1052
`Apple v. Qualcomm
`|PR2018—01277
`
`1
`
`Exhibit 1052
`Apple v. Qualcomm
`IPR2018-01277
`
`
`
`Patent Application Publication
`
`May 8, 2008 Sheet 1 of 7
`
`US 2008/0107411 A1
`
`120
`
`100 N 170
`
`110
`
`130
`
`150
`
`FIG. 1B
`
`2
`
`
`
`
`
`
`
`
`
`
`
`
`Patent Application Publication May 8, 2008 Sheet 2 0f 7
`
`
`
`US 2008/0107411 A1
`
`
`
`100
`
`‘N
`
`
`
`ANTENNA
`
`
`
`250
`
`
`
`
`
`
`220
`
`
`
`
`
`COMMUNICATION
`INTERFACE
`
`
`
`
`
`
`
`
`
`
`260
`
`USER
`
`INTERFACE
`LOGIC
`
`
`
`
`PROCESS'NG
`
`LOGIC
`
`
`
`MEMORY
`
`
`
`230
`
`
`AUTOFOCUS
`LOGIC
`
`
`
`
`CAMERA
`
`
`
`
`170
`
`3
`
`
`
`Patent Application Publication
`
`May 8, 2008 Sheet 3 of 7
`
`US 2008/0107411 A1
`
`START
`
`300
`
`\‘
`
`302
`
`304
`
`306
`
`DISPLAY PREVIEW OF SCENE.
`
`RECEIVE AREA FOR AUTOFOCUS.
`
`
`
`PERFORM AUTOFOCUS.
`
`NO
`
`
`IMAGE
`
`CAPTURE
`
`NITIATED?
`
`308
`
`FIG. 3
`
`310
`
`312
`
`
`
`
`YES
`
`CAPTURE IMAGE.
`
`STORE CAPTURED IMAGE.
`
`
`
`4
`
`
`
`Patent Application Publication
`
`May 8, 2008 Sheet 4 of 7
`
`US 2008/0107411 A1
`
`130
`
`FIG. 4
`
`5
`
`
`
`Patent Application Publication
`
`May 8, 2008 Sheet 5 of 7
`
`US 2008/0107411 A1
`
`
`
`FIG. 5
`
`6
`
`
`
`Patent Application Publication
`
`May 8, 2008 Sheet 6 of 7
`
`US 2008/0107411 A1
`
`130
`
`540
`
`FIG. 6
`
`7
`
`
`
`Patent Application Publication
`
`May 8, 2008 Sheet 7 of 7
`
`US 2008/0107411 A1
`
`130
`
`FIG. 7
`
`8
`
`
`
`US 2008/0107411 A1
`
`
`
`
`
`
`
`May 8, 2008
`
`
`
`
`
`
`
`USER DEFINED AUTOFOCUS AREA
`
`
`
`BACKGROUND
`
`
`
`
`
`1. Technical Field of the Invention
`[0001]
`
`
`
`
`
`
`
`Implementations described herein relate generally
`[0002]
`
`
`
`
`
`to imaging systems. and more particularly,
`to portable
`
`
`
`
`
`
`
`imaging devices having an autofocus feature.
`
`
`
`
`
`
`2. Description of Related Art
`[0003]
`
`
`
`
`
`
`[0004] Many portable imaging devices typically include
`
`
`
`
`
`
`
`an autofocus feature through which the device automatically
`
`
`
`
`
`
`
`
`adjusts the optical system of the device to obtain correct
`
`
`
`
`
`
`
`
`
`
`focus on a subject. In other words, the portable imaging
`
`
`
`
`
`
`
`
`
`
`device will automatically place the subject of interest into
`
`
`
`
`
`
`
`
`
`focus without requiring manual adjustment by the operator.
`
`
`
`
`
`
`
`
`[0005]
`in a typical autofocus operation, a user may center
`
`
`
`
`
`
`
`
`
`the subject of interest
`in the frame of the picture. The
`
`
`
`
`
`
`
`
`
`
`
`imaging device may then automatically (or, in some devices,
`
`
`
`
`
`
`
`
`
`in response to a user pressing an “autofocus” button) adjust
`
`
`
`
`
`
`
`
`
`
`the image so that whatever is in the center portion of the
`
`
`
`
`
`
`
`
`
`
`
`
`image is in-focus. This type of autofocus operation can be
`
`
`
`
`
`
`
`
`
`
`cfificient in many situations but, in some situations. can be
`
`
`
`
`
`
`
`
`
`
`problematic.
`
`
`SUMMARY
`
`
`[0006] According to one aspect, a device includes a cam-
`
`
`
`
`
`
`
`
`
`era; a display to preview a scene associated with the camera;
`
`
`
`
`
`
`
`
`
`
`an interface configured to allow a user of the device to select
`
`
`
`
`
`
`
`
`
`
`
`a portion ofthe scene; and autofocus logic to adjust the focus
`
`
`
`
`
`
`
`
`
`
`
`
`of the camera based on the selected portion of the scene,
`
`
`
`
`
`
`
`
`
`
`
`[0007] Additionally, the autofocus logic may adjust the
`
`
`
`
`
`
`
`focus of the camera by adjusting a focal length of an optical
`
`
`
`
`
`
`
`
`
`
`component ofthe camera using image processing techniques
`
`
`
`
`
`
`
`applied to the selected portion of the scene.
`
`
`
`
`
`
`
`
`[0008] Additionally, the display may include touch screen
`
`
`
`
`
`
`
`display.
`
`[0009] Additionally, the device may include logic to allow
`
`
`
`
`
`
`
`
`
`the user to select the portion of the scene by drawing a closed
`
`
`
`
`
`
`
`
`
`
`
`
`or nearly closed shape on the display.
`
`
`
`
`
`
`
`[0010] Additionally, the device may include logic to allow
`
`
`
`
`
`
`
`
`
`the user to select a location in the scene by touching the
`
`
`
`
`
`
`
`
`
`
`
`
`display with the stylus and logic to generate the portion of
`
`
`
`
`
`
`
`
`
`
`
`the scene based on the selected location.
`
`
`
`
`
`
`
`[0011] Additionally, the generated portion of the scene
`
`
`
`
`
`
`
`may be generated as a rectangular area of the scene centered
`
`
`
`
`
`
`
`
`
`
`at the selected location.
`
`
`
`
`[0012] Additionally, the generated portion of the scene is
`
`
`
`
`
`
`
`
`generated as a shape identified based on an object corre—
`
`
`
`
`
`
`
`
`
`sponding to the selected location.
`
`
`
`
`
`[0013] Additionally, the device may include logic to over—
`
`
`
`
`
`
`
`
`lay an icon on the scene, wherein the location of the icon is
`
`
`
`
`
`
`
`
`
`
`
`
`
`controllable to select a location in the scene, the location
`
`
`
`
`
`
`
`
`
`
`being used to generate the selected portion of the scene.
`
`
`
`
`
`
`
`
`
`
`[0014] Additionally, the autofocus logic may adjust the
`
`
`
`
`
`
`
`focus of the camera using a passive autofocus teclmique.
`
`
`
`
`
`
`
`
`
`[0015] Additionally, the autofocus logic may adjust the
`
`
`
`
`
`
`
`focus of the camera using an active autofocus technique.
`
`
`
`
`
`
`
`
`
`[0016] Additionally, the device may be a mobile phone.
`
`
`
`
`
`
`
`
`
`[0017]
`In another aspect, a method includes displaying a
`
`
`
`
`
`
`
`
`
`scene from a camera; receiving a selection of a portion of the
`
`
`
`
`
`
`
`
`
`scene from a user; and adjusting an optical component of the
`
`
`
`
`
`
`
`
`
`
`camera to optimize focus of the camera at
`the selected
`
`
`
`
`
`
`
`
`
`
`portion of the scene.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`[0018] Additionally, the method may include capturing an
`
`
`
`
`
`
`
`
`image of the scene in response to a command from the user;
`
`
`
`
`
`
`
`
`
`
`
`
`and storing the captured image.
`
`
`
`
`
`[0019] Additionally, the receiving selection of a portion of
`
`
`
`
`
`
`
`
`the scene from the user may include receiving a shape drawn
`
`
`
`
`
`
`
`
`
`
`by the user on a display on which the scene is displayed and
`
`
`
`
`
`
`
`
`
`
`
`
`generating the portion of the scene based on the shape.
`
`
`
`
`
`
`
`
`
`
`[0020] Addi ionally, the shape may be a closed or nearly
`
`
`
`
`
`
`
`
`
`closed shape.
`
`
`[0021] Addi ionally, receiving selection of a portion of the
`
`
`
`
`
`
`
`
`scene from the user may include receiving selection of a
`
`
`
`
`
`
`
`
`
`location specified by touching a display and generating the
`
`
`
`
`
`
`
`
`
`portion of the scene based on the selected location.
`
`
`
`
`
`
`
`
`
`[0022] Addi ionally,
`the generated portion of the scene
`
`
`
`
`
`
`
`may be generated as a rectangular area within the scene and
`
`
`
`
`
`
`
`
`
`
`centered at the selected location.
`
`
`
`
`
`[0023] Addi ionally, adjusting the optical component of
`
`
`
`
`
`
`the camera may be based on passive autofocusing tech-
`
`
`
`
`
`
`
`
`niques.
`
`[0024] Addi ionally, adjusting the optical component of
`
`
`
`
`
`
`
`the camera may be based on active autofocusing techniques.
`
`
`
`
`
`
`
`
`
`[0025] According to another aspect, a device may include
`
`
`
`
`
`
`
`
`
`means for displaying a scene from a camera; means for
`
`
`
`
`
`
`
`
`
`
`receiving selection of a portion of the scene from a user; and
`
`
`
`
`
`
`
`
`
`
`means for adjusting an optical component of the camera
`
`
`
`
`
`
`
`
`
`based on the selected portion of the scene.
`
`
`
`
`
`
`
`
`[0026] Additionally, the means for receiving may further
`
`
`
`
`
`
`
`include means for receiving a shape drawn by the user on the
`
`
`
`
`
`
`
`
`
`
`means for displaying; and means for generating the portion
`
`
`
`
`
`
`
`
`of the scene based on the shape.
`
`
`
`
`
`
`
`[0027] Additionally, the means for receiving may further
`
`
`
`
`
`
`
`include means for receiving selection of a location specified
`
`
`
`
`
`
`
`by touching the display; and means for generating the
`
`
`
`
`
`
`
`
`portion of the scene based on the selected location.
`
`
`
`
`
`
`
`
`
`[0028]
`Additionally,
`the device may include a mobile
`
`
`
`
`
`
`terminal.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`[0029] The accompanying drawings, which are incorpo-
`
`
`
`
`
`
`rated in and constitute a part of this specification, illustrate
`
`
`
`
`
`
`
`
`
`an embodiment of the invention and,
`together with the
`
`
`
`
`
`
`
`
`description, explain the invention. In the drawings,
`
`
`
`
`
`
`
`FIG. 1A is a front side view of an exemplary
`[0030]
`
`
`
`
`
`
`
`
`
`
`electronic device;
`
`
`FIG. 1B is a back side view of the exemplary
`[0031]
`
`
`
`
`
`
`
`
`
`
`electronic device;
`
`
`FIG. 2 is a diagram of exemplary conceptual com-
`[0032]
`
`
`
`
`
`
`
`
`
`ponents of the system architecture of the device shown in
`
`
`
`
`
`
`
`
`
`FIGS. 1A and 1B;
`
`
`
`
`[0033]
`FIG. 3 is a flow chart illustrating exemplary opera-
`
`
`
`
`
`
`
`
`
`tions for autofocusing a scene for an electronic device;
`
`
`
`
`
`
`
`
`
`[0034]
`FIG. 4 is a diagram illustrating an exemplary scene
`
`
`
`
`
`
`
`
`
`shown in the display of an electronic device;
`
`
`
`
`
`
`
`
`FIG. 5 is a diagram illustrating an exemplary
`[0035]
`
`
`
`
`
`
`
`
`technique through which a user can change the portion of the
`
`
`
`
`
`
`
`
`
`scene used for autofocusing;
`
`
`
`
`[0036]
`FIG. 6 is a diagram illustrating an alternate exem—
`
`
`
`
`
`
`
`
`
`plary technique through which a user can change the portion
`
`
`
`
`
`
`
`
`of the scene used for autofocusing; and
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`9
`
`
`
`US 2008/0107411 A1
`
`
`
`
`
`
`
`May 8, 2008
`
`
`
`FIG. 7 is a diagram illustrating an alternate cxcm-
`[0037]
`
`
`
`
`
`
`
`
`
`plary technique through which a user can change the portion
`
`
`
`
`
`
`
`
`of the scene used for autofocusing.
`
`
`
`
`
`
`
`
`
`DETAILED DESCRIPTION OF EMBODIMENTS
`
`
`
`
`
`
`
`
`
`
`[0038] The following detailed description of the invention
`
`
`
`
`
`
`
`
`refers to the accompanying drawings. The same reference
`
`
`
`
`
`
`
`
`numbers in diflerent drawings may identify the same or
`
`
`
`
`
`
`
`
`
`similar elements. Also, the following detailed description
`
`
`
`
`
`
`
`does not limit the invention.
`
`
`
`
`
`[0039] As described herein, a portable imaging device,
`
`
`
`
`
`
`
`
`such as a portable communication device, includes a camera
`
`
`
`
`
`
`
`autofocus feature in which the user may, if desired, select an
`
`
`
`
`
`
`
`
`
`
`
`arbitrary portion of a selected camera scene to which the
`
`
`
`
`
`
`
`
`
`autofocus operation is to be applied.
`
`
`
`
`
`
`[0040] Allowing a user to select a portion of a scene for
`
`
`
`
`
`
`
`
`
`
`
`autofocusing can be particularly useful
`in a number of
`
`
`
`
`
`
`
`
`
`situations. For example, if the portable imaging device is
`
`
`
`
`
`
`
`
`
`attached to a tripod and the subject is located off to one side
`
`
`
`
`
`
`
`
`
`
`
`
`
`in the scene,
`the user can instruct the portable imaging
`
`
`
`
`
`
`
`
`
`
`device to autofocus based on the position of the subject
`
`
`
`
`
`
`
`
`
`
`without physically moving the portable imaging device. As
`
`
`
`
`
`
`
`
`another example, consider the situation in which the camera
`
`
`
`
`
`
`
`
`
`lens is remote to the user or to the camera phone body (e.g.,
`
`
`
`
`
`
`
`
`
`
`
`
`
`such as via a short range wireless connection). In this
`
`
`
`
`
`
`
`
`
`
`situation, it may also be desirable to change the autofocus
`
`
`
`
`
`
`
`
`
`
`area while maintaining a fixed camera lens position.
`
`
`
`
`
`
`
`
`
`Exemplary Devices
`
`
`
`
`FIGS. 1A and 1B are diagrams of an exemplary
`[0041]
`
`
`
`
`
`
`
`
`
`
`portable imaging device 100 in which embodiments
`
`
`
`
`
`
`
`described herein may be implemented. In this example,
`
`
`
`
`
`
`
`
`portable imaging device 100 is a mobile phone. FIG. 1A
`
`
`
`
`
`
`
`
`
`
`illustrates the front of device 100 and FIG. 1B illustrates the
`
`
`
`
`
`
`
`
`
`
`
`back of device 100. As used herein, a “device” may include
`
`
`
`
`
`
`
`
`
`
`
`a mobile telephone (e.g., a radiotelephone); a personal
`
`
`
`
`
`
`
`
`communications system (PCS) terminal that may combine a
`
`
`
`
`
`
`
`cellular radiotelephone with data processing, facsimile, and’
`
`
`
`
`
`
`
`or data communications capabilities; a personal digital assis-
`
`
`
`
`
`
`tant (PDA) that may include a radiotelephone, pager, Inter-
`
`
`
`
`
`
`
`
`net/intranet access, web browser, organizer, calendar, and/or
`
`
`
`
`
`
`
`global positioning system (GPS) receiver; a laptop; a GPS
`
`
`
`
`
`
`
`
`
`device; a computer; an MP3 player; a pager; a digital
`
`
`
`
`
`
`
`
`
`
`camera; a video camera; binoculars; a telescope; and/or any
`
`
`
`
`
`
`
`
`
`other device including a camera.
`
`
`
`
`
`[0042] Device 100 may include housing 110, speaker 120,
`
`
`
`
`
`
`
`
`
`display 130, control keys 140, keypad 150, microphone 160,
`
`
`
`
`
`
`
`
`
`and camera 170 (FIG. 13). Housing 110 may protect the
`
`
`
`
`
`
`
`
`
`
`components of device 100 from outside elements. IIousing
`
`
`
`
`
`
`
`
`110 may be made from thcrmoplastics, metals, clastomcrs
`
`
`
`
`
`
`
`
`(e.g., synthetic rubber and/or natural rubber), and/or other
`
`
`
`
`
`
`
`
`similar materials. Speaker 120 may provide audible infor-
`
`
`
`
`
`
`
`mation to a user of device 100. Display 130 may provide
`
`
`
`
`
`
`
`
`
`
`Visual information to the user. For example, display 130 may
`
`
`
`
`
`
`
`
`
`provide information regarding incoming or outgoing tele—
`
`
`
`
`
`
`phone calls, games, telephone numbers, the current time,
`
`
`
`
`
`
`
`
`e-mail, etc. Control keys 140 may permit the user to interact
`
`
`
`
`
`
`
`
`
`
`
`with device 100 to cause device 100 to perform one or more
`
`
`
`
`
`
`
`
`
`
`
`
`operations. Keypad 150 may include a standard telephone
`
`
`
`
`
`
`
`
`keypad and may include additional keys to enable typing
`
`
`
`
`
`
`
`
`
`information into device 100. Microphone 160 may receive
`
`
`
`
`
`
`
`
`audible information from the user.
`
`
`
`
`
`
`
`
`
`[0043] Camera 170 may enable device 100 to capture
`
`
`
`
`
`
`
`
`
`and/or store video and/or images (e.g. pictures) of a scene
`
`
`
`
`
`
`
`
`
`being viewed through the lens of camera 170. Camera 170
`
`
`
`
`
`
`
`
`
`
`may be on the front side of device 100 (not shown) and/or
`
`
`
`
`
`
`
`
`
`
`
`
`the rear side of device 100 (as shown in FIG. 1B). Control
`
`
`
`
`
`
`
`
`
`
`
`
`keys 140 may include, for example, a shutter key (not
`
`
`
`
`
`
`
`
`
`
`shown) for enabling the user to take a picture with camera
`
`
`
`
`
`
`
`
`
`
`
`170. Display 130 may display captured or stored video
`
`
`
`
`
`
`
`
`
`and/or images. Camera 170 may be an electronic device that
`
`
`
`
`
`
`
`
`
`
`may capture and/or store images and/or video digitally.
`
`
`
`
`
`
`
`
`[0044]
`FIG. 2 is a diagram of exemplary conceptual com-
`
`
`
`
`
`
`
`
`
`ponents of the system architecture of device 100 of FIGS.
`
`
`
`
`
`
`
`
`
`
`1A and 1B. As shown in FIG. 2, device 100 may include
`
`
`
`
`
`
`
`
`
`
`
`
`processing logic 210, memory 220, communication inter—
`
`
`
`
`
`
`face 240, anteima 250, user interface logic 260, camera 170,
`
`
`
`
`
`
`
`
`
`
`and autofocus logic 230. Processing logic 210 may include
`
`
`
`
`
`
`
`
`
`a processor, microprocessor, an application specific inte-
`
`
`
`
`
`
`grated circuit (ASIC), or a field programmable gate array
`
`
`
`
`
`
`
`
`
`(FPGA). Processing logic 210 may include data structures or
`
`
`
`
`
`
`
`
`
`software programs to control operation of device 100 and its
`
`
`
`
`
`
`
`
`
`
`components. Memory 220 may include a hard disk drive
`
`
`
`
`
`
`
`
`
`(HDD), a random access memory (RAM), a read only
`
`
`
`
`
`
`
`
`
`memory (ROM), flash memory, a removable memory, and/
`
`
`
`
`
`
`
`
`or another type of memory to store data and/or instructions
`
`
`
`
`
`
`
`
`
`
`that may be used by processing logic 210, e.g., any type of
`
`
`
`
`
`
`
`
`
`
`
`
`a computer-readable medium. Camera 170 may store cap-
`
`
`
`
`
`
`
`tured video and/or images, e.g., pictures, in memory 220.
`
`
`
`
`
`
`
`
`
`Display 130 may display stored video and/or images, e.g.,
`
`
`
`
`
`
`
`
`
`pictures, from memory 220.
`
`
`
`
`[0045] Communication interface 240 may include, for
`
`
`
`
`
`
`
`example, a USB port for communication over a cable.
`
`
`
`
`
`
`
`
`
`Communication interface 240 may include a transmitter that
`
`
`
`
`
`
`
`may convert baseband signals from processing logic 210 to
`
`
`
`
`
`
`
`
`
`radio frequency (RF) signals and/or a receiver that may
`
`
`
`
`
`
`
`
`
`convert RF signals to baseband signals. Altemativcly, com-
`
`
`
`
`
`
`
`munication interface 240 may include a transceiver to per—
`
`
`
`
`
`
`
`
`form functions of both a transmitter and a receiver. Com-
`
`
`
`
`
`
`
`
`
`munication interface 240 may connect to antenna 250 for
`
`
`
`
`
`
`
`
`
`transmission and reception of the RF signals. Antenna 250
`
`
`
`
`
`
`
`
`
`may include one or more antemias to transmit and receive
`
`
`
`
`
`
`
`
`
`
`RF signals over the air. Antenna 250 may receive RF signals
`
`
`
`
`
`
`
`
`
`
`
`from communication interface 240 and transmit them over
`
`
`
`
`
`
`
`
`the air and receive RF signals from over the air and provide
`
`
`
`
`
`
`
`
`
`
`
`
`them to communication interface 240. Communication
`
`
`
`
`
`
`interface 240 may incorporate the Bluetooth standard or a
`
`
`
`
`
`
`
`
`
`USB serial port standard.
`
`
`
`
`[0046] User interface logic 260 may include mechanisms
`
`
`
`
`
`
`
`for inputting information into device 100 and/or for output—
`
`
`
`
`
`
`
`
`ting information from device 100. Examples of input and
`
`
`
`
`
`
`
`
`
`output mechanisms may include speaker 120 to output audio
`
`
`
`
`
`
`
`
`
`signals, microphone 160 to receive audio signals, keys 140
`
`
`
`
`
`
`
`
`
`or 150 to permit data and control commands to be input,
`
`
`
`
`
`
`
`
`
`
`
`and/or display 130 to output visual information. Display 130
`
`
`
`
`
`
`
`
`
`may show content, such as pictures or videos. Speaker 120
`
`
`
`
`
`
`
`
`
`
`may play content, such as music or radio programming. User
`
`
`
`
`
`
`
`
`
`
`interface logic 260 may also include a vibrator mechanism
`
`
`
`
`
`
`
`
`
`that causes device 100 to vibrate when, for example, an
`
`
`
`
`
`
`
`
`
`
`incoming telephone call is received. User interface logic 260
`
`
`
`
`
`
`
`
`
`may allow the user to receive a menu of options. The menu
`
`
`
`
`
`
`
`
`
`
`
`may allow the user to select various fiinctions or modes
`
`
`
`
`
`
`
`
`
`
`associated with applications executed by device 100. User
`
`
`
`
`
`
`
`
`interface logic 260 may allow the user to activate a particular
`
`
`
`
`
`
`
`
`
`
`mode, such as a mode defined by an application rumiing in
`
`
`
`
`
`
`
`
`
`
`
`device 100.
`
`
`
`
`
`
`
`10
`
`10
`
`
`
`US 2008/0107411 A1
`
`
`
`
`
`
`
`May 8, 2008
`
`
`
`[0047] Autofocus logic 230 may interact with camera 170
`
`
`
`
`
`
`
`
`
`to perform autofocusing operations relating to the optical
`
`
`
`
`
`
`
`
`components of camera 170. Autofocus logic 230 may be
`
`
`
`
`
`
`
`
`
`implemented in hardware, software, or a combination of
`
`
`
`
`
`
`
`
`hardware and software. Although illustrated as a separate
`
`
`
`
`
`
`
`
`component from camera 170, autofocus logic 230 could
`
`
`
`
`
`
`
`
`equivalently be considered as integrated within camera 170.
`
`
`
`
`
`
`
`
`[0048] Techniques for automatically determining an opti-
`
`
`
`
`
`
`mal focal length for a lens (i.e., autofocus techniques) are
`
`
`
`
`
`
`
`
`
`
`generally known. One such set of techniques. referred to as
`
`
`
`
`
`
`
`
`
`
`passive autofocus techniques. is based on an analysis, using
`
`
`
`
`
`
`
`
`
`image processing techniques, of a portion of the scene in the
`
`
`
`
`
`
`
`
`
`
`optical system of the camera. Such techniques may, for
`
`
`
`
`
`
`
`
`
`example, calculate the high-frequency components of the
`
`
`
`
`
`
`
`portion of the scene over different focal
`lengths of the
`
`
`
`
`
`
`
`
`
`
`camera lens. The camera lens position corresponding to
`
`
`
`
`
`
`
`
`maximum value of the high frequency components corre-
`
`
`
`
`
`
`
`sponds to the optimal focal length.
`
`
`
`
`
`
`[0049] Other passive autofocus techniques, such as those
`
`
`
`
`
`
`
`based on image phase detection are also known.
`
`
`
`
`
`
`
`
`[0050] Active autofocus
`techniques are also known.
`
`
`
`
`
`
`
`Active autofocus systems may measure distance to the
`
`
`
`
`
`
`
`
`subject independently of the optical system and then adjust
`
`
`
`
`
`
`
`
`
`the optical focal length based on the measured distance.
`
`
`
`
`
`
`
`
`
`Active autofocus systems may use, for example, ultrasonic
`
`
`
`
`
`
`
`
`sound waves or infrared light to measure distance.
`
`
`
`
`
`
`
`
`[0051] Autofocus logic 230 will be primarily described
`
`
`
`
`
`
`
`
`herein as
`implemented as a passive autofocus system,
`
`
`
`
`
`
`
`
`although it can be appreciated that concepts similar to those
`
`
`
`
`
`
`
`
`
`
`described herein may be implemented with an active auto-
`
`
`
`
`
`
`
`
`focus system or other suitable autofocus systems.
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Autofocus Logic
`
`
`[0052] As mentioned, autofocus logic 230 may perform
`
`
`
`
`
`
`
`autofocusing operations to assist
`in taking images with
`
`
`
`
`
`
`
`camera 170. FIG. 3 is a flow chart illustrating exemplary
`
`
`
`
`
`
`
`
`
`operations for autofocusing a scene for camera 170.
`
`
`
`
`
`
`
`
`[0053] When a user of device 100 is interested in taking a
`
`
`
`
`
`
`
`
`
`
`picture with camera 170. the user may control or otherwise
`
`
`
`
`
`
`
`
`
`
`manipulate device 100 so that it is available to take a picture.
`
`
`
`
`
`
`
`
`
`
`
`For example, to put device 100 into picture taking mode, the
`
`
`
`
`
`
`
`
`
`
`
`user may activate a pre-designated picture taking button or
`
`
`
`
`
`
`
`
`
`select a picture taking mode through a graphical interface
`
`
`
`
`
`
`
`
`
`presented on display 130. In response. device 100 may begin
`
`
`
`
`
`
`
`
`
`
`to show, in display 130, the scene that is currently incident
`
`
`
`
`
`
`
`
`
`
`
`upon the lens of camera 170 (act 302).
`In other words,
`
`
`
`
`
`
`
`
`
`
`
`device 100 may allow the user to preview the scene (i.e., the
`
`
`
`
`
`
`
`
`
`
`
`
`picture) that the user is about to take.
`
`
`
`
`
`
`
`
`[0054] Autofocus logic 230 may attempt to continuously
`
`
`
`
`
`
`
`
`optimize the focus of the scene being previewed (acts 304,
`
`
`
`
`
`
`
`
`
`
`306, and 308). More specifically, autofocus logic 230 may
`
`
`
`
`
`
`
`
`
`receive an autofocus area that corresponds to a portion of the
`
`
`
`
`
`
`
`
`
`
`
`scene for which the focus is to be optimized (act 304) and
`
`
`
`
`
`
`
`
`
`
`
`
`attempt to optimize the focus based on this area (act 306). In
`
`
`
`
`
`
`
`
`
`
`
`
`one implementation, autofocus logic 230 may initially
`
`
`
`
`
`
`
`assume that the selected portion of the image is a center
`
`
`
`
`
`
`
`
`
`
`
`portion of the image. FIG. 4 is a diagram illustrating an
`
`
`
`
`
`
`
`
`
`
`
`exemplary scene shown in display 130. As shown, display
`
`
`
`
`
`
`
`
`
`130 may display the scene currently in View of camera 170.
`
`
`
`
`
`
`
`
`
`
`
`In this example, the scene includes three people 401, 402,
`
`
`
`
`
`
`
`
`
`
`and 403. Autofocus logic 230 may, by default, assume that
`
`
`
`
`
`
`
`
`
`
`the intended subject of the image is located in the center of
`
`
`
`
`
`
`
`
`
`
`
`
`the image,
`i.e.. person 402 in this example. Accordingly,
`
`
`
`
`
`
`
`
`
`autofocus logic 230 may change the focal
`length of the
`
`
`
`
`
`
`
`
`
`
`
`
`
`11
`
`
`
`
`
`camera lens system to focus the image based on a center area
`
`
`
`
`
`
`
`
`
`
`or portion of the scene,
`illustrated by box 410. In some
`
`
`
`
`
`
`
`
`
`
`implementations, box 410 may not be explicitly shown on
`
`
`
`
`
`
`
`
`display 130.
`
`
`if desired, change the selected
`[0055] The user may.
`
`
`
`
`
`
`
`
`
`portion of the scene for which autofocusing is applied,
`
`
`
`
`
`
`
`
`
`resulting in a new autofocus area (act 304). For example, in
`
`
`
`
`
`
`
`
`
`
`an implementation in which device 100 includes a stylus and
`
`
`
`
`
`
`
`
`
`the display is a touch—sensitive display. the user may select
`
`
`
`
`
`
`
`
`
`
`the portion of the scene on which to autofocus by drawing
`
`
`
`
`
`
`
`
`
`
`
`on display 130. FIG. 5 is a diagram illustrating an exemplary
`
`
`
`
`
`
`
`
`
`
`
`technique through which a user can change the portion of the
`
`
`
`
`
`
`
`
`
`
`scene used for autofocusing. The exemplary scene shown on
`
`
`
`
`
`
`
`
`
`display 130 in FIG. 5 is identical to that shown in FIG. 4. In
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`this example, however,
`the user has selected a roughly
`
`
`
`
`
`
`
`
`
`circular area 510 using a stylus 540. Circular area 510 may
`
`
`
`
`
`
`
`
`
`
`
`be drawn at any position Within display 130 by the user.
`
`
`
`
`
`
`
`
`
`
`
`[0056] Autofocus logic 230 may adjust the focus based on
`
`
`
`
`
`
`
`
`
`
`the portion of the scene defined by circular area 510 (act
`
`
`
`
`
`
`
`
`
`
`
`306). In other words, in this example. autofocus logic 230
`
`
`
`
`
`
`
`
`
`
`will focus based on person 403 rather than person 402. This
`
`
`
`
`
`
`
`
`
`
`
`can be useful in the situation in which the user would like to
`
`
`
`
`
`
`
`
`
`
`
`
`
`take a picture of the entire scene shown in FIG. 5 but focus
`
`
`
`
`
`
`
`
`
`
`
`
`on a subject (e.g., person 403) that is not centered in the
`
`
`
`
`
`
`
`
`
`
`
`
`scene. It can be appreciated that area 510 is not necessarily
`
`
`
`
`
`
`
`
`
`
`
`limited to being circular. The user could, for example. draw
`
`
`
`
`
`
`
`
`
`
`a rectangular shaped area or any other closed or nearly
`
`
`
`
`
`
`
`
`
`
`closed shape that autofocus logic 230 can use to perform an
`
`
`
`
`
`
`
`
`
`
`
`autofocus operation. The user may, for example, outline a
`
`
`
`
`
`
`
`
`
`“humanoid” shape around person 403 to thus select person
`
`
`
`
`
`
`
`
`
`402 for autofocusing.
`
`
`
`[0057]
`FIG. 6 is a diagram illustrating an alternate exern—
`
`
`
`
`
`
`
`
`
`plary technique through which a user can change the portion
`
`
`
`
`
`
`
`
`of the scene used for autofocusing. The exemplary scene
`
`
`
`
`
`
`
`
`shown on display 130 in FIG. 6 is identical to that shown in
`
`
`
`
`
`
`
`
`
`
`
`
`FIGS. 4 and 5. In this implementation, assume that a device
`
`
`
`
`
`
`
`
`
`100 includes a stylus 540 and display 130 is a touch—
`
`
`
`
`
`
`
`
`
`
`sensitive display. The user may select the portion of the
`
`
`
`
`
`
`
`
`
`scene on which to autofocus by tapping or otherwise touch-
`
`
`
`
`
`
`
`
`
`ing display 130.
`In response, device 100 may use the
`
`
`
`
`
`
`
`
`
`
`touched point as the center of the area corresponding to the
`
`
`
`
`
`
`
`
`
`
`
`autofocus area. In some implementations, device 100 may
`
`
`
`
`
`
`
`
`draw a rectangle or other shape centered at the touched point
`
`
`
`
`
`
`
`
`
`
`to Visually show the user the area selected for autofocusing.
`
`
`
`
`
`
`
`
`
`
`This is shown in FIG. 6,
`in which a rectangle 605 is
`
`
`
`
`
`
`
`
`
`
`
`
`illustrated in display 130. In other implementations, device
`
`
`
`
`
`
`
`
`100 may use the touched point as the center of the area
`
`
`
`
`
`
`
`
`
`
`
`
`corresponding to the autofocus area without explicitly show-
`
`
`
`
`
`
`
`ing the area to the user.
`
`
`
`
`
`
`[0058]
`FIG. 7 is a diagram illustrating yet another exem-
`
`
`
`
`
`
`
`
`
`plary technique through which a user can change the portion
`
`
`
`
`
`
`
`
`
`of the scene used when autofocusing. In this example, a
`
`
`
`
`
`
`
`
`
`
`stylus and a touch—sensitive display are not necessary.
`
`
`
`
`
`
`
`
`Instead, the user may select the area corresponding to the
`
`
`
`
`
`
`
`
`
`
`autofocus area through, for example, keypad 150 of device
`
`
`
`
`
`
`
`
`
`100. Device 100 may, for instance, display a cross-hair icon
`
`
`
`
`
`
`
`
`
`710 on display 130. The user may move cross—hair icon 710
`
`
`
`
`
`
`
`
`
`
`
`by pressing various buttons on keypad 150 (e.g., “2," “6",
`
`
`
`
`
`
`
`
`
`
`“8”, and “4” may correspond to up, right, down, and left,
`
`
`
`
`
`
`
`
`
`
`
`respectively; alternatively, a designated direction pad or
`
`
`
`
`
`
`
`joystick may be used to receive uscr movement selections).
`
`
`
`
`
`
`
`
`
`When cross—hair icon 710 is at the position on display 130
`
`
`
`
`
`
`
`
`
`
`
`desired by the user.
`the user may press another button to
`
`
`
`
`
`
`
`
`
`
`
`select that position. In response, device 100 may use the
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`11
`
`
`
`US 2008/0107411 A1
`
`
`
`
`
`
`
`May 8, 2008
`
`
`
`
`
`
`
`
`
`
`selected point as the center of the area corresponding to the
`
`
`
`
`
`
`
`
`
`
`autofocus area. In some implementations, device 100 may
`
`
`
`
`
`
`
`draw a rectangle or other shape centered at the selected point
`
`
`
`
`
`
`
`
`
`to Visually show the user the area corresponding to the
`
`
`
`
`
`
`
`
`
`autofocused area, similar to area 605 shown in FIG. 6.
`
`
`
`
`
`
`
`
`
`
`[0059] Referring back to FIG. 3, when the user selects an
`
`
`
`
`
`
`
`
`
`
`
`arbitrary area on display 130 for autofocusing, autofocus
`
`
`
`
`
`
`
`
`logic 230 may adjust the focal length of camera 170 based
`
`
`
`
`
`
`
`
`
`
`
`on the selected area (act 306). For example, autofocus logic
`
`
`
`
`
`
`
`
`
`
`230 may adjust
`the focal
`length to maximize the high
`
`
`
`
`
`
`
`
`
`
`frequency components in the image corresponding to the
`
`
`
`
`
`
`
`
`selected area. In this maimer, device 100 autofocuses on an
`
`
`
`
`
`
`
`
`
`
`area chosen by the user. At some point the user may be
`
`
`
`
`
`
`
`
`
`
`
`
`satisfied with the scene being previewed in display 130 and
`
`
`
`
`
`
`
`
`
`
`may decide to “take” the picture by, for example, pressing a
`
`
`
`
`
`
`
`
`
`
`button or otherwise controlling device 100 to capture an
`
`
`
`
`
`
`
`
`
`image. In response, device 100 may capture the image of the
`
`
`
`
`
`
`
`
`
`
`
`scene (acts 308 and 310). Device 100 may also store the
`
`
`
`
`
`
`
`
`
`
`
`captured image for later retrieval and viewing by the user
`
`
`
`
`
`
`
`
`
`
`(act 312). The image may be stored, for example, in memory
`
`
`
`
`
`
`
`
`
`
`
`220.
`
`[0060] The above description relating to FIGS. 4-6
`
`
`
`
`
`
`
`describe a number of implementations for allowing a user to
`
`
`
`
`
`
`
`arbitrarily select a portion of a scene for which an autofocus
`
`
`
`
`
`
`
`
`operation is applied. It can be appreciated that other alter-
`
`
`
`
`
`
`
`
`
`natives are possible. For example, instead of using a stylus
`
`
`
`
`
`
`
`
`
`
`to interact with display 130, display 130 may allow the user
`
`
`
`
`
`
`
`
`
`
`
`to interact with it by touch. Additionally, instead of device
`
`
`
`
`
`
`
`
`
`
`100 automatically generating a rectangle or other closed
`
`
`
`
`
`
`
`
`shape, such as rectangle 605, around a point selected by the
`
`
`
`
`
`
`
`
`
`
`user, device 100 may use more sophisticated image analysis
`
`
`
`
`
`
`
`
`
`techniques to determine the object that is of interest to the
`
`
`
`
`
`
`
`
`
`
`
`user. For example, in response to the user touching a person
`
`
`
`
`
`
`
`
`
`
`in a scene, device 100 may use image processing techniques
`
`
`
`
`
`
`
`
`
`to recognize the boundaries of the touched person and to use
`
`
`
`
`
`
`
`
`
`
`
`that shape for autofocusing.
`
`
`
`
`[0061]
`Further, although the implementation of autofocus
`
`
`
`
`
`
`
`logic 230 was generally described with respect to using
`
`
`
`
`
`
`
`
`
`passive autofocusing techniques. active autofocusing tech—
`
`
`
`
`
`niques could also be used. For example, assume that an
`
`
`
`
`
`
`
`
`
`
`active autofocus system is implemented with an ultrasonic
`
`
`
`
`
`
`
`
`transceiver to measure distance to a target object. The
`
`
`
`
`
`
`
`
`
`ultrasonic transceiver may be aimed based on the portion of
`
`
`
`
`
`
`
`
`
`
`the scene selected by the user.
`
`
`
`
`
`
`
`
`
`
`
`CONCLUSION
`
`
`
`
`
`
`
`
`
`[0062] As described above, a user may select an arbitrary
`
`
`
`
`
`
`
`
`
`portion of a scene being previewed for picture taking. The
`
`
`
`
`
`
`
`
`selected portion of the scene is used to implement an image
`
`
`
`