`US007918398B2
`
`c12) United States Patent
`Li et al.
`
`(10) Patent No.:
`(45) Date of Patent:
`
`US 7 ,918,398 B2
`Apr. 5, 2011
`
`(54)
`
`INDICIA READING TERMINAL HAVING
`MULTIPLE SETTING IMAGING LENS
`
`(75)
`
`Inventors: Jianhua Li, Fremont, CA (US); Chen
`Feng, Snohomish, WA (US); William H.
`Havens, Syracuse, NY (US); Ynjiun
`Wang, Cupertino, CA (US)
`
`(73) Assignee: Hand Held Products, Inc., Skaneateles
`Falls, NY (US)
`
`( *) Notice:
`
`Subject to any disclaimer, the term ofthis
`patent is extended or adjusted under 35
`U.S.C. 154(b) by 375 days.
`
`(21) Appl. No.: 12/132,480
`
`(22) Filed:
`
`Jun.3,2008
`
`(65)
`
`Prior Publication Data
`
`US 2009/0072038 Al
`
`Mar. 19, 2009
`
`Related U.S. Application Data
`
`(60)
`
`Provisional application No. 60/933,022, filed on Jun.
`4, 2007.
`
`(51)
`
`Int. Cl.
`G06K 7110
`(2006.01)
`G06K 15112
`(2006.01)
`(52) U.S. Cl. .......... 235/462.41; 235/462.11; 235/462.24
`(58) Field of Classification Search ............. 235/462.41,
`235/462.11, 462.24
`See application file for complete search history.
`
`(56)
`
`References Cited
`
`U.S. PATENT DOCUMENTS
`4,877,949 A
`10/1989 Danielson et al.
`5,019,699 A
`511991 Koenck
`5,406,062 A
`4/1995 Hasegawa et al.
`
`5,572,006 A
`5,576,529 A
`5,591,955 A
`5,646,390 A
`5,702,058 A
`5,756,981 A
`5,770,847 A
`5,784,102 A
`5,786,582 A
`5,811,828 A
`5,815,200 A
`5,821,518 A
`5,837,987 A
`
`Wang eta!.
`1111996
`Koenck et al.
`1111996
`Laser
`111997
`Wang eta!.
`7/1997
`Dobbs et al.
`12/1997
`Roustaei et al.
`5/1998
`Olmstead
`6/1998
`Hussey et al.
`7/1998
`Roustaei et al.
`7/1998
`Laser
`9/1998
`Ju et al.
`9/1998
`Sussmeier et al.
`10/1998
`Koenck et al.
`1111998
`(Continued)
`
`CN
`
`FOREIGN PATENT DOCUMENTS
`101031930
`9/2007
`(Continued)
`
`OTHER PUBLICATIONS
`
`Extended European Search Report for European Patent Application
`No. 08010217, DatedOct.17, 2008, 3 pages.
`
`Primary Examiner - Edwyn Labaze
`(74) Attorney, Agent, or Firm -Marjama Muldoon Blasiak
`& Sullivan LLP
`
`ABSTRACT
`(57)
`An indicia reading terminal can include a multiple setting
`imaging lens assembly and an image sensor having an image
`sensor array. In one embodiment, an indicia reading terminal
`in an active reading state can cycle through a set of different
`lens settings, expose pixels of an image sensor array during an
`exposure period when each new lens setting is achieved, and
`attempt to decode decodable indicia represented in frames of
`image data captured corresponding to each exposure period.
`In one embodiment, movement of an imaging lens assembly
`lens element can be provided with use of a hollow stepper
`motor.
`
`13 Claims, 10 Drawing Sheets
`
`SHORT RANGE - BEST FOCUS: 2"
`40
`430 ~ 420
`.----------.-..
`
`50
`
`33
`
`404 403
`
`407
`
`Apple v. Corephotonics
`
`Page 1 of 25
`
`Apple Ex. 1007
`
`
`
`US 7,918,398 B2
`Page 2
`
`U.S. PATENT DOCUMENTS
`5,841,121 A
`1111998 Koenck
`6,010,070 A
`112000 Mizuochi et al.
`6,073,851 A
`612000 Olmstead et al.
`6,223,988 Bl
`512001 Batterman et al.
`6,230,975 Bl
`512001 Colley et al.
`6,254,003 Bl
`712001 Pettinelli et al.
`6,315,203 Bl
`1112001 Ikeda et al.
`6,386,452 Bl
`512002 Kawamura et al.
`6,522,441 Bl
`212003 Rudeen
`6,598,797 B2
`7/2003 Lee
`6,681,994 Bl
`112004 Koenck
`6,695,209 Bl
`212004 La
`412005 Wilde et al.
`6,880,759 B2
`7,044,378 B2
`512006 Patel et al.
`612006 Havens et al.
`7,055,747 B2
`7 /2006 Patel et al.
`7,073,715 B2
`7,083,098 B2
`8/2006 Joseph et al.
`12/2006 Harper et al.
`7,148,923 B2
`10/2007 Attia et al.
`7,287,696 B2
`7,303,126 B2
`12/2007 Patel et al.
`8/2009 Wang et al.
`7,568,628 B2
`1112009 Wang et al.
`7,611,060 B2
`200110003346 Al
`6/2001 Feng
`10/2004 Schmidt et al.
`2004/0206825 Al
`
`2005/0001035 Al
`2005/0103854 Al
`200610011724 Al
`2006/0043194 Al
`2006/0113386 Al
`2006/0163355 Al
`2006/0202038 Al
`2006/0249581 Al
`2007/0181692 Al
`2008/0223933 Al
`2008/0265034 Al *
`2009/0108071 Al
`2010/0044440 Al
`2010/0090007 Al
`
`112005 Hawley et al.
`512005 Zhu et al.
`112006 Joseph et al.
`3/2006 Barkan et al.
`612006 Olmstead
`712006 Olmstead et al.
`912006 Wang et al.
`1112006 Smith
`8/2007 Barkan et al.
`9/2008 Smith
`10/2008 Gibson .................... 235/462.25
`412009 Carlson
`212010 Wang et al.
`412010 Wang et al.
`
`FOREIGN PATENT DOCUMENTS
`CN
`101147157
`3/2008
`EP
`1784761
`5/2007
`EP
`1828957
`9/2007
`EP
`1856651
`1112007
`2008511917
`4/2008
`JP
`W0-2006026141
`3/2006
`WO
`W0-2006065450
`WO
`612006
`8/2006
`W0-2006081466
`WO
`* cited by examiner
`
`Apple v. Corephotonics
`
`Page 2 of 25
`
`Apple Ex. 1007
`
`
`
`00 = N
`"""' 00 w
`\c
`-....l
`rJl
`d
`
`\C
`
`0
`....
`0 .....
`....
`.....
`1J1 =(cid:173)
`
`('D
`('D
`
`N
`~Ul
`:-:
`~
`
`....
`0 ....
`
`~ = ~
`
`~
`~
`~
`•
`00
`~
`
`33
`
`_+
`Xz
`T
`~32
`
`FIG. 2
`
`Q 2 ______.. I
`
`f.
`
`402
`
`406 404 403
`
`92
`-------·
`f2v~-n! ____ ------·--
`
`~
`
`30
`
`407
`
`L
`
`I
`
`1
`
`.j
`
`2
`p
`
`I..
`
`I
`
`I
`
`..-------....
`
`430 ~ 420
`
`40
`
`50
`
`MEDIUM RANGE -BEST FOCUS: 7", f2 = f1
`
`10~
`
`FIG. 1
`
`i.----Ql --+I
`
`402
`
`406 404 403
`
`407
`
`33
`
`_+
`Xl
`T
`~32
`
`..--------"---
`
`40
`
`430 ~ 420
`
`50
`
`30 n
`
`---e~'--~UJ~--
`
`SHORT RANGE -BEST FOCUS: 2"
`
`10~
`
`Apple v. Corephotonics
`
`Page 3 of 25
`
`Apple Ex. 1007
`
`
`
`00 = N
`"'"" 00 w
`\c
`-....l
`rJl
`d
`
`\C
`
`0
`....
`0 .....
`N
`.....
`1J1 =(cid:173)
`
`('D
`('D
`
`N
`~Ul
`:-:
`~
`
`....
`0 ....
`
`~ = ~
`
`~
`~
`~
`•
`00
`~
`
`FIG. 3
`
`11J
`
`Q3
`
`402 14
`
`404 403
`
`407
`
`410 40a / 406
`
`•I
`
`i...._ ___ p3
`
`33
`
`J
`T
`~32
`
`X3
`
`420
`
`430 ~
`
`40
`
`03
`
`------------~
`
`30
`
`50
`
`LONG RANGE -BEST FOCUS: 24'', f3 f:. f2, f1
`
`10 "---:..
`
`Apple v. Corephotonics
`
`Page 4 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 3of10
`
`US 7,918,398 B2
`
`10~
`
`52
`
`1111111111-----t-----1---t----+----f---L--30 ~
`
`34
`
`35
`
`32
`
`97
`
`99
`
`Lens
`control 64
`
`Illumination
`control 62
`
`94
`
`Processor 60
`
`92
`
`90
`
`84
`
`82
`
`80
`
`FIG. 4
`
`Apple v. Corephotonics
`
`Page 5 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 4of10
`
`US 7,918,398 B2
`
`300
`r"
`
`FIG. 5
`
`FIG. 6
`
`32
`
`612
`
`620
`
`300
`r"
`
`30
`,'
`",'~ I
`
`I
`
`618
`
`30
`
`/ "',' I
`
`618
`
`620
`
`Apple v. Corephotonics
`
`Page 6 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 5of10
`
`US 7,918,398 B2
`
`N
`0
`N
`
`0
`N
`
`z
`
`"""
`0... >< UJ
`
`r~
`
`,,....
`<D
`N
`
`00
`.--
`N
`
`co
`0
`N
`
`0
`..-
`N
`
`N
`,,....
`N
`
`,,....
`"""
`N
`
`0
`00
`N
`
`0
`t-...
`N
`
`co
`<D
`N
`
`<D
`(0
`N
`
`""" <D
`
`N
`
`N
`(0
`N
`
`•
`(!)
`
`""'
`u::
`
`""
`0...
`>< UJ
`
`r~
`
`N
`
`0.. >< UJ
`
`r~
`
`T T T T T T T
`
`l
`
`Apple v. Corephotonics
`
`Page 7 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 6of10
`
`US 7,918,398 B2
`
`302
`~
`
`32
`
`310
`
`312
`
`304
`
`350
`
`352
`
`30
`308
`·-·-·-·-·-·-·-·-·-·- -·-·..t::_·-·-
`
`000 000
`000 000
`
`351
`
`000 000
`000 000
`
`353
`
`314
`
`316
`
`318
`
`320
`
`FIG. 8
`
`346
`
`342
`
`344
`
`~306
`
`FIG. 9
`
`351
`
`Apple v. Corephotonics
`
`Page 8 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 7of10
`
`US 7,918,398 B2
`
`FIG. 10
`
`525
`
`530
`
`I
`510 I
`~
`~ 30
`
`40
`
`FIG. 11
`
`Apple v. Corephotonics
`
`Page 9 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 8of10
`
`US 7,918,398 B2
`
`764
`
`752
`
`~568
`
`560
`
`762
`
`750
`
`I
`
`I .,
`
`30
`
`FIG. 12
`
`Apple v. Corephotonics
`
`Page 10 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 9of10
`
`US 7,918,398 B2
`
`802
`
`804
`
`806
`
`808
`
`810
`
`10
`~
`
`11
`
`97
`
`30
`\..__:
`
`I
`I
`I
`
`CONFIG. 1
`
`CONFIG. 2
`
`CONFIG. 3
`
`CONFIG. 4
`
`MORE
`
`96
`
`94
`
`IPllPllPllPI
`IPllPllPllPI
`IPllPllPllPI
`IPllPllPllPI
`
`FIG. 13
`
`Apple v. Corephotonics
`
`Page 11 of 25
`
`Apple Ex. 1007
`
`
`
`U.S. Patent
`
`Apr. 5, 2011
`
`Sheet 10 of 10
`
`US 7,918,398 B2
`
`<D
`0
`-.::::!"
`
`N
`0
`"c:::I"
`
`'<;j""
`0
`"c:::I"
`
`~
`~
`•
`(!)
`CT:
`
`N
`0
`("')
`
`0
`0
`('I')
`
`0
`("')
`\.._,'
`
`I
`(
`
`'<;f"
`0
`'<;f"
`
`I'-
`
`0) ~
`
`0
`
`~\
`
`co
`CD
`
`N
`0
`"c:::I"
`
`Apple v. Corephotonics
`
`Page 12 of 25
`
`Apple Ex. 1007
`
`
`
`US 7,918,398 B2
`
`1
`INDICIA READING TERMINAL HAVING
`MULTIPLE SETTING IMAGING LENS
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`This application claims priority under35 U.S.C. § 119( e) to
`Provisional Patent Application No. 60/933,022, entitled
`"Indicia Reading Terminal Processing Plurality of Frames of
`Image Data Responsively To Trigger Signal Activation" filed
`Jun. 4, 2007. This application is also related to U.S. patent
`application Ser. No. 12/132,462 entitled "Indicia Reading
`Terminal Processing Plurality of Frames Of Image Data
`Responsively to Trigger Signal Activation" filed concurrently
`herewith. Each of the above applications is incorporated
`herein by reference in its entirety.
`
`2
`FIG. 8 is a cutaway side view of a lens movement assembly
`in one embodiment.
`FIG. 9 is a perspective view of a hollow stepper motor in
`one embodiment.
`FIG. 10 is a cutaway side view illustrating a non-zooming
`imaging lens assembly which can be incorporated in an indi(cid:173)
`cia reading terminal.
`FIG. 11 is a cutaway side view illustrating another non(cid:173)
`zooming imaging lens assembly which can be incorporated in
`10 an indicia reading terminal.
`FIG. 12 is a cutaway side view illustrating a zooming
`imaging lens assembly which can be incorporated in an indi(cid:173)
`cia reading terminal.
`FIG. 13 is a perspective view of an indicia reading terminal
`15 incorporating a hand held housing in one embodiment.
`FIG.14 is a cutaway side view ofanindiciareading termi(cid:173)
`nal as shown in FIG. 13.
`
`FIELD OF THE INVENTION
`
`DETAILED DESCRIPTION OF THE INVENTION
`
`The invention relates to an indicia reading terminal in
`general, and specifically, to an indicia reading terminal hav(cid:173)
`ing a multiple setting imaging lens assembly.
`
`BACKGROUND OF THE PRIOR ART
`
`A majority of commercially available image based indicia
`reading terminals are equipped with fixed position (single
`setting) imaging lens assemblies. Advances in lens technol(cid:173)
`ogy, illumination technology, image sensor technology, and
`image processing technology have increased the depth of
`field of such terminals. However, the operational field of view
`of such terminals is limited by the single setting aspect of the
`lens assemblies of such terminals.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`The features described herein can be better understood
`with reference to the drawings described below. The drawings
`are not necessarily to scale, emphasis instead generally being
`placed upon illustrating the principles of the invention. In the
`drawings, like numerals are used to indicate like parts
`throughout the various views.
`FIG. 1 is a schematic diagram illustrating an indicia read(cid:173)
`ing terminal having a multiple setting imaging lens assembly,
`wherein the imaging lens assembly has a plurality of lens
`elements, and wherein the imaging lens assembly is set to a
`first setting.
`FIG. 2 is a schematic diagram illustrating an indicia read(cid:173)
`ing terminal having a multiple setting imaging lens assembly,
`wherein the imaging lens assembly has a plurality of lens
`elements, and wherein the imaging lens assembly is set to a
`second setting.
`FIG. 3 is a schematic diagram illustrating an indicia read(cid:173)
`ing terminal having a multiple setting imaging lens assembly,
`wherein the imaging lens assembly has a plurality of lens
`elements, and wherein the imaging lens assembly is set to a
`third setting.
`FIG. 4 is an exemplary block diagram illustrating an exem(cid:173)
`plary component of an indicia reading terminal in one
`embodiment.
`FIG. 5 is an exploded perspective assembly view of an
`imaging module in one embodiment.
`FIG. 6 is a perspective view of an assembled imaging
`module in one embodiment.
`FIG. 7 is a timing diagram illustrating operation of an
`indicia reading terminal in one embodiment.
`
`20
`
`There is provided an indicia reading terminal having a
`multiple setting imaging lens assembly (imaging lens). As
`shown in FIGS. 1, 2, and 3, terminal 10 can have an image
`sensor 32 and an imaging lens assembly 40 (imaging lens)
`25 capable of multiple lens settings. A multiple setting imaging
`lens assembly can be provided e.g., with use of one or more
`lens elements capable of being moved into different multiple
`positions, with use of one or more lens elements having
`adjustable lens surface curvatures, with use of one or more
`30 lens elements having an adjustable index ofrefraction, or with
`use of any combination of the above. In the particular embodi(cid:173)
`ment of FIGS. 1, 2, and 3, a multiple lens setting indicia
`reading terminal comprises one or more multiple position
`lens elements. In the specific embodiment of FIGS. 1, 2, and
`35 3, imaging lens assembly 40 comprises seven lens elements;
`namely, elements 402, 403, 404, 406, 407, 408, 410 where the
`combination of elements 403 and 404 and the combination of
`lens elements 407 and 408 are lens doublets. In the embodi(cid:173)
`ment ofFIGS.1, 2, and 3, imaging lens assembly 40 focuses
`40 an image of a decodable indicia disposed on a target substrate
`50 onto an active surface of image sensor 32. In one embodi(cid:173)
`ment, an active surface of an image sensor can be provided by
`an image sensor pixel array 33 (image sensor array).
`A well-corrected lens assembly can be treated as a "black
`45 box" whose characteristics are defined by its cardinal points;
`namely, its first and second focal points, its first and second
`principal points, and its first and second nodal points. The first
`focal point is the point at which light rays (for example,
`coming from the left) from an infinitely distant object and
`50 parallel to the optical axis are brought to a common focus on
`the optical axis. If the rays entering the lens assembly and
`those emerging from the lens assembly are extended until
`they intersect, the points of intersection will define a surface,
`usually referred to as the principal plane. The intersection of
`55 this surface with the optical axis is the principal point. The
`"second" focal point and the "second" principal plane are
`those defined by rays approaching the system from the right.
`The "first" points are those defined by rays from the left.
`The focal length of a lens assembly (also referred to as the
`60 effective focal length, EFL) is the distance from the principal
`point to the focal point. The back focal length (BFL) or the
`back focus is the distance from the vertex of the last surface of
`the system to the second focal point (again for light traveling
`through the lens assembly from left to right). The front focal
`65 length (FFL) is the distance from the front surface to the first
`focal point. The nodal points are two axial points such that a
`ray directed at the first nodal point appears (after passing
`
`Apple v. Corephotonics
`
`Page 13 of 25
`
`Apple Ex. 1007
`
`
`
`US 7,918,398 B2
`
`3
`through the system) to emerge from the second nodal point
`parallel to its original direction. When a lens assembly is
`bounded on both sides by air (as is true in the great majority
`of applications), the nodal points coincide with the principal
`points.
`FIGS. 1, 2 and 3 all show the respective effective focal
`points, nodal points, and field angles in both image and object
`space. Note that since the lens assembly is bounded on both
`sides by air, the nodal points coincide with the principal
`points. The field of view halfangle is defined by the ray which
`originates at the most extreme field point of object 50 that
`projects to the point farthest removed from the optical axis of
`image sensor 3 2. The focal point, nodal point and field of view
`angles in object space are noted as fn, ~ and 8n and the
`corresponding points in object space are noted as fn', ~·,and
`8n'. The subscript "n" represents the example associated with
`FIGS. 1, 2, and 3 respectively.
`In general, using paraxial approximations, the distance
`from the lens object space nodal point to the object Pm the
`distance from the image space nodal point to the image Qm
`and the focal length fn are related through the lens equation:
`
`eq.1
`
`As the lens equation demonstrates, when the focal length is
`constant, the plane of nominal focus for the lens assembly can
`be changed simply by changing the separation between the
`object and the lens principal plane. If the focal length and
`image distance are similar in value, which is often the case in
`bar code imaging systems, then the image distance change
`will be minimal for a major shift in the object plane. The field
`of view for such lens assembly is determined by the size of the
`active surface of the image sensor. Similarly using paraxial
`approximations, the field angles for image space and object
`space are identical, thus:
`
`8'n~8n
`
`Referring again to FIGS. 1, 2, and 3, the half field of view
`angle 8'n is related to the optical configuration:
`
`eq. 2 40
`This can be substituted into the lens equation to eliminate
`Qn and giving:
`
`eq. 3
`
`4
`length (the FOY angle of a lens is typically expressed in terms
`of"halfFOV" units) and image plane distance. Where image
`plane distances are significantly larger than an imaging lens
`assembly's focal length, an FOY angle of imaging lens
`assembly 40 can be maintained at a substantially constant
`value by retaining the relative positions between lens ele(cid:173)
`ments. A focal length of an imaging lens assembly 40 can be
`changed by adjusting a relative position between lens ele(cid:173)
`ments of an imaging lens assembly having multiple lens
`10 elements. Thus, changing a relative position between lens
`grouping 420 and grouping 430 changes a focal length of
`imaging lens assembly 40. As mentioned, an FOY angle of
`imaging lens assembly 40 is a function of the imaging lens
`assembly's focal length.Accordingly, an FOY angle ofimag-
`15 ing lens assembly 40 will change as grouping 420 is moved
`relative to grouping 430 or vice versa. Because the distance
`between a focal point position and image sensor 32 (the image
`plane) will also change as one grouping is moved relative to
`another, a change in the relative position between grouping
`20 420 and grouping 430 can be expected to produce a change in
`a best focus position of terminal 10 as well as a change in the
`focal length and field of view angle. The act of reducing a field
`of view angle of a lens while increasing a best focus distance
`is often referred to as "zooming". Where an imaging lens
`25 assembly is capable of zooming, it is often referred to as a
`"zoom lens".
`In one embodiment, terminal 10 is configured so that a
`setting of imaging lens assembly 40 can be switched between
`a plurality of lens settings. In one embodiment, the plurality
`30 oflens settings is three lens settings.
`Various lens settings of imaging lens assembly 40, in one
`embodiment, are illustrated with reference to FIGS. 1, 2, and
`3. In setting (a), a short range setting, terminal 10 has a best
`focus distance of2" and a half FOY angle of35°. With lens 40
`35 set to setting (b ), a medium (intermediate) range setting,
`terminal 10 has a best focus distance of 7" and a half FOY
`angle of 36.9°. With lens 40 set to setting ( c ), a long range
`setting, terminal 10 has a best focus distance of 24" and a half
`FOY angle of 11.5°. A focal length of imaging lens assembly
`40 can be unchanged relative to setting (a) and setting (b).
`Between setting (a) and setting (b ), a focal length of lens
`assembly 40 can be maintained at a constant value by main(cid:173)
`taining a constant spacing between lens elements. Between
`setting (a) and setting (b) in a specific embodiment, a focal
`45 length of lens assembly 40 can be maintained at a constant
`value by maintaining a constant spacing between lens group(cid:173)
`ings where the groupings are moved farther from an image
`sensor array between setting (a) and setting (b ). By maintain(cid:173)
`ing focal length at a constant value the FOY angle of lens
`50 assembly 40 will not change substantially provided the image
`plane distance is significantly larger than the lens focal
`length. Distance and angular measurements herein are given
`as approximate measurements. A summary of possible lens
`settings in one embodiment is summarized in Table 1.
`
`From this expression we can observe that the field of view
`for a lens assembly will not change strongly with object
`distance P n as long as the object distance is significantly larger
`than the lens focal length fn. In bar code/indicia reading
`systems, this condition is usually satisfied. Conversely, if one
`wants to change the field of view, this can be most effectively
`done by changing the focal length fn. In general, one can
`assert that if the lens curvatures, materials dimensions, and
`lens separations relative to each are unchanged, then the focal
`length of the lens assembly will be unchanged. Similarly, the
`focal length can be changed by varying any of these attributes
`either singularly or more likely together.
`Where a focal length of an imaging lens assembly remains
`constant, a best focus distance of terminal 10 (the distance
`between the terminal and a substrate at which the terminal is
`optimally focused) can be adjusted by changing the distance
`between a focal point of imaging lens assembly 40 and the
`image plane, i.e., image sensor array 33 (the active surface of
`image sensor 32). A focal length ofimaging lens assembly 40
`can be maintained at a constant value by maintaining the
`relative positions of lens elements 402, 403, 404, 406, 407, 65
`408, 410. A field of view (FOY) angle of an imaging lens
`assembly 40 is a function of an imaging lens assembly's focal
`
`55
`
`60
`
`TABLE 1
`
`Range
`
`Lens Setting
`
`Focal Length
`
`Half FOY Best Focus
`Angle
`Distance
`
`Short
`Intermediate
`Long
`
`(a)
`(b)
`(c)
`
`4.7mm
`4.7mm
`17.3 mm
`
`35°
`36.9°
`11.5°
`
`2"
`7"
`24"
`
`Regarding lens setting ( c ), it is advantageous for terminal
`10 to have a reduced FOY angle at a long range setting so that
`a resolution of image data representing a target indicia is
`improved. Terminal 10 can be adapted so that when terminal
`
`Apple v. Corephotonics
`
`Page 14 of 25
`
`Apple Ex. 1007
`
`
`
`US 7,918,398 B2
`
`5
`10 operates to capture frames of image data for subjecting to
`decoding, terminal 10 cycles between three lens settings.
`Terminal 10 can cycle between lens settings such that for a
`certain exposure period, the lens setting is at a first lens
`setting; for a subsequent exposure period, the lens setting is at
`a second lens setting, and for a further subsequent exposure
`period, the lens setting is at a third lens setting, and continuing
`with the cycling so that during an exposure period after the
`further subsequent exposure period, the lens returns to a first
`or previous lens setting and so on. Frames that are captured 10
`corresponding to and representing light incident on an image
`sensor array during the certain, subsequent, and further sub(cid:173)
`sequent exposure periods can be subject to an indicia decod(cid:173)
`ing attempt such as a bar code decoding attempt.
`In an alternative embodiment as shown in Table 2, imaging 15
`lens assembly 40 can have at least three lens settings. In each
`of the lens settings summarized in Table 2, imaging lens
`assembly 40 has a different focal length, a different half FOY
`angle, and a different best focus distance.
`
`A multiple setting lens assembly for use with terminal 10
`can be conveniently provided by employing a motor for mov(cid:173)
`ing lens elements relative to an image plane and/or relative to
`one another. It will be understood, however, that a multiple
`setting imaging lens can be provided utilizing alternative
`technologies. For example, spring-based actuators can be
`employed for moving lens elements to an image plane and/or
`each other. Also, fluid lens technologies can be employed.
`Fluid lens technologies can be employed for purposes of
`adjusting a curvature of a lens assembly lens element. Fluid
`lens technologies can also be employed in order to change an
`index of refraction of a lens assembly lens element by way of
`applying energy to the lens element to vary an optical prop(cid:173)
`erty of a liquid included in the lens element.
`A block diagram of an electrical component circuit dia(cid:173)
`gram supporting operations of terminal 10 is shown in FIG. 4.
`Image sensor 32 can be provided on an integrated circuit
`having an image sensor pixel array 33 (image sensor array),
`colunm circuitry 34, row circuitry 35, a gain block 36, an
`analog-to-digital converter 37, and a timing and control block
`38. Image sensor array 33 can be a two dimensional image
`sensor array having a plurality oflight sensitive pixels formed 50
`in a plurality of rows and colunms. Terminal 10 can further
`include a processor 60, an illumination control circuit 62, a
`lens control circuit 64, an imaging lens assembly 40, a direct
`memory access (DMA) unit 70, a volatile system memory 80
`(e.g., a RAM), a nonvolatile system memory 82 (e.g., 55
`EPROM), a storage memory 84, a wireline input/output inter(cid:173)
`face 90 (e.g., Ethernet), and an RF transceiver interface 92
`(e.g., IEEE 802.11). Regarding illumination control circuit
`62, illumination control circuit 62 can receive illumination
`control signals from processor 60 and can responsively 60
`deliver power to one or more illumination light sources such
`as light sources 604, and one or more aiming light sources
`such as aiming light source 610. Terminal 10 can also include
`a keyboard 94, a trigger button 95, and a pointer controller 96
`for input of data and for initiation of various controls and a 65
`display 97 for output of information to an operator. Terminal
`10 can also include a system bus 98 providing communication
`
`TABLE2
`
`Range
`
`Lens Setting
`
`Focal Length
`
`Half FOY Best Focus
`Angle
`Distance
`
`Short
`Intermediate
`Long
`
`(a)
`(b)
`(c)
`
`4.7 nnn
`8 nnn
`17.3 nnn
`
`35°
`23.4°
`11.5°
`
`2"
`7"
`24"
`
`20
`
`25
`
`6
`between processor 60 and various components of terminal 10.
`DMA unit 70 can be provided by, e.g., a field programmable
`gate array (FPGA) or an application specific integrated circuit
`(ASIC). While shown as being separate units, DMA unit 70
`and processor 60 can be provided on a common integrated
`circuit. In a further aspect, terminal 10 can include multiple
`image sensors and can include a plurality of light source
`banks. The light source banks can be controlled according to
`various control methods that can vary depending on which of
`a plurality of available operating configurations are active. An
`example of terminals that can include a plurality of image
`sensors and which can include plural light source banks that
`can be controlled in accordance with a variety of different
`settings depending on which of a plurality of different candi(cid:173)
`date configurations is active are described in U. S. patent
`application Ser. No. (not yet assigned) entitled, "Indicia
`Reading Terminal Processing Plurality of Frames of Image
`Data Responsively To Trigger Signal Activation," filed con(cid:173)
`currently herewith and incorporated herein by reference.
`In response to control signals received from processor 60,
`timing and control circuit 38 can send image sensor array
`timing signals to array 33 such as reset, exposure control, and
`readout timing signals. After an exposure period, a frame of
`image data can be read out. Analog image signals that are read
`out of array 33 can be amplified by gain block 36 converted
`into digital form by analog-to-digital converter 37 and sent to
`DMA unit 70. DMA unit 70, in turn, can transfer digitized
`image data into volatile memory 80. Processor 60 can address
`frames of image data retained in volatile memory 80 for
`30 decoding of decodable indicia represented therein.
`Referring to FIGS. 5 and 6, an imaging module for sup(cid:173)
`porting various components of terminal 10 is described.
`Mounted on first circuit board 602 can be image sensor 32,
`illumination light sources 604 (e.g., LEDs), and aiming light
`35 source 610 which can be provided by a laser diode assembly.
`A shroud 612 can be disposed forwardly of image sensor 32,
`and disposed forwardly of shroud 612 can be a lens moving
`assembly 302, which in the embodiment of FIG. 5 can be
`provided by a hollow stepper motor assembly having more
`40 than one hollow stepper motor. An optical plate 618 having
`diffusers 620 for diffusing light from illumination light
`sources 604 can be disposed over lens moving assembly 302
`so that hole 622 fits over outer barrel 304 as will be described
`in greater detail herein. An imaging module in an assembled
`45 form is shown in FIG. 6.
`A timing diagram further illustrating operation of terminal
`10, in one embodiment, is shown in FIG. 7. Timeline 202
`shows a state of a trigger signal which may be made active by
`depression of trigger button 95. Terminal 10 can also be
`adapted so that a trigger signal can be made active by the
`terminal sensing that an object has been moved into a field of
`view thereof or by receipt of a serial command from an
`external computer. Terminal 10 can also be adapted so that a
`trigger signal is made active by a power up of terminal 10. For
`example, in one embodiment, terminal 10 can be supported
`on a scan stand and used for presentation reading. In such an
`embodiment, terminal 10 can be adapted so that a trigger
`signal represented by timeline 202 can be active for the entire
`time terminal 10 is powered up. With further reference to the
`timing diagram of FIG. 7, terminal 10 can be adapted so that
`after a trigger signal is made active at time 220, pixels of
`image sensor 32 are exposed during first exposure period
`EXP 1 occurring during a first time period followed by second
`exposure period EXP 2 occurring during a second time period,
`third exposure period EXP3 occurring during a third time
`period and so on (after time 220 and prior to first exposure
`period EXP 1 , parameter determination frames subject to
`
`Apple v. Corephotonics
`
`Page 15 of 25
`
`Apple Ex. 1007
`
`
`
`US 7,918,398 B2
`
`7
`parameter determination processing may be optionally cap(cid:173)
`tured subsequent to parameter determination exposure peri(cid:173)
`ods not indicated in FIG. 7). Referring to the timing diagram
`of FIG. 7, terminal 10 may expose, capture and subject to
`unsuccessful decode attempts N-1 frames of image data prior
`to successfully decoding a frame ofimage data corresponding
`to exposure period EXP N An exposure control signal in one
`embodiment is represented by timeline 204 of FIG. 7.
`Terminal 10 can be adapted so that after pixels of image
`sensor array 33 are exposed during an exposure period, a 10
`readout control pulse is applied to array 33 to read out analog
`voltages from image sensor 32 representative oflight incident
`on each pixel of a set of pixels of array 33 during the preced(cid:173)
`ing exposure period. Timeline 206 illustrates a timing of
`readout control pulses applied to image sensor array 33. A 15
`readout control pulse can be applied to image sensor array 33
`after each exposure period EXPu EXP2 , EXP3 , EXPN_ 1 ,
`EXP N Readout control pulse 232 can be applied for reading
`out a frame of image data exposed during first exposure
`period EXP 1 . Readout control pulse 234 can be applied for 20
`reading out a first frame of image data exposed during second
`exposure period EXP 2 , and readout pulse 236 can be applied
`for reading out a frame of image data exposed during third
`exposure period, EXP3 . A readout control pulse 238 can be
`applied for reading out a frame of image data exposed during 25
`exposure period EXP N- l and readout control pulse 240 can be
`applied for reading out a frame of image data exposed during
`exposure period EXP N
`Terminal 10 can be adapted so that making active trigger
`signal 202 drives terminal 10 into an active reading state. 30
`After analog voltages corresponding to pixels ofimage sensor
`array 33 are read out and digitized by analog-to-digital con(cid:173)
`verter 37, digitized pixel values corresponding to the voltages
`can be received (captured) into system volatile memory 80.
`Terminal 10 can be adapted so that processor 60 can subject to 35
`a decode attempt a frame of image data retained in memory
`80. For example, in attempting to decode a 1 D bar code
`symbol represented in a frame of image data, processor 60
`can execute the following processes. First, processor 60 can
`launch a scan line in a frame of image data, e.g., at a center of 40
`a frame, or a coordinate location determined to include a
`decodable indicia representation. Next, processor 60 can per(cid:173)
`form a second derivative edge detection to detect edges. After
`completing edge detection, processor 60 can determine data
`indicating widths be