`
`
`
`
`
`
`
`
`(12) United States Patent
`US 7,974,460 B2
`(10) Patent No.:
`
`
`
`
`
`
`(45) Date of Patent:
`Jul. 5, 2011
`Elgersma
`
`US007974460B2
`
`
`
`(54) METHOD AND SYSTEM FOR
`
`
`
`
`
`THREE-DIMENSIONAL OBSTACLE
`
`MAPPING FOR NAVIGATION OF
`
`
`
`AUTONOMOUS VEHICLES
`
`
`
`
`
`(75)
`
`
`
`Inventor:
`
`
`
`
`
`
`Michael R. Elgersma, Plymouth, MN
`
`(US)
`
`(73)
`
`
`
`Assignee:
`
`
`
`(*)
`
`
`
`Notice:
`
`
`
`
`Appl. No.:
`
`
`
`
`
`
`
`Honeywell International Inc.,
`
`
`Morristown, N] (US)
`
`
`
`
`
`
`Subject to any disclaimer, the term of this
`
`
`
`
`patent is extended or adjusted under 35
`
`
`
`
`U.S.C. 154(b) by 1151 days.
`
`
`11/671,755
`
`Filed:
`
`
`
`Feb. 6, 2007
`
`
`Prior Publication Data
`
`
`
`US 2008/0189036 A1
`
`
`
`Aug. 7, 2008
`
`
`
`
`
`Int. Cl.
`
`
`G06K 9/00
`
`
`(2006.01)
`G06K 9/62
`(2006.01)
`
`
`
`G06K 9/32
`(2006.01)
`
`
`
`G06K 9/36
`(2006.01)
`
`........ 382/153; 382/104; 382/284; 382/157;
`US. Cl.
`
`
`
`
`
`
`382/293; 701/301
`
`Field of Classification Search .................. 382/ 103,
`
`
`
`
`
`
`
`
`
`
`382/104, 107, 153, 154, 157, 284, 2937295;
`701/301
`
`
`
`
`
`
`
`
`
`
`
`
`
`See application file for complete search history.
`
`
`References Cited
`
`
`
`(21)
`
`(22)
`
`(65)
`
`
`
`
`
`
`
`(51)
`
`
`
`(52)
`
`(58)
`
`(56)
`
`
`
`
`
`
`U.S. PATENT DOCUMENTS
`
`
`
`8/1980
`Rittenbach .................... 342/1 10
`4,219,812 A
`l<
`
`
`
`
`
`Crimmins
`...... 382/275
`11/1988
`4,783,753 A
`
`
`
`
`
`1/1991
`4,984,279 A
`
`
`
`
`
`
`l<
`Kidney et a1.
`.. 382/113
`3/1992
`Hattori .....
`701/27
`
`5,101,351 A
`
`
`
`
`
`6/1992
`Hattori ..
`701/27
`5,122,957 A
`
`
`
`
`
`
`11/1994
`5,361,127 A
`
`
`
`
`
`Daily ........................... 356/302
`
`**
`
`*>
`
`*>
`
`7/1995 Dausch et a1.
`.................. 701/27
`5,436,839 A *
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`7/1998 Takeda et a1.
`......
`348/699
`5,777,690 A *
`
`
`
`
`
`
`
`
`8/1998 Nourbakhsh et a1.
`382/263
`5,793,900 A *
`
`
`
`
`
`
`
`6,005,961 A * 12/1999 Na .....................
`382/113
`6,130,705 A * 10/2000 Lareau et a1.
`348/144
`
`
`
`
`
`
`
`6,154,558 A * 11/2000 Hsieh .....................
`382/103
`
`
`
`
`
`
`6,307,959 B1 *
`10/2001 Mandelbaum et a1.
`....... 382/154
`
`
`
`
`
`
`
`6,456,728 B1 *
`9/2002 Doiet a1.
`...............
`382/103
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`6,470,271 B2* 10/2002 Matsunaga .
`701/301
`
`
`
`
`
`
`
`
`6,507,661 B1 *
`1/2003 Roy ............
`382/107
`
`
`
`
`
`
`342/191
`6,853,332 B1 *
`2/2005 Brookes ..
`6,868,314 B1 *
`3/2005 Frink ................................ 701/3
`
`
`
`
`
`
`
`(Continued)
`
`
`OTHER PUBLICATIONS
`
`
`
`
`
`
`
`
`
`Call et a1. Obstacle Avoidance for Unmanned Air Vehicles Using
`
`
`
`
`
`
`
`Image Feature Tracking, AISS Guidance Navigation and Control
`
`
`
`
`
`
`
`
`Conference and Exhibit Aug. 21-24, 2006, pp. 1-9.*
`
`
`
`
`
`(Continued)
`
`
`
`Primary Examiner 7 Bhavesh M Mehta
`Assistant Examiner 7 Mia M Thomas
`
`
`
`
`
`
`
`
`(74) Attorney, Agent, or Firm 7 Fogg & Powers LLC
`
`
`
`
`
`
`
`ABSTRACT
`(57)
`
`
`
`
`
`
`
`
`A method and system for obstacle mapping for navigation of
`
`
`
`
`
`
`
`an autonomous vehicle is disclosed. The method comprises
`
`
`
`
`
`
`providing an autonomous vehicle with an image capturing
`
`
`
`
`
`
`
`
`device, and focusing the image capturing device at a prede-
`
`
`
`
`
`
`termined number ofdifferent specified distances to capture an
`
`
`
`
`
`
`
`image at each of the specified distances. The method further
`
`
`
`
`
`
`
`comprises identifying which regions in the captured images
`
`
`
`
`
`
`
`are in focus, and assigning a corresponding lens-focus dis-
`
`
`
`
`
`
`
`tance to each of the regions that are in focus. A composite
`
`
`
`
`
`
`
`
`image is formed from the captured images, with each of the
`
`
`
`
`
`
`
`regions labeled with the corresponding lens-focus distance. A
`
`
`
`
`
`
`
`three-dimensional obstacle map is then produced from the
`
`
`
`
`
`
`composite image. The three-dimensional obstacle map has an
`
`
`
`
`
`
`
`x, y, Z coordinate system, with x being proportional to pixel
`
`
`
`
`
`
`horizontal position, y being proportional to pixel vertical
`
`
`
`
`
`
`position, and Z being the lens-focus distance.
`
`
`
`
`
`20 Claims, 2 Drawing Sheets
`
`/100
`112
`
`musv ms 1.: facus av
`
`predsmmmed number m‘
`
`
`
`
`110
`specified dismmes
`
`
`114
`
`
`71—Take Photo at each diflunce
` 122
`
`
`
`
`Iaenwry w‘mch regions in
`each mm are m fetus
`
`
`
`
`
`
`
`
`120
`124
`1
`Assign Lnrrawnding Izns-fucus
`
`
`05mm To each ragian
`mm is in focus
`
`
`
`
`
`
`
`
`
`
`y
` 126
`Farm Compost?! pm: with
`
`
`with rigirm lubded um
`
`
`
`)vs (ms msvme
`
`
`
`
`
`
`
`
` Farm 3D
`
`map wk:
`x: waehnmzanmwmr.
`.
`
`
`
`
`
`
`
`
`focus ism)
`(WmM 12w my ~11”):
`
`
`.
`Exxzvzmuflgnsmnn
`h,
`,
`_,
`(
`
`
`
`
`1 = lmsincus dwstancz
`
`[gsmmmm 1min mu “m ""5 am“)
`
`
`
`132
`
`
`
`
`Use 3D mini obmdes
`in obstacle—avoidanc: algorithm
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
` 130
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`APPL-1032 / Page 1 of 8
`Apple v. Corephotonics
`
`APPL-1032 / Page 1 of 8
`Apple v. Corephotonics
`
`
`
`
`
`US 7,974,460 B2
`
`
`Page 2
`
`U.S. PATENT DOCUMENTS
`
`
`
`.................. 342/25 A
`7,015,855 B1*
`3/2006 Medl et a1.
`
`
`
`
`
`
`7,027,615 B2 *
`4/2006 Chen ...........
`. 382/104
`
`
`
`
`
`
`7,231,294 B2 *
`6/2007 Bodin et a1.
`. 701/206
`
`
`
`
`
`
`
`
`
`
`
`
`
`7,298,289 B1* 11/2007 Hoffberg .......
`. 340/903
`
`
`
`
`
`
`
`7,298,922 B1 * 11/2007 Lindgren et al.
`. 382/294
`
`
`
`
`
`
`
`7,330,567 B2 *
`2/2008 Hong et al.
`. 382/103
`
`7,417,641 B1 *
`8/2008 Barber et a1.
`. 345/589
`
`
`
`
`
`
`7,440,585 B2 * 10/2008 Roh et a1.
`. 382/103
`
`
`
`
`
`
`
`
`7,492,965 B2 *
`2/2009 Blais
`. 382/284
`
`
`
`
`
`
`7,512,494 B2 *
`3/2009 Nishiuchi
`. 701/300
`
`
`
`
`
`
`. 382/276
`.
`7,620,265 B1 * 11/2009 Wolff et al.
`
`
`
`
`
`
`. 700/245
`..
`7,865,267 B2 *
`1/2011 Sabe et al.
`
`
`
`
`
`
`
`
`
`
`
`
`2001/0018640 A1*
`8/2001 Matsunaga ................... 701/301
`2001/0040505 A1* 11/2001 Ishida et al.
`.................. 340/435
`
`
`
`
`
`
`
`2002/0105484 A1*
`8/2002 Navab et al.
`.................. 345/8
`
`
`
`
`
`
`
`
`2003/0072479 A1*
`4/2003 Sofia Totterman et al.
`382/131
`
`
`
`
`
`
`
`
`
`2004/0013295 A1*
`1/2004 Sabe et al.
`382/153
`
`
`
`
`
`
`
`
`2004/0062419 A1*
`4/2004 Roh et al.
`. 382/104
`.
`
`
`
`
`
`
`
`2004/0101161 A1*
`5/2004 Roh et al.
`.......
`. 382/103
`
`
`
`
`
`
`
`
`
`
`
`2004/0158355 A1 *
`8/2004 Holmqvist et al.
`. 700/245
`2004/0175019 A1*
`9/2004 Howard .............
`. 382/103
`
`
`
`
`
`
`
`2006/0132753 A1*
`6/2006 Nichols et al.
`............... 356/5.07
`
`
`
`
`
`
`2007/0019181 A1*
`1/2007 Sinclair et al.
`............... 356/4.01
`
`
`
`
`
`
`
`
`2007/0023582 A1 *
`2/2007 Steele et al.
`. 244/190
`
`
`
`
`
`
`
`
`
`
`
`
`
`2007/0092143 A1*
`4/2007 Higgins ......
`. 382/228
`2007/0122058 A1*
`5/2007 Kitaura et al.
`. 382/284
`
`
`
`
`
`
`
`2007/0285438 A1* 12/2007 Kanowitz
`. 345/632
`
`
`
`
`
`
`
`
`
`
`
`
`. 382/103
`2007/0286456 A1* 12/2007 Ariyur et al.
`2007/0286526 A1* 12/2007 Abousleman et al.
`382/284
`
`
`
`
`
`
`
`. 244/158.4
`2008/0023587 A1*
`1/2008 Head et al.
`.........
`
`
`
`
`
`
`
`
`
`2008/0059068 A1*
`3/2008 Strelow et al.
`................ 701/214
`
`
`
`
`
`
`
`
`
`
`
`
`2008/0189036 A1*
`8/2008 Elgersma ...................... 701/211
`2008/0205790 A1*
`8/2008 Wear et al.
`. 382/284
`..
`
`
`
`
`
`
`
`
`
`
`
`
`2008/0219508 A1*
`9/2008 Ganguli et al.
`. 382/104
`
`
`
`
`
`
`
`..
`2009/0088916 A1*
`4/2009 Elgersma et al.
`701/23
`
`
`
`
`
`2009/0154773 A1*
`6/2009 Pickering et al.
`. 382/107
`..
`2009/0198371 A1*
`8/2009 Emanuel et al.
`. 700/226
`
`
`
`
`
`
`
`
`
`2009/0226113 A1*
`9/2009 Matsumoto et al.
`. 382/284
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`2009/0228205 A1*
`9/2009 Ariyur et al.
`. 701/209
`
`2010/0034424 A1*
`2/2010 Goossen ....................... 382/103
`
`
`
`
`
`
`2010/0223008 A1*
`9/2010 Dunbabin et al.
`............ 701/301
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`OTHER PUBLICATIONS
`
`
`
`
`
`
`
`
`
`
`
`
`Conkur et al. The Beam Analysis Algorithm for Path Planning for
`
`
`
`
`
`
`redundant manipulators Mechatronics 15(2005) pp. 67-94.*
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Dolgov et al. Practical Search Techniques in Path Planning for
`
`
`
`
`
`
`
`
`Autonomous Driving American Association for AI 2008, pp. 1-6.*
`
`
`
`
`
`
`
`NPL Search Historinoogle Scholar pp. 1-2 “Three Dimensional
`
`
`
`Obstacle Avoidance Algorithm”.*
`
`
`
`
`
`
`Nourani-Vatani et al. Practical Path Planning and Obstacle Avoidance
`
`
`
`
`
`
`for Autonomous Mowing, 2006, pp. 1-9.*
`
`
`
`
`
`
`
`The Virginia Tech Autonomous Vehicle Team-Johnny 5, Apr. 27,
`
`
`
`2004, pp. 1-15.*
`
`
`
`
`
`
`
`
`Wahlde et al. An Open Path Obstacle Avoidance Algorithm Using
`
`
`
`
`
`
`
`
`
`Scanning Laser Range Data, Army Research Labs, Feb. 2009, pp.
`1-20.*
`
`
`
`
`
`
`
`Feng et al. Implementation of Dynamic Obstacle Avoidance on the
`
`
`
`
`
`
`CMU NAVLAB, IEEE71990, pp. 208-211.*
`Borenstein et al. Real Time Obstacle Avoidance for Fast Mobile
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Robots, 1989 IEEE Systems, Man and Cybernetics vol. 19, No. 5:
`
`
`
`
`Sep.-Oct. 1989, pp. 1179-1187.*
`
`
`
`
`
`
`
`Nilsson Visual Landmark Selection and Recognition for Autono-
`
`
`
`
`
`
`
`mous Unmanned Aerial Vehicle Navigation KTH Numerical Analy-
`
`
`
`
`
`
`
`sis and Computer Science (2005), pp. 1-53.*
`
`
`
`
`
`
`
`Batavia et al. “Obstacle Detection Using Adaptive Color Segmenta-
`
`
`
`
`
`
`
`tion and Color Stereo Homography” Carnegie Melon University,
`
`
`
`
`
`
`IEEE International Conference on Robotics, 20017pp. 1-6.*
`
`
`
`
`
`
`
`
`Elfes “Using Occupancy Grids for Mobile Robot Perception and
`
`
`
`
`
`Navigation” IEEE 1989, pp. 1-12.*
`
`
`
`
`
`WO 2006/021813 A1 Ansell et al. “Collision Avoidance System” pp.
`1-24.*
`
`
`
`
`
`
`
`
`Ghangrekar et al. “A Path Planning and Obstacle Avoidance Algo-
`
`
`
`
`
`
`
`
`rithm for an Autonomous Robotic Vehicle” UNC Charlotte 2009, pp.
`1-103.*
`
`
`
`
`
`
`Piepmeier et al. “Uncalibrated Target Tracking with Obstacle Avoid-
`
`
`
`
`
`
`ance” Proceedings of the 2000 IEEE International Conf on Robotics
`
`
`
`
`
`
`
`and Automation San Fran, CA Apr. 2000, pp. 1-6.*
`
`
`
`
`
`
`
`NPL Search historyiobstacle avoidance algorithm pp. 1-3.*
`
`
`
`
`
`
`
`
`Fu et al. “Uncalibrated Visual Servoing with Obstacle Avoidance
`
`
`
`
`
`
`
`
`Using SQP Method” Proceedings of the 2009 IEEE International
`
`
`
`
`
`
`
`
`
`Conf on Mechatronics and Automation Aug. 9-12, China, pp. 1-6.*
`
`
`
`
`
`
`
`
`
`Khatib et al. “Real Time Obstacle Avoidance for Manipulators and
`
`
`
`
`
`
`
`Mobile Robots” 1985 IEEE Proceedings, pp. 1-6.*
`
`
`
`
`
`
`
`
`European Patent Office, “European Search Report”, Jun. 11, 2008,
`Published in: EP.
`
`
`
`
`
`
`
`
`Jackson, M. et al., “Airborne Technology for Distributed Air Traffic
`
`
`
`
`
`
`Management”, “IEEE Conference on Decision and Control”, Dec.
`
`
`
`
`
`
`15, 2005, pp. 3947-3954, Publisher: IEEE.
`
`
`
`* cited by examiner
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`APPL-1032 / Page 2 of 8
`
`APPL-1032 / Page 2 of 8
`
`
`
`
`U.S. Patent
`
`
`
`
`
`Jul. 5, 2011
`
`
`
`Sheet 1 012
`
`
`
`US 7,974,460 B2
`
`
`
`
`
`Adjusf Sam ‘E‘Q fcacus a?
`
`
`pradaferminad number of
`
`
`specified distances
`
`110
`
`
`
`
`
`“
`
`‘
`
`
`
`
`Edemify which regions in
`
`
`
`
`each 931010 main focus
`
`
` 12C)
`
`
`
`Assign carresponding tens-focus
`
`
`
`dismnce Ta each regian
`
`
`
`ma?“ is in focus
`
`
`
`
`
`
`Form camposifa phoTa with
`
`
`
`
`each region iabeied wh‘h
`
`
`
`51's focus disfance
`
`
`
`125
`
`
`
`3 :
`
`_
`
`
`
`
`Form 3D map wifh:
`
`
`
`
`gixeihoi‘anaigcsifion
`H t
`*{iens-fficusdasmnce)
`«C
`
`
`
`
`
`disfance {mm {@11st imager
`
`
`
`
`
`
`
`,
`.
`gixeiverflflgosiflcn
`1
`,
`1..
`Y"
`“A.
`*(iens-rocuwasmnce)
`.
`
`
`
`sasmnce from Fans Tc- wager
`
`
`
`...............................................................
`
`.
`
`
`
`
`
`
`
`
`
`"
`
`'
`
`130
`
`
`
`
`
`
`Use BB mapof obsmdes
`
`
`in absmdaflvoidanca algori’rhm
`
`
`
`
`
`APPL-1032 / Page 3 of 8
`
`APPL-1032 / Page 3 of 8
`
`
`
`
`US. Patent
`
`
`
`
`
`Jul. 5, 2011
`
`
`Sheet 2 of2
`
`
`
`US 7,974,460 B2
`
`120
`
`
`
`
`EGO
`
`
`
`m Dfiaar‘ : H*s/(H+S}
`
`
`
`"Qua” Dfar : His/(WE)
`
`
`
`2Q
`
`
`
`
`
`Focusrange(mamrs)
`
`
`g». C)
`
`4Q
`
`
`
`
`
`5
`
`10
`
`15
`
`
`
`20
`
`
`
`25
`
`30
`
`35
`
`
`
`
`
`
`
`
`
`S : Disfance fhm‘ {ens is focused a“? (flamers)
`
`4O
`
`
`
`
`H& 2
`
`45
`
`
`
`50
`
`
`
`55
`
`
`
`
`
`APPL-1032 / Page 4 of 8
`
`APPL-1032 / Page 4 of 8
`
`
`
`
`
`US 7,974,460 B2
`
`
`1
`METHOD AND SYSTEM FOR
`
`
`
`
`THREE-DIMENSIONAL OBSTACLE
`
`
`MAPPING FOR NAVIGATION OF
`
`
`
`AUTONOMOUS VEHICLES
`
`
`
`BACKGROUND
`
`
`
`
`
`
`
`
`
`
`
`Autonomous vehicles are widely used and include a variety
`
`
`
`
`
`
`
`
`ofunmanned ground vehicles, underwater vehicles, and aero-
`
`
`
`
`
`
`
`
`space vehicles, such as robots and unmanned aerial vehicles
`
`
`
`
`
`
`(UAVs). An autonomous vehicle is required to make deci-
`
`
`
`
`
`
`
`sions and respond to situations completely without human
`
`
`
`
`
`
`
`
`intervention. There are major limitations to the overall per-
`
`
`
`
`
`
`
`formance, accuracy, and robustness ofnavigation and control
`
`
`
`
`
`
`
`of an autonomous vehicle. In order to perform navigation
`
`
`
`
`
`
`properly, an autonomous vehicle must be able to sense its
`
`
`
`
`
`
`
`location, steer toward a desired destination, and avoid
`
`
`
`
`
`
`
`
`obstacles. Various modalities have been used to provide navi-
`
`
`
`
`
`
`
`
`
`gation of autonomous vehicles. These include use of the
`
`
`
`
`
`
`Global Positioning System (GPS),
`inertial measurements
`
`
`
`
`
`
`
`from sensors, and image measurements from cameras.
`
`
`
`
`
`
`
`Smaller UAVs are being developed for reconnaissance and
`
`
`
`
`
`
`
`
`surveillance that can be carried and deployed in the field by an
`
`
`
`
`
`
`
`
`individual or a small group. Such UAVs include micro air
`
`
`
`
`
`
`
`
`vehicles (MAVs) and organic air vehicles (OAVs), which can
`
`
`
`
`
`
`
`
`be remotely controlled. The typical dimension for MAVs is
`
`
`
`
`
`
`
`
`approximately six to twelve inches (15 to 30 cm), and devel-
`
`
`
`
`
`
`opment of insect-size MAVs is underway. Such air vehicles
`
`
`
`
`
`
`
`can be designed for operation in a battlefield by troops, and
`
`
`
`
`
`
`
`
`
`provide small combat teams and individual soldiers with the
`
`
`
`
`
`
`
`capability to detect enemy forces concealed in forests or hills,
`
`
`
`
`
`
`
`around buildings in urban areas, or in places where there is no
`
`
`
`
`
`
`
`
`direct line-of—sight. Some of these air vehicles can perch and
`
`
`
`
`
`
`
`stare, and essentially become sentinels for maneuvering
`
`troops.
`
`
`
`
`
`
`
`In order to avoid obstacles during navigation, autonomous
`
`
`
`
`
`
`
`vehicles such as UAVs need three-dimensional (3 -D) obstacle
`
`
`
`
`
`
`
`
`
`mapping. Typical vehicle sensors that are used currently are
`
`
`
`
`
`
`
`
`
`either very expensive (e.g., scanning laser detection and rang-
`
`
`
`
`
`
`ing (LADAR)) or require very computationally expensive
`
`
`
`
`
`
`
`
`
`algorithms (e.g., stereo cameras that try to track many fea-
`
`tures).
`
`10
`
`
`
`15
`
`
`
`20
`
`25
`
`
`
`30
`
`
`
`35
`
`
`
`40
`
`
`
`SUMMARY
`
`
`
`
`
`
`
`
`
`
`
`The present invention is related to a method and system for
`
`
`
`
`
`
`obstacle mapping for navigation of an autonomous vehicle.
`
`
`
`
`
`
`The method comprises providing an autonomous vehicle
`
`
`
`
`
`
`
`
`
`with an image capturing device, and focusing the image cap-
`
`
`
`
`
`
`turing device at a predetermined number of different specified
`
`
`
`
`
`
`distances to capture an image at each of the specified dis-
`
`
`
`
`
`
`
`tances. The method further comprises identifying which
`
`
`
`
`
`
`
`
`regions in each captured image are in focus, and assigning a
`
`
`
`
`
`
`corresponding lens-focus distance to each of the regions that
`
`
`
`
`
`
`
`are in focus. A composite image is formed based on each
`
`
`
`
`
`
`
`
`
`captured image, with each of the regions labeled with the
`
`
`
`
`corresponding lens-focus distance. A three-dimensional
`
`
`
`
`
`
`
`
`obstacle map is then produced from the composite image. The
`
`
`
`
`
`
`three-dimensional obstacle map has an x, y, Z coordinate
`
`
`
`
`
`
`
`system, with x being proportional to pixel horizontal position,
`
`
`
`
`
`
`
`
`
`y being proportional to pixel vertical position, and Z being the
`lens-focus distance.
`
`
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`
`
`
`
`
`
`
`
`
`
`
`
`Features of the present invention will become apparent to
`
`
`
`
`
`
`
`
`
`those skilled in the art from the following description with
`
`45
`
`
`
`50
`
`
`
`55
`
`
`
`60
`
`
`
`65
`
`
`
`2
`
`
`
`
`
`
`
`
`
`
`
`
`
`reference to the drawings. Understanding that the drawings
`
`
`
`
`
`
`depict only typical embodiments of the invention and are not
`
`
`
`
`
`
`therefore to be considered limiting in scope, the invention will
`
`
`
`
`
`
`
`
`be described with additional specificity and detail through the
`
`
`
`
`use of the accompanying drawings, in which:
`FIG. 1 is a combined flow chart and functional block dia-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`gram for an obstacle detection sensor system according to an
`
`
`
`embodiment of the invention; and
`
`
`
`
`
`
`
`FIG. 2 is a graph showing range binning resolution and
`
`
`
`
`
`
`
`accuracy for an obstacle detection sensor system according to
`an embodiment of the invention.
`
`
`
`
`DETAILED DESCRIPTION
`
`
`
`
`
`
`
`
`
`
`
`In the following detailed description, embodiments are
`described in sufficient detail to enable those skilled in the art
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`to practice the invention. It is to be understood that other
`
`
`
`
`
`
`
`embodiments may be utilized without departing from the
`
`
`
`
`
`
`
`
`scope of the present
`invention. The following detailed
`
`
`
`
`
`
`description is, therefore, not to be taken in a limiting sense.
`
`
`
`
`
`
`
`The present invention is directed to a method and system
`
`
`
`
`
`
`for obstacle mapping for navigation of autonomous vehicles.
`
`
`
`
`
`
`
`In general, the method comprises providing an autonomous
`
`
`
`
`
`
`
`
`
`vehicle with an image capturing device, and focusing the
`
`
`
`
`
`
`image capturing device at a predetermined number of differ-
`
`
`
`
`
`
`
`ent specified distances to capture an image at each of the
`
`
`
`
`
`
`
`
`specified distances. The method then identifies which regions
`
`
`
`
`
`
`
`
`
`in the captured images are in focus, and assigns a correspond-
`
`
`
`
`
`
`
`
`ing lens-focus distance to each ofthe regions that are in focus.
`
`
`
`
`
`
`
`
`
`A composite image is formed from the captured images, with
`
`
`
`
`
`
`
`
`each ofthe regions labeled with the corresponding lens-focus
`
`
`
`
`
`
`distance. A three-dimensional (3-D) obstacle map is then
`
`
`
`
`
`produced from the composite image.
`
`
`
`
`
`
`
`
`The 3-D obstacle mapping can be accomplished using
`
`
`
`
`
`
`monocular camera autofocus algorithms to produce a two-
`
`
`
`
`
`
`
`dimensional (2-D) array of ranges for each of the regions in
`
`
`
`
`
`
`
`
`
`focus. This information in each of the images is then
`
`
`
`
`
`employed to build a 3-D obstacle map.
`
`
`
`
`
`
`
`
`
`The present method and system can be used for obstacle
`avoidance in an autonomous vehicle such as an unmanned
`
`
`
`
`
`
`
`
`
`
`
`
`
`aerial vehicle (UAV), including a micro air vehicle (MAV) or
`
`
`
`
`
`
`
`
`
`
`
`an organic air vehicle (OAV), but are not limited to light
`
`
`payload platforms.
`FIG. 1 is a combined flow chart and functional block dia-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`gram for an obstacle detection and mapping system 100
`
`
`
`
`
`
`
`according to one embodiment of the invention. The system
`
`
`
`
`
`
`
`
`100 includes an image capturing device 110, such as a digital
`
`
`
`
`
`
`
`
`camera. A mapping module 120, which contains 3-D map-
`
`
`
`
`
`
`
`ping software,
`is in operative communication with image
`
`
`
`
`
`
`
`capturing device 110. An obstacle avoidance module 130
`
`
`
`
`
`
`containing obstacle avoidance software, such as a Laplacian
`
`
`
`
`
`
`path planning algorithm, is in operative communication with
`
`
`
`mapping module 120.
`
`
`
`
`
`
`
`Further details related to Laplacian path planning algo-
`
`
`
`
`
`
`
`
`rithms can be found in copending US. patent application Ser.
`
`
`
`
`
`
`
`
`No. 11/470,099, filed on Sep. 5, 2006, the disclosure ofwhich
`
`
`
`
`is incorporated herein by reference.
`
`
`
`
`
`
`
`During operation of system 100, the lens of the camera is
`
`
`
`
`
`adjusted to focus at a predetermined number of specified
`
`
`
`
`
`
`
`
`distances (e.g., seven distances) at block 112, and the camera
`
`
`
`
`
`
`
`
`takes a photo (image) at each of these distances (e.g., seven
`
`
`
`
`
`
`
`
`photographs) at block 114. Each of the lens settings for the
`
`
`
`
`
`
`specified distances is saved in a memory device ofthe camera,
`
`
`
`
`
`
`
`
`and the photos are downloaded to the mapping module 120 at
`
`
`
`
`
`
`
`
`high speed. The 3-D mapping software in mapping module
`
`
`
`
`
`
`
`
`120 identifies which regions in each ofthe photos are in focus
`
`
`
`
`
`
`
`(block 122), and assigns a corresponding lens-focus distance
`
`APPL-1032 / Page 5 of 8
`
`APPL-1032 / Page 5 of 8
`
`
`
`
`
`US 7,974,460 B2
`
`
`3
`
`
`
`
`
`
`
`
`
`to each region that is in focus (block 124). A composite photo
`
`
`
`
`
`
`
`
`
`is formed with each region labeled with its focus distance
`
`
`
`
`
`
`
`(block 126). A 3-D obstacle map is then produced having an
`
`
`
`
`x, y, Z coordinate system, with
`
`x
`
`
`pixel horizontal position
`
`
`
`
`
`
`_ distance from lens to imager
`pixel vertical position
`
`
`
`
`
`
`
`_ distance from lens to imager
`z 2 (lens — focus distance)
`
`
`
`
`
`y
`
`* (lens — focus distance)
`
`
`
`
`
`
`
`* (lens — focus distance)
`
`
`
`
`
`
`
`10
`
`
`
`15
`
`20
`
`
`
`
`
`
`
`
`
`
`(block 128). Thus, x is proportional to pixel horizontal posi-
`
`
`
`
`
`
`
`tion, y is proportional to pixel vertical position, and Z is the
`
`
`
`
`
`
`
`lens-focus distance. The 3-D obstacle map is transferred to
`
`
`
`
`
`
`
`
`
`the obstacle avoidance module 130, which utilizes the 3-D
`
`
`
`
`
`
`obstacle map in a standard obstacle avoidance algorithm
`
`
`(block 132).
`
`
`
`
`
`
`Further details related to implementing the method and
`
`
`
`
`
`
`system of the invention are set forth hereafter.
`
`Equipment
`
`
`
`
`
`
`
`
`In one embodiment, the obstacle detection and mapping
`
`
`
`
`
`
`
`
`system can be based on a single digital camera (8.6 mega-
`
`
`
`
`
`
`pixel) with full-size imager (e.g., 24 mm by 36 mm imager
`
`
`
`
`
`
`with 2400*3600:8.6 million pixels) and a normal 50 mm
`
`
`
`
`
`
`
`
`lens. The camera has a focal length (f) of 0.05 m, a pixel width
`
`
`
`
`
`(c) of (24 mm CD width)/(2400 pixels):10 microns (microns
`
`
`
`
`
`
`
`
`
`are typical units for measuring pixel size), and an f-number
`
`
`
`
`
`
`
`
`(N) of 3.0. The field of view (FOV):(24 mm CCD width)/50
`
`
`
`
`
`
`
`mm focal length:30 degrees. The autofocus algorithms in the
`
`
`
`
`
`
`
`
`camera can be used to distinguish range in 6 bins, from about
`
`
`
`
`
`
`
`7 meters to about 45 meters. Typical image compression
`
`
`
`
`
`
`
`schemes, such as JPEG, use 4*4 or 8*8 cells of pixels, and
`information in the JPEG coefficients can be used to determine
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`if that 4*4 or 8*8 region is in focus. This type of range
`
`
`
`
`
`
`determination can be done for each of (2400/4)*(3600/4):
`
`
`
`
`
`
`600*900 subregions, which gives a (30 degrees)/600:0.05
`
`
`
`
`
`
`
`
`
`degree angular resolution. Each autofocus cell has 4*4 pixels,
`
`
`
`
`
`
`
`
`
`and photographs can be taken with six different lens positions
`
`
`
`
`
`
`
`
`
`
`within 0.25 sec (24 frames per sec (fps)). This provides
`
`
`
`
`
`
`z:range for each of the 600*900 (x, y) values in an image,
`
`
`
`
`
`
`
`which can be used to generate 3-D obstacle maps.
`45
`
`
`
`
`
`
`
`
`In another embodiment, the obstacle detection and map-
`
`
`
`
`
`
`
`ping system can be based on a single 8 megapixel digital
`
`
`
`
`
`
`
`
`single-lens reflex (SLR) camera. Such a system gives seven
`
`
`
`
`
`
`
`range bins for an obstacle distance ofabout 7 m to about 52 m,
`
`
`
`
`
`
`
`
`
`for each of 400*300 JPEG regions, where each IPEG region
`
`
`
`
`
`
`
`contains 8*8 pixels. The only required computation is the
`
`
`
`
`
`
`
`
`JPEG algorithm in the digital camera, and the selection ofone
`
`
`
`
`
`
`high-spatial-frequency coefficient for each 8*8 JPEG region
`
`
`
`
`
`
`
`
`to determine if that region is in focus. An ultrasonic range
`
`
`
`
`
`
`
`sensor can be optionally used to augment the camera to deter-
`
`
`
`
`
`
`mine range to nearby windows or featureless walls. A field of
`
`
`
`
`
`
`
`
`view of about 45 degrees is also needed for obstacle avoid-
`ance in this embodiment.
`
`
`
`
`
`Range Resolution
`
`
`
`
`
`
`In determining range resolution, depth information is
`
`
`
`
`
`
`
`
`obtained by using the fact that a wide-aperture and/or long
`
`
`
`
`
`
`focal-length lens has a small depth-of-field, so a photograph
`
`
`
`
`
`
`
`
`will only be in focus in regions where obstacles are in a given
`
`
`
`
`
`
`
`
`range bin. For example, to determine which areas of a pho-
`
`
`
`
`
`
`tograph are in focus, each 8*8 cell in a JPEG representation of
`
`
`
`
`
`
`
`the photograph is examined. An 8*8 JPEG cell is in focus
`
`
`
`
`
`
`
`if-and-only-if it has large coefficients for its highest spatial
`
`
`
`
`
`
`frequencies. By storing the values of the largest high-spatial-
`
`25
`
`30
`
`35
`
`40
`
`50
`
`55
`
`60
`
`65
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`4
`
`
`
`
`
`
`
`
`frequency coefiicients, an indicator is provided for whether
`
`
`
`
`
`
`the 8*8 cell ofpixels is in focus. By taking a sequence of (e. g.,
`
`
`
`
`
`
`
`around seven) photographs, with the lens focused at a
`
`
`
`sequence ofdistances (e.g., 8 m, 9 m, 11 m, 13 m, 18 m, 31 m,
`
`
`
`
`
`
`
`and 52 m), a composite photograph can be constructed that
`
`
`
`
`
`
`
`
`
`has depth information in seven range bins for each 8*8 pixel
`
`
`
`
`
`
`
`area on the composite photograph. Given a high resolution
`
`
`
`
`
`
`photograph, e.g., 3500*2300 pixels (8 megapixels), a total of
`
`
`
`
`
`(3500/8)*(2300/8):437*287 JPEG cells with range-bin
`information can be obtained.
`
`
`
`
`
`
`
`
`
`
`If a camera/lens system is focused at its hyper-focal dis-
`
`
`
`
`
`
`
`
`tance, H, then all viewed objects at distances between H/2 and
`
`
`
`
`
`infinity will be in focus. So the range-from-focus technique
`
`
`
`
`
`
`
`
`
`that is used in the present method only works out to a distance
`
`
`
`
`
`
`
`of H/2. In order to measure range out to 52 m, a camera/lens
`
`
`
`
`
`
`system with hyper-focal distance ofat least H:f* f/(N*c):l 04
`meters is needed.
`
`
`
`
`
`
`
`
`
`FIG. 2 is a graph showing range binning resolution and
`
`
`
`
`
`
`
`
`
`accuracy. The graph plots the maximum and minimum dis-
`
`
`
`
`
`
`
`
`
`tance that a lens is focused at (s) with respect to nominal focus
`
`
`
`
`
`
`
`
`range, for seven range bins from about 7 m to about 52 m for
`
`
`
`
`
`
`
`
`a camera/lens system with H:104 m. The near distance is
`
`
`
`
`
`determined by Dnea,:H*s/(H+s), and the far distance is deter-
`
`
`
`
`
`
`
`
`mined by Dfa,:H*s/(H—s). The graph shows that the spread
`between maximum and minimum focus distance increases
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`rapidly as the focus distance approaches the hyper-focal dis-
`
`
`
`
`
`
`
`tance H. The resulting number of bins of distinguishable
`
`
`
`
`
`
`
`
`distances is quite small for ranges approaching the hyper-
`
`
`
`
`
`
`
`
`focal distance, while the number ofdistingui shable range bins
`
`
`
`
`
`
`is much larger for closer ranges.
`Vibration
`
`
`
`
`
`
`
`
`
`
`
`On a sunny day, with an F16 lens setting, the shutter speed
`
`
`
`
`
`
`needs to be approximately the inverse of the ISO settings, so
`
`
`
`
`
`
`
`
`
`
`at ISO 100, shutter speed needs to be around 1/100 second. If an
`
`
`
`
`
`
`
`Fl.8 lens is used, shutter speeds need to be around:
`
`
`
`.4 WT (1 HI
`J
`
`
`
`
`
`
`(s u er 1me)— E * msec — 8000sec
`
`
`
`
`
`
`
`Vibration frequencies are at frequencies of vehicle engine
`
`
`
`
`
`
`
`revolutions per minute (RPM) and several higher harmonics.
`
`
`
`
`
`
`
`
`If the vehicle engine RPM is around 6000 RPM:100 Hz, the
`
`
`
`
`
`
`fraction of an engine revolution undergone during the time
`
`
`
`
`that the shutter is open is:
`
`
`
`
`
`
`
`
`
`
`(shutter time)*(engine rate):((%uuo)sec)* (100
`
`
`Hz):%o<<l
`
`
`
`
`
`
`
`
`Thus, any vibrations should not blur the photographs.
`
`
`
`Sensor Update Rate
`
`
`
`
`
`
`
`
`An additional requirement for good obstacle avoidance for
`
`
`
`
`
`
`MAV or OAV type vehicles is an obstacle-sensor update rate
`
`
`
`
`
`
`
`
`
`of about 1 Hz. Digital SLR cameras with frame rates of at
`
`
`
`
`
`
`
`
`least 5 Hz (5 fps) can be used to provide this capability. This
`
`
`
`
`
`
`would allow a sequence of seven photos to be updated at (5
`
`
`
`
`
`
`
`
`fps)/(7 frames):0.7 Hz. For example, when the 5 fps speed of
`
`
`
`
`
`
`
`
`
`a Canon EOS-20D is in burst mode, the photographs are
`
`
`
`
`
`
`
`
`
`stored into the camera’s fast memory buffer. The present
`
`
`
`
`
`
`
`
`method only needs to transfer a small fraction (e. g., l/(8*8):
`
`
`
`
`
`
`
`
`1/54, since 8*8 cells ofpixels are grouped) ofthat data out to an
`
`
`
`
`
`
`obstacle-avoidance algorithm, since collision avoidance does
`
`
`
`
`
`
`
`
`
`not need the full angular resolution of a digital SLR camera.
`With a USB2 data transfer between the camera and a com-
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`puter, a full 8 megapixel image can be transferred in about 0.5
`
`
`
`
`
`
`
`
`seconds. Since the present method only transfers 1/54 of the
`
`
`
`
`
`
`
`
`
`
`
`detail in the image (one byte per 8*8 pixel JPEG region,
`
`APPL-1032 / Page 6 of 8
`
`APPL-1032 / Page 6 of 8
`
`
`
`
`
`US 7,974,460 B2
`
`
`5
`
`
`
`
`
`
`
`
`needed for range determination), the time needed to transfer
`
`
`
`
`
`
`
`range data from 1 frame is: (0.5 sec/full frame)*(1/s4):0.008
`sec.
`
`
`
`Example Calculations
`
`
`
`
`
`
`
`
`To determine the feasibility of using a digital SLR camera
`
`
`
`
`
`
`
`
`
`for building a 3-D obstacle map, the following example cal-
`
`
`
`
`
`
`
`
`
`culates hyper-focal distance and field of view for the camera
`
`
`
`
`
`
`
`described in the previous section. Camera: Canon EOS-20D,
`
`
`
`
`
`
`
`3500*2300 pixel 22.5 mm><15 mm CCD (pixel width:6.5
`10
`
`
`
`
`
`
`
`
`micron). Lens: Sigma f:30 mm, f—number:1.4, EX DC HSM
`
`
`
`
`
`
`
`
`with Canon mount. The hyper-focal distance, H, and field of
`views are calculated as follows:
`
`
`
`
`
`5
`
`
`
`
`
`
`
`
`(0.030 m)2
`f2
`H =
`= 99 In
`.
`.
`=
`
`
`(F number) * (pixel Width)
`1.4 *6.5 *10’6 m
`
`
`
`
`
`
`.
`.
`,1 (22.5 mm)/2
`
`
`30 mm
`
`
`
`
`Horizontal field of View 2 2*tan (—] = 41 degrees
`
`(15 mm)/2
`
`
`
`
`
`
`
`30mm
`] = 28 degrees.
`Vertical field of View = 2 *tan’l(
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Instructions for carrying out the various process tasks,
`
`
`
`
`
`
`
`
`calculations, and generation of signals and other data used in
`
`
`
`
`
`
`
`the operation of the method and system of the invention can
`
`
`
`
`
`
`be implemented in software, firmware, or other computer
`
`
`
`
`
`
`
`readable instructions. These instructions are typically stored
`
`
`
`
`
`
`
`
`
`on any appropriate computer readable media used for storage
`
`
`
`
`
`
`
`
`of computer readable instructions or data structures. Such
`
`
`
`
`
`
`
`
`
`computer readable media can be any available media that can
`
`
`
`
`
`
`
`be accessed by a general purpose or special purpose computer
`
`
`
`
`
`
`or processor, or any programmable logic device.
`
`
`
`
`
`
`
`Suitable computer readable media may comprise, for
`
`
`
`
`
`
`example, non-volatile memory devices including semicon-
`
`
`
`
`
`
`
`ductor memory devices such as EPROM, EEPROM, or flash
`
`
`
`
`
`
`
`
`memory devices; magnetic disks such as internal hard disks
`
`
`
`
`
`
`
`or removable disks; magneto-optical disks; CDs, DVDs, or
`
`
`
`
`
`
`
`
`other optical storage disks; nonvolatile ROM, RAM, and
`
`
`
`
`
`
`
`
`
`
`other like media; or any other media that can be used to carry
`
`
`
`
`
`
`
`
`
`or store desired program code means in the form of computer
`
`
`
`
`
`
`executable instructions or data structures. Any of the forego-
`
`
`
`
`
`
`
`ing may be supplemented by, or incorporated in, specially-
`
`
`
`
`
`designed application-specific integrated circuits (ASle).
`
`
`
`
`
`
`When information is transferred or provided over