throbber

`
` FOR MOBILE
`OBOTS.
`
`
`
`
`
`THEORY AND APPLICATION ©
`
`SilverStar Exhibit 1016
`SilverStar Exhibit 1016
`
`

`

`ISBN 1-56441-048-2
`
`SilverStar Exhibit 1016 - 2
`SilverStar Exhibit 1016 - 2
`
`

`

`Sensors
`for Mobile
`Robots
`
`Theory and Application
`
`H.R. Everett
`Naval Command, Control and
`Ocean Surveillance Center
`San Diego, California
`
`AKPeters, Ltd.
`Wellesley, Massachusetts
`
`SilverStar Exhibit 1016 - 3
`SilverStar Exhibit1016 - 3
`
`

`

`Editorial, Sales, and Customer Service Office
`
`AK Peters, Ltd.
`289 Linden Street
`Wellesley, MA 02181
`
`Copyright © 1995 by A K Peters, Ltd.
`
`All rights reserved. No part of the material protected by this copyright notice may
`be reproduced orutilized in any form, electronic or mechanical, including photo-
`copying, recording, or by any information storage andretrieval system, without
`written permission from the copyright owner.
`
`Library of Congress Cataloging-in-Publication Data
`
`Everett, H. R., 1949-
`Sensors for mobile robots : theory and application / H.R. Everett.
`p. cm.
`Includes bibliographical references and index.
`ISBN 1-56881-048-2
`1. Mobile robots. 2. Robots—Control systems.
`TJ211.415.E83 1995
`629.8 ' 92—dce20
`
`|. Title.
`
`95-17178
`cIP
`
`Many of the designations used by manufacturers and sellers to distinguish their products are
`claimed as trademarks. Where designations appear in this book and A K Peters was aware of the
`trademark claim, the designations have been printed in italics. Where designations have not been
`provided, every efforl has been made to ensure accurate presentation of product names and
`specifics,
`
`Principle illustrator: Todd Ashley Everett
`
`Printed in the United States of America
`99 98 97 96 95 10987654321
`
`SilverStar Exhibit 1016 - 4
`SilverStar Exhibit 1016 - 4
`
`

`

`Table of Contents
`
`EEPIRESVO CERES msocxsaaracicesicnencecnacvasanctineneareeankeeecmeerenneeteremerteemerterereseseeXbti
`
`
`PREFACEvssssssesssseseinicio
`aaa ena
`iiss
`
`evseeeeXV
`
`NEKOWLEDGMENTS scscscssicessccscsccsostasctittioniesite SecelestSoetinicouteaiedecvebnebiestcaxvii
`
`ISINTRODCTION vopccccccccncorecercsiesumaiesiestencuceensenaarti cueamaasacee
`
`
`
`1.1 DESIGN CONSIDERATIONS........
`ae
`wa
`1.2 THE ROBOTS...
`
`
`1.2.1 WALTER(1965-1967)...
`cnc
`1.2.2 CRAWLERI (1966-1968)...
`eal
`
`
`1.2.3 CRAWLERI] (1968-197 1)....ssssssssseseescstrsesracseeteersneressennnsees
`
`1.2.4 ROBARTI (1980-1985) ..ue
`1.2.5 ROBARTII (1982-)..
`
`1.2.6 MODBOT(1990-) ..
`1.2.7 USMC TeleOperatedVehicle(1985- 1989)...
`
`1.2.8 MDARSInterior (1989-) .....0ccc..ccc e
`
`1.2.9 Surrogate Teleoperated Vehicle (1990-1993) ...
`wed
`1.2.10 ROBARTTI(1992-).....ccsescscsseceeceessesesse
`28
`1.2.11 MDARSExterior (11994-)..
`
`
`Lest canteen
`1.3 REFERENCES...
`
`2, DEAD RECKONING,.......
`
`
`2.1 ODOMETRY SENSORS.......
`
`2.1.1 Potentiometers....
`
`
`2.1.2 Synchros and Resolvers.
`er
`2.1.3 Optical Encoders ..
`2.2 DOPPLER AND INER’TIALNAVIGATION...
`
`2.2.1 Doppler Navigation...
`
`wae
`2.2.2 Inertial Navigation...
`2.3 TYPICAL MOBILITY CONFIGURATIONS
`
`2.3.1 Differential Steering..
`
`2.3.2 Ackerman Steering .
`
`2.3.3 Synchro Drive.....
`
`
`er
`2.3.4 Tricycle Drive..
`2.3.5 Omni-DirectionalDave...
`ae
`im
`
`2.4 INTERNAL POSITION ERRORCORRECTION,
`
`2:5 REFERENCES:sicctiscececsisissecsataciacnscath\sepincaniacaciaciel
`3. TACTILE AND PROXIMITY SENSING j....sssscscssssesssssssneesssrceeessnconsnsnnsnnensesssesseenensnesersegeesOF
`
`S21 TAGTSE: SENSORS.ss cciasnraciassccntanasiaatueson sssciteein fotceeav naantava iraa nla Foe ee a
`SF TC EG OPE acca geeneamenrens
`3.1.2 Tactile Bumpers vc.
`3.1.3 Distributed Surface Arrays...
`
`
`
`
`SilverStar Exhibit 1016 - 5
`SilverStar Exhibit 1016 - 5
`
`

`

`vi
`
`Sensors for Mobile Robots
`
`3.2 PROXIMITY SENSORS...
`
`
`3.2.1 Magnetic Proximity Sensors
`3.2.2 Inductive Proximity Sensors......
`
`3.2.3 Capacitive Proximity Sensors...
`
`3.2.4 Ultrasonic Proximity Sensors .
`
`3.2.5 Microwave Proximity Sensors...
`3.2.6 Optical eeSensors...
`
`3.3 REFERENCES...
`ee
`4, TRIANGULATION RANGING....essssseessserseesines tkesc asccantenehsacearnercactienaneeneayexatcnasanwes 103
`
`4.1 STEREO DISPARITY...
`
`4.1.1 JPL Stereo Vision...
`
`4.1.2 David Sarnoff Stereo Vision
`
`4.2 ACTIVE TRIANGULATION........
`
`4.2.1 Hamamatsu Rangefinder ChipSet.
`
`4.2.2 Draper Laboratory Rangefinder.........
`
`4.2.3 Quantic Ranging System.....
`
`4.3 ACTIVE STEREOSCOPIC..
`
`43.1 HERMIES..
`43.2 Danl-Aparturs3DRange Sensor...
`
`4.4 STRUCTURED LIGHT..
`
`4.4.1 TRC Strobed-Light Triangulation‘System...
`=
`4.5 KNOWN TARGETSIZE..
`we
`4.5.1 NAMCOLasernet™ Seanning Lasereects
`
`4.6 OPTICAL FLOW ..
`ii
`4.6.1 NIST Passive nies ae Collision ‘Avoidance.,
`
`4.6.2 David Sarnoff Passive Vision.....
`
`4.) REFERENCES5 jc0ccjjaiaseenicepiineinas
`
`
`Bi TIME OR FUMGEID isisssiccssssccecsnssaiseassetascubonssitianteyevesansaneassepnecevetaciesserouamrecavensbaqacsiosencenesesets 139
`
`5.1 ULTRASONIC TOF SYSTEMS...
`
`5.1.1 National Semiconductor's LM1812UltrasonicTransceiver.
`
`5.1.2 Massa Products Ultrasonic Ranging Module Subsystems...
`
`5.1.3 Polaroid Ultrasonic Ranging Modules.........0...004
`
`5.1.4 Cybermotion CA-2 Collision Avoidance System
`
`5.2 LASER-BASED TOF SYSTEMS......
`50
`
`
`5.2.1 Schwartz Electro-Optics Laser Rangefi
`«150
`5.2.2 RIEGL Laser Measurement Systems...
`158
`5.2.3 Odetics Fast Frame Rate 3-D Laser TihagingSystem.
`«161
`
`. 162
`3.2.4 RVSI LongERE iiiiatand Detectionmy Sse
`
`
`«165
`scngaaremmante
`§.3 REFERENCES..
`.
`
`6. PHASE-SHIFT MEASUREMENT AND FREQUENCY MODULATION.cscs 169
`
`6.1 PHASE-SHIFT MEASUREMENT.....
`
`6.1.1 ERIM 3-D Vision Systems..
`
`6.1.2 Perceptron LASAR........006
`
`6.1.3 Odetics Scanning Laser Imaging System...
`6.1.4 Sandia Scannerless Range Imager............
`
`6.1.5 ESP Optical Ranging System ......
`
`6.1.6 Acuity Research AccuRange 3000,
`
`
`SilverStar Exhibit 1016 -6
`SilverStar Exhibit 1016 - 6
`
`

`

`Table of Contents
`
`Vii
`
`6.1.7 TRC Light Direction and Ranging System...
`6.2 FREQUENCY MODULATION...
`6.2.1 VRSS Automotive Collision Avotlamee:Radar...
`
`6.2.2 VORADVehicle Detection and Driver Alert Sysian..
`6.2.3 Safety First Systems Vehicular Obstacle Detection andWarming System...
`
`6.2.4 Millitech Millimeter Wave Radar...
`6.3 REFERENCES. ..0...cccceccsccseseseetecesnenensaeeeeee
`
`
`
`
`7. OTHER RANGING TECHNIQUES...
`
`
`7.1 INTERFEROMETRY...
`—
`7.1.1 CLS Coordinate MeasuringSystem
`
`7.2 RANGE FROM FOCUS...
`tt
`
`7.2.1 Honeywell AnrafoousSyemene -
`
`7.2.2 Associates and Ferren Swept-Focus Ranging...
`7.2.3 JPL Range-from-Focus System........06
`
`7.3 RETURN SIGNAL INTENSITY ...........
`7.3.1 Programmable Near-Infrared Proximity Sensor......
`
`7.3.2 Australian National University Rangefinder.....
`7.3.3 MIT Near-Infrared Ranging System.......0...0..
`
`7.3.4 Honeywell Displaced-Sensor Ranging Unit.
`
`Ta REPRRENGRS ceninsknannnaes
`SA COUSTICAE ENERGY ssicccscsivivcinesntacssaveceisassnesonrsscevsinsetsebboaseneesieneestossescaveltaeivonitenyseoienBOD
`
`
` 8.1 APPLICATIONS ..
`8.2 PERFORMANCE FACTORS..
`225
`
`8.2.1 Atmospheric Atlenuation.........cccccsececoe
`225
`
`B22 Tarstet REnestiv ty cccsccccssecsaisavsnosvstiavisocavans
`227
`
`8.2.3 Air Turbulence...
`232
`
`
`8.2.4 Temperature...
`233
`
`8.2.5 Beam Geometry...
`234
`DUBAD INGING au cncasistaintieaeiainahitins
`
`8.2.7 System-Specific Anomialies..
`8.3 CHOOSING AN OPERATING FREQUENCY
`
`8.4 SENSOR SELECTION CASE STUDY..
`
`HOS
`8.5 REFERENCES...
`
`9. ELECTROMAGNETIC ENERGY........
`
`
`
`5
`
`9.1 OPTICAL ENERG¥oiviveticsteceetyesssesceees
`9.1.1 Electro-Optical Sources...
`9.1.2 Performance Factors...
`9.1.3 Choosing an OperatingWavelength.
`
`9.2 MICROWAVE RADAR..
`aoe
`
`9.2.1 Applications ...
`
`9.2.2 Performance Factors......
`
`9.3 MILLIMETER-WAVERADAR..
`
`9.3.1 Applications...
`9.3.2 Performance Factors...
`oats
`9.3.3 ChoosinganOperatingFrequency.
`
`Fr
`
`9.4 REFERENCES...
`
`aang
`
`
`SilverStar Exhibit 1016 - 7
`SilverStar Exhibit 1016 - 7
`
`

`

`Vill
`
`Sensors for Mobile Robots
`
`10. COLLISION AVOIDANCE,....cccccnsessecsssreeessFead.veesnbaesentensunanpoatecaundeealplccssiorteteeieaenenaarEee
`
`
`10.1 NAVIGATIONAL CONTROL STRATEGIEG..........
`279
`10.1.1 Reactive: Controliiaiecasic.cvivcnieisieccitiees
`280
`
`290
`[0.1.2 Representational World Modeling.
`
`10.1.3 Combined-Approach ccicixi...21.dscecateccteecase
`on 295
`10.2 EXTERIOR APPLICATION CONSIDERATIONS...
`on 299
`
`10.3 NAVIGATIONAL RE-REFERENCING.............
`anu
`
`1 dO2
`10:4: REFERENCES) esicinicssaccsncn
`
`
`11. GUIDEPATH FOLLOWING ocscssscissesesssesscenssnesssacbieciciesiesenelNESSUSBiceeasiestSOD
`
`EST WORE) GUIDED issca. sosaveascaicaccicstectenisnnntaaniansctansstessonacien
`11.2 OPTICAL STRIPE.......
`
`
`11.2.1 ModBot Optical Stripe Tracker............
`
`11.2.2 U/V Stimulated Emission.........0.:0.00-
`
`11.3 MAGNETIC TAPE. ............csscceeceseese
`
`11.3.1 Macome Magnetic Stripe Follower...
`11.3.2 Apogee Magnetic Stripe Follower...
`
`11.3.3 3M/Honey well Magnetic Lateral Guidance System
`11.4 HEAT AND ODOR SENSING..,....cssesccsses
`11.5 INTERMITTENT-PATH NAVIGATION...
`
`
`11.5.1 MDARSInterior Hybrid Navigation........
`322
`ie
`11.5.2 Free Ranging On Grid.....
`
`
`116 REPRERENCESeossccrssreesaieanesconesscocansaianatatcetsennsstcoccaanteneansarvasanvatucusntesncauericesseasxstaeranniontbal325
`
`
`306
`
`
`
`TZ. MAGNETIC COMPASSIS wisesisivesiensssrenitinniessciysenniietnahnsenanantyieeiestaniansabiiuetbdbentenetshabaies 327
`
`12.1 MECHANICAL MAGNETIC COMPASSES.....cesscessccsccscsesresessecensnsteenttsadbeatsnadeneesneeneereaens 2B
`
`12.1.1 Dinsmore Starguide Magnetic Compass...
`x
`dd
`
`12.2 FLUXGATE COMPASSES............
`330
`
`12.2.1 Zemeo Fluxgate Compasses
`337
`12.2.2 Watson Gyro Compass...
`340
`
`12.2.3 KVH Fluxgate Compasses
`4]
`12.2.4 Applied Physics Systems Miniature Orientation Sensor:
`1.343
`
`12.3 MAGNETOINDUCTIVE MAGNETOMETERS..
`garsub.feaunige nay eaciunamraneaLanganayens sary 344
`12.3.1 Precision Navigation TCM Meencisinductive———e URN Mees 345
`
`12.4 HALL-EFFECT COMPASSES.........00:
`347
`
`349
`12.5 MAGNETORESISTIVE COMPASSES.
`
`350
`12.5.1 Philips AMR Compass............
`351
`
`12.5.2 Space Electronics AMR Compass...
`
`12.5.3 Honeywell HMRSeries Smart peekMagncicenctet
`oan
`
`12.6 MAGNETOELASTIC COMPASSES..
`ea ra aera
`TRIER 353
`EET REFERENCES sccsissrcctos eaecennnicornanranane 3357
`
`TR. GY ROSCOPES cpesiscosivccosssssevcsncsuenusinissiecasansrevssrstotocsoveatenenisteniteesssersctertisstsicneaceivieaebieonsieenSOL
`
`
`
`13.1 MECHANICAL GYROSCOPES...........
`si seincaean aaisnTistaaaasi 362
`13.1.1 Space-Stable Gyroscopes.
`
`13.1.2 GyfOCOMPASSES.......cccsesececcesieessssens
`sheer
`sscdCasoavbs ap acme taka
`13.1.3 Rate GYTr0S...........cscccsseeressssssseseneseeseenereesearssinetsanenentnesetsesertenesatsseetnesarsesersenesserreatenss JOD
`
`
`SilverStar Exhibit 1016 - 8
`SilverStar Exhibit 1016 - 8
`
`

`

`Table of Contents
`
`Ix
`
`13.2 OPTICAL GYROSCOPES .....0000.....
`
`13.2.1 Active Ring-Laser Gyros......
`
`13.2.2 Passive Ring Resonator GyroS.............ccececeeseseeee
`
`13.2.3 Open-Loop Interferometric Fiber-Optic Gyros.
`13.2.4 Closed-Loop Interferometric Fiber-Optic Gyros...
`
`13.2.5 Resonant mn ee
`
`13.3 REFERENCES...
`sa
`e
` 14. RF POSITION-LOCATION SYSTEMS .........+0100
`
`Shine ION
`
`astancaa eRiea
`
`14.1 GROUND-BASEDRF SYSTEMS...
`14.1.1 Loran...
`
`14.1.2 Kaman SciencesRadioFrequency Navigation Grid.........
`[4.1.3 Precision Technology Tracking andterete _—
`14,1.4 Motorola Mini-Ranger Falcon...
`Jaaviiyontnneitoaniaats
`
`
`14.1.5 Harris Infogeometric System...
`14.2 SATELLITE-BASED SYSTEMS....
`
`14.2.1 Transit Satellite Navigation System
`14.2.2 Navstar Globalplaitieditie areas
`
`
`14.3 REFERENCES...
`Pretec ere
`15. ULTRASONIC AND OPTICAL POSITION-LOCATION SYSTEMS,Q.......:-s:ssseseeee+00423
`
`
`15.1 ULTRASONIC POSITION-LOCATION SYSTEMS |. iisccsicisescsicssastlcenesnidencietinciecntiod 423
`
`
`15.1.1 Ultrasonic Transponder Trilateration .........0cc0sce
`vee hd
`
`431
`15.1.2 Ultrasonic Signature Matching...
`433
`15.2 OPTICAL POSITION-LOCATION SYSTEMS.
`
`433
`15.2.1 CRAWLER| Homing Beacon..........
`
`434
`15.2.2 ROBARTII Recharging Beacon..
`
`436
`15.2.3 Cybermotion Docking Beacon..
`
`14.2.4 Hilare..
`oe
`
`15:25 NAMCOLaseuieta StunningLiaeee Sensor...
`15.2.6 Caterpillar Self-Guided Vehicle...
`:
`
`15.2.7 TRC Beacon Navigation Syitetn...aaiviqaatters
`15.2.8 Intelligent Solutions EZNav Position Sensor.....
`15.2.9 Imperial College Beacon Navigation System.....
`
`15.2.10 MTI Research CONAC..............
`
`15,2.11 MDARSLateral-Post Sensor...
`
`15.3 REFERENCEG..........
`
`
`16. WALL, DOORWAY, AND CEILING REFERENCING. ......ccccccssscsscssceenteteessestseseseaeen455
`
`Seas
`16.1 WALL REFERENCING.....
`
`16.1.1 Tactile Wall Referencing...
`16.1.2 Non-Contact Wall Referencing .
`
`16.1.3 Wall Pollowing.......csecee
`16.2 DOORWAY TRANSIT REFERENCING
`
`16.3 CEILING REFERENCING..
`a
`16.3.1 Polarized Optical Heading Referenceeee
`
`16.3.2 Georgia Tech Ceiling Referencing System....
`
`16.3.3 TRC HelpMate Ceiling Referencing System...
`
`16.3.4 MDARS Overhead-Beamsaeae
`
`16.4 REFERENCES...
`"i
`
`
`SilverStar Exhibit 1016 - 9
`SilverStar Exhibit 1016 - 9
`
`

`

`x
`
`Sensors for Mobile Robots
`
` 17, APPLICATION-SPECIFIC MISSION SENSORS...
`
`17,1 THE SECURITY APPLICATION..
`
`
`17.1.1 Acoustical Detection .
`17.1.2 Vibration Sensors..........
`
`17.1.3 Ultrasonic Presence Sensors.
`
`17.1.4 Optical Motion Detection...........006
`
`17.1.5 Passive Infrared Motion Detection.
`17.1.6 Microwave Motion Detection................
`
`17.1.7 Video Motion Detection .........0....
`
`17.1.8 Intrusion Detection on the Move,
`
`17.1.9 Verification and Assessment..........
`
`17.2 AUTOMATED INVENTORY ASSESSMENT...
`
`17.2.1 MDARSProduct Assessment System ....
`
`17.3 REFERENCES.........0.:00004
`
`
`SilverStar Exhibit 1016 - 10
`SilverStar Exhibit 1016 - 10
`
`

`

`Foreword
`
`A robot’s ability to sense its world and change its behavior on that basis is what
`makes a robot an interesting thing to build and a useful artifact when completed.
`Without sensors, robots would be nothing more than fixed automation, going
`through the same repetitive task again and again in a carefully controlled
`environment.
`Such devices certainly have their place and are often the right
`economic solution. But with good sensors, robots have the potential to do so
`much more. They can operate in unstructured environments and adapt as the
`environment changes around them. They can work in dirty dangerous places
`where there are no humans to keep the world safe for them. They can interact
`with us and with each other to work as parts of teams. They can inspire our
`imaginations and lead us to build devices that not so long ago were purely in the
`realms offiction.
`
`Sensors are what makesit all possible.
`When it comes right down to it there are two sorts of sensors. There are visual
`sensors, or eyes, and there are non-visual sensors. Lots of books have been
`written about visual sensors and computervision for robots.
`There is exactly one book devoted to non-visual sensors. This one.
`Wetend to be a little vision-centric in our “view’’ (there we go again...) of the
`world, as for humans vision is the most vivid sensor mechanism, But when we
`look at other animals, and without
`the impediment of introspection, another
`picture (hmmm...) begins to emerge.
`Insects have two eyes, each with at most
`perhaps 10,000 sensor elements.
`Arachnids have eight eyes, many of them vestigial, some with only a few
`hundred sensor elements, and at most 10,000 again, But insects have lots and lots
`and lots of other sensors. Cockroaches, for example, have 30,000 wind-sensitive
`hairs on their legs, and can sense a change in wind direction andalter the direction
`in which they are scuttling in only 10 milliseconds. That is why you cannot stomp
`on one unless you have it cornered, and on top of that get lucky. The cockroach
`can sense your foot coming and change course much faster than you can change
`where you are aiming. And those 30,000 sensitive hairs represent just one of a
`myriad of specialized sensors on a cockroach. Plus each different insect has many
`varied and often uniquely different sensors. Evolution has become a master al
`producing non-visual sensors.
`As robotics engineers we find it hard to create new sensors, but are all aware
`that in general our robots have a rather impoverished connection to the world,
`More sensors would let us program our
`robots in ways that handled more
`situations, and do better in those situations than they would with fewer sensors.
`Since we cannot easily create new sensors, the next best thing would be to know
`what sensors were already available. Up until this point we have all maintained
`
`
`SilverStar Exhibit 1016 - 11
`SilverStar Exhibit 1016 - 11
`
`

`

`xii
`
`Sensors for Mobile Robots
`
`our ownlittle libraries of sensors in our heads. Now Bart Everett has written
`downall he had in his own private library and more. Bart's robots have always
`stood out as those with the most sensors, because interactive sensing has always
`been a priority for Bart. Now he is sharing his accumulated wisdom with us, and
`robotdom will be a better place for it. Besides providing us with an expanded
`library, Bart has also done it in a way that everyone interested in robotics can
`understand, He takes us through the elementary physics of each sensor with an
`approach that a computer scientist, an electrical engineer, a mechanical engineer,
`or an industrial engineer can relate to and appreciate. We gain a solid
`understanding of just what each sensor is measuring, and whatits limitations will
`be.
`
`So let’s go build some new robots!
`
`Rodney A. Brooks
`MIT AI Lab
`Cambridge, MA
`
`
`SilverStar Exhibit 1016 - 12
`SilverStar Exhibit 1016 - 12
`
`

`

`Preface
`
`in the preparation of this manuscript was to present some
`My underlying goal
`general background on the sensing needs of a mobile system,
`followed by
`sufficient theory of operation and illustrative examples such that the overall result
`is both informative and of practical use. Perhaps the most challenging problem I
`faced early on in this endeavor was how to arrange reams ofinformation on al] the
`various sensors into some semblance of logical order. One considered possibility
`was to categorize by class of robot (i.e., airborne, underwater, indoor, exterior,
`autonomous, teleoperated). Given the emphasis of the book, however, it seemed
`more appropriate to break down the discussion by sensor type.
`In an attempt to bound the problem, I decided to eliminate any treatment of
`airborne or underwater scenarios and focus instead on interior and exterior land-
`based applications. Even so, there was still considerable difficulty associated with
`organizing the flow. For example, at least seven different methods of non-contact
`ranging techniques are knownto exist; one of these methods alone (triangulation)
`can be implementedin five different ways. Almost all such ranging systems can
`operate in the acoustical or electromagnetic regions of the energy spectrum; can
`be active or passive; and may have markedly different assigned functions in actual
`deployment.
`After much weighing of alternative strategies, | chose to present the material in
`a manner that to some extent parallels the strategy often employed in robotic
`development.
`The initial
`thrust of most early research efforts in which I
`participated was simply aimed at how to get
`the robot
`to move about
`in a
`controlled and purposeful fashion. Once this hurdle is surmounted, attention can
`be turned to collision avoidance, wherein the system learns not to run into things
`while enroute. The proud builders soon realize the robot can perform admirably
`for
`some finite length of
`time but eventually will get
`lost, whereupon
`developmental focus shifts to navigational referencing. Applications are tacked
`on later, sometimes almost as an afterthought.
`Accordingly, following some general background discussions in Chapter |, we
`start by taking a look in Chapter 2 at
`the sensors employed in vehicle dead
`reckoning, with a careful analysis of potential error sources.
`Tactile and
`proximity sensors are introduced next
`in Chapter 3, providing a rudimentary
`capability to at least detect potential obstructions in time to stop. Chapters 4
`through 7 provide an overview of the various distance measurement techniques
`available, such as triangulation, time of flight, frequency modulation, phase-shift
`measurement, and interferometry. Related discussion of implementation in the
`acoustical, radio frequency, and electro-optical domains is presented in Chapters 8
`and 9, with a special emphasis on the various factors affecting performance.
`This approach hopefully provides a good foundation for later examining how
`such non-contact
`ranging sensors are employed in specific roles,
`first and
`
`
`SilverStar Exhibit 1016 - 13
`SilverStar Exhibit 1016 - 13
`
`

`

`xiv
`
`Sensors for Mobile Robots
`
`foremost being in support of collision avoidance (Chapter 10). Navigational
`referencing, the subject of Chapters 11 through 16, is addressed in considerable
`detail as it represents one of the biggest remaining stumbling blocks to successful
`fielding. A few representative samples of application-specific sensors are treated
`in closing in Chapter 17.
`In retrospect, there is considerably less emphasis than I originally intended on
`image-based systems, as the subject of machine vision quite obviously could be
`the focus of a book all in itself. And since a numberofdistinguished individuals
`far better qualified than myself have in fact taken that very objective to task, I
`have purposely limited discussion in this volume, and concentrated instead on
`various alternative (and often less complex) sensing strategies less documented in
`the open literature. Reference is made throughout the text to candidate systems,
`both commercially available and under development, in hopes of complementing
`theory of operation with some practical
`lessons in real-world usage. These
`illustrative examples are called out under separate headings where the discussion
`becomesratherdetailed.
`I have very much enjoyed the preparation of this manuscript, both in terms of
`what I learned in the process and the new contacts | made with other researchers
`in this exciting field.
`I hope the results as presented here will be useful
`in
`promoting the successful employment of mobile robotic systems
`through
`increased awarenessofavailable supporting technologies.
`
`H.R. Everett
`
`San Diego, CA
`
`
`SilverStar Exhibit 1016 - 14
`SilverStar Exhibit 1016 - 14
`
`

`

`Acknowledgments
`
`A number of people have assisted me in my educational and research endeavors
`over the years and collectively contributed to making this book a reality.
`I would
`like to express my heart-felt appreciation to:
`
`My uncles, Gene Everett and Joe Hickey, who introduced meto electronics at
`an early age.
`
`My high school geometry teacher, Mrs. Nell Doar, for providing discipline,
`inspiration, and the mathematical foundation upon which I wasto build.
`
`Professor Robert Newton, my thesis advisor at the Naval Postgraduate School,
`who made it possible for me to pursue a rather unorthodox topic in the field of
`mobile robotics.
`
`Vice Admiral Earl B. Fowler, USN (Ret.) for creating a robotics program
`office within the Naval Sea Systems Command, and giving me a job after
`graduate school.
`
`Dr. Anita Flynn of MIT forall the late nights and weekends we spent hacking
`code and building our own sensors in my basement in Virginia.
`
`Gary Gilbreath of the Naval Command Control and Ocean Surveillance Center
`for transforming ROBARTII into a truly intelligent machine.
`
`My son, Todd Everett, for his tremendous help in generating all the graphics
`used in the figures.
`
`All those people kind enough to review this manuscriptin the various stages of
`its completion, offering helpful insights on how best to present the material: Ron
`Arkin, Johann Borenstein, Fernando Figueroa, Anita Flynn, Doug Gage, Bob
`Garwood, Tracy Heath, Susan Hower, Robin Laird, Richard Langley, Richard
`Lao, Larry Mathies, and Hoa Nguyen.
`
`In addition, portions of the material presented in Chapters 4 through 7 were
`previously published in Sensers and later Robotics and Autonomous Systems
`magazines, and updated in this book with their kind permissions.
`
`SilverStar Exhibit 1016 - 15
`SilverStar Exhibit 1016 - 15
`
`

`

`1 I
`
`ntroduction
`
`The past several years have brought about a tremendous rise in the envisioned
`potential of robotic systems, along with a significant increase in the number of
`proposed applications. Well-touted benefits
`typically associated with the
`installation of fixed-location industrial robots are improved effectiveness, higher
`quality, reductions in manpower, as well as greater efficiency, reliability, and cost
`savings. Additiona) drivers include the ability to perform tasks of which humans
`are incapable, and the removal of humans from demeaning or dangerous
`scenarios.
`
`range of
`suggested an additional
`The concept of mobility has always
`applications beyond that of the typical factory floor, where free-roaming robots
`move about with an added versatility fostering even greater returns.
`Early
`developmental efforts introduced potential systems for fighting fires, handling
`ammunition, transporting materials, and patrolling warehouses and storage areas,
`to name but a few. Most of the resulting prototypes met with unexpected
`difficulty, primarily due to an insufficient supporting technology base. Even
`today, after decades of extensive research and development,
`the successful
`application of mobile robots remains for the most part an elusive dream, with only
`a small handful of fielded systems up and running.
`While a number of technological hurdles have impeded progress, the three
`generally regarded as having the greatest impact are:
`1) computational resources,
`2) communications, and 3) sensors. The first two areas have been addressed for a
`variety of commercial reasons with remarkable progress.
`In just a little over 10
`years we havetransitioned from 6502- and Z80-based personal computers running
`under C/PM with a maximum 64-kilobyte address space,
`to Pentium-based
`systems running at 90 MHz and addressing up to 32 megabytes of memory. The
`recent surge in popularity of laptop computers has provided an extra impetus, with
`special emphasis on reduced power consumption and extended battery life.
`Wireless local area networks and spread-spectrum technology have likewise
`advanced in kind, to the point where there are now a number of vendors offering
`full-duplex Ethernet-compatible high-speed datalinks with ranges of several
`miles.
`
`
`SilverStar Exhibit 1016-16 |
`SilverStar Exhibit 1016 - 16
`
`

`

`2
`
`Sensors for Mobile Robots
`
`The third category of sensors now stands somewhat alone as the most
`significant technical challenge still facing developers, due primarily to a lack of
`high-volume applications. While there has indeed been some carry-over sensor
`technology from advances in flexible automation for manufacturing, it has fallen
`far short of the explosive growth seen in the computer and communications
`industries.
`Successful adaptation of what progress has been made is further
`hampered by the highly unstructured nature of a mobile robot's operating
`environment.
`Industrial process-control systems used in repetitive manufacturing
`scenarios,
`in contrast,
`rely on carefully placed sensors that exploit
`the target
`characteristics.
`Background conditions are arranged to provide minimal
`interference, and often aid in the detection process by purposely increasing the on-
`off differential or contrast. Unfortunately, such optimized configuration controlis
`usually no longer possible once mobility is introduced as a factor in the equation.
`Consider for example the issue of collision avoidance:
`any mobile robot
`intended for real-world operation must be capable of moving around without
`running into surrounding obstructions.
`In practice, however,
`the nature and
`orientation of obstacles are not known with any certainty; the system must be
`capable of detecting a wide variety of target surfaces under varying angles of
`incidence. Control of background and ambient conditions may not be possible. A
`priori information regarding the relative positions, orientations, and nature of
`objects within the sensor’s field of view becomes very difficult to supply.
`The situation only worsens when the operating environment is taken outdoors,
`for a number of reasons. To begin with, problems of scale introduce a need for
`additional range capability that significantly adds to system complexity and cost.
`While an indoor collision avoidance system may need to see only 4 to 6 feet in
`front of the robot, for example, exterior scenarios typically require effective
`coverage over a 20- to 30-foot span, sometimes more.
`In addition, the outdoor
`environment often poses additional complicating hazards to safe navigation (i.e.,
`terrain traversabilty, oncoming traffic, atmospheric obscurants)
`that demand
`appropriate engineering solutions not even addressed oninterior systems,
`On the positive side, worldwide interest in a rapidly expanding field known as
`intelligent vehicle highway systems (IVHS) has already created a huge potential
`market for sensors to address many of these problems as faced by the automotive
`industry (Catling, 1994), Lower-volume autonomous mobile robot applications
`are sure to benefit from the inevitable spin-off technologies that have already
`begun to emerge in the form of low-cost laser and millimeter-wave systems, for
`example. Many of these new and innovative products will be presented as
`illustrative examples in the following chapters, in hopes of further stimulating this
`technology-transfer process.
`
`1.1 Design Considerations
`
`The problems confronting most mobile robotic developmentefforts arise directly
`from the inherent need to interact with the physical objects and entities in the
`
`
`SilverStar Exhibit 1016 - 17
`SilverStar Exhibit 1016 - 17
`
`

`

`Chapter 1
`
`Introduction
`
`3
`
`environment. The platform must be able to navigate from a known position to a
`desired new location and orientation, avoiding any contact with fixed or moving
`objects while en route. There has been quite a tendency in early developmental
`efforts to oversimplify these issues and assume the natural growth of technology
`would provide the needed answers. While such solutions will ultimately come to
`pass,
`it
`is
`important
`to pace the evolution of the platform with a_ parallel
`development of the needed collision avoidance and navigation technologies.
`Fundamental in this regard are the required sensors with which to acquire high-
`resolution data describing the robot's physical surroundings in a timely yet
`practical
`fashion,
`and in keeping with the
`limited onboard energy and
`computational resources of a mobile vehicle. General considerations for such
`sensors are summarized below:
`
`e
`
`Field of view — Should be wide enough with sufficient depth of field to
`suit the application.
`e Range capability — The minimum range of detection, as well as the
`maximum effective range, must be appropriate for the intended use of the
`sensor.
`
`e
`
`* Accuracy and resolution — Both must be in keeping with the needs ofthe
`given task.
`« Ability to detect all objects in environment — Objects can absorb emitted
`energy; target surfaces can be specular as opposed to diffuse reflectors;
`ambient conditions and noise can interfere with the sensing process.
`e Real-time operation — The update frequency must provide rapid, real-
`time data at a rate commensurate with the platform’s speed of advance
`(and take into accountthe velocity of other approaching vehicles).
`e Concise, easy to interpret data — The output format should berealistic
`from the standpoint of processing requirements; too much data can be as
`meaningless as not enough; some degree of preprocessing and analysis is
`required to provide output only when action is required,
`e Redundancy — The system should provide graceful degradation and not
`become incapacitated due to the loss of a sensing element; a multimodal
`capability would be desirable to ensure detection ofall targets, as well as
`to increase the confidence level of the output.
`Simplicity — The system should be low-cost and modular to allow for
`easy maintenance and evolutionary upgrades, not hardware-specific.
`Power consumption — The power requirements should be minimal
`keeping with the limited resources on board a mobile vehicle.
`Size — The physical size and weight of the system should be practical
`with regard to the intended vehicle.
`

`
`*
`
`in
`
`The various issues associated with sensor design, selection, and/or integration
`are complex and interwoven, and not easily conveyed from a purely theoretical
`perspective only. Actual device characterization in the form of performance
`
`
`SilverStar Exhibit 1016 - 18
`SilverStar Exhibit 1016 - 18
`
`

`

`4
`
`Sensors for Mobile Robots
`
`validation is invaluable in matching the capabilities and limitations of a particular
`sensor technology to the application at hand. Most manufacturers of established
`product
`lines prov

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket