`
`
`
`
`
`
`VEHICLES(U> AR"Y ENGINEER TOPOGRAPHIC LABS FORT 8ELVOIR
`VA R D LEIGHTY ET AL OCT 85 ETL-R122
`
`FIG 12/9 NL
`UNCLASSIFIED
`
`lllllll-=IIIII
`11111
`
`VWGoA EX1038
`U.S. Patent No. 11,208,029
`
`1
`
`
`
`Grr
`
`Cran ees eo ee ee
`
`ae
`
`rae ea
`
`A
`
`war eg t
`
`r
`
`n a
`
`-
`
`e
`s
`/
`
`>
`
`7
`
`oO
`
`i FEEFEEEE
`
`nNNw
`
`i
`|
`— . —
`iggwe
`11111.54 :
`1.6
`i
`|
`NOn
`14 Hy,
`———
`i
`
`i
`
`o
`
`MITPACOPY RESOLUTICN TEST CHART
`1963-4
`
`MAWW
`
`rz
`2
`ae
`PONG a800 ne
`f
`Seer So be
`tbe bie baal
`Sate ately a Os
`
`2
`
`
`
`AD-A 187 972 ,OCUMENTATION PAGE
`lb RESTRICTIVE MARKINGS
`
`Form Approv
`oMBoo.o,0
`
`Za. SECURITY CLASSIFICATION AUTHORITY
`
`2b. DECLASSIFICATION/ DOWNGRADING SCHEDULE
`
`3 DISTRIBUTION/AVAILABILITY OF REPORT
`Approved for public release;
`distribution is unlimited.
`
`4. PERFORMING ORGANIZATION REPORT NUMBER(S)
`
`5 MONITORING ORGANIZATION REPORT NUMBER(S)
`
`R-122
`
`6a. NAME OF PERFORMING ORGANIZATION
`
`U SAET L
`c. ADDRESS (City, State, and ZIP Code)
`
`6b. OFFICE SYMBOL
`(if applicable)
`CEETL-LO
`
`Fort Belvoir, VA 22060-5546
`8a. NAME OF FUNDING/SPONSORING
`8b. OFFICE SYMBOL
`ORGANIZATION
`(If applicable)
`
`7a. NAME OF MONITORING ORGANIZATION
`
`7b. ADDRESS (City, State, a
`
`,
`
`LECTE
`
`3
`
`9. PROCUREMENT
`
`INSTRUMEIR
`
`FN TIFICATION NUMB
`C
`.
`
`8c. ADDRESS(City, State, and ZIP Code)
`
`11. TITLE (Include Security Classification)
`
`10. SOURCE OF FUNDING NUMBERS
`PROJECT
`PROGRAM
`NO.
`ELEMENT NO.
`
`TASK
`NO.
`
`IWORK UNIT
`1ACCESSION NO.
`
`DEVELOPING TECHNOLOGIES FOR ARMY AUTONOMOUS LAND VEHICLES (U)
`12. PERSONAL AUTHOR(S)
`DR. ROBERT D. LEIGHTY, AND MR. GERALD R. LANE
`114. DATE OF REPORT (Year Month, Day) 1
`13a. TYPE OF REPORT
`1l3b. TiME COVERED
`I
`TECHNICAL
`OCT 85
`IFROM
`16. SUPPLEMENTARY NOTATION
`
`TO
`
`.PAGE COUNT
`
`17.
`
`FIELD
`
`COSATI CODES
`GROUP
`
`SUB-GROUP I robotic land vehicle technologies
`
`18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number)
`
`19. ABSTRACT (Continue on reverse if necessary and identify by block number)
`
`N/A (presentation with publications in the proceedings.)
`
`20. DISTRIBUTION/AVAILABILITY OF ABSTRACT
`13 SAME AS RPT
`-3 UNCLASSIFIED/UNLIMITED
`22a. NAME OF RESPONSIBLE INDIVIDUAL
`F. DARLENE SEYLER
`DD Form 1473, JUN 86
`
`0 DTIC USERS
`
`21. ABSTRACT SECURITY CLASSIFICATION
`UNCLASSIFIED
`22b TELEPHONE (Include Area Code) 22c. OFFICE SYMBOL
`CEETL-LO
`1
`(202) 355-2647
`Previous editions are obsolete.
`
`SECURITY CLASSIFICATION OF THIS PAGE
`
`01
`
`UNCLASSIFIED
`
`3
`
`
`
`DEVELOPING TECHNOLOGIES FOR ARMY
`AUTONOMOUS LAND VEHICLES (U)
`
`*ROBERT D. LEIGHTY, DR.
`U.S. ARMY ENGINEER TOPOGRAPHIC LABORATORIES
`FORT BELVOIR, VA 22060-5546
`
`GERALD R. LANE, MR.
`U.S. ARMY TANK AND AUTOMOTIVE COMMAND
`WARREN, MI 48397-5000
`
`INTRODUCTION
`
`The battlefield, as described in AirLand Battle 2000, will be
`characterized by considerable movement,
`in
`large areas of operations
`a variety of environments, and the potential use of increasingly
`sophisticated and lethal weapons throughout the area of conflict.
`Opposing forces will rarely be engaged in the classical sense and clear
`differentiation between
`rear and
`forward areas will not be possible.
`To operate effectively under these conditions the Army must bring new
`In 1981 the Army commissioned a study
`technologies to the battlefield.
`to suggest applications of artificial intelligence (AI) and robotics
`technologies in Army combat and combat support functions [1]. One hundred
`applications were suggested and these were divided into ten categories.
`The
`technologies were
`to be
`immature for a
`large number of
`indicated
`the potential applications, but
`the number of key technology elements
`associated with Al and robotics Is relatively small. Thus development
`of future Army systems that Integrate Al and robotic capabilities to
`more effectively move, shoot, and communicate on
`the battlefield may
`depend on the maturity of relatively few key technology elements.
`Autonomous land vehicles represent one subclass of these systems which
`has been a subject of increasing Army interest [2].
`The potential value
`of such systems for unmanned weapons platforms, reconnaissance, resupply,
`etc.,
`has been
`recognized at all levels (3,4].
`This has created a
`dilemma;
`the user community has Initially expressed a need
`for these
`systems, while
`the
`laboratory community has
`indicated a present
`lack
`of technological maturity.
`
`This paper describes research, development, and demonstration of
`robotic land vehicle technologies and a recent R&D partnership between
`and the U.S. Army
`the Defense Advanced Research Projects Agency
`(DARPA)
`
`7
`
`.7
`
`77
`
`4
`
`
`
`LEIGHTY & LANE
`
`Tank and Automotive Command
`(TACOM).
`The partnership
`is significant
`because it addresses both horns of the dilemma. The DARPA program focuses
`on developing the key technologies for autonomous vehicle navigation
`and provides a critical mass of dollars and talent to meet these
`objectives, while the Army program integrates DARPA and other technologies
`to provide demonstrations of value to the user community.
`
`DARPA STRATEGIC COMPUTING PROGRAM
`
`The DARPA Strategic Computing Program (SCP) was a new initiative
`in October 1983 [5]. It was designeo to seize an opportunity to leverage
`recent advances in Al, computer science, and microelectronics and create
`a new generation of "machine intelligent technology." The program focuses
`on
`three military applications of machine
`technology: (1)
`intelligence
`the Autonomous Land Vehicle, (2) the Pilot's Associate, and (3) the
`Battle Management Advisors.
`Each application has yearly demonstrations
`of prototype systems of increasing complexity and the requirements of
`each *demonstrator have been structured to "pull" new capabilities from
`the technology base, rather than "push" available capabilities at the
`user. The SCP has a large built-in technology base research program
`addressing areas of advanced computing technologies such as image
`understanding, expert systems, voice recognition, natural language
`understanding, and microelectronics.
`These technology efforts are
`appropriately linked to the demonstrators. The expected expenditures
`of the SCP is $600 million over the five year period 1984-1988.
`
`THE AUTONOMOUS LAND VEHICLE PROGRAM
`
`The ALV
`focuses on development of a broadly applicable autonomous
`navigation technology base, and not vehicle development per se. The
`primary requirement of the ALV
`testbed is
`to provide a platform with
`flexibility to integrate and demonstrate the SCP technologies. Objectives
`of the ALV yearly demonstrations are:
`
`1985 - Road Following Demonstration: Vehicle
`traverses a 2 km preset
`route on a paved road at speeds up to 10 km/hr.
`Forward motion only
`and no obstacle avoidance required.
`
`1986 -
`Obstacle Avoidance Demonstration:
`Vehicle traverses 5 km road
`course at speeds up to 20 km/hr; must recognize and maneuver to avoid
`fixed objects that are small with respect to road width.
`
`1987 - Cross-country Route Planning Demonstration:
`Vehicle plans and
`executes a 5 km traverse of open desert terrain at speeds up to 5 km/hr.
`Demonstrates soil and ground cover typing.
`
`1988 - Road Network Route Planning and Obstacle Avoidance Demonstration:
`
`*
`
`'C.
`
`4,
`
`',
`
`"
`
`"%
`
`"
`
`%
`
`'
`
`5
`
`
`
`LEIGHTY & LANE
`
`through a
`km point-to-point traverse
`Vehicle plans and executes a 20
`landmarks as navigation
`road network at speeds up to 20 km/hr using
`Demonstration includes map updating and off-road maneuvering
`aids.
`to avoid obstacles.
`
`1989 - Cross-country Traverse with Landmark Recognition Demonstration:
`Vehicle plans and executes a 20 km traverse through desert terrain with
`includes replanning
`obstacles at speeds up to 10 km/hr. Demonstration
`when confronted with impassable obstacles.
`
`road and Open Terrain Demonstration: Vehicle plans and
`- Mixed
`1990
`executes a 20 km traverse in wooded terrain with isolated obstacles
`and a 50 km traverse on paved and unpaved roads at speeds up to 50 km/hr.
`Route planning includes multiple goals.
`
`Martin Marietta Denver Aerospace, Denver, CO, won competitive com-
`petition as ALY integrating contractor in August 1984 and has responsi-
`bilities for all project research and development except vision algorithm
`development. In this regard, !niverslty of Maryland directly supports
`the ALV project and the Technology-based Vision contractors will provide
`vision algorithm support for the future. Martin Marietta is supported
`by two additional contractors; Hughes Al Research Laboratory provides
`Institute of
`planning software support and the Environmental Research
`imaging
`the laser ranging
`is developing and supports
`(ERIM)
`Michigan
`system. The U.S. Army Engineer Topographic Laboratories will produce
`the digital terrain data base for the Martin Marietta test area.
`
`AUTONOMOUS LAND VEHICLE TECHNOLOGIES
`
`FUNCTIONAL REQUIREMENTS FOR AUTONOMOUS LAND VEHICLES
`
`Autonomous mobility in a dynamic unconstrained environment requires
`that a system sense its environment, model critical features from the
`to determine a mobility path, and
`the model
`reason about
`sensed data,
`control the vehicle along the path. Functional subsystems could be:
`The sensors subsystem must have the capability to sense
`SENSORS:
`
`critical environmental features having impact on mobility.
`
`PERCEPTION: The perception subsystem must be able to process sensor
`data to create a perceptive model of the environment.
`
`The reasoning subsystem must be capable of reasoning
`REASONING:
`about the perceptive model and information from the knowledge base to
`determine appropriate mobility strategies.
`
`CONTROL: The control subsystem must execute stable control to travel
`
`or
`
`copy
`INSPECT~
`
`6
`
`
`
`LEIGHTY & LANE
`
`along the selected Dath.
`
`KNOWLEDGE BASE:
`The vehicle system must have access to knowledge
`about the environment, the capabilities of the vehicle, the misrion
`requirements, and characteristics of the environmental features.
`
`VEHICLE:
`The vehicle system must have a stable platform capable
`of carrying necessary sensors, computers, electronics, and communications
`equipment at required speeds for on-road and cross-country travel.
`
`HUMAN INTERFACE:
`The vehicle system must interface with a human
`operator to accept mission goals, report on system status, and assist
`in problem solving.
`
`A general scenario for Operation of the ALV platform integrates
`these functional requirements. The mission begins when a human operator
`specifies mission objectives and constraints to the vehicle system via
`a man/machine communications
`interface.
`The reasoning subsystem
`interprets mission goals and constraints and decomposes them into
`subgoals.
`From information in the knowledge base and the subgoals,
`the reasoning subsystem prepares a global plan of its route and actions.
`Upon completion of the global planning, the reasoning subsystem provides
`goals to the preception subsystem for decomposition into tasks to be
`accomplished by the sensors subsystem. Scene data acquired by the sensors
`subsystem along the proposed route is passed to and processed by the
`perceptual subsystem to produce a high-level symbolic model of the
`environmental features along the route. If no obstacles are detected,
`the reasoning system updates its position along the route and issues
`commands to the control subsystem to move along the route. If obstacles
`are detected, the reasoning subsystem initiates local data acquisition
`and planning to effect circumnavigation around the obstacle. If local
`planning produces no acceptable circumnavigation path, the global planning
`process is relnitiated from the current location. And if no acceptable
`route is found, the vehicle requests assistance from the operator.
`
`DARPA TECHNOLOGY-BASE VISION FOR THE ALV
`
`The Technology-base Vision efforts of the SCP are focused on issues
`that are impediments to real-time image understanding in outdoor
`environments. The research addresses the perceptual subsystem in above
`discussion and has issues that include development of: robust and general
`models for objects and terrain features; general representation schema
`for computer vision primitives and knowledge; the ability to generate
`3D
`scene
`descriptions;
`spatial
`reasoning
`capabilities;
`massive
`6computational
`speedups at all levels of the computer vision problem;
`sound theoretical foundations for vision process models; techniques
`for dealing with the dynamic aspects of rapidly changing envlronments;and
`
`0II1I
`
`" I I:"
`
`7
`
`
`
`LEIGHTY & LANE
`
`integrated vision systems that can perform complex tasks in real time.
`The Technology-base Vision efforts address these issues with a substantial
`set of contractors which include: Carnegie-Mellon University (CMU);
`SRI International (SRI); Advanced Decision Systems (ADS); Stanford
`University (SU); General Electric Corporation (GE); Hughes AI Research
`Laboratory (Hughes); University of Massachusetts (UMass); University
`(Honeywell);
`Corporation
`Honeywell
`(USC);
`of Southern California
`and
`Columbia University
`(CU);
`Rochester
`(UofR);
`University of
`Massachusetts Institute of Technology (MIT). A brief description of
`the research responsibilities of these organizations follows.
`
`New-Generation Vision System Development (CMU): A new-generation
`vision system is to be developed for dynamic image understanding environ-
`ments for ALV applications. A system framework will be built to
`accommodate integration of component research tasks outlined below.
`
`Different representa-
`Common Vision Representation Schema (SRI):
`tion schema needed for various parts of the computer vision process
`and the construction of a spatial directory to provide a uniform means
`of handling differentiation models will be developed.
`
`-
`
`.1'.
`
`This involves dis-
`Visual Modeling (SRI, ADS with SU, and GE):
`covery of general models to represent objects and natural terrain for
`predicting and matching against real world observations. Also included
`is the application of reasoning techniques to improve geometric model
`construction and object identification.
`
`Discriminatory techniques are in-
`Obstacle Avoidance (Hughes):
`vestigated for distinguishing and evaluating obstacles in the path of
`a vehicle and the integration of those techniques with a planner to
`avoid obstacles along a planned path.
`
`This effort focuses on
`Dynamic Image Interpretation (UMass):
`discovery of knowledge about dynamic environments and development of
`improved image recognition techniques that accommodate distortions arising
`from movement within the environment.
`
`* •to
`
`Target Motion Detection (USC): Motion analysis technology is studied
`detect moving objects within the ALV field of view.
`
`Object Recognition and Tracking (Honeywell): This effort involves
`discovery of improved object recognition techniques and higher-level
`knowledge to permit tracking of objects from scene to scene.
`
`Real Time Issues (UofR, UMass, MIT, CU): Development of a parallel
`algorithms,
`parallel
`processing
`common
`environment,
`programming
`specialized parallel processing techniques for depth mapping, and
`
`8
`
`
`
`LEIGHTY & LANE
`
`integrated advanced architecture for parallel processing at all levels
`
`of the computer vision process are the thrusts of these efforts.
`
`ALV SUBSYSTEMS
`
`The May 1985 road following demonstration was accomplished by Martin
`Marietta with vision algorithm assistance from the University of Maryland.
`This was a significant accomplishment since the contract was awarded
`in late August 1984 and an initial demonstration, had the vehicle
`autonomously traveling along 1 km of road at 5 km/hr. Later in the
`year the vehicle traveled 2 Km at a speed of 10 km/hr and required
`processing at 1.75 sec/frame to segment roads with commercially available
`computer hardware.
`The ALV subsystems in place at the end of 1985,
`when the vehiele attained 5 km at 10 km/hr, will be briefly outlined.
`
`Sensors: The ALV
`sensor subsystem employs
`a RCA color video CCD TV
`camera and an ERIM laser
`range scanner. The video
`camera acquires 30 frames
`per second and delivers
`red, blue, and green inten-
`sity images in analog form
`to a VICOM image processor
`that digitizes the three
`color bands into 512x484
`pixels with 8 bits/pixel.
`The perception subsystem
`controls a pan/tilt drive
`for this sensor. The
`laser range scanner is an
`amplitute modulated light
`source that is scanned
`over the area in front of
`the vehicle. Phase shift of reflected light from the scene features is
`measured with respect to an internal reference to determine range. The
`range data is processed on the VICOM in the form of a 64x256 digital array
`with 8 bit accuracy and requires 1 to 2 seconds to acquire and store as a
`range image.
`
`Perception: The perception subsystem accepts sensor images and routes
`them to the appropriate processor. It has four major components: (1) a
`video processing component that extracts road edges and activates pan con-
`trols by cues from the reasoning subsystem, (2) a range data processing
`component that produces a set of 3-D points (in the sensor coordinate
`system) representing road edges, (3) a transformation component to correct
`
`4,
`
`Pg
`
`I-.
`
`0:
`
`0: .
`
`e46
`
`9
`
`
`
`LEIGHTY & LANE
`
`video or range points to 3-D vehicle coordinates, and (4) an executive that
`switches between components, based on a measure of plausibility of the pro-
`cessed edge points, to transmit a set of 3-D road edge coordinates as the
`scene model.
`
`Reasoning:
`The reasoning subsystem receives a plan script from
`a human test conductor and coordinates all ALV ooerations. It requests
`scene models from the perceptlon subsystem and converts them into smooth
`trajectories that are passed to the pilot to drive the vehicle. It
`also provides the perception subsystem with cues about upcoming events
`for maintaining a
`its path and
`is
`responsible
`or conditions along
`knowledge base that contains a map and other descriptive data about
`the test track.
`The reasoning subsystem has three major components:
`(1) a goal seeker that directs and coordinates the activity of the
`reasoning subsystem from a decomposed plan script, controls information
`interchange with the perception subsystem, and monitors execution of
`the -current activity until its completion when the next plan script
`activity is issued; (2) a navigator that receives a scene model and
`a goal position, queries the knowledge base about the road location,
`and computes a trajectory which is sent to the pilot; and (3) a knowledge
`base that maintains a map of the test area.
`
`Pilot: The pilot subsystem converts the intervals of a trajectory
`into steering commands for the vehicle. It calculates steer right,
`steer left, and speed commands by first determining error values for
`speed, lateral position, and heading by comparing the current vehicle
`heading and speed provided by the land navigation system with the desired
`speed and heading specified by the current trajectory internal.
`
`The knowledge base consists of a digital repre-
`Knowledge Base:
`sentation of the road net.
`
`The vehicle subsystem has an undercarriage that is an
`Vehicle:
`eight-wheel hydrostatically driven unit capable of traversing rough
`terrain at speeds up to 29 km/hr and 72 km/hr on improved surfaces.
`Steering is accomplished by reducing or reversing power to one of the
`wheel sets. A 2-inch air tight fiberglass shell is large enough to
`house on-board computers, sensors, associated electronics, electric
`power, and air conditioning for interior environmental control.
`Human Interface:
`The human test conductor directly inputs the
`plan script for the road following test. A deadman switch serves as
`a safety device for halting unexpected or out of control trajectories.
`
`,
`
`Hardware Architecture: The primary computer architectures include
`an Intel multiprocessor system which supports the reasoning subsystem
`and pilot, and a VICOM image processor which supports the perception
`
`1
`
`(
`
`ii
`
`10
`
`
`
`LEIGHTY & LANE
`
`subsystem. The multichannel controller provides an interface to the
`VICOM image processor and the laser scanner.
`In addition to the
`Intel multiprocessor and VICOM, the ALV's architecture includes a video
`tape recorder, a time-code generator, a Bendix land navigation system,
`left and right odometers, vehicle control and status sensors, and an
`ERIM laser scanner with an associated processor.
`
`FUTURE ALV SUBSYSTEM DEVELOPMENTS
`
`As indicated above, the development of the ALV capabilities is
`driven by the yearly demonstration objectives. As the demonstration
`requirements stress the performance capabilities of the methods and
`equipments, new approaches are necessary to continue the system evolution.
`In many cases
`the methods and equipments already employed are at or
`very near the state of the art and progress will require implementation
`directly from basic research
`in the
`technology base.
`Thus prediction
`of ALV subsystem developments is risky and subject to change.
`Nevertheless, it is instructional to indicate the major near-range
`subsystem plans, given the present state of the ALV system and the
`technology base program that supports it.
`
`Sensors: A multispectral laser scanner, presently under develop-
`ment at ERIM, will replace the monospectral laser scanner now being
`employed. This scanner will use a YAG laser to develop six discrete
`wavelength beams which are detected as a range image and six reflected
`intensity images having 256x256 pixels.
`
`Perception: The primary near-range enhancements to the perception
`subsystem involve generalization of road following algorithms to allow
`faster travel along roadways with an increased range of variability.
`Avoidance of road obstacles requires their recognition and segmentation
`from sensor data.
`Offroad travel requires multispectral processing
`and segmentation to be modeled and transmitted to the reasoning subsystem.
`
`Reasoning:
`The reasoning subsystem must evolve considerably in
`the near-range to attain the demonstration goals. It must interpret
`a wide range of road, obstacle, and terrain object models; monitor the
`status of the vehicle; reason about its present and future location;
`and adjust speed and direction as necessary.
`
`-Knowledge-base: For 1986 models of roads and obstacles in the
`data base will be expanded. In 1987 a terrain data base will be added
`for apriori environmental information and terrain object models will
`be introduced. A "blackboard" memory structure will be used for main-
`taining cognizance of temporal activities and knowledge.
`
`Vehicle:
`
`The vehicle chassis will not change through 1990, how-
`
`*
`
`,
`
`%,
`
`11
`
`
`
`LEIGHTY & LANE
`
`ever the computers, electronics, and environmental accessories (e.g.,
`power supplies and air conditionings) necessary for the demonstrations
`must be incorporated in the vehicle enclosure.
`
`Human Interface: A user-friendly command module will be coupled
`to the vehicle with a communications interface.
`
`Advanced Computers:
`At 10 km/hr the onboard VICOM processor is
`almost compute bound.
`There
`is not enough processing power
`to analyze
`both video and
`range data simultaneously,
`therefore the perception sub-
`system must choose the set of sensory data to process. Parallel processors
`are required for 1986 and beyond.
`To this end a 16-node BBN Butterfly
`parallel processor will be utilized for the reasoning subsystem and
`portions of the perception subsystem in 1986.
`Two onboard VICOM pro-
`cessors will be used for perceptual processing until mid-1986 and then
`these will be
`replaced by a CMU WARP computer, which
`is an aavanced
`multistage programmable systolic array.
`Once an integrated hardware
`and software environment is developed, the WARP-Butterfly combination
`will provide powerful parallel support for perception and reasoning.
`Both computer systems can be upgraded as additional capacity as required.
`
`Thus it is seen that the DARPA ALV program focuses significant
`efforts on key
`technology
`issues to meet
`the demonstration objectives
`leading to a new generation of intelligent machines.
`The following
`discussion will indicate how the Army plans to capture and use techno-
`logy spinoffs to advance the state of the art of robotic vehicle systems
`capable of performing military missions.
`
`ARMY ROBOTIC VEHICLE PROGRAM
`
`The Army robotic vehicle program focuses on demonstration of state
`of the art robotic vehicle capabilities applied to combat missions of
`value to the Army user community.
`The program is structured to
`progressively demonstrate increasing degrees of autonomous capabilities
`in military missions as the technologies evolve.
`This program naturally
`complements the DARPA ALV program which is structured to demonstrate
`increasing capabilities of technologies associated with autonomous
`vehicles beginning with autonomous demonstrations of limited military
`value and increasing in military value as the technologies evolve.
`In the cooperative DARPA and Army programs the ALV provides the transfer
`of technology advances that enhance the Army program's degree of vehicle
`autonomy, while the Army provides military focus for the ALV.
`The Army
`robotic vehicle program is also a mechanism to transfer other DOD
`sponsored vehicle technologies and provides a test bed for evaluation
`of industrial Independent Research and Development (IR&D) in related
`efforts. Though the Army program
`is
`long-term it can provide short-term
`spinoffs with direct military applications.
`
`g
`
`l=
`
`12
`
`
`
`LEIGHTY & LANE
`
`The ultimate goal of the Army program is to demonstrate increased
`force effectiveness and/or improved soldiers battlefield survivability.
`Techniques such as remote management and multiple vehicle control will
`be primary tools used in the program.
`
`ROBOTIC VEHICLE SYSTEM
`
`Armor and Infantry type missions will be initially demonstrated
`with a system consisting of several robot vehicles and a Robotic Command
`Center (RCC). The vehicles and RCC will be coupled through communication
`subsystems under evaluation. Standard RF. microwave, and fiber optics
`links will be initially integrated.
`
`The robot vehicles will have two sensor subsystems: The driving
`sensor subsystem and the mission sensor subsystem. The former will
`operate in two modes: (1) teleoperated extension of the operators eyes
`and ears tnrough stereo and peripheral vision and stereo microphones
`and (2) supervisory and machine vision allowing operator management
`of vehicle actions while onboard image processors interpret images from
`The mission sensor subsystem
`the stereo cameras and laser scanners.
`incorporating thermal imaging, daylight video and laser rangefinders,
`shall be mounted on the mission modules. A telescoping mast will be
`employed providing 3600 independent rotation.
`
`Robotic vehicle machine vision processors will track those of the
`ALV, beginning with a VICOM image processor and advancing to WARP and
`Manual
`Butterfly processors as the technologies and need evolves.
`override of vision control provides added mobility in obstacle avoidance
`situations which may initially overburden the image processing system.
`
`The vehicle payload will be modularized according to mission and
`permit mounting of modules such as weapon packages or manipulators for
`logistics material handling.
`
`The RCC will be an adapted manned close combat vehicle and house
`all the robotic vehicle displays and operator controls. Displays will
`provide stereo vision, terrain data base, and peripheral and rear camera
`views for driving.
`
`DEMONSTRATION PLAN
`
`1986 -
`The first Army demonstrations will occur in August and
`September at the ALV test site. A route reconnaissance mission will
`be performed by two robotic vehicles under separate supervised control
`from robotic control stations. The supervised control will use both
`state of the art teleoperation and autonomous road following. The robotic
`vehicles under evaluation are built under IR&D programs by FMC Corporation
`
`J
`
`-
`
`",
`
`*,
`
`0
`
`-a
`Oa.
`
`13
`
`
`
`LEIGHTY & LANE
`
`ADVANCED GROUND VEHICLE TECHNOLOGY (AGVT)
`TESTBEDS
`
`and General Dynamics.
`The FMC vehicle is a
`tracked M113 and the
`General Dynamics ye-
`hicle is a 4 x 4
`wheeled Commando
`Scout (Chassis is
`built by Cadillac
`Gage). These
`systems will inte-
`grate ALV software
`and perform military/
`combat type missions
`using both tele-
`operated and autono-
`mous control. The
`teleoperation mode
`will be used in
`cross-country
`terrain and more
`difficult maneuvers
`now requiring man in
`the control loop. The autonomous road following algorithms are a product
`issues will
`Soldier/machine
`the November 1985 ALV demonstration.
`of
`Follow-on Army demonstrations are
`be evaluated during the missions.
`planned to show decreasing driver workloads leading to the ability of
`a driver to manage multiple vehicles simultaneously.
`
`1988 - A newly developed Army owned and operated robot vehicle
`system will be used to evaluate the performance of a more agressive
`route reconnaissance mission using ALV software demonstrated in 1986
`Route reconnaissance will be performed autonomously on the
`and 1987.
`road at speeds up to 20 km/hr with obstacle avoidance and cross-country
`traverse at 5 km/hr. In addition to the ALV software integration,
`Computer Aided Remote Driving (CARD), which is a new robotic mobility
`technique, will be added to allow a driver to predrive a path via his
`display. Using a light pen and a light sensitive stereo display of
`the driver's sensor, the driver will designate a path for the vehicle
`to follow. CARD, presently being developed for the Army by JPL, is
`coupled to vehicle control with an onboard land navigation subsystem
`Preliminary demonstrations of multiple robot
`and optical tracking.
`control through teleoperation and fiber optics links will be evaluated.
`
`1989-1990 -
`Improved multi-vehicle management at speeds up to 10
`km/hr will be demonstrated. A platoon of vehicles will engage in both
`offensive and defensive missions to evaluate the potential increase
`One
`in force effectiveness rsulting from multiple vehicle control.
`
`*
`
`14
`
`
`
`LEIGHTY & LANE
`
`driver and one commander/mission specialist will maneuver and operate
`the vehicles.
`Initial site selection will be road networks and smooth
`terrain.
`Speeds will be limited to 10 km/hr.
`These demonstrations
`will result primarily from the 1987 and 1988 ALV software demonstrations
`in cross-country route planning and obstacle avoidance.
`
`1991+ -
`Armored Family of Vehicles (AFV) robotic variants will
`execute missions singularly, in packs, or in concert with manned vehicles.
`As the ALV software provides higher vehicle speeds, more realistic combat
`missions can be demonstrated with increasing autonomy. Vehicles mounting
`weapon packages or other mission modules operating in concert with manned
`systems will demonstrate performance of military missions. The use
`of robotic vehicles in the AFV family is the major objective of the
`Army program. . AFV will use the robotic vehicle demonstrations to build
`follow-on requirements for singular robotic vehicles performing high
`risk or tasks for a platoon of robots in offensive type missions.
`
`ARMY ROBOTIC VEHICLE REQUIREMENTS
`
`The Army user community is very interested in the application of
`robotics to solve field problems and through a number of TRADOC Centers
`and Schools have identified potential robotic concepts.
`Concept re-
`quirements were formulated by a TRADOC General Officer Steering Committee
`(GOSC) for AI and Robotics managed by the U.S. Army Soldier Support
`Center (USASSC).
`USASSC released a broad requirements document for
`robotic vehicles which summarizes all submissions to the GOSC by the
`Schools and Centers. On 15 May 85, USASSC released a "Summary of TRADOC
`Requirements for Generic Robotic Vehicle Systems" which realizes the
`evolutionary process
`required to
`field
`robotic vehicles,
`i.e.,
`teleoperation available now, while also being the first step in the
`evolution of autonomous systems.
`
`Two leading TRADOC Schools that drove each end of the USASSC require-
`ments were the U.S. Army Infantry School (USAIS) and the U.S. Army
`Armor School (USAARMS). USAIS requirements are for a teleoperated mobile
`platform that can mount a mission system to defeat enemy armor. An
`Infantry man with a control box will guide the Robotic Anti-Armor Systems
`(RAS) to a firing position and then locate, aim, fire, and guide the
`missile. Ultimately USAIS would like to product improve the RAS to
`provide autonomous mobility and automatic target detection and servicing.
`
`USAARMS's Operational and Organizational (O&O) Plan for the Family
`of Robotic Combat Vehicles looks at using a common chassis with various
`mission module combinations. In normal mode of operation the robotic
`vehicles will receive mission guidance from a RCC in a manned close
`combat vehicle operating in concert with the robotic vehicles. This
`O&O desires autonomous operation, however, for near term applications
`
`15
`
`
`
`LEIGHTY & LANE
`
`would field systems directed b