`
`IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: ANALOG AND DIGITAL SIGNAL PROCESSING, VOL. 44, NO. 8, AUGUST 1997
`
`Integrated Circuit Testing for Quality
`Assurance in Manufacturing:
`History, Current Status, and Future Trends
`
`Andrew Grochowski, Member, IEEE, Debashis Bhattacharya, Senior Member, IEEE,
`T. R. Viswanathan, Fellow, IEEE, and Ken Laker, Fellow, IEEE
`
`(Invited Paper)
`
`Abstract— Integrated circuit (IC) testing for quality assur-
`ance is approaching 50% of the manufacturing costs for some
`complex mixed-signal IC’s. For many years the market growth
`and technology advancements in digital IC’s were driving the
`developments in testing. The increasing trend to integrate infor-
`mation acquisition and digital processing on the same chip has
`spawned increasing attention to the test needs of mixed-signal
`IC’s. The recent advances in wireless communications indicate a
`trend toward the integration of the RF and baseband mixed signal
`technologies. In this paper we examine the developments in IC
`testing form the historic, current status and future view points. In
`separate sections we address the testing developments for digital,
`mixed signal and RF IC’s. With these reviews as context, we relate
`new test paradigms that have the potential to fundamentally alter
`the methods used to test mixed-signal and RF parts.
`
`good IC’s to the total number of IC’s fabricated. A good IC
`is one that satisfies all of its performance specifications under
`all specified conditions. The probability of a bad IC increases
`in proportion to its size and complexity. It is also increased
`by process sensitivities that occur in digital and analog IC’s
`that rely on the control and/or matching of IC components or
`parameters to achieve their specified functionality. The details
`that govern IC yield and the associated economics that drive
`IC manufacturing are very interesting, but beyond the scope
`of this paper. The interested reader is referred to the literature,
`e.g., [98], [99]. In any event, testing is an indispensable part
`of the IC manufacturing process.
`Since testing is repeatedly performed at several manufac-
`turing stages, such as after wafer fabrication and packaging, it
`is very important to understand its inefficiencies. Reducing or
`eliminating these inefficiencies enables a chip manufacturer
`to drive down the cost of the final product [1], [2]. It is
`also important to understand the reasons for and the costs
`associated with testing. It is in the chip manufacturer’s best
`interest to minimize the number of bad devices shipped to the
`customer. A bad device is an IC that fails to meet one or more
`specifications at any point in the manufacturing process. Poorly
`designed tests or parts not designed for testability can result
`in bad devices appearing as good parts or good devices failing
`tests and appearing as bad. The shipment of bad devices leads
`to incurred replacement cost, loss of reputation, and possible
`loss of market share. The other side of this problem is not much
`better. When good parts are represented as bad, it reduces the
`chip yield and, correspondingly, it reduces the earnings of the
`chip manufacturer. At this point we should introduce the reader
`to the term DUT or device under test. DUT is a generic term
`used in the testing literature to refer to the component, or the
`bare IC die on a wafer, or the packaged IC, or the circuit board,
`etc. that is being tested. We will use this term frequently in
`this paper, mostly to refer to bare and packaged IC’s
`Manuscript received August 8, 1996; revised January 31, 1997. This paper
`The major steps of a typical IC manufacturing process, with
`was recommended by Editor J. Choma, Jr.
`clear indication of the points at which the device testing is
`A. Grochowski, D. Bhattacharys, and T. R. Viswanathan are with Texas
`performed, are shown in the flow diagram of Fig. 1.
`Instruments Inc., Dallas, TX 75243 USA.
`K. Laker is with the Electrical Engineering Department, University of
`Testing, at different steps in the product development
`Pennsylvania, Philadelphia, PA 19104 USA.
`process, addresses different
`issues and presents different
`Publisher Item Identifier S 1057-7130(97)03645-8.
`challenges. The initial
`testing is performed within the
`1 CODEC—a related pair of coding (analog-to-digital) and decoding
`computer-aided design (CAD) environment. At this stage the
`(digital-to-analog) blocks in a pulse code modulated telephone channel
`1057–7130/97$10.00 ª
`
`I. INTRODUCTION
`
`OVER THE YEARS of its development, the integrated
`
`technology has brought great progress to the
`circuit
`design of high performance systems. Many challenges in the
`manufacturing process had to be solved to achieve this. One
`of the steps in this process, namely testing, is posing the most
`significant challenge to contemporary and future integrated
`circuit (IC) manufacturing. This is a continuing trend, because
`due to decreasing silicon cost and increasing complexity of
`integrated circuits, testing constitutes a very sizable portion of
`the IC manufacturing cost. This trend is further accentuated by
`the emergence of mixed signal, including radio frequency (RF)
`circuits, coupled with the competitive price pressures of the
`high volume consumer market. Frequently, the cost of testing
`a chip with a CODEC,1 an integrated digital signal processor
`(DSP), and other base-band circuitry reaches 30 to 50% of the
`total cost.
`Due to unavoidable statistical flaws in the materials and
`masks used to fabricate IC’s, it is impossible to realize 100%
`yield on any particular IC; where yield refers to the ratio of
`
`1997 IEEE
`
`
`
`GROCHOWSKI et al.: IC TESTING FOR QUALITY ASSURANCE
`
`611
`
`retested using the full test suite. The objective of this step is
`to eliminate the “weak” devices which are likely to fail during
`their early period of operation. This failure mode is referred
`to as “infant mortality” and it represents the initial portion of
`the aging or so called “bath-tub curve” [6].
`Finally, a smaller sample is retested, prior to delivery,
`for quality assurance (QA) purposes. Different manufacturers
`implement their own independent QA programs. However,
`the globalization of the microelectronics industry drives the
`establishment of QA standards such as the popular ISO9000
`standard.
`The repetitions of similar tests performed on every IC,
`makes the process appear inefficient. The number of repetitions
`is necessary to weed out bad IC’s as early in the manufacturing
`process as possible and to ultimately insure that no bad
`IC’s are shipped to customers. The cost associated with
`producing a bad device is magnified by a factor of ten
`as it propagates undetected through the chip and system
`manufacturing processes and ultimately to the field. Conse-
`quently, the number of tests and the total test time need to
`be balanced to achieve the requisite process quality assurance
`while keeping the manufacturing cost a low as possible. The
`feed-back of failure mode information to previous steps in the
`manufacturing process enables a closed loop control that is
`vital to continuously improving the efficiency and quality of
`the process. For this reason, simply limiting the number of
`tests performed at each step is usually an unacceptable cost
`reducing measure.
`
`II. AUTOMATED TESTING
`Historically, IC characterization and testing developed with
`the use of bench-top equipment. There are still parts, e.g.,
`in the microwave and millimeter wave area, that can only be
`tested this way. These instruments have provided and continue
`to provide high performance and flexibility of use; both
`characteristics are very important during the characterization
`and debugging phases of IC development. As IC DUT’s
`became smaller and more complex, they also became more
`difficult to fully test on the bench. It also became important to
`have several synchronized instruments operating on the DUT
`simultaneously. This use of synchronized instruments became
`a vital approach for both the characterizing of IC’s in the
`R&D laboratory and the conduct of the production tests in the
`manufacturing environment. It was then seen that a streamlined
`interconnection of such instruments could simplify the entire
`testing procedure. At the simplest level, one instrument may
`trigger or control the instrument supplying the stimuli to the
`DUT or it may control the intervention of other measuring
`instruments.
`The integration of a collection of interconnected and syn-
`chronized bench top instruments into a test station was marked
`by two significant developments. The first event was the
`introduction of standards for instrument interconnect and in-
`tercommunications; and the second was the availability of the
`microprocessor.
`The IEEE 488 standard evolved in response to a growing
`demand for computer driven testing solutions. It evolved
`
`Fig. 1. Major steps in IC manufacturing flow.
`
`designer is verifying the functionality and the performance of
`the intended product. Many of the production tests, indicated
`in Fig. 1, are based on this activity.
`After the wafer processing, the integrity of the wafer pro-
`cessing is evaluated by probing sites on the wafer that contain
`standardized device structures that are designed to evaluate
`process control. These test sites are stand-alone IC’s that are
`placed on the wafer either in place of a few primary2 IC sites
`scattered over the area of the wafer or in the scribe areas in
`between the primary IC sites. This test is intended to quickly
`identify any common catastrophic processing defects and to
`ensure accurate process control. If the wafer processing is
`found to be out of the manufacturer’s specification window;
`such wafers or the lots that include them may be scrapped
`prior to any functional testing. We should point out that the
`same device structures and/or additional ones on these test
`sites may be used to acquire the data for the SPICE models
`used in the simulation of the IC’s.
`At the end of the wafer processing, every device (primary
`IC) on the wafer undergoes a set of performance or function
`related tests (commonly referred to as the test suite). The goal
`of this step is to eliminate DUT’s that fail to satisfy one or
`more of the expected performance specifications. Depending
`on the test philosophy of the IC manufacturer, the test suite
`used for wafer probing will either include a complete set
`of tests intended for the packaged devices or a subset. At
`this time, the bad devices are “electronically marked,” and
`this information is used in the next manufacturing step. The
`next step is to scribe the wafer and select the good dies for
`packaging. After the good IC dies are packaged, testing is
`again performed to ensure proper handling, wire bonding and
`packaging. The final test checks performance of the device
`over the specified temperature range.
`A sample set of the qualified packaged IC’s is subjected to
`stress testing that is referred to as burn-in (i.e., operating the
`IC’s in an oven at elevated temperature for an extended period
`of time). After removing the devices from the oven, they are
`
`2 Every wafer contains the sites with the designs of the intended final
`product, called “primary” sites. Some of these primary sites are sacrificed to
`facilitate the manufacturing process control (indirect measurements of levels
`of doping, layer thicknesses, etc.). Same sites are used for placement of the
`features used in the alignment of different mask levels.
`
`
`
`612
`
`IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: ANALOG AND DIGITAL SIGNAL PROCESSING, VOL. 44, NO. 8, AUGUST 1997
`
`Fig. 2. VXIbus electrical architecture (the power distribution bus is not shown) [92].
`
`in the mid-1970’s out of the proposal for the byte serial
`protocol from Hewlett-Packard Company. At times it has been
`called the “ASCII bus” or the “General Purpose Interface Bus
`(GPIB)”. The standard [7] was designed to allow a user to
`easily interconnect GPIB configured instruments to each other
`and to a control computer (i.e., usually a PC with a GPIB
`plug-in card). It covers the communications protocols (for
`transmitting/receiving data and controlling communications on
`the bus), the multiconductor interconnection cables, and the
`plug-in fixtures.
`The second breakthrough in the automation of bench top
`equipment was facilitated by the emergence of relatively
`inexpensive microprocessors, which enabled the development
`of more versatile and more reliable instrumentation. The fast
`development of digital VLSI rendered practical the incorpo-
`ration of digitizing and data processing within the instrument.
`The inclusion of microprocessors enabled instruments to be de-
`veloped that were easier to operate. The use of microprocessors
`also reduced operator error by eliminating most front panel
`controls and making the remaining controls more powerful.
`These two developments lead to the evolution of new and
`more efficient protocols that advanced the integration and
`synchronization of instruments and computers.
`The early 1980’s saw an emergence of VME (Versa Mod-
`ule Europa) computer bus technology. It began with Mo-
`torola’s VERSAbus. After adopting Eurocard module dimen-
`sions, VERSAbus became VMEbus (IEEE 1014 standard,
`1987). This new standard provided specifications for an open
`architecture computer bus with wide bandwidth. The protocol
`for VMEbus proved to well be suited for a wide variety of
`computer plug-in cards; but is not so for data acquisition
`boards. Consequently, VMEbus was never well accepted by
`instrument manufacturers.
`In the late 1980’s, a number of technical representatives
`from the instrumentation manufacturers formed a committee to
`
`formulate the additional standards necessary for an open archi-
`tecture instrumentation bus based on VMEbus and the IEEE
`488 standards. This gave birth to the VXIbus, an acronym
`that stands for VMEbus Extension for Instrumentation. The
`VXIbus can be logically grouped into eight buses, as shown
`in Fig. 2. Global buses are accessible and shared by all VXI
`modules. Unique buses are routed from the slot 0 module (the
`main control unit) to other modules on a one-to-one basis.
`Private buses are local buses between adjacent modules [92].
`VXIbus enables the control of signal characteristics and
`propagation delay via a well defined backplane environment
`[8]. With tighter electrical characteristics comes greater flex-
`ibility and higher bandwidth communications between in-
`struments. VXI provides modularity and ease of use, while
`maintaining compatibility with GPIB. The growth in high
`volume discrete electronic component manufacturing (resis-
`tors, capacitors, diodes, etc.), and the need for repeatable
`measurements, further stimulated the automation of existing
`test methods.
`At the time of the development of the GPIB standard,
`semiconductor and discrete component manufacturers like
`Fairchild, RCA, Western Electric, TI, etc. began building their
`own automated testers. These developments were driven by the
`pressure of increasing demand for higher IC production and
`the associated need to perform large volume batch testing. The
`fact is these “homegrown” systems were very difficult to build
`reliably and they conformed to few if any standards.
`High demand in the consumer electronics market, coupled
`with the fact that it was difficult and expensive to build the
`test equipment in-house, created the right environment for the
`emergence of the third party manufacturers of automated test
`systems. ATE to test digital IC’s appeared first; and ATE for
`digital and analog IC’s were separate systems. The analog and
`mixed-signal ATE industry was started when Teradyne, Inc.
`and the LTX Corporation were founded in the late 1960’s,
`
`
`
`GROCHOWSKI et al.: IC TESTING FOR QUALITY ASSURANCE
`
`613
`
`Fig. 3. The ATE evolution.3
`
`and the early 1970’s, respectively. These companies began
`by developing automated test equipment (ATE) to test linear
`analog IC’s, where the void was greatest. Over the next several
`years the capabilities of their ATE products evolved to serve
`the testing needs of mixed-signal (digital and analog) IC
`manufacturers. In Fig. 3 we graphically show the time-line for
`the evolution of ATE technologies. It can be debated whether
`the development of “analog only” ATE has disappeared or
`not. The fact is that today’s mixed signal ATE’s, even if
`equipped with “analog only” instruments, rely totally on digital
`signal processing (DSP) based measurement methods (i.e., the
`measured analog signals are digitized and DSP algorithms
`are used to extract the desired performance parameters). For
`this reason Fig. 3 shows the development of “pure analog”
`or “analog only” ATE to have ended with the emergence of
`mixed-signal ATE. Moreover, the trend among digital ATE
`vendors is to offer ATE’s with some mixed signal capabilities.
`Thus, the contemporary ATE industry offers products with
`varying ratios of analog and digital functionality (i.e., “big
`D little A” or “big A little D” configurations and variations
`in between).
`Mixed-signal ATE systems are sophisticated computer con-
`trolled systems which perform automatic testing of DUT’s that
`include single components, packaged and bare IC’s, circuit
`boards, and complete systems. The allocation of the ATE’s
`functionality and the control of its operation are achieved with
`software. The ATE generates and applies the stimuli to the
`IC, it senses and digitizes the IC’s responses, and it analyzes
`these responses. Consequently, contemporary ATE are com-
`prised of the following five functional modules: STIMULUS,
`MEASUREMENT, TESTER CONTROL, PROCESSING, and
`SIGNAL HANDLING; as shown in Fig. 4.
`The TESTER CONTROL, i.e., the top block in Fig. 4,
`controls the entire ATE under the guidance of a test plan.
`It establishes the controls for the PROCESSING, MEASURE-
`MENT, and STIMULUS blocks needed to implement the test
`plan. Test plans are written in a high level programming
`language. Due to its efficiency, modularity, and versatility, C
`language has become the most popular language for this task.
`It is also in this portion of the ATE where the interface (menus
`and icons) between the test developer (and ATE operator) and
`the ATE is determined. The PROCESSING block provides
`
`3 This graph does not assume any particular date for the onset of the bench-
`top equipment development, nor does it assume the disappearance of all test
`equipment in the near future. Furthermore, the dates for the appearance and
`disappearance of a particular ATE type are approximate dates.
`
`Fig. 4. Functional modules of an ATE.
`
`computing power necessary for preprocessing the data that is
`to be sent to the STIMULUS block. The STIMULUS block, in
`turn, is responsible for the final shaping of the signals that are
`to be applied to the DUT. The PROCESSING block also does
`the postprocessing of the digitized signals provided by the
`MEASUREMENT block. The SIGNAL HANDLING block
`embodies all the hardware used for routing the signals to and
`from the DUT.
`In addition to the functional partitioning shown in Fig. 4, ev-
`ery ATE is physically comprised of a test head connected to a
`main cabinet containing the processor and the instruments. The
`physical appearance of a typical ATE is shown in Figs. 5 and 6.
`The main cabinet houses the STIMULUS, MEASUREMENT,
`PROCESSING, and TESTER CONTROL blocks. SIGNAL
`HANDLING is a primary the function of the ATE’s test head.
`It also includes signal routing through adapters, fixtures, and
`cables between modules in the ATE and the DUT. Thus, signal
`handling is an area that requires special consideration. The
`amount of degradation experienced by signals, especially for
`analog signals at high frequencies, passing through switch
`contacts and cables cannot be underestimated [9].
`Because the signal handling requirement is unique to an
`ATE, the development of this hardware is very closely tied
`to the growth of the ATE industry. The instrumentation found
`in the main cabinet, on the other hand, evolved separately
`with little influence from the ATE industry except to make
`them programmable and compact. Most of these instruments
`were developed as bench-top (stand alone) units as early as
`the 1950’s. Simple hardware controllers were used in the
`1950’s but true automation occurred only subsequent to the
`availability of low cost, powerful microprocessors.
`Originally ATE vendors used in their equipment a mix
`of proprietary instrument interfaces with GPIB controllers.
`Subsequent ATE models incorporated prevailing instrument
`and computer bus standards. The emerging bus standards have
`been incorporated within later ATE generations to various
`extent. Today, this trend continues, with majority of ATE
`industry converging on VXIbus based instruments and several
`
`
`
`614
`
`IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: ANALOG AND DIGITAL SIGNAL PROCESSING, VOL. 44, NO. 8, AUGUST 1997
`
`Fig. 5. Teradyne A580 automated tester.
`
`Fig. 6. LTX Synchro automated tester.
`
`vendors introducing optical fiber as means of communicating
`between internal subsystems.
`In addition, the performance of the instruments available
`within an ATE framework has undergone a tremendous im-
`provement. Today’s ATE provides a measurement environment
`with a noise floor between 120 and 140 dB. This type of
`environment is necessary in the testing of high resolution
`analog-to-digital and digital-to-analog converters. Similarly,
`RF testing also requires a very low noise environment for low
`signal power measurements. Since today’s ATE was developed
`using IC’s developed in yesterday’s technologies, even the
`most advanced ATE is challenged to test IC’s that represent the
`latest state-of-the-art. For example, the increasing performance
`
`of IC’s for storage devices is resulting in increasingly more
`stringent demands on the timing parameters of today’s ATE’s.
`The clock jitter (phase noise) of no more than 10 ps rms is
`required to properly test the read channel IC’s that are realizing
`more than 100 Mb/s. data rates. Some of the mixed-signal
`ATE vendors provide special modes which increase the digital
`capture rates up to 400 MHz. Table I gives a general list of
`the instruments available in a truly mixed-signal ATE. The
`parameters listed in this table represent typical values, and they
`are often the differentiating factors that drive the IC manufac-
`turer’s choice between the ATE’s offered by different vendors.
`One of the first obstacles the ATE industry had to overcome,
`was signal handling and conditioning. As stated in the intro-
`
`
`
`GROCHOWSKI et al.: IC TESTING FOR QUALITY ASSURANCE
`
`615
`
`Fig. 7. Teradyne A580 device interface board (13 inches in diameter).
`
`duction, there are two embodiments of an IC DUT that the
`ATE has to interface with, i.e., die on a wafer and package. In
`both cases the signal path from an ATE to the DUT is routed
`through a multilayer device interface board (DIB), such as
`that shown in Fig. 7. Device interface boards can be either
`square or round. The round DIB’s have diameters between 12
`in and 18 in, depending on the tester type. Typically the DIB’s
`are constructed with 2–4 separate, uncommitted planes (layers)
`which can be used for separate ground planes and shielding. In
`addition the DIB’s have planes (layers) dedicated to set power
`15 V, etc.). The access to ATE resources
`supplies ( 5 V,
`is gained via contact pads radially located on the ring around
`the perimeter of the board. These pads make contact through
`the spring loaded coaxial pins to the resources located in the
`test head.
`During the wafer test, the signals are delivered to and from
`the IC via probe card [3]. To accommodate such interface, the
`device interface board has an opening in the center (usually
`3 to 5 inches in diameter). The small pads are located on
`the perimeter of that opening. The interconnection between
`the probe card and the device under test board is achieved
`through the spring loaded shielded pins. To maintain the signal
`integrity of the connection between the probe card and the IC,
`this interface has to be very reliable and resilient. Probe card
`
`resiliency is specified in hundreds of thousands of touchdowns
`(typically 100 000 to 300 000). The three most commonly used
`probe card types are the ceramic blade, metal blade, and epoxy
`ring. The choice of a probe card style is dependent on different
`factors such as maintenance and signal aberrations. Fig. 8
`shows an example of a probe card used for testing of the
`low pin count devices.
`The diameter of a probe card ranges anywhere from 4 to 8
`in, depending on the test system. The diameter of the probe
`0.001
`tip is usually between 5 and 15 mils (where 1 mil
`in). Most commonly used materials for standard probes are
`tungsten, rhenium tungsten, beryllium copper, and palladium.
`The planarity of the probes depends on the chip area and
`typically ranges from less than 0.5 mil for a small die to
`below 1 mil for a large die. The probes are laid out radially
`with ground layers placed in the way that allows placement of
`bypass capacitors as close to the probes as possible.
`A considerable amount of effort is currently being devoted
`to the development of high speed and RF probing. A recent
`utilization of the same technologies that are used in IC
`manufacturing, leads to introduction of higher density and
`higher performance probe cards [4], [5], [89].
`Packaged IC’s are tested by placing a socket adapter on
`the DIB. The physical admissions of the packages are very
`
`
`
`616
`
`IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: ANALOG AND DIGITAL SIGNAL PROCESSING, VOL. 44, NO. 8, AUGUST 1997
`
`(a)
`
`(b)
`
`Fig. 8. Example of low probe count probe card (4 inches in diameter).
`
`different for different package types. They also exhibit very
`different electrical characteristics. The important
`issues in
`selecting the socket type are the lead inductance of high speed
`( 10 MHz) paths and the series resistance of the leads. These
`characteristics affect the fidelity of the signals brought in
`and out of the DUT. The most prominent vendor names in
`the socket adapter industry are Yamaichi and Johnstech. As
`an example, Fig. 9 shows a socket for a 28 PLCC package
`
`mounted on a DIB. Shown is a close-up of the center of the
`DIB with a mounted socket adapter.
`The type of socket adapter shown in Fig. 8 is used during
`“hand testing,” that is when the devices are inserted into a
`test fixture manually. For large volume automated testing this
`socket is removed and the DIB is directly connected to the
`automated handler. Automated handlers are the instruments
`which bring DUT’s into contact with the DIB. After the test is
`
`
`
`GROCHOWSKI et al.: IC TESTING FOR QUALITY ASSURANCE
`
`617
`
`Fig. 9. Socket adapter mounted on the load board.
`
`TABLE I
`TYPICAL ATE INSTRUMENTATION PERFORMANCE
`
`completed, the automated handler places the DUT in the ap-
`propriate bin, e.g., separate bins for good and bad devices (the
`details of the handler operation are determined by commands
`in the test plan). Some products, e.g., microprocessors, are
`tested to more than one performance criteria; thus, additional
`bins are used to separate parts that satisfy the different criteria.
`The second significant achievement in the ATE development
`was the simplification of the interface between the ATE and
`the test developer/operator. It was largely the computational
`
`Fig. 10. Test program development environment. (Courtesy of HP, ATE
`Division.)
`
`power and programming flexibility of the DSP, coupled with
`the emergence high performance A/D and D/A converters, that
`enabled the development of user friendly and flexible virtual
`instruments.4 In today’s ATE, test instruments for specific
`applications (e.g., video, telecommunications, hard disk drives,
`etc.) are created by configuring the ATE’s resources using
`software.
`
`4 Using analog channel/digitizer/DSP and flexible software, it is possible to
`generate various instruments based on the same principles of digitizing events
`and digital signal processing.
`
`
`
`618
`
`IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: ANALOG AND DIGITAL SIGNAL PROCESSING, VOL. 44, NO. 8, AUGUST 1997
`
`Fig. 11.
`
`Instrument status window. (Courtesy of HP, ATE Division.)
`
`For the initial “home grown” ATE systems, the human-
`machine interface was the low level machine language code
`prepared by the test engineer. The design and implementation
`of tests in these early systems were encumbered by an interface
`that had its own steep learning curve. As time progressed
`the programming evolved to higher level
`languages,
`i.e.,
`FORTRAN, Pascal, and C, and to a window based graphical
`user interface (GUI). With a GUI the test development envi-
`ronment evolved into one that is intuitive and easy to navigate.
`Moreover, it freed the test engineers to focus their efforts on
`the important testing issue rather than writing and debugging
`code. Fig. 10 shows an example of the program development
`window for a typical ATE. We see buttons or icons for file
`management, program compilation and launching, program
`and test debugging, and many other utilities designed to make
`the test development easier. Here the user can select a set of
`functions to be included in the test plan, start the execution
`of the program, stop the execution at any arbitrary point, and
`direct the flow of the output data.
`There are many utilities which allow the user to interrogate
`the current states of the ATE and to change them, if desired.
`All the information describing the state of each instrument
`is available in form of windows. The user can verify and/or
`change the settings of voltage/current levels, select among
`available filters, select the mode of operation, timing sets, etc.
`All these capabilities are available through an easy to follow
`point-and-click environment. The examples of such utilities is
`shown in Fig. 11. The panel in this figure left shows graphical
`representation of the arbitrary waveform generator (AWG). It
`also shows the status of the connections and the waveform
`parameter settings. The user is permitted to change some of
`the waveform parameters when the program is stopped at a
`break point.
`inputs are needed, digital pattern editing
`When digital
`resources are available at
`the click of a button. Fig. 11
`shows the digital pattern editor window. The window is
`divided into three areas; namely, a spreadsheet style pattern
`display, a graphical pattern display, and a text timing and
`
`Fig. 12. Digital signal display. (Courtesy of HP, ATE Division.)
`
`vector display. Through this window, the user can change and
`monitor the timing relations between various digital signals.
`The spreadsheet style display section allows the user to input
`instrument specific instructions. This allows triggering of the
`signal generators and digitizers. These instruments are built
`with ADC’s and DAC’s, therefore the time when the samples
`are sent out of the instrument can be precisely controlled.
`The graphical waveform analyzer is shown in Fig. 13. Data
`stored in any numerical array can be displayed and edited in
`this window. Data contained in a previously stored file can be
`read and displayed using this tool, as well. The generation of
`data segments for arbitrary waveform generators (AWG) can
`also be accomplished with this tool. The data obtained by the
`digitizer or the digital capture instruments can be displayed
`here too. The ATE will update these displays when ever the
`program is restarted or suspended at a break point.
`The important functions of data logging and the binning of
`results are governed by the test plan. Fig. 14 shows a typical
`data log or results window. The data output can be saved in
`various file formats and later used for further analysis. This
`window shows an easy to read way of presenting numerical
`values of data measured and/or calculated during tests, as
`defined in the test plan.
`is completely under the test
`Test program development
`engineer’s control. Any valid application within the software
`system can be opened during development or debugging ses-
`sions. All mixed-signal ATE vendors provide test development
`environment emulation software that can be used on any
`workstation or PC connected to the network that includes the
`ATE. This arrangement allows for easy transfer of data and
`test programs between the ATE and the engineer’s desktop
`computer. As a result, most of the test development can be
`performed prior to the physical availability of the DUT.
`The easy to use ATE environment described in this paper is
`in effect a direct port of advances that have made workstations
`and PC’s more intuitive and friendly to use. The drive to sim-
`
`
`
`GROCHOWSKI et al.: IC TESTING FOR QUALITY ASSURANCE
`
`619
`
`Fig. 13. Waveform display. (Courtesy of LTX Corporation.)
`
`plify ATE programming will continue until the state is reached
`where the software driving the system is transparent, and the
`test engineer can solely concentrate on generating the test plan
`and analyzing the results. Many ATE vendors are seriously
`pursuing the integration of the IC design and test activities
`into