Trust FPGAs to make testing simple
As system complexity increases no longer is simple stimulus and response testing enough, test systems need to simulate the world around a device and testing it under realistic operating conditions and for this FPGA-based instruments come into their own, writes Jeremy Twaits from National Instruments
Let’s take a quick trip through time. First stop, the 1920s. A time when instrumentation was simple. The cathode-ray tube was the tool of choice for observing electrical waveforms, the pen, paper and human brain were the processor and the filing cabinet was the network storage.
In the 1970s, the invention of the microprocessor went a long way to changing this. Suddenly, computers, and even instruments themselves, could process data. Instruments became capable of being controlled and automated as standard protocols and buses like SCPI and GPIB were developed.
Furthermore, as we reached the late 1990s, the concept of virtual instrumentation came to the fore, with the functionality of an instrument being determined by software. PXI was invented, providing a flexible platform for combining modular instrumentation, a high-speed backplane and a host controller to ultimately process the data.
And that brings us to today. So where do we go from here?
As devices like electronic control units (ECUs) and power amplifiers become more complex, instrumentation must continue to evolve to be able to offer full test coverage. No longer can we get away with simple stimulus and response testing, acquiring and post-processing the resultant data. Test systems need to be capable of responding with low latency, simulating the world around a device and testing it under realistic operating conditions.
For this, the test system must incorporate intelligence closer to the I/O pins – a way of processing data and initiating communication with the device under test (DUT). There a number of options for performing the required processing.
CPUs (central processing units) were typically used, due to their widespread adoption and ease of programming on general-purpose operating systems. However, their downfall is the moderate speed at which signals can be processed. DSPs (digital signal processing) can alleviate this problem, but are not necessarily simple to program.
The best of both worlds can be achieved with the use of FPGAs, which offer the high performance of a DSP, but can be programmed in a more intuitive manner using a graphical programming environment.
With their low latency, high reliability and true parallelism, FPGAs offer an ideal platform for enhancing test methods and even enabling previously impossible applications to be created.
Continuous acquisition is made possible by streaming data directly from a DUT to an FPGA. Instead of acquiring the data and transferring to the host for post-processing, real-time analysis can be performed.
Vendor-defined triggering and recording can become user-defined – this may be a complex digital trigger based upon the logical states of a multitude of lines, or even a custom hysteresis pattern in an analogue signal.
And finally, rather than simplistic stimulus and response open-loop testing, feedback can be incorporated, closing the loop. This is beneficial when dealing with many modern communication schemes, which often incorporate acknowledgement packets or bits. If the test system cannot correctly interpret these, it is not obeying the protocol and the DUT may be inadequately tested, or worse, stop communicating entirely. Only an FPGA can provide these kinds of low latency responses.
Two examples of these enhancements in action can be seen in RF measurements and semiconductor testing.
The triggering and real-time acquisition on FPGAs is invaluable in an application like frequency domain triggering.
A traditional spectrum analyser uses a swept-spectrum approach, only analysing a given band at a given time. If a signal burst isn’t present when the spectrum analyser is sweeping over its frequency, it won’t be detected. Using FPGAs however, it is possible to create a real-time spectrum analyser, analysing the entire band of interest at all times by performing hundreds of thousands of continuous, repeated and even overlapped FFTs (Fast Fourier Transforms).
To put this into practice, using the PXI instrumentation bus it is possible to combine a modular, PXI Express-based vector signal analyser (VSA) with an FPGA module to stream data from the VSA to the FPGA, where the FFTs are performed. The result can be compared to a mask, which, if exceeded, results in a trigger sent back to the VSA. This will then fill its onboard memory with samples and send it to the host controller for any additional processing. This can be particularly useful in applications like surveillance, where a signal may need to be swiftly responded to upon its capture.
Another example comes from the world of semiconductor manufacture and test. When validating and testing ICs that communicate via digital protocols, the traditional approach has been to use a pattern generator and logic analyser.
The generator is used to stimulate the device with a fixed pattern of 0s and 1s, with a high-impedance period where you expect a response from the device. The analyser then verifies that the pattern generator is able to drive the bus correctly and that the IC responds with the right data at the right time.
The problem comes when attempting to test multi-protocol ICs, where the different interfaces operate asynchronously or in different clock domains. The simplest way to get round this is to add test hardware for each protocol, though this is an expensive approach.
With an FPGA, it is possible to implement the protocol on the FPGA itself. The first advantage is that we can now communicate with the DUT in the manner it would experience in an actual application, rather than with static test vectors.
Secondly, the parallelism of the FPGA allows multiple clock domains to be incorporated, making it scalable to many protocols on the same IC. This approach is known as protocol-aware testing, where the operating environment of the DUT is mimicked.
With the ability to implement inline processing, custom triggering and even new test approaches like protocol-emulation, the combination of PXI and FPGA is ushering automated test into the future. This article has barely scratched the surface of applications enhanced by FPGAs and new possibilities are continually being discovered.
From novel methods of detecting skin cancer via optical coherence tomography, to power level servoing in RF amplifier testing, FPGAs are permeating into test and measurement systems across the globe.
Whatever the future holds, we should all be thankful that instrumentation has evolved. After all, I don’t much fancy my chances of calculating a million FFTs per second in my head.
Jeremy Twaits is automated test marketing engineer at National Instruments UK & Ireland