Virtual Instrumentation: Measuring Up Remotely

Date:   Monday , May 31, 2004

Dinesh Sangale, Quality Head at the GE Medical Systems (GEMS) plant located in Bangalore used to most of his time monitoring the manufacturing process of many critical components used to build sophisticated, high-value medical equipment like CT Scans or X-Ray machines. Some of these components are processed in ovens or high voltage tanks that required 24x7 monitoring. Sangale and his team used to measuring electrical parameters, temperature and vacuum of these systems in a tiring, round-the-clock cycle.

Testing the systems was a tedious job. Every time a particular test had to be conducted—and there were numerous individual tests in a process—Sangale and team has to start plugging in the measuring devices to the systems, only to repeat the cycle. The manual testing was time consuming, and required Sangale’s intervention throughout the tests. Many a time some of the traditional measuring instruments that were being used—digital voltmeter, digital counter, oscilloscope—presented unique problems. To add to this, the tests involved high voltages of 2500 V and frequent human intervention to change connections created room for errors and safety concerns.

Further, the team would note the measurements manually and later transfer to a computer for report generation.

GEMS needed a test station that would perform all the tests with minimal human intervention. For Sangale the answer lay in using a computer based automation system. It was sometime in 2002 that he heard of the Austin, TX-based National Instruments that was pioneering a concept called Virtual Instrumentation.

Virtual instruments are not really that different from the traditional instruments that Sangale was using earlier. However, the difference between a virtual instrument and a conventional one is that while the former uses a personal computer for all user interaction and control, conventional instruments use displays, knobs and switches.

Historically, instrumentation systems originated in the distant past, with measuring rods, thermometers, and scales. In modern times, instrumentation systems have generally consisted of individual instruments, for example, an electro-mechanical pressure gauge comprising a sensing transducer wired to signal conditioning circuitry, outputting a processed signal to a display panel and perhaps also to a line recorder, in which a trace of changing conditions is inked onto a rotating drum by a mechanical arm, creating a time record of pressure changes. Even complex systems such as chemical process control applications typically employed, until the 1980s, sets of individual physical instruments wired to a central control panel that comprised an array of physical data display devices such as dials and counters, together with sets of switches, knobs and buttons for controlling the instruments. The introduction of computers into the field of instrumentation began as a way to couple an individual instrument, such as a pressure sensor, to a computer, and enable the display of measurement data on a virtual instrument panel, displayed in software on the computer monitor and containing buttons or other means for controlling the operation of the sensor.

Thus, such instrumentation software enabled the creation of a simulated physical instrument, having the capability to control physical sensing components. Any virtual instrumentation system intended for connection to a typical variety of commercially available data collection hardware devices must accordingly comprise software tools capable of communicating effectively with the disparate types of hardware devices.

“We no longer provide one fixed solution, but a fixed piece of hardware—like Lego blocks—with software as a front-end that delivers the application. This kind of model is becoming increasingly acceptable,” says Jayaram Pillai of National Instruments. National Instruments’ software—LabView—not only provides a graphical user interface on the computer screen but also an application development environment through which customers can design custom virtual instruments.

Thus, engineers and scientists can easily modify or expand virtual instrumentation systems to adapt specific needs without having to replace the entire device. The software could be written for the specific analysis or experiment. The hardware, on the other hand, could be changed according to the design of the experiment, whether to receive data from an apparatus or control it. The Lego blocks that Pillai is talking about are low-cost, plug-in hardware boards used for data acquisition. They are ideal for a wide range of applications in the laboratory because they provide reliable measurements cheaply. From digital multimeters, through high-speed digitizers, to RF measurement devices, there is a wide range of modular computer-based devices that deliver data acquisition capabilities more cheaply than dedicated devices.

“As the system-on-chip concept advances, and off-the-shelf components become cheaper and more powerful, so do the boards that use them. With these advances in technology comes an increase in data acquisition rates, measurement accuracy, and precision,” says Pillai.
Instead of the various conventional instruments that Sangale was using earlier, today he uses a data acquisition board or card, with a personal computer and software. In fact, a multiple purpose virtual instrument can be made my using a single data acquisition board or card. The primary benefits of applying data acquisition technology to virtual instrumentation include costs, size, and flexibility and ease of programming. The cost to configure a virtual instrumentation based system using a data acquisition board or cards can be as little as 25 percent of the cost of a conventional instrument.

One of the features of virtual instruments that Sangale likes is the ability to monitor the testing from a remote location. It is not necessary for him to be at the testing base. At times he controls the temperature and performs testing from his home. He logs into the GEMS intranet, which gives him access to remotely read the test parameters. Sangale is not alone: there are many customers who are excited by the ‘remote-location-monitoring’ feature that virtual instruments offer. For instance, a U.S-based global manufacturer of bearings tests some of its bearings in India. The bearings are continuously run and various physical measurements like temperature, speed, vibrations are done in real-time. The data acquired is transmitted through the network and engineers in the company’s headquarters in the U.S. analyze the data. An engineer sitting in the U.S. can control the spinning speed by increasing or decreasing the RPM of the shaft.
Measurement is no longer just a matter of getting data. What matters—beyond the amount of data—is how quickly and efficiently one can get them. “In a manufacture setup like ours, measurement—data acquisition and analysis, and testing—would have to be done on a budget. Also time is crucial. We have to get the product to the market quickly. Virtual instrumentation is an inexpensive and yet efficient testing method,” says Sangale.

Virtual instrumentation is changing the way scientists and engineers like Sangale measure and automate the world around them.
No one can predict exactly where the future will take virtual instrumentation, but one thing is clear—the PC and its related technologies will be at the center, and engineers and scientists will be more successful as a result.

The National Instruments mixed-signal test platform includes a new family of 100 MS/s mixed-signal PXI modular instrumentation and system resources combined with power of LabVIEW 7 Express. This integrated software and hardware platform is ideal for prototyping and test applications that require high performance and tightly synchronized analog and digital signal generation and acquisition. You will learn how this platform can:

  • Decrease your test development time

  • Make accurate and repeatable measurements

  • Increase your test system throughput

  • For more information please click here