With moves to software-based solutions, test & measurement is entering the digital transformation.

February 21, 2023

7 Min Read
GettyImages-1360213508.jpg
Kynny for Getty Images

Daniel Shaddock, co-founder and CEO of Liquid Instruments

The test and measurement industry is undergoing a major transformation, moving from a hardware- to a software-based model that will completely redefine engineering research and development processes. This evolution requires an overhaul of the tools that industries rely on to develop, validate, and produce new technologies and devices across many critical sectors today. As organizations work to reckon with and reap the rewards of this digital transformation, an incremental approach is tempting, but a more ambitious change may be key to unlocking its full potential.

Across many areas of technology development, project managers must balance mitigating technical risks, squeezing costs, and tightening schedules all while striving to maintain and extend competitive advantages. Managing costs over long program timelines in the face of unpredictable supply chain conditions becomes a much simpler problem when dealing with software vs. hardware solutions. Consider major advances now underway, including the progress of the commercial space industry (NewSpace); advances in cross-domain capabilities integrating land, sea, air, and space capabilities; and the integration of autonomy through artificial intelligence and machine learning (AI/ML) technology. Groundbreaking innovations like these must be supported by cutting-edge labs that are adequately equipped with the modern tools required to design, develop, and bring new products to market.

Established test and measurement companies understand the value of software, and they are investing to grow their software capabilities. Considering their decades of investment in hardware, it isn’t surprising that they’ve embraced a bolt-on approach where they wrap a veneer of software around conventionally architected hardware. This incremental change is low risk and brings some benefits for customers, but it forgoes the massive payoff that can come with a clean-sheet approach to the problem. The situation is analogous to plug-in hybrids that have added electric power trains to bridge the gap to pure EVs. They can run on electricity sometimes, but compared to clean-sheet designs, they lose out on the substantially lower maintenance costs, better handling and performance, and improved safety that a full EV platform can provide.

Software-defined radio is a good example of how this software vs. hardware approach has won over important industrial needs. Software-defined radio began almost as a low-end hobbyist field but has evolved to take over the high-end market thanks to its ability to rapidly evolve, ease of customization, and the agility of the software development process.

Modernize Test and Measurement to Drive Efficiency

Historically, test and measurement equipment was siloed into single-function hardware boxes that were stacked together by engineers to control and test their devices and systems. However, software-defined instrumentation allows both manufacturers and users to do more with less. The versatility and upgradability of test equipment is driving a convergence and consolidation of the traditional matrix of test instruments. The versatility of this new generation of instruments is a clear benefit to users, as it reduces equipment purchases and expands the usable lifespan of equipment. For manufacturers, it provides a pathway to rationalizing product portfolios. Reducing the number of product lines at a time of supply chain turmoil is an attractive side benefit. These next-generation upgrades have allowed the industry to challenge established norms with an eye toward delivering more efficiency in a testing solution that’s faster, smaller, and more affordable than traditional approaches. Moreover, it benefits from a rapid rate of improvement due to the semiconductor-like scaling of underlying capabilities.

Like many transitions, the timing is dictated by the maturation of several technologies. Advances in field-programmable gate arrays (FPGAs), which can be completely reconfigured based on project needs, are proving to be the key platform for integrating a full suite of software-defined instruments into one piece of hardware, creating a new solution that is much greater than the sum of its parts. While this approach can replace conventional instrumentation, including an oscilloscope, spectrum analyzer, signal generator, or lock-in amplifier, the full impact of the power and flexibility of FPGAs is realized when considering the bigger problem to be solved, across multiple instruments and test scenarios. Combine this with advances in cloud infrastructure and the prevalence of mobile computing platforms, and the future direction of the industry becomes clearer.

The Software-First Approach

Twenty years ago, most FPGAs deployed in the wild typically were reconfigured only a handful of times during their lifespans. In modern software defined-instrumentation, FPGAs might be reconfigured hundreds of times a day. As tools to more efficiently program FPGAs have emerged, the dynamism expected of FPGAs has increased dramatically, opening up new ways to leverage their capabilities.

This new software-first approach to test and measurement has a multitude of advantages over traditional equipment. Engineers can quickly access the instruments they need and run tests remotely. Researchers can easily reconfigure their setup through software to incorporate new measurements or even deploy custom real-time signal processing and data analysis. As FPGA capabilities have continued to rapidly improve, software-defined systems are able to replace not just a single conventional instrument but a whole rack of test equipment. Multiple instruments can operate simultaneously, even rerouting connections between instruments directly on a single FPGA. These advantages deliver a greater degree of flexibility to power more dynamic test capabilities and configurations, better visualizations and diagnostics, and hardware-in-the-loop (HITL) testing and real-time simulation.

In aerospace and defense, modern development environments rely on software-defined tools to help address today’s challenges head-on. For example, using a digital twin — a virtual representation of a physical system — enables far more flexible characterization and scenario testing, decreasing the number of required hardware spins and accelerating development timelines. Similarly, HITL testing can be employed, where certain parts of a new design, like the control algorithms, are initially reconfigurable test solutions. The plant can be included as a model initially and then migrated to a physical prototype. This enables design development and characterization that can be iterated quickly while revealing the real-world behavior of the closed-loop system. As technologies like AI/ML continue to grow in importance and become more tightly integrated with designs, a software-based approach to test will be critical to account for the vast set of possible scenarios and ensure adequate test coverage.

Best Practices to Modernize Design and Test Environments

While experimental physicists and engineers tend to embrace new technologies, the commercial sector often takes a little longer to catch on. One reason for this is that design and test engineers face a range of challenges when migrating their infrastructure and adopting new tools and processes. From IT and security to data and test migration to user training and results correlation, migration can be a big undertaking.

But as software-based test and measurement solutions continue to mature, even conservative organizations are considering the advantages that this approach can offer. Areas that can benefit from software-defined instrumentation include:

  • Test systems that require tight integration between multiple instruments or system-level characterization.

  • Projects where requirements are expected to evolve while maintaining schedule and cost.

  • Test strategy initiatives that seek to maximize asset utilization, optimize lab space, or balance budget considerations like capex vs. opex.

Traditional standalone test and measurement hardware is simply no longer sufficient for bringing together the different technologies and subsystems that are common in industry today. Modern, software-defined devices that offer a suite of instruments are more equipped to manage these complex demands by enabling faster design and validation and taking advantage of the cloud to help users easily share and manage vast amounts of data. Once assets are deployed to the field, maintaining them requires solutions that are flexible, remotely manageable, and upgradable. To effectively meet these challenges, test equipment must be able to translate raw data into insights and help users to navigate this complexity with versatile, software-defined solutions.

Although the transition to a software-based approach to test and measurement is in its early stages, the movement is quickly gaining steam. In the years ahead, increases in efficiency, flexibility, and scalability will accelerate this evolution even more, leading to profound implications — from shorter timelines to reduced costs — that will further accelerate the development of emerging technologies.

Daniel Shaddock is the CEO of Liquid Instruments and a Professor of physics at The Australian National University. He served as a Director’s Fellow at NASA’s Jet Propulsion Laboratory and is a Fellow of the American Physical Society.

Sign up for the Packaging Digest News & Insights newsletter.

You May Also Like