Where to now for the test and measurement industry?
The Internet of Things (IoT), big data and the rapid development of wireless networks are all having a significant impact on the test and measurement industry. The article below provides insights on the key technologies and methodologies that are affecting the industry.
Testing the hybrid
Test departments are constantly looking for ways to drive down their test costs by maximising efficiencies. One method is the continuous improvement process, which strives to improve processes, products or services by ensuring commonality across test platforms through a standardised universal tester. But far too often, a lack of consideration for market drivers, product complexity and long-term cost goals results in overdesigned, costly testers that are intended to meet needs but end up creating highly complex burdens.
Thankfully, recent changes in automatic test equipment (ATE) allow test managers to look beyond the limited build-versus-buy purchase model and focus on a more holistic, hybrid approach when it comes to defining overall test strategy. In this new approach, test managers can evaluate more trade-offs, match unique needs with considerations for product complexity and contemplate how market drivers may reduce product life cycles. For the many industries typically inclined to purchase commercial ATE solutions, such as automotive, aerospace/defence and semiconductor, this trend presents an exciting new opportunity.
This is especially true for those in the semiconductor industry, who are seeing market drivers such as the IoT, tablets, smartphones, digital TVs and smart grids. Because these trends all leverage radiofrequency integrated circuits (RFICs) and microelectromechanical systems (MEMSs) that are more integrated, complex and capable, they also have significantly shorter life cycles and are much more difficult to test. As a result, an ATE test strategy that is not cost-efficient, flexible and easy to support will risk capital-intensive acquisitions of entirely new equipment every few years. However, with the benefits of increased throughput and high volume production driving strong adoption of a universal test approach in the semiconductor industry, these challenges can be easily addressed.
Leveraging big data
The emergence of the big analog data problem, which includes collecting and analysing raw data from the physical world around us, is pushing test and measurement companies to evaluate the people, processes and technologies used to develop products and services.
Unlike the big data typically associated with traditional IT data sources such as social media and enterprise applications, big analog data solutions represent a vastly untapped well of information and insight that test and measurement companies can use to identify and create competitive advantages in data-centric engineering. This is no small feat considering the IDC estimates that only 5% of the data collected today is even being analysed.
In this push to better acquire, store and leverage big analog data solutions, specifically for test data, as it is known in automated test, engineers must start by recognising the role that IT plays in managing it. At present, the sheer amount of data being generated by engineering departments is causing a chasm between IT and engineering. Unless these groups work together to develop tools and methods to better use the data, this chasm will grow deeper.
The first step to cohesion is understanding how big data is classified: structured, unstructured or semi-structured. Historically, most big data solutions have focused on structured data. Defined by the user, structured data embodies a distinct relationship to the user, who inputs numerous values (name, birthday, address) as raw data. Unstructured data contains no metadata, schema or other preassigned, established organisation.
The third category, semi-structured, is influenced by the dramatic increase in the amount of test data being collected. As more test systems are deployed for 24/7 test data collection, the volume of test data will soon surpass that of human-generated data. Because test data yields so much information, assigning structured value to each byte is difficult. Creating hierarchies of data provides structure and makes mining the data after capture easier. This semi-structured test data is typically marked with a timestamp and then analysed across a set period or for a set stimulus/response event.
Form a cross-functional team
To effectively transform into a test data-centric organisation, a cross-functional team should jointly test solutions and ensure compatibility. This team should include a representative from IT, an engineer tasked with data collection, a data scientist and a manager with a high-level view of how new solutions will roll out to other departments. Additionally, an executive should have a vested interest in the outcome of the inclusion of test data analytics to ensure key members of the cross-functional team are held accountable for progress.
Do not expect results immediately
Many companies make the mistake of expecting a full data analytics solution in an unreasonable amount of time. Underestimating the effort required to align multiple teams while trying to overhaul existing workflow processes usually leads teams into proposing solutions without understanding their true data needs. This results in an unusable solution that end users don’t adopt.
A full data analytics solution for test data involves smaller, incremental steps and builds momentum for end users, IT professionals, business leaders and so on. Best-in-class companies often run an internal pilot within a single department before documenting data analytics requirements. This allows key stakeholders to understand the flow of the collected data and identify data bottlenecks. Addressing bottlenecks also improves yield, quality and time to market as well as prevents sending inadequate products to market by catching more errors or tests out of specification. These benefits will increase the company’s overall profit.
Design for expansion
Companies need to keep the big picture in mind when starting pilot programs in test automation. They need to remember that solutions architected for certain groups will not scale when rolling out test data analytics solutions to different departments. In addition, companies can send their engineering and design teams’ weekly reports to identify key trends for avoiding failures or tightening margins. This can jump-start a redesign process that addresses all possible scenarios.
By prioritising a long-term vision when designing a test data analytics solution architecture, companies can set tangible goals for expansion and IT can plan accordingly and add more servers as the solution is implemented across multiple departments.
Invest now for enormous payoffs
Implementing a test data solution can add tremendous value to an organisation by enabling a more productive workforce while lowering costs and increasing profit. The companies that choose to make the shift to data-centric organisations will be market leaders with access to up to 95% more data than competitors, which can make them 20% more cost-efficient.
Multi-core to many-core
In the test and measurement industry, faster processor clock rates have traditionally reduced test time and cost. Though many companies, especially those in semiconductor and consumer electronics, have benefited from upgrading the PCs that control test hardware, the days of depending on faster clock rates for computational performance gains are numbered.
Faster clock rates have an inverse correlation to processor thermal dissipation and power efficiency. Therefore, over the last decade, the computing industry has focused on integrating multiple parallel processing elements or cores instead of increasing clock rates for increasing the computing performance of CPUs. Moore’s law states that transistor counts double every two years, and processor vendors use those additional transistors to fabricate more cores. Today, dual- and quad-core processors are common in desktop, mobile and ultra-mobile computing segments and servers typically have 10 or more cores.
Traditionally, the test and measurement industry has relied on computers with a desktop and/or server class of processor for higher performance. As recent sales trends indicate, the desktop segment of the computing industry is shrinking. This trend reveals that casual consumers are moving towards more portable yet powerful platforms such as ultrabooks, tablets and all-in-ones. For better addressing the demands of the faster growing market segment, the computing industry is focusing on improving the graphics performance and power efficiency of the ultra-mobile, mobile and desktop classes of processors. Increasing computational performance for these processor categories is generally a tertiary consideration. High-end mobile and desktop processors will continue to offer adequate computational performance for test and measurement applications. However, limited improvements in their raw processing capabilities between newer generations of these processors should be expected.
For the server class of processors, the main applications for the computing industry are IT systems, data centres, cloud computing and high-performance computing for commercial and academic research. These applications are significantly more computationally intensive and are pushing the computing industry to continue to invest in increasing the raw computational capabilities of this server class of processors.
Many-core
More cores are being pushed into smaller, lower-power footprints. Processors are becoming ‘many-core’ as core counts soar higher than the 10 cores common in server-class processors today. Supercomputers provide an idea of what the processors of tomorrow will look like. Some cores are being devoted to special functions instead of solely to general computing. Graphics processing engines are a good example, with video displays at high resolutions showing more realistic 3D rendering. Other special-purpose cores include security engines that perform root-of-trust and encryption/decryption operations and manageability engines that allow for out-of-band management if the processor is hung, in reset or otherwise unreachable. However, for these many-core processors, the majority of cores will be available for general computing.
Leveraging many-core
With the relative plateauing of the general-purpose computing capabilities of high-end mobile and desktop processors, engineers who want their test applications to maximise performance, lower test times and hence reduce the overall cost of ownership will need to start adopting server-class processors with many-core architectures.
Software architectures that divide computing work and can scale to leverage more than 10 processor cores will be required. Consider which tasks can be implemented in parallel from the beginning when designing new applications. When considering implementation, choose tools that allow a user to maximise the parallelism in an application. Selecting an optimising compiler, multithreaded analysis routines and thread-safe drivers is a good starting point. Also, make sure that implementation languages offer strong support for threading and an appropriate level of abstraction so that the increased software complexity does not negatively affect developer efficiency.
Ignoring parallelism, at best, will result in tepid performance gains as processors evolve. The market is pushing for graphics improvements and higher core counts. However, test and measurement applications most likely will not use the graphics features, newer processors with higher core counts offer valuable performance gains to test applications designed to benefit from the upward trend in core count.
Testing in the software-driven world
We live in an increasingly software-driven world, but the growth of embedded software in modern automobiles and airplanes presents significant challenges for manufacturers trying to eliminate software bugs and make products as safe as possible.
In the aerospace and defence industry, reducing release cycles and preventing program delays have become increasingly difficult. In automotive, consumer demands are driving up test complexity and introducing new costs in areas like infotainment. In response, test managers must find affordable ways to incorporate RF testing for wireless signals and machine vision testing for assisted parking to meet the widening I/O spread of test coverage.
Though industry regulations provide a guide to ensure safety in embedded electronics, compliance with these regulations requires the thorough testing of embedded software across an exhaustive range of real-world scenarios. Developing and testing embedded software with an emphasis on quality can strain the balance of business needs such as short time to market, low test cost and the ability to meet the technical requirements driven by customer demand for new features and product differentiation. All embedded system manufacturers face similar demands, but they cannot sacrifice quality when it comes to safety-critical applications. Organisations that can evolve their development strategies to incorporate advanced hardware-in-the-loop (HIL) testing can reduce spending on quality-related problems, improve their market perception and, most importantly, ensure customer safety.
Meeting safety and business needs
Complying with safety standards requires an understanding of all potential health risks and hazards as well as the capability to rigorously test those scenarios. HIL testing meets many of these growing test needs at a lower cost and in a shorter time frame than physical tests and field tests. With this method, companies dynamically simulate real-world environments using mathematical models to provide closed-loop feedback to the controller being tested. HIL test becomes even more valuable as the need to offload test time in the field or the test cell intensifies with the addition of functionalities to controllers and the increase in test cases.
Scalable test platforms
Embedded software design and test teams must continue to find new ways to use this practice to ensure quality and make consumer safety a priority without sacrificing release schedules. HIL testing is mostly entrusted to only a specific test team, but developers have also been performing manual stimulus testing known as knob-box testing for quick functionality checks. This restricted form of testing allows them to spoof the controller by manually changing a limited number of channels. However, many functionality defects are still found in the later stages of HIL testing, or even in the field, which cost developers more resolution time. With higher levels of automation and easily repeatable test scenarios, developers can discover more of these functionality defects so that test engineers can focus on identifying performance and integration-based defects. Full-rack HIL test systems are not necessary for this application. Instead, organisations must build scalable test platforms to provide an affordable solution across varying capabilities.
As increasing embedded controller capability drives further innovation, safety regulations will be honed to ensure even greater user safety. To keep up with feature demand while preserving the quality of the overall system, test capabilities will need to grow accordingly. Simply adding more test bandwidth will not scale with overhead; test managers need to adopt advanced HIL test technology and new techniques. This ensures that as industry regulations help guide system engineering teams towards higher levels of safety for more advanced products, test platforms can still meet critical cost and time requirements.
The 5G era
From the late 1980s to the early 2000s, the rule of microwave instrumentation was simple: those who make the best microwave transistors win. Throughout this era, test vendors released instrumentation that pushed the envelope on characteristics like frequency range, noise floor and linearity performance. Advances in hybrid microcircuit technology, synthesiser tuning time and phase noise were some of the most critical innovations during this period.
Today, the continued evolution of wider instantaneous bandwidth represents a significant area of improvement for RF signal generator and analyser technology.
This trend of signal analysers supporting wider instantaneous bandwidth is primarily being driven by the evolution of off-the-shelf analog-to-digital converter (ADC) technology and wireless standards, but the benefits of faster ADCs reach far beyond the wireless industry. Improvements in off-the-shelf ADC technology now allow test equipment manufacturers to address the needs of customers across a broad spectrum of industries, especially aerospace and defence.
From 1G to 5G
To understand how the wireless communications industry has helped drive improvements in signal analyser technology, it is important to recognise the rapid increase of channel bandwidth across today’s modern wireless standards.
An even more telling evolution in wireless technology was the widespread development of 802.11ac devices that began several years ago. At the time, the wireless industry had created a widely adopted standard that was ahead of the capabilities of RF signal generators and analysers. As a result, many test and measurement vendors accelerated their development of wider bandwidth instruments just to support the bandwidth requirements of 802.11ac in a timely manner.
Looking ahead, the next major milestone for RF test equipment is the ability to test the fifth generation of cellular devices. And as researchers use advanced software-defined radio tools to actively prototype 5G candidate technologies such as massive MIMO, GFDM and millimetre wave communications, the potential use of wideband millimetre wave signals most likely will require RF test equipment to offer 2 GHz of bandwidth by 2017 or 2018 to support a 2020 deployment.
By any standard, achieving 2 GHz of instantaneous bandwidth in an RF signal analyser would be a major landmark in the test and measurement industry. If such an instrument existed, it would be an incredibly useful tool for bandwidth-hungry applications such as radar pulse measurements and spectrum monitoring.
Making it all possible
If you’re wondering how the industry is going to get to 2 GHz of bandwidth, a good place to start is Moore’s law, which theorises that transistor density on an integrated circuit doubles every two years. And for those in the computing industry, Moore’s law remains a strong indicator of the ever-increasing capability of computing technology today.
CPUs and FPGAs are not the only technologies that have benefited from exponential improvements in transistor density on an IC. ADC sample rates are following a similar trend. Consider the maximum available sample rate of 12-bit ADC technology versus time. Because 12-bit ADCs provide increasing dynamic range to analyse frequency domain signals, they are an effective proxy for the bandwidth capabilities of RF signal analysers.
Based on the current rate of development, 12-bit converter technology will soon be able to drive RF instruments to multi-gigahertz of instantaneous bandwidth and boost today’s gigahertz-bandwidth oscilloscopes to even higher resolutions.
Next-gen RF instruments
For engineers in the wireless industry, the next generation of extremely wideband instruments is poised to help drive 5G products to market. However, with a broader view of the benefits to come, engineers will soon be using exciting new measurement approaches and techniques ushered in by next-generation RF signal analysers (and even oscilloscopes).
In radar design and development, for instance, the growing bandwidth and signal-processing capabilities of instrumentation should soon yield more advanced radar prototypes. In high-volume manufacturing test, the ability to acquire ultrawideband signals in a single shot will help test engineers easily capture data from multiple wireless devices in parallel for faster multisite test configurations.
In many respects, the bandwidth limitations of yesterday’s RF signal analysers now drive some of the test techniques we use today. Now that we’re in the middle of a bandwidth revolution, we need to consider how wider bandwidth is going to empower the test techniques of tomorrow
Australia's largest electronics expo returns to Sydney
Electronex, the annual electronics design and assembly expo, will return to Sydney on 19–20...
The fundamentals of Australian RCM compliance
The following information aims to help readers understand the Australian compliance requirements...
Largest ever Electronex Expo in Melbourne
The Electronics Design and Assembly Expo will return to Melbourne from 10–11 May at the...