Making the impossible possible and the common easy
By Jeffrey Phillips, Principal Software Product Manager
Wednesday, 24 May, 2017
The rapid pace of technological advancement should be celebrated and embraced. It fuels amazing new technologies and scientific achievements that make us more connected and safer. It also pushes the limits of what we previously thought possible.
The impact of these achievements is no longer isolated to a narrow market vertical. It permeates every industry and exposes the established market incumbents to an unusual combination of disruption and growth potential.
But the pressure and the challenge to drive business impact are daunting in this climate. How do you stimulate growth while making large investments in future technologies without dramatically changing your business model? Companies are watching their operational costs balloon as they dip their toes into numerous areas of investment that require significant and often disparate expertise. Meanwhile, small start-ups with incredible focus and no prior obligations can leverage new technologies in ways that established competitors struggle to answer.
So how do you protect yourself from disruption? How do you innovate without radically increasing the cost of doing business? It all boils down to one simple question: do you feel secure in the tools you’re using?
That’s the magic question, whether it’s your personal finances, career or the engineering systems of the future. For instance, the Industrial Internet of Things ushers in a new era of both networked potential and significant risk. To best understand which software prepares you to most securely engineer future systems, you should turn to the recent past.
In 2005, the previous three technological decades were defined by one simple observation made by the co-founder of Intel, Gordon Moore. Moore’s law was the prediction, based on the recent past, that the number of transistors per square inch on an integrated circuit would continue to double every 18 months. Seemingly linear growth was just the start of exponential growth.
Before we knew it, CEOs from every semiconductor manufacturer talked about the number of parallel processing cores over the next few years. Intel CEO Paul Otellini promised 80 core chips in the following five years. The demand for more processing power with lower latency marched on. Alternative processing fabric emerged. First, the FPGA stormed into popularity with its software-defined timing and massively complex low-level programming languages. Next, heterogeneous processing was born when the traditional processor and the FPGA were combined onto a single chip.
Along with this explosion of processor architectures came a flood of new programming environments, programming languages and open-source fads biding their time until the inevitable decline into oblivion. And, of course, the whole burden of figuring out how to efficiently program the processors fell on you.
But now, we look to the future. The explosion of processing capabilities is leading us forward into a world of hyperconnectivity. And this world becomes more connected as engineering systems become more distributed. Trends like 5G and the Industrial Internet of Things promise to connect infrastructure, transportation and the consumer network to enrich the lives of people the world over. It’s inarguable that software will be the defining aspect of any engineering system, if it’s not already. And it won’t be long before hardware becomes completely commoditised and the only distinguishing component of a system will be the IP that defines the logic.
Most test and measurement vendors have been slow to respond to the inevitable rise of software and are just now hitting the market with software environments that help the engineering community. But even those can only get you so far. As the industry continues to evolve, the tools engineers use to design these connected systems must meet four key challenges: productivity through abstraction, software interoperability, comprehensive data analytics and the efficient management of distributed systems.
Productivity through abstraction
Abstraction is one of those words that is so overused it’s in danger of losing its meaning. Simply put, it is making the complex common. In the world of designing engineering systems, complexity often comes from programming. The custom logic that adds the smart to smart systems typically requires a level of coding that’s often so complex, it’s what separates the pros from the amateurs. The complex must become common, though.
To solve this challenge, engineers need a ‘programming optional’ workflow that enables them to discover and configure measurement hardware, acquire real-world data and then perform data analytics to turn that raw data into real insight. NI is introducing a new configuration-based workflow in the form of LabVIEW NXG. It is complemented by the graphical dataflow programming paradigm native to LabVIEW and known for accelerating developer productivity in complex system design for nearly 30 years. With this configuration-based interaction style, you can progress from sensor connections all the way to the resulting action without the need for programming and still construct the code modules behind the scenes. That last step is a critical feature that streamlines the transition from one-off insights into repeatable and automated measurements.
Software interoperability
With the growing complexity of today’s solutions, the need to combine multiple software languages, environments and approaches is quickly becoming ubiquitous. However, the cost of integrating these software components is considerable and continues to increase. Languages for specialised hardware platforms must be integrated with other languages as these compute platforms are being combined into single devices. The solution to this is typically the design team assuming the burden of integration. However, this is essentially just treating the symptoms and not addressing the root cause. The software vendors must fix the root cause.
By design, NI’s software-centric platform places this software interoperability at the forefront of the development process. Though LabVIEW has been at the centre of this software-centric approach, many complementary software products from other companies are individually laser-focused on specific tasks, such as test sequencing, hardware-in-the-loop prototyping, server-based data analytics, circuit simulation for teaching engineers and online asset monitoring. These products are purposefully limited to the common workflows of the engineers and technicians performing those tasks. This characteristic is shared with other software in the industry tailored to the same purpose. However, for NI software, LabVIEW provides extensibility capabilities through an engineering-focused programming language that defies the limitations of tailored software. For example, consider DAQExpress.
DAQExpress is new companion software for USB and low-cost plug-in NI data acquisition hardware that drastically simplifies the discovery and configuration of hardware and provides access to live data in two clicks. All the configuration ‘tasks’ within this product are fully transferrable to LabVIEW NXG, which simplifies the transition from hardware configuration to measurement automation.
In addition to interoperating within the NI platform, products like LabVIEW 2017 feature enhanced interoperability with IP and standard communications protocols. For embedded systems that need to interoperate with industrial automation devices, LabVIEW 2017 includes native support for IEC 61131-3, OPC-UA and the secure DDS messaging standard. It also offers new interactive machine learning algorithms and native integration with Amazon Web Services.
Beyond the individual innovation within each of these products, the collection represents the fruition of NI’s commitment to its ongoing investment in software. This combination of software products and their inherent interoperability separates NI’s platform from the rest. Other vendors are just now figuring out that software is the key, but NI’s investment in software has steadily increased over the last 30 years.
Comprehensive data analytics
Perhaps the most prolific benefit of the mass connectedness between the world’s systems is the ability to instantaneously access data and analyse every data point you collect. This process is critical to automating decision-making and eliminating preventable delays in the necessary corrective action when data anomalies happen.
To create the future network that can support this need, billions of dollars are being poured into research as algorithm experts from around the globe race to meet the demands of 1 ms latency coupled with 10 Gbps throughput. This direction introduces new demands on software. The first is to ensure that the processing elements can be easily deployed across a wide variety of processing architectures and then redeployed on a different processor with minimal (hopefully zero) rework. The second is to be open enough to now interface with data from an infinite number of nodes and via an infinite number of data formats.
NI has invested in server products that allow you to intelligently and easily standardise, analyse and report on large amounts of data across your entire test organisation. A key component is providing algorithms to preprocess files and automatically standardise items such as metadata, units and file types in addition to performing basic analysis and data quality checks. Based on that data’s contents, the software can then intelligently choose which script gets run. This type of interface is critical to eliminating the complexity of real-time data analytics so you can focus on what matters: the data.
Distributed systems management
The mass deployment and connectedness of these systems have renewed the need to efficiently manage all the distributed hardware from a centralised — and often remote — location. Today, this typically requires replicating single-point deployments across hundreds, and even thousands, of systems. Centralising the management then leads to the ability to see a real-time dashboard of the hardware from the remote depot instead of physically accessing the system.
SystemLink is innovative software from NI that helps you centralise the coordination of a system’s device configuration, software deployment and data management. This reduces the administrative burden and logistical costs associated with systems management functions. The software also improves test and embedded system uptime by promoting awareness of operational state and health criteria. It simplifies managing distributed systems and provides APIs from LabVIEW and other software languages such as C++.
Ask yourself again
Beyond the individual innovation within each of these product releases, the collection represents the culmination of the ongoing investment in software that NI has committed to year after year. From discovering the Higgs boson particle to decreasing test times by 100x for Qualcomm to being Nokia’s and Samsung’s solution of choice for their 5G research, NI’s software-centric platform is the building block that engineers use to solve the most complex challenges in the world.
Ask yourself again: how secure do you feel in the tools you’re using?
Turning RFID tags into scalable passive sensors
Researchers from UC San Diego have used low-cost RFID tags to develop analog passive sensors that...
Smart buoy tech live streams data 24/7 from sea floor
A Perth-based company has developed the Nodestream protocol, a bespoke solution that uses...
Engineers discover a new way to control atomic nuclei as “qubits”
Using lasers, researchers can directly control a property of nuclei called spin, that can encode...