Software makes a difference in embedded design
Rahman Jamal, technical and marketing director Europe at National Instruments discusses some challenges of embedded system design for control and monitoring applications
The offering for the embedded market is currently comprised of disjointed and complex tool chains that make it difficult for developers to create embedded systems which combine measurement and control functionalities. Solving this challenge requires standardised hardware and software platforms enabling even small teams of developers to experiment and solve problems quickly and efficiently. Compared with conventional tools, which offer almost no scope for system abstraction and are inclined to be characterised by cryptic hardware-dependent programming, a platform-based approach is more productive.
Graphical System Design is one such classic platform-based approach, which breaks an application down into basic building blocks such as I/O, analysis, processing, programming, user interface and implementation platform and links them together using graphical programming techniques including timing and synchronisation. This platform-based approach enables the user to concentrate on innovation instead of having to grapple with complex system design problems.
Practical examples of how hardware components are combined with software architecture are to be found in the most diverse areas, such as renewable energies, life sciences and robotics. Users of the platform-based Graphical System Design rely on it to solve challenging tasks to do with control and monitoring of embedded systems. For example, medical imaging devices were created with optical coherence tomography for a three-dimensional representation. These examples prove that even small teams of developers are able to successfully manage complex control and signal processing tasks and quickly find innovative solutions.
Developing complex things in ever shorter timescales
A constantly increasing number of new designs and escalating complexity are forcing embedded design teams to become more efficient and influencing their choice of technology. Evidence of this is provided by mobile telephones; ten years ago, devices featured a wireless module and a single processor, which was adequate for the task to be performed – telephoning while out and about.
Since then the mobile phone has been transformed into a smartphone with various interfaces (Bluetooth and WLAN) and a number of processors. These are also required in order to execute the multitude of applications – e-mails, SMS, calendar, videos, music, games, photos and telephony.
Even more complex are automobiles, with as many as 100 processors for controlling the engine, braking, traction, on-board computer, storing the seat and mirror settings, music, and navigation systems. The same trend is also followed in industrial applications; electronic systems and machines feature a wealth of control and monitoring systems. Their purpose is to improve plant performance and quality. Ultimately, what it always comes down to is differentiating oneself from the competition.
To support design teams with a rapid market launch, technology providers are developing components, modules and even complete embedded platforms with an ever higher degree of integration and functionality. Ultimately, companies are working towards a comprehensive platform for embedded design incorporating communication, program execution, system I/O and the design software.
This trend began with SoCs (Systems-on-Chip) and SoMs (Systems-on-Module) customised to specific embedded applications. Such components integrate all the electronic circuits and often feature the three main elements of an embedded system: a communication interface, processing and system-specific I/Os. Common examples include video digital signal processing (DSP), audio DSP, radio solutions, networking solutions, or a complete computing platform in a single chip or module.
Computers-on-module (CoMs) make up a special subcategory of SoMs. By integrating an entire computer or embedded subsystem into a single device, companies are providing more value to embedded designers through increased functionality, better integration, a more thoroughly tested design, a smaller package, and lower power consumption. SoCs and SoMs are typically offered as a standard component and are designed either for universal use or for vertical applications. Mainly manufactured in high volumes, they ensure low cost and high quality.
In some application areas, almost all developer teams use the same SoC or SoM. With many teams using the same SoC or SoM, it is difficult to maintain differentiation with the final design. To maintain differentiation, most design teams augment the SoC or SoM with additional discrete components and programmable logic.
With the addition of programmable logic, such as a field-programmable gate array (FPGA), to the design, teams can add specialized processing – proprietary know-how or so-called Intellectual Property (IP) – improve performance, and future proof the design with the ability to update the logic at any time during development or even once they have deployed the embedded system.
The addition of FPGAs is now established practice so that SoCs are already being offered that combine both a full microprocessor and an FPGA in one single module. Currently, the most interesting component from National Instruments’ perspective is the Xilinx Extensible Processing Platform (EPP) Zynq-7000. The EPP family combines a Dual-Core processor ARM Cortex-A9 and a Xilinx-7-FPGA. These components unite high performance and flexibility so that embedded developers can differentiate their design while benefiting from the advantages of a SoC.
Embedded platform challenges
Although the advancements in SoCs and SoMs are exciting, most fall short of offering a complete embedded platform. In the upcoming decade, software tools will play a more critical role in system design and development. In the past, many embedded designs were dictated by embedded hardware capabilities and mapping them to the system requirements.
Due to the reduction in power, cost, and size in embedded hardware over the last decade, hardware will no longer limit or dictate many embedded design choices. Productivity will. Embedded design productivity will be driven by tightly integrated software design tools that can use off-the-shelf hardware capabilities with an environment intuitive enough to be used by nearly all engineers and scientists, not only those trained in embedded software, firmware development or hardware description languages.
Smart phones are a good illustration of the influence that better software development tools can have on embedded design.
A completely integrated embedded platform must include a single software development environment that programs the heterogeneous processing systems, includes a large library of analysis and control algorithms, and has tight integration with communication and application-specific I/O, while giving the design team the ability to choose from a variety of programming approaches based on the specific application needs. The embedded platform should also be flexible and modular enough to give design teams the ability to evolve the system during the entire design flow, from first prototype to final deployment, while using the same code throughout.
Previously, the decision whether to use a low-cost microcontroller or a higher performance CPU was fairly straightforward and based on the expected performance needs of the embedded system. Now however, control and monitoring systems also need to deliver additional functionality such as:
• Faster and more reliable responses to I/O
• Machine monitoring to predict failures and improve safety
• Audio and image processing
• Wireless communications and Internet connectivity
• Filtering of analogue and digital signals to get more accurate measurements
• Digital communications to intelligent sensors and other subsystems
• I/O level preprocessing for data reduction
More complex systems such as these require additional processing components, such as FPGA, digital signal processors (DSP) and graphics processors (GPUs).
The reconfigurable logic within the FPGA fabric has been ideal for implementing complex state machines and application-specific digital circuitry that operate independently from processor clock cycles, with higher reliability and determinism.
Over the years, the performance of FPGAs has increased dramatically, with significant reductions in power and cost. For this reason, the use of FPGAs in embedded measurement and control designs has expanded from simple glue logic to handling signal processing tasks, such as custom digital filters, fast Fourier transforms (FFTs), and logic for proportional integral derivative (PID) control. A primary benefit of FPGAs for processing is that several algorithms can now run in parallel, unlike the sequential architecture of a processor.
With all the performance and flexibility that FPGAs offer, they are nowhere near replacing the need for microcontrollers and microprocessors in embedded designs. Comparatively, processors are still lower cost and come with a well-established ecosystem of software abstraction, including OSs, standard hardware drivers, and libraries for signal processing with easy floating-point arithmetic. The adoption of FPGA technology has been the result of higher performance systems that combine both processors and FPGA fabric to divide and conquer complex processing needs through both sequential and parallel architectures. Integrating reprogrammable hardware into designs is the fastest way to iterate without having to spend time and cost on redesigning PCBs.
Cloud computing in the engineering chain
As regards to embedded systems for measurement, control and command applications, Cloud computing generally enables data from distributed applications (e.g. condition monitoring and performance measurement in wind farms) to be concentrated along with the transfer of resource-intensive tasks, such as demanding image or signal processing or even the compiling of programs and calculations. The Cloud provides virtually infinite resources for these purposes.
Admittedly, decentralised access to embedded systems is fraught with additional hazards. Foremost among these is system security, which, fundamentally, comes at a trade-off; more secure systems require greater time and cost investments and sacrifice convenience. Therefore, it is necessary to evaluate the proper investment in security for each application based on hazard and risk of failure. The risk of failure and of endangerment needs to be considered here. The following is a list of areas where you can secure your OS and network:
• Disable any services that leave open network ports (like FTP)
• Enable SSL support for any Web services
• Install security updates and patches from your OS vendor
• Set up antivirus and firewall software
It is also advisable to change all standard network ports, set up a VPN-capable firewall as well as permit third-party applications using white-listing and encrypted communications signals.
A software-first design paradigm is predicated on a system architecture that minimizes fixed-function hardware. This includes obvious fixed-functionality devices such as application-specific integrated circuits (ASICs) and hardware filters.
Although these fixed-function devices offer a lower per-piece component cost, they achieve that cost at the expense of future scalability. Software-defined hardware platforms, such as processors, digital signal processors (DSPs), and field-programmable gate arrays (FPGAs), give system designers the flexibility to more completely change a device’s behavior without new electrical work.
While these platforms have higher component costs, they can dramatically reduce design costs, increase market share through faster time to market, and over time increase volume and drive down cost by making it possible to use one design across multiple devices.