“Automation is critical to building a reliable pipeline, particularly when targeting embedded platforms. If a build is going to be reproduceable, everything will need automating”
In the software world we have seen the adoption of continuous delivery processes transform the approach to software development. It has allowed companies to release new features to customers rapidly, incrementally, often and at lower risk. This kind of software development, with its own processes, testing, and continuous integration, is often treated separately when developing products with a physical component, such as IoT systems with embedded CPU or FPGA, medical devices, safety critical control systems or soft drinks machines.
However, the principles behind continuous delivery should be applied to the entire process of product development not just software. This means building quality in, working in small batches, automation, backed up by continuous improvement and constant feedback on progress.
This approach can head off problems earlier and reduce the costs of the overall product development. In this article we discuss how to apply continuous delivery principles to projects involving software and firmware on embedded devices, even where there are significant differences and additional challenges.
The Deployment Pipeline becomes a Configuration Controlled Development Process
A deployment pipeline is at the heart of successful continuous delivery in products. Traditionally, continuous integration is used in a software development process to monitor the source control system. This triggers a build and test cycle for each change to source control. This allows the health of the software to be monitored constantly and rapid feedback on its status to be fed back to the team.
Not only can this be used to run the software build and test, but it can be extended to build FPGA firmware code. It can also be used to build and install any operating systems, configure the embedded platforms, deploy the resulting binaries to all target platforms and run system level tests.
The deployment pipeline should also incorporate any manual test stages, QA signoff, generation of documentation and final release.
Typically in a product development involving embedded devices, the final deployment step will either be via an over the air update or by deployment to the factory installation system and this stage should also be included in the pipeline. Each stage should be carefully designed to verify the system incrementally and provide information to the team as fast as possible in the event of failure.
Ultimately the pipeline should provide a reliable, repeatable process which is used to build, verify and release the system and can be relied upon to maintain the overall product quality. It should also continually evolve as new hardware platforms come online and as new quality or process requirements are required. For example, the addition of a QA signoff process should lead to an additional stage in the pipeline which ‘promotes’ the build to the next stage. This should only be activated by a QA with the appropriate authority, though with no need for paper-based sign off. This is especially useful in regulated industries where such processes are mandatory.
Modern continuous integration tools provide the capability for the pipeline process itself to be scripted. This is immensely powerful as it provides a fully coded development process that is itself under configuration management control within the source repository. This provides a history of changes made to the development process in sync with changes to the software and firmware and maintains the full audit trail necessary for regulated industries.
Continuous Delivery: Automate everything
Automation is critical to building a reliable pipeline, particularly when targeting embedded platforms. If a build is going to be reproduceable, everything will need automating. Often the first step is to automate building and deploying code onto the target device. This could be as simple as building the FPGA code and downloading the resulting image onto the device via JTAG.
Many difficulties often arise in these areas as the tools for building FPGA code lag significantly behind the equivalent software tools in terms of automation. They are often primarily built around a GUI and running from the command line is an afterthought.
A significant investment in scripting to get an image built and deployed onto the target. In addition, the scripts that are developed to automate the deployment of the hardware will need to have the same level of care, attention and quality as the released code. Any intermittent failures of the test harness and flaky tests will become a constant source of frustration.
The other prime areas for automation are in hardware configurations. For example, where a device has different power configurations, instead of a traditional approach of manual testing, the operation of a connected power supply should be automated to test switching between the various modes.
For hardware jumper configurations, use USB switches. If test equipment is needed – the setup and execution of the tester should be automated and subsequently automatically test all configurations of the system. However, manual testing is likely still to be required, and this should be focussed on exploring areas of the system that are difficult to automatically judge if they are running correctly (for example user interfaces).
Many of these continuous delivery practices are discussed in books such as Jez Humble’s Continuous Delivery and provide great resources to help companies get started with continuous delivery. This approach can greatly benefit product development projects that involve hardware systems. Equally, while there are challenges around the investment required to secure the necessary tools and automation, that investment can bring real benefits such as rapid release, high quality and constant feedback into the physical product development world.