Big data might seem like an unlikely hero for manufacturing companies. Big data analysis might appear better suited to data rich industries like telecoms, advertising or social media.
But in fact manufacturing has a long history of using quality management suites to improve understanding of processes and products.
The accepted international standard is ISO 9001 and its roots date back to the 1980s. It relies on transparent, inter-operable systems which allow companies to fully check supply chains for quality and consistency. It means any component part of a product like a car can be traced right back to the workbench where it was made.
It is now so well-embedded that it is often a requirement for companies wishing to supply to big manufacturers.
Moves to ‘just-in-time’ logistics and supply chains also pushed manufacturing towards an increased use of technology. Adoption of RFID systems, especially on pallets and other parts of logistics hardware, meant the creation of a prototype ‘internet of things’ infrastructure in many factories and distribution centres.
All these existing tools bring with them a wealth of data – but this resource has not typically been exploited by the manufacturing sector.
The most important recent change for the manufacturing industry’s adoption of big data tools is the easier availability of cloud-based big data services which mean they can get competitive advantage from information which their systems are already collecting.
This means that what were once expensive bespoke systems which only made sense for the largest companies are now a realistic proposition for small and medium-sized companies too.
Recent research from the Alan Turing Institute and Warwick Analytics found widespread optimism about adoption of big data in manufacturing – 92 per cent of respondents expected their projects to bring in ten per cent improvements in business performance.
The main areas of expected improvement were in quality and yields and in increasing production throughput – by better analysis of the stages of production.
Survey respondents also saw possibilities for better customer service and longer warranties by bringing together data from production systems and CRM systems. They also predicted a speedier time to launch new products, a bigger role for predictive and preventative maintenance and wider improvements to supply chain management.
But there are still barriers to successful big data projects.
The biggest issue is still having data in different legacy systems which either take too long to join up or are too expensive to change. Systems in manufacturing have long expected life-times so many struggle with even basic connectivity or interoperability – such things just weren’t part of the design spec fifteen or twenty years ago.
There second set of barriers are arguably more difficult to deal – problems with the data itself.
Many big data projects uncover serious issues with the information the business is collecting – whether it is ‘dirty’ and takes too long too clean or there is so much ‘noise’ it is impossible to discern the signal. Other forms of data can appear so unstructured that making proper use of it remains difficult.
Some of this might be an issue for your data project itself.
There are experts and tools to clean data and to sift signals from noise. There are also systems to analyse and to better visualise data which might help reveal useful insights.
But even with all this help a surprising number of big data projects do hit this wall – the data is simply not good enough in the first place.
This is not just a problem for the project but also a bigger problem for the business. There is no point collecting data which has no use – either because it is wrong or because it is in a format which makes it unusable.
It is not going to make you popular but don’t be surprised if a vital early step in a big data project is going back to the beginning and making quite fundamental changes to how data is collected in the first place.
The old maxim of ‘garbage in, garbage out’ remains true and no company can gain advantage from its data unless it is clean, accessible and usable from the start.