The Internet of Things (IoT) is beginning to provide the 'wow factor' we've come to expect from successful new technologies. Smart tennis racquets that record ball speeds, toothbrushes that keep track of our daily habits and baby sleep suits that send audio and data to a parent's mobile device all have a certain "cuteness" and appeal - and are suddenly appearing in front of us like signposts to a new, smarter and connected future.
Yet, manufacturers of these products may have only a short window in which to take advantage of their first-to-market status. When our homes, cars, offices, factories and so on are all run by invisible wireless sensors, these will need to be mass produced and therefore become less expensive. With inexpensive sensors available, connected products will proliferate and prices undercut.
So where is the real IoT profit going to come from? The UK government certainly thinks that that there is money to be made. Its announcement last month of an additional £45m of funding, brings funding for IoT research to £73m.
But there are two groups of companies who could benefit here. First are those that design and manufacture the products - and then there are the data analysts who will help the manufacturers, consumers or business buyers manage and create value from the data produced by them. My money is on the latter eventually becoming the more profitable.
So is this all down to big data? Yes and no - data from the IoT will make our current volumes look relatively insignificant. "A billion is the new million," commented one senior director describing the sheer proliferation we are about to experience.
But it's not just volume, it's variety and velocity of data too. Take electricity meters, for example. Traditionally these were read and data gathered once a quarter - smart meters now generate a reading every 15 minutes. This is around 10,000 more data points per subscriber. Next generation aeroplanes such as Boeing's 787 Dreamliner, create terabytes of sensor data per flight. Multiply this by the thousands of planes in a major airline's fleet and you have some idea of the volumes that will be pouring in to be managed and analysed.
In fact, new tools are emerging all the time to help businesses master the complexity of this big data environment without becoming highly-technical specialists. One of the key processing frameworks for big data, Hadoop, is still relatively new and there are still too few IT professionals who are totally proficient in using it. NoSQL, the new database architecture is not yet widely understood and even database experts aren't necessarily yet up to speed with it.
So the best new solutions are those which hide the underlying complexity. This makes data analysis less difficult, turning mere data scientists and anyone with data integration skills into 'super' big data developers.
These new tools enable users to work in an intuitive visual development environment, building data flows and manipulating data graphically. In this way, raw data is translated into a much more accessible format. They also help to bridge the gap between data scientists who are experts at statistical analysis, but have few programming skills and programmers who struggle to interpret the data scientists' and business users' requirements.
The latest data integration tools take this process one stage further. These combine two key functions - the visual development environment which allows a standard development team to do most of the development work, thereby freeing up senior expert developers to be more innovative and create new intellectual property (IP) for the business.