Fog computing builds on agile thinking: encouraging value, speed and flexibility in IT infrastructure.
Cloud computing is widely appreciated as a necessity for Internet of Things (IoT) networking. But what is the plan of action in the event of connection failure? Fog computing – a facet of edge computing – is an increasingly vital topic for CIOs to study up on, and could prove the missing link between present and future IoT project success.
So what is fog computing? The OpenFog Consortium, a vendor-neutral group headed up by experts from Intel, Cisco, Dell, Microsoft and others, defines fog computing as “a distributed architecture which spans the continuum between the cloud and everything else [ie nodes, sensors and other IoT-enabled devices and compute platforms].” The benefits of fog for enterprise amount to faster processing as devices are in closer proximity to data generation sites, as well as cost savings from lower use of bandwidth. As ever with cloud computing endeavours, a flexible and business-specific architecture is essential, meaning long-term planning is key to ROI.
Cisco, which first named the concept, said: “Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The distinguishing Fog characteristics are its proximity to end-users, its dense geographical distribution, and its support for mobility.”
Sound ideal for Internet of Things capability? Business leaders are thinking the same, with a recent survey of OpenFog Consortium members finding IoT is the top application area for fog technology by a considerable margin (70%), followed by industry-specific applications and 5G.
Autonomous vehicles for safe, easy transportation; subsurface geophysical imaging for fossil fuel detection; process manufacturing alteration for improved product variation – just some of the benefits which well-networked fog nodes can bring about in industry. Inspired? CBR has looked at the key considerations any IT decision maker or enterprise manager should be mindful of.
The major draw of fog computing for Internet of Things strategic transformation is the promise of reduced latency. In fact, latency and network bandwidth are the two most-cited reasons for interested in fog among OpenFog Consortium members. The reason this is achieved is through the computation taking place ‘on the edge’ of the whole network. Radically, this perspective moves away from enterprise dependency on the cloud, but promises speedier operations with the potential for lower cloud computing costs. This has major implications for businesses focused on mobility, particularly in light of UK bandwidth speeds and the distant approach of 5G.
As Neil Bramley, Toshiba B2B Client Solutions Business Unit Director of Northern Europe, explains: “The ability to process data at the edge of the network and close to its originating source is invaluable. By minimising the strain on cloud storage services, organisations can ensure faster operations through reduced latency, only sending the most relevant data to the cloud. By spreading the load in this way, bottle-necks are greatly reduced, if not eliminated.”
More agile architecture equals faster data delivery
Fog computing can be seen as a practice built on the much-used foundation of agile thinking; the software principle encouraging value, speed and flexibility in IT infrastructure. The improvement in latency naturally means faster compute time and data delivery.
Citing the example of sensors on the factory floor, Yen-Sze Soon, Managing Director, Accenture Digital said “If all of these decisions [on performance, throughput and reliability] are made in the cloud, the reliance is on connectivity speeds. But the more controllable element for data processing is the hardware. In situations where time-to-decision is vital, the difference in speed between edge and cloud can be measured in pounds and pence.” The financial implications are clear: more precise control through cutting-edge hardware could spell improved business processes and TCO.
Network availability and its impact on CX/UX
In a world of ever-increasing competition for digital business services, customer/user experience (CX/UX) is a more important KPI in 2018 than any other year. As Toshiba’s Bramley reasons: “With 5G’s impending arrival set to drive IoT adoption within the business sector, Edge Computing will be more necessary than ever to help contend with ever-growing swathes of data.”
Keeping up with network innovation must not fall by the wayside for the IT decision maker – or customers will vote with their feet. Telecoms choices should be taken into consideration during any enterprise fog computing endeavour.
Yet again, fog benefits are ripe here. “By adding nodes closer to the end user, this reduces the geographic distance that data has to travel,” said Appal Chintapalli, Vertiv EMEA VP of integrated rack systems, “Coupling this with the added capacity that each node delivers, data interaction and analysis is sped up which results in a significantly improved end-user experience.” On the other hand, faster data delivery and improved UX is contingent on a highly available network infrastructure.
Fog enables evaluation of local performance of an application or business process, proving particularly useful for a number of SME needs. Yet a data analytics platform is most likely required for this aspect of the IT strategy, meaning investment in PaaS software and workforce skills could well be necessary.
Nevertheless, businesses typically fall down on lack of preparedness when it comes to data management ahead of fog implementation. “As with all projects associated with the collection and analysis of data (such as IoT related projects) customers should also think about how they harness and action that data analysis,” said Ian Waters, Director of Solutions Marketing, ThousandEyes. “We know from experience of big data type projects that the vast majority of the data captured was never really properly leveraged to deliver business value.”
Time saving: only anomalous data are flagged
An ingenious innovation of fog computing its potential for cost savings in the system’s propensity to only retrieving data needing immediate human action. In tandem with machine learning and artificial intelligence technologies, fog could be saving IT managers, engineers and maintenance workers oodles of hours through its smart data selection. We knew all along that computers were better at handling big data than human brains, so why not make best use of this in the latest edge computing upgrades?
In the words of Tom Fisher, CTO, of MapR: “What makes Edge different is the ability to enable real-time analytics. In the world of IoT, this is essential for anomaly detection and time series (trending) data.” Fisher believes that “analytics, over time, will replace the functional application logic by leveraging this same local compute for running and feeding models” based on AI and ML.
Cost savings: telecommunications, data centre capacity
When it comes to budgeting, the sky cannot be the limit. Fisher of MapR outlines how companies utilising fog appropriately can save big on telecoms: “The enterprise will find that by investing in and leveraging compute, they will create cost savings by bringing back only the data required. Data anomalies or summary performance metrics can be identified without moving the entire data set, representing a significant reduction in telecommunications costs.”
As with any digital investment – and even more so given the emergent character of IoT technologies – IT decision makers must keep a macro perspective on how today’s fog computing spend will affect adjacent budget segments such as telecoms, data storage and maintenance costs.
Nevertheless, fog computing presents several tantalising opportunities for cost-saving. Vertiv’s Chintapalli goes into more detail: “Deploying smaller nodes also reduces costs; they’re simpler and faster to install than larger facilities and deployment can be efficiently replicated at each new site. Longer term, a fog computing model strategically complements organisations’ cost optimisation efforts through power, cooling and space savings.”
Decision-making: node proximity
A key benefit of fog to bear in mind is the increase in system control and adaptation which the nodes facilitate. Joe Fagan, Senior Director for EMEA Cloud Initiatives at Seagate, explains: “In some cases, it is much more efficient to process data near its source and send only the data that has value over the network to a remote data centre.” Fagan identifies an “increasing need for data to be available in real time” which “will heighten the focus on ensuring low-latency responsiveness from edge storage solutions.”
Soon gives further insight to CBR into how fog nodes aid businesses’ ability to meet these customer expectations: “The data is analysed to optimise operations and for other decision making on a real time basis, in the cloud or on a central location potentially where compute power is available. The output is then used to adjust operations real time.”
Data privacy, security and cyberattacks
Last but potentially most important of all is the data security aspect of fog computing. As with any IoT project, the presence of multiple weakly-secured devices connected to a network present more points of vulnerability for cyberattacks.
That said, Bramley sees an opportunity for cyber security through the fog. He told CBR: “An Edge-focused strategy can significantly strengthen the network by keeping major threats away from its core,” he said, “With activity taking place between local end-points, threats such as malware or infected files can be identified at an earlier stage and contained at device level, rather than contaminating the entire network.” Individual privacy can be increased by processing person-specific data in the fog, rather than it being collected and stored in a centralised database available to company staff.