Joe Baguley, EMEA chief technologist at VMware writes for CBR on how businesses must change their mindset to get the most from big data and what the trend actually means for CIOs and the IT department.
Throughout 2012, 'big data' was in danger of becoming a term as overused and confusing as 'the cloud' had in 2011. Cut through the marketing hype, however, and there's the growing realisation that big data could be an opportunity for businesses of all sizes to gather meaningful insights to drive growth and their bottom lines.
Economies of Scale
The reality is that, around the world, we are now generating data at an accelerating rate - and this is only going to continue. Increasing sources and amounts of data are being created, offering organisations more insight and a better idea of what works, what doesn't work and what will work in the future.
The desired end result is, and should rightly be, competitive advantage, in what remains a tough economic environment. Take, for example, a company that has not just moved from monthly to weekly or daily sales reports or stock data, but is now able to deliver real-time reporting, 24/7, to all relevant employees - even those on mobile devices. Increased flexibility, insight and responsiveness; the benefits of having this information to hand should not be underestimated.
The challenge lies in having the ability to effectively store, analyse and consume all of this data, and this requires changes in two areas: mind-set and reality.
A Change in Mind-set
Most organisations are sitting on enormous amounts of data that have yet to be mined. Businesses have to start focusing on this existing information, which is often sat in large databases closely guarded by IT departments, and unlocking it.
IT departments should realise that data integration is being done now, and has always been done, by existing users in the company. Those 'business logic experts', the people in charge of the spread sheets and number crunching, have being pulling data from various Business Intelligence systems and using their own experience to create tools to help them run operations. What's now needed is a step-change in thinking, to realise that their role is heading towards providing a data feed for the business as a whole.
A Change in Technology
Firstly, there's the need to place interfaces on this data to allow it to be combined with other data and data feeds, to provide holistic, new insight. But preparing for new, larger and faster data necessitates a new approach to building IT services. An almost industrial approach is required to building a scalable and resilient platform for the future that can grow and develop alongside the ever-increasing amount of statistics and figures.
The Software Defined Datacenter, a concept which has started to gain real traction over the second half of 2012, is integral to this. This concept takes all of the previous, silo technology that was tied to particular hardware stacks, and defines a path towards everything being performed in software. The result? A fluid and agile infrastructure defined and controlled in software.
Virtualization of compute is the first step in moving to the Software Defined Datacentre, but this needs to be allied with moves to both virtualized storage and networking. This can then enable the adoption of big data technologies such as Hadoop, which has recently been virtualized as part of the Serengeti Open Source project and is designed to allow enterprises to harness very large amounts of data for competitive advantage.
So while the hype may continue, what's clear is that big data delivers a significant opportunity for businesses to do things better. More data, more insight, better forecasting and informed decision-making are desirable for any organisation, regardless of size or vertical. However, it does necessitate a change to newer, more scalable technologies and architectures, as the associated requirements out-perform existing infrastructures.
Joe Baguley, EMEA chief technologist at VMware