“Businesses are having to spend as much as three-quarters of their total database management cost on labour”
Humans are now generating an estimated 2.5 quintillion bytes of data every single day, with more data being created in the past two years than in all of human history, writes John Abel, Vice President of Cloud and Innovation for UK and Ireland, Oracle.
Managing this growing flood is complex and the task comes with a high level of responsibility. Unplanned downtime costs businesses an average of £3.6 million every year, with the expense of a data breach topping around £6 million. But throwing more bodies at the problem isn’t efficient or even a guarantee of success.
The 24/7 requirements on businesses and huge security challenges mean that ‘manual’ management is no longer an option.
Four Major Hurdles
Today’s digital business requires a new approach, in particular around the four major hurdles currently hindering organisations from harnessing the full power of data:
IT budgets are getting eaten up by the complexity of managing data manually
Businesses are having to spend as much as three-quarters of their total database management cost on labour, with the focus being on mundane administration and operational tasks rather than high-value work.
The reliability of the systems needed to support today‘s 24/7 business is hard to ensure
Security continues to pose a threat
The Oracle and KPMG Cloud Threat Report, 2018 found that only 14 percent of respondents felt able to effectively analyse and respond to the vast majority (75-100 percent) of their security event data.
Businesses continue to be stunted by the amount of time spent on manual management
This is despite the fact that even a 10 percent increase in data accessibility translates into an additional £51.7 million in net income.
The key to overcoming these hurdles lies in the use of artificial intelligence, machine learning and automation. Particularly when combined together they will let businesses manage and get value from their information more easily, effectively, and with less effort. One technology in particular unlocking new levels of value is the autonomous database.
From Automation to Autonomous
Automation in IT isn’t new; there has long been a desire to let the machines work on their own.
Right from the start of the internet era and dawn of devices in the early 2000s, machine learning and automation have been used to help databases self-tune, ‘hot’ patch, undertake fault diagnostics for faster fault resolution and expand the performance, scalability and availability of databases, on demand.
The hardware side too, followed a similar pattern. Pre-built, pre-configured, ‘engineered’ systems optimised for key workloads gave rise to super-fast infrastructure that again could self-tune, self-repair and self-scale. In addition to delivering easier data management, for the first time IT departments didn’t have to hand-build infrastructure from piece parts from multiple vendors. More recently, cloud computing has helped DBAs manage information with more ease and speed and for reduced up-front cost.
The Autonomous Evolution
The pace of change is continuing. With AI and next generation cloud services becoming established the autonomous database has arrived. Embracing these core traits of being self-driving, self-securing and self-repairing it offers unprecedented availability, performance, and security – helping to eliminate human error. Autonomous is about intelligent automation. It is like comparing a cassette player that only plays music to the mobile device that provides so much more in capability.
With a self-driving system that uses built-in machine learning algorithms to manage itself, businesses can lower costs and increase productivity whereby manpower can be optimised and resources can be deployed to higher value tasks. Being self-securing, the database can automatically apply patches to protect against external attacks as soon as they become available and automatically encrypt data, for lower risk.
As Clark Kho of Accenture explains, “Most organisations don’t patch immediately. So the ability to be able to patch immediately and unconsciously gives better peace of mind that security is not something we need to worry about.”
In terms of self-repairing, it’s much more reliable. The time from a healthy database until crash is, on average, just 4.2 seconds. It is easy to see that only a machine could identify and react so quickly. With an autonomous database, the system constantly regulates its own operation, ensuring processes are running smoothly. This reduces planned and unplanned downtime, which can cost businesses up to 0.2 percent of revenue.
A Vanishing Advantage?
As with any new development, there is often a first mover advantage. As Forrester predicts, businesses that use AI, big data and the Internet of Things (IoT) to uncover new business insights “will steal almost £1 trillion per annum from their less informed peers by 2020.”
Like the time value of money (TVM), getting early value from technology that is available now has the potential to be worth more than the same amount in the future, because of the advantage it will give users in the present time.
History has proven this true. Looking back, early automation went some way towards freeing up DBAs from the practice of drawing straws to see who would stay in over the weekend or work late on patches. Engineered systems made massive strides forward in terms of speed to market, extensively shortening set up times, as well as speed to insight. Cloud too addressed the OPEX/CAPEX balance sheet equation. But it doesn’t take long for new performance levels to be perceived as the norm, giving rise to a demand for more.
Now is the Time to Act
As more data continues to be generated each day, there will be even more pressure on businesses to make the most of the data available. Database management will be more crucial than ever before, and emerging technologies like autonomous will soon become the norm as they help businesses boost innovation and financial gains – without boosting costs.