Here are five easy to implement steps that will help *any* organisation improve its data analytics TTV.
In today’s quarterly driven world, the time it takes to drive the return on any strategic investment is critical. Put simply, agile, responsive businesses outperform slow moving, reactive ones.
‘Time to value’ (or TTV) is increasingly driving business decisions as a result. And nowhere does this need to be more closely scrutinised than when investing in the
technology that helps you make the right business decisions; robust and efficient data analytics capabilities that are essential for today’s best businesses. Think about it; you only invest in technology to make yourself more efficient or to drive a better strategic (often transformational) outcome.
Logically, the sooner you are successful with both, the better your return on your investment. But too many organisations either struggle to evaluate their TTV with any confidence or simply assume that they must passively accept the timeline set by their technology suppliers. It doesn’t have to be this way! Here are five easy to implement steps that will help *any* organisation improve its data analytics TTV:
Get serious about the Cloud as a means to drive better TTV
So, just about everybody I engage with now knows that the time, cost and expertise required to procure, install and configure systems can be eye watering. In fact, it can take weeks and a significant cost just to turn the lights on before you can even begin any development. Yet still people persist with the approach. OK, in certain circumstances, this may be the right way to go. But let me tell you, in the great majority of technology deployments, cloud infrastructure and platforms can provide an obvious way to significantly shorten the TTV for new projects, especially for organisations whose data [a] originates in the cloud or [b] can be easily moved to the cloud. The beauty of cloud platforms is that they provide a much lower barrier to entry being lower cost, quicker to stand up and far easier to scale as needed.
The result? A much more flexible eco-system from which to innovate; organisations can start small and increase the size of compute as workload/data requires. This can be particularly important, especially with the explosion of data being captured by organisations and subsequently needed for analysis. And if you are worried about creating complexity by mixing on-premises and cloud solutions, fear not; there are vendors out there that can manage hybrid environments, where there is a split between on-premises and cloud, presenting it to IT as a single, cohesive, environment and also managing the movement of data between respective components, as needed.
This is an area where my company, WhereScape, has particularly focused because we see it as a significant pain point for organisations.
Automate your data analytics. Now!
Many of the organisations I see in the market are stuck in a time warp when it comes to managing data analytics. At the Extract, Transform, Load or Extract, Load, Transform (ETL/ELT) stage of data analytics, manual processes (which takes money and time) remains prevalent. And this manual development of ETL/ELT routines can be the greatest inhibitor to TTV. Why? Because it is slow, tedious work that is error prone if rushed and not validated.
In addition, documentation is generally not done, or if it is, it is incomplete and becomes quickly out-of-date, which impacts future usage. This makes no sense when there are robust solutions available that can automate the ELT code generation using best-practice based methodologies (Data Vault, Dimensional, etc.) as well as ensuring that the generated code is optimised for the respective platform that it will be deployed upon.
This approach can reduce time (and risk) by up to 90%, allowing companies to develop at a pace that the business needs, while also ensuring that the generated code will be consistent, robust and well-documented. To coin an oft used phrase, what’s not to like?
And continue to automate, wherever you can
I passionately believe that organisations should have a commitment to automating the entire lifecycle of managing their data infrastructure — not just the ETL code generation. Whether it is on-premises, cloud, or a mix, you should be obsessional about reducing the time, cost and risk in all phases of the lifecycle to maximize your TTV. But, again, the deployment of new (or changed) components between various environments, such as Development, Test, Production, is still frequently done manually, slowing down TTV in the process. Ensuring that respective schema changes are made and code is deployed correctly can be quite involved and fairly risky if it requires a human to manually perform tasks. If we look at traditional software development best practices, there is a reason why this is automated!
So, look to automate this aspect of the lifecycle, to both speed up the time it takes to deploy changes, as well as reduce the risk during deployment. I’m now seeing more and more customers wanting to apply software development best practices and methodologies to their data infrastructure environments. This entails trying to automate as much as possible. It’s a good maxim to adopt to improve your TTV.
Don’t let your data infrastructure lag the pace of business change
Business needs are changing faster in today’s dynamic and fragmented economy, where competitors can appear (or disappear) overnight. With that pressure comes the expectation/need that your organisation’s data infrastructure can change just as fast. But understanding quickly what needs to change, as well as the potential impacts on downstream consumers of the data, is imperative to managing the risk and time associated with evolving the existing environment. And if the environment is not well-documented (yes, it has been known to happen!) or not conducive to clear lineage and impact analysis, then the risk further increases.
Look to ensure that your system supports detailed lineage and impact analysis in order to provide a clean line of sight to understanding how the environment can be successfully enhanced in the future. Invest in automation solutions that allow for rapid code generation, can deliver full impact analysis relative to potential changes, and automate the code regeneration when changes are made. This significantly reduces the time to make changes as well as the risk in doing so. Trust me, it’s an investment worth making.
Don’t let the business come from Mars and IT come from Venus
The Business tells IT what it wants. IT builds a solution that address the request, only to find out that the solution built does not match what is actually needed when presenting their work back to the Business. Sounds familiar? Of course it does! And with it comes a massive impact on TTV. Collaborative/Iterative development between IT and the Business can significantly reduce TTV by ensuring IT’s work delivers the expected value to the Business.
Using automation to engage in rapid prototyping and collaborative discussion early on in a project can bring IT and the Business closer together to detect nuances in vision and gaps between business needs and technical capability.
Automation solutions, like WhereScape’s, using a metadata framework, can shrink TTV by easing the transition from prototype to a robust, on-target solution. Additionally, using rapid prototyping to provide the Business an earlier view of the project’s progression can bolster trust and the relationship between the two groups – an additional benefit that can pay dividends well into the future.
As Gartner has recently reinforced, organisations are in a race to significantly reduce the time it takes to turn their technology investments into value. And those that are fastest with TTV will create enormous competitive advantage by ensuring that they are able to turn data into insights and, by doing so, will turn ideas into commercial success. Give yourself a chance in this race by heeding the five steps I have referenced and you will be in a great position to speed up TTV. It’s a race worth winning.