Many companies are grappling with the challenge of how to store and protect their critical long-term digital information assets – from important business records to unique company heritage. Executives are concerned about how they will store, protect and read digital files 100 years from now.
The answer is usually focused around having a robust storage strategy, either by using local tiered storage or via a scalable cloud storage service. The assumption is that by storing lots of digital information in a durable way (multiple copies, multiple locations with self-healing) it will remain intact for tens if not hundreds of years.
Move over, traditional storage
As technology refresh cycles accelerate, most of these "robust storage strategies" are missing one critical focal point: you might be able to get all the "ones and zeros" back in the future but because file formats and software are moving at such an unprecedented pace, you probably won’t be able to read the information using the software and devices of the future.
This is why Vint Cerf (one of the fathers of the Internet and now a Google VP) has been warning us about a "Digital Dark Age" whereby all the record of the 21st century could be lost forever as hardware and software becomes obsolete, and digital files become unreadable. As Cerf commented, "What can happen over time is that even if we accumulate vast archives of digital content, we may not actually know what it is."
Just look at the changes we have seen in technology over just the last 20 years – remember floppy disks, Smart Drives and CD-ROMs? More importantly, what about Lotus 1-2-3, WordStar, PageMaker, and early versions of Word and Excel, as well as the many other formats are now unreadable using today’s supported software?
It’s because of this rapid evolution that digital preservation technology is moving from niche to mainstream. Digital preservation has for a long time been recognised as an essential service by memory and cultural institutions: look at the National Archives, Yale University and even Transport for London.
However, it’s now moving up the government and corporate agenda as senior stakeholders realise that storage on its own is not enough. With more "born digital" content being produced every day, requirements are only getting more complex, and the need for organisation wide digital preservation strategies becoming greater.
Forget 100 Years
The growing industry consensus is that because of file format and software obsolescence and the pace of technology refresh cycles, the "tipping-point" is not 100 years, but more like 10 years. So if you have digital files and records being created today that have a 10 year+ retention time or are indeed already 10+ years old, you run the risk of not being able to read or use these when they are required.
With up to 30% of digital information needing to be retained for long-term legal, regulatory or business value needs, many organisations are sitting on a ticking time bomb. They are at risk of not being able to find or produce a useable and trustworthy digital records, whether it be to defend a regulatory case or to defend themselves against litigation. On top of legal considerations, companies are also having to consider the associated financial consequences and damage to brand reputation.
10 Year Window of Opportunity
10 years is, therefore, a more realistic time-frame to consider when planning to protect critical and unique digital information assets.
The key is to do something now whilst you can to ensure critical digital information can be read tomorrow. Building on reliable storage, digital preservation adds tools to accurately identify which formats are being used, pin-point those at risk and reliably recycle these into newer formats that can be read.
Throughout 2015, we saw large corporations begin to wake up to this as they have realised that digital preservation isn’t just about storing digital assets, but also about being able to find and use them as well as to prove their trustworthiness when required.
This means that the storage landscape is changing as organisations add digital preservation technology to their content management and information archiving systems. In addition digital preservation is being used to accelerate legacy application decommissioning, freeing up vital IT budget by ensuring the records that those applications contained are safely stored and accessible into the future.
Unfortunately, we’re still seeing many companies claim to want their storage to be a natural extension of the business and to be able to ‘store and forget.’ In reality, there is no ‘store and forget’ – only a proactive approach to safeguarding critical digital content and the use of digital preservation technology as part of the overall information governance lifecycle.
Like Malcolm Gladwell explained in his 2002 bestseller The Tipping Point: problems can and will behave like epidemics that are capable of sudden and dramatic changes in direction. Pinpointing the right intervention at just the right time can cause a cascade of change that could just guard against digital disaster. It’s time to start looking past 100 years at the new 10 year tipping point, and planning your digital preservation strategy accordingly.