Genome editing is now so well-established as to be a household term, but that doesn’t mean that it is thoroughly understood by the world at large.
Fear can be a wonderful thing. From an evolutionary point of view, terror of the unknown enabled our ancestors to avoid dangerous creatures and sticky situations that would have brought an early demise; yet over-caution has, at times, hindered our species’ progress.
Nowhere is this more evident than in our approach to medical advances. For example, the early pioneer of vaccination Edward Jenner was derided in a series of vicious pamphlets and savage cartoons that predicted grave (and ridiculous) consequences for those submitting to the smallpox vaccine.
Sometimes, though, we place too much trust in new life science technologies. The tragic and terrible consequences of prescribing thalidomide to pregnant women and the barbarous “science” of lobotomy both show that the history of medical science is not one of untrammeled success. But even technologies that have been tried and tested over decades of use, such as vaccination and genetically-modified organisms, are still not universally accepted.
The obstacle for any new life science technology is therefore not just to pass all the rigorous clinical trials, but to achieve something much more difficult: win acceptance in the court of public opinion. This is the challenge currently faced by one of them most exciting medical advances in living memory, CRISPR, which promises to revolutionise healthcare – if the industry can overcome the fear that seems to accompany any advance.
CRISPR and its discontents
Genome editing is now so well-established as to be a household term, but that doesn’t mean that it is thoroughly understood by the world at large. Most people are familiar, to some extent, with genetically-modified organisms (GMOs), and will likely take a strong view on its benefits (or otherwise) to the human race.
Science has now progressed from gene modification to gene engineering, the intentional editing of plant, animal or virus genomes to achieve a specific result. The latest iteration of this technology is the gloriously-named Clustered Regularly Interspaced Short Palindromic Repeat (hereafter, thankfully, known as CRISPR/Cas9).
CRISPR works by injecting a mix of DNA, RNA and an enzyme known as Cas9 into a living organism to perform, to use technology parlance, a “rip and replace” of its genome. So far, so science fiction – except that this technique is already in use, and beginning to garner incredibly exciting results. Scientists using CRISPR have been able to create mosquitos that are resistant to the malaria parasite, help immune cells to target cancer more effectively, and even enable us to transplant organs from pigs to humans.
These examples only scratch the surface of what CRISPR can achieve, and naturally this technology is not without its discontents. Some of the opposition comes from a failure to understand how the technology works; however, there also deep-seated ethical concerns over potential issues and applications. These include “designer babies”, the perennial debate over the morality of using human stem cells for medical research, and the issue of bioinformatics being used to refuse healthcare coverage, or boost insurance premiums.
You might think that these are questions for legislators, ethicists and scientists themselves – and you’d be right. But if techniques such as CRISPR are to fulfil their potential, and answer the many and varied questions over transparency and ethics, then information technology providers will have to play an important role.
CRISPR and the role of IT
Life sciences has long been a data-intensive industry, especially when it comes to genome sequencing. A single genome contains so much information that DNA has been proposed as a future data storage technology.
Technologies such as Big Data storage, processing and analytics are central to resolving many of the practical and even ethical questions raised by CRISPR. For example, at any moment there are dozens of research projects and trials underway all around the world, and the various scientists and clinicians need to share huge amounts of data in order to determine the efficacy of various approaches and avenues of research.
Information technology has a crucial role to play, not just in making sense of the enormous volumes of data and enabling it to be shared easily, but also to help create a framework to coordinate international research and collaboration.
Of course, much of the actual research is highly proprietary, with life science companies and universities jealously guarding their investments in research. But given the widespread unease which CRISPR can provoke in a sceptical and uninformed public, it is becoming crucially important that we can establish global data-sharing platforms.
Collaboration platforms will enable scientists and other stakeholders – such as legislators, journalists and even ethicists – to monitor the progress of research, and better understand the long-term impact of different CRISPR applications, for example through global patient registries.
While IT will not, in itself, provide answers to all the moral and ethical conundrums, it can however enable proactive regulatory engagement, providing legislators and others with the evidence they need to make informed judgments, rather than decisions based on prejudiced and uninformed opinions.
Ignorance is the enemy of progress, and that’s as true today as it was when Jenner started deliberately infecting his patients with cowpox. In the second decade of the 21st Century, we should be using the latest data technologies to eradicate misinformation and doubt, just as we brought an end to smallpox and will, very soon, to polio and a whole host of other diseases.