Question: What do you get if you take five professors, eight heads of research labs, and a sprinkling of consultants and computer company bigwigs to bring the total to 22? Answer: A Software 2000 Workshop and a report – Software 2000 – a View of the Future. Claiming parallels with the 1968 Nato-sponsored conference on […]
Question: What do you get if you take five professors, eight heads of research labs, and a sprinkling of consultants and computer company bigwigs to bring the total to 22? Answer: A Software 2000 Workshop and a report – Software 2000 – a View of the Future. Claiming parallels with the 1968 Nato-sponsored conference on the Cold war weapons crisis, the Workshop was convened at ICL Plc’s executive training centre in Windsor, Berkshire in April to discuss the impact of telecommunications and electronics on the future world. The motivation for the Workshop was the belief that a set of discontinuities will be hitting software in the next few years and there is no coherent external body of analysis or thought on these issues. The dominant topic at the Workshop was the global information infrastructure. The technical achievement of universal communications has necessitated discussion on the new options available and the government and social issues involved, as communities are increasingly defined by interaction rather than location. The Internet network is spreading rapidly, with usage and numbers connected growing at 10% to 15% per month, and now has over 20m users and the annualised rate of growth of the World-Wide Web connected through Internet is a staggering 300,000%. For Bill Wulf of the University of Virginia, the existence of this infrastructure will facilitate, if not demand, information services that we are all probably too myopic to see now. The global net will force us to rethink our use of the law. The storage of digital information is soaring, exacerbated by the plunging costs of storage, and this is creating problems that need to be addressed. The basis of ownership of information, intellectual information protection, is different, when that information can be replicated without trace, and its source is not apparent, so that physical presence cannot be used as the determinant. Jurisdiction based on place can no longer be viable, and the Workshop is concerned that while rules on patent and copyright are reasonably alike in most countries, they are denied any validity by some countries. Without a standardised approach, Panamas and Liberias waving seductive flags of convenience will spring up for the information superhighway.
By David Johnson
Brian Gladman, director of communications and information systems engineering at the UK Ministry of Defence, also observed that whereas the UK does not have a Freedom of Information act, the US does. It is therefore possible to obtain information about UK government activities through the US. However the Workshop is concerned that the issue of how to treat digital property in law is in danger of being left to lawyers. Jessica Litman, professor of law at Wayne State University in the US, believes that the laws are too complex and It would help enormously if we could all make our intellectual laws simpler so that lawyers did not have to be there to explain them to tecchies. In regulating the global network, the Workshop believes the information-sharing ethos of academia is inadequate for the commercial world. Additions to the existing rules should be evolutionary and commercially-driven, rather than mandated by government, and the group saw a good example in the way standards bodies are now beginning to work. The traditional standards process is incapable of keeping pace with the current changes in information technology. The standards process is increasingly an information exchange to accelerate the convergence of de facto standards, and the evolution of practicable processes. The law will also be stirred as the importance of software, embedded or packaged, grows and the public perception of software’s effective quality becomes more critical, especially as future generations become more computer-literate and feed into the legal and commercial worlds. Litman expects the law will in future require that off-the-shelf software is as reliable and warranted as any other product, and the question of jurisdiction for liability will become more pressing until the rules are harmo
nised. The communications developments also enable the creation of vastly more complex systems by interconnecting existing systems. This demands the use of de facto standards, so as to build systems that interface with those already running. These will use modularity and interconnection rather than a single data or system model. Brian Randall, Professor of Computing Science at the University of Newcastle-upon-Tyne, believes that the complexity and fragility of these systems will be a major challenge… because they can consist of a great set of interconnected single points of failure, The past, or legacy systems, presents the largest hurdle for software in the future as it must support old interfaces, old data formats and old methods of working and not merely by graunching – to make fit by the use of excessive force. However for Mike Lesk, manager of the Computer Science Research Division at Bell Communications Research, the most interesting possibility offered by network interaction was suggested by Garrison Keillor in his Rise of the Shy: it will lead to a reversal of the historical situation where the loudest, prettiest or biggest have dominated discussions. Instead those that have the best writing skills – for example, those who were shy – will now have the biggest influence including journalists?