Addressing delegates at London’s CASExpo-Europe conference, Santa Clara Laboratory general manager Morris Taradalsky, yesterday confirmed that IBM is working very hard on the long-awaited Data Repository for the DB2 relational database manager. Naturally, he was unable to disclose a specific availability date, but was more than willing to outline the role which the product will […]
Addressing delegates at London’s CASExpo-Europe conference, Santa Clara Laboratory general manager Morris Taradalsky, yesterday confirmed that IBM is working very hard on the long-awaited Data Repository for the DB2 relational database manager. Naturally, he was unable to disclose a specific availability date, but was more than willing to outline the role which the product will play in future IBM offerings and environments. Kicking off with the findings of a recent survey, which suggest that most companies now face a four year mainframe application development back-log, Taradalsky acknowledged that the time had come to do something about these bottlenecks. IBM’s contribution would, he promised, be the meeting of customer productivity requirements, specifically the facilitation of shorter development life cycles, and the provision of high quality development tools. Taradalsky then proceeded to share his – or rather IBM’s – vision of data processing in the 1990s. Certain changes must occur in today’s data-processing environments, he suggested, to allow development automation and distributed data processing – the IBM-specified keys to increased productivity – to shine through. Primary requirement appears to be the integration of application development and execution environments, and the automation of the current piles-of-paper routine, where design specifications are shunted in bundles from analyst to programmer to test site.
Applications Factory IBM’s visionary alternative is an Applications Factory, where the information base is available at every stage of development, where the best tools in the industry are integrated within an application development framework, and where co-operative processing combines the usability of the workstation with the power of the host system. Initial benefits, according to Taradalsky, will be the transformation of programmers into analysts, producing, in turn, significant reductions in development time. Turning back to the Applications Factory, Taradalsky claimed that by integrating, storing, and making available data accumulated during application development, the Repository will represent the heart of the development framework. Using object and entity relationship data manipulation capabilities, he continued, the Repository will provide a powerful and flexible means of expressing and accessing information, without requiring knowledge of the physical data structures, and without corresponding changes to the application development tools. The independence is essential to provide application-enabling functions from multiple sources, he added. Four additional elements make up the proposed development framework. The provision of a Tools Access interface will allow the integration of tools into the development environment, while an Integrated Tools Structure will offer consistency and integration between tools and host capabilities. In addition, a Common User Access specification will be provided to promote application development skills, portability and ease-of-use, together with a Co-operative Processing Model, to exploit both personal systems and host processing capabilities. Even further into the future, IBM’s plans include the incorporation of knowledge-based systems into development tools, in order to integrate existing applications with rule-based software, and automate the applications flow from one environment to another.