Just as Unix is turning hardware into a commodity item, reducing a user’s dependence on a single supplier, so the gradual establishment of the SQL structured query language is doing the same thing to database vendors. So says a new report on fourth generation languages, written by research team Martin Butler and Robin Bloor, which […]
Just as Unix is turning hardware into a commodity item, reducing a user’s dependence on a single supplier, so the gradual establishment of the SQL structured query language is doing the same thing to database vendors. So says a new report on fourth generation languages, written by research team Martin Butler and Robin Bloor, which appropriately takes an end of the decade look at a market sector that seems to have promised a lot more than it delivered. From the early 1980s on, unfortunate Cobol developers were bombarded with sales and marketing pitches telling them that they were shortly to become redundant, replaced by technical users (not necessarily programmers at all) wielding tools that could produce complex applications at 10 times the speed, with the added benefits of portability and flexibility thrown in. In theory, most commercial software authors should have abandoned traditional languages altogether by now. In practice, it is estimated that Cobol is still used for over 40% of the current systems produced in the US, while use of the C language, spurred on by its advantages of portability and close ties with Unix, has boomed. What went wrong, and what is now right?
Often fall down
The report – 4GLs: an Evaluation and Comparison, published by ButlerBloor Ltd – identifies a number of crucial factors essential to the success of a fourth generation language that were not widely recognised initially. The common factor that tied the 22 fourth generation language products studied in the report turned out to be the their use of a data dictionary, holding fields, database tables, forms, and occasionally procedures. Such languages are simply a front-end for exploiting the information in that dictionary, and code should be developed with this in mind. It is better, says the report, to think in terms of a fourth generation environment, consisting of dictionary, forms management package, query language, report writer and conventional third generation language, which encourages more effort on the original analysis and design stages, things many 4GL vendors have led us to believe are no longer necessary. The new languages impress when generating routines of commonly-used functions – such as a simple file maintenance transaction – but often fall down when a non-standard task is required. One case study in the report tells of a user spending two weeks bending a product to produce a bar code routine that would have taken one hour in a conventional language. One aspect of 4GLs that did become rapidly evident was that functionality in 4GLs was nearly always gained at the expense of performance. Early 4GL-produced software was often notoriously slow and memory hungry. Not only that, but the advantages of elements adaptable by the end user often led to the production of over complex reporting tasks that could take up vast amounts of CPU time. The quest for ease of use led many of the early 4GLs to opt for a non-procedural approach to their languages, leading to an inflexibility that has been avoided by the newer, procedural 4GLs. And one area that is still rarely addressed in 4GLs is a good debugging facility – surprising since a recent survey by Ready Systems Inc revealed that software maintenance can account for 67% of the total project cost, with testing at 15%, requirements, analysis and design 11%, coding itself a mere 7%.
By John Abbott
Despite all this, the 4GL market is very healthy, with estimates of a $3,000m business for databases and their associated languages during 1990. The products themselves have developed from the earliest versions, and new generation hardware is now more capable of coping with the extra overheads they demand. As the report points out, most of the major independent software houses – Computer Associates, Oracle, Ashton-Tate, Ingres, Cincom Systems, Software AG, Cognos and Information Builders are all offering fourth generation languages, and there are promising products from smaller, but significant companies such as Sybase, Unify, and the Netherlands-based company Uniface. Around 40 products
on the market are available on DEC VAX, and 30 or so run under Unix. Very few are now specific to a single manufacturer’s hardware. The report covers 22 products including the VAX-specific Systel and IBM-specific Synon/2. The other 20 are all available on Unix, aside from Software AG’s Natural product and Computer Associates CA-DB:Gen, which will both be available in Unix versions soon. The report divides product assessment into six categories: development environment, performance, architecture and scope, inter-operability, end user functionality and portability. Overall, the best ratings are achieved by Computer Associates and Software AG, which both score top marks on performance and architecture and scope. CA-DB from Computer Associates is in fact the software it acquired from its takeover of Cullinet Software last year, and is not strictly a fourth generation language at all, but an applications generator producing third generation code with embedded SQL, resulting in high performance, and open Fortran, Cobol or C code not constrained to either a proprietary database or fourth generation language. Problems arise with maintenance, as there is currently no means of reverse engineering the 3GL code produced back to the 4GL. The other products singled out by the report include Accell from Unify, and Uniface, which is also bundled in Europe with the Sybase database as Fastbuild. These products gain top marks for the development environment, with the Uniface product making extensive use of a powerful data dictionary, with good forms management and full database-independence. Those not faring so well include Today (now owned by Australian giant Computer Power Group), which is criticised for its largely non procedural approach, and the older languages such as Microprocessor Development Group’s Sculptor (hard to interface with 3GLs) and the UK National Computer Centre’s Filetab. Others covered in the report include Focus, Informix, Ingres, Mimer, Oracle, Powerhouse, Pro IV, Progress and SAS.
The 4GL explosion is only just beginning. The widespread use of SQL means that database prices will begin to fall, and vendors must find a way of differentiating their products from their competitors. Of the trends in the 4GL market, the most important is database-independence, identified astutely by Unify Corp, which recently unbundled its Accell set of tools from its own database, and even persuaded its deadly rival Oracle to market them. Taking inter-operability further, it is likely that some vendors will release reverse engineering products that will allow users to switch from one 4GL to another if they wish, thus breaking the language lock-in. This has already been done with 3GLs such as Cobol, and should be far easier with 4GL products. Object-oriented functionality will become increasingly important, with closer ties between fourth and third generation-class libraries. And says the report, the move in hardware to client-server configurations will largely address the performance limitations, particularly associated with functions such as scrolled areas, windowing and the use of high level constructs – clientserver hardware architectures being the ideal hardware set-up for 4GL systems. Perhaps the days of Cobol really are numbered after all. 4GLs: an Evaluation and Comparison is available from ButlerBloor Ltd of Hull, UK at UKP380.