All through its long battle for acceptance within the mainstream computer industry, Unix has been dogged by unfavourable comparisons with the PC market. With its portability between computer types and multi-user, multi-tasking and communications capabilities aimed way above any pretensions of MS-DOS, Unix became an instant hit with software developers and technical users, but due […]
All through its long battle for acceptance within the mainstream computer industry, Unix has been dogged by unfavourable comparisons with the PC market. With its portability between computer types and multi-user, multi-tasking and communications capabilities aimed way above any pretensions of MS-DOS, Unix became an instant hit with software developers and technical users, but due to its complexities, failed to make any impact on the mass market software industry, which had become used to dealing with a single standardised hardware system. Over the past few years, however, the battle has been hotting up. Unix vendors have realised that by clubbing together, they can increase the market potential for software developers who want to take advantage of the wider Unix market, but are unwilling to create multiple versions of their packages for slightly different Unix machines, with all the testing, marketing and packaging overheads that entails. In short, they are looking for the establishment of a personal computer-style shrink-wrapped software market, where packages will run unmodified on any conformant machine. The responses to this have so far been various.
Microsoft Corp’s early establishment of Xenix at the desktop end of the market at least set a de-facto standard there, and this was carried forward earlier this year by the Santa Cruz Operation’s announcement of Open Desktop, which bundles application software, database and Xenix (now called Unix) into a basic standard package for personal computers using iAPX-86 microprocessors and costs around $1,000. It is due for general release any time now. And in 1987, Sun Microsystems announced its Sparc programme, an attempt to establish a new hardware standard based around the Sparc RISC microprocessor by licensing the technology to other vendors. So far, only Sun itself and Solbourne Computer Inc actually have Sparc machines out on the market, but others, including ICL, are said to have them on the way.The major effect of the Sparc initiative was to galvanise AT&T into sorting out its own attitude on compatibility at the binary, or chip level. AT&T initially supported Sun over the Sparc, but found itself accused by the rest of the industry of uncompetitive practices, particularly after it started bankrolled Sun in return for a stake of up to 20%. Rumours (fuelled according to some sources by Sun itself), suggested that AT&T was intending to optimise future versions of Unix with hardware specific support for the Sparc. The row led to the formation of the Open Software Foundation in May 1988, but also led to assurances from AT&T that all chip manufacturers would receive early access to new Unix releases, and promises of support from AT&T to establish a series of Application Binary Interfaces, ABIs, for each processor. The ABI programme should come to fruition next month, when the general release of AT&T’s Unix System V.4 is expected to be announced. AT&T has said it is working with all the chip manufacturers to establish Applications Binary Interfaces for all the major chips running Unix, such as the Motorola 68000 and 88000 series, Intel’s iAPX-86, Sun Sparc, Mips R Series, Intergraph Clipper, and National Semiconductor NS32000. Applications written to conform to the ABI of a particular chip should run immediately on any machine that uses the same processor, without the need for further tweaking.Chief beneficiaries of the ABI programme will be those companies that have been able to start with a clean slate. In particular, Motorola Inc’s 88000 RISC processor, announced in the spring of 1988, has benefited from the immediate establishment of the 88open Consortium, which has now completed work on a Binary Compatibility for the chip running Unix V.3.2 (confusingly, AT&T insists on reserving the term Application Binary Interface for Unix V.4). The Wilsonville, Oregon-based Consortium, which now has over 50 members, and offices in Japan and Europe, has also produced an Object Compatibility Standard, important for database vendors thatneed to know about compiler for
mats and library availability. The V.4 ABI for the 88000 will be a subset of the two, taking advantage of the dynamic linking capabilities of the new release. Unix V.2 and V.3 applications will run unchanged, according to 88Open president Bob Anundson. The establishment of an ABI for such processors as the 88000 will offer a far wider spectrum of hardware options than the old personal computer/Intel approach, said Anundson, with current hardware ranging from Data General’s low cost workstations up to Bolt Beranek & Newman’s parallel supercomputers, and could be the final nail in the coffin of proprietary systems. Chip manufacturers with existing customer bases will see less advantages from the ABI programme, at least until Unix System V.4 applications start to appear on the market, something probably as far away as a year to 18 months, according to Anundson. MIPS currently has four different environments to contend with: it’s own Unix version, and separate implementations from Silicon Graphics, Ardent (now Stardent) and DEC. It is DEC’s entry that could prove most problematic: DEC has already changed the byte ordering of the MIPS processor to enable closer integration with its VAXes, and rumour has it that the R6000 ECL development for a high performance version due out next year will not be fully compatible with the current R3000. If DEC fulfills its projections in the workstation market, software developers working on the MIPS processor will have their eye primarily on DEC, a situation that could force MIPS and its other chip users to switch over to DEC compatibility in order to take advantage of the software base.
One other approach to mass distribution of Unix software should be mentioned: that of Architecture-Neutral Distribution Format, currently being investigated by the Open Software Foundation, which has shied away from an endorsement of the ABI approach in deference to members such as IBM and DEC, who would not gain from its introduction. ANDF looks to use intermediate formats for software distribution, so that an application can be delivered in a single, specified format that does not need to know what the target computer is. Although attractive to the distribution market, this technology could suffer from a number of potentially serious problems, particularly performance, and security for the software developer that wants to make sure that competitors cannot reverse engineer the intermediate code. Another problem could be verification that software will in fact work on any machine – a developer may not even have heard of some of the machines destined to run the product. Applications written for such technology could be up to four or five years away, and, says 88open’s Bob Anundson might never appear at all – the concept could be intended to stall other people.