The wonder of Moore’s Law

Yesterday I had an interesting chat with Sun’s James Gosling, who many of you will no doubt know is best known as the father of Java. As programmers go he’s among a handful of famous and clearly quite brilliant software developers, as well known as people like Dennis Ritchie, Ken Thompson and Doug McIlroy (creators of Unix); Bill Joy (author of Berkeley Unix and TCP/IP); and Tim Berners-Lee, who invented you-know-what. Keep an eye out for my interview with Gosling coming soon on

Anyway I was asking him to what extent Moore’s Law has worked in Java’s favour, because any scalability challenges that it may have had when first created back in 1991 are unlikely to be much of a problem today, given the low cost and high power technology that good old Gordon Moore and his Law have furnished us with by 2007.

Discussing this, Gosling noted that Sony’s PlayStation 3 is more powerful than many of the supercomputers that were around 10 years ago, delivering 1.8 Teraflops of floating point performance, for under £450.

Putting that in perspective, the very slowest machine on the Top 500 Supercomputers list delivers 2.7 Teraflops from 800 processors, while the fastest in the world — the famous IBM Blue Gene with over 131,000 processors — manages 280.6 Teraflops. Which really is a lot of flops. And if you want to know why Moore’s Law has been so helpful, consider that back in 1993, the fastest supercomputer, costing many millions of dollars, could manage only 59 Gigaflops. You couldn’t even play a game on that these days.

Digg this

Type: White Paper


  • Favorite list is empty.
FavoriteLoadingClear favorites

Your favorite posts saved to your browsers cookies. If you clear cookies also favorite posts will be deleted.