Edinburgh-based Quadstone Ltd is to release a new version of its DecisionHouse data mining and visualization tool in July. The DecisionHouse software is based on parallel processing techniques and according to Quadstone’s Technical Director, Nick Radcliffle, it uses technology the firm acquired from the Edinburgh Parallel Computing Center in a management buyout in March 1995. […]
Edinburgh-based Quadstone Ltd is to release a new version of its DecisionHouse data mining and visualization tool in July. The DecisionHouse software is based on parallel processing techniques and according to Quadstone’s Technical Director, Nick Radcliffle, it uses technology the firm acquired from the Edinburgh Parallel Computing Center in a management buyout in March 1995. At present, most data warehousing systems only implement parallel technology at a hardware and database level, says Radcliffe. There is practically none at the front end. This is because data mining and visualization tools are still deeply rooted in the Windows world, so contstant querying produces heavy traffic, which in turn causes bottlenecks and performance problems, he continues. Quadstone’s solution is to target parallelism specifically at this level, which improves performance and processing, and enables the data mining tool to carry out complex and sophisticated statistical analysis. Obviously, it makes sense to have a large SMP Symmetrical Multi-processing or Massively Parallel machine and a powerful database in operation for DecisionHouse use. DecisionHouse has been on the market for just under a year and it already has British Airways Plc and J Sainsbury Plc as customers. But, DecisionHouse is not just for any user – it is targeted at sites that have a million or more customers. The two biggest tweaks to the forthcoming release of DecisionHouse include a new statistical analysis engine called ScoreHouse and a new version of its existing one, TreeHouse.
Assessment of risk
ScoreHouse produces and provides the analysis of additive scorecards for the assessment of risk, customer worth, lifetime value, or purchasing propensity. For example, applicants for a credit card or mortgage will go through this process and will either be accepted or rejected on the basis of amassed points. The analysis task at the heart of scorecard development is the assignment of scores to observed customer characteristics, based on historical customer profiles including application, bureau and behavioral data. Meanwhile, TreeHouse automatically builds decision trees for characterizing data. This provides users with ability to profile customers, target and segment data. In the real world, TreeHouse would typically be used to measure response to marketing campaigns and to pinpoint new segments in the market to exploit in the future. New additions to this engine include multiple view variables for improved profiling and support for non binary objectives. DecisionHouse comes with everything else you expect of a mining product, including support for multiple data sources in relational databases and flat files. The package also provides data pre-processing functions, including such things as cleansing, selection, transformation, sampling, aggregation and computation of statistics. It enables drill-down to the level of individual records and fields, as well as providing summary information over the selected data. Meanwhile, there are plans afoot at Quadstone to NUMA-ize DecisionHouse – the firm is talking to the NUMA developers: Data General Corp, Sequent Computer Systems Inc, Silicon Graphics Inc and Siemens Nixdorf Informationssystemes AG about content and future relationships.