DataFlux updates platform to enhance scalability

By on

SAS owned Data integration firm DataFlux has updated its Data Quality Integration Platform with improvements in performance and scalability.

“Data volumes are rising rapidly and some organisations are being forced to process ever large quantities of data in batch or real-time,” said DataFlux chief executive Tony Fisher. “It’s essential that data quality technologies are built with the ability to scale up to this new challenge and process hundreds of thousands of records every day,” Fisher explained.

Version 8.1 of the platform adapts to an organisation’s data quality requirements. “We have threaded algorithms so they can run in parallel and perform more pieces of work at the same time,” Fisher said.

The algorithms work to match and standardise data, alerting organisations to instances where data does not conform to their quality standards. Fisher explained the update will also mean customers do not need to purchase another platform as one becomes unable to process increased data volumes.

DataFlux accelerators will also be part of the update. These are pre-built components that help apply generic data quality rules to specific problems in order to reduce implementation time, said Fisher. DataFlux has released three accelerators so far, while a fourth is planned for release before the end of year.

A single sign on capability has also been added to the platform. This will enhance management’s control over staff access to data, Fisher added. @ 2010 Incisive Media

Most Read Articles

Most popular tech stories

Log In

|  Forgot your password?