Open in new tab

New Standards for Big Data

Verification, credibility and accuracy.

Sponsored Big data holds great promise: structured and unstructured data harvested, processes and analysed in near-real-time by organisations sifting for new opportunities or seeking to refine existing activities. Data is pouring in: from sensors, shoppers, IoT, social and more with companies investing in big-data projects, from data lakes and processing frameworks like Hadoop to analytics tools and Intel hardware.

As we pump increasing amounts of data into these systems, how we source and manage that information is becoming increasingly important. Users can take advantage of the increased analytical computing power offered by scale-out configurations of x86 processors, but not all companies have been fastidious about data quality.

That might not sound like a problem, but reliable foundations are required in this new world and information that might not be accurate could cause problems down the line. You are a financial services firm selling insurance based on customers' combined fit-bit and shopping data: what if you deny somebody a policy because but data you founded that decision on was somehow wrong – it became corrupted somewhere during creation, transmission, storage or analysis?

"Not much attention was paid to data," according to Melanie Mecca, who directs data management products and services at the CMMI Institute, a Carnegie Mellon organization that focuses on best technology practice, on our new desire for data. "It was seen as the toothpaste in the tube of features, technology and automated capabilities.

The data itself was never viewed as the foundation and the life blood of the organization's business knowledge.


Läs hela artikeln →
www.theregister.co.uk

Läs nästa artikel

Datorer är värdelösa på tid – det leder till stora problem

Läs nästa: Datorer är värdelösa på tid – det leder till stora problem