Tools needed to turn big data into useful data
Turning big data into useful data requires not just effective aggregation tools but effective analytical tools as well. Barb Darrow at GigaOM offers an overview of some of the latest analytics technologies attempting to make sense of the huge variety of data formats that businesses are pulling into their systems from social networks, sensors and other sources.
Analytics vendors like Splunk, Tableau, QlikView and Paradigm4 are gaining traction as the need for new, specialized databases becomes more apparent. Multi-dimensional data arrays show a clear example of this new need, says Andy Palmer, who co-founded Vertica Systems and VoltDB. "When you represent data in traditional relational databases, you can compromise the inherent nature of the data," Palmer says. "And if you integrate a lot of data together, ultimately that data looks like a large array. Representing an array in a traditional database is really an unnatural act."
It is no longer sufficient to try fitting new types of data into traditional relational database technologies, which is what many of the high-profile database providers did in the past, Darrow writes.
A "decade or so ago, there was a raft of small, innovative object database companies--Object Design, Ontos and others--that built their businesses on the premise that relational databases could not handle objects which did not fit well into the rows-and-columns world of relational databases," she writes. "Over time, however, the big data base players pushed and shoved at least some object capabilities into their databases, and those smaller companies disappeared."
This time that won't work, Palmer predicts. With the variety of data formats businesses are dealing with now, it would take far too many resources to get valuable insights using traditional databases.
- see Barb Darrow's article at GigaOM