Anyone could be forgiven for thinking that Big Data is about size. Yet if you have a bezillion bytes of data, all in one place, it’s not much of a problem at all. Especially the data is of one type and it’s static.
You’ve got more of a problem if you have a gigabyte of data which is all over the place. Video, social media, relational databases, data streams: lots of different file types, all contained on different storage media. That IS a Big Data problem.
It’s more of a problem if you have to analyse it all in real time. I know I’ll never get around to identifying any of the video files or JPGs I’ve created, let alone analyzing them!
Big data is not about volumes of structured data in block storage. It’s about a moveable target.
There are big rewards in big data. The value is in helping enterprises steer themselves more skillfully, identify new markets instantly and launch new products quickly. Big data helps a heavyweight to move like a flyweight, without surrendering its knockout punch.
However, big data moves fast. Data streams and social media sentiments, for example, will change by the minute. To make matters worse, it’s unstructured information. Without some big data guns, like in memory computing or Hadoop’s aggregation of processing power, it’s nigh on impossible.
But this feat has to be pulled off by someone because, according to IDC research, by 2020 the volume of data most companies manage will multiply thirty five fold.
Thanks to social interactions, mobile devices, facilities, equipment, R&D, simulations and physical infrastructure we’re currently moving past 1.8ZB (1.6 trillion gigabytes). Who knows what other mutations will come along. One thing’s for sure, they’re going to make big data even bigger.
It’s a great time to be selling big data solutions. As long as you can make them understand what big data is.