What does "big data" mean?


While "big data" is a very broad trend, it actually has a very meaningful definition

"big data" is when the size of the data itself becomes part of the problem [What is data science?]

When the size of the data becomes problem in itself is entirely dependent on the hardware resources at your disposal. When we had PCs with 640KB of RAM, 20MB of disk, and 80386 processor, any database of gigabyte size was very difficult to handle, while today gigabyte sized databases can be processed instantaneously on commodity hardware. Therefore, "big data" today means datasets in a terabyte or pentabyte range, depending on the resources available.

What might have changed in the past 20 years, is the commoditization of "big data". Traditionally, "big data" was the domain of oil companies, large banks and "big science" projects., but now everybody seems to have embraced data driven decision making.

Is big data new, or have we forgotten its old heroes?

In the past few years, big data has essentially gone from zero to hero in the enterprise tech world. Except for one small thing: it hasn't, really. Many seem to have forgotten that big data was around, and being put to good use, well before it became the buzzword du jour.

We need this to understand how you use our service - you can take it out if you like. Cheers, your Blogspire team.

via: gigaom.com