While "big data" is a very broad trend, it actually has a very meaningful definition
"big data" is when the size of the data itself becomes part of the problem [What is data science?]
When the size of the data becomes problem in itself is entirely dependent on the hardware resources at your disposal. When we had PCs with 640KB of RAM, 20MB of disk, and 80386 processor, any database of gigabyte size was very difficult to handle, while today gigabyte sized databases can be processed instantaneously on commodity hardware. Therefore, "big data" today means datasets in a terabyte or pentabyte range, depending on the resources available.
What might have changed in the past 20 years, is the commoditization of "big data". Traditionally, "big data" was the domain of oil companies, large banks and "big science" projects., but now everybody seems to have embraced data driven decision making.
In the past few years, big data has essentially gone from zero to hero in the enterprise tech world. Except for one small thing: it hasn't, really. Many seem to have forgotten that big data was around, and being put to good use, well before it became the buzzword du jour.
- 10 ways big data changes everything (gigaom.com)
- Big Data is Big Business - $50B Market by 2012 (prweb.com)
- Big Data and the Stalker Economy (forbes.com)
- The Dark Side of Big Data: Pseudo-Science & Fooled By Randomness (java.dzone.com)