I’m not sure it’s needed but frankly when the topic arises (and it does all the time) it’s just too tempting to pass up. Any definition is a bit circular, as “Big” data is still data of course.
Data is a set of qualitative or quantitative variables – it can be structured or unstructured, machine readable or not, digital or analogue, personal or not. Ultimately it is a specific set or sets of individual data points, which can be used to generate insights, be combined and abstracted to create information, knowledge and wisdom. Traditional analysis tools and software can be used to analyse and “crunch” data.
There are “dimensions” that distinguish data from BIG DATA, summarised as the “3 Vs” of data: Volume, Variety, Velocity. Hence, BIG DATA, is not just “more” data. It is so much data, that is so mixed and unstructured, and is accumulating so rapidly, that traditional techniques and methodologies including “normal” software do not really work (like Excel, Crystal reports or similar). Gartner stated that in 2011, the rate of data growth globally was around 59%. This means that almost 40% of all data ever created was created in the previous year and I am sure it is even more now...