Summary Of Concept Analysis: Big Data

854 Words4 Pages

Concept Analysis: Big Data
(Assignment-1)
Shalabh Singh, Research Scholar Decision Sciences Area
Indian Institute of Management, Lucknow
Email: fpm15014@iiml.ac.in, shalabhapril02@gmail.com

Originality Certificate

The work submitted is my own work (not taken from any source - print or electronic). Proper referencing has been done in case some idea or text is picked.

Shalabh Singh

Concept Analysis: Big Data
(Assignment-1)

1. How and when the concept under analysis came into existence? What was the essence of the concept and it's distinctiveness to start with?
In his famous book “The Human Face of Big Data”, Rick Smolan quotes big data as “the process of helping the planet grow a nervous system, one in which we are just another, …show more content…

It was the first time an attempt was made to quantity the rate of flow of data and its volume. In 1944, Fremont Rider, a Librarian at Wesleyan University estimated that American libraries at various Universities were doubling every sixteen years in size. He speculated that Yale library would have around 2, 00,00, 000 volumes and spread over 6,000 miles of shelf space, by 2040. Subsequently, many scholarly articles used “Information Explosion” in different context.
The term “Big Data” was first coined by Cox and Ellsworth [] in 1997 at an IEEE proceeding. They referred “Big Data” as huge data sets that were taxing large memory capacity of remote and local disk. Bryson et.al in 1999 [] were the first to publish a CACM article on Big Data. This paper pointed out the importance of huge data stored in super computer and remote servers and how it could provide valuable insights about the data. Steve Lohr, did extensive research on Big Data and published an article in New York Times titled “The Origin of Big Data”.
The early key attributes of Big Data as indicated by Laney were Velocity, Volume and …show more content…

Today there are around 5 billion mobile phone users and between 2 to 3 million internet users worldwide. Predictions suggest that information exchange via telecommunication would be in the range of 750 exabytes (750 * 1024 petabytes = 750*1024*1024 tera byte) annually by 2015. This presents a huge challenge as well as opportunity for companies to handle this huge data and its analysis. Today, Data Analytics is a key area of interest wherein keys insights can be made about customer buying behavior and future sales forecast.
In 2012, Gartner added one more V “Veracity” to the earlier attributes of Big Data. Further with these 4 V’s other characteristics have been added to Big Data as:
• Velocity: Availability of Real-Time Big Data
• Volume: Big Data is not sampled, it is observed and used in predictive analysis.
• Variety: Big Data can be in various forms such as image, text, videos etc.
• Veracity: High quality data is the need and helps in accurate analysis and trend forecasting.
• Complexity: Big data can come from various sources and thus must be connected, linked and correlated for correct results.
Based on these characteristics Big Data in now used in various contexts and thus finds many

Open Document