Concept Analysis: Big Data
(Assignment-1)
Shalabh Singh, Research Scholar Decision Sciences Area
Indian Institute of Management, Lucknow
Email: fpm15014@iiml.ac.in, shalabhapril02@gmail.com
Originality Certificate
The work submitted is my own work (not taken from any source - print or electronic). Proper referencing has been done in case some idea or text is picked.
Shalabh Singh
Concept Analysis: Big Data
(Assignment-1)
1. How and when the concept under analysis came into existence? What was the essence of the concept and it's distinctiveness to start with?
In his famous book “The Human Face of Big Data”, Rick Smolan quotes big data as “the process of helping the planet grow a nervous system, one in which we are just another,
…show more content…
It was the first time an attempt was made to quantity the rate of flow of data and its volume. In 1944, Fremont Rider, a Librarian at Wesleyan University estimated that American libraries at various Universities were doubling every sixteen years in size. He speculated that Yale library would have around 2, 00,00, 000 volumes and spread over 6,000 miles of shelf space, by 2040. Subsequently, many scholarly articles used “Information Explosion” in different context.
The term “Big Data” was first coined by Cox and Ellsworth [] in 1997 at an IEEE proceeding. They referred “Big Data” as huge data sets that were taxing large memory capacity of remote and local disk. Bryson et.al in 1999 [] were the first to publish a CACM article on Big Data. This paper pointed out the importance of huge data stored in super computer and remote servers and how it could provide valuable insights about the data. Steve Lohr, did extensive research on Big Data and published an article in New York Times titled “The Origin of Big Data”.
The early key attributes of Big Data as indicated by Laney were Velocity, Volume and
…show more content…
Today there are around 5 billion mobile phone users and between 2 to 3 million internet users worldwide. Predictions suggest that information exchange via telecommunication would be in the range of 750 exabytes (750 * 1024 petabytes = 750*1024*1024 tera byte) annually by 2015. This presents a huge challenge as well as opportunity for companies to handle this huge data and its analysis. Today, Data Analytics is a key area of interest wherein keys insights can be made about customer buying behavior and future sales forecast.
In 2012, Gartner added one more V “Veracity” to the earlier attributes of Big Data. Further with these 4 V’s other characteristics have been added to Big Data as:
• Velocity: Availability of Real-Time Big Data
• Volume: Big Data is not sampled, it is observed and used in predictive analysis.
• Variety: Big Data can be in various forms such as image, text, videos etc.
• Veracity: High quality data is the need and helps in accurate analysis and trend forecasting.
• Complexity: Big data can come from various sources and thus must be connected, linked and correlated for correct results.
Based on these characteristics Big Data in now used in various contexts and thus finds many
Hadoop [8] is an open source implementation of MapReduce programming model which runs in a distributed environment. Hadoop consists of two core components namely Hadoop Distributed File System (HDFS) and the MapReduce programming with the job management framework. HDFS and MapReduce both follow the master-slave architecture. A Hadoop program (client) submits a job to the MapReduce framework through the jobtracker which is running on the master node. The jobtracker assigns the tasks to the tasktrackers running on many slave nodes or on a cluster of machines.
They supply the stuff of thought, but they also shape the process of thought.” (733) McLuhan expresses that the internet doesn’t just allow our brains to see something, but it tells us how we should analyze the data controlling what we think. He goes on to
This will ensure that the correct audience will read her article. It is also important for a tech-savvy audience that all information on technology is up-to-date, which the author successfully executes by only writing her article only two days after Google explained their Jigsaw expansion of Alphabet. Lafrance knew that the tech-savvy audience would demand an article with all of the history put together — with her added opinion.
Technology plays a significant role in every aspect of our lives. It is what separates modern society from an archaic past. It’s open-endedness and potential for great change make it nearly impossible to gain a complete understanding of its effects. Without this overarching understanding, many of us develop an improper indication of how powerful these advances could become. As a result, we fear what we don’t know and artificial intelligence is often Enemy Number One.
Application of Conflict Theory to the Gun Control Debate Being a debate, the conflict theory is a very applicable theory that can be applied to guns/gun control laws and their roles in society. A debate is something that is associated with conflict, so by observing how deep and exactly in what directions this conflict extends, one might be able to understand this topic in a new light. In other words, by analyzing the very nature of this argument, this sociological perspective can be used to generate a deepened understanding of the debate on the extent of gun control laws. The Conflict Theory
On one hand, new technology allows for people to grow because of the immediate responses that they provided to the user. These immediate answers that are given to the user allows for them to succeed because the answers allow them to understand how they are wrong and how they can succeed in the future. Gilbert explains in his essay how when people seem to fail in the world we often accept a “fake explanation [which] can cause us to tuck an event away and move along to the next one” (142). However, with the impact of new technologies, these “fake explanations” are few and far between since computers now give an actual explanation for the failure. With computers and technology advancing at a fast rate, these “false explanations” which could be given are beginning to die out.
The second author received an award for her service at the Cornell University Libraries. I found the information on the process to create a bibliography helpful in providing questions one might ponder. Moreover, the embedded links such as “How to Critically Analyze Information Sources” were extremely useful in creating questions for this handout. Fitch, Bob. " Tilting with the System.
" Is Google Making Us Stupid" By Nicholas Carr refers to the ways technology is negatively affecting our brain function. Carr starts his argument talking about how the internet is a resource we can use for almost anything. As a result, we are becoming more and more dependent on it for simple everyday tasks. Carr states that technology is a distraction and just a "shortcut". According to the article, technology is becoming more important than people.
In today’s society, technology plays a very important role in its ability to function, it helps people find information, communicate with others far away and provides entertainment. In “Fahrenheit 451”, a book written by Ray Bradbury, a dystopian future where books have been made illegal is presented. In the article, “Is Google Making Us Stupid?” by Nicholas Carr, raises many questions about technology and its effects on society. It’s quite evident that we have become quite dependent on technology due to our overconsumption of it.
Amazon is purely an online sales portal. Based on premium web rating organizations Amazon has a position ranging from 4 to 10 on a global ranking of premium websites. The presence of Amazon in the virtual world of internet is unquestionable. Big Data is a technology area which is highly talked about during the last several years. During the last 18 months, companies in the retail sector, manufacturing, construction, and technology areas have realized the extreme potential of Big Data and are trying to gain maximum advantage from it.
Even though organizations hold huge amount of data, they cannot use them effectively as they are unstructured. However new technologies are now available which enable analysis of large, complex, unstructured data. The accessibility of technology has become easy; as a result, there is massive increase in data amounts available with the entrepreneurs. The data usage depends on the ability the way it is stored, managed and then analyzing it adequately. Big data is an upcoming and emerging trend in the field of Information technology.
One Amazing Thing. Chitra Banerjee Divakaruni. USA: Hyperion, 2009. 209pp. Under the rubric of Commonwealth Literature, there is always a bewildering array of overlapping and intersecting experiences between ‘home’ and ‘abroad’.
To understand the full theory and its impact, the main ideas, the importance, the research done with it, and its strengths and weaknesses must be
Tasting Success Article Page 95 Discussion Questions Question 1 Which decisions in this story could be considered unstructured problems? And structured problems? Structured problem Can be defined as a straightforward, familiar and easily defined issue, and it is easily solved by the eight step-by-step process Identify a Problem, Identify Decision Criteria, Allocate Weights to the Criteria, Develop Alternatives, Analyze Alternatives, Select an Alternative, Implement the Alternative and Evaluating Decision Effectiveness. The issue as described in the article is the orange juice production and it is considered as a structured problem, and the way it is produced, its mechanism is responsible for the production as it is based on Coca-Cola’s mixture
Big Data There are many different definitions for Big Data. SAS (n.d.) an analytical software company describes it as, “a popular term used to describe the exponential growth and availability of data, both structured and unstructured.” Many think Big Data just came into existence but it has been around for years. Banks, retail, advertisers have been using big data for marketing purposes.