Big Data is the buzz word in the technology sphere of the world and with other industries realizing its importance are slowly shifting their paradigms and trying to tame the importance of big data and finding different ways of utilizing the big data. This time we are looking at an article on big data, to understand it better and practice reading for IELTS academic.
The below passage is 828 words and is more likely to come in IELTS Academic.
You should spend 20 minutes on this task.
In the early decades of the 20th century, Henry Ford devised a manufacturing system of mass production, using specialized machinery and standardized products. It quickly became the dominant vision of technological progress. ‘Fordism’ meant automation and assembly lines; for decades onward, this became the orthodoxy of manufacturing: out with skilled craftspeople and slow work, in with a new machine-made era. But it was more than just a new set of tools. The 20th century was marked by Fordism at a cellular level and it produced a new understanding of labor, the human relationship to work, and society at large.
Big Data not only refers to very large data sets and the tools and procedures used to manipulate and analyze them, but also to a computational turn in thought and research . Just as Ford changed the way we made cars – and then transformed work itself – Big Data has emerged a system of knowledge that is already changing the objects of knowledge, while also having the power to inform how we understand human networks and community. ‘Change the instruments, and you will change the entire social theory that goes with them,’ Latour reminds us. Big Data creates a radical shift in how we think about research. Commenting on computational social science, Lazer et al argue that it offers ‘the capacity to collect and analyze data with an unprecedented breadth and depth and scale’.
It is not just a matter of scale nor is it enough to consider it in terms of proximity, or what Moretti (2007) refers to as distant or close analysis of texts. Rather, it is a profound change at the levels of epistemology and ethics. Big Data reframes key questions about the constitution of knowledge and the processes of research, how we should engage with information, and the nature and the categorization of reality. Just as du Gay and Pryke note that ‘accounting boyd, danah and Kate Crawford. Speaking in praise of what he terms ‘The Petabyte Age’, Chris Anderson, Editor-in-Chief of Wired, writes: This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity.
With enough data, the numbers speak for themselves. Do numbers speak for themselves? We believe the answer is ‘no’. Significantly, Anderson’s sweeping dismissal of all other theories and disciplines is a tell: it reveals an arrogant undercurrent in many Big Data debates where other forms of analysis are too easily sidelined. Other methods for ascertaining why people do things, write things, or make things are lost in the sheer volume of numbers. This is not a space that has been welcoming to older forms of intellectual craft. As David Berry writes, Big Data provides ‘destablising amounts of knowledge and information that lack the regulating force of philosophy.’ Instead of philosophy – which Kant saw as the rational basis for all institutions – ‘computationality might then be understood as an ontotheology, creating a new ontological “epoch” as a new historical constellation of intelligibility’. We must ask difficult questions of Big Data’s models of intelligibility before they crystallize into new orthodoxies.
If we return to Ford, his innovation was using the assembly line to break down interconnected, holistic tasks into simple, atomized, mechanistic ones. He did this by designing specialized tools that strongly predetermined and limited the action of the worker. Similarly, the specialized tools of Big Data also have their own inbuilt limitations and restrictions. For example, Twitter and Facebook are examples of Big Data sources that offer very poor archiving and search functions. Consequently, researchers are much more likely to focus on something in the present or immediate past – tracking reactions to an election, TV finale or natural disaster – because of the sheer difficulty or impossibility of accessing older data. If we are observing the automation of particular kinds of research functions, then we must consider the inbuilt flaws of the machine tools. It is not enough to simply ask, as Anderson has suggested ‘what can science learn from Google?’, but ask how the harvesters of Big Data might change the meaning of learning, and what new possibilities and new limitations may come with these systems of knowing. Claims to Objectivity and Accuracy are Misleading ‘Numbers, numbers, numbers,’ writes Latour (2010).
‘Sociology has been obsessed with the goal of becoming a quantitative science.’ Sociology has never reached this goal, in Boyd, danah and Kate Crawford. (2012). “Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon.” Latour’s view, because of where it draws the line between what is and is not quantifiable knowledge in the social domain. Big Data offers the humanistic disciplines a new way to claim the status of quantitative science and objective method. It makes many more social spaces quantifiable. In reality, working with Big Data is still subjective, and what it quantifies does not necessarily have a closer claim on objective truth – particularly when considering messages from social media sites. But there remains a mistaken belief that qualitative researchers are in the business of interpreting stories and quantitative researchers are in the business of producing facts. In this way, Big Data risks reinscribing established divisions in the long running debates about scientific method and the legitimacy of social science and humanistic inquiry.
Do the following questions agree with the following information given in the passage?
- Automation and assembly lines were considered synonymous with Henry Ford.
- Big Data has transformed the objects of knowledge and transformed the way we understand humans.
- Most of the tools in the current age tend to get replaced with applied mathematics and computations.
- As per David Berry, Big Data lacks the regulating forces of philosophy.
- All of the Big Data sources are poor sources of archiving and search functions.
- Qualitative researchers are in the business of producing facts.
Let us now learn the vocabulary used in the above passage –
||authorized or generally accepted theory
||something that is relating to the process of mathematical calculation
||something that is never known or done before
nearness in space, time, or relationship.
|| the theory of knowledge, especially with regards to its methods, validity and scope and the distinction between justified belief and opinion
|| the scientific study of language and its structure
||the branch dealing with nature of being
||the branch of metaphysics dealing with the nature of being
|| the scientific study of human mind and its functions
|| faithfulness to a person, cause or belief, demonstrated by continuing loyalty and support
|| the theology or science of being
|| a particular period of time in history or a person’s life
|| characterized by the belief that the parts of something are intimately connected and explicable only by reference to the whole
|| action or speech that makes someone angry, especially deliberately
|| to re establish or rename in a new and especially stronger form or context.