The notion of Big Data in information technology refers to a series of approaches, tools, and methods for processing of structured and unstructured data and large amounts of the significant diversity for the results, which could be perceived by man and be effective in terms of continuous growth, distribution across multiple nodes of a computer network. The conception was formed at the end of 2000s, alternative to traditional database management systems and Business Intelligence class solutions. The series include means for massively parallel processing vaguely structured data, first of all, the decisions of the NoSQL category, software frameworks, MapReduce algorithms, and Hadoop project libraries.
As examples of the sources for the Big Data we can mention continuously incoming events of radio frequency identifiers, data from measuring devices, meteorological data, message streams from social networks, data streams on the location of subscribers of cellular networks devices, remote sensing, video and audio recordings. It is expected that the development and the widespread use of these sources will initiate the penetration of Big Data technologies in both research activities and in the commercial sector and government administration.
We can prepare a Custom Research Paper on Big Data for you!
There are “triple Vā for the defining characteristics for Big Data concept: volume (in terms of the physical volume), velocity (in terms of both the rate of growth and the need for high-speed processing and results), and variety (in terms of the possibility of simultaneous processing of different types of structured and semi-structured data).
The introduction of the term “Big Data” refers to Clifford Lynch, editor of the Nature journal, who prepared for the September 3, 2008 special edition of the magazine with the title “How could technology, opening up the possibility of working with large volumes of data, influence the future of science?”, which presented materials about the phenomenon of explosive growth and diversity of the processing data and technological prospects in the paradigm of the likely leap “from quantity to quality,” the term was proposed by analogy with the conventional in the English-speaking business environment metaphors “big oil,” “big ore.”
Despite the fact that the term was introduced in the academic environment, and above all, it was related to the problem of growth and diversity of scientific data, since 2009 the term has spread widely in the business press, and in 2010, there appeared the first products and solutions relating solely and directly to the problem of big data processing. By 2011, most of the largest providers of information technology, including IBM, Oracle, Microsoft, Hewlett-Packard, EMC, began to use in their business strategies the concept of Big Data, and basic information technology market analysts devote the concept dedicated research.
To be able to prepare a good research proposal, you should consult free research paper topics on Dig Data. This will help you understand the procedure of proper scientific writing.
Free research paper samples and term paper examples available online are plagiarized. They cannot be used as your own paper, even a part of it. You can order a high-quality custom research paper on your topic from expert writers:EffectivePapers.com is a professional academic paper writing service committed to writing non-plagiarized custom research papers of top quality. All academic papers are written from scratch by highly qualified research paper writers you can hire online. Just proceed with your order, and we will find the best expert for you!