Fig 1- - uploaded by Shoban Babu Sriramoju
Content may be subject to copyright.
Big data is measured in Peta bytes or higher (Intel, 2013)

Big data is measured in Peta bytes or higher (Intel, 2013)

Source publication
Article
Full-text available
Data mining has been around for many years. The term data mining becomes Big Data mining when mining involves huge amount of data with characteristics such as volume, velocity and variety. Big data mining assumes importance as the enterprises are producing data with exponential growth. Big data mining refers to mining voluminous data and extracting...

Similar publications

Conference Paper
Full-text available
The need to ensure privacy and data protection in educational contexts is driving a shift towards new ways of securing and managing learning records. Although there are platforms available to store educational activity traces outside of a central repository, no solution currently guarantees that these traces are authentic when they are retrieved fo...
Conference Paper
Full-text available
This is quite usual nowadays that some university courses are taught for hundreds of students simultaneously. To accomplish the practical part of such courses, students are requested to implement practical assignments or projects and upload them into the Learning Management System for further evaluation and getting the final grade. Grading such stu...

Citations

... At the point when shortcomings are recognized in built up ways, an adaptive testing method is propelled trying to identify the broken connections. Broken connections are given diminished rating and are therefore stayed away from.Just and Kranakis[18] and Kargl et al[19] proposed schemes for recognizing egotistical or noxious nodes in an ad hoc network. The schemes include examining mechanisms which are comparative in usefulness to that of Awerbuch et al[6] above Patwardhan and Lorga [20] exhibited a secure routing convention called Sec AODV. ...
Article
Full-text available
Mobile Ad hoc Network (MANET) is gathered as a self-sorted out network with mobile nodes with a dynamic foundation. Designing of secure routing protocols is extremely troublesome as a result of its attributes. Also, protocols are designed with suspicion of no vindictive or childish nodes in network. Subsequently, to design robust and secured routing protocols a few impacts made from scientists. In this paper, audit on writing review on essential secure routing protocols exhibited. The overview is arranged to Basic Routing Security Schemes, Trust-Based Routing Schemes, Incentive-base plans Schemes which utilize detection and isolation mechanisms.
... [14] Big Data Mining focuses on information that provides comprehensive knowledge that can represent predictive, current and historical views that can help in making accurate business decisions. [15] Data mining can be applied to various fields that have a number of data, but because the research area has a long history, and has not passed through the 'adolescence' period, data mining is still being debated. the position of the field of knowledge that has it. ...
... (2) Managerial Implications : Decision makers continuously spend hours to understand the insights out of the data received from different sources (Sriramoju, 2017). At the same time, business decision making capacity lies in the application of data logic and processes to find the business information, that is, forecasting problem solving metrics, opportunity of innovation, and long-term sustainability, etc. (Oswaldo, Sergio, Cáceres, & Schweimanns, 2016). ...
Article
Review of literature is a very critical part of the research journey. The tenacity of this research was to explain a step-by-step guide to expedite understanding by presenting the critical components of the literature review process. We collected and synthesized business intelligence specific research papers from relevant journals with the help of web aggregator. This research paper discussed the strategy of analyzing 553 business intelligence research papers published from 2007-2018. We utilized exploratory research methodology to analyze the research conducted on BI solutions during the defined period. The research ripened a holistic, theoretically grounded, and relevant approach for reviewing the literature on business intelligence. It specified, defined, and positioned the existing BI solution research and helped identify the areas which need further exploration.
... Data mining is a modern concept for analyzing data that may at first be inaccurate, heterogeneous, contain gaps, and also have huge volumes. The need for regular analysis of such data has arisen as a result of the spread of information technologies that allow for a detailed logging of the processes of production, trade and finance (Sriramoju, S. B. (2017)). Literally, data mining translates as mining or digging data. ...
Article
Full-text available
The behavior of agents to ensure financial security on the basis of game theory was analyzed, the winning strategy taking into account risk and uncertainty was determined. Using Data Mining the useful functions of this technology were identified to ensure financial security: suspicious transactions determination, credit risks analysis, client account reliability analysis, financial indicators predicting and risks control. A comparison was made of the assessment of the effectiveness of various data mining algorithms on the nature of financial transactions and decision-making procedures in the financial security system. It was proved that the development of information technology has created a whole range of vulnerabilities in the financial system, in particular, has transformed the form of money in modern conditions - the emergence of a cryptocurrency. The influence of the formation and development of cryptocurrency on financial security at all levels of the economy: micro and macro was analyzed.
... Today schedule of high-capacity networks, inexpensive computer systems and also storage space gadgets along with the prevalent fostering of equipment virtualization, serviceoriented design, as well as free and also energy computer have actually caused a development in cloud computing. Cloud suppliers are experiencing development prices of 50% per year [16]. ...
Article
Full-text available
Typically Cloud Computing solutions are supplied by a 3rd party company that possesses the infrastructure. Cloud Computer holds the possibility to get rid of the needs for establishing of high-cost computer framework for IT-based options as well as solutions that the industry makes use of. It guarantees to give a versatile IT style, easily accessible with the net from lightweight mobile devices. Many sectors, such as financial, health care and also education and learning are relocating in the direction of the cloud as a result of the effectiveness of solutions supplied by the pay-per-use pattern based upon the sources such as refining power utilized, purchases performed, data transfer taken in, information moved, or storage area inhabited etc. In a cloud computing setting, the whole information lives over a collection of networked sources, making it possible for the information to be accessed via digital machines. As the trends of making use of all solutions create the remote system without making it personal by pay-per-use basis is expanding on enhancing, the service classifications are cloud system is expanding their service locations. In this paper, we are offering a lot of solutions in various computer systems as well as applications. As the cloud Computing system is playing a significant function in typically all companies, we offer several of the dislike trends in the cloud computing systems.
... Data mining is a modern concept for analyzing data that may at first be inaccurate, heterogeneous, contain gaps, and also have huge volumes. The need for regular analysis of such data has arisen as a result of the spread of information technologies that allow for a detailed logging of the processes of production, trade and finance (Sriramoju, S. B. (2017)). Literally, data mining translates as mining or digging data. ...
Article
Full-text available
The behavior of agents to ensure financial security on the basis of game theory was analyzed, the winning strategy taking into account risk and uncertainty was determined. Using Data Mining the useful functions of this technology were identified to ensure financial security: suspicious transactions determination, credit risks analysis, client account reliability analysis, financial indicators predicting and risks control. A comparison was made of the assessment of the effectiveness of various data mining algorithms on the nature of financial transactions and decision-making procedures in the financial security system. It was proved that the development of information technology has created a whole range of vulnerabilities in the financial system, in particular, has transformed the form of money in modern conditions - the emergence of a cryptocurrency. The influence of the formation and development of cryptocurrency on financial security at all levels of the economy: micro and macro was analyzed.
... In the era of big data, the classification algorithm has important practical significance as a branch of data mining [6]. Random forest is one of the classification algorithms. ...
... The total number of samples (6) This paper deals with and standardizes all experimental data. In order to reduce the randomness during the experiment and some other factors interfering with the experimental results, this paper will conduct multiple comparison experiments. ...
... Safeguard systems for moderating XSS and SQL infusion vulnerabilities center either around customer side mechanisms, or on serverside mechanisms. Client-side or browser-based components, for example, Noxes [15], Nonce spaces [5], or DSI [19] roll out improvements to the program foundation planning to keep the execution of infused contents. Every one of these methodologies necessitates that end-clients redesign their programs or introduce extra programming; shockingly, numerous clients do not regularly upgrade their systems [34]. ...
Article
Web applications have turned into an indispensable piece of the day by day lives of a great many clients. Sadly, web applications are additionally habitually focused by assailants, and critical vulnerabilities, for example, XSS and SQL infusion are as yet normal. As a result, much exertion in the previous decade has been spent on mitigating web application vulnerabilities. Current systems center for the most part around disinfection: either on computerized sterilization, the location of missing sanitizers, the rightness of sanitizers, or the right situation of sanitizers. In any case, these procedures are either not ready to avert new types of info approval vulnerabilities, for example, HTTP Parameter Pollution, accompany huge runtime overhead, need accuracy, or require noteworthy alterations to the customer as well as server infrastructure. In this paper, we introduce IPAAS, a novel procedure for keeping the abuse of XSS and SQL infusion vulnerabilities in view of computerized information compose location of info parameters. IPAAS consequently and straightforwardly expands generally shaky web application improvement conditions within put validators that result in significant and tangible security improvements for real systems. We implemented IPAAS for PHP and assessed it on five genuine web applications with known XSS and SQL infusion vulnerabilities. Our assessment exhibits that IPAAS would have forestalled 83% of SQL infusion vulnerabilities and 65% of XSS vulnerabilities while causing no developer burden.
... Data mining privacy preserving is targeted to produce correct data mining results without revealing the sensitive information. Figure 2 shows the privacy preserving data mining architecture [10]. Data mining techniques extracts valuable information from data stores. ...
Article
The development in data mining technology brings serious threat to the individual information. The objective of privacy preserving data mining (PPDM) is to safeguard the sensitive information contained in the data. The unwanted disclosure of the sensitive information may happen during the process of data mining results. In this study we identify four different types of users involved in mining application i.e. data source provider, data receiver, data explorer and determiner decision maker. We would like to provide useful insights into the study of privacy preserving data mining. This paper presents a comprehensive noise addition technique for protecting individual privacy in a data set used for classification, while maintaining the data quality. We add noise to all attributes, both numerical and categorical, and both to class and non-class, in such a way so that the original patterns are preserved in a perturbed data set. Our technique is also capable of incorporating previously proposed noise addition techniques that maintain the statistical parameters of the data set, including correlations among attributes. Thus the perturbed data set may be used not only for classification but also for statistical analysis.
... These junks are computed in the reduce functions. The data transfer delay can be comparable or even higher than the time required to compute the data [1,9,10]. To overcome the problem of transfer delay from the existing system, a novel optimization scheduling algorithm has been implemented in our study. ...
Article
Full-text available
Cloud computing is one of the emerging techniques to process the big data. Cloud computing is also, known as service on demand. Large set or large volume of data is known as big data. Processing big data (MRI images and DICOM images) normally takes more time. Hard tasks such as handling big data can be solved by using the concepts of hadoop. Enhancing the hadoop concept will help the user to process the large set of images. The Hadoop Distributed File System (HDFS) and MapReduce are the two default main functions which is used to enhance hadoop. HDFS is a hadoop file storing system, which is used for storing and retrieving the data. MapReduce is the combination of two functions namely map and reduce. Map is the process of splitting the inputs and reduce is the process of integrating the output of map's input. Recently, medical experts experienced problems like machine failure and fault tolerance while processing the result for the scanned data. A unique optimized time scheduling algorithm, called Dynamic Handover Reduce Function (DHRF) algorithm is introduced in the reduce function. Enhancement of hadoop and cloud and introduction of DHRF helps to overcome the processing risks, to get optimized result with less waiting time and reduction in error percentage of the output image.