Search results for: 3D plant data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27251

Search results for: 3D plant data

24611 Small Community’s Proactive Thinking to Move from Zero to 100 Percent Water Reuse

Authors: Raj Chavan

Abstract:

The City of Jal serves a population of approximately 3,500 people, including 2,100 permanent inhabitants and 1,400 oil and gas sector workers and RV park occupants. Over the past three years, Jal's population has increased by about 70 percent, mostly due to the oil and gas industry. The City anticipates that the population will exceed 4,200 by 2020, necessitating the construction of a new wastewater treatment plant (WWTP) because the old plant (aerated lagoon system) cannot accommodate such rapid population expansion without major renovations or replacement. Adhering to discharge permit restrictions has been challenging due to aging infrastructure and equipment replacement needs, as well as increasing nutrient loading to the wastewater collecting system from the additional oil and gas residents' recreational vehicles. The WWTP has not been able to maintain permit discharge standards for total nitrogen of less than 20 mg N/L and other characteristics in recent years. Based on discussions with the state's environmental department, it is likely that the future permit renewal would impose stricter conditions. Given its location in the dry, western part of the country, the City must rely on its meager groundwater supplies and scant annual precipitation. The city's groundwater supplies will be depleted sooner than predicted due to rising demand from the growing population for drinking, leisure, and other industrial uses (fracking). The sole type of reuse the city was engaging in (recreational reuse for a golf course) had to be put on hold because of an effluent water compliance issue. As of right now, all treated effluent is evaporated. The city's long-term goal is to become a zero-waste community that sends all of its treated wastewater effluent either to the golf course, Jal Lake, or the oil and gas industry for reuse. Hydraulic fracturing uses a lot of water, but if the oil and gas industry can use recycled water, it can reduce its impact on freshwater supplies. The City's goal of 100% reuse has been delayed by the difficulties of meeting the constraints of the regular discharge permit due to the large rise in influent loads and the aging infrastructure. The City of Jal plans to build a new WWTP that can keep up with the city's rapid population increase due to the oil and gas industry. Several treatment methods were considered in light of the City's needs and its long-term goals, but MBR was ultimately chosen recommended since it meets all of the permit's requirements while also providing 100 percent beneficial reuse. This talk will lay out the plan for the city to reach its goal of 100 percent reuse, as well as the various avenues for funding the small community that have been considered.

Keywords: membrane bioreactor, nitrogent, reuse, small community

Procedia PDF Downloads 73
24610 Materialized View Effect on Query Performance

Authors: Yusuf Ziya Ayık, Ferhat Kahveci

Abstract:

Currently, database management systems have various tools such as backup and maintenance, and also provide statistical information such as resource usage and security. In terms of query performance, this paper covers query optimization, views, indexed tables, pre-computation materialized view, query performance analysis in which query plan alternatives can be created and the least costly one selected to optimize a query. Indexes and views can be created for related table columns. The literature review of this study showed that, in the course of time, despite the growing capabilities of the database management system, only database administrators are aware of the need for dealing with archival and transactional data types differently. These data may be constantly changing data used in everyday life, and also may be from the completed questionnaire whose data input was completed. For both types of data, the database uses its capabilities; but as shown in the findings section, instead of repeating similar heavy calculations which are carrying out same results with the same query over a survey results, using materialized view results can be in a more simple way. In this study, this performance difference was observed quantitatively considering the cost of the query.

Keywords: cost of query, database management systems, materialized view, query performance

Procedia PDF Downloads 264
24609 An AK-Chart for the Non-Normal Data

Authors: Chia-Hau Liu, Tai-Yue Wang

Abstract:

Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.

Keywords: multivariate control chart, statistical process control, one-class classification method, non-normal data

Procedia PDF Downloads 411
24608 Text Mining of Veterinary Forums for Epidemiological Surveillance Supplementation

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Web scraping and text mining are popular computer science methods deployed by public health researchers to augment traditional epidemiological surveillance. However, within veterinary disease surveillance, such techniques are still in the early stages of development and have not yet been fully utilised. This study presents an exploration into the utility of incorporating internet-based data to better understand the smallholder farming communities within Scotland by using online text extraction and the subsequent mining of this data. Web scraping of the livestock fora was conducted in conjunction with text mining of the data in search of common themes, words, and topics found within the text. Results from bi-grams and topic modelling uncover four main topics of interest within the data pertaining to aspects of livestock husbandry: feeding, breeding, slaughter, and disposal. These topics were found amongst both the poultry and pig sub-forums. Topic modeling appears to be a useful method of unsupervised classification regarding this form of data, as it has produced clusters that relate to biosecurity and animal welfare. Internet data can be a very effective tool in aiding traditional veterinary surveillance methods, but the requirement for human validation of said data is crucial. This opens avenues of research via the incorporation of other dynamic social media data, namely Twitter and Facebook/Meta, in addition to time series analysis to highlight temporal patterns.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, smallholding, social media, web scraping, sentiment analysis, geolocation, text mining, NLP

Procedia PDF Downloads 79
24607 Rhizobia-Containing Rhizobacterial Consortia and Intercropping Improved Faba Bean and Wheat Performances Under Stress Combining Drought and Phosphorus Deficiency

Authors: Said Cheto, Khawla Oukaltouma, Imane Chamkhi, Ammar Ibn Yasser, Bouchra Benmrid, Ahmed Qaddoury, Lamfeddal Kouisni, Joerg Geistlinger, Youssef Zeroual, Adnane Bargaz, Cherki Ghoulam

Abstract:

Our study aimed to assess, the role of inoculation of faba bean/wheat intercrops with selected rhizobacteria consortia gathering one rhizobia and two phosphate solubilizing bacteria “PSB” to alleviate the effects of combined water deficit and P limitation on Faba bean/ wheat intercrops versus monocrops under greenhouse conditions. One Vicia faba L variety (Aguadulce “Ag”), and one Triticum durum L. variety (Karim “K”) were grown as sole crops or intercrop in pots containing sterilized substrate (sand: peat 4:1v/v) added either with rock phosphate (RP) as the alone P source (P limitation) or with KH₂PO₄ in nutrient solution (P sufficient control). Plant inoculation was done using rhizobacterial consortia composed; C1(Rhizobium laguerreae, Kocuria sp, and Pseudomonas sp) and C2 (R. laguerreae, Rahnella sp, and Kocuria sp). Two weeks after inoculation, the plants were submitted to water deficit consisting of 40% of substrate water holding Capacity (WHC) versus 80% WHC for well-watered plants. At the flowering stage, the trial was assessed, and the results showed that inoculation with both consortia (C1 and C2) improved faba bean biomass in terms of shoots, roots, and nodules compared to inoculation with rhizobia alone, particularly C2 improved these parametres by 19.03, 78.99, and 72.73%, respectively. Leaf relative water content decreased under combined stress, particularly in response to C1 with a significant improvement of this parameter in wheat intercrops. For faba bean under P limitation, inoculation with C2 increased stomatal conductance (gs) by 35.73% compared to plants inoculated with rhizobia alone. Furthermore, the same inoculum C2 improved membrane stability by 44,33% versus 16,16% for C1 compared to inoculation with rhizobia alone under P deficit. For sole cropped faba bean plants, inoculation with both consortia improved N accumulation compared to inoculation with rhizobia alone with an increase of 70.75% under P limitation. Moreover, under the combined stress, intercropping inoculation with C2 improved plant biomass and N content (112.98%) in wheat plants, compared to the sole crop. Our finding revealed that consortium C2 might offer an agronomic advantage under water and P deficit and could be used as inoculum for enhancing faba bean and wheat production under both monocropping and intercropping systems.

Keywords: drought, phosphorus, intercropping, PSB, rhizobia, vicia faba, Triticum durum

Procedia PDF Downloads 58
24606 Panel Application for Determining Impact of Real Exchange Rate and Security on Tourism Revenues: Countries with Middle and High Level Tourism Income

Authors: M. Koray Cetin, Mehmet Mert

Abstract:

The purpose of the study is to examine impacts on tourism revenues of the exchange rate and country overall security level. There are numerous studies that examine the bidirectional relation between macroeconomic factors and tourism revenues and tourism demand. Most of the studies support the existence of impact of tourism revenues on growth rate but not vice versa. Few studies examine the impact of factors like real exchange rate or purchasing power parity on the tourism revenues. In this context, firstly impact of real exchange rate on tourism revenues examination is aimed. Because exchange rate is one of the main determinants of international tourism services price in guests currency unit. Another determinant of tourism demand for a country is country’s overall security level. This issue can be handled in the context of the relationship between tourism revenues and overall security including turmoil, terrorism, border problem, political violence. In this study, factors are handled for several countries which have tourism revenues on a certain level. With this structure, it is a panel data, and it is evaluated with panel data analysis techniques. Panel data have at least two dimensions, and one of them is time dimensions. The panel data analysis techniques are applied to data gathered from Worldbank data web page. In this study, it is expected to find impacts of real exchange rate and security factors on tourism revenues for the countries that have noteworthy tourism revenues.

Keywords: exchange rate, panel data analysis, security, tourism revenues

Procedia PDF Downloads 330
24605 The Effect of General Data Protection Regulation on South Asian Data Protection Laws

Authors: Sumedha Ganjoo, Santosh Goswami

Abstract:

The rising reliance on technology places national security at the forefront of 21st-century issues. It complicates the efforts of emerging and developed countries to combat cyber threats and increases the inherent risk factors connected with technology. The inability to preserve data securely might have devastating repercussions on a massive scale. Consequently, it is vital to establish national, regional, and global data protection rules and regulations that penalise individuals who participate in immoral technology usage and exploit the inherent vulnerabilities of technology. This study paper seeks to analyse GDPR-inspired Bills in the South Asian Region and determine their suitability for the development of a worldwide data protection framework, considering that Asian countries are much more diversified than European ones. In light of this context, the objectives of this paper are to identify GDPR-inspired Bills in the South Asian Region, identify their similarities and differences, as well as the obstacles to developing a regional-level data protection mechanism, thereby satisfying the need to develop a global-level mechanism. Due to the qualitative character of this study, the researcher did a comprehensive literature review of prior research papers, journal articles, survey reports, and government publications on the aforementioned topics. Taking into consideration the survey results, the researcher conducted a critical analysis of the significant parameters highlighted in the literature study. Many nations in the South Asian area are in the process of revising their present data protection measures in accordance with GDPR, according to the primary results of this study. Consideration is given to the data protection laws of Thailand, Malaysia, China, and Japan. Significant parallels and differences in comparison to GDPR have been discussed in detail. The conclusion of the research analyses the development of various data protection legislation regimes in South Asia.

Keywords: data privacy, GDPR, Asia, data protection laws

Procedia PDF Downloads 69
24604 Genetic Advance versus Environmental Impact toward Sustainable Protein, Wet Gluten and Zeleny Sedimentation in Bread and Durum Wheat

Authors: Gordana Branković, Dejan Dodig, Vesna Pajić, Vesna Kandić, Desimir Knežević, Nenad Đurić

Abstract:

The wheat grain quality properties are influenced by genotype, environmental conditions and genotype × environment interaction (GEI). The increasing request of more nutritious wheat products will direct future breeding programmes. Therefore, the aim of investigation was to determine: i) variability of the protein content (PC), wet gluten content (WG) and Zeleny sedimentation volume (ZS); ii) components of variance, heritability in a broad sense (hb2), and expected genetic advance as percent of mean (GAM) for PC, WG, and ZS; iii) correlations between PC, WG, ZS, and most important agronomic traits; in order to assess expected breeding success versus environmental impact for these quality traits. The plant material consisted of 30 genotypes of bread wheat (Triticum aestivum L. ssp. aestivum) and durum wheat (Triticum durum Desf.). The trials were sown at the three test locations in Serbia: Rimski Šančevi, Zemun Polje and Padinska Skela during 2010-2011 and 2011-2012. The experiments were set as randomized complete block design with four replications. The plot consisted of five rows of 1 m2 (5 × 0.2 m × 1 m). PC, WG and ZS were determined by the use of Near infrared spectrometry (NIRS) with the Infraneo analyser (Chopin Technologies, France). PC, WG and ZS, in bread wheat, were in the range 13.4-16.4%, 22.8-30.3%, and 39.4-67.1 mL, respectively, and in durum wheat, in the range 15.3-18.1%, 28.9-36.3%, 37.4-48.3 mL, respectively. The dominant component of variance for PC, WG, and ZS, in bread wheat, was genotype with the genetic variance/GEI variance (VG/VG × E) relation of 3.2, 2.9 and 1.0, respectively, and in durum wheat was GEI with the VG/VG × E relation of 0.70, 0.69 and 0.49, respectively. hb2 and GAM values for PC, WG and ZS, in bread wheat, were 94.9% and 12.6%, 93.7% and 18.4%, and 86.2% and 28.1%, respectively, and in durum wheat, 80.7% and 7.6%, 79.7% and 10.2%, and 74% and 11.2%, respectively. The most consistent through six environments, statistically significant correlations, for bread wheat, were between PC and spike length (-0.312 to -0.637); PC, WG, ZS and grain number per spike (-0.320 to -0.620; -0.369 to -0.567; -0.301 to -0.378, respectively); PC and grain thickness (0.338 to 0.566), and for durum wheat, were between PC, WG, ZS and yield (-0.290 to -0.690; -0.433 to -0.753; -0.297 to -0.660, respectively); PC and plant height (-0.314 to -0.521); PC, WG and spike length (-0.298 to -0.597; -0.293 to -0.627, respectively); PC, WG and grain thickness (0.260 to 0.575; 0.269 to 0.498, respectively); PC, WG and grain vitreousness (0.278 to 0.665; 0.357 to 0.690, respectively). Breeding success can be anticipated for ZS in bread wheat due to coupled high values for hb2 and GAM, suggesting existence of additive genetic effects, and also for WG in bread wheat, due to very high hb2 and medium high GAM. The small, and medium, negative correlations between PC, WG, ZS, and yield or yield components, indicate difficulties to select simultaneously for high quality and yield, depending on linkage for particular genetic arrangements to be broken by recombination.

Keywords: bread and durum wheat, genetic advance, protein and wet gluten content, Zeleny sedimentation volume

Procedia PDF Downloads 236
24603 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region

Authors: Musab Isah

Abstract:

This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.

Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool

Procedia PDF Downloads 35
24602 A Web Service Based Sensor Data Management System

Authors: Rose A. Yemson, Ping Jiang, Oyedeji L. Inumoh

Abstract:

The deployment of wireless sensor network has rapidly increased, however with the increased capacity and diversity of sensors, and applications ranging from biological, environmental, military etc. generates tremendous volume of data’s where more attention is placed on the distributed sensing and little on how to manage, analyze, retrieve and understand the data generated. This makes it more quite difficult to process live sensor data, run concurrent control and update because sensor data are either heavyweight, complex, and slow. This work will focus on developing a web service platform for automatic detection of sensors, acquisition of sensor data, storage of sensor data into a database, processing of sensor data using reconfigurable software components. This work will also create a web service based sensor data management system to monitor physical movement of an individual wearing wireless network sensor technology (SunSPOT). The sensor will detect movement of that individual by sensing the acceleration in the direction of X, Y and Z axes accordingly and then send the sensed reading to a database that will be interfaced with an internet platform. The collected sensed data will determine the posture of the person such as standing, sitting and lying down. The system is designed using the Unified Modeling Language (UML) and implemented using Java, JavaScript, html and MySQL. This system allows real time monitoring an individual closely and obtain their physical activity details without been physically presence for in-situ measurement which enables you to work remotely instead of the time consuming check of an individual. These details can help in evaluating an individual’s physical activity and generate feedback on medication. It can also help in keeping track of any mandatory physical activities required to be done by the individuals. These evaluations and feedback can help in maintaining a better health status of the individual and providing improved health care.

Keywords: HTML, java, javascript, MySQL, sunspot, UML, web-based, wireless network sensor

Procedia PDF Downloads 199
24601 A Comparative Study of the Physicochemical and Structural Properties of Quinoa Protein Isolate and Yellow Squat Shrimp Byproduct Protein Isolate through pH-Shifting Modification

Authors: María José Bugueño, Natalia Jaime, Cristian Castro, Diego Naranjo, Guido Trautmann, Mario Pérez-Won, Vilbett Briones-Labarca

Abstract:

Proteins play a crucial role in various prepared foods, including dairy products, drinks, emulsions, and ready meals. These food proteins are naturally present in food waste and byproducts. The alkaline extraction and acid precipitation method is commonly used to extract proteins from plants and animals due to its product stability, cost-effectiveness, and ease of use. This study aimed to investigate the impact of pH-shifting storage at two different pH levels on the conformational changes affecting the physicochemical and functional properties of quinoa protein isolate (QPI) and yellow shrimp byproduct protein isolate (YSPI). The QPI and YSPI were extracted using the alkaline extraction-isoelectric precipitation method. The dispersions were adjusted to pH 4 or 12, stirred for 2 hours at 20°C to achieve a uniform dispersion, and then freeze-dried. Various analyses were conducted, including flexibility (F), free sulfhydryl content (Ho), emulsifying activity (EA), emulsifying capacity (EC), water holding capacity (WHC), oil holding capacity (OHC), intrinsic fluorescence, ultraviolet spectroscopy, differential scanning calorimetry (DSC), and Fourier transform infrared spectroscopy (FTIR) to assess the properties of the protein isolates. pH-shifting at pH 11 and 12 for QPI and YSPI, respectively, significantly improved protein properties, while property modification of the samples treated under acidic conditions was less pronounced. Additionally, the pH 11 and 12 treatments significantly improved F, Ho, EA, WHC, OHC, intrinsic fluorescence, ultraviolet spectroscopy, DSC, and FTIR. The increase in Ho was due to disulfide bond disruption, which produced more protein sub-units than other treatments for both proteins. This study provides theoretical support for comprehensively elucidating the functional properties of protein isolates, promoting the application of plant proteins and marine byproducts. The pH-shifting process effectively improves the emulsifying property and stability of QPI and YSPI, which can be considered potential plant-based or marine byproduct-based emulsifiers for use in the food industry.

Keywords: quinoa protein, yellow shrimp by-product protein, physicochemical properties, structural properties

Procedia PDF Downloads 13
24600 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 53
24599 Number of Necessary Parameters for Parametrization of Stabilizing Controllers for two times two RHinf Systems

Authors: Kazuyoshi Mori

Abstract:

In this paper, we consider the number of parameters for the parametrization of stabilizing controllers for RHinf systems with size 2 × 2. Fortunately, any plant of this model can admit doubly coprime factorization. Thus we can use the Youla parameterization to parametrize the stabilizing contollers . However, Youla parameterization does not give itself the minimal number of parameters. This paper shows that the minimal number of parameters is four. As a result, we show that the Youla parametrization naturally gives the parameterization of stabilizing controllers with minimal numbers.

Keywords: RHinfo, parameterization, number of parameters, multi-input, multi-output systems

Procedia PDF Downloads 397
24598 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features

Authors: Bushra Zafar, Usman Qamar

Abstract:

Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.

Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection

Procedia PDF Downloads 304
24597 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 92
24596 Foundation of the Information Model for Connected-Cars

Authors: Hae-Won Seo, Yong-Gu Lee

Abstract:

Recent progress in the next generation of automobile technology is geared towards incorporating information technology into cars. Collectively called smart cars are bringing intelligence to cars that provides comfort, convenience and safety. A branch of smart cars is connected-car system. The key concept in connected-cars is the sharing of driving information among cars through decentralized manner enabling collective intelligence. This paper proposes a foundation of the information model that is necessary to define the driving information for smart-cars. Road conditions are modeled through a unique data structure that unambiguously represent the time variant traffics in the streets. Additionally, the modeled data structure is exemplified in a navigational scenario and usage using UML. Optimal driving route searching is also discussed using the proposed data structure in a dynamically changing road conditions.

Keywords: connected-car, data modeling, route planning, navigation system

Procedia PDF Downloads 360
24595 Processes and Application of Casting Simulation and Its Software’s

Authors: Surinder Pal, Ajay Gupta, Johny Khajuria

Abstract:

Casting simulation helps visualize mold filling and casting solidification; predict related defects like cold shut, shrinkage porosity and hard spots; and optimize the casting design to achieve the desired quality with high yield. Flow and solidification of molten metals are, however, a very complex phenomenon that is difficult to simulate correctly by conventional computational techniques, especially when the part geometry is intricate and the required inputs (like thermo-physical properties and heat transfer coefficients) are not available. Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mockup of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome. The all casting simulation software has own requirements, like magma cast has only best for crack simulation. The latest generation software Auto CAST developed at IIT Bombay provides a host of functions to support method engineers, including part thickness visualization, core design, multi-cavity mold design with common gating and feeding, application of various feed aids (feeder sleeves, chills, padding, etc.), simulation of mold filling and casting solidification, automatic optimization of feeders and gating driven by the desired quality level, and what-if cost analysis. IIT Bombay has developed a set of applications for the foundry industry to improve casting yield and quality. Casting simulation is a fast and efficient solution for process for advanced tool which is the result of more than 20 years of collaboration with major industrial partners and academic institutions around the world. In this paper the process of casting simulation is studied.

Keywords: casting simulation software’s, simulation technique’s, casting simulation, processes

Procedia PDF Downloads 465
24594 The Prediction Mechanism of M. cajuputi Extract from Lampung-Indonesia, as an Anti-Inflammatory Agent for COVID-19 by NFκβ Pathway

Authors: Agustyas Tjiptaningrum, Intanri Kurniati, Fadilah Fadilah, Linda Erlina, Tiwuk Susantiningsih

Abstract:

Coronavirus disease-19 (COVID-19) is still one of the health problems. It can be a severe condition that is caused by a cytokine storm. In a cytokine storm, several proinflammatory cytokines are released massively. It destroys epithelial cells, and subsequently, it can cause death. The anti-inflammatory agent can be used to decrease the number of severe Covid-19 conditions. Melaleuca cajuputi is a plant that has antiviral, antibiotic, antioxidant, and anti-inflammatory activities. This study was carried out to analyze the prediction mechanism of the M. cajuputi extract from Lampung, Indonesia, as an anti-inflammatory agent for COVID-19. This study constructed a database of protein host target that was involved in the inflammation process of COVID-19 using data retrieval from GeneCards with the keyword “SARS-CoV2”, “inflammation,” “cytokine storm,” and “acute respiratory distress syndrome.” Subsequent protein-protein interaction was generated by using Cytoscape version 3.9.1. It can predict the significant target protein. Then the analysis of the Gene Ontology (GO) and KEGG pathways was conducted to generate the genes and components that play a role in COVID-19. The result of this study was 30 nodes representing significant proteins, namely NF-κβ, IL-6, IL-6R, IL-2RA, IL-2, IFN2, C3, TRAF6, IFNAR1, and DOX58. From the KEGG pathway, we obtained the result that NF-κβ has a role in the production of proinflammatory cytokines, which play a role in the COVID-19 cytokine storm. It is an important factor for macrophage transcription; therefore, it will induce inflammatory gene expression that encodes proinflammatory cytokines such as IL-6, TNF-α, and IL-1β. In conclusion, the blocking of NF-κβ is the prediction mechanism of the M. cajuputi extract as an anti-inflammation agent for COVID-19.

Keywords: antiinflammation, COVID-19, cytokine storm, NF-κβ, M. cajuputi

Procedia PDF Downloads 70
24593 Safe Disposal of Processed Industrial Biomass as Alternative Organic Manure in Agriculture

Authors: V. P. Ramani, K. P. Patel, S. B. Patel

Abstract:

It is necessary to dispose of generated industrial wastes in the proper way to overcome the further pollution for a safe environment. Waste can be used in agriculture for good quality higher food production. In order to evaluate the effect and rate of processed industrial biomass on yield, contents, uptake and soil status in maize, a field experiment was conducted during 2009 - 2011 at Anand on loamy sand soil for two years. The treatments of different levels of NPK i.e. 100% RD, 75% RD and 50% RD were kept to study the possibility of reduction in fertilizer application with the use of processed biomass (BM) in different proportion with FYM. (Where, RD= Recommended dose, FYM= Farm Yard Manure, BM= Processed Biomass.) The significantly highest grain yield of maize was recorded under the treatment of 75% NPK + BM application @ 10t ha-1. The higher (10t ha-1) and lower (5t ha-1) application rate of BM with full dose of NPK was found beneficial being at par with the treatment 75% NPK along with BM application @ 10t ha-1. There is saving of 25% recommended dose of NPK when combined with BM application @ 10.0t ha-1 or 50% saving of organics when applied with full dose (100%) of NPK. The highest straw yield (7734 kg ha-1) of maize on pooled basis was observed under the treatment of recommended dose of NPK along with FYM application at 7.5t ha-1 coupled with BM application at 2.5t ha-1. It was also observed that highest straw yield was at par under all the treatments except control and application of 100% recommended dose of NPK coupled with BM application at 7.5t ha-1. The Fe content of maize straw were found altered significantly due to different treatments on pooled basis and it was noticed that biomass application at 7.5t ha-1 along with recommended dose of NPK showed significant enhancement in Fe content of straw over other treatments. Among heavy metals, Co, Pb and Cr contents of grain were found significantly altered due to application of different treatments variably during the pooled. While, Ni content of maize grain was not altered significantly due to application of different organics. However, at higher rate of BM application i.e. of 10t ha-1, there was slight increase in heavy metal content of grain/ straw as well as DTPA heavy metals in soil; although the increase was not alarming Thus, the overall results indicated that the application of BM at 5t ha-1 along with full dose of NPK is beneficial to get higher yield of maize without affecting soil / plant health adversely. It also indicated that the 5t BM ha-1 could be utilized in place of 10t FYM ha-1 where FYM availability is scarce. The 10t BM ha-1 helps to reduce a load of chemical fertilizer up to 25 percent in agriculture. The lower use of agro-chemicals always favors safe environment. However, the continuous use of biomass needs periodical monitoring to check any buildup of heavy metals in soil/ plant over the years.

Keywords: alternate use of industrial waste, heavy metals, maize, processed industrial biomass

Procedia PDF Downloads 305
24592 Immuno-Modulatory Role of Weeds in Feeds of Cyprinus Carpio

Authors: Vipin Kumar Verma, Neeta Sehgal, Om Prakash

Abstract:

Cyprinus carpio has a wide spread occurrence in the lakes and rivers of Europe and Asia. Heavy losses in natural environment due to anthropogenic activities, including pollution as well as pathogenic diseases have landed this fish in IUCN red list of vulnerable species. The significance of a suitable diet in preserving the health status of fish is widely recognized. In present study, artificial feed supplemented with leaves of two weed plants, Eichhornia crassipes and Ricinus communis were evaluated for their role on the fish immune system. To achieve this objective fish were acclimatized to laboratory conditions (25 ± 1 °C; 12 L: 12D) for 10 days prior to start of experiment and divided into 4 groups: non-challenged (negative control= A), challenged [positive control (B) and experimental (C & D)]. Group A, B were fed with non-supplemented feed while group C & D were fed with feed supplemented with 5% Eichhornia crassipes and 5% Ricinus communis respectively. Supplemented feeds were evaluated for their effect on growth, health, immune system and disease resistance in fish when challenged with Vibrio harveyi. Fingerlings of C. carpio (weight, 2.0±0.5 g) were exposed with fresh overnight culture of V. harveyi through bath immunization (concentration 2 Χ 105) for 2 hours on 10 days interval for 40 days. The growth was monitored through increase in their relative weight. The rate of mortality due to bacterial infection as well as due to effect of feed was recorded accordingly. Immune response of fish was analyzed through differential leucocyte count, percentage phagocytosis and phagocytic index. The effect of V. harveyi on fish organs were examined through histo-pathological examination of internal organs like spleen, liver and kidney. The change in the immune response was also observed through gene expression analysis. The antioxidant potential of plant extracts was measured through DPPH and FRAP assay and amount of total phenols and flavonoids were calculates through biochemical analysis. The chemical composition of plant’s methanol extracts was determined by GC-MS analysis, which showed presence of various secondary metabolites and other compounds. Investigation revealed immuno-modulatory effect of plants, when supplemented with the artificial feed of fish.

Keywords: immuno-modulation, gc-ms, Cyprinus carpio, Eichhornia crassipes, Ricinus communis

Procedia PDF Downloads 471
24591 Biochemical Identification and Study of Antibiotic Resistance in Isolated Bacteria from WWTP TIMGAD

Authors: Abdessemed Zineb, Atia Yahia, Yeza Salima

Abstract:

Water is self-purified by activated sludge process which makes its uniqueness. The main goal is the microbial biocenosis study of the input and output water of the waste water treatment system plant Timgad. 89.47% of the identified biocenosis belongs to ɤ-Proteobacteria while the remaining 10.52 % is equally divided between α-Proteobacteria and β-Proteobacteria. The antibiotics susceptibility profiles reveal that over 30 % are wild strains while the penicillinases are often present (11.30-20 %) with also other profiles. This proportion is worrying that the water discharged join the Oued Soltez used for irrigation. This disadvantage involves the installation of a chlorination step.

Keywords: activated sludge, biocenosis, antibiotics profiles, penicillinases, physic-chemical quality

Procedia PDF Downloads 286
24590 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data

Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad

Abstract:

Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.

Keywords: remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction

Procedia PDF Downloads 323
24589 Automated Multisensory Data Collection System for Continuous Monitoring of Refrigerating Appliances Recycling Plants

Authors: Georgii Emelianov, Mikhail Polikarpov, Fabian Hübner, Jochen Deuse, Jochen Schiemann

Abstract:

Recycling refrigerating appliances plays a major role in protecting the Earth's atmosphere from ozone depletion and emissions of greenhouse gases. The performance of refrigerator recycling plants in terms of material retention is the subject of strict environmental certifications and is reviewed periodically through specialized audits. The continuous collection of Refrigerator data required for the input-output analysis is still mostly manual, error-prone, and not digitalized. In this paper, we propose an automated data collection system for recycling plants in order to deduce expected material contents in individual end-of-life refrigerating appliances. The system utilizes laser scanner measurements and optical data to extract attributes of individual refrigerators by applying transfer learning with pre-trained vision models and optical character recognition. Based on Recognized features, the system automatically provides material categories and target values of contained material masses, especially foaming and cooling agents. The presented data collection system paves the way for continuous performance monitoring and efficient control of refrigerator recycling plants.

Keywords: automation, data collection, performance monitoring, recycling, refrigerators

Procedia PDF Downloads 148
24588 Sales Patterns Clustering Analysis on Seasonal Product Sales Data

Authors: Soojin Kim, Jiwon Yang, Sungzoon Cho

Abstract:

As a seasonal product is only in demand for a short time, inventory management is critical to profits. Both markdowns and stockouts decrease the return on perishable products; therefore, researchers have been interested in the distribution of seasonal products with the aim of maximizing profits. In this study, we propose a data-driven seasonal product sales pattern analysis method for individual retail outlets based on observed sales data clustering; the proposed method helps in determining distribution strategies.

Keywords: clustering, distribution, sales pattern, seasonal product

Procedia PDF Downloads 580
24587 Probability Sampling in Matched Case-Control Study in Drug Abuse

Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell

Abstract:

Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.

Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling

Procedia PDF Downloads 480
24586 Insights on Nitric Oxide Interaction with Phytohormones in Rice Root System Response to Metal Stress

Authors: Piacentini Diego, Della Rovere Federica, Fattorini Laura, Lanni Francesca, Cittadini Martina, Altamura Maria Maddalena, Falasca Giuseppina

Abstract:

Plants have evolved sophisticated mechanisms to cope with environmental cues. Changes in intracellular content and distribution of phytohormones, such as the auxin indole-3-acetic acid (IAA), have been involved in morphogenic adaptation to environmental stresses. In addition to phytohormones, plants can rely on a plethora of small signal molecules able to promptly sense and transduce the stress signals, resulting in morpho/physiological responses thanks also to their capacity to modulate the levels/distribution/reception of most hormones. Among these signaling molecules, nitrogen monoxide (nitric oxide – NO) is a critical component in several plant acclimation strategies to both biotic and abiotic stresses. Depending on its levels, NO increases plant adaptation by enhancing the enzymatic or non-enzymatic antioxidant systems or by acting as a direct scavenger of reactive oxygen/nitrogen (ROS/RNS) species produced during the stress. In addition, exogenous applications of NO-specific donor compounds showed the involvement of the signal molecule in auxin metabolism, transport, and signaling, under both physiological and stress conditions. However, the complex mechanisms underlying NO action in interacting with phytohormones, such as auxins, during metal stress responses are still poorly understood and need to be better investigated. Emphasis must be placed on the response of the root system since it is the first plant organ system to be exposed to metal soil pollution. The monocot Oryza sativa L. (rice) has been chosen given its importance as a stable food for some 4 billion people worldwide. In addition, increasing evidence has shown that rice is often grown in contaminated paddy soils with high levels of heavy metal cadmium (Cd) and metalloid arsenic (As). The facility through which these metals are taken up by rice roots and transported to the aerial organs up to the edible caryopses makes rice one of the most relevant sources of these pollutants for humans. This study aimed to evaluate if NO has a mitigatory activity in the roots of rice seedlings against Cd or As toxicity and to understand if this activity requires interactions with auxin. Our results show that exogenous treatments with the NO-donor SNP alleviate the stress induced by Cd, but not by As, in in-vitro-grown rice seedlings through increased intracellular root NO levels. The damages induced by the pollutants include root growth inhibition, root histological alterations and ROS (H2O2, O2●ˉ), and RNS (ONOOˉ) production. Also, SNP treatments mitigate both the root increase in root IAA levels and the IAA alteration in distribution monitored by the OsDR5::GUS system due to the toxic metal exposure. Notably, the SNP-induced mitigation of the IAA homeostasis altered by the pollutants does not involve changes in the expression of OsYUCCA1 and ASA2 IAA-biosynthetic genes. Taken together, the results highlight a mitigating role of NO in the rice root system, which is pollutant-specific, and involves the interaction of the signal molecule with both IAA and brassinosteroids at different (i.e., transport, levels, distribution) and multiple levels (i.e., transcriptional/post-translational levels). The research is supported by Progetti Ateneo Sapienza University of Rome, grant number: RG120172B773D1FF

Keywords: arsenic, auxin, cadmium, nitric oxide, rice, root system

Procedia PDF Downloads 62
24585 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process

Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma

Abstract:

As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.

Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis

Procedia PDF Downloads 87
24584 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 353
24583 Evaluating the Effectiveness of Science Teacher Training Programme in National Colleges of Education: a Preliminary Study, Perceptions of Prospective Teachers

Authors: A. S. V Polgampala, F. Huang

Abstract:

This is an overview of what is entailed in an evaluation and issues to be aware of when class observation is being done. This study examined the effects of evaluating teaching practice of a 7-day ‘block teaching’ session in a pre -service science teacher training program at a reputed National College of Education in Sri Lanka. Effects were assessed in three areas: evaluation of the training process, evaluation of the training impact, and evaluation of the training procedure. Data for this study were collected by class observation of 18 teachers during 9th February to 16th of 2017. Prospective teachers of science teaching, the participants of the study were evaluated based on newly introduced format by the NIE. The data collected was analyzed qualitatively using the Miles and Huberman procedure for analyzing qualitative data: data reduction, data display and conclusion drawing/verification. It was observed that the trainees showed their confidence in teaching those competencies and skills. Teacher educators’ dissatisfaction has been a great impact on evaluation process.

Keywords: evaluation, perceptions & perspectives, pre-service, science teachering

Procedia PDF Downloads 299
24582 Detecting Venomous Files in IDS Using an Approach Based on Data Mining Algorithm

Authors: Sukhleen Kaur

Abstract:

In security groundwork, Intrusion Detection System (IDS) has become an important component. The IDS has received increasing attention in recent years. IDS is one of the effective way to detect different kinds of attacks and malicious codes in a network and help us to secure the network. Data mining techniques can be implemented to IDS, which analyses the large amount of data and gives better results. Data mining can contribute to improving intrusion detection by adding a level of focus to anomaly detection. So far the study has been carried out on finding the attacks but this paper detects the malicious files. Some intruders do not attack directly, but they hide some harmful code inside the files or may corrupt those file and attack the system. These files are detected according to some defined parameters which will form two lists of files as normal files and harmful files. After that data mining will be performed. In this paper a hybrid classifier has been used via Naive Bayes and Ripper classification methods. The results show how the uploaded file in the database will be tested against the parameters and then it is characterised as either normal or harmful file and after that the mining is performed. Moreover, when a user tries to mine on harmful file it will generate an exception that mining cannot be made on corrupted or harmful files.

Keywords: data mining, association, classification, clustering, decision tree, intrusion detection system, misuse detection, anomaly detection, naive Bayes, ripper

Procedia PDF Downloads 401