Search results for: General data protection regulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9070

Search results for: General data protection regulation

7000 Using Data Mining Techniques for Finding Cardiac Outlier Patients

Authors: Farhan Ismaeel Dakheel, Raoof Smko, K. Negrat, Abdelsalam Almarimi

Abstract:

In this paper we used data mining techniques to identify outlier patients who are using large amount of drugs over a long period of time. Any healthcare or health insurance system should deal with the quantities of drugs utilized by chronic diseases patients. In Kingdom of Bahrain, about 20% of health budget is spent on medications. For the managers of healthcare systems, there is no enough information about the ways of drug utilization by chronic diseases patients, is there any misuse or is there outliers patients. In this work, which has been done in cooperation with information department in the Bahrain Defence Force hospital; we select the data for Cardiac patients in the period starting from 1/1/2008 to December 31/12/2008 to be the data for the model in this paper. We used three techniques for finding the drug utilization for cardiac patients. First we applied a clustering technique, followed by measuring of clustering validity, and finally we applied a decision tree as classification algorithm. The clustering results is divided into three clusters according to the drug utilization, for 1603 patients, who received 15,806 prescriptions during this period can be partitioned into three groups, where 23 patients (2.59%) who received 1316 prescriptions (8.32%) are classified to be outliers. The classification algorithm shows that the use of average drug utilization and the age, and the gender of the patient can be considered to be the main predictive factors in the induced model.

Keywords: Data Mining, Clustering, Classification, Drug Utilization..

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1899
6999 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data

Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu

Abstract:

Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant  of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual  value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.

Keywords: Food waste reduction, particle filter, point of sales, sustainable development goals, Taylor's Law, time series analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 871
6998 Experimental teaching, Perceived usefulness, Ease of use, Learning Interest and Science Achievement of Taiwan 8th Graders in TIMSS 2007 Database

Authors: Pei Wen Liao, Tsung Hau Jen

Abstract:

the data of Taiwanese 8th grader in the 4th cycle of Trends in International Mathematics and Science Study (TIMSS) are analyzed to examine the influence of the science teachers- preference in experimental teaching on the relationships between the affective variables ( the perceived usefulness of science, ease of using science and science learning interest) and the academic achievement in science. After dealing with the missing data, 3711 students and 145 science teacher-s data were analyzed through a Hierarchical Linear Modeling technique. The major objective of this study was to determine the role of the experimental teaching moderates the relationship between perceived usefulness and achievement.

Keywords: TIMSS database, Science achievement, Experimental teaching, Perceived Usefulness, Perceived Ease of Use

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
6997 Practices in Planning, Design and Construction of Head Race Tunnel of a Hydroelectric Project

Authors: M. S. Thakur, Mohit Shukla

Abstract:

A channel/tunnel, which carries the water to the penstock/pressure shaft is called headrace tunnel (HRT). It is necessary to know the general topography, geology of the area, state of stress and other mechanical properties of the strata. For this certain topographical and geological investigations, in-situ and laboratory tests, and observations are required to be done. These investigations play an important role in a tunnel design as these help in deciding the optimum layout, shape and size and support requirements of the tunnel. The paper includes inputs from Nathpa Jhakri Hydeoelectric project which is India’s highest capacity (1500 MW) operating hydroelectric project. The paper would help the design engineers with various new concepts and preparedness against geological surprises.

Keywords: Tunnelling, geology, head race tunnel, rockmass.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3762
6996 Theoretical, Numerical and Experimental Assessment of Elastomeric Bearing Stability

Authors: Manuel A. Guzman, Davide Forcellini, Ricardo Moreno, Diego H. Giraldo

Abstract:

Elastomeric bearings (EB) are used in many applications, such as base isolation of bridges, seismic protection and vibration control of other structures and machinery. Their versatility is due to their particular behavior since they have different stiffness in the vertical and horizontal directions, allowing to sustain vertical loads and at the same time horizontal displacements. Therefore, vertical, horizontal and bending stiffnesses are important parameters to take into account in the design of EB. In order to acquire a proper design methodology of EB all three, theoretical, finite element analysis and experimental, approaches should be taken into account to assess stability due to different loading states, predict their behavior and consequently their effects on the dynamic response of structures, and understand complex behavior and properties of rubber-like materials respectively. In particular, the recent large-displacement theory on the stability of EB formulated by Forcellini and Kelly is validated with both numerical simulations using the finite element method, and experimental results set at the University of Antioquia in Medellin, Colombia. In this regard, this study reproduces the behavior of EB under compression loads and investigates the stability behavior with the three mentioned points of view.

Keywords: Elastomeric bearings, experimental tests, numerical simulations, stability, large-displacement theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 820
6995 The Use of Recommender Systems in Decision Support–A Case Study on Used Car Dealers

Authors: Nalinee Sophatsathit

Abstract:

This research focuses on the use of a recommender system in decision support by means of a used car dealer case study in Bangkok Metropolitan. The goal is to develop an effective used car purchasing system for dealers based on the above premise. The underlying principle rests on content-based recommendation from a set of usability surveys. A prototype was developed to conduct buyers- survey selected from 5 experts and 95 general public. The responses were analyzed to determine the mean and standard deviation of buyers- preference. The results revealed that both groups were in favor of using the proposed system to assist their buying decision. This indicates that the proposed system is meritorious to used car dealers.

Keywords: Recommender Systems, Decision Support, Content- Based Recommendation, used car dealer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2373
6994 Identifying Corruption in Legislation using Risk Analysis Methods

Authors: Chvalkovska, J., Jansky, P., Mejstrik, M.

Abstract:

The objective of this article is to discuss the potential of economic analysis as a tool for identification and evaluation of corruption in legislative acts. We propose that corruption be perceived as a risk variable within the legislative process. Therefore we find it appropriate to employ risk analysis methods, used in various fields of economics, for the evaluation of corruption in legislation. Furthermore we propose the incorporation of these methods into the so called corruption impact assessment (CIA), the general framework for detection of corruption in legislative acts. The applications of the risk analysis methods are demonstrated on examples of implementation of proposed CIA in the Czech Republic.

Keywords: corruption; corruption impact assessment (CIA); legislative; legislative process; risk analysis; Czech Republic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2467
6993 From Vertigo to Verticality: An Example of Phenomenological Design in Architecture

Authors: E. Osorio Schmied

Abstract:

Architects commonly attempt a depiction of organic forms when their works are inspired by nature, regardless of the building site. Nevertheless it is also possible to try matching structures with natural scenery, by applying a phenomenological approach in terms of spatial operations, regarding perceptions from nature through architectural aspects such as protection, views, and orientation. This method acknowledges a relationship between place and space, where intentions towards tangible facts then become design statements. Although spaces resulting from such a process may present an effective response to the environment, they can also offer further outcomes beyond the realm of form. The hypothesis is that, in addition to recognising a bond between architecture and nature, it is also plausible to associate such perceptions with the inner ambient of buildings, by analysing features such as daylight. The case study of a single-family house in a rainforest near Valdivia, Chilean Patagonia is presented, with the intention of addressing the above notions through a discussion of the actual effects of inhabiting a place by way of a series of insights, including a revision of diagrams and photographs that assist in understanding the implications of this design practice. In addition, figures based on post-occupancy behaviour and daylighting performance relate both architectural and environmental issues to a decision-making process motivated by the observation of nature.

Keywords: Architecture, design statements, nature, perception.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078
6992 Fiber Optic Sensors

Authors: Bahareh Gholamzadeh, Hooman Nabovati

Abstract:

Fiber optic sensor technology offers the possibility of sensing different parameters like strain, temperature, pressure in harsh environment and remote locations. these kinds of sensors modulates some features of the light wave in an optical fiber such an intensity and phase or use optical fiber as a medium for transmitting the measurement information. The advantages of fiber optic sensors in contrast to conventional electrical ones make them popular in different applications and now a day they consider as a key component in improving industrial processes, quality control systems, medical diagnostics, and preventing and controlling general process abnormalities. This paper is an introduction to fiber optic sensor technology and some of the applications that make this branch of optic technology, which is still in its early infancy, an interesting field.

Keywords: Fiber optic sensors, distributed sensors, sensorapplication, crack sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6522
6991 Radiation Dose Distribution for Workers in South Korean Nuclear Power Plants

Authors: B. I. Lee, S. I. Kim, D. H. Suh, J. I. Kim, Y. K. Lim

Abstract:

A total of 33,680 nuclear power plants (NPPs) workers were monitored and recorded from 1990 to 2007. According to the record, the average individual radiation dose has been decreasing continually from it 3.20 mSv/man in 1990 to 1.12 mSv/man at the end of 2007. After the International Commission on Radiological Protection (ICRP) 60 recommendation was generalized in South Korea, no nuclear power plant workers received above 20 mSv radiation, and the numbers of relatively highly exposed workers have been decreasing continuously. The age distribution of radiation workers in nuclear power plants was composed of mainly 20-30- year-olds (83%) for 1990 ~ 1994 and 30-40-year-olds (75%) for 2003 ~ 2007. The difference in individual average dose by age was not significant. Most (77%) of NPP radiation exposures from 1990 to 2007 occurred mostly during the refueling period. With regard to exposure type, the majority of exposures were external exposures, representing 95% of the total exposures, while internal exposures represented only 5%. External effective dose was affected mainly by gamma radiation exposure, with an insignificant amount of neutron exposure. As for internal effective dose, tritium (3H) in the pressurized heavy water reactor (PHWR) was the biggest cause of exposure.

Keywords: Dose distribution, External exposure, Nuclear powerplant, Occupational radiation dose

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2567
6990 A Cumulative Learning Approach to Data Mining Employing Censored Production Rules (CPRs)

Authors: Rekha Kandwal, Kamal K.Bharadwaj

Abstract:

Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.

Keywords: Censored production rules, cumulative learning, data mining, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
6989 A Web-Based System for Mapping Features into ISO 14649-Compliant Machining Workingsteps

Authors: J. C. T. Benavente, J. C. E. Ferreira

Abstract:

The rapid development of manufacturing and information systems has caused significant changes in manufacturing environments in recent decades. Mass production has given way to flexible manufacturing systems, in which an important characteristic is customized or "on demand" production. In this scenario, the seamless and without gaps information flow becomes a key factor for success of enterprises. In this paper we present a framework to support the mapping of features into machining workingsteps compliant with the ISO 14649 standard (known as STEP-NC). The system determines how the features can be made with the available manufacturing resources. Examples of the mapping method are presented for features such as a pocket with a general surface.

Keywords: Features, ISO 14649 standard, STEP-NC, mapping, machining workingsteps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
6988 Social Media as a Tool for Political Communication: A Case Study of India

Authors: Srikanth Bade

Abstract:

This paper discusses how the usage of social media has altered certain discourses and communicated with the political institutions for major actions in Indian scenario. The advent of new technology in the form of social media has engrossed the general public to discuss in the open forum. How they promulgated their ideas into action is captured in this study. Moreover, these discourses happening in the social media is analyzed from certain philosophical traditions by adopting a framework. Hence, this paper analyses the role of social media in political communication and change the political discourse. Also, this paper tries to address the issue that whether the deliberation made through social media had indeed communicated the issue of political matters to the decision making authorities.

Keywords: Collective action and social capital, political communication, political discourse, social media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
6987 Study the Effect of Soft Errors on FlexRay-Based Automotive Systems

Authors: Yung-Yuan Chen, Kuen-Long Leu

Abstract:

FlexRay, as a communication protocol for automotive control systems, is developed to fulfill the increasing demand on the electronic control units for implementing systems with higher safety and more comfort. In this work, we study the impact of radiation-induced soft errors on FlexRay-based steer-by-wire system. We injected the soft errors into general purpose register set of FlexRay nodes to identify the most critical registers, the failure modes of the steer-by-wire system, and measure the probability distribution of failure modes when an error occurs in the register file.

Keywords: Soft errors, FlexRay, fault injection, steer-by-wirer

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1793
6986 Pattern Classification of Back-Propagation Algorithm Using Exclusive Connecting Network

Authors: Insung Jung, Gi-Nam Wang

Abstract:

The objective of this paper is to a design of pattern classification model based on the back-propagation (BP) algorithm for decision support system. Standard BP model has done full connection of each node in the layers from input to output layers. Therefore, it takes a lot of computing time and iteration computing for good performance and less accepted error rate when we are doing some pattern generation or training the network. However, this model is using exclusive connection in between hidden layer nodes and output nodes. The advantage of this model is less number of iteration and better performance compare with standard back-propagation model. We simulated some cases of classification data and different setting of network factors (e.g. hidden layer number and nodes, number of classification and iteration). During our simulation, we found that most of simulations cases were satisfied by BP based using exclusive connection network model compared to standard BP. We expect that this algorithm can be available to identification of user face, analysis of data, mapping data in between environment data and information.

Keywords: Neural network, Back-propagation, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
6985 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: Matrix Minimization Algorithm, Decoding Sequential Search Algorithm, image compression, Discrete Cosine Transform, Discrete Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 248
6984 A New Analytic Solution for the Heat Conduction with Time-Dependent Heat Transfer Coefficient

Authors: Te Wen Tu, Sen Yung Lee

Abstract:

An alternative approach is proposed to develop the analytic solution for one dimensional heat conduction with one mixed type boundary condition and general time-dependent heat transfer coefficient. In this study, the physic meaning of the solution procedure is revealed. It is shown that the shifting function takes the physic meaning of the reciprocal of Biot function in the initial time. Numerical results show the accuracy of this study. Comparing with those given in the existing literature, the difference is less than 0.3%.

Keywords: Analytic solution, heat transfer coefficient, shifting function method, time-dependent boundary condition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3014
6983 A Survey in Techniques for Imbalanced Intrusion Detection System Datasets

Authors: Najmeh Abedzadeh, Matthew Jacobs

Abstract:

An intrusion detection system (IDS) is a software application that monitors malicious activities and generates alerts if any are detected. However, most network activities in IDS datasets are normal, and the relatively few numbers of attacks make the available data imbalanced. Consequently, cyber-attacks can hide inside a large number of normal activities, and machine learning algorithms have difficulty learning and classifying the data correctly. In this paper, a comprehensive literature review is conducted on different types of algorithms for both implementing the IDS and methods in correcting the imbalanced IDS dataset. The most famous algorithms are machine learning (ML), deep learning (DL), synthetic minority over-sampling technique (SMOTE), and reinforcement learning (RL). Most of the research use the CSE-CIC-IDS2017, CSE-CIC-IDS2018, and NSL-KDD datasets for evaluating their algorithms.

Keywords: IDS, intrusion detection system, imbalanced datasets, sampling algorithms, big data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1126
6982 A New Approach for Recoverable Timestamp Ordering Schedule

Authors: Hassan M. Najadat

Abstract:

A new approach for timestamp ordering problem in serializable schedules is presented. Since the number of users using databases is increasing rapidly, the accuracy and needing high throughput are main topics in database area. Strict 2PL does not allow all possible serializable schedules and so does not result high throughput. The main advantages of the approach are the ability to enforce the execution of transaction to be recoverable and the high achievable performance of concurrent execution in central databases. Comparing to Strict 2PL, the general structure of the algorithm is simple, free deadlock, and allows executing all possible serializable schedules which results high throughput. Various examples which include different orders of database operations are discussed.

Keywords: Concurrency control, schedule, timestamp, transaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088
6981 Disidentification of Historical City Centers: A Comparative Study of the Old and New Settlements of Mardin, Turkey

Authors: Fatma Kürüm Varolgüneş, Fatih Canan

Abstract:

Mardin is one of the unique cities in Turkey with its rich cultural and historical heritage. Mardin’s traditional dwellings have been affected both by natural data such as climate and topography and by cultural data like lifestyle and belief. However, in the new settlements, housing is formed with modern approaches and unsuitable forms clashing with Mardin’s culture and environment. While the city is expanding, traditional textures are ignored. Thus, traditional settlements are losing their identity and are vanishing because of the rapid change and transformation. The main aim of this paper is to determine the physical and social data needed to define the characteristic features of Mardin’s old and new settlements. In this context, based on social and cultural data, old and new settlement formations of Mardin have been investigated from various aspects. During this research, the following methods have been utilized: observations, interviews, public surveys, literature review, as well as site examination via maps, photographs and questionnaire methodology. In conclusion, this paper focuses on how changes in the physical forms of cities affect the typology and the identity of cities, as in the case of Mardin.

Keywords: Urban and local identity, historical city center, traditional settlements, Mardin, Turkey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1042
6980 Data Mining Techniques in Computer-Aided Diagnosis: Non-Invasive Cancer Detection

Authors: Florin Gorunescu

Abstract:

Diagnosis can be achieved by building a model of a certain organ under surveillance and comparing it with the real time physiological measurements taken from the patient. This paper deals with the presentation of the benefits of using Data Mining techniques in the computer-aided diagnosis (CAD), focusing on the cancer detection, in order to help doctors to make optimal decisions quickly and accurately. In the field of the noninvasive diagnosis techniques, the endoscopic ultrasound elastography (EUSE) is a recent elasticity imaging technique, allowing characterizing the difference between malignant and benign tumors. Digitalizing and summarizing the main EUSE sample movies features in a vector form concern with the use of the exploratory data analysis (EDA). Neural networks are then trained on the corresponding EUSE sample movies vector input in such a way that these intelligent systems are able to offer a very precise and objective diagnosis, discriminating between benign and malignant tumors. A concrete application of these Data Mining techniques illustrates the suitability and the reliability of this methodology in CAD.

Keywords: Endoscopic ultrasound elastography, exploratorydata analysis, neural networks, non-invasive cancer detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
6979 Development and Characterization of Bio-Tribological, Nano-Multilayer Coatings for Medical Tools Application

Authors: L. Major, J. M. Lackner, M. Dyner, B. Major

Abstract:

Development of new generation bio-tribological, multilayer coatings opens an avenue for fabrication of future hightech functional surfaces. In the presented work, nano-composite, Cr/CrN+[Cr/ a-C:H implanted by metallic nanocrystals] multilayer coatings have been developed for surface protection of medical tools. Thin films were fabricated by a hybrid Pulsed Laser Deposition technique. Complex microstructure analysis of nanomultilayer coatings, subjected to mechanical and biological tests, were performed by means of transmission electron microscopy (TEM). Microstructure characterization revealed the layered arrangement of Cr23C6 nanoparticles in multilayer structure. Influence of deposition conditions on bio-tribological properties of the coatings was studied. The bio-tests were used as a screening tool for the analyzed nanomultilayer coatings before they could be deposited on medical tools. Bio-medical tests were done using fibroblasts. The mechanical properties of the coatings were investigated by means of a ball-ondisc mechanical test. The micro hardness was done using Berkovich indenter. The scratch adhesion test was done using Rockwell indenter. From the bio-tribological point of view, the optimal properties had the C106_1 material.

Keywords: Bio-tribological coatings, cell-material interaction, hybrid PLD, tribology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996
6978 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
6977 TELUM Land Use Model: An Investigation of Data Requirements and Calibration Results for Chittenden County MPO, U.S.A.

Authors: Georgia Pozoukidou

Abstract:

TELUM software is a land use model designed specifically to help metropolitan planning organizations (MPOs) prepare their transportation improvement programs and fulfill their numerous planning responsibilities. In this context obtaining, preparing, and validating socioeconomic forecasts are becoming fundamental tasks for an MPO in order to ensure that consistent population and employment data are provided to travel demand models. Chittenden County Metropolitan Planning Organization of Vermont State was used as a case study to test the applicability of TELUM land use model. The technical insights and lessons learned from the land use model application have transferable value for all MPOs faced with land use forecasting development and transportation modeling.

Keywords: Calibration data requirements, land use models, land use planning, Metropolitan Planning Organizations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
6976 Rotor Concepts for the Counter Flow Heat Recovery Fan

Authors: Christoph Speer

Abstract:

Decentralized ventilation systems should combine a small and economical design with high aerodynamic and thermal efficiency. The Counter Flow Heat Recovery Fan (CHRF) provides the ability to meet these requirements by using only one cross flow fan with a large number of blades to generate both airflows and which simultaneously acts as a regenerative counter flow heat exchanger. The successful development of the first laboratory prototype has shown the potential of this ventilation system. Occurring condensate on the surfaces of the fan blades during the cold and dry season can be recovered through the characteristic mode of operation. Hence the CHRF provides the possibility to avoid the need for frost protection and condensate drain. Through the implementation of system-specific solutions for flow balancing and summer bypass the required functionality is assured. The scalability of the CHRF concept allows the use in renovation as well as in new buildings from single-room devices through to systems for office buildings. High aerodynamic and thermal efficiency and the lower number of required mechatronic components should enable a reduction in investment as well as operating costs. The rotor is the key component of the system, the requirements and possible implementation variants are presented.

Keywords: CHRF, counter flow heat recovery fan, decentralized ventilation system, renovation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
6975 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques

Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian

Abstract:

Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.

Keywords: Data mining, K-means, road traffic accidents, Waze, Weka.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1215
6974 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models

Authors: Yoonsuh Jung

Abstract:

As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an ‘optimal’ value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.

Keywords: Cross Validation, Parameter Averaging, Parameter Selection, Regularization Parameter Search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
6973 Image Similarity: A Genetic Algorithm Based Approach

Authors: R. C. Joshi, Shashikala Tapaswi

Abstract:

The paper proposes an approach using genetic algorithm for computing the region based image similarity. The image is denoted using a set of segmented regions reflecting color and texture properties of an image. An image is associated with a family of image features corresponding to the regions. The resemblance of two images is then defined as the overall similarity between two families of image features, and quantified by a similarity measure, which integrates properties of all the regions in the images. A genetic algorithm is applied to decide the most plausible matching. The performance of the proposed method is illustrated using examples from an image database of general-purpose images, and is shown to produce good results.

Keywords: Image Features, color descriptor, segmented classes, texture descriptors, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2326
6972 Using ALOHA Code to Evaluate CO2 Concentration for Maanshan Nuclear Power Plant

Authors: W. S. Hsu, S. W. Chen, Y. T. Ku, Y. Chiang, J. R. Wang , J. H. Yang, C. Shih

Abstract:

ALOHA code was used to calculate the concentration under the CO2 storage burst condition for Maanshan nuclear power plant (NPP) in this study. Five main data are input into ALOHA code including location, building, chemical, atmospheric, and source data. The data from Final Safety Analysis Report (FSAR) and some reports were used in this study. The ALOHA results are compared with the failure criteria of R.G. 1.78 to confirm the habitability of control room. The result of comparison presents that the ALOHA result is below the R.G. 1.78 criteria. This implies that the habitability of control room can be maintained in this case. The sensitivity study for atmospheric parameters was performed in this study. The results show that the wind speed has the larger effect in the concentration calculation.

Keywords: PWR, ALOHA, habitability, Maanshan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 742
6971 Review and Comparison of Associative Classification Data Mining Approaches

Authors: Suzan Wedyan

Abstract:

Associative classification (AC) is a data mining approach that combines association rule and classification to build classification models (classifiers). AC has attracted a significant attention from several researchers mainly because it derives accurate classifiers that contain simple yet effective rules. In the last decade, a number of associative classification algorithms have been proposed such as Classification based Association (CBA), Classification based on Multiple Association Rules (CMAR), Class based Associative Classification (CACA), and Classification based on Predicted Association Rule (CPAR). This paper surveys major AC algorithms and compares the steps and methods performed in each algorithm including: rule learning, rule sorting, rule pruning, classifier building, and class prediction.

Keywords: Associative Classification, Classification, Data Mining, Learning, Rule Ranking, Rule Pruning, Prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6634