Search results for: regression models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3059

Search results for: regression models.

2189 Expected Present Value of Losses in the Computation of Optimum Seismic Design Parameters

Authors: J. García-Pérez

Abstract:

An approach to compute optimum seismic design parameters is presented. It is based on the optimization of the expected present value of the total cost, which includes the initial cost of structures as well as the cost due to earthquakes. Different types of seismicity models are considered, including one for characteristic earthquakes. Uncertainties are included in some variables to observe the influence on optimum values. Optimum seismic design coefficients are computed for three different structural types representing high, medium and low rise buildings, located near and far from the seismic sources. Ordinary and important structures are considered in the analysis. The results of optimum values show an important influence of seismicity models as well as of uncertainties on the variables.

Keywords: Importance factors, optimum parameters, seismic losses, seismic risk, total cost.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
2188 A Dynamic Hybrid Option Pricing Model by Genetic Algorithm and Black- Scholes Model

Authors: Yi-Chang Chen, Shan-Lin Chang, Chia-Chun Wu

Abstract:

Unlike this study focused extensively on trading behavior of option market, those researches were just taken their attention to model-driven option pricing. For example, Black-Scholes (B-S) model is one of the most famous option pricing models. However, the arguments of B-S model are previously mentioned by some pricing models reviewing. This paper following suggests the importance of the dynamic character for option pricing, which is also the reason why using the genetic algorithm (GA). Because of its natural selection and species evolution, this study proposed a hybrid model, the Genetic-BS model which combining GA and B-S to estimate the price more accurate. As for the final experiments, the result shows that the output estimated price with lower MAE value than the calculated price by either B-S model or its enhanced one, Gram-Charlier garch (G-C garch) model. Finally, this work would conclude that the Genetic-BS pricing model is exactly practical.

Keywords: genetic algorithm, Genetic-BS, option pricing model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2211
2187 Evaluation of the Impact of Dataset Characteristics for Classification Problems in Biological Applications

Authors: Kanthida Kusonmano, Michael Netzer, Bernhard Pfeifer, Christian Baumgartner, Klaus R. Liedl, Armin Graber

Abstract:

Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.

Keywords: Classification, High dimensional data, Machine learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2354
2186 Bio-Heat Transfer in Various Transcutaneous Stimulation Models

Authors: Trevor E. Davis, Isaac Cassar, Yi-Kai Lo, Wentai Liu

Abstract:

This study models the use of transcutaneous electrical nerve stimulation on skin with a disk electrode in order to simulate tissue damage. The current density distribution above a disk electrode is known to be a dynamic and non-uniform quantity that is intensified at the edges of the disk. The non-uniformity is subject to change through using various electrode geometries or stimulation methods. One of these methods known as edge-retarded stimulation has shown to reduce this edge enhancement. Though progress has been made in modeling the behavior of a disk electrode, little has been done to test the validity of these models in simulating the actual heat transfer from the electrode. This simulation uses finite element software to couple the injection of current from a disk electrode to heat transfer described by the Pennesbioheat transfer equation. An example application of this model is studying an experimental form of stimulation, known as edge-retarded stimulation. The edge-retarded stimulation method will reduce the current density at the edges of the electrode. It is hypothesized that reducing the current density edge enhancement effect will, in turn, reduce temperature change and tissue damage at the edges of these electrodes. This study tests this hypothesis as a demonstration of the capabilities of this model. The edge-retarded stimulation proved to be safer after this simulation. It is shown that temperature change and the fraction of tissue necrosis is much greater in the square wave stimulation. These results bring implications for changes of procedures in transcutaneous electrical nerve stimulation and transcutaneous spinal cord stimulation as well.

Keywords: Bioheat transfer, Electrode, Neuroprosthetics, TENS, Transcutaneous stimulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2263
2185 Quality-Driven Business Process Refactoring

Authors: María Fernández-Ropero, Ricardo Pérez-Castillo, Ismael Caballero, Mario Piattini

Abstract:

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

Keywords: business process model, modifiability, refactoring, understandability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495
2184 A Forecast Model for Projecting the Amount of Hazardous Waste

Authors: J. Vilgerts, L. Timma, D. Blumberga

Abstract:

The objective of the paper is to develop the forecast model for the HW flows. The methodology of the research included 6 modules: historical data, assumptions, choose of indicators, data processing, and data analysis with STATGRAPHICS, and forecast models. The proposed methodology was validated for the case study for Latvia. Hypothesis on the changes in HW for time period of 2010-2020 have been developed and mathematically described with confidence level of 95.0% and 50.0%. Sensitivity analysis for the analyzed scenarios was done. The results show that the growth of GDP affects the total amount of HW in the country. The total amount of the HW is projected to be within the corridor of – 27.7% in the optimistic scenario up to +87.8% in the pessimistic scenario with confidence level of 50.0% for period of 2010-2020. The optimistic scenario has shown to be the least flexible to the changes in the GDP growth.

Keywords: Forecast models, hazardous waste management, sustainable development, waste management indicators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1836
2183 Communicative and Artistic Machines: A Survey of Models and Experiments on Artificial Agents

Authors: Artur Matuck, Guilherme F. Nobre

Abstract:

Machines can be either tool, media, or social agents. Advances in technology have been delivering machines capable of autonomous expression, both through communication and art. This paper deals with models (theoretical approach) and experiments (applied approach) related to artificial agents. On one hand it traces how social sciences' scholars have worked with topics such as text automatization, man-machine writing cooperation, and communication. On the other hand it covers how computer sciences' scholars have built communicative and artistic machines, including the programming of creativity. The aim is to present a brief survey on artificially intelligent communicators and artificially creative writers, and provide the basis to understand the meta-authorship and also to new and further man-machine co-authorship.

Keywords: Artificial communication, artificial creativity, artificial writers, meta-authorship, robotic art.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1284
2182 External Effects on Dynamic Competitive Model of Domestic Airline and High Speed Rail

Authors: Shih-Ching Lo, Yu-Ping Liao

Abstract:

Social-economic variables influence transportation demand largely. Analyses of discrete choice model consider social-economic variables to study traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. Also, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, models with different social-economic variables, which are oil price, GDP per capita, CPI and economic growth rate, are compared. From the results, the model with the oil price is better than models with the other social-economic variables.

Keywords: forecasting, passenger volume, dynamic competitive model, social-economic variables, oil price.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
2181 Applications of Artificial Neural Network to Building Statistical Models for Qualifying and Indexing Radiation Treatment Plans

Authors: Pei-Ju Chao, Tsair-Fwu Lee, Wei-Luen Huang, Long-Chang Chen, Te-Jen Su, Wen-Ping Chen

Abstract:

The main goal in this paper is to quantify the quality of different techniques for radiation treatment plans, a back-propagation artificial neural network (ANN) combined with biomedicine theory was used to model thirteen dosimetric parameters and to calculate two dosimetric indices. The correlations between dosimetric indices and quality of life were extracted as the features and used in the ANN model to make decisions in the clinic. The simulation results show that a trained multilayer back-propagation neural network model can help a doctor accept or reject a plan efficiently. In addition, the models are flexible and whenever a new treatment technique enters the market, the feature variables simply need to be imported and the model re-trained for it to be ready for use.

Keywords: neural network, dosimetric index, radiation treatment, tumor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665
2180 Determination of Sequential Best Replies in N-player Games by Genetic Algorithms

Authors: Mattheos K. Protopapas, Elias B. Kosmatopoulos

Abstract:

An iterative algorithm is proposed and tested in Cournot Game models, which is based on the convergence of sequential best responses and the utilization of a genetic algorithm for determining each player-s best response to a given strategy profile of its opponents. An extra outer loop is used, to address the problem of finite accuracy, which is inherent in genetic algorithms, since the set of feasible values in such an algorithm is finite. The algorithm is tested in five Cournot models, three of which have convergent best replies sequence, one with divergent sequential best replies and one with “local NE traps"[14], where classical local search algorithms fail to identify the Nash Equilibrium. After a series of simulations, we conclude that the algorithm proposed converges to the Nash Equilibrium, with any level of accuracy needed, in all but the case where the sequential best replies process diverges.

Keywords: Best response, Cournot oligopoly, genetic algorithms, Nash equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
2179 Predictor Factors for Treatment Failure among Patients on Second Line Antiretroviral Therapy

Authors: Mohd. A. M. Rahim, Yahaya Hassan, Mathumalar L. Fahrni

Abstract:

Second line antiretroviral therapy (ART) regimen is used when patients fail their first line regimen. There are many factors such as non-adherence, drug resistance as well as virological and immunological failure that lead to second line highly active antiretroviral therapy (HAART) regimen treatment failure. This study was aimed at determining predictor factors to treatment failure with second line HAART and analyzing median survival time. An observational, retrospective study was conducted in Sungai Buloh Hospital (HSB) to assess current status of HIV patients treated with second line HAART regimen. Convenience sampling was used and 104 patients were included based on the study’s inclusion and exclusion criteria. Data was collected for six months i.e. from July until December 2013. Data was then analysed using SPSS version 18. Kaplan-Meier and Cox regression analyses were used to measure median survival times and predictor factors for treatment failure. The study population consisted mainly of male subjects, aged 30- 45 years, who were heterosexual, and had HIV infection for less than 6 years. The most common second line HAART regimen given was lopinavir/ritonavir (LPV/r)-based combination. Kaplan-Meier analysis showed that patients on LPV/r demonstrated longer median survival times than patients on indinavir/ritonavir (IDV/r) based combination (p<0.001). The commonest reason for a treatment to fail with second line HAART was non-adherence. Based on Cox regression analysis, other predictor factors for treatment failure with second line HAART regimen were age and mode of HIV transmission.

Keywords: Adherence, antiretroviral therapy, second line, treatment failure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2700
2178 Analysis of Network Performance Using Aspect of Quantum Cryptography

Authors: Nisarg A. Patel, Hiren B. Patel

Abstract:

Quantum cryptography is described as a point-to-point secure key generation technology that has emerged in recent times in providing absolute security. Researchers have started studying new innovative approaches to exploit the security of Quantum Key Distribution (QKD) for a large-scale communication system. A number of approaches and models for utilization of QKD for secure communication have been developed. The uncertainty principle in quantum mechanics created a new paradigm for QKD. One of the approaches for use of QKD involved network fashioned security. The main goal was point-to-point Quantum network that exploited QKD technology for end-to-end network security via high speed QKD. Other approaches and models equipped with QKD in network fashion are introduced in the literature as. A different approach that this paper deals with is using QKD in existing protocols, which are widely used on the Internet to enhance security with main objective of unconditional security. Our work is towards the analysis of the QKD in Mobile ad-hoc network (MANET).

Keywords: QKD, cryptography, quantum cryptography, network performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 898
2177 Effects of Polymers and Alkaline on Recovery Improvement from Fractured Models

Authors: Payam Parvasi, Mohammad Hossein Sedaghat, Reza Janamiri, Amir Hatampour

Abstract:

In this work, several ASP solutions were flooded into fractured models initially saturated with heavy oil at a constant flow rate and different geometrical characteristics of fracture. The ASP solutions are constituted from 2 polymers i.e. a synthetic polymer, hydrolyzed polyacrylamide as well as a biopolymer, a surfactant and 2types of alkaline. The results showed that using synthetic hydrolyzed polyacrylamide polymer increases ultimate oil recovery; however, type of alkaline does not play a significant rule on oil recovery. In addition, position of the injection well respect to the fracture system has remarkable effects on ASP flooding. For instance increasing angle of fractures with mean flow direction causes more oil recovery and delays breakthrough time. This work can be accounted as a comprehensive survey on ASP flooding which considers most of effective factors in this chemical EOR method.

Keywords: ASP Flooding, Fractured System, Displacement, Heavy Oil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
2176 Equilibrium and Kinetic Studies of Lead Adsorption on Activated Carbon Derived from Mangrove Propagule Waste by Phosphoric Acid Activation

Authors: Widi Astuti, Rizki Agus Hermawan, Hariono Mukti, Nurul Retno Sugiyono

Abstract:

The removal of lead ion (Pb2+) from aqueous solution by activated carbon with phosphoric acid activation employing mangrove propagule as precursor was investigated in a batch adsorption system. Batch studies were carried out to address various experimental parameters including pH and contact time. The Langmuir and Freundlich models were able to describe the adsorption equilibrium, while the pseudo first order and pseudo second order models were used to describe kinetic process of Pb2+ adsorption. The results show that the adsorption data are seen in accordance with Langmuir isotherm model and pseudo-second order kinetic model.

Keywords: Activated carbon, adsorption, equilibrium, kinetic, Pb2+, mangrove propagule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 714
2175 Comparative Study of Ecological City Criteria in Traditional Iranian Cities

Authors: Zahra Yazdani Paraii, Zohreh Yazdani Paraei

Abstract:

Many urban designers and planners have been involved in the design of environmentally friendly or nature adaptable urban development models due to increase in urban populations in the recent century, limitation on natural resources, climate change, and lack of enough water and food. Ecological city is one of the latest models proposed to accomplish the latter goal. In this work, the existing establishing indicators of the ecological city are used regarding energy, water, land use and transportation issues. The model is used to compare the function of traditional settlements of Iran. The result of investigation shows that the specifications and functions of the traditional settlements of Iran fit well into the ecological city model. It is found that the inhabitants of the old cities and villages in Iran had founded ecological cities based on their knowledge of the environment and its natural opportunities and limitations.

Keywords: Ecological city, traditional city, urban design, environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1224
2174 Neuron Dynamics of Single-Compartment Traub Model for Hardware Implementations

Authors: J. C. Moctezuma, V. Breña-Medina, Jose Luis Nunez-Yanez, Joseph P. McGeehan

Abstract:

In this work we make a bifurcation analysis for a single compartment representation of Traub model, one of the most important conductance-based models. The analysis focus in two principal parameters: current and leakage conductance. Study of stable and unstable solutions are explored; also Hop-bifurcation and frequency interpretation when current varies is examined. This study allows having control of neuron dynamics and neuron response when these parameters change. Analysis like this is particularly important for several applications such as: tuning parameters in learning process, neuron excitability tests, measure bursting properties of the neuron, etc. Finally, a hardware implementation results were developed to corroborate these results.

Keywords: Traub model, Pinsky-Rinzel model, Hopf bifurcation, single-compartment models, Bifurcation analysis, neuron modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1187
2173 A Three Elements Vector Valued Structure’s Ultimate Strength-Strong Motion-Intensity Measure

Authors: A. Nicknam, N. Eftekhari, A. Mazarei, M. Ganjvar

Abstract:

This article presents an alternative collapse capacity intensity measure in the three elements form which is influenced by the spectral ordinates at periods longer than that of the first mode period at near and far source sites. A parameter, denoted by β, is defined by which the spectral ordinate effects, up to the effective period (2T1), on the intensity measure are taken into account. The methodology permits to meet the hazard-levelled target extreme event in the probabilistic and deterministic forms. A MATLAB code is developed involving OpenSees to calculate the collapse capacities of the 8 archetype RC structures having 2 to 20 stories for regression process. The incremental dynamic analysis (IDA) method is used to calculate the structure’s collapse values accounting for the element stiffness and strength deterioration. The general near field set presented by FEMA is used in a series of performing nonlinear analyses. 8 linear relationships are developed for the 8structutres leading to the correlation coefficient up to 0.93. A collapse capacity near field prediction equation is developed taking into account the results of regression processes obtained from the 8 structures. The proposed prediction equation is validated against a set of actual near field records leading to a good agreement. Implementation of the proposed equation to the four archetype RC structures demonstrated different collapse capacities at near field site compared to those of FEMA. The reasons of differences are believed to be due to accounting for the spectral shape effects.

Keywords: Collapse capacity, fragility analysis, spectral shape effects, IDA method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
2172 A Bathtub Curve from Nonparametric Model

Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos

Abstract:

This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.

Keywords: Bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2210
2171 Comparative Evaluation of Accuracy of Selected Machine Learning Classification Techniques for Diagnosis of Cancer: A Data Mining Approach

Authors: Rajvir Kaur, Jeewani Anupama Ginige

Abstract:

With recent trends in Big Data and advancements in Information and Communication Technologies, the healthcare industry is at the stage of its transition from clinician oriented to technology oriented. Many people around the world die of cancer because the diagnosis of disease was not done at an early stage. Nowadays, the computational methods in the form of Machine Learning (ML) are used to develop automated decision support systems that can diagnose cancer with high confidence in a timely manner. This paper aims to carry out the comparative evaluation of a selected set of ML classifiers on two existing datasets: breast cancer and cervical cancer. The ML classifiers compared in this study are Decision Tree (DT), Support Vector Machine (SVM), k-Nearest Neighbor (k-NN), Logistic Regression, Ensemble (Bagged Tree) and Artificial Neural Networks (ANN). The evaluation is carried out based on standard evaluation metrics Precision (P), Recall (R), F1-score and Accuracy. The experimental results based on the evaluation metrics show that ANN showed the highest-level accuracy (99.4%) when tested with breast cancer dataset. On the other hand, when these ML classifiers are tested with the cervical cancer dataset, Ensemble (Bagged Tree) technique gave better accuracy (93.1%) in comparison to other classifiers.

Keywords: Artificial neural networks, breast cancer, cancer dataset, classifiers, cervical cancer, F-score, logistic regression, machine learning, precision, recall, support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
2170 Interoperable CNC System for Turning Operations

Authors: Yusri Yusof, Stephen Newman, Aydin Nassehi, Keith Case

Abstract:

The changing economic climate has made global manufacturing a growing reality over the last decade, forcing companies from east and west and all over the world to collaborate beyond geographic boundaries in the design, manufacture and assemble of products. The ISO10303 and ISO14649 Standards (STEP and STEP-NC) have been developed to introduce interoperability into manufacturing enterprises so as to meet the challenge of responding to production on demand. This paper describes and illustrates a STEP compliant CAD/CAPP/CAM System for the manufacture of rotational parts on CNC turning centers. The information models to support the proposed system together with the data models defined in the ISO14649 standard used to create the NC programs are also described. A structured view of a STEP compliant CAD/CAPP/CAM system framework supporting the next generation of intelligent CNC controllers for turn/mill component manufacture is provided. Finally a proposed computational environment for a STEP-NC compliant system for turning operations (SCSTO) is described. SCSTO is the experimental part of the research supported by the specification of information models and constructed using a structured methodology and object-oriented methods. SCSTO was developed to generate a Part 21 file based on machining features to support the interactive generation of process plans utilizing feature extraction. A case study component has been developed to prove the concept for using the milling and turning parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM environment.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1972
2169 Main Bearing Stiffness Investigation

Authors: B. Bellakhdhar, A. Dogui, J.L. Ligier

Abstract:

Simplified coupled engine block-crankshaft models based on beam theory provide an efficient substitute to engine simulation in the design process. These models require accurate definition of the main bearing stiffness. In this paper, an investigation of this stiffness is presented. The clearance effect is studied using a smooth bearing model. It is manifested for low shaft displacement. The hydrodynamic assessment model shows that the oil film has no stiffness for low loads and it is infinitely rigid for important loads. The deformation stiffness is determined using a suitable finite elements model based on real CADs. As a result, a main bearing behaviour law is proposed. This behaviour law takes into account the clearance, the hydrodynamic sustention and the deformation stiffness. It ensures properly the transition from the configuration low rigidity to the configuration high rigidity.

Keywords: Clearance, deformation stiffness, main bearing behaviour law, oil film stiffness

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2349
2168 An Investigation on Electric Field Distribution around 380 kV Transmission Line for Various Pylon Models

Authors: C. F. Kumru, C. Kocatepe, O. Arikan

Abstract:

In this study, electric field distribution analyses for three pylon models are carried out by a Finite Element Method (FEM) based software. Analyses are performed in both stationary and time domains to observe instantaneous values along with the effective ones. Considering the results of the study, different line geometries is considerably affecting the magnitude and distribution of electric field although the line voltages are the same. Furthermore, it is observed that maximum values of instantaneous electric field obtained in time domain analysis are quite higher than the effective ones in stationary mode. In consequence, electric field distribution analyses should be individually made for each different line model and the limit exposure values or distances to residential buildings should be defined according to the results obtained.

Keywords: Electric field, energy transmission line, finite element method, pylon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2697
2167 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
2166 Applications of Rough Set Decompositions in Information Retrieval

Authors: Chen Wu, Xiaohua Hu

Abstract:

This paper proposes rough set models with three different level knowledge granules in incomplete information system under tolerance relation by similarity between objects according to their attribute values. Through introducing dominance relation on the discourse to decompose similarity classes into three subclasses: little better subclass, little worse subclass and vague subclass, it dismantles lower and upper approximations into three components. By using these components, retrieving information to find naturally hierarchical expansions to queries and constructing answers to elaborative queries can be effective. It illustrates the approach in applying rough set models in the design of information retrieval system to access different granular expanded documents. The proposed method enhances rough set model application in the flexibility of expansions and elaborative queries in information retrieval.

Keywords: Incomplete information system, Rough set model, tolerance relation, dominance relation, approximation, decomposition, elaborative query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
2165 Evaluation of the Beach Erosion Process in Varadero, Matanzas, Cuba: Effects of Different Hurricane Trajectories

Authors: Ana Gabriela Diaz, Luis Fermín Córdova, Jr., Roberto Lamazares

Abstract:

The island of Cuba, the largest of the Greater Antilles, is located in the tropical North Atlantic. It is annually affected by numerous weather events, which have caused severe damage to our coastal areas. In the same way that many other coastlines around the world, the beautiful beaches of the Hicacos Peninsula also suffer from erosion. This leads to a structural regression of the coastline. If measures are not taken, the hotels will be exposed to the advance of the sea, and it will be a serious problem for the economy. With the aim of studying the intensity of this type of activity, specialists of group of coastal and marine engineering from CIH, in the framework of the research conducted within the project MEGACOSTAS 2, provide their research to simulate extreme events and assess their impact in coastal areas, mainly regarding the definition of flood volumes and morphodynamic changes in sandy beaches. The main objective of this work is the evaluation of the process of Varadero beach erosion (the coastal sector has an important impact in the country's economy) on the Hicacos Peninsula for different paths of hurricanes. The mathematical model XBeach, which was integrated into the Coastal engineering system introduced by the project of MEGACOSTA 2 to determine the area and the more critical profiles for the path of hurricanes under study, was applied. The results of this project have shown that Center area is the greatest dynamic area in the simulation of the three paths of hurricanes under study, showing high erosion volumes and the greatest average length of regression of the coastline, from 15- 22 m.

Keywords: Beach, erosion, mathematical model, coastal areas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1179
2164 Potentials of Raphia hookeri Wine in Livelihood Sustenance among Rural and Urban Populations in Nigeria

Authors: A. A. Aiyeloja, A.T. Oladele, O. Tumulo

Abstract:

Raphia wine is an important forest product with cultural significance besides its use as medicine and food in southern Nigeria. This work aims to evaluate the profitability of Raphia wine production and marketing in Sapele Local Government Area, Nigeria. Four communities (Sapele, Ogiede, Okuoke and Elume) were randomly selected for data collection via questionnaires among producers and marketers. A total of 50 producers and 34 marketers were randomly selected for interview. Data was analyzed using descriptive statistics, profit margin, multiple regression and rate of returns on investment (RORI). Annual average profit was highest in Okuoke (Producers – N90, 000.00, Marketers - N70, 000.00) and least in Sapele (Producers N50, 000.00, Marketers – N45, 000.00). Calculated RORI for marketers were Elume (40.0%), Okuoke (25.0%), Ogiede (33.3%) and Sapele (50.0%). Regression results showed that location has significant effects (0.000, ρ ≤ 0.05) on profit margins. Male (58.8%) and female (41.2%) invest in Raphia wine marketing, while males (100.0%) dominate production. Results showed that Raphia wine has potentials to generate household income, enhance food security and improve quality of life in rural, semi-urban and urban communities. Improved marketing channels, storage facilities and credit facilities via cooperative groups are recommended for producers and marketers by concerned agencies.

Keywords: Raphia wine, Profit margin, RORI, Livelihood, Nigeria.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2402
2163 Research on Residential Block Fabric: A Case Study of Hangzhou West Area

Authors: Wang Ye, Wei Wei

Abstract:

Residential block construction of big cities in China began in the 1950s, and four models had far-reaching influence on modern residential block in its development process, including unit compound and residential district in 1950s to 1980s, and gated community and open community in 1990s to now. Based on analysis of the four models’ fabric, the article takes residential blocks in Hangzhou west area as an example and carries on the studies from urban structure level and block spacial level, mainly including urban road network, land use, community function, road organization, public space and building fabric. At last, the article puts forward “Semi-open Sub-community” strategy to improve the current fabric.

Keywords: Hangzhou West Area, residential block model, residential block fabric, “Semi-open Sub-community” strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1386
2162 A Linear Regression Model for Estimating Anxiety Index Using Wide Area Frontal Lobe Brain Blood Volume

Authors: Takashi Kaburagi, Masashi Takenaka, Yosuke Kurihara, Takashi Matsumoto

Abstract:

Major depressive disorder (MDD) is one of the most common mental illnesses today. It is believed to be caused by a combination of several factors, including stress. Stress can be quantitatively evaluated using the State-Trait Anxiety Inventory (STAI), one of the best indices to evaluate anxiety. Although STAI scores are widely used in applications ranging from clinical diagnosis to basic research, the scores are calculated based on a self-reported questionnaire. An objective evaluation is required because the subject may intentionally change his/her answers if multiple tests are carried out. In this article, we present a modified index called the “multi-channel Laterality Index at Rest (mc-LIR)” by recording the brain activity from a wider area of the frontal lobe using multi-channel functional near-infrared spectroscopy (fNIRS). The presented index aims to measure multiple positions near the Fpz defined by the international 10-20 system positioning. Using 24 subjects, the dependencies on the number of measuring points used to calculate the mc-LIR and its correlation coefficients with the STAI scores are reported. Furthermore, a simple linear regression was performed to estimate the STAI scores from mc-LIR. The cross-validation error is also reported. The experimental results show that using multiple positions near the Fpz will improve the correlation coefficients and estimation than those using only two positions.

Keywords: Stress, functional near-infrared spectroscopy, frontal lobe, state-trait anxiety inventory score.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137
2161 Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations

Authors: Yehjune Heo

Abstract:

Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.

Keywords: Anti-spoofing, CNN, fingerprint recognition, GAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 566
2160 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies  the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: Retail stores, Faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 525