Search results for: placement models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2647

Search results for: placement models.

1987 Assessment of Path Loss Prediction Models for Wireless Propagation Channels at L-Band Frequency over Different Micro-Cellular Environments of Ekiti State, Southwestern Nigeria

Authors: C. I. Abiodun, S. O. Azi, J. S. Ojo, P. Akinyemi

Abstract:

The design of accurate and reliable mobile communication systems depends majorly on the suitability of path loss prediction methods and the adaptability of the methods to various environments of interest. In this research, the results of the adaptability of radio channel behavior are presented based on practical measurements carried out in the 1800 MHz frequency band. The measurements are carried out in typical urban, suburban and rural environments in Ekiti State, Southwestern part of Nigeria. A total number of seven base stations of MTN GSM service located in the studied environments were monitored. Path loss and break point distances were deduced from the measured received signal strength (RSS) and a practical path loss model is proposed based on the deduced break point distances. The proposed two slope model, regression line and four existing path loss models were compared with the measured path loss values. The standard deviations of each model with respect to the measured path loss were estimated for each base station. The proposed model and regression line exhibited lowest standard deviations followed by the Cost231-Hata model when compared with the Erceg Ericsson and SUI models. Generally, the proposed two-slope model shows closest agreement with the measured values with a mean error values of 2 to 6 dB. These results show that, either the proposed two slope model or Cost 231-Hata model may be used to predict path loss values in mobile micro cell coverage in the well-considered environments. Information from this work will be useful for link design of microwave band wireless access systems in the region.

Keywords: Break-point distances, path loss models, path loss exponent, received signal strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 819
1986 Educational Quiz Board Games for Adaptive E-Learning

Authors: Boyan Bontchev, Dessislava Vassileva

Abstract:

Internet computer games turn to be more and more attractive within the context of technology enhanced learning. Educational games as quizzes and quests have gained significant success in appealing and motivating learners to study in a different way and provoke steadily increasing interest in new methods of application. Board games are specific group of games where figures are manipulated in competitive play mode with race conditions on a surface according predefined rules. The article represents a new, formalized model of traditional quizzes, puzzles and quests shown as multimedia board games which facilitates the construction process of such games. Authors provide different examples of quizzes and their models in order to demonstrate the model is quite general and does support not only quizzes, mazes and quests but also any set of teaching activities. The execution process of such models is explained and, as well, how they can be useful for creation and delivery of adaptive e-learning courseware.

Keywords: Quiz, board game, e-learning, adaptive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2422
1985 Comparative Analysis of the Software Effort Estimation Models

Authors: Jaswinder Kaur, Satwinder Singh, Karanjeet Singh Kahlon

Abstract:

Accurate software cost estimates are critical to both developers and customers. They can be used for generating request for proposals, contract negotiations, scheduling, monitoring and control. The exact relationship between the attributes of the effort estimation is difficult to establish. A neural network is good at discovering relationships and pattern in the data. So, in this paper a comparative analysis among existing Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model and Neural Network Based Model is performed. Neural Network has outperformed the other considered models. Hence, we proposed Neural Network system as a soft computing approach to model the effort estimation of the software systems.

Keywords: Effort Estimation, Neural Network, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2221
1984 Analyses for Primary Coolant Pump Coastdown Phenomena for Jordan Research and Training Reactor

Authors: Yazan M. Alatrash, Han-ok Kang, Hyun-gi Yoon, Shen Zhang, Juhyeon Yoon

Abstract:

Flow coastdown phenomena are very important to secure nuclear fuel integrity during loss of off-site power accidents. In this study, primary coolant flow coastdown phenomena are investigated for the Jordan Research and Training Reactor (JRTR) using a simulation software package, Modular Modeling System (MMS). Two MMS models are built. The first one is a simple model to investigate the characteristics of the primary coolant pump only. The second one is a model for a simulation of the Primary Coolant System (PCS) loop, in which all the detailed design data of the JRTR PCS system are modeled, including the geometrical arrangement data. The same design data for a PCS pump are used for both models. Coastdown curves obtained from the two models are compared to study the PCS loop coolant inertia effect on a flow coastdown. Results showed that the loop coolant inertia effect is found to be small in the JRTR PCS loop, i.e., about one second increases in a coastdown half time required to halve the coolant flow rate. The effects of different flywheel inertia on the flow coastdown are also investigated. It is demonstrated that the coastdown half time increases with the flywheel inertia linearly. The designed coastdown half time is proved to be well above the design requirement for the fuel integrity.

Keywords: Flow Coastdown, Loop Coolant Inertia, Modeling, Research Reactor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3796
1983 Expected Present Value of Losses in the Computation of Optimum Seismic Design Parameters

Authors: J. García-Pérez

Abstract:

An approach to compute optimum seismic design parameters is presented. It is based on the optimization of the expected present value of the total cost, which includes the initial cost of structures as well as the cost due to earthquakes. Different types of seismicity models are considered, including one for characteristic earthquakes. Uncertainties are included in some variables to observe the influence on optimum values. Optimum seismic design coefficients are computed for three different structural types representing high, medium and low rise buildings, located near and far from the seismic sources. Ordinary and important structures are considered in the analysis. The results of optimum values show an important influence of seismicity models as well as of uncertainties on the variables.

Keywords: Importance factors, optimum parameters, seismic losses, seismic risk, total cost.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1354
1982 A Dynamic Hybrid Option Pricing Model by Genetic Algorithm and Black- Scholes Model

Authors: Yi-Chang Chen, Shan-Lin Chang, Chia-Chun Wu

Abstract:

Unlike this study focused extensively on trading behavior of option market, those researches were just taken their attention to model-driven option pricing. For example, Black-Scholes (B-S) model is one of the most famous option pricing models. However, the arguments of B-S model are previously mentioned by some pricing models reviewing. This paper following suggests the importance of the dynamic character for option pricing, which is also the reason why using the genetic algorithm (GA). Because of its natural selection and species evolution, this study proposed a hybrid model, the Genetic-BS model which combining GA and B-S to estimate the price more accurate. As for the final experiments, the result shows that the output estimated price with lower MAE value than the calculated price by either B-S model or its enhanced one, Gram-Charlier garch (G-C garch) model. Finally, this work would conclude that the Genetic-BS pricing model is exactly practical.

Keywords: genetic algorithm, Genetic-BS, option pricing model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2245
1981 Development of Rock Engineering System-Based Models for Tunneling Progress Analysis and Evaluation: Case Study of Tailrace Tunnel of Azad Power Plant Project

Authors: S. Golmohammadi, M. Noorian Bidgoli

Abstract:

Tunneling progress is a key parameter in the blasting method of tunneling. Taking measures to enhance tunneling advance can limit the progress distance without a supporting system, subsequently reducing or eliminating the risk of damage. This paper focuses on modeling tunneling progress using three main groups of parameters (tunneling geometry, blasting pattern, and rock mass specifications) based on the Rock Engineering Systems (RES) methodology. In the proposed models, four main effective parameters on tunneling progress are considered as inputs (RMR, Q-system, Specific charge of blasting, Area), with progress as the output. Data from 86 blasts conducted at the tailrace tunnel in the Azad Dam, western Iran, were used to evaluate the progress value for each blast. The results indicated that, for the 86 blasts, the progress of the estimated model aligns mostly with the measured progress. This paper presents a method for building the interaction matrix (statistical base) of the RES model. Additionally, a comparison was made between the results of the new RES-based model and a Multi-Linear Regression (MLR) analysis model. In the RES-based model, the effective parameters are RMR (35.62%), Q (28.6%), q (specific charge of blasting) (20.35%), and A (15.42%), respectively, whereas for MLR analysis, the main parameters are RMR, Q (system), q, and A. These findings confirm the superior performance of the RES-based model over the other proposed models.

Keywords: Rock Engineering Systems, tunneling progress, Multi Linear Regression, Specific charge of blasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141
1980 Using Historical Data for Stock Prediction of a Tech Company

Authors: Sofia Stoica

Abstract:

In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices over the past five years of 10 major tech companies: Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We implemented and tested three models – a linear regressor model, a k-nearest neighbor model (KNN), and a sequential neural network – and two algorithms – Multiplicative Weight Update and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.

Keywords: Finance, machine learning, opening price, stock market.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 662
1979 Bio-Heat Transfer in Various Transcutaneous Stimulation Models

Authors: Trevor E. Davis, Isaac Cassar, Yi-Kai Lo, Wentai Liu

Abstract:

This study models the use of transcutaneous electrical nerve stimulation on skin with a disk electrode in order to simulate tissue damage. The current density distribution above a disk electrode is known to be a dynamic and non-uniform quantity that is intensified at the edges of the disk. The non-uniformity is subject to change through using various electrode geometries or stimulation methods. One of these methods known as edge-retarded stimulation has shown to reduce this edge enhancement. Though progress has been made in modeling the behavior of a disk electrode, little has been done to test the validity of these models in simulating the actual heat transfer from the electrode. This simulation uses finite element software to couple the injection of current from a disk electrode to heat transfer described by the Pennesbioheat transfer equation. An example application of this model is studying an experimental form of stimulation, known as edge-retarded stimulation. The edge-retarded stimulation method will reduce the current density at the edges of the electrode. It is hypothesized that reducing the current density edge enhancement effect will, in turn, reduce temperature change and tissue damage at the edges of these electrodes. This study tests this hypothesis as a demonstration of the capabilities of this model. The edge-retarded stimulation proved to be safer after this simulation. It is shown that temperature change and the fraction of tissue necrosis is much greater in the square wave stimulation. These results bring implications for changes of procedures in transcutaneous electrical nerve stimulation and transcutaneous spinal cord stimulation as well.

Keywords: Bioheat transfer, Electrode, Neuroprosthetics, TENS, Transcutaneous stimulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2287
1978 Quality-Driven Business Process Refactoring

Authors: María Fernández-Ropero, Ricardo Pérez-Castillo, Ismael Caballero, Mario Piattini

Abstract:

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

Keywords: business process model, modifiability, refactoring, understandability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
1977 A Forecast Model for Projecting the Amount of Hazardous Waste

Authors: J. Vilgerts, L. Timma, D. Blumberga

Abstract:

The objective of the paper is to develop the forecast model for the HW flows. The methodology of the research included 6 modules: historical data, assumptions, choose of indicators, data processing, and data analysis with STATGRAPHICS, and forecast models. The proposed methodology was validated for the case study for Latvia. Hypothesis on the changes in HW for time period of 2010-2020 have been developed and mathematically described with confidence level of 95.0% and 50.0%. Sensitivity analysis for the analyzed scenarios was done. The results show that the growth of GDP affects the total amount of HW in the country. The total amount of the HW is projected to be within the corridor of – 27.7% in the optimistic scenario up to +87.8% in the pessimistic scenario with confidence level of 50.0% for period of 2010-2020. The optimistic scenario has shown to be the least flexible to the changes in the GDP growth.

Keywords: Forecast models, hazardous waste management, sustainable development, waste management indicators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
1976 Communicative and Artistic Machines: A Survey of Models and Experiments on Artificial Agents

Authors: Artur Matuck, Guilherme F. Nobre

Abstract:

Machines can be either tool, media, or social agents. Advances in technology have been delivering machines capable of autonomous expression, both through communication and art. This paper deals with models (theoretical approach) and experiments (applied approach) related to artificial agents. On one hand it traces how social sciences' scholars have worked with topics such as text automatization, man-machine writing cooperation, and communication. On the other hand it covers how computer sciences' scholars have built communicative and artistic machines, including the programming of creativity. The aim is to present a brief survey on artificially intelligent communicators and artificially creative writers, and provide the basis to understand the meta-authorship and also to new and further man-machine co-authorship.

Keywords: Artificial communication, artificial creativity, artificial writers, meta-authorship, robotic art.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1312
1975 External Effects on Dynamic Competitive Model of Domestic Airline and High Speed Rail

Authors: Shih-Ching Lo, Yu-Ping Liao

Abstract:

Social-economic variables influence transportation demand largely. Analyses of discrete choice model consider social-economic variables to study traveler-s mode choice and demand. However, to calibrate the discrete choice model needs to have plenty of questionnaire survey. Also, an aggregative model is proposed. The historical data of passenger volumes for high speed rail and domestic civil aviation are employed to calibrate and validate the model. In this study, models with different social-economic variables, which are oil price, GDP per capita, CPI and economic growth rate, are compared. From the results, the model with the oil price is better than models with the other social-economic variables.

Keywords: forecasting, passenger volume, dynamic competitive model, social-economic variables, oil price.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1586
1974 Applications of Artificial Neural Network to Building Statistical Models for Qualifying and Indexing Radiation Treatment Plans

Authors: Pei-Ju Chao, Tsair-Fwu Lee, Wei-Luen Huang, Long-Chang Chen, Te-Jen Su, Wen-Ping Chen

Abstract:

The main goal in this paper is to quantify the quality of different techniques for radiation treatment plans, a back-propagation artificial neural network (ANN) combined with biomedicine theory was used to model thirteen dosimetric parameters and to calculate two dosimetric indices. The correlations between dosimetric indices and quality of life were extracted as the features and used in the ANN model to make decisions in the clinic. The simulation results show that a trained multilayer back-propagation neural network model can help a doctor accept or reject a plan efficiently. In addition, the models are flexible and whenever a new treatment technique enters the market, the feature variables simply need to be imported and the model re-trained for it to be ready for use.

Keywords: neural network, dosimetric index, radiation treatment, tumor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
1973 Determination of Sequential Best Replies in N-player Games by Genetic Algorithms

Authors: Mattheos K. Protopapas, Elias B. Kosmatopoulos

Abstract:

An iterative algorithm is proposed and tested in Cournot Game models, which is based on the convergence of sequential best responses and the utilization of a genetic algorithm for determining each player-s best response to a given strategy profile of its opponents. An extra outer loop is used, to address the problem of finite accuracy, which is inherent in genetic algorithms, since the set of feasible values in such an algorithm is finite. The algorithm is tested in five Cournot models, three of which have convergent best replies sequence, one with divergent sequential best replies and one with “local NE traps"[14], where classical local search algorithms fail to identify the Nash Equilibrium. After a series of simulations, we conclude that the algorithm proposed converges to the Nash Equilibrium, with any level of accuracy needed, in all but the case where the sequential best replies process diverges.

Keywords: Best response, Cournot oligopoly, genetic algorithms, Nash equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
1972 Analysis of Network Performance Using Aspect of Quantum Cryptography

Authors: Nisarg A. Patel, Hiren B. Patel

Abstract:

Quantum cryptography is described as a point-to-point secure key generation technology that has emerged in recent times in providing absolute security. Researchers have started studying new innovative approaches to exploit the security of Quantum Key Distribution (QKD) for a large-scale communication system. A number of approaches and models for utilization of QKD for secure communication have been developed. The uncertainty principle in quantum mechanics created a new paradigm for QKD. One of the approaches for use of QKD involved network fashioned security. The main goal was point-to-point Quantum network that exploited QKD technology for end-to-end network security via high speed QKD. Other approaches and models equipped with QKD in network fashion are introduced in the literature as. A different approach that this paper deals with is using QKD in existing protocols, which are widely used on the Internet to enhance security with main objective of unconditional security. Our work is towards the analysis of the QKD in Mobile ad-hoc network (MANET).

Keywords: QKD, cryptography, quantum cryptography, network performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 946
1971 Effects of Polymers and Alkaline on Recovery Improvement from Fractured Models

Authors: Payam Parvasi, Mohammad Hossein Sedaghat, Reza Janamiri, Amir Hatampour

Abstract:

In this work, several ASP solutions were flooded into fractured models initially saturated with heavy oil at a constant flow rate and different geometrical characteristics of fracture. The ASP solutions are constituted from 2 polymers i.e. a synthetic polymer, hydrolyzed polyacrylamide as well as a biopolymer, a surfactant and 2types of alkaline. The results showed that using synthetic hydrolyzed polyacrylamide polymer increases ultimate oil recovery; however, type of alkaline does not play a significant rule on oil recovery. In addition, position of the injection well respect to the fracture system has remarkable effects on ASP flooding. For instance increasing angle of fractures with mean flow direction causes more oil recovery and delays breakthrough time. This work can be accounted as a comprehensive survey on ASP flooding which considers most of effective factors in this chemical EOR method.

Keywords: ASP Flooding, Fractured System, Displacement, Heavy Oil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
1970 Equilibrium and Kinetic Studies of Lead Adsorption on Activated Carbon Derived from Mangrove Propagule Waste by Phosphoric Acid Activation

Authors: Widi Astuti, Rizki Agus Hermawan, Hariono Mukti, Nurul Retno Sugiyono

Abstract:

The removal of lead ion (Pb2+) from aqueous solution by activated carbon with phosphoric acid activation employing mangrove propagule as precursor was investigated in a batch adsorption system. Batch studies were carried out to address various experimental parameters including pH and contact time. The Langmuir and Freundlich models were able to describe the adsorption equilibrium, while the pseudo first order and pseudo second order models were used to describe kinetic process of Pb2+ adsorption. The results show that the adsorption data are seen in accordance with Langmuir isotherm model and pseudo-second order kinetic model.

Keywords: Activated carbon, adsorption, equilibrium, kinetic, Pb2+, mangrove propagule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741
1969 Comparative Study of Ecological City Criteria in Traditional Iranian Cities

Authors: Zahra Yazdani Paraii, Zohreh Yazdani Paraei

Abstract:

Many urban designers and planners have been involved in the design of environmentally friendly or nature adaptable urban development models due to increase in urban populations in the recent century, limitation on natural resources, climate change, and lack of enough water and food. Ecological city is one of the latest models proposed to accomplish the latter goal. In this work, the existing establishing indicators of the ecological city are used regarding energy, water, land use and transportation issues. The model is used to compare the function of traditional settlements of Iran. The result of investigation shows that the specifications and functions of the traditional settlements of Iran fit well into the ecological city model. It is found that the inhabitants of the old cities and villages in Iran had founded ecological cities based on their knowledge of the environment and its natural opportunities and limitations.

Keywords: Ecological city, traditional city, urban design, environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
1968 Neuron Dynamics of Single-Compartment Traub Model for Hardware Implementations

Authors: J. C. Moctezuma, V. Breña-Medina, Jose Luis Nunez-Yanez, Joseph P. McGeehan

Abstract:

In this work we make a bifurcation analysis for a single compartment representation of Traub model, one of the most important conductance-based models. The analysis focus in two principal parameters: current and leakage conductance. Study of stable and unstable solutions are explored; also Hop-bifurcation and frequency interpretation when current varies is examined. This study allows having control of neuron dynamics and neuron response when these parameters change. Analysis like this is particularly important for several applications such as: tuning parameters in learning process, neuron excitability tests, measure bursting properties of the neuron, etc. Finally, a hardware implementation results were developed to corroborate these results.

Keywords: Traub model, Pinsky-Rinzel model, Hopf bifurcation, single-compartment models, Bifurcation analysis, neuron modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1205
1967 A Bathtub Curve from Nonparametric Model

Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos

Abstract:

This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.

Keywords: Bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2234
1966 Interoperable CNC System for Turning Operations

Authors: Yusri Yusof, Stephen Newman, Aydin Nassehi, Keith Case

Abstract:

The changing economic climate has made global manufacturing a growing reality over the last decade, forcing companies from east and west and all over the world to collaborate beyond geographic boundaries in the design, manufacture and assemble of products. The ISO10303 and ISO14649 Standards (STEP and STEP-NC) have been developed to introduce interoperability into manufacturing enterprises so as to meet the challenge of responding to production on demand. This paper describes and illustrates a STEP compliant CAD/CAPP/CAM System for the manufacture of rotational parts on CNC turning centers. The information models to support the proposed system together with the data models defined in the ISO14649 standard used to create the NC programs are also described. A structured view of a STEP compliant CAD/CAPP/CAM system framework supporting the next generation of intelligent CNC controllers for turn/mill component manufacture is provided. Finally a proposed computational environment for a STEP-NC compliant system for turning operations (SCSTO) is described. SCSTO is the experimental part of the research supported by the specification of information models and constructed using a structured methodology and object-oriented methods. SCSTO was developed to generate a Part 21 file based on machining features to support the interactive generation of process plans utilizing feature extraction. A case study component has been developed to prove the concept for using the milling and turning parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM environment.

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1989
1965 Rapid Study on Feature Extraction and Classification Models in Healthcare Applications

Authors: S. Sowmyayani

Abstract:

The advancement of computer-aided design helps the medical force and security force. Some applications include biometric recognition, elderly fall detection, face recognition, cancer recognition, tumor recognition, etc. This paper deals with different machine learning algorithms that are more generically used for any health care system. The most focused problems are classification and regression. With the rise of big data, machine learning has become particularly important for solving problems. Machine learning uses two types of techniques: supervised learning and unsupervised learning. The former trains a model on known input and output data and predicts future outputs. Classification and regression are supervised learning techniques. Unsupervised learning finds hidden patterns in input data. Clustering is one such unsupervised learning technique. The above-mentioned models are discussed briefly in this paper.

Keywords: Supervised learning, unsupervised learning, regression, neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 346
1964 Main Bearing Stiffness Investigation

Authors: B. Bellakhdhar, A. Dogui, J.L. Ligier

Abstract:

Simplified coupled engine block-crankshaft models based on beam theory provide an efficient substitute to engine simulation in the design process. These models require accurate definition of the main bearing stiffness. In this paper, an investigation of this stiffness is presented. The clearance effect is studied using a smooth bearing model. It is manifested for low shaft displacement. The hydrodynamic assessment model shows that the oil film has no stiffness for low loads and it is infinitely rigid for important loads. The deformation stiffness is determined using a suitable finite elements model based on real CADs. As a result, a main bearing behaviour law is proposed. This behaviour law takes into account the clearance, the hydrodynamic sustention and the deformation stiffness. It ensures properly the transition from the configuration low rigidity to the configuration high rigidity.

Keywords: Clearance, deformation stiffness, main bearing behaviour law, oil film stiffness

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2371
1963 An Investigation on Electric Field Distribution around 380 kV Transmission Line for Various Pylon Models

Authors: C. F. Kumru, C. Kocatepe, O. Arikan

Abstract:

In this study, electric field distribution analyses for three pylon models are carried out by a Finite Element Method (FEM) based software. Analyses are performed in both stationary and time domains to observe instantaneous values along with the effective ones. Considering the results of the study, different line geometries is considerably affecting the magnitude and distribution of electric field although the line voltages are the same. Furthermore, it is observed that maximum values of instantaneous electric field obtained in time domain analysis are quite higher than the effective ones in stationary mode. In consequence, electric field distribution analyses should be individually made for each different line model and the limit exposure values or distances to residential buildings should be defined according to the results obtained.

Keywords: Electric field, energy transmission line, finite element method, pylon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2715
1962 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
1961 Applications of Rough Set Decompositions in Information Retrieval

Authors: Chen Wu, Xiaohua Hu

Abstract:

This paper proposes rough set models with three different level knowledge granules in incomplete information system under tolerance relation by similarity between objects according to their attribute values. Through introducing dominance relation on the discourse to decompose similarity classes into three subclasses: little better subclass, little worse subclass and vague subclass, it dismantles lower and upper approximations into three components. By using these components, retrieving information to find naturally hierarchical expansions to queries and constructing answers to elaborative queries can be effective. It illustrates the approach in applying rough set models in the design of information retrieval system to access different granular expanded documents. The proposed method enhances rough set model application in the flexibility of expansions and elaborative queries in information retrieval.

Keywords: Incomplete information system, Rough set model, tolerance relation, dominance relation, approximation, decomposition, elaborative query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
1960 Aircraft Gas Turbine Engines Technical Condition Identification System

Authors: A. M. Pashayev, C. Ardil, D. D. Askerov, R. A. Sadiqov, P. S. Abdullayev

Abstract:

In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.

Keywords: Gas turbine engines, neural networks, fuzzy logic, fuzzy statistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
1959 Comparative Study of Evolutionary Model and Clustering Methods in Circuit Partitioning Pertaining to VLSI Design

Authors: K. A. Sumitra Devi, N. P. Banashree, Annamma Abraham

Abstract:

Partitioning is a critical area of VLSI CAD. In order to build complex digital logic circuits its often essential to sub-divide multi -million transistor design into manageable Pieces. This paper looks at the various partitioning techniques aspects of VLSI CAD, targeted at various applications. We proposed an evolutionary time-series model and a statistical glitch prediction system using a neural network with selection of global feature by making use of clustering method model, for partitioning a circuit. For evolutionary time-series model, we made use of genetic, memetic & neuro-memetic techniques. Our work focused in use of clustering methods - K-means & EM methodology. A comparative study is provided for all techniques to solve the problem of circuit partitioning pertaining to VLSI design. The performance of all approaches is compared using benchmark data provided by MCNC standard cell placement benchmark net lists. Analysis of the investigational results proved that the Neuro-memetic model achieves greater performance then other model in recognizing sub-circuits with minimum amount of interconnections between them.

Keywords: VLSI, circuit partitioning, memetic algorithm, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
1958 Research on Residential Block Fabric: A Case Study of Hangzhou West Area

Authors: Wang Ye, Wei Wei

Abstract:

Residential block construction of big cities in China began in the 1950s, and four models had far-reaching influence on modern residential block in its development process, including unit compound and residential district in 1950s to 1980s, and gated community and open community in 1990s to now. Based on analysis of the four models’ fabric, the article takes residential blocks in Hangzhou west area as an example and carries on the studies from urban structure level and block spacial level, mainly including urban road network, land use, community function, road organization, public space and building fabric. At last, the article puts forward “Semi-open Sub-community” strategy to improve the current fabric.

Keywords: Hangzhou West Area, residential block model, residential block fabric, “Semi-open Sub-community” strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1429