Search results for: Statistical Approaches
1269 Using Single Decision Tree to Assess the Impact of Cutting Conditions on Vibration
Authors: S. Ghorbani, N. I. Polushin
Abstract:
Vibration during machining process is crucial since it affects cutting tool, machine, and workpiece leading to a tool wear, tool breakage, and an unacceptable surface roughness. This paper applies a nonparametric statistical method, single decision tree (SDT), to identify factors affecting on vibration in machining process. Workpiece material (AISI 1045 Steel, AA2024 Aluminum alloy, A48-class30 Gray Cast Iron), cutting tool (conventional, cutting tool with holes in toolholder, cutting tool filled up with epoxy-granite), tool overhang (41-65 mm), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev) and depth of cut (0.05-0.15 mm) were used as input variables, while vibration was the output parameter. It is concluded that workpiece material is the most important parameters for natural frequency followed by cutting tool and overhang.Keywords: Cutting condition, vibration, natural frequency, decision tree, CART algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14341268 Adaptive Group of Pictures Structure Based On the Positions of Video Cuts
Authors: Lenka Krulikovská, Jaroslav Polec, Michal Martinovič
Abstract:
In this paper we propose a method which improves the efficiency of video coding. Our method combines an adaptive GOP (group of pictures) structure and the shot cut detection. We have analyzed different approaches for shot cut detection with aim to choose the most appropriate one. The next step is to situate N frames to the positions of detected cuts during the process of video encoding. Finally the efficiency of the proposed method is confirmed by simulations and the obtained results are compared with fixed GOP structures of sizes 4, 8, 12, 16, 32, 64, 128 and GOP structure with length of entire video. Proposed method achieved the gain in bit rate from 0.37% to 50.59%, while providing PSNR (Peak Signal-to-Noise Ratio) gain from 1.33% to 0.26% in comparison to simulated fixed GOP structures.
Keywords: Adaptive GOP structure, video coding, video content, shot cut detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22901267 Preliminary Knowledge Extraction from Beethoven’s Sonatas: from Musical Referential Patterns to Emotional Normative Ratings
Authors: Christina Volioti, Sotiris Manitsaris, Eleni Katsouli, Vasiliki Tsekouropoulou, Leontios J. Hadjileontiadis
Abstract:
The piano sonatas of Beethoven represent part of the Intangible Cultural Heritage. The aims of this research were to further explore this intangibility by placing emphasis on defining emotional normative ratings for the “Waldstein” (Op. 53) and “Tempest” (Op. 31) Sonatas of Beethoven. To this end, a musicological analysis was conducted on these particular sonatas and referential patterns in these works of Beethoven were defined. Appropriate interactive questionnaires were designed in order to create a statistical normative rating that describes the emotional status when an individual listens to these musical excerpts. Based on these ratings, it is possible for emotional annotations for these same referential patterns to be created and integrated into the music score.
Keywords: Emotional annotations, intangible cultural heritage, musicological analysis, normative ratings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8581266 Experimental Analysis of Diesel Hydrotreating Reactor to Development a Simplified Tool for Process Real- time Optimization
Authors: S.Shokri, S.Zahedi, M.Ahmadi Marvast, B. Baloochi, H.Ganji
Abstract:
In this research, a systematic investigation was carried out to determine the optimum conditions of HDS reactor. Moreover, a suitable model was developed for a rigorous RTO (real time optimization) loop of HDS (Hydro desulfurization) process. A systematic experimental series was designed based on CCD (Central Composite design) and carried out in the related pilot plant to tune the develop model. The designed variables in the experiments were Temperature, LHSV and pressure. However, the hydrogen over fresh feed ratio was remained constant. The ranges of these variables were respectively equal to 320-380ºC, 1- 21/hr and 50-55 bar. a power law kinetic model was also developed for our further research in the future .The rate order and activation energy , power of reactant concentration and frequency factor of this model was respectively equal to 1.4, 92.66 kJ/mol and k0=2.7*109 .
Keywords: Statistical model, Multiphase Reactors, Gas oil, Hydrodesulfurization, Optimization, Kinetics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26861265 Construction of Intersection of Nondeterministic Finite Automata using Z Notation
Authors: Nazir Ahmad Zafar, Nabeel Sabir, Amir Ali
Abstract:
Functionalities and control behavior are both primary requirements in design of a complex system. Automata theory plays an important role in modeling behavior of a system. Z is an ideal notation which is used for describing state space of a system and then defining operations over it. Consequently, an integration of automata and Z will be an effective tool for increasing modeling power for a complex system. Further, nondeterministic finite automata (NFA) may have different implementations and therefore it is needed to verify the transformation from diagrams to a code. If we describe formal specification of an NFA before implementing it, then confidence over transformation can be increased. In this paper, we have given a procedure for integrating NFA and Z. Complement of a special type of NFA is defined. Then union of two NFAs is formalized after defining their complements. Finally, formal construction of intersection of NFAs is described. The specification of this relationship is analyzed and validated using Z/EVES tool.Keywords: Modeling, Nondeterministic finite automata, Znotation, Integration of approaches, Validation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31821264 Folksonomy-based Recommender Systems with User-s Recent Preferences
Authors: Cheng-Lung Huang, Han-Yu Chien, Michael Conyette
Abstract:
Social bookmarking is an environment in which the user gradually changes interests over time so that the tag data associated with the current temporal period is usually more important than tag data temporally far from the current period. This implies that in the social tagging system, the newly tagged items by the user are more relevant than older items. This study proposes a novel recommender system that considers the users- recent tag preferences. The proposed system includes the following stages: grouping similar users into clusters using an E-M clustering algorithm, finding similar resources based on the user-s bookmarks, and recommending the top-N items to the target user. The study examines the system-s information retrieval performance using a dataset from del.icio.us, which is a famous social bookmarking web site. Experimental results show that the proposed system is better and more effective than traditional approaches.Keywords: Recommender systems, Social bookmarking, Tag
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14041263 Model-Based Small Area Estimation with Application to Unemployment Estimates
Authors: Hichem Omrani, Philippe Gerber, Patrick Bousch
Abstract:
The problem of Small Area Estimation (SAE) is complex because of various information sources and insufficient data. In this paper, an approach for SAE is presented for decision-making at national, regional and local level. We propose an Empirical Best Linear Unbiased Predictor (EBLUP) as an estimator in order to combine several information sources to evaluate various indicators. First, we present the urban audit project and its environmental, social and economic indicators. Secondly, we propose an approach for decision making in order to estimate indicators. An application is used to validate the theoretical proposal. Finally, a decision support system is presented based on open-source environment.
Keywords: Small area estimation, statistical method, sampling, empirical best linear unbiased predictor (EBLUP), decision-making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17121262 International Marketing in Business Practice of Small and Medium-Sized Enterprises
Authors: K. Matušínská, Z. Bednarčík, M. Klepek
Abstract:
This paper examines international marketing in business practice of Czech exporting small and medium-sized enterprises (SMEs) with regard to the strategic perspectives. Research was focused on Czech exporting SMEs from Moravian- Silesia region and their behavior on international markets. For purpose of collecting data, a questionnaire was given to 262 SMEs involved in international business. Statistics utilized in this research included frequency, mean, percentage, and chi-square test. Data were analyzed by Statistical Package for the Social Sciences software. The research analysis disclosed that there is certain space for improvement in strategic marketing especially in a marketing research, perception of cultural and social differences, product adaptation and usage of marketing communication tools.
Keywords: International Marketing, Marketing Mix, Marketing Research, Small and Medium-sized Enterprises (SMEs), Strategic Marketing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21061261 Global Security Using Human Face Understanding under Vision Ubiquitous Architecture System
Abstract:
Different methods containing biometric algorithms are presented for the representation of eigenfaces detection including face recognition, are identification and verification. Our theme of this research is to manage the critical processing stages (accuracy, speed, security and monitoring) of face activities with the flexibility of searching and edit the secure authorized database. In this paper we implement different techniques such as eigenfaces vector reduction by using texture and shape vector phenomenon for complexity removal, while density matching score with Face Boundary Fixation (FBF) extracted the most likelihood characteristics in this media processing contents. We examine the development and performance efficiency of the database by applying our creative algorithms in both recognition and detection phenomenon. Our results show the performance accuracy and security gain with better achievement than a number of previous approaches in all the above processes in an encouraging mode.Keywords: Ubiquitous architecture, verification, Identification, recognition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13361260 Applications of Artificial Neural Network to Building Statistical Models for Qualifying and Indexing Radiation Treatment Plans
Authors: Pei-Ju Chao, Tsair-Fwu Lee, Wei-Luen Huang, Long-Chang Chen, Te-Jen Su, Wen-Ping Chen
Abstract:
The main goal in this paper is to quantify the quality of different techniques for radiation treatment plans, a back-propagation artificial neural network (ANN) combined with biomedicine theory was used to model thirteen dosimetric parameters and to calculate two dosimetric indices. The correlations between dosimetric indices and quality of life were extracted as the features and used in the ANN model to make decisions in the clinic. The simulation results show that a trained multilayer back-propagation neural network model can help a doctor accept or reject a plan efficiently. In addition, the models are flexible and whenever a new treatment technique enters the market, the feature variables simply need to be imported and the model re-trained for it to be ready for use.Keywords: neural network, dosimetric index, radiation treatment, tumor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16901259 Speech Enhancement Using Kalman Filter in Communication
Authors: Eng. Alaa K. Satti Salih
Abstract:
Revolutions Applications such as telecommunications, hands-free communications, recording, etc. which need at least one microphone, the signal is usually infected by noise and echo. The important application is the speech enhancement, which is done to remove suppressed noises and echoes taken by a microphone, beside preferred speech. Accordingly, the microphone signal has to be cleaned using digital signal processing DSP tools before it is played out, transmitted, or stored. Engineers have so far tried different approaches to improving the speech by get back the desired speech signal from the noisy observations. Especially Mobile communication, so in this paper will do reconstruction of the speech signal, observed in additive background noise, using the Kalman filter technique to estimate the parameters of the Autoregressive Process (AR) in the state space model and the output speech signal obtained by the MATLAB. The accurate estimation by Kalman filter on speech would enhance and reduce the noise then compare and discuss the results between actual values and estimated values which produce the reconstructed signals.
Keywords: Autoregressive Process, Kalman filter, Matlab and Noise speech.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40251258 Application of Adaptive Neural Network Algorithms for Determination of Salt Composition of Waters Using Laser Spectroscopy
Authors: Tatiana A. Dolenko, Sergey A. Burikov, Alexander O. Efitorov, Sergey A. Dolenko
Abstract:
In this study, a comparative analysis of the approaches associated with the use of neural network algorithms for effective solution of a complex inverse problem – the problem of identifying and determining the individual concentrations of inorganic salts in multicomponent aqueous solutions by the spectra of Raman scattering of light – is performed. It is shown that application of artificial neural networks provides the average accuracy of determination of concentration of each salt no worse than 0.025 M. The results of comparative analysis of input data compression methods are presented. It is demonstrated that use of uniform aggregation of input features allows decreasing the error of determination of individual concentrations of components by 16-18% on the average.
Keywords: Inverse problems, multi-component solutions, neural networks, Raman spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19271257 Faults Forecasting System
Authors: Hanaa E.Sayed, Hossam A. Gabbar, Shigeji Miyazaki
Abstract:
This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.Keywords: Bayesian Techniques, Faults Detection, Forecasting techniques, Multivariate Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15521256 The Effect of Precipitation on Weed Infestation of Spring Barley under Different Tillage Conditions
Authors: J. Winkler, S. Chovancová
Abstract:
The article deals with the relation between rainfall in selected months and subsequent weed infestation of spring barley. The field experiment was performed at Mendel University agricultural enterprise in Žabčice, Czech Republic. Weed infestation was measured in spring barley vegetation in years 2004 to 2012. Barley was grown in three tillage variants: conventional tillage technology (CT), minimization tillage technology (MT), and no tillage (NT). Precipitation was recorded in one-day intervals. Monthly precipitation was calculated from the measured values in the months of October through to April. The technique of canonical correspondence analysis was applied for further statistical processing. 41 different species of weeds were found in the course of the 9-year monitoring period. The results clearly show that precipitation affects the incidence of most weed species in the selected months, but acts differently in the monitored variants of tillage technologies.
Keywords: Weeds, precipitation, tillage, weed infestation forecast.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16411255 Carbon Nanotubes–A Successful Hydrogen Storage Medium
Authors: Vijaya Ilango, Avika Gupta
Abstract:
Hydrogen fuel is a zero-emission fuel which uses electrochemical cells or combustion in internal engines, to power vehicles and electric devices. Methods of hydrogen storage for subsequent use span many approaches, including high pressures, cryogenics and chemical compounds that reversibly release H2 upon heating. Most research into hydrogen storage is focused on storing hydrogen as a lightweight, compact energy carrier for mobile applications. With the accelerating demand for cleaner and more efficient energy sources, hydrogen research has attracted more attention in the scientific community. Until now, full implementation of a hydrogen-based energy system has been hindered in part by the challenge of storing hydrogen gas, especially onboard an automobile. New techniques being researched may soon make hydrogen storage more compact, safe and efficient. In this overview, few hydrogen storage methods and mechanism of hydrogen uptake in carbon nanotubes are summarized.
Keywords: Carbon nanotubes, Chemisorption, Hydrogen storage, Physisorption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31541254 Determining the Criteria and their Importance Level of Calibration Supplier Selection
Authors: Ayse Gecer, Nihal Erginel
Abstract:
Quality control is the crucial step for ISO 9001 Quality System Management Standard for companies. While measuring the quality level of both raw material and semi product/product, the calibration of the measuring device is an essential requirement. Calibration suppliers are in the service sector and therefore the calibration supplier selection is becoming a worthy topic for improving service quality. This study presents the results of a questionnaire about the selection criteria of a calibration supplier. The questionnaire was applied to 103 companies and the results are discussed in this paper. The analysis was made with MINITAB 14.0 statistical programs. “Competence of documentations" and “technical capability" are defined as the prerequisites because of the ISO/IEC17025:2005 standard. Also “warranties and complaint policy", “communication", “service features", “quality" and “performance history" are defined as very important criteria for calibration supplier selection.Keywords: Calibration, criteria of calibration supplier selection, calibration supplier selection, questionnaire
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20081253 Study of Features for Hand-printed Recognition
Authors: Satish Kumar
Abstract:
The feature extraction method(s) used to recognize hand-printed characters play an important role in ICR applications. In order to achieve high recognition rate for a recognition system, the choice of a feature that suits for the given script is certainly an important task. Even if a new feature required to be designed for a given script, it is essential to know the recognition ability of the existing features for that script. Devanagari script is being used in various Indian languages besides Hindi the mother tongue of majority of Indians. This research examines a variety of feature extraction approaches, which have been used in various ICR/OCR applications, in context to Devanagari hand-printed script. The study is conducted theoretically and experimentally on more that 10 feature extraction methods. The various feature extraction methods have been evaluated on Devanagari hand-printed database comprising more than 25000 characters belonging to 43 alphabets. The recognition ability of the features have been evaluated using three classifiers i.e. k-NN, MLP and SVM.Keywords: Features, Hand-printed, Devanagari, Classifier, Database
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17291252 Importance of Macromineral Ratios and Products in Association with Vitamin D in Pediatric Obesity Including Metabolic Syndrome
Authors: Mustafa M. Donma, Orkide Donma
Abstract:
Metabolisms of macrominerals, those of calcium, phosphorus and magnesium, are closely associated with the metabolism of vitamin D. Particularly magnesium, the second most abundant intracellular cation, is related to biochemical and metabolic processes in the body, such as those of carbohydrates, proteins and lipids. The status of each mineral was investigated in obesity to some extent. Their products and ratios may possibly give much more detailed information about the matter. The aim of this study is to investigate possible relations between each macromineral and some obesity-related parameters. This study was performed on 235 children, whose ages were between 06-18 years. Aside from anthropometric measurements, hematological analyses were performed. TANITA body composition monitor using bioelectrical impedance analysis technology was used to establish some obesity-related parameters including basal metabolic rate (BMR), total fat, mineral and muscle masses. World Health Organization body mass index (BMI) percentiles for age and sex were used to constitute the groups. The values above 99th percentile were defined as morbid obesity. Those between 95th and 99th percentiles were included into the obese group. The overweight group comprised of children whose percentiles were between 95 and 85. Children between the 85th and 15th percentiles were defined as normal. Metabolic syndrome (MetS) components (waist circumference, fasting blood glucose, triacylglycerol, high density lipoprotein cholesterol, systolic pressure, diastolic pressure) were determined. High performance liquid chromatography was used to determine Vitamin D status by measuring 25-hydroxy cholecalciferol (25-hydroxy vitamin D3, 25(OH)D). Vitamin D values above 30.0 ng/ml were accepted as sufficient. SPSS statistical package program was used for the evaluation of data. The statistical significance degree was accepted as p < 0.05. The important points were the correlations found between vitamin D and magnesium as well as phosphorus (p < 0.05) that existed in the group with normal BMI values. These correlations were lost in the other groups. The ratio of phosphorus to magnesium was even much more highly correlated with vitamin D (p < 0.001). The negative correlation between magnesium and total fat mass (p < 0.01) was confined to the MetS group showing the inverse relationship between magnesium levels and obesity degree. In this group, calcium*magnesium product exhibited the highest correlation with total fat mass (p < 0.001) among all groups. Only in the MetS group was a negative correlation found between BMR and calcium*magnesium product (p < 0.05). In conclusion, magnesium is located at the center of attraction concerning its relationships with vitamin D, fat mass and MetS. The ratios and products derived from macrominerals including magnesium have pointed out stronger associations other than each element alone. Final considerations have shown that unique correlations of magnesium as well as calcium*magnesium product with total fat mass have drawn attention particularly in the MetS group, possibly due to the derangements in some basic elements of carbohydrate as well as lipid metabolism.Keywords: Macrominerals, metabolic syndrome, pediatric obesity, vitamin D.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8041251 Various Modifications of Electrochemical Barrier Layer Thinning of Anodic Aluminum Oxide
Authors: W. J. Stępniowski, W. Florkiewicz, M. Norek, M. Michalska-Domańska, E. Kościuczyk, T. Czujko
Abstract:
In this paper, two options of anodic alumina barrier layer thinning have been demonstrated. The approaches varied with the duration of the voltage step. It was found that too long step of the barrier layer thinning process leads to chemical etching of the nanopores on their top. At the bottoms pores are not fully opened what is disadvantageous for further applications in nanofabrication. On the other hand, while the duration of the voltage step is controlled by the current density (value of the current density cannot exceed 75% of the value recorded during previous voltage step) the pores are fully opened. However, pores at the bottom obtained with this procedure have smaller diameter, nevertheless this procedure provides electric contact between the bare aluminum (substrate) and electrolyte, what is suitable for template assisted electrodeposition, one of the most cost-efficient synthesis method in nanotechnology.
Keywords: Anodic aluminum oxide, anodization, barrier layer thinning, nanopores.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26331250 A Comparison between Heuristic and Meta-Heuristic Methods for Solving the Multiple Traveling Salesman Problem
Authors: San Nah Sze, Wei King Tiong
Abstract:
The multiple traveling salesman problem (mTSP) can be used to model many practical problems. The mTSP is more complicated than the traveling salesman problem (TSP) because it requires determining which cities to assign to each salesman, as well as the optimal ordering of the cities within each salesman's tour. Previous studies proposed that Genetic Algorithm (GA), Integer Programming (IP) and several neural network (NN) approaches could be used to solve mTSP. This paper compared the results for mTSP, solved with Genetic Algorithm (GA) and Nearest Neighbor Algorithm (NNA). The number of cities is clustered into a few groups using k-means clustering technique. The number of groups depends on the number of salesman. Then, each group is solved with NNA and GA as an independent TSP. It is found that k-means clustering and NNA are superior to GA in terms of performance (evaluated by fitness function) and computing time.Keywords: Multiple Traveling Salesman Problem, GeneticAlgorithm, Nearest Neighbor Algorithm, k-Means Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32331249 A Framework for Review Spam Detection Research
Authors: Mohammadali Tavakoli, Atefeh Heydari, Zuriati Ismail, Naomie Salim
Abstract:
With the increasing number of people reviewing products online in recent years, opinion sharing websites has become the most important source of customers’ opinions. Unfortunately, spammers generate and post fake reviews in order to promote or demote brands and mislead potential customers. These are notably destructive not only for potential customers, but also for business holders and manufacturers. However, research in this area is not adequate, and many critical problems related to spam detection have not been solved to date. To provide green researchers in the domain with a great aid, in this paper, we have attempted to create a highquality framework to make a clear vision on review spam-detection methods. In addition, this report contains a comprehensive collection of detection metrics used in proposed spam-detection approaches. These metrics are extremely applicable for developing novel detection methods.
Keywords: Fake reviews, Feature collection, Opinion spam, Spam detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25171248 A Scatter Search and Help Policies Approaches for a New Mixed Model Assembly Lines Sequencing Problem
Authors: N. Manavizadeh , M. Rabbani , H. Sotudian , F. Jolai
Abstract:
Mixed Model Production is the practice of assembling several distinct and different models of a product on the same assembly line without changeovers and then sequencing those models in a way that smoothes the demand for upstream components. In this paper, we consider an objective function which minimizes total stoppage time and total idle time and it is presented sequence dependent set up time. Many studies have been done on the mixed model assembly lines. But in this paper we specifically focused on reducing the idle times. This is possible through various help policies. For improving the solutions, some cases developed and about 40 tests problem was considered. We use scatter search for optimization and for showing the efficiency of our algorithm, experimental results shows behavior of method. Scatter search and help policies can produce high quality answers, so it has been used in this paper.Keywords: mixed model assembly lines, Scatter search, help policies, idle time, Stoppage time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14911247 A Control Model for the Dismantling of Industrial Plants
Authors: Florian Mach, Eric Hund, Malte Stonis
Abstract:
The dismantling of disused industrial facilities such as nuclear power plants or refineries is an enormous challenge for the planning and control of the logistic processes. Existing control models do not meet the requirements for a proper dismantling of industrial plants. Therefore, the paper presents an approach for the control of dismantling and post-processing processes (e.g. decontamination) in plant decommissioning. In contrast to existing approaches, the dismantling sequence and depth are selected depending on the capacity utilization of required post-processing processes by also considering individual characteristics of respective dismantling tasks (e.g. decontamination success rate, uncertainties regarding the process times). The results can be used in the dismantling of industrial plants (e.g. nuclear power plants) to reduce dismantling time and costs by avoiding bottlenecks such as capacity constraints.
Keywords: Dismantling management, logistics planning and control models, nuclear power plant dismantling, reverse logistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14511246 Multi-objective Optimization of Graph Partitioning using Genetic Algorithm
Authors: M. Farshbaf, M. R. Feizi-Derakhshi
Abstract:
Graph partitioning is a NP-hard problem with multiple conflicting objectives. The graph partitioning should minimize the inter-partition relationship while maximizing the intra-partition relationship. Furthermore, the partition load should be evenly distributed over the respective partitions. Therefore this is a multiobjective optimization problem (MOO). One of the approaches to MOO is Pareto optimization which has been used in this paper. The proposed methods of this paper used to improve the performance are injecting best solutions of previous runs into the first generation of next runs and also storing the non-dominated set of previous generations to combine with later generation's non-dominated set. These improvements prevent the GA from getting stuck in the local optima and increase the probability of finding more optimal solutions. Finally, a simulation research is carried out to investigate the effectiveness of the proposed algorithm. The simulation results confirm the effectiveness of the proposed method.Keywords: Graph partitioning, Genetic algorithm, Multiobjective optimization, Pareto front.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19681245 A Dynamic Equation for Downscaling Surface Air Temperature
Authors: Ch. Surawut, D. Sukawat
Abstract:
In order to utilize results from global climate models, dynamical and statistical downscaling techniques have been developed. For dynamical downscaling, usually a limited area numerical model is used, with associated high computational cost. This research proposes dynamic equation for specific space-time regional climate downscaling from the Educational Global Climate Model (EdGCM) for Southeast Asia. The equation is for surface air temperature. This equation provides downscaling values of surface air temperature at any specific location and time without running a regional climate model. In the proposed equations, surface air temperature is approximated from ground temperature, sensible heat flux and 2m wind speed. Results from the application of the equation show that the errors from the proposed equations are less than the errors for direct interpolation from EdGCM.Keywords: Dynamic Equation, Downscaling, Inverse distance weight interpolation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24551244 Critical Issues Affecting the Engagement by Staff in Professional Development for E-Learning: Findings from a Research Project within the Context of a National Tertiary Education Sector
Authors: J. Mansvelt, G. Suddaby, D. O'Hara
Abstract:
This paper focuses on issues of engagement by staff in professional development related to the delivery of e-learning. The paper reports on findings drawn from a New Zealand research project which is producing a sector-wide framework for professional development in tertiary e-learning. The research findings indicate that staff engaged in e-learning in tertiary institutions is not making the most effective use of the professional development opportunities available to them; rather they seem to gain their knowledge and support from a variety of informal means. This is despite an emphasis on the provision of professional development opportunities by both Government Policies and Institutions themselves. The conclusion drawn from the findings is that institutional approaches to professional development for e-learning do not yet fully reflect the demands and constraints that working in a digital context impose.
Keywords: Academic development, e-learning, engagement, professional development, tertiary education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14481243 Statistical Wavelet Features, PCA, and SVM Based Approach for EEG Signals Classification
Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh
Abstract:
The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the supportvectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.
Keywords: Discrete Wavelet Transform, Electroencephalogram, Pattern Recognition, Principal Component Analysis, Support Vector Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31131242 The Effect of Land Cover on Movement of Vehicles in the Terrain
Authors: Dana Kristalova, Jan Mazal
Abstract:
This article deals with geographical conditions in terrain and their effect on the movement of vehicles, their effect on speed and safety of movement of people and vehicles. Finding of the optimal routes outside the communication is studied in the Army environment, but it occur in civilian as well, primarily in crisis situation, or by the provision of assistance when natural disasters such as floods, fires, storms etc., have happened. These movements require the optimization of routes when effects of geographical factors should be included. The most important factor is the surface of a terrain. It is based on several geographical factors as are slopes, soil conditions, micro-relief, a type of surface and meteorological conditions. Their mutual impact has been given by coefficient of deceleration. This coefficient can be used for the commander`s decision. New approaches and methods of terrain testing, mathematical computing, mathematical statistics or cartometric investigation are necessary parts of this evaluation.
Keywords: Movement in a terrain, geographical factors, surface of a field, mathematical evaluation, optimization and searching paths.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18901241 Using Cooperation Approaches at Different Levels of Artificial Bee Colony Method
Authors: Vahid Zeighami, Mohsen Ghasemi, Reza Akbari
Abstract:
In this work, a Multi-Level Artificial Bee Colony (called MLABC) for optimizing numerical test functions is presented. In MLABC, two species are used. The first species employs n colonies where each of them optimizes the complete solution vector. The cooperation between these colonies is carried out by exchanging information through a leader colony, which contains a set of elite bees. The second species uses a cooperative approach in which the complete solution vector is divided to k sub-vectors, and each of these sub-vectors is optimized by a colony. The cooperation between these colonies is carried out by compiling sub-vectors into the complete solution vector. Finally, the cooperation between two species is obtained by exchanging information. The proposed algorithm is tested on a set of well-known test functions. The results show that MLABC algorithm provides efficiency and robustness to solve numerical functions.
Keywords: Artificial bee colony, cooperative artificial bee colony, multilevel cooperation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23631240 A Dose Distribution Approach Using Monte Carlo Simulation in Dosimetric Accuracy Calculation for Treating the Lung Tumor
Authors: Md Abdullah Al Mashud, M. Tariquzzaman, M. Jahangir Alam, Tapan Kumar Godder, M. Mahbubur Rahman
Abstract:
This paper presents a Monte Carlo (MC) method-based dose distributions on lung tumor for 6 MV photon beam to improve the dosimetric accuracy for cancer treatment. The polystyrene which is tissue equivalent material to the lung tumor density is used in this research. In the empirical calculations, TRS-398 formalism of IAEA has been used, and the setup was made according to the ICRU recommendations. The research outcomes were compared with the state-of-the-art experimental results. From the experimental results, it is observed that the proposed based approach provides more accurate results and improves the accuracy than the existing approaches. The average %variation between measured and TPS simulated values was obtained 1.337±0.531, which shows a substantial improvement comparing with the state-of-the-art technology.
Keywords: Lung tumor, Monte Carlo, polystyrene, elekta synergy, Monaco Planning System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1242