Search results for: analysis methods.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11331

Search results for: analysis methods.

11031 Automated Process Quality Monitoring with Prediction of Fault Condition Using Measurement Data

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events is important to improve safety and reliability of machine operations and reduce losses caused by failures. Improper set-ups or aligning of parts often leads to severe problems in many machines. The construction of prediction models for predicting faulty conditions is quite essential in making decisions on when to perform machine maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of machine measurement data. The calibration model is used to predict two faulty conditions from historical reference data. This approach utilizes genetic algorithms (GA) based variable selection, and we evaluate the predictive performance of several prediction methods using real data. The results shows that the calibration model based on supervised probabilistic principal component analysis (SPPCA) yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: Prediction, operation monitoring, on-line data, nonlinear statistical methods, empirical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
11030 The Frame Analysis and Testing for Student Formula

Authors: Tanawat Limwathanagura, Chartree Sithananun, Teekayu Limchamroon, Thanyarat Singhanart

Abstract:

The objective of this paper is to study the analysis and testing for determining the torsional stiffness of the student formula-s space frame. From past study, the space frame for Chulalongkorn University Student Formula team used in 2011 TSAE Auto Challenge Student Formula in Thailand was designed by considering required mass and torsional stiffness based on the numerical method and experimental method. The numerical result was compared with the experimental results to verify the torsional stiffness of the space frame. It can be seen from the large error of torsional stiffness of 2011 frame that the experimental result can not verify by the numerical analysis due to the different between the numerical model and experimental setting. In this paper, the numerical analysis and experiment of the same 2011 frame model is performed by improving the model setting. The improvement of both numerical analysis and experiment are discussed to confirm that the models from both methods are same. After the frame was analyzed and tested, the results are compared to verify the torsional stiffness of the frame. It can be concluded that the improved analysis and experiments can used to verify the torsional stiffness of the space frame.

Keywords: Space Frame, Student Formula, Torsional Stiffness, TSAE Auto Challenge

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7962
11029 Affine Combination of Splitting Type Integrators, Implemented with Parallel Computing Methods

Authors: Adrian Alvarez, Diego Rial

Abstract:

In this work we present a family of new convergent type methods splitting high order no negative steps feature that allows your application to irreversible problems. Performing affine combinations consist of results obtained with Trotter Lie integrators of different steps. Some examples where applied symplectic compared with methods, in particular a pair of differential equations semilinear. The number of basic integrations required is comparable with integrators symplectic, but this technique allows the ability to do the math in parallel thus reducing the times of which exemplify exhibiting some implementations with simple schemes for its modularity and scalability process.

Keywords: Lie Trotter integrators, Irreversible Problems, Splitting Methods without negative steps, MPI, HPC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1302
11028 A Comparative Study on the Financial Characteristics for Development Methods of Urban Development Project - Focusing on Multi-level Replotting Method -

Authors: Jin hui Kim, Hyung kwan Cho, Ji won Moon, Hoon Chang

Abstract:

The purpose of this study is comparing and analysing of the financial characteristics for development methods of the urban development project in the established area, focusing on the multi-level replotting. Analysis showed that the type of the lowest expenditure was 'combination type of group-land and multi-level replotting' and the type of the highest profitability was 'multi-level replotting type'. But 'multi-level replotting type' has still risk of amount of cost for the additional architecture. In addition, we subdivided standard amount for liquidation of replotting and analysed income-expenditure flow. Analysis showed that both of 'multi-level replotting type' and 'combination type of group-land and multi-level replotting' improved profitability of project and property change ratio. However, when the standard was under a certain amount, amount of original property for the replotting was increased exponentially, and profitability of project.

Keywords: Urban development, multi-level replotting, financial characteristics, expropriation type, combination type, urban meteorology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
11027 Mathematical Modeling of the AMCs Cross-Contamination Removal in the FOUPs: Finite Element Formulation and Application in FOUP’s Decontamination

Authors: N. Santatriniaina, J. Deseure, T.Q. Nguyen, H. Fontaine, C. Beitia, L. Rakotomanana

Abstract:

Nowadays, with the increasing of the wafer's size and the decreasing of critical size of integrated circuit manufacturing in modern high-tech, microelectronics industry needs a maximum attention to challenge the contamination control. The move to 300 [mm] is accompanied by the use of Front Opening Unified Pods for wafer and his storage. In these pods an airborne cross contamination may occur between wafers and the pods. A predictive approach using modeling and computational methods is very powerful method to understand and qualify the AMCs cross contamination processes. This work investigates the required numerical tools which are employed in order to study the AMCs cross-contamination transfer phenomena between wafers and FOUPs. Numerical optimization and finite element formulation in transient analysis were established. Analytical solution of one dimensional problem was developed and the calibration process of physical constants was performed. The least square distance between the model (analytical 1D solution) and the experimental data are minimized. The behavior of the AMCs intransient analysis was determined. The model framework preserves the classical forms of the diffusion and convection-diffusion equations and yields to consistent form of the Fick's law. The adsorption process and the surface roughness effect were also traduced as a boundary condition using the switch condition Dirichlet to Neumann and the interface condition. The methodology is applied, first using the optimization methods with analytical solution to define physical constants, and second using finite element method including adsorption kinetic and the switch of Dirichlet to Neumann condition.

Keywords: AMCs, FOUP, cross-contamination, adsorption, diffusion, numerical analysis, wafers, Dirichlet to Neumann, finite elements methods, Fick’s law, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3084
11026 Approximate Solutions to Large Stein Matrix Equations

Authors: Khalide Jbilou

Abstract:

In the present paper, we propose numerical methods for solving the Stein equation AXC - X - D = 0 where the matrix A is large and sparse. Such problems appear in discrete-time control problems, filtering and image restoration. We consider the case where the matrix D is of full rank and the case where D is factored as a product of two matrices. The proposed methods are Krylov subspace methods based on the block Arnoldi algorithm. We give theoretical results and we report some numerical experiments.

Keywords: IEEEtran, journal, LATEX, paper, template.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
11025 Application of Subversion Analysis in the Search for the Causes of Cracking in a Marine Engine Injector Nozzle

Authors: Leszek Chybowski, Artur Bejger, Katarzyna Gawdzińska

Abstract:

Subversion analysis is a tool used in the TRIZ (Theory of Inventive Problem Solving) methodology. This article introduces the history and describes the process of subversion analysis, as well as function analysis and analysis of the resources, used at the design stage when generating possible undesirable situations. The article charts the course of subversion analysis when applied to a fuel injection nozzle of a marine engine. The work describes the fuel injector nozzle as a technological system and presents principles of analysis for the causes of a cracked tip of the nozzle body. The system is modelled with functional analysis. A search for potential causes of the damage is undertaken and a cause-and-effect analysis for various hypotheses concerning the damage is drawn up. The importance of particular hypotheses is evaluated and the most likely causes of damage identified.

Keywords: Complex technical system, fuel injector, function analysis, importance analysis, resource analysis, sabotage analysis, subversion analysis, TRIZ.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1146
11024 The Estimation of Human Vital Signs Complexity

Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius

Abstract:

Nonstationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based on the interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore, we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables’ interactions.

Keywords: Cardiac diseases, Complex systems theory, ECG analysis, matrix analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2199
11023 EZW Coding System with Artificial Neural Networks

Authors: Saudagar Abdul Khader Jilani, Syed Abdul Sattar

Abstract:

Image compression plays a vital role in today-s communication. The limitation in allocated bandwidth leads to slower communication. To exchange the rate of transmission in the limited bandwidth the Image data must be compressed before transmission. Basically there are two types of compressions, 1) LOSSY compression and 2) LOSSLESS compression. Lossy compression though gives more compression compared to lossless compression; the accuracy in retrievation is less in case of lossy compression as compared to lossless compression. JPEG, JPEG2000 image compression system follows huffman coding for image compression. JPEG 2000 coding system use wavelet transform, which decompose the image into different levels, where the coefficient in each sub band are uncorrelated from coefficient of other sub bands. Embedded Zero tree wavelet (EZW) coding exploits the multi-resolution properties of the wavelet transform to give a computationally simple algorithm with better performance compared to existing wavelet transforms. For further improvement of compression applications other coding methods were recently been suggested. An ANN base approach is one such method. Artificial Neural Network has been applied to many problems in image processing and has demonstrated their superiority over classical methods when dealing with noisy or incomplete data for image compression applications. The performance analysis of different images is proposed with an analysis of EZW coding system with Error Backpropagation algorithm. The implementation and analysis shows approximately 30% more accuracy in retrieved image compare to the existing EZW coding system.

Keywords: Accuracy, Compression, EZW, JPEG2000, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880
11022 Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model

Authors: Yepeng Cheng, Yasuhiko Morimoto

Abstract:

Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.

Keywords: Customer value, Huff's Gravity Model, POS, retailer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 549
11021 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam

Authors: S. Golmohammadi, M. Noorian Bidgoli

Abstract:

Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the Rock Quality Designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and Stress Reduction Factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has been attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the Rock Engineering System (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.

Keywords: Q-system, Rock Engineering System, statistical analysis, rock mass, tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225
11020 Static Analysis and Pseudostatic Slope Stability

Authors: Meftah Ali

Abstract:

This article aims to analyze the static stability and pseudostatic slope by using different methods such as: Bishop method, Junbu, Ordinary, Morgenstern-price and GLE. The two dimensional modeling of slope stability under various loading as: the earthquake effect, the water level and road mobile charges. The results show that the slope is stable in the static case without water, but in other cases, the slope lost its stability and give unstable. The calculation of safety factor is to evaluate the stability of the slope using the limit equilibrium method despite the difference between the results obtained by these methods that do not rely on the same assumptions. In the end, the results of this study illuminate well the influence of the action of water, moving loads and the earthquake on the stability of the slope.

Keywords: Slope stability, pseudo static, safety factor, limit equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3306
11019 Ranking Alternatives in Multi-Criteria Decision Analysis using Common Weights Based on Ideal and Anti-ideal Frontiers

Authors: Saber Saati Mohtadi, Ali Payan, Azizallah Kord

Abstract:

One of the most important issues in multi-criteria decision analysis (MCDA) is to determine the weights of criteria so that all alternatives can be compared based on the collective performance of criteria. In this paper, one of popular methods in data envelopment analysis (DEA) known as common weights (CWs) is used to determine the weights in MCDA. Two frontiers named ideal and anti-ideal frontiers, instead of ideal and anti-ideal alternatives, are defined based on two new proposed CWs models. Ideal and antiideal frontiers are more flexible than that of alternatives. According to the optimal solutions of these two models, the distances of an alternative from the ideal and anti-ideal frontiers are derived. Then, a relative distance is introduced to measure the value of each alternative. The suggested models are linear and despite weight restrictions are feasible. An example is presented for explaining the method and for comparing to the existing literature.

Keywords: Anti-ideal frontier, Common weights (CWs), Ideal frontier, Multi-criteria decision analysis (MCDA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1858
11018 Towards Automatic Recognition and Grading of Ganoderma Infection Pattern Using Fuzzy Systems

Authors: Mazliham Mohd Su'ud, Pierre Loonis, Idris Abu Seman

Abstract:

This paper deals with the extraction of information from the experts to automatically identify and recognize Ganoderma infection in oil palm stem using tomography images. Expert-s knowledge are used as rules in a Fuzzy Inference Systems to classify each individual patterns observed in he tomography image. The classification is done by defining membership functions which assigned a set of three possible hypotheses : Ganoderma infection (G), non Ganoderma infection (N) or intact stem tissue (I) to every abnormalities pattern found in the tomography image. A complete comparison between Mamdani and Sugeno style,triangular, trapezoids and mixed triangular-trapezoids membership functions and different methods of aggregation and defuzzification is also presented and analyzed to select suitable Fuzzy Inference System methods to perform the above mentioned task. The results showed that seven out of 30 initial possible combination of available Fuzzy Inference methods in MATLAB Fuzzy Toolbox were observed giving result close to the experts estimation.

Keywords: Fuzzy Inference Systems, Tomography analysis, Modelizationof expert's information, Ganoderma Infection pattern recognition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
11017 E-Learning Methodology Development using Modeling

Authors: Sarma Cakula, Maija Sedleniece

Abstract:

Simulation and modeling computer programs are concerned with construction of models for analyzing different perspectives and possibilities in changing conditions environment. The paper presents theoretical justification and evaluation of qualitative e-learning development model in perspective of advancing modern technologies. There have been analyzed principles of qualitative e-learning in higher education, productivity of studying process using modern technologies, different kind of methods and future perspectives of e-learning in formal education. Theoretically grounded and practically tested model of developing e-learning methods using different technologies for different type of classroom, which can be used in professor-s decision making process to choose the most effective e-learning methods has been worked out.

Keywords: E-learning, modeling, E-learning methods development, personal knowledge management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937
11016 Reading Literacy and Methods of Improving Reading

Authors: Iva Košek Bartošová, Andrea Jokešová, Eva Kozlová, Helena Matějová

Abstract:

The paper presents results of a research team from Faculty of Education, University of Hradec Králové in the Czech Republic. It introduces with the most reading methods used in the 1st classes of a primary school and presents results of a pilot research focused on mastering reading techniques and the quality of reading comprehension of pupils in the first half of a school year during training in teaching reading by an analytic-synthetic method and by a genetic method. These methods of practicing reading skills are the most used ones in the Czech Republic. During the school year 2015/16 there has been a measurement made of two groups of pupils of the 1st year and monitoring of quantitative and qualitative parameters of reading pupils’ outputs by several methods. Both of these methods are based on different theoretical basis and each of them has a specific educational and methodical procedure. This contribution represents results during a piloting project and draws pilot conclusions which will be verified in the subsequent broader research at the end of the school year of the first class of primary school.

Keywords: Analytic-synthetic method of reading, genetic method of reading, reading comprehension, reading literacy, reading methods, reading speed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 998
11015 On Preprocessing of Speech Signals

Authors: Ayaz Keerio, Bhargav Kumar Mitra, Philip Birch, Rupert Young, Chris Chatwin

Abstract:

Preprocessing of speech signals is considered a crucial step in the development of a robust and efficient speech or speaker recognition system. In this paper, we present some popular statistical outlier-detection based strategies to segregate the silence/unvoiced part of the speech signal from the voiced portion. The proposed methods are based on the utilization of the 3 σ edit rule, and the Hampel Identifier which are compared with the conventional techniques: (i) short-time energy (STE) based methods, and (ii) distribution based methods. The results obtained after applying the proposed strategies on some test voice signals are encouraging.

Keywords: STE based methods, Mahalanobis distance, 3 edit σ rule, Hampel Identifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
11014 Fuzzy Risk-Based Life Cycle Assessment for Estimating Environmental Aspects in EMS

Authors: Kevin Fong-Rey Liu, Ken Yeh, Cheng-Wu Chen, Han-Hsi Liang

Abstract:

Environmental aspects plays a central role in environmental management system (EMS) because it is the basis for the identification of an organization-s environmental targets. The existing methods for the assessment of environmental aspects are grouped into three categories: risk assessment-based (RA-based), LCA-based and criterion-based methods. To combine the benefits of these three categories of research, this study proposes an integrated framework, combining RA-, LCA- and criterion-based methods. The integrated framework incorporates LCA techniques for the identification of the causal linkage for aspect, pathway, receptor and impact, uses fuzzy logic to assess aspects, considers fuzzy conditions, in likelihood assessment, and employs a new multi-criteria decision analysis method - multi-criteria and multi-connection comprehensive assessment (MMCA) - to estimate significant aspects in EMS. The proposed model is verified, using a real case study and the results show that this method successfully prioritizes the environmental aspects.

Keywords: Environmental management system, environmental aspect, risk assessment, life cycle assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2178
11013 Extracting Single Trial Visual Evoked Potentials using Selective Eigen-Rate Principal Components

Authors: Samraj Andrews, Ramaswamy Palaniappan, Nidal Kamel

Abstract:

In single trial analysis, when using Principal Component Analysis (PCA) to extract Visual Evoked Potential (VEP) signals, the selection of principal components (PCs) is an important issue. We propose a new method here that selects only the appropriate PCs. We denote the method as selective eigen-rate (SER). In the method, the VEP is reconstructed based on the rate of the eigen-values of the PCs. When this technique is applied on emulated VEP signals added with background electroencephalogram (EEG), with a focus on extracting the evoked P3 parameter, it is found to be feasible. The improvement in signal to noise ratio (SNR) is superior to two other existing methods of PC selection: Kaiser (KSR) and Residual Power (RP). Though another PC selection method, Spectral Power Ratio (SPR) gives a comparable SNR with high noise factors (i.e. EEGs), SER give more impressive results in such cases. Next, we applied SER method to real VEP signals to analyse the P3 responses for matched and non-matched stimuli. The P3 parameters extracted through our proposed SER method showed higher P3 response for matched stimulus, which confirms to the existing neuroscience knowledge. Single trial PCA using KSR and RP methods failed to indicate any difference for the stimuli.

Keywords: Electroencephalogram, P3, Single trial VEP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1603
11012 Sensitivity Analysis for Determining Priority of Factors Controlling SOC Content in Semiarid Condition of West of Iran

Authors: Y. Parvizi, M. Gorji, M.H. Mahdian, M. Omid

Abstract:

Soil organic carbon (SOC) plays a key role in soil fertility, hydrology, contaminants control and acts as a sink or source of terrestrial carbon content that can affect the concentration of atmospheric CO2. SOC supports the sustainability and quality of ecosystems, especially in semi-arid region. This study was conducted to determine relative importance of 13 different exploratory climatic, soil and geometric factors on the SOC contents in one of the semiarid watershed zones in Iran. Two methods canonical discriminate analysis (CDA) and feed-forward back propagation neural networks were used to predict SOC. Stepwise regression and sensitivity analysis were performed to identify relative importance of exploratory variables. Results from sensitivity analysis showed that 7-2-1 neural networks and 5 inputs in CDA models output have highest predictive ability that explains %70 and %65 of SOC variability. Since neural network models outperformed CDA model, it should be preferred for estimating SOC.

Keywords: Soil organic carbon, modeling, neural networks, CDA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
11011 Object Tracking System Using Camshift, Meanshift and Kalman Filter

Authors: Afef Salhi, Ameni Yengui Jammaoussi

Abstract:

This paper presents a implementation of an object tracking system in a video sequence. This object tracking is an important task in many vision applications. The main steps in video analysis are two: detection of interesting moving objects and tracking of such objects from frame to frame. In a similar vein, most tracking algorithms use pre-specified methods for preprocessing. In our work, we have implemented several object tracking algorithms (Meanshift, Camshift, Kalman filter) with different preprocessing methods. Then, we have evaluated the performance of these algorithms for different video sequences. The obtained results have shown good performances according to the degree of applicability and evaluation criteria.

Keywords: Tracking, meanshift, camshift, Kalman filter, evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8203
11010 Novel Methods for Desulfurization of Fuel Oils

Authors: H. Hosseini

Abstract:

Because of the requirement for low sulfur content of fuel oils, it is necessary to develop alternative methods for desulfurization of heavy fuel oil. Due to the disadvantages of HDS technologies such as costs, safety and green environment, new methods have been developed. Among these methods is ultrasoundassisted oxidative desulfurization. Using ultrasound-assisted oxidative desulfurization, compounds such as benzothiophene and dibenzothiophene can be oxidized. As an alternative method is sulfur elimination of heavy fuel oil by using of activated carbon in a packed column in batch condition. The removal of sulfur compounds in this case to reach about 99%. The most important property of activated carbon is ability of it for adsorption, which is due to high surface area and pore volume of it.

Keywords: Desulfurization, Fuel oil, Activated carbon, Ultrasound-assisted oxidative desulfurization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4403
11009 Project and Module Based Teaching and Learning

Authors: Jingyu Hou

Abstract:

This paper proposes a new teaching and learning approach-project and module based teaching and learning (PMBTL). The PMBTL approach incorporates the merits of project/problem based and module based learning methods, and overcomes the limitations of these methods. The correlation between teaching, learning, practice and assessment is emphasized in this approach, and new methods have been proposed accordingly. The distinct features of these new methods differentiate the PMBTL approach from conventional teaching approaches. Evaluation of this approach on practical teaching and learning activities demonstrates the effectiveness and stability of the approach in improving the performance and quality of teaching and learning. The approach proposed in this paper is also intuitive to the design of other teaching units. 

Keywords: Computer science education, project and module based, software engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3403
11008 Net Fee and Commission Income Determinants of European Cooperative Banks

Authors: Karolína Vozková, Matěj Kuc

Abstract:

Net fee and commission income is one of the key elements of a bank’s core income. In the current low-interest rate environment, this type of income is gaining importance relative to net interest income. This paper analyses the effects of bank and country specific determinants of net fee and commission income on a set of cooperative banks from European countries in the 2007-2014 period. In order to do that, dynamic panel data methods (system Generalized Methods of Moments) were employed. Subsequently, alternative panel data methods were run as robustness checks of the analysis. Strong positive impact of bank concentration on the share of net fee and commission income was found, which proves that cooperative banks tend to display a higher share of fee income in less competitive markets. This is probably connected with the fact that they stick with their traditional deposit-taking and loan-providing model and fees on these services are driven down by the competitors. Moreover, compared to commercial banks, cooperatives do not expand heavily into non-traditional fee bearing services under competition and their overall fee income share is therefore decreasing with the increased competitiveness of the sector.

Keywords: Cooperative banking, dynamic panel data models, net fee, commission income, system GMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2874
11007 New Identity Management Scheme and its Formal Analysis

Authors: Jeonghoon Han, Hanjae Jeong, Dongho Won, Seungjoo Kim

Abstract:

As the Internet technology has developed rapidly, the number of identities (IDs) managed by each individual person has increased and various ID management technologies have been developed to assist users. However, most of these technologies are vulnerable to the existing hacking methods such as phishing attacks and key-logging. If the administrator-s password is exposed, an attacker can access the entire contents of the stolen user-s data files in other devices. To solve these problems, we propose here a new ID management scheme based on a Single Password Protocol. The paper presents the details of the new scheme as well as a formal analysis of the method using BAN Logic.

Keywords: Anti-phishing, BAN Logic, ID management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495
11006 Modal Dynamic Analysis of a Mechanism with Deformable Elements from an Oil Pump Unit Structure

Authors: N. Dumitru, S. Dumitru, C. Copilusi, N. Ploscaru

Abstract:

On this research, experimental analyses have been performed in order to determine the oil pump mechanism dynamics and stability from an oil unit mechanical structure. The experimental tests were focused on the vibrations which occur inside of the rod element during functionality of the oil pump unit. The oil pump mechanism dynamic parameters were measured and also determined through numerical computations. Entire research is based on the oil pump unit mechanical system virtual prototyping. For a complete analysis of the mechanism, the frequency dynamic response was identified, mainly for the mechanism driven element, based on two methods: processing and virtual simulations with MSC Adams aid and experimental analysis. In fact, through this research, a complete methodology is presented where numerical simulations of a mechanism with deformed elements are developed on a dynamic mode and these can be correlated with experimental tests.

Keywords: Modal dynamic analysis, oil pump, vibrations, flexible elements, frequency response.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1340
11005 A Comparison of Recent Methods for Solving a Model 1D Convection Diffusion Equation

Authors: Ashvin Gopaul, Jayrani Cheeneebash, Kamleshsing Baurhoo

Abstract:

In this paper we study some numerical methods to solve a model one-dimensional convection–diffusion equation. The semi-discretisation of the space variable results into a system of ordinary differential equations and the solution of the latter involves the evaluation of a matrix exponent. Since the calculation of this term is computationally expensive, we study some methods based on Krylov subspace and on Restrictive Taylor series approximation respectively. We also consider the Chebyshev Pseudospectral collocation method to do the spatial discretisation and we present the numerical solution obtained by these methods.

Keywords: Chebyshev Pseudospectral collocation method, convection-diffusion equation, restrictive Taylor approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
11004 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.

Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3271
11003 Hierarchical Clustering Analysis with SOM Networks

Authors: Diego Ordonez, Carlos Dafonte, Minia Manteiga, Bernardino Arcayy

Abstract:

This work presents a neural network model for the clustering analysis of data based on Self Organizing Maps (SOM). The model evolves during the training stage towards a hierarchical structure according to the input requirements. The hierarchical structure symbolizes a specialization tool that provides refinements of the classification process. The structure behaves like a single map with different resolutions depending on the region to analyze. The benefits and performance of the algorithm are discussed in application to the Iris dataset, a classical example for pattern recognition.

Keywords: Neural networks, Self-organizing feature maps, Hierarchicalsystems, Pattern clustering methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
11002 Design for Manufacturability and Concurrent Engineering for Product Development

Authors: Alemu Moges Belay

Abstract:

In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.

Keywords: Design for manufacturability, Concurrent Engineering, Time-to-Market, Product development

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5531