Search results for: long-term computational analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9506

Search results for: long-term computational analysis

8036 Thermal Analysis of Toroidal Transformers Using Finite Element Method

Authors: Adrian T.

Abstract:

In this paper a three dimensional thermal model of a power toroidal transformer is proposed for both steady-state or transient conditions. The influence of electric current and ambient temperature on the temperature distribution, has been investigated. To validate the three dimensional thermal model, some experimental tests have been done. There is a good correlation between experimental and simulation results.

Keywords: Temperature distribution, thermal analysis, toroidal transformer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3572
8035 Power Flow Analysis for Radial Distribution System Using Backward/Forward Sweep Method

Authors: J. A. Michline Rupa, S. Ganesh

Abstract:

This paper proposes a backward/forward sweep method to analyze the power flow in radial distribution systems. The distribution system has radial structure and high R/X ratios. So the newton-raphson and fast decoupled methods are failed with distribution system. The proposed method presents a load flow study using backward/forward sweep method, which is one of the most effective methods for the load-flow analysis of the radial distribution system. By using this method, power losses for each bus branch and voltage magnitudes for each bus node are determined. This method has been tested on IEEE 33-bus radial distribution system and effective results are obtained using MATLAB.

Keywords: Backward/Forward sweep method, Distribution system, Load flow analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17509
8034 Review of Surface Electromyogram Signals: Its Analysis and Applications

Authors: Anjana Goen, D. C. Tiwari

Abstract:

Electromyography (EMG) is the study of muscles function through analysis of electrical activity produced from muscles. This electrical activity which is displayed in the form of signal is the result of neuromuscular activation associated with muscle contraction. The most common techniques of EMG signal recording are by using surface and needle/wire electrode where the latter is usually used for interest in deep muscle. This paper will focus on surface electromyogram (SEMG) signal. During SEMG recording, several problems had to been countered such as noise, motion artifact and signal instability. Thus, various signal processing techniques had been implemented to produce a reliable signal for analysis. SEMG signal finds broad application particularly in biomedical field. It had been analyzed and studied for various interests such as neuromuscular disease, enhancement of muscular function and human-computer interface.

Keywords: Evolvable hardware (EHW), Functional Electrical Simulation (FES), Hidden Markov Model (HMM), Hjorth Time Domain (HTD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3513
8033 Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function

Authors: Tirthankar Ghosh, Dilip Roy, Nimai Kumar Chandra

Abstract:

Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.

Keywords: Discretization, Reversed Hazard Rate, Exponential distribution, reliability approximation, engineering item.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2615
8032 Study on Performance of Wigner Ville Distribution for Linear FM and Transient Signal Analysis

Authors: Azeemsha Thacham Poyil, Nasimudeen KM

Abstract:

This research paper presents some methods to assess the performance of Wigner Ville Distribution for Time-Frequency representation of non-stationary signals, in comparison with the other representations like STFT, Spectrogram etc. The simultaneous timefrequency resolution of WVD is one of the important properties which makes it preferable for analysis and detection of linear FM and transient signals. There are two algorithms proposed here to assess the resolution and to compare the performance of signal detection. First method is based on the measurement of area under timefrequency plot; in case of a linear FM signal analysis. A second method is based on the instantaneous power calculation and is used in case of transient, non-stationary signals. The implementation is explained briefly for both methods with suitable diagrams. The accuracy of the measurements is validated to show the better performance of WVD representation in comparison with STFT and Spectrograms.

Keywords: WVD: Wigner Ville Distribution, STFT: Short Time Fourier Transform, FT: Fourier Transform, TFR: Time-Frequency Representation, FM: Frequency Modulation, LFM Signal: Linear FM Signal, JTFA: Joint time frequency analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2419
8031 Predicting DHF Incidence in Northern Thailand using Time Series Analysis Technique

Authors: S. Wongkoon, M. Pollar, M. Jaroensutasinee, K. Jaroensutasinee

Abstract:

This study aimed at developing a forecasting model on the number of Dengue Haemorrhagic Fever (DHF) incidence in Northern Thailand using time series analysis. We developed Seasonal Autoregressive Integrated Moving Average (SARIMA) models on the data collected between 2003-2006 and then validated the models using the data collected between January-September 2007. The results showed that the regressive forecast curves were consistent with the pattern of actual values. The most suitable model was the SARIMA(2,0,1)(0,2,0)12 model with a Akaike Information Criterion (AIC) of 12.2931 and a Mean Absolute Percent Error (MAPE) of 8.91713. The SARIMA(2,0,1)(0,2,0)12 model fitting was adequate for the data with the Portmanteau statistic Q20 = 8.98644 ( x20,95= 27.5871, P>0.05). This indicated that there was no significant autocorrelation between residuals at different lag times in the SARIMA(2,0,1)(0,2,0)12 model.

Keywords: Dengue, SARIMA, Time Series Analysis, Northern Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
8030 Stress and Strain Analysis of Notched Bodies Subject to Non-Proportional Loadings

Authors: A. Ince

Abstract:

In this paper, an analytical simplified method for calculating elasto-plastic stresses strains of notched bodies subject to non-proportional loading paths is discussed. The method was based on the Neuber notch correction, which relates the incremental elastic and elastic-plastic strain energy densities at the notch root and the material constitutive relationship. The validity of the method was presented by comparing computed results of the proposed model against finite element numerical data of notched shaft. The comparison showed that the model estimated notch-root elasto-plastic stresses strains with good accuracy using linear-elastic stresses. The prosed model provides more efficient and simple analysis method preferable to expensive experimental component tests and more complex and time consuming incremental non-linear FE analysis. The model is particularly suitable to perform fatigue life and fatigue damage estimates of notched components subjected to nonproportional loading paths.

Keywords: Elasto-plastic, stress-strain, notch analysis, nonprortional loadings, cyclic plasticity, fatigue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549
8029 Harmonic Analysis and Performance Improvement of a Wind Energy Conversions System with Double Output Induction Generator

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Wind turbines with double output induction generators can operate at variable speed permitting conversion efficiency maximization over a wide range of wind velocities. This paper presents the performance analysis of a wind driven double output induction generator (DOIG) operating at varying shafts speed. A periodic transient state analysis of DOIG equipped with two converters is carried out using a hybrid induction machine model. This paper simulates the harmonic content of waveforms in various points of drive at different speeds, based on the hybrid model (dqabc). Then the sinusoidal and trapezoidal pulse-width–modulation control techniques are used in order to improve the power factor of the machine and to weaken the injected low order harmonics to the supply. Based on the frequency spectrum, total harmonics distortion, distortion factor and power factor. Finally advantages of sinusoidal and trapezoidal pulse width modulation techniques are compared.

Keywords: DOIG, Harmonic Analysis, Wind.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
8028 Predicting Crack Initiation Due to Ratchetting in Rail Heads Using Critical Element Analysis

Authors: I. U. Wickramasinghe, D. J. Hargreaves, D. V. De Pellegrin

Abstract:

This paper presents a strategy to predict the lifetime of rails subjected to large rolling contact loads that induce ratchetting strains in the rail head. A critical element concept is used to calculate the number of loading cycles needed for crack initiation to occur in the rail head surface. In this technique the finite element method (FEM) is used to determine the maximum equivalent ratchetting strain per load cycle, which is calculated by combining longitudinal and shear stains in the critical element. This technique builds on a previously developed critical plane concept that has been used to calculate the number of cycles to crack initiation in rolling contact fatigue under ratchetting failure conditions. The critical element concept simplifies the analytical difficulties of critical plane analysis. Finite element analysis (FEA) is used to identify the critical element in the mesh, and then the strain values of the critical element are used to calculate the ratchetting rate analytically. Finally, a ratchetting criterion is used to calculate the number of cycles to crack initiation from the ratchetting rate calculated.

Keywords: Critical element analysis, finite element modeling (FEM), wheel/rail contact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2929
8027 Reliability Analysis of Computer Centre at Yobe State University Using LRU Algorithm

Authors: V. V. Singh, Yusuf Ibrahim Gwanda, Rajesh Prasad

Abstract:

In this paper, we focus on the reliability and performance analysis of Computer Centre (CC) at Yobe State University, Damaturu, Nigeria. The CC consists of three servers: one database mail server, one redundant and one for sharing with the client computers in the CC (called as a local server). Observing the different possibilities of the functioning of the CC, the analysis has been done to evaluate the various popular measures of reliability such as availability, reliability, mean time to failure (MTTF), profit analysis due to the operation of the system. The system can ultimately fail due to the failure of router, redundant server before repairing the mail server and switch failure. The system can also partially fail when a local server fails. The failed devices have restored according to Least Recently Used (LRU) techniques. The system can also fail entirely due to a cooling failure of the server, electricity failure or some natural calamity like earthquake, fire tsunami, etc. All the failure rates are assumed to be constant and follow exponential time distribution, while the repair follows two types of distributions: i.e. general and Gumbel-Hougaard family copula distribution.

Keywords: Reliability, availability Gumbel-Hougaard family copula, MTTF, internet data center.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 866
8026 3D Oil Reservoir Visualisation Using Octree Compression Techniques Utilising Logical Grid Co-Ordinates

Authors: S. Mulholland

Abstract:

Octree compression techniques have been used for several years for compressing large three dimensional data sets into homogeneous regions. This compression technique is ideally suited to datasets which have similar values in clusters. Oil engineers represent reservoirs as a three dimensional grid where hydrocarbons occur naturally in clusters. This research looks at the efficiency of storing these grids using octree compression techniques where grid cells are broken into active and inactive regions. Initial experiments yielded high compression ratios as only active leaf nodes and their ancestor, header nodes are stored as a bitstream to file on disk. Savings in computational time and memory were possible at decompression, as only active leaf nodes are sent to the graphics card eliminating the need of reconstructing the original matrix. This results in a more compact vertex table, which can be loaded into the graphics card quicker and generating shorter refresh delay times.

Keywords: 3D visualisation, compressed vertex tables, octree compression techniques, oil reservoir grids.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
8025 Closed Form Optimal Solution of a Tuned Liquid Column Damper Responding to Earthquake

Authors: A. Farshidianfar, P. Oliazadeh

Abstract:

In this paper the vibration behaviors of a structure equipped with a tuned liquid column damper (TLCD) under a harmonic type of earthquake loading are studied. However, due to inherent nonlinear liquid damping, it is no doubt that a great deal of computational effort is required to search the optimum parameters of the TLCD, numerically. Therefore by linearization the equation of motion of the single degree of freedom structure equipped with the TLCD, the closed form solutions of the TLCD-structure system are derived. To find the reliability of the analytical method, the results have been compared with other researcher and have good agreement. Further, the effects of optimal design parameters such as length ratio and mass ratio on the performance of the TLCD for controlling the responses of a structure are investigated by using the harmonic type of earthquake excitation. Finally, the Citicorp Center which has a very flexible structure is used as an example to illustrate the design procedure for the TLCD under the earthquake excitation.

Keywords: Closed form solution, Earthquake excitation, TLCD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2023
8024 Parameter Sensitivity Analysis of Artificial Neural Network for Predicting Water Turbidity

Authors: Chia-Ling Chang, Chung-Sheng Liao

Abstract:

The present study focuses on the discussion over the parameter of Artificial Neural Network (ANN). Sensitivity analysis is applied to assess the effect of the parameters of ANN on the prediction of turbidity of raw water in the water treatment plant. The result shows that transfer function of hidden layer is a critical parameter of ANN. When the transfer function changes, the reliability of prediction of water turbidity is greatly different. Moreover, the estimated water turbidity is less sensitive to training times and learning velocity than the number of neurons in the hidden layer. Therefore, it is important to select an appropriate transfer function and suitable number of neurons in the hidden layer in the process of parameter training and validation.

Keywords: Artificial Neural Network (ANN), sensitivity analysis, turbidity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2805
8023 Fighter Aircraft Selection Using Neutrosophic Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

Fuzzy set and intuitionistic fuzzy set are dealing with the imprecision and uncertainty inherent in a complex decision problem. However, sometimes these theories are not sufficient to model indeterminate and inconsistent information encountered in real-life problems. To overcome this insufficiency, the neutrosophic set, which is useful in practical applications, is proposed, triangular neutrosophic numbers and trapezoidal neutrosophic numbers are examined, their definitions and applications are discussed. In this study, a decision making algorithm is developed using neutrosophic set processes and an application is given in fighter aircraft selection as an example of a decision making problem. The estimation of the fighter aircraft selection with the neutrosophic multiple criteria decision analysis method is examined.  

Keywords: neutrosophic set, multiple criteria decision making analysis, fighter aircraft selection, MCDMA, neutrosophic numbers

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 925
8022 A Monte Carlo Method to Data Stream Analysis

Authors: Kittisak Kerdprasop, Nittaya Kerdprasop, Pairote Sattayatham

Abstract:

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Keywords: Data Stream, Monte Carlo, Sampling, DensityEstimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1414
8021 A Simulated Design and Analysis of a Solar Thermal Parabolic Trough Concentrator

Authors: Fauziah Sulaiman, Nurhayati Abdullah, Balbir Singh Mahinder Singh

Abstract:

In recent years Malaysia has included renewable energy as an alternative fuel to help in diversifying the country-s energy reliance on oil, natural gas, coal and hydropower with biomass and solar energy gaining priority. The scope of this paper is to look at the designing procedures and analysis of a solar thermal parabolic trough concentrator by simulation utilizing meteorological data in several parts of Malaysia. Parameters which include the aperture area, the diameter of the receiver and the working fluid may be varied to optimize the design. Aperture area is determined by considering the width and the length of the concentrator whereas the geometric concentration ratio (CR) is obtained by considering the width and diameter of the receiver. Three types of working fluid are investigated. Theoretically, concentration ratios can be very high in the range of 10 to 40 000 depending on the optical elements used and continuous tracking of the sun. However, a thorough analysis is essential as discussed in this paper where optical precision and thermal analysis must be carried out to evaluate the performance of the parabolic trough concentrator as the theoretical CR is not the only factor that should be considered.

Keywords: Parabolic trough concentrator, Concentration ratio, Intercept factor, Efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3978
8020 Improving the Performance of Gas Turbine Power Plant by Modified Axial Turbine

Authors: Hakim T. Kadhim, Faris A. Jabbar, Aldo Rona, Audrius Bagdanaviciu

Abstract:

Computer-based optimization techniques can be employed to improve the efficiency of energy conversions processes, including reducing the aerodynamic loss in a thermal power plant turbomachine. In this paper, towards mitigating secondary flow losses, a design optimization workflow is implemented for the casing geometry of a 1.5 stage axial flow turbine that improves the turbine isentropic efficiency. The improved turbine is used in an open thermodynamic gas cycle with regeneration and cogeneration. Performance estimates are obtained by the commercial software Cycle – Tempo. Design and off design conditions are considered as well as variations in inlet air temperature. Reductions in both the natural gas specific fuel consumption and in CO2 emissions are predicted by using the gas turbine cycle fitted with the new casing design. These gains are attractive towards enhancing the competitiveness and reducing the environmental impact of thermal power plant.

Keywords: Axial flow turbine, computational fluid dynamics, gas turbine power plant, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1067
8019 Aircraft Selection Process Using Preference Analysis for Reference Ideal Solution (PARIS)

Authors: C. Ardil

Abstract:

Multiple criteria decision making analysis (MCDMA) methods are applied to many real - life problems in different fields of engineering science and technology. The "preference analysis for reference ideal solution (PARIS)" method is proposed for an efficient MCDMA evaluation of decision problems. The multiple criteria aircraft evaluation approach is based on the integrated the mean weight, entropy weight, PARIS, and TOPSIS method, which eliminates the subjective importance weight assignment process. The evaluation criteria were identified from an extensive literature review of aircraft selection process. The aim of this study is to propose an efficient methodology for handling the aircraft selection process in which the proposed method solves effectively the MCDMA problem. A numerical example is presented to demonstrate the applicability and validity of the proposed MCDMA approach. 

Keywords: aircraft selection, aircraft, multiple criteria decision making, multiple criteria decision making analysis, mean weight, entropy weight, MCDMA, PARIS, TOPSIS, VIKOR, ELECTRE, PROMETHEE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 525
8018 A Block Cipher for Resource-Constrained IoT Devices

Authors: Muhammad Rana, Quazi Mamun, Rafiqul Islam

Abstract:

In the Internet of Things (IoT), many devices are connected and accumulate a sheer amount of data. These Internet-driven raw data need to be transferred securely to the end-users via dependable networks. Consequently, the challenges of IoT security in various IoT domains are paramount. Cryptography is being applied to secure the networks for authentication, confidentiality, data integrity and access control. However, due to the resource constraint properties of IoT devices, the conventional cipher may not be suitable in all IoT networks. This paper designs a robust and effective lightweight cipher to secure the IoT environment and meet the resource-constrained nature of IoT devices. We also propose a symmetric and block-cipher based lightweight cryptographic algorithm. The proposed algorithm increases the complexity of the block cipher, maintaining the lowest computational requirements possible. The proposed algorithm efficiently constructs the key register updating technique, reduces the number of encryption rounds, and adds a layer between the encryption and decryption processes.

Keywords: Internet of Things, IoT, cryptography block cipher, s-box, key management, IoT security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 512
8017 Geospatial Network Analysis Using Particle Swarm Optimization

Authors: Varun Singh, Mainak Bandyopadhyay, Maharana Pratap Singh

Abstract:

The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.

Keywords: GIS, Outliers, PSO, Traffic Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2886
8016 Speech Intelligibility Improvement Using Variable Level Decomposition DWT

Authors: Samba Raju, Chiluveru, Manoj Tripathy

Abstract:

Intelligibility is an essential characteristic of a speech signal, which is used to help in the understanding of information in speech signal. Background noise in the environment can deteriorate the intelligibility of a recorded speech. In this paper, we presented a simple variance subtracted - variable level discrete wavelet transform, which improve the intelligibility of speech. The proposed algorithm does not require an explicit estimation of noise, i.e., prior knowledge of the noise; hence, it is easy to implement, and it reduces the computational burden. The proposed algorithm decides a separate decomposition level for each frame based on signal dominant and dominant noise criteria. The performance of the proposed algorithm is evaluated with speech intelligibility measure (STOI), and results obtained are compared with Universal Discrete Wavelet Transform (DWT) thresholding and Minimum Mean Square Error (MMSE) methods. The experimental results revealed that the proposed scheme outperformed competing methods

Keywords: Discrete Wavelet Transform, speech intelligibility, STOI, standard deviation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 687
8015 Parallezation Protein Sequence Similarity Algorithms using Remote Method Interface

Authors: Mubarak Saif Mohsen, Zurinahni Zainol, Rosalina Abdul Salam, Wahidah Husain

Abstract:

One of the major problems in genomic field is to perform sequence comparison on DNA and protein sequences. Executing sequence comparison on the DNA and protein data is a computationally intensive task. Sequence comparison is the basic step for all algorithms in protein sequences similarity. Parallel computing is an attractive solution to provide the computational power needed to speedup the lengthy process of the sequence comparison. Our main research is to enhance the protein sequence algorithm using dynamic programming method. In our approach, we parallelize the dynamic programming algorithm using multithreaded program to perform the sequence comparison and also developed a distributed protein database among many PCs using Remote Method Interface (RMI). As a result, we showed how different sizes of protein sequences data and computation of scoring matrix of these protein sequence on different number of processors affected the processing time and speed, as oppose to sequential processing.

Keywords: Protein sequence algorithm, dynamic programming algorithm, multithread

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
8014 Comparative Study of Tensile Properties of Cortical Bone Using Sub-size Specimens and Finite Element Simulation

Authors: N. K. Sharma, J. Nayak, D. K. Sehgal, R. K. Pandey

Abstract:

Bone material is treated as heterogeneous and hierarchical in nature therefore appropriate size of bone specimen is required to analyze its tensile properties at a particular hierarchical level. Tensile properties of cortical bone are important to investigate the effect of drug treatment, disease and aging as well as for development of computational and analytical models. In the present study tensile properties of buffalo as well as goat femoral and tibiae cortical bone are analyzed using sub-size tensile specimens. Femoral cortical bone was found to be stronger in tension as compared to the tibiae cortical bone and the tensile properties obtained using sub-size specimens show close resemblance with the tensile properties of full-size cortical specimens. A two dimensional finite element (FE) modal was also applied to simulate the tensile behavior of sub-size specimens. Good agreement between experimental and FE model was obtained for sub-size tensile specimens of cortical bone.

Keywords: Cortical bone, sub-size specimen, full size specimen, finite element modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
8013 An Intelligent Human-Computer Interaction System for Decision Support

Authors: Chee Siong Teh, Chee Peng Lim

Abstract:

This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.

Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
8012 Semi-Automatic Artifact Rejection Procedure Based on Kurtosis, Renyi's Entropy and Independent Component Scalp Maps

Authors: Antonino Greco, Nadia Mammone, Francesco Carlo Morabito, Mario Versaci

Abstract:

Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.

Keywords: Artifact, EEG, Renyi's entropy, kurtosis, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1850
8011 Process Analysis through Length Consistency

Authors: James E. Ponder

Abstract:

The requirement for consistency in physics can sometimes offer a common ground between disciplines such that their fundamental equations share a common parameter set and mathematical method for equation extraction. The parameter set shared by Relativity and Quantum Wave Mechanics enables an analysis which will be seen to be very straightforward, primarily classical in nature using linear algebra concepts, yet deriving a theoretical estimate of the value of the Gravitational Constant along with dependencies never before known.

Keywords: Gravitational Constant, Physical Consistency, Quantum Mechanics, Relativity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1536
8010 Investigation on the Stability of Rock Slopes Subjected to Tension Cracks via Limit Analysis

Authors: W. Wu, S. Utili

Abstract:

Based on the kinematic approach of limit analysis, a full set of upper bound solutions for the stability of homogeneous rock slopes subjected to tension cracks are obtained. The generalized Hoek-Brown failure criterion is employed to describe the non-linear strength envelope of rocks. In this paper, critical failure mechanisms are determined for cracks of known depth but unspecified location, cracks of known location but unknown depth, and cracks of unspecified location and depth. It is shown that there is a nearly up to 50% drop in terms of the stability factors for the rock slopes intersected by a tension crack compared with intact ones. Tables and charts of solutions in dimensionless forms are presented for ease of use by practitioners.

Keywords: Hoek-Brown failure criterion, limit analysis, rock slope, tension cracks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
8009 A Calibration Approach towards Reducing ASM2d Parameter Subsets in Phosphorus Removal Processes

Authors: N.Boontian

Abstract:

A novel calibration approach that aims to reduce ASM2d parameter subsets and decrease the model complexity is presented. This approach does not require high computational demand and reduces the number of modeling parameters required to achieve the ASMs calibration by employing a sensitivity and iteration methodology. Parameter sensitivity is a crucial factor and the iteration methodology enables refinement of the simulation parameter values. When completing the iteration process, parameters values are determined in descending order of their sensitivities. The number of iterations required is equal to the number of model parameters of the parameter significance ranking. This approach was used for the ASM2d model to the evaluated EBPR phosphorus removal and it was successful. Results of the simulation provide calibration parameters. These included YPAO, YPO4, YPHA, qPHA, qPP, μPAO, bPAO, bPP, bPHA, KPS, YA, μAUT, bAUT, KO2 AUT, and KNH4 AUT. Those parameters were corresponding to the experimental data available.

Keywords: ASM2d, calibration approach, iteration methodology, sensitivity, phosphorus removal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2416
8008 Performance Analysis of Self Excited Induction Generator Using Artificial Bee Colony Algorithm

Authors: A. K. Sharma, N. P. Patidar, G. Agnihotri, D. K. Palwalia

Abstract:

This paper presents the performance state analysis of Self-Excited Induction Generator (SEIG) using Artificial Bee Colony (ABC) optimization technique. The total admittance of the induction machine is minimized to calculate the frequency and magnetizing reactance corresponding to any rotor speed, load impedance and excitation capacitance. The performance of SEIG is calculated using the optimized parameter found. The results obtained by ABC algorithm are compared with results from numerical method. The results obtained coincide with the numerical method results. This technique proves to be efficient in solving nonlinear constrained optimization problems and analyzing the performance of SEIG.

Keywords: Artificial bee colony, Steady state analysis, Selfexcited induction generator, Nonlinear constrained optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2177
8007 Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

Authors: Samee Ullah Khan, Ishfaq Ahmad

Abstract:

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Keywords: Auctions, data replication, pricing, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461