Search results for: soft error rates
1138 Design of Measurement Interface and System for Ion Sensors
Authors: Jung-Chuan Chou, Chang-Chi Lee
Abstract:
A measurement system was successfully fabricated to detect ion concentrations (hydrogen and chlorine) in this study. PIC18F4520, the microcontroller was used as the control unit in the measurement system. The measurement system was practically used to sense the H+ and Cl- in different examples, and the pH and pCl values were exhibited on real-time LCD display promptly. In the study, the measurement method is used to judge whether the response voltage is stable. The change quantity is smaller than 0.01%, that the present response voltage compares with next response voltage for H+ measurement, and the above condition is established only 6 sec. Besides, the change quantity is smaller than 0.01%, that the present response voltage compares with next response voltage for Clmeasurement, and the above condition is established only 5 sec. Furthermore, the average error quantities would also be considered, and they are 0.05 and 0.07 for measurements of pH and pCl values, respectively.Keywords: Chlorine ion sensor, hydrogen ion sensor, microcontroller, response voltage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16421137 A Tool for Audio Quality Evaluation Under Hostile Environment
Authors: Akhil Kumar Arya, Jagdeep Singh Lather, Lillie Dewan
Abstract:
In this paper is to evaluate audio and speech quality with the help of Digital Audio Watermarking Technique under the different types of attacks (signal impairments) like Gaussian Noise, Compression Error and Jittering Effect. Further attacks are considered as Hostile Environment. Audio and Speech Quality Evaluation is an important research topic. The traditional way for speech quality evaluation is using subjective tests. They are reliable, but very expensive, time consuming, and cannot be used in certain applications such as online monitoring. Objective models, based on human perception, were developed to predict the results of subjective tests. The existing objective methods require either the original speech or complicated computation model, which makes some applications of quality evaluation impossible.Keywords: Digital Watermarking, DCT, Speech Quality, Attacks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16241136 Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences
Authors: Mahmoud M. S. Albattah
Abstract:
In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.
Keywords: Characteristic straight line method, dynamic height, landslides, orthometric height, systematic errors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15671135 Feature Subset Selection approach based on Maximizing Margin of Support Vector Classifier
Authors: Khin May Win, Nan Sai Moon Kham
Abstract:
Identification of cancer genes that might anticipate the clinical behaviors from different types of cancer disease is challenging due to the huge number of genes and small number of patients samples. The new method is being proposed based on supervised learning of classification like support vector machines (SVMs).A new solution is described by the introduction of the Maximized Margin (MM) in the subset criterion, which permits to get near the least generalization error rate. In class prediction problem, gene selection is essential to improve the accuracy and to identify genes for cancer disease. The performance of the new method was evaluated with real-world data experiment. It can give the better accuracy for classification.Keywords: Microarray data, feature selection, recursive featureelimination, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15411134 Fifth Order Variable Step Block Backward Differentiation Formulae for Solving Stiff ODEs
Authors: S.A.M. Yatim, Z.B. Ibrahim, K.I. Othman, F. Ismail
Abstract:
The implicit block methods based on the backward differentiation formulae (BDF) for the solution of stiff initial value problems (IVPs) using variable step size is derived. We construct a variable step size block methods which will store all the coefficients of the method with a simplified strategy in controlling the step size with the intention of optimizing the performance in terms of precision and computation time. The strategy involves constant, halving or increasing the step size by 1.9 times the previous step size. Decision of changing the step size is determined by the local truncation error (LTE). Numerical results are provided to support the enhancement of method applied.Keywords: Backward differentiation formulae, block backwarddifferentiation formulae, stiff ordinary differential equation, variablestep size.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22571133 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model
Authors: Zina Benouaret, Djamil Aissani
Abstract:
In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8881132 Low Complexity Regular LDPC codes for Magnetic Storage Devices
Authors: Gabofetswe Malema, Michael Liebelt
Abstract:
LDPC codes could be used in magnetic storage devices because of their better decoding performance compared to other error correction codes. However, their hardware implementation results in large and complex decoders. This one of the main obstacles the decoders to be incorporated in magnetic storage devices. We construct small high girth and rate 2 columnweight codes from cage graphs. Though these codes have low performance compared to higher column weight codes, they are easier to implement. The ease of implementation makes them more suitable for applications such as magnetic recording. Cages are the smallest known regular distance graphs, which give us the smallest known column-weight 2 codes given the size, girth and rate of the code.
Keywords: Structured LDPC codes, cage graphs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21121131 Cost of Road Traffic Accidents in Egypt
Authors: Mohamed A. Ismail, Samar M. M. Abdelmageed
Abstract:
The main objective of this paper is to estimate the cost of road traffic accidents in Egypt. The Human Capital (HC) approach, specifically the Gross-Loss-of-Output methodology, is adopted for estimation. Moreover, cost values obtained by previous national literature are updated using the inflation rates. The results indicate an estimated cost of road traffic accidents in Egypt of approximately 10 billion Egyptian Pounds (about $US 1.8 billion) for the year 2008. In addition, it is expected that this cost will rise in 2009 to 11.8 billion Egyptian Pounds (about $US 2.1 billion).
Keywords: Cost, Gross-Loss-of-Output, Human CapitalApproach, Road Traffic Accidents.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37811130 Mathematical Modelling of Single Phase Unity Power Factor Boost Converter
Authors: Sanjay L. Kurkute, Pradeep M. Patil, Kakasaheb C. Mohite
Abstract:
An optimal control strategy based on simple model, a single phase unity power factor boost converter is presented with an evaluation of first order differential equations. This paper presents an evaluation of single phase boost converter having power factor correction. The simple discrete model of boost converter is formed and optimal control is obtained, digital PI is adopted to adjust control error. The method of instantaneous current control is proposed in this paper for its good tracking performance of dynamic response. The simulation and experimental results verified our design.Keywords: Single phase, boost converter, Power factor correction (PFC), Pulse Width Modulation (PWM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34541129 Development of a Real Time Axial Force Measurement System and IoT-Based Monitoring for Smart Bearing
Authors: Hassam Ahmed, Yuanzhi Liu, Yassine Selami, Wei Tao, Hui Zhao
Abstract:
The purpose of this research is to develop a real time axial force measurement system for a smart bearing through the use of strain-gauges, whereby the data acquisition is performed by an Arduino microcontroller due to its easy manipulation and low-cost. The measured signal is acquired and then discretized using a Wheatstone Bridge and an Analog-Digital Converter (ADC) respectively. For bearing monitoring, a real time monitoring system based on Internet of things (IoT) and Bluetooth were developed. Experimental tests were performed on a bearing within a force range up to 600 kN. The experimental results show that there is a proportional linear relationship between the applied force and the output voltage, and the error R squared is within 0.9878 based on the regression analysis.
Keywords: Bearing, force measurement, IoT, strain gauge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6811128 Lifting Wavelet Transform and Singular Values Decomposition for Secure Image Watermarking
Authors: Siraa Ben Ftima, Mourad Talbi, Tahar Ezzedine
Abstract:
In this paper, we present a technique of secure watermarking of grayscale and color images. This technique consists in applying the Singular Value Decomposition (SVD) in LWT (Lifting Wavelet Transform) domain in order to insert the watermark image (grayscale) in the host image (grayscale or color image). It also uses signature in the embedding and extraction steps. The technique is applied on a number of grayscale and color images. The performance of this technique is proved by the PSNR (Pick Signal to Noise Ratio), the MSE (Mean Square Error) and the SSIM (structural similarity) computations.Keywords: Color image, grayscale image, singular values decomposition, lifting wavelet transform, image watermarking, watermark, secure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10281127 Morphological Analysis of English L1-Persian L2 Adult Learners’ Interlanguage: From the Perspective of SLA Variation
Authors: Maassoumeh Bemani Naeini
Abstract:
Studies on interlanguage have long been engaged in describing the phenomenon of variation in SLA. Pursuing the same goal and particularly addressing the role of linguistic features, this study describes the use of Persian morphology in the interlanguage of two adult English-speaking learners of Persian L2. Taking the general approach of a combination of contrastive analysis, error analysis and interlanguage analysis, this study focuses on the identification and prediction of some possible instances of transfer from English L1 to Persian L2 across six elicitation tasks aiming to investigate whether any of contextual features may variably influence the learners’ order of morpheme accuracy in the areas of copula, possessives, articles, demonstratives, plural form, personal pronouns, and genitive cases. Results describe the existence of task variation in the interlanguage system of Persian L2 learners.Keywords: English L1, Interlanguage Analysis, Persian L2, SLA variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13121126 Predictability of the Two Commonly Used Models to Represent the Thin-layer Re-wetting Characteristics of Barley
Authors: M. A. Basunia
Abstract:
Thirty three re-wetting tests were conducted at different combinations of temperatures (5.7- 46.30C) and relative humidites (48.2-88.6%) with barley. Two most commonly used thinlayer drying and rewetting models i.e. Page and Diffusion were compared for their ability to the fit the experimental re-wetting data based on the standard error of estimate (SEE) of the measured and simulated moisture contents. The comparison shows both the Page and Diffusion models fit the re-wetting experimental data of barley well. The average SEE values for the Page and Diffusion models were 0.176 % d.b. and 0.199 % d.b., respectively. The Page and Diffusion models were found to be most suitable equations, to describe the thin-layer re-wetting characteristics of barley over a typically five day re-wetting. These two models can be used for the simulation of deep-bed re-wetting of barley occurring during ventilated storage and deep bed drying.Keywords: Thin-layer, barley, re-wetting parameters, temperature, relative humidity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14951125 The Performance Improvement of the Target Position Determining System in Laser Tracking Based on 4Q Detector using Neural Network
Authors: A. Salmanpour, Sh. Mohammad Nejad
Abstract:
One of the methods for detecting the target position error in the laser tracking systems is using Four Quadrant (4Q) detectors. If the coordinates of the target center is yielded through the usual relations of the detector outputs, the results will be nonlinear, dependent on the shape, target size and its position on the detector screen. In this paper we have designed an algorithm with using neural network that coordinates of the target center in laser tracking systems is calculated by using detector outputs obtained from visual modeling. With this method, the results except from the part related to the detector intrinsic limitation, are linear and dependent from the shape and target size.Keywords: four quadrant detector, laser tracking system, rangefinder, tracking sensor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22071124 Application of BP Neural Network Model in Sports Aerobics Performance Evaluation
Authors: Shuhe Shao
Abstract:
This article provides partial evaluation index and its standard of sports aerobics, including the following 12 indexes: health vitality, coordination, flexibility, accuracy, pace, endurance, elasticity, self-confidence, form, control, uniformity and musicality. The three-layer BP artificial neural network model including input layer, hidden layer and output layer is established. The result shows that the model can well reflect the non-linear relationship between the performance of 12 indexes and the overall performance. The predicted value of each sample is very close to the true value, with a relative error fluctuating around of 5%, and the network training is successful. It shows that BP network has high prediction accuracy and good generalization capacity if being applied in sports aerobics performance evaluation after effective training.Keywords: BP neural network, sports aerobics, performance, evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16181123 Application New Approach with Two Networks Slow and Fast on the Asynchronous Machine
Authors: Samia Salah, M’hamed Hadj Sadok, Abderrezak Guessoum
Abstract:
In this paper, we propose a new modular approach called neuroglial consisting of two neural networks slow and fast which emulates a biological reality recently discovered. The implementation is based on complex multi-time scale systems; validation is performed on the model of the asynchronous machine. We applied the geometric approach based on the Gerschgorin circles for the decoupling of fast and slow variables, and the method of singular perturbations for the development of reductions models.
This new architecture allows for smaller networks with less complexity and better performance in terms of mean square error and convergence than the single network model.
Keywords: Gerschgorin’s Circles, Neuroglial Network, Multi time scales systems, Singular perturbation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16051122 A Fuzzy Linear Regression Model Based on Dissemblance Index
Authors: Shih-Pin Chen, Shih-Syuan You
Abstract:
Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.Keywords: Dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14421121 Investigating the Process Kinetics and Nitrogen Gas Production in Anammox Hybrid Reactor with Special Emphasis on the Role of Filter Media
Authors: Swati Tomar, Sunil Kumar Gupta
Abstract:
Anammox is a novel and promising technology that has changed the traditional concept of biological nitrogen removal. The process facilitates direct oxidation of ammonical nitrogen under anaerobic conditions with nitrite as an electron acceptor without addition of external carbon sources. The present study investigated the feasibility of Anammox Hybrid Reactor (AHR) combining the dual advantages of suspended and attached growth media for biodegradation of ammonical nitrogen in wastewater. Experimental unit consisted of 4 nos. of 5L capacity AHR inoculated with mixed seed culture containing anoxic and activated sludge (1:1). The process was established by feeding the reactors with synthetic wastewater containing NH4-H and NO2-N in the ratio 1:1 at HRT (hydraulic retention time) of 1 day. The reactors were gradually acclimated to higher ammonium concentration till it attained pseudo steady state removal at a total nitrogen concentration of 1200 mg/l. During this period, the performance of the AHR was monitored at twelve different HRTs varying from 0.25-3.0 d with increasing NLR from 0.4 to 4.8 kg N/m3d. AHR demonstrated significantly higher nitrogen removal (95.1%) at optimal HRT of 1 day. Filter media in AHR contributed an additional 27.2% ammonium removal in addition to 72% reduction in the sludge washout rate. This may be attributed to the functional mechanism of filter media which acts as a mechanical sieve and reduces the sludge washout rate many folds. This enhances the biomass retention capacity of the reactor by 25%, which is the key parameter for successful operation of high rate bioreactors. The effluent nitrate concentration, which is one of the bottlenecks of anammox process was also minimised significantly (42.3-52.3 mg/L). Process kinetics was evaluated using first order and Grau-second order models. The first-order substrate removal rate constant was found as 13.0 d-1. Model validation revealed that Grau second order model was more precise and predicted effluent nitrogen concentration with least error (1.84±10%). A new mathematical model based on mass balance was developed to predict N2 gas in AHR. The mass balance model derived from total nitrogen dictated significantly higher correlation (R2=0.986) and predicted N2 gas with least error of precision (0.12±8.49%). SEM study of biomass indicated the presence of heterogeneous population of cocci and rod shaped bacteria of average diameter varying from 1.2-1.5 mm. Owing to enhanced NRE coupled with meagre production of effluent nitrate and its ability to retain high biomass, AHR proved to be the most competitive reactor configuration for dealing with nitrogen laden wastewater.
Keywords: Anammox, filter media, kinetics, nitrogen removal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25511120 Target and Equalizer Design for Perpendicular Heat-Assisted Magnetic Recording
Authors: P. Tueku, P. Supnithi, R. Wongsathan
Abstract:
Heat-Assisted Magnetic Recording (HAMR) is one of the leading technologies identified to enable areal density beyond 1 Tb/in2 of magnetic recording systems. A key challenge to HAMR designing is accuracy of positioning, timing of the firing laser, power of the laser, thermo-magnetic head, head-disk interface and cooling system. We study the effect of HAMR parameters on transition center and transition width. The HAMR is model using Thermal Williams-Comstock (TWC) and microtrack model. The target and equalizer are designed by the minimum mean square error (MMSE). The result shows that the unit energy constraint outperforms other constraints.
Keywords: Heat-Assisted Magnetic Recording, Thermal Williams-Comstock equation, Microtrack model, Equalizer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18841119 An Optimal Algorithm for Finding (r, Q) Policy in a Price-Dependent Order Quantity Inventory System with Soft Budget Constraint
Authors: S. Hamid Mirmohammadi, Shahrazad Tamjidzad
Abstract:
This paper is concerned with the single-item continuous review inventory system in which demand is stochastic and discrete. The budget consumed for purchasing the ordered items is not restricted but it incurs extra cost when exceeding specific value. The unit purchasing price depends on the quantity ordered under the all-units discounts cost structure. In many actual systems, the budget as a resource which is occupied by the purchased items is limited and the system is able to confront the resource shortage by charging more costs. Thus, considering the resource shortage costs as a part of system costs, especially when the amount of resource occupied by the purchased item is influenced by quantity discounts, is well motivated by practical concerns. In this paper, an optimization problem is formulated for finding the optimal (r, Q) policy, when the system is influenced by the budget limitation and a discount pricing simultaneously. Properties of the cost function are investigated and then an algorithm based on a one-dimensional search procedure is proposed for finding an optimal (r, Q) policy which minimizes the expected system costs.Keywords: (r, Q) policy, Stochastic demand, backorders, limited resource, quantity discounts.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18621118 Application of EEG Wavelet Power to Prediction of Antidepressant Treatment Response
Authors: Dorota Witkowska, Paweł Gosek, Lukasz Swiecicki, Wojciech Jernajczyk, Bruce J. West, Miroslaw Latka
Abstract:
In clinical practice, the selection of an antidepressant often degrades to lengthy trial-and-error. In this work we employ a normalized wavelet power of alpha waves as a biomarker of antidepressant treatment response. This novel EEG metric takes into account both non-stationarity and intersubject variability of alpha waves. We recorded resting, 19-channel EEG (closed eyes) in 22 inpatients suffering from unipolar (UD, n=10) or bipolar (BD, n=12) depression. The EEG measurement was done at the end of the short washout period which followed previously unsuccessful pharmacotherapy. The normalized alpha wavelet power of 11 responders was markedly different than that of 11 nonresponders at several, mostly temporoparietal sites. Using the prediction of treatment response based on the normalized alpha wavelet power, we achieved 81.8% sensitivity and 81.8% specificity for channel T4.
Keywords: Alpha waves, antidepressant, treatment outcome, wavelet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19741117 The Interaction between Human and Environment on the Perspective of Environmental Ethics
Authors: Mella Ismelina Farma Rahayu
Abstract:
Environmental problems could not be separated from unethical human perspectives and behaviors toward the environment. There is a fundamental error in the philosophy of people’s perspective about human and nature and their relationship with the environment, which in turn will create an inappropriate behavior in relation to the environment. The aim of this study is to investigate and to understand the ethics of the environment in the context of humans interacting with the environment by using the hermeneutic approach. The related theories and concepts collected from literature review are used as data, which were analyzed by using interpretation, critical evaluation, internal coherence, comparisons, and heuristic techniques. As a result of this study, there will be a picture related to the interaction of human and environment in the perspective of environmental ethics, as well as the problems of the value of ecological justice in the interaction of humans and environment. We suggest that the interaction between humans and environment need to be based on environmental ethics, in a spirit of mutual respect between humans and the natural world.
Keywords: The environment, environmental ethics, the interaction, value.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15981116 Reliability Based Performance Evaluation of Stone Column Improved Soft Ground
Authors: A. GuhaRay, C. V. S. P. Kiranmayi, S. Rudraraju
Abstract:
The present study considers the effect of variation of different geotechnical random variables in the design of stone column-foundation systems for assessing the bearing capacity and consolidation settlement of highly compressible soil. The soil and stone column properties, spacing, diameter and arrangement of stone columns are considered as the random variables. Probability of failure (Pf) is computed for a target degree of consolidation and a target safe load by Monte Carlo Simulation (MCS). The study shows that the variation in coefficient of radial consolidation (cr) and cohesion of soil (cs) are two most important factors influencing Pf. If the coefficient of variation (COV) of cr exceeds 20%, Pf exceeds 0.001, which is unsafe following the guidelines of US Army Corps of Engineers. The bearing capacity also exceeds its safe value for COV of cs > 30%. It is also observed that as the spacing between the stone column increases, the probability of reaching a target degree of consolidation decreases. Accordingly, design guidelines, considering both consolidation and bearing capacity of improved ground, are proposed for different spacing and diameter of stone columns and geotechnical random variables.
Keywords: Bearing capacity, consolidation, geotechnical random variables, probability of failure, stone columns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11771115 Coherence Analysis between Respiration and PPG Signal by Bivariate AR Model
Authors: Yue-Der Lin, Wei-Ting Liu, Ching-Che Tsai, Wen-Hsiu Chen
Abstract:
PPG is a potential tool in clinical applications. Among such, the relationship between respiration and PPG signal has attracted attention in past decades. In this research, a bivariate AR spectral estimation method was utilized for the coherence analysis between these two signals. Ten healthy subjects participated in this research with signals measured at different respiratory rates. The results demonstrate that high coherence exists between respiration and PPG signal, whereas the coherence disappears in breath-holding experiments. These results imply that PPG signal reveals the respiratory information. The utilized method may provide an attractive alternative approach for the related researches.
Keywords: Coherence analysis, photoplethysmography (PPG), bivariate AR spectral estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25981114 Comparison of Imputation Techniques for Efficient Prediction of Software Fault Proneness in Classes
Authors: Geeta Sikka, Arvinder Kaur Takkar, Moin Uddin
Abstract:
Missing data is a persistent problem in almost all areas of empirical research. The missing data must be treated very carefully, as data plays a fundamental role in every analysis. Improper treatment can distort the analysis or generate biased results. In this paper, we compare and contrast various imputation techniques on missing data sets and make an empirical evaluation of these methods so as to construct quality software models. Our empirical study is based on NASA-s two public dataset. KC4 and KC1. The actual data sets of 125 cases and 2107 cases respectively, without any missing values were considered. The data set is used to create Missing at Random (MAR) data Listwise Deletion(LD), Mean Substitution(MS), Interpolation, Regression with an error term and Expectation-Maximization (EM) approaches were used to compare the effects of the various techniques.Keywords: Missing data, Imputation, Missing Data Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16671113 Arabic Character Recognition Using Regression Curves with the Expectation Maximization Algorithm
Authors: Abdullah A. AlShaher
Abstract:
In this paper, we demonstrate how regression curves can be used to recognize 2D non-rigid handwritten shapes. Each shape is represented by a set of non-overlapping uniformly distributed landmarks. The underlying models utilize 2nd order of polynomials to model shapes within a training set. To estimate the regression models, we need to extract the required coefficients which describe the variations for a set of shape class. Hence, a least square method is used to estimate such modes. We then proceed by training these coefficients using the apparatus Expectation Maximization algorithm. Recognition is carried out by finding the least error landmarks displacement with respect to the model curves. Handwritten isolated Arabic characters are used to evaluate our approach.
Keywords: Shape recognition, Arabic handwritten characters, regression curves, expectation maximization algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7131112 Wavelet-Based ECG Signal Analysis and Classification
Authors: Madina Hamiane, May Hashim Ali
Abstract:
This paper presents the processing and analysis of ECG signals. The study is based on wavelet transform and uses exclusively the MATLAB environment. This study includes removing Baseline wander and further de-noising through wavelet transform and metrics such as signal-to noise ratio (SNR), Peak signal-to-noise ratio (PSNR) and the mean squared error (MSE) are used to assess the efficiency of the de-noising techniques. Feature extraction is subsequently performed whereby signal features such as heart rate, rise and fall levels are extracted and the QRS complex was detected which helped in classifying the ECG signal. The classification is the last step in the analysis of the ECG signals and it is shown that these are successfully classified as Normal rhythm or Abnormal rhythm. The final result proved the adequacy of using wavelet transform for the analysis of ECG signals.
Keywords: ECG Signal, QRS detection, thresholding, wavelet decomposition, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12731111 Creative Teaching of New Product Development to Operations Managers
Authors: Marco Leite, J. M. Vilas-Boas da Silva, Isabel Duarte de Almeida
Abstract:
New Product Development (NPD) has got its roots on an Engineering background. Thus, one might wonder about the interest, opportunity, contents and delivery process, if students from soft sciences were involved. This paper addressed «What to teach?» and «How to do it?», as the preliminary research questions that originated the introduced propositions. The curriculum-developer model that was purposefully chosen to adapt the coursebook by pursuing macro/micro strategies was found significant by an exploratory qualitative case study. Moreover, learning was developed and value created by implementing the institutional curriculum through a creative, hands-on, experiencing, problem-solving, problem-based but organized teamwork approach. Product design of an orange squeezer complying with ill-defined requirements, including drafts, sketches, prototypes, CAD simulations and a business plan, plus a website, written reports and presentations were the deliverables that confirmed an innovative contribution towards research and practice of teaching and learning of engineering subjects to non-specialist operations managers candidates.
Keywords: Teaching Engineering to Non-specialists, Operations Managers Education, Teamwork, Product Design and Development, Market- driven NPD, Curriculum development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24101110 Neural Network Controller for Mobile Robot Motion Control
Authors: Jasmin Velagic, Nedim Osmic, Bakir Lacevic
Abstract:
In this paper the neural network-based controller is designed for motion control of a mobile robot. This paper treats the problems of trajectory following and posture stabilization of the mobile robot with nonholonomic constraints. For this purpose the recurrent neural network with one hidden layer is used. It learns relationship between linear velocities and error positions of the mobile robot. This neural network is trained on-line using the backpropagation optimization algorithm with an adaptive learning rate. The optimization algorithm is performed at each sample time to compute the optimal control inputs. The performance of the proposed system is investigated using a kinematic model of the mobile robot.Keywords: Mobile robot, kinematic model, neural network, motion control, adaptive learning rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33321109 Comanche – A Compiler-Driven I/O Management System
Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye
Abstract:
Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318