Search results for: forecasting accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1920

Search results for: forecasting accuracy

780 A Decision Boundary based Discretization Technique using Resampling

Authors: Taimur Qureshi, Djamel A Zighed

Abstract:

Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.

Keywords: Bootstrap, discretization, resampling, soft decision trees.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406
779 Vibration Analysis of Functionally Graded Engesser- Timoshenko Beams Subjected to Axial Load Located on a Continuous Elastic Foundation

Authors: M. Karami Khorramabadi, A. R. Nezamabadi

Abstract:

This paper studies free vibration of functionally graded beams Subjected to Axial Load that is simply supported at both ends lies on a continuous elastic foundation. The displacement field of beam is assumed based on Engesser-Timoshenko beam theory. The Young's modulus of beam is assumed to be graded continuously across the beam thickness. Applying the Hamilton's principle, the governing equation is established. Resulting equation is solved using the Euler's Equation. The effects of the constituent volume fractions and foundation coefficient on the vibration frequency are presented. To investigate the accuracy of the present analysis, a compression study is carried out with a known data.

Keywords: Functionally Graded Beam, Free Vibration, Elastic Foundation, Engesser-Timoshenko Beam Theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
778 An Alternative Approach for Assessing the Impact of Cutting Conditions on Surface Roughness Using Single Decision Tree

Authors: S. Ghorbani, N. I. Polushin

Abstract:

In this study, an approach to identify factors affecting on surface roughness in a machining process is presented. This study is based on 81 data about surface roughness over a wide range of cutting tools (conventional, cutting tool with holes, cutting tool with composite material), workpiece materials (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). A single decision tree (SDT) analysis was done to identify factors for predicting a model of surface roughness, and the CART algorithm was employed for building and evaluating regression tree. Results show that a single decision tree is better than traditional regression models with higher rate and forecast accuracy and strong value.

Keywords: Cutting condition, surface roughness, decision tree, CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 843
777 Unsupervised Feature Selection Using Feature Density Functions

Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh

Abstract:

Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.

Keywords: Feature, Feature Selection, Filter, Probability Density Function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050
776 Handwritten Character Recognition Using Multiscale Neural Network Training Technique

Authors: Velappa Ganapathy, Kok Leong Liew

Abstract:

Advancement in Artificial Intelligence has lead to the developments of various “smart" devices. Character recognition device is one of such smart devices that acquire partial human intelligence with the ability to capture and recognize various characters in different languages. Firstly multiscale neural training with modifications in the input training vectors is adopted in this paper to acquire its advantage in training higher resolution character images. Secondly selective thresholding using minimum distance technique is proposed to be used to increase the level of accuracy of character recognition. A simulator program (a GUI) is designed in such a way that the characters can be located on any spot on the blank paper in which the characters are written. The results show that such methods with moderate level of training epochs can produce accuracies of at least 85% and more for handwritten upper case English characters and numerals.

Keywords: Character recognition, multiscale, backpropagation, neural network, minimum distance technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
775 Performance Analysis of Brain Tumor Detection Based On Image Fusion

Authors: S. Anbumozhi, P. S. Manoharan

Abstract:

Medical Image fusion plays a vital role in medical field to diagnose the brain tumors which can be classified as benign or malignant. It is the process of integrating multiple images of the same scene into a single fused image to reduce uncertainty and minimizing redundancy while extracting all the useful information from the source images. Fuzzy logic is used to fuse two brain MRI images with different vision. The fused image will be more informative than the source images. The texture and wavelet features are extracted from the fused image. The multilevel Adaptive Neuro Fuzzy Classifier classifies the brain tumors based on trained and tested features. The proposed method achieved 80.48% sensitivity, 99.9% specificity and 99.69% accuracy. Experimental results obtained from fusion process prove that the use of the proposed image fusion approach shows better performance while compared with conventional fusion methodologies.

Keywords: Image fusion, Fuzzy rules, Neuro-fuzzy classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3029
774 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline M. R. Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge dataset configurations.

Keywords: Brazil, classifiers, data-mining, Image Segmentation, oil well visualization, classifiers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2521
773 Sampling of Variables in Discrete-Event Simulation using the Example of Inventory Evolutions in Job-Shop-Systems Based on Deterministic and Non-Deterministic Data

Authors: Bernd Scholz-Reiter, Christian Toonen, Jan Topi Tervo, Dennis Lappe

Abstract:

Time series analysis often requires data that represents the evolution of an observed variable in equidistant time steps. In order to collect this data sampling is applied. While continuous signals may be sampled, analyzed and reconstructed applying Shannon-s sampling theorem, time-discrete signals have to be dealt with differently. In this article we consider the discrete-event simulation (DES) of job-shop-systems and study the effects of different sampling rates on data quality regarding completeness and accuracy of reconstructed inventory evolutions. At this we discuss deterministic as well as non-deterministic behavior of system variables. Error curves are deployed to illustrate and discuss the sampling rate-s impact and to derive recommendations for its wellfounded choice.

Keywords: discrete-event simulation, job-shop-system, sampling rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
772 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: Diabetic retinopathy, fundus images, STARE, Gabor filter, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
771 Over-Height Vehicle Detection in Low Headroom Roads Using Digital Video Processing

Authors: Vahid Khorramshahi, Alireza Behrad, Neeraj K. Kanhere

Abstract:

In this paper we present a new method for over-height vehicle detection in low headroom streets and highways using digital video possessing. The accuracy and the lower price comparing to present detectors like laser radars and the capability of providing extra information like speed and height measurement make this method more reliable and efficient. In this algorithm the features are selected and tracked using KLT algorithm. A blob extraction algorithm is also applied using background estimation and subtraction. Then the world coordinates of features that are inside the blobs are estimated using a noble calibration method. As, the heights of the features are calculated, we apply a threshold to select overheight features and eliminate others. The over-height features are segmented using some association criteria and grouped using an undirected graph. Then they are tracked through sequential frames. The obtained groups refer to over-height vehicles in a scene.

Keywords: Feature extraction, over-height vehicle detection, traffic monitoring, vehicle tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2797
770 A Hybrid Genetic Algorithm for the Sequence Dependent Flow-Shop Scheduling Problem

Authors: Mohammad Mirabi

Abstract:

Flow-shop scheduling problem (FSP) deals with the scheduling of a set of jobs that visit a set of machines in the same order. The FSP is NP-hard, which means that an efficient algorithm for solving the problem to optimality is unavailable. To meet the requirements on time and to minimize the make-span performance of large permutation flow-shop scheduling problems in which there are sequence dependent setup times on each machine, this paper develops one hybrid genetic algorithms (HGA). Proposed HGA apply a modified approach to generate population of initial chromosomes and also use an improved heuristic called the iterated swap procedure to improve initial solutions. Also the author uses three genetic operators to make good new offspring. The results are compared to some recently developed heuristics and computational experimental results show that the proposed HGA performs very competitively with respect to accuracy and efficiency of solution.

Keywords: Hybrid genetic algorithm, Scheduling, Permutationflow-shop, Sequence dependent

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
769 Fast and Robust Long-term Tracking with Effective Searching Model

Authors: Thang V. Kieu, Long P. Nguyen

Abstract:

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Keywords: Correlation filter, long-term tracking, random fern, real-time tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 736
768 Error Correction Method for 2D Ultra-Wideband Indoor Wireless Positioning System Using Logarithmic Error Model

Authors: Phornpat Chewasoonthorn, Surat Kwanmuang

Abstract:

Indoor positioning technologies have been evolved rapidly. They augment the Global Positioning System (GPS) which requires line-of-sight to the sky to track the location of people or objects. In this study, we developed an error correction method for an indoor real-time location system (RTLS) based on an ultra-wideband (UWB) sensor from Decawave. Multiple stationary nodes (anchor) were installed throughout the workspace. The distance between stationary and moving nodes (tag) can be measured using a two-way-ranging (TWR) scheme. The result has shown that the uncorrected ranging error from the sensor system can be as large as 1 m. To reduce ranging error and thus increase positioning accuracy, we present an online correction algorithm using the Kalman filter. The results from experiments have shown that the system can reduce ranging error down to 5 cm.

Keywords: Indoor positioning, ultra-wideband, error correction, Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 496
767 An Improved Ant Colony Algorithm for Genome Rearrangements

Authors: Essam Al Daoud

Abstract:

Genome rearrangement is an important area in computational biology and bioinformatics. The basic problem in genome rearrangements is to compute the edit distance, i.e., the minimum number of operations needed to transform one genome into another. Unfortunately, unsigned genome rearrangement problem is NP-hard. In this study an improved ant colony optimization algorithm to approximate the edit distance is proposed. The main idea is to convert the unsigned permutation to signed permutation and evaluate the ants by using Kaplan algorithm. Two new operations are added to the standard ant colony algorithm: Replacing the worst ants by re-sampling the ants from a new probability distribution and applying the crossover operations on the best ants. The proposed algorithm is tested and compared with the improved breakpoint reversal sort algorithm by using three datasets. The results indicate that the proposed algorithm achieves better accuracy ratio than the previous methods.

Keywords: Ant colony algorithm, Edit distance, Genome breakpoint, Genome rearrangement, Reversal sort.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875
766 Prediction of Phenolic Compound Migration Process through Soil Media using Artificial Neural Network Approach

Authors: Supriya Pal, Kalyan Adhikari, Somnath Mukherjee, Sudipta Ghosh

Abstract:

This study presents the application of artificial neural network for modeling the phenolic compound migration through vertical soil column. A three layered feed forward neural network with back propagation training algorithm was developed using forty eight experimental data sets obtained from laboratory fixed bed vertical column tests. The input parameters used in the model were the influent concentration of phenol(mg/L) on the top end of the soil column, depth of the soil column (cm), elapsed time after phenol injection (hr), percentage of clay (%), percentage of silt (%) in soils. The output of the ANN was the effluent phenol concentration (mg/L) from the bottom end of the soil columns. The ANN predicted results were compared with the experimental results of the laboratory tests and the accuracy of the ANN model was evaluated.

Keywords: Modeling, Neural Networks, Phenol, Soil media

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
765 Investigating the Influence of Porosity on Thermal and Mechanical Properties of a C/C Composite Using Image Based FE Modelling

Authors: Abdulrahman A. Alghamdi, Paul M. Mummery, Mohammad A. Sheikh

Abstract:

In this paper, 3D image based composite unit cell is constructed from high resolution tomographic images. Through-thickness thermal diffusivity and in-plane Young’s modulus are predicted for the composite unit cell. The accuracy of the image based composite unit cell is tested by comparing its results with the experimental results obtained from laser flash and tensile test. The FE predictions are in close agreement with experimental results. Through-thickness thermal diffusivity and in-plane Young’s modulus of a virgin C/C composite are predicted by replacing the properties of air (porosity) with the properties of carbon matrix. The effect of porosity was found to be more profound on thermal diffusivity than young’s modulus.

Keywords: Porosity, C/C composite, image based FE modelling, CMC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2138
764 Numerical Modelling of Dry Stone Masonry Structures Based on Finite-Discrete Element Method

Authors: Ž. Nikolić, H. Smoljanović, N. Živaljić

Abstract:

This paper presents numerical model based on finite-discrete element method for analysis of the structural response of dry stone masonry structures under static and dynamic loads. More precisely, each discrete stone block is discretized by finite elements. Material non-linearity including fracture and fragmentation of discrete elements as well as cyclic behavior during dynamic load are considered through contact elements which are implemented within a finite element mesh. The application of the model was conducted on several examples of these structures. The performed analysis shows high accuracy of the numerical results in comparison with the experimental ones and demonstrates the potential of the finite-discrete element method for modelling of the response of dry stone masonry structures.

Keywords: Finite-discrete element method, dry stone masonry structures, static load, dynamic load.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
763 Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB

Authors: Durgesh Kumar Ojha, Monica Subashini

Abstract:

The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.

Keywords: ECG Waveform, Peak Detection, Arrhythmia, Matlab.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11964
762 Fourier Galerkin Approach to Wave Equation with Absorbing Boundary Conditions

Authors: Alexandra Leukauf, Alexander Schirrer, Emir Talic

Abstract:

Numerical computation of wave propagation in a large domain usually requires significant computational effort. Hence, the considered domain must be truncated to a smaller domain of interest. In addition, special boundary conditions, which absorb the outward travelling waves, need to be implemented in order to describe the system domains correctly. In this work, the linear one dimensional wave equation is approximated by utilizing the Fourier Galerkin approach. Furthermore, the artificial boundaries are realized with absorbing boundary conditions. Within this work, a systematic work flow for setting up the wave problem, including the absorbing boundary conditions, is proposed. As a result, a convenient modal system description with an effective absorbing boundary formulation is established. Moreover, the truncated model shows high accuracy compared to the global domain.

Keywords: Absorbing boundary conditions, boundary control, Fourier Galerkin approach, modal approach, wave equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 863
761 X-Corner Detection for Camera Calibration Using Saddle Points

Authors: Abdulrahman S. Alturki, John S. Loomis

Abstract:

This paper discusses a corner detection algorithm for camera calibration. Calibration is a necessary step in many computer vision and image processing applications. Robust corner detection for an image of a checkerboard is required to determine intrinsic and extrinsic parameters. In this paper, an algorithm for fully automatic and robust X-corner detection is presented. Checkerboard corner points are automatically found in each image without user interaction or any prior information regarding the number of rows or columns. The approach represents each X-corner with a quadratic fitting function. Using the fact that the X-corners are saddle points, the coefficients in the fitting function are used to identify each corner location. The automation of this process greatly simplifies calibration. Our method is robust against noise and different camera orientations. Experimental analysis shows the accuracy of our method using actual images acquired at different camera locations and orientations.

Keywords: Camera Calibration, Corner Detector, Saddle Points, X-Corners.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3115
760 Neuro-Fuzzy Network Based On Extended Kalman Filtering for Financial Time Series

Authors: Chokri Slim

Abstract:

The neural network's performance can be measured by efficiency and accuracy. The major disadvantages of neural network approach are that the generalization capability of neural networks is often significantly low, and it may take a very long time to tune the weights in the net to generate an accurate model for a highly complex and nonlinear systems. This paper presents a novel Neuro-fuzzy architecture based on Extended Kalman filter. To test the performance and applicability of the proposed neuro-fuzzy model, simulation study of nonlinear complex dynamic system is carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction of financial time series. A benchmark case studie is used to demonstrate that the proposed model is a superior neuro-fuzzy modeling technique.

Keywords: Neuro-fuzzy, Extended Kalman filter, nonlinear systems, financial time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
759 Effect of Visual Speech in Sign Speech Synthesis

Authors: Zdenek Krnoul

Abstract:

This article investigates a contribution of synthesized visual speech. Synthesis of visual speech expressed by a computer consists in an animation in particular movements of lips. Visual speech is also necessary part of the non-manual component of a sign language. Appropriate methodology is proposed to determine the quality and the accuracy of synthesized visual speech. Proposed methodology is inspected on Czech speech. Hence, this article presents a procedure of recording of speech data in order to set a synthesis system as well as to evaluate synthesized speech. Furthermore, one option of the evaluation process is elaborated in the form of a perceptual test. This test procedure is verified on the measured data with two settings of the synthesis system. The results of the perceptual test are presented as a statistically significant increase of intelligibility evoked by real and synthesized visual speech. Now, the aim is to show one part of evaluation process which leads to more comprehensive evaluation of the sign speech synthesis system.

Keywords: Perception test, Sign speech synthesis, Talking head, Visual speech.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
758 A Boundary Fitted Nested Grid Model for Tsunami Computation along Penang Island in Peninsular Malaysia

Authors: Md. Fazlul Karim, Ahmad Izani Ismail, Mohammed Ashaque Meah

Abstract:

This paper focuses on the development of a 2-D boundary fitted and nested grid (BFNG) model to compute the tsunami propagation of Indonesian tsunami 2004 along the coastal region of Penang in Peninsular Malaysia.

In the presence of a curvilinear coastline, boundary fitted grids are suitable to represent the model boundaries accurately. On the other hand, when large gradient of velocity within a confined area is expected, the use of a nested grid system is appropriate to improve the numerical accuracy with the least grid numbers.

This paper constructs a shallow water nested and orthogonal boundary fitted grid model and presents computational results of the tsunami impact on the Penang coast due to the Indonesian tsunami of 2004. The results of the numerical simulations are compared with available data.

Keywords: Boundary Fitted Nested Model, Tsunami, Penang Island, 2004 Indonesian Tsunami.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
757 A Modified Decoupled Semi-Analytical Approach Based On SBFEM for Solving 2D Elastodynamic Problems

Authors: M. Fakharian, M. I. Khodakarami

Abstract:

In this paper, a new trend for improvement in semianalytical method based on scale boundaries in order to solve the 2D elastodynamic problems is provided. In this regard, only the boundaries of the problem domain discretization are by specific subparametric elements. Mapping functions are uses as a class of higherorder Lagrange polynomials, special shape functions, Gauss-Lobatto- Legendre numerical integration, and the integral form of the weighted residual method, the matrix is diagonal coefficients in the equations of elastodynamic issues. Differences between study conducted and prior research in this paper is in geometry production procedure of the interpolation function and integration of the different is selected. Validity and accuracy of the present method are fully demonstrated through two benchmark problems which are successfully modeled using a few numbers of DOFs. The numerical results agree very well with the analytical solutions and the results from other numerical methods.

Keywords: 2D Elastodynamic Problems, Lagrange Polynomials, G-L-Lquadrature, Decoupled SBFEM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
756 Seismic Alert System based on Artificial Neural Networks

Authors: C. M. A. Robles G., R. A. Hernandez-Becerril

Abstract:

We board the problem of creating a seismic alert system, based upon artificial neural networks, trained by using the well-known back-propagation and genetic algorithms, in order to emit the alarm for the population located into a specific city, about an eminent earthquake greater than 4.5 Richter degrees, and avoiding disasters and human loses. In lieu of using the propagation wave, we employed the magnitude of the earthquake, to establish a correlation between the recorded magnitudes from a controlled area and the city, where we want to emit the alarm. To measure the accuracy of the posed method, we use a database provided by CIRES, which contains the records of 2500 quakes incoming from the State of Guerrero and Mexico City. Particularly, we performed the proposed method to generate an issue warning in Mexico City, employing the magnitudes recorded in the State of Guerrero.

Keywords: Seismic Alert System, Artificial Neural Networks, Genetic Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696
755 Application of Femtosecond Laser pulses for Nanometer Accuracy Profiling of Quartz and Diamond Substrates and for Multi-Layered Targets and Thin-Film Conductors Processing

Authors: Dmitry S. Sitnikov, Andrey V. Ovchinnikov

Abstract:

Research results and optimal parameters investigation of laser cut and profiling of diamond and quartz substrates by femtosecond laser pulses are presented. Profiles 10 μm in width, ~25 μm in depth and several millimeters long were made. Investigation of boundaries quality has been carried out with the use of AFM «Vecco». Possibility of technological formation of profiles and micro-holes in diamond and quartz substrates with nanometer-scale boundaries is shown. Experimental results of multilayer dielectric cover treatment are also presented. Possibility of precise upper layer (thickness of 70–140 nm) removal is demonstrated. Processes of thin metal film (60 nm and 350 nm thick) treatment are considered. Isolation tracks (conductance ~ 10-11 S) 1.6–2.5 μm in width in conductive metal layers are formed.

Keywords: Femtosecond laser ablation, microhole and nanoprofileformation, micromachining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1525
754 The Imaging Methods for Classifying Crispiness of Freeze-Dried Durian using Fuzzy Logic

Authors: Sitthichon Kanitthakun, Pinit Kumhom, Kosin Chamnongthai

Abstract:

In quality control of freeze-dried durian, crispiness is a key quality index of the product. Generally, crispy testing has to be done by a destructive method. A nondestructive testing of the crispiness is required because the samples can be reused for other kinds of testing. This paper proposed a crispiness classification method of freeze-dried durians using fuzzy logic for decision making. The physical changes of a freeze-dried durian include the pores appearing in the images. Three physical features including (1) the diameters of pores, (2) the ratio of the pore area and the remaining area, and (3) the distribution of the pores are considered to contribute to the crispiness. The fuzzy logic is applied for making the decision. The experimental results comparing with food expert opinion showed that the accuracy of the proposed classification method is 83.33 percent.

Keywords: Durian, crispiness, freeze drying, pore, fuzzy logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
753 Worm Gearing Design Improvement by Considering Varying Mesh Stiffness

Authors: A. H. Elkholy, A. H. Falah

Abstract:

A new approach has been developed to estimate the load share and distribution of worm gear drives, and to calculate the instantaneous tooth meshing stiffness. In the approach, the worm gear drive was modelled as a series of spur gear slices, and each slice was analyzed separately using the well-established formulae of spur gear loading and stresses. By combining the results obtained for all slices, the entire envolute worm gear set loading and stressing was obtained. The geometric modelling method presented allows tooth elastic deformation and tooth root stresses of worm gear drives under different load conditions to be investigated. Based on the slicing method introduced in this study, the instantaneous meshing stiffness and load share are obtained. In comparison with existing methods, this approach has both good analysis accuracy and less computing time.

Keywords: Gear, load/stress distribution, worm, wheel, tooth stiffness, contact line.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2202
752 Determining Earthquake Performances of Existing Reinforced Concrete Buildings by Using ANN

Authors: Musa H. Arslan, Murat Ceylan, Tayfun Koyuncu

Abstract:

In this study, an Artificial Neural Network (ANN) analytical method has been developed for analyzing earthquake performances of the Reinforced Concrete (RC) buildings. 66 RC buildings with four to ten storeys were subjected to performance analysis according to the parameters which are the existing material, loading and geometrical characteristics of the buildings. The selected parameters have been thought to be effective on the performance of RC buildings. In the performance analyses stage of the study, level of performance possible to be shown by these buildings in case of an earthquake was determined on the basis of the 4-grade performance levels specified in Turkish Earthquake Code-2007 (TEC-2007). After obtaining the 4-grade performance level, selected 23 parameters of each building have been matched with the performance level. In this stage, ANN-based fast evaluation algorithm mentioned above made an economic and rapid evaluation of four to ten storey RC buildings. According to the study, the prediction accuracy of ANN has been found about 74%.

Keywords: Artificial neural network, earthquake, performance, reinforced concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2637
751 Text Summarization for Oil and Gas News Article

Authors: L. H. Chong, Y. Y. Chen

Abstract:

Information is increasing in volumes; companies are overloaded with information that they may lose track in getting the intended information. It is a time consuming task to scan through each of the lengthy document. A shorter version of the document which contains only the gist information is more favourable for most information seekers. Therefore, in this paper, we implement a text summarization system to produce a summary that contains gist information of oil and gas news articles. The summarization is intended to provide important information for oil and gas companies to monitor their competitor-s behaviour in enhancing them in formulating business strategies. The system integrated statistical approach with three underlying concepts: keyword occurrences, title of the news article and location of the sentence. The generated summaries were compared with human generated summaries from an oil and gas company. Precision and recall ratio are used to evaluate the accuracy of the generated summary. Based on the experimental results, the system is able to produce an effective summary with the average recall value of 83% at the compression rate of 25%.

Keywords: Information retrieval, text summarization, statistical approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580