Search results for: Statistical methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4886

Search results for: Statistical methods

4766 A Sociological Study of Rural Women Attitudes toward Education, Health and Work outside Home in Beheira Governorate, Egypt

Authors: A. A. Betah

Abstract:

This research was performed to evaluate the attitudes of rural women towards education, health and work outside the home. The study was based on a random sample of 147 rural women, Kafr-Rahmaniyah village was chosen for the study because its life expectancy at birth for females, education and percentage of females in the labor force, were the highest in the district. The study data were collected from rural female respondents, using a face-to-face questionnaire. In addition, the study estimated several factors like age, main occupation, family size, monthly household income, geographic cosmopolites, and degree of social participation for rural women respondents. Using Statistical Package for the Social Sciences (SPSS), data were analyzed by non-parametric statistical methods. The main finding in this study was a significant relationship between each of the previous variables and each of rural women’s attitudes toward education, health, and work outside home. The study concluded with some recommendations. The most important element is ensuring attention to rural women’s needs, requirements and rights via raising their health awareness, education and their contributions in their society.

Keywords: Attitudes, education, health, rural women, work outside the home.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1003
4765 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: Mortality map, spatial patterns, statistical area, variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 946
4764 Statistical Optimization of the Enzymatic Saccharification of the Oil Palm Empty Fruit Bunches

Authors: Rashid S. S., Alam M. Z.

Abstract:

A statistical optimization of the saccharification process of EFB was studied. The statistical analysis was done by applying faced centered central composite design (FCCCD) under response surface methodology (RSM). In this investigation, EFB dose, enzyme dose and saccharification period was examined, and the maximum 53.45% (w/w) yield of reducing sugar was found with 4% (w/v) of EFB, 10% (v/v) of enzyme after 120 hours of incubation. It can be calculated that the conversion rate of cellulose content of the substrate is more than 75% (w/w) which can be considered as a remarkable achievement. All the variables, linear, quadratic and interaction coefficient, were found to be highly significant, other than two coefficients, one quadratic and another interaction coefficient. The coefficient of determination (R2) is 0.9898 that confirms a satisfactory data and indicated that approximately 98.98% of the variability in the dependent variable, saccharification of EFB, could be explained by this model.

Keywords: Face centered central composite design (FCCCD), Liquid state bioconversion (LSB), Palm oil mill effluent, Trichoderma reesei RUT C-30.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
4763 Defect Detection of Tiles Using 2D-Wavelet Transform and Statistical Features

Authors: M.Ghazvini, S. A. Monadjemi, N. Movahhedinia, K. Jamshidi

Abstract:

In this article, a method has been offered to classify normal and defective tiles using wavelet transform and artificial neural networks. The proposed algorithm calculates max and min medians as well as the standard deviation and average of detail images obtained from wavelet filters, then comes by feature vectors and attempts to classify the given tile using a Perceptron neural network with a single hidden layer. In this study along with the proposal of using median of optimum points as the basic feature and its comparison with the rest of the statistical features in the wavelet field, the relational advantages of Haar wavelet is investigated. This method has been experimented on a number of various tile designs and in average, it has been valid for over 90% of the cases. Amongst the other advantages, high speed and low calculating load are prominent.

Keywords: Defect detection, tile and ceramic quality inspection, wavelet transform, classification, neural networks, statistical features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2311
4762 Statistical Feature Extraction Method for Wood Species Recognition System

Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof

Abstract:

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Keywords: Classification, fuzzy, inspection system, image analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
4761 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek

Abstract:

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
4760 A Optimal Subclass Detection Method for Credit Scoring

Authors: Luciano Nieddu, Giuseppe Manfredi, Salvatore D'Acunto, Katia La Regina

Abstract:

In this paper a non-parametric statistical pattern recognition algorithm for the problem of credit scoring will be presented. The proposed algorithm is based on a clustering k- means algorithm and allows for the determination of subclasses of homogenous elements in the data. The algorithm will be tested on two benchmark datasets and its performance compared with other well known pattern recognition algorithm for credit scoring.

Keywords: Constrained clustering, Credit scoring, Statistical pattern recognition, Supervised classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2002
4759 Statistical Characteristics of Distribution of Radiation-Induced Defects under Random Generation

Authors: Pavlo Selyshchev

Abstract:

We consider fluctuations of defects density taking into account their interaction. Stochastic field of displacement generation rate gives random defect distribution. We determinate statistical characteristics (mean and dispersion) of random field of point defect distribution as function of defect generation parameters, temperature and properties of irradiated crystal.

 

Keywords: Irradiation, Primary Defects, Interaction, Fluctuations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1793
4758 Development of a Biomechanical Method for Ergonomic Evaluation: Comparison with Observational Methods

Authors: M. Zare, S. Biau, M. Croq, Y. Roquelaure

Abstract:

A wide variety of observational methods have been developed to evaluate the ergonomic workloads in manufacturing. However, the precision and accuracy of these methods remain a subject of debate. The aims of this study were to develop biomechanical methods to evaluate ergonomic workloads and to compare them with observational methods.

Two observational methods, i.e. SCANIA Ergonomic Standard (SES) and Rapid Upper Limb Assessment (RULA), were used to assess ergonomic workloads at two simulated workstations. They included four tasks such as tightening & loosening, attachment of tubes and strapping as well as other actions. Sensors were also used to measure biomechanical data (Inclinometers, Accelerometers, and Goniometers).

Our findings showed that in assessment of some risk factors both RULA & SES were in agreement with the results of biomechanical methods. However, there was disagreement on neck and wrist postures. In conclusion, the biomechanical approach was more precise than observational methods, but some risk factors evaluated with observational methods were not measurable with the biomechanical techniques developed.

Keywords: Ergonomic, Observational Method, Biomechanical method, Workload.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5037
4757 Detection of Clipped Fragments in Speech Signals

Authors: Sergei Aleinik, Yuri Matveev

Abstract:

In this paper a novel method for the detection of  clipping in speech signals is described. It is shown that the new  method has better performance than known clipping detection  methods, is easy to implement, and is robust to changes in signal  amplitude, size of data, etc. Statistical simulation results are  presented.

 

Keywords: Clipping, clipped signal, speech signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2621
4756 An Experience Report on Course Teaching in Information Systems

Authors: Carlos Oliveira

Abstract:

This paper is a criticism of the traditional model of teaching and presents alternative teaching methods, different from the traditional lecture. These methods are accompanied by reports of experience of their application in a class. It was concluded that in the lecture, the student has a low learning rate and that other methods should be used to make the most engaging learning environment for the student, contributing (or facilitating) his learning process. However, the teacher should not use a single method, but rather a range of different methods to ensure the learning experience does not become repetitive and fatiguing for the student.

Keywords: Educational practices, experience report, IT in education, teaching methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1061
4755 4D Flight Trajectory Optimization Based on Pseudospectral Methods

Authors: Kouamana Bousson, Paulo Machado

Abstract:

The optimization and control problem for 4D trajectories is a subject rarely addressed in literature. In the 4D navigation problem we define waypoints, for each mission, where the arrival time is specified in each of them. One way to design trajectories for achieving this kind of mission is to use the trajectory optimization concepts. To solve a trajectory optimization problem we can use the indirect or direct methods. The indirect methods are based on maximum principle of Pontryagin, on the other hand, in the direct methods it is necessary to transform into a nonlinear programming problem. We propose an approach based on direct methods with a pseudospectral integration scheme built on Chebyshev polynomials.

Keywords: Pseudospectral Methods, Trajectory Optimization, 4DTrajectories

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346
4754 Wheat Yield Prediction through Agro Meteorological Indices for Ardebil District

Authors: Fariba Esfandiary, Ghafoor Aghaie, Ali Dolati Mehr

Abstract:

Wheat prediction was carried out using different meteorological variables together with agro meteorological indices in Ardebil district for the years 2004-2005 & 2005–2006. On the basis of correlation coefficients, standard error of estimate as well as relative deviation of predicted yield from actual yield using different statistical models, the best subset of agro meteorological indices were selected including daily minimum temperature (Tmin), accumulated difference of maximum & minimum temperatures (TD), growing degree days (GDD), accumulated water vapor pressure deficit (VPD), sunshine hours (SH) & potential evapotranspiration (PET). Yield prediction was done two months in advance before harvesting time which was coincide with commencement of reproductive stage of wheat (5th of June). It revealed that in the final statistical models, 83% of wheat yield variability was accounted for variation in above agro meteorological indices.

Keywords: Wheat yields prediction, agro meteorological indices, statistical models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2094
4753 The Effect of Different Pre-Treatment Methods on the Shear Bond Strength of Orthodontic Tubes: An in vitro Study

Authors: A. C. B. C. J. Fernandes, V. C. de Jesus, S. Noruziaan, O. F. G. G. Vilela, K. K. Somarin, R. França, F. H. S. L. Pinheiro

Abstract:

Objective: This in vitro study aimed to evaluate the shear bond strength (SBS) of orthodontic tubes after different enamel pre-treatments. Materials and Methods: A total of 39 crown halves were randomly divided into 3 groups (n = 13). Group I (control group) was exposed to prophy paste (PP), 37% phosphoric acid (PA), and a self-etching primer (SEP). Group II received no prophylaxis, but only PA and SEP. Group III was exposed to PP and SEP. The SBS was used to evaluate the bond strength of the orthodontic tubes one year after bonding. One-way ANOVA and Tukey’s post-hoc test were used to compare SBS values between the three groups. The statistical significance was set to 5%. Results: The difference in SBS values of groups I (36.672 ± 9.315 Mpa), II (34.242 ± 9.986 Mpa), and III (39.055 ± 5.565 Mpa) were not statistically significant (P < 0.05). Conclusion: This study suggests that chairside time can be significantly reduced with the use of PP and a SEP without compromising adhesion. Further evidence is needed by means of a split-mouth design trial.

Keywords: Shear bond strength, orthodontic tubes, self-etching primer, pumice, prophy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 345
4752 Fuzzy Estimation of Parameters in Statistical Models

Authors: A. Falsafain, S. M. Taheri, M. Mashinchi

Abstract:

Using a set of confidence intervals, we develop a common approach, to construct a fuzzy set as an estimator for unknown parameters in statistical models. We investigate a method to derive the explicit and unique membership function of such fuzzy estimators. The proposed method has been used to derive the fuzzy estimators of the parameters of a Normal distribution and some functions of parameters of two Normal distributions, as well as the parameters of the Exponential and Poisson distributions.

Keywords: Confidence interval. Fuzzy number. Fuzzy estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220
4751 Characterisation and Classification of Natural Transients

Authors: Ernst D. Schmitter

Abstract:

Monitoring lightning electromagnetic pulses (sferics) and other terrestrial as well as extraterrestrial transient radiation signals is of considerable interest for practical and theoretical purposes in astro- and geophysics as well as meteorology. Managing a continuous flow of data, automisation of the detection and classification process is important. Features based on a combination of wavelet and statistical methods proved efficient for analysis and characterisation of transients and as input into a radial basis function network that is trained to discriminate transients from pulse like to wave like.

Keywords: transient signals, statistics, wavelets, neural networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
4750 Event Information Extraction System (EIEE): FSM vs HMM

Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani

Abstract:

Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.

Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2272
4749 A Practical Approach for Testing the Process Quality

Authors: Mou-Yuan Liao, Chien-Wei Wu, Chien-Hua Lin

Abstract:

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

Keywords: Process capability analysis, quality control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1382
4748 Immobilization of Lipase Enzyme by Low Cost Material: A Statistical Approach

Authors: Md. Z. Alam, Devi R. Asih, Md. N. Salleh

Abstract:

Immobilization of lipase enzyme produced from palm oil mill effluent (POME) by the activated carbon (AC) among the low cost support materials was optimized. The results indicated that immobilization of 94% was achieved by AC as the most suitable support material. A sequential optimization strategy based on a statistical experimental design, including one-factor-at-a-time (OFAT) method was used to determine the equilibrium time. Three components influencing lipase immobilization were optimized by the response surface methodology (RSM) based on the face-centered central composite design (FCCCD). On the statistical analysis of the results, the optimum enzyme concentration loading, agitation rate and carbon active dosage were found to be 30 U/ml, 300 rpm and 8 g/L respectively, with a maximum immobilization activity of 3732.9 U/g-AC after 2 hrs of immobilization. Analysis of variance (ANOVA) showed a high regression coefficient (R2) of 0.999, which indicated a satisfactory fit of the model with the experimental data. The parameters were statistically significant at p<0.05.

Keywords: Activated carbon, adsorption, immobilization, POME based lipase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2520
4747 Neural Networks: From Black Box towards Transparent Box Application to Evapotranspiration Modeling

Authors: A. Johannet, B. Vayssade, D. Bertin

Abstract:

Neural networks are well known for their ability to model non linear functions, but as statistical methods usually does, they use a no parametric approach thus, a priori knowledge is not obvious to be taken into account no more than the a posteriori knowledge. In order to deal with these problematics, an original way to encode the knowledge inside the architecture is proposed. This method is applied to the problem of the evapotranspiration inside karstic aquifer which is a problem of huge utility in order to deal with water resource.

Keywords: Neural-Networks, Hydrology, Evapotranpiration, Hidden Function Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
4746 Assessment of Hargreaves Equation for Estimating Monthly Reference Evapotranspiration in the South of Iran

Authors: Ali Dehgan Moroozeh, B. Farhadi Bansouleh

Abstract:

Evapotranspiration is one of the most important components of the hydrological cycle. Evapotranspiration (ETo) is an important variable in water and energy balances on the earth’s surface, and knowledge of the distribution of ET is a key factor in hydrology, climatology, agronomy and ecology studies. Many researchers have a valid relationship, which is a function of climate factors, to estimate the potential evapotranspiration presented to the plant water stress or water loss, prevent. The FAO-Penman method (PM) had been recommended as a standard method. This method requires many data and these data are not available in every area of world. So, other methods should be evaluated for these conditions. When sufficient or reliable data to solve the PM equation are not available then Hargreaves equation can be used. The Hargreaves equation (HG) requires only daily mean, maximum and minimum air temperature extraterrestrial radiation .In this study, Hargreaves method (HG) were evaluated in 12 stations in the North West region of Iran. Results of HG and M.HG methods were compared with results of PM method. Statistical analysis of this comparison showed that calibration process has had significant effect on efficiency of Hargreaves method.

Keywords: Evapotranspiration, Hargreaves equation, FAOPenman method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
4745 Development of a Complex Meteorological Support System for UAVs

Authors: Z. Bottyán, F. Wantuch, A. Z. Gyöngyösi, Z. Tuba, K. Hadobács, P. Kardos, R. Kurunczi

Abstract:

The sensitivity of UAVs to the atmospheric effects are apparent. All the same the meteorological support for the UAVs missions is often non-adequate or partly missing. In our paper we show a new complex meteorological support system for different types of UAVs pilots, specialists and decision makers, too. The mentioned system has two important parts with different forecasts approach such as the statistical and dynamical ones. The statistical prediction approach is based on a large climatological data base and the special analog method which is able to select similar weather situations from the mentioned data base to apply them during the forecasting procedure. The applied dynamic approach uses the specific WRF model runs twice a day and produces 96 hours, high resolution weather forecast for the UAV users over the Hungary. An easy to use web-based system can give important weather information over the Carpathian basin in Central-Europe. The mentioned products can be reached via internet connection.

Keywords: Aviation meteorology, statistical weather prediction, unmanned aerial systems, WRF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707
4744 Posture Recognition using Combined Statistical and Geometrical Feature Vectors based on SVM

Authors: Omer Rashid, Ayoub Al-Hamadi, Axel Panning, Bernd Michaelis

Abstract:

It is hard to percept the interaction process with machines when visual information is not available. In this paper, we have addressed this issue to provide interaction through visual techniques. Posture recognition is done for American Sign Language to recognize static alphabets and numbers. 3D information is exploited to obtain segmentation of hands and face using normal Gaussian distribution and depth information. Features for posture recognition are computed using statistical and geometrical properties which are translation, rotation and scale invariant. Hu-Moment as statistical features and; circularity and rectangularity as geometrical features are incorporated to build the feature vectors. These feature vectors are used to train SVM for classification that recognizes static alphabets and numbers. For the alphabets, curvature analysis is carried out to reduce the misclassifications. The experimental results show that proposed system recognizes posture symbols by achieving recognition rate of 98.65% and 98.6% for ASL alphabets and numbers respectively.

Keywords: Feature Extraction, Posture Recognition, Pattern Recognition, Application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1474
4743 Identification of Aircraft Gas Turbine Engines Temperature Condition

Authors: Pashayev A., Askerov D., C. Ardil, Sadiqov R., Abdullayev P.

Abstract:

Groundlessness of application probability-statistic methods are especially shown at an early stage of the aviation GTE technical condition diagnosing, when the volume of the information has property of the fuzzy, limitations, uncertainty and efficiency of application of new technology Soft computing at these diagnosing stages by using the fuzzy logic and neural networks methods. It is made training with high accuracy of multiple linear and nonlinear models (the regression equations) received on the statistical fuzzy data basis. At the information sufficiency it is offered to use recurrent algorithm of aviation GTE technical condition identification on measurements of input and output parameters of the multiple linear and nonlinear generalized models at presence of noise measured (the new recursive least squares method (LSM)). As application of the given technique the estimation of the new operating aviation engine D30KU-154 technical condition at height H=10600 m was made.

Keywords: Identification of a technical condition, aviation gasturbine engine, fuzzy logic and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
4742 A File Splitting Technique for Reducing the Entropy of Text Files

Authors: Abdel-Rahman M. Jaradat, , Mansour I. Irshid, Talha T. Nassar

Abstract:

A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.

Keywords: Bit-wise compression, entropy, file splitting, source mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
4741 Unit Commitment Solution Methods

Authors: Sayeed Salam

Abstract:

An effort to develop a unit commitment approach capable of handling large power systems consisting of both thermal and hydro generating units offers a large profitable return. In order to be feasible, the method to be developed must be flexible, efficient and reliable. In this paper, various proposed methods have been described along with their strengths and weaknesses. As all of these methods have some sort of weaknesses, a comprehensive algorithm that combines the strengths of different methods and overcomes each other-s weaknesses would be a suitable approach for solving industry-grade unit commitment problem.

Keywords: Unit commitment, Solution methods, and Comprehensive algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6109
4740 Identification of Aircraft Gas Turbine Engine's Temperature Condition

Authors: Pashayev A., Askerov D., C. Ardil, Sadiqov R., Abdullayev P.

Abstract:

Groundlessness of application probability-statistic methods are especially shown at an early stage of the aviation GTE technical condition diagnosing, when the volume of the information has property of the fuzzy, limitations, uncertainty and efficiency of application of new technology Soft computing at these diagnosing stages by using the fuzzy logic and neural networks methods. It is made training with high accuracy of multiple linear and nonlinear models (the regression equations) received on the statistical fuzzy data basis. At the information sufficiency it is offered to use recurrent algorithm of aviation GTE technical condition identification on measurements of input and output parameters of the multiple linear and nonlinear generalized models at presence of noise measured (the new recursive least squares method (LSM)). As application of the given technique the estimation of the new operating aviation engine D30KU-154 technical condition at height H=10600 m was made.

Keywords: Identification of a technical condition, aviation gasturbine engine, fuzzy logic and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
4739 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data

Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu

Abstract:

Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant  of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual  value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.

Keywords: Food waste reduction, particle filter, point of sales, sustainable development goals, Taylor's Law, time series analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803
4738 Mechanical Quadrature Methods and Their Extrapolations for Solving First Kind Boundary Integral Equations of Anisotropic Darcy-s Equation

Authors: Xin Luo, Jin Huang, Chuan-Long Wang

Abstract:

The mechanical quadrature methods for solving the boundary integral equations of the anisotropic Darcy-s equations with Dirichlet conditions in smooth domains are presented. By applying the collectively compact theory, we prove the convergence and stability of approximate solutions. The asymptotic expansions for the error show that the methods converge with the order O (h3), where h is the mesh size. Based on these analysis, extrapolation methods can be introduced to achieve a higher convergence rate O (h5). An a posterior asymptotic error representation is derived in order to construct self-adaptive algorithms. Finally, the numerical experiments show the efficiency of our methods.

Keywords: Darcy's equation, anisotropic, mechanical quadrature methods, extrapolation methods, a posteriori error estimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
4737 Detecting Circles in Image Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: Image processing, median filter, projection, scalespace, segmentation, threshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775