Search results for: proximity measure method (PMM).
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8900

Search results for: proximity measure method (PMM).

8810 Do C-Test and Cloze Procedure Measure what they Purport to be Measuring? A Case of Criterion-Related Validity

Authors: Masoud Saeedi, Mansour Tavakoli, Shirin Rahimi Kazerooni, Vahid Parvaresh

Abstract:

This article investigated the validity of C-test and Cloze test which purport to measure general English proficiency. To provide empirical evidence pertaining to the validity of the interpretations based on the results of these integrative language tests, their criterion-related validity was investigated. In doing so, the test of English as a foreign language (TOEFL) which is an established, standardized, and internationally administered test of general English proficiency was used as the criterion measure. Some 90 Iranian English majors participated in this study. They were seniors studying English at a university in Tehran, Iran. The results of analyses showed that there is a statistically significant correlation among participants- scores on Cloze test, C-test, and the TOEFL. Building on the findings of the study and considering criterion-related validity as the evidential basis of the validity argument, it was cautiously deducted that these tests measure the same underlying trait. However, considering the limitations of using criterion measures to validate tests, no absolute claims can be made as to the construct validity of these integrative tests.

Keywords: Integrative testing, C-test, Cloze test, theTOEFL, Validity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3329
8809 Theoretical Analysis of the Effect of Accounting for Special Methods in Similarity-Based Cohesion Measurement

Authors: Jehad Al Dallal

Abstract:

Class cohesion is an important object-oriented software quality attributes, and it refers to the degree of relatedness of class attributes and methods. Several class cohesion measures are proposed in the literature, and the impact of considering the special methods (i.e., constructors, destructors, and access and delegation methods) in cohesion calculation is not thoroughly theoretically studied for most of them. In this paper, we address this issue for three popular similarity-based class cohesion measures. For each of the considered measures we theoretically study the impact of including or excluding special methods on the values that are obtained by applying the measure. This study is based on analyzing the definitions and formulas that are proposed for the measures. The results show that including/excluding special methods has a considerable effect on the obtained cohesion values and that this effect varies from one measure to another. The study shows the importance of considering the types of methods that have to be accounted for when proposing a similarity-based cohesion measure.

Keywords: Object-oriented class, software quality, class cohesion measure, class cohesion, special methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1670
8808 Antioxidant Biosensor Using Microbe

Authors: Dyah Iswantini, Trivadila, Novik Nurhidayat, Waras Nurcholis

Abstract:

The antioxidant compounds are needed for the food, beverages, and pharmaceuticals industry. For this purpose, an appropriate method is required to measure the antioxidant properties in various types of samples. Spectrophotometric method usually used has some weaknesses, including the high price, long sample preparation time, and less sensitivity. Among the alternative methods developed to overcome these weaknesses is antioxidant biosensor based on superoxide dismutase (SOD) enzyme. Therefore, this study was carried out to measure the SOD activity originating from Deinococcus radiodurans and to determine its kinetics properties. Carbon paste electrode modified with ferrocene and immobilized SOD exhibited anode and cathode current peak at potential of +400 and +300mv respectively, in both pure SOD and SOD of D. radiodurans. This indicated that the current generated was from superoxide catalytic dismutation reaction by SOD. Optimum conditions for SOD activity was at pH 9 and temperature of 27.50C for D. radiodurans SOD, and pH 11 and temperature of 200C for pure SOD. Dismutation reaction kinetics of superoxide catalyzed by SOD followed the Lineweaver-Burk kinetics with D. radiodurans SOD KMapp value was smaller than pure SOD. The result showed that D. radiodurans SOD had higher enzyme-substrate affinity and specificity than pure SOD. It concluded that D. radiodurans SOD had a great potential as biological recognition component for antioxidant biosensor.

Keywords: Antioxidant biosensor, Deinococcus radiodurans, enzyme kinetic, superoxide dismutase (SOD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2119
8807 Image Retrieval Using Fused Features

Authors: K. Sakthivel, R. Nallusamy, C. Kavitha

Abstract:

The system is designed to show images which are related to the query image. Extracting color, texture, and shape features from an image plays a vital role in content-based image retrieval (CBIR). Initially RGB image is converted into HSV color space due to its perceptual uniformity. From the HSV image, Color features are extracted using block color histogram, texture features using Haar transform and shape feature using Fuzzy C-means Algorithm. Then, the characteristics of the global and local color histogram, texture features through co-occurrence matrix and Haar wavelet transform and shape are compared and analyzed for CBIR. Finally, the best method of each feature is fused during similarity measure to improve image retrieval effectiveness and accuracy.

Keywords: Color Histogram, Haar Wavelet Transform, Fuzzy C-means, Co-occurrence matrix; Similarity measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127
8806 Simultaneous Clustering and Feature Selection Method for Gene Expression Data

Authors: T. Chandrasekhar, K. Thangavel, E. N. Sathishkumar

Abstract:

Microarrays are made it possible to simultaneously monitor the expression profiles of thousands of genes under various experimental conditions. It is used to identify the co-expressed genes in specific cells or tissues that are actively used to make proteins. This method is used to analysis the gene expression, an important task in bioinformatics research. Cluster analysis of gene expression data has proved to be a useful tool for identifying co-expressed genes, biologically relevant groupings of genes and samples. In this work K-Means algorithms has been applied for clustering of Gene Expression Data. Further, rough set based Quick reduct algorithm has been applied for each cluster in order to select the most similar genes having high correlation. Then the ACV measure is used to evaluate the refined clusters and classification is used to evaluate the proposed method. They could identify compact clusters with feature selection method used to genes are selected.

Keywords: Clustering, Feature selection, Gene expression data, Quick reduct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
8805 Bi-Directional Evolutionary Topology Optimization Based on Critical Fatigue Constraint

Authors: Khodamorad Nabaki, Jianhu Shen, Xiaodong Huang

Abstract:

This paper develops a method for considering the critical fatigue stress as a constraint in the Bi-directional Evolutionary Structural Optimization (BESO) method. Our aim is to reach an optimal design in which high cycle fatigue failure does not occur for a specific life time. The critical fatigue stress is calculated based on modified Goodman criteria and used as a stress constraint in our topology optimization problem. Since fatigue generally does not occur for compressive stresses, we use the p-norm approach of the stress measurement that considers the highest tensile principal stress in each point as stress measure to calculate the sensitivity numbers. The BESO method has been extended to minimize volume an object subjected to the critical fatigue stress constraint. The optimization results are compared with the results from the compliance minimization problem which shows clearly the merits of our newly developed approach.

Keywords: Topology optimization, BESO method, p-norm, fatigue constraint.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1069
8804 The Measurement of Endogenous Higher-Order Formative Composite Variables in PLS-SEM: An Empirical Application from CRM System Development

Authors: Samppa Suoniemi, Harri Terho, Rami Olkkonen

Abstract:

In recent methodological articles related to structural equation modeling (SEM), the question of how to measure endogenous formative variables has been raised as an urgent, unresolved issue. This research presents an empirical application from the CRM system development context to test a recently developed technique, which makes it possible to measure endogenous formative constructs in structural models. PLS path modeling is used to demonstrate the feasibility of measuring antecedent relationships at the formative indicator level, not the formative construct level. Empirical results show that this technique is a promising approach to measure antecedent relationships of formative constructs in SEM.

Keywords: CRM system development, formative measures, PLS path modeling, research methodology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262
8803 A Sparse Representation Speech Denoising Method Based on Adapted Stopping Residue Error

Authors: Qianhua He, Weili Zhou, Aiwu Chen

Abstract:

A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.

Keywords: Speech denoising, sparse representation, K-singular value decomposition, orthogonal matching pursuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
8802 Quantification of Heart Rate Variability: A Measure based on Unique Heart Rates

Authors: V. I. Thajudin Ahamed, P. Dhanasekaran, A. Naseem, N. G. Karthick, T. K. Abdul Jaleel, Paul K.Joseph

Abstract:

It is established that the instantaneous heart rate (HR) of healthy humans keeps on changing. Analysis of heart rate variability (HRV) has become a popular non invasive tool for assessing the activities of autonomic nervous system. Depressed HRV has been found in several disorders, like diabetes mellitus (DM) and coronary artery disease, characterised by autonomic nervous dysfunction. A new technique, which searches for pattern repeatability in a time series, is proposed specifically for the analysis of heart rate data. These set of indices, which are termed as pattern repeatability measure and pattern repeatability ratio are compared with approximate entropy and sample entropy. In our analysis, based on the method developed, it is observed that heart rate variability is significantly different for DM patients, particularly for patients with diabetic foot ulcer.

Keywords: Autonomic nervous system, diabetes mellitus, heart rate variability, pattern identification, sample entropy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
8801 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: Binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731
8800 Q-Test of Undergraduate Epistemology and Scientific Thought: Development and Testing of an Assessment of Scientific Epistemology

Authors: Matthew J. Zagumny

Abstract:

The QUEST is an assessment of scientific epistemic beliefs and was developed to measure students’ intellectual development in regards to beliefs about knowledge and knowing. The QUEST utilizes Q-sort methodology, which requires participants to rate the degree to which statements describe them personally. As a measure of personal theories of knowledge, the QUEST instrument is described with the Q-sort distribution and scoring explained. A preliminary demonstration of the QUEST assessment is described with two samples of undergraduate students (novice/lower division compared to advanced/upper division students) being assessed and their average QUEST scores compared. The usefulness of an assessment of epistemology is discussed in terms of the principle that assessment tends to drive educational practice and university mission. The critical need for university and academic programs to focus on development of students’ scientific epistemology is briefly discussed.

Keywords: Scientific epistemology, critical thinking, Q-sort method, STEM undergraduates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
8799 Improving the Design of Blood Pressure and Blood Saturation Monitors

Authors: L. Parisi

Abstract:

A blood pressure monitor or sphygmomanometer can be either manual or automatic, employing respectively either the auscultatory method or the oscillometric method. The manual version of the sphygmomanometer involves an inflatable cuff with a stethoscope adopted to detect the sounds generated by the arterial walls to measure blood pressure in an artery. An automatic sphygmomanometer can be effectively used to monitor blood pressure through a pressure sensor, which detects vibrations provoked by oscillations of the arterial walls. The pressure sensor implemented in this device improves the accuracy of the measurements taken.

Keywords: Blood pressure, blood saturation, sensors, actuators, design improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3738
8798 Uncertainty Multiple Criteria Decision Making Analysis for Stealth Combat Aircraft Selection

Authors: C. Ardil

Abstract:

Fuzzy set theory and its extensions (intuitionistic fuzzy sets, picture fuzzy sets, and neutrosophic sets) have been widely used to address imprecision and uncertainty in complex decision-making. However, they may struggle with inherent indeterminacy and inconsistency in real-world situations. This study introduces uncertainty sets as a promising alternative, offering a structured framework for incorporating both types of uncertainty into decision-making processes.This work explores the theoretical foundations and applications of uncertainty sets. A novel decision-making algorithm based on uncertainty set-based proximity measures is developed and demonstrated through a practical application: selecting the most suitable stealth combat aircraft.

The results highlight the effectiveness of uncertainty sets in ranking alternatives under uncertainty. Uncertainty sets offer several advantages, including structured uncertainty representation, robust ranking mechanisms, and enhanced decision-making capabilities due to their ability to account for ambiguity.Future research directions are also outlined, including comparative analysis with existing MCDM methods under uncertainty, sensitivity analysis to assess the robustness of rankings,and broader application to various MCDM problems with diverse complexities. By exploring these avenues, uncertainty sets can be further established as a valuable tool for navigating uncertainty in complex decision-making scenarios.

Keywords: Uncertainty set, stealth combat aircraft selection multiple criteria decision-making analysis, MCDM, uncertainty proximity analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186
8797 Introduction of the Harmfulness of the Seismic Signal in the Assessment of the Performance of Reinforced Concrete Frame Structures

Authors: Kahil Amar, Boukais Said, Kezmane Ali, Hamizi Mohand, Hannachi Naceur Eddine

Abstract:

The principle of the seismic performance evaluation methods is to provide a measure of capability for a building or set of buildings to be damaged by an earthquake. The common objective of many of these methods is to supply classification criteria. The purpose of this study is to present a method for assessing the seismic performance of structures, based on Pushover method; we are particularly interested in reinforced concrete frame structures, which represent a significant percentage of damaged structures after a seismic event. The work is based on the characterization of seismic movement of the various earthquake zones in terms of PGA and PGD that is obtained by means of SIMQK_GR and PRISM software and the correlation between the points of performance and the scalar characterizing the earthquakes will developed.

Keywords: Seismic performance, Pushover method, characterization of seismic motion, harmfulness of the seismic signal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
8796 Limits of Phase Modulated Frequency Shifted Holographic Vibrometry at Low Amplitudes of Vibrations

Authors: Pavel Psota, Vít Lédl, Jan Václavík, Roman Doleček, Pavel Mokrý, Petr Vojtíšek

Abstract:

This paper presents advanced time average digital holography by means of frequency shift and phase modulation. This technique can measure amplitudes of vibrations at ultimate dynamic range while the amplitude distribution evaluation is done independently in every pixel. The main focus of the paper is to gain insight into behavior of the method at low amplitudes of vibrations. In order to reach that, a set of experiments was performed. Results of the experiments together with novel noise suppression show the limit of the method to be below 0.1 nm.

Keywords: Acousto-optical modulator, digital holography, low amplitudes, vibrometry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1119
8795 Simulation of 3D Flow using Numerical Model at Open-channel Confluences

Authors: R.Goudarzizadeh, S.H.Mousavi Jahromi, N.Hedayat

Abstract:

This paper analytically investigates the 3D flow pattern at the confluences of two rectangular channels having 900 angles using Navier-Stokes equations based on Reynolds Stress Turbulence Model (RSM). The equations are solved by the Finite- Volume Method (FVM) and the flow is analyzed in terms of steadystate (single-phased) conditions. The Shumate experimental findings were used to test the validity of data. Comparison of the simulation model with the experimental ones indicated a close proximity between the flow patterns of the two sets. Effects of the discharge ratio on separation zone dimensions created in the main-channel downstream of the confluence indicated an inverse relation, where a decrease in discharge ratio, will entail an increase in the length and width of the separation zone. The study also found the model as a powerful analytical tool in the feasibility study of hydraulic engineering projects.

Keywords: 900 confluence angle, flow separation zone, numerical modeling, turbulent flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
8794 Identification of Nonlinear Systems Using Radial Basis Function Neural Network

Authors: C. Pislaru, A. Shebani

Abstract:

This paper uses the radial basis function neural network (RBFNN) for system identification of nonlinear systems. Five nonlinear systems are used to examine the activity of RBFNN in system modeling of nonlinear systems; the five nonlinear systems are dual tank system, single tank system, DC motor system, and two academic models. The feed forward method is considered in this work for modelling the non-linear dynamic models, where the KMeans clustering algorithm used in this paper to select the centers of radial basis function network, because it is reliable, offers fast convergence and can handle large data sets. The least mean square method is used to adjust the weights to the output layer, and Euclidean distance method used to measure the width of the Gaussian function.

Keywords: System identification, Nonlinear system, Neural networks, RBF neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2864
8793 Ranking Genes from DNA Microarray Data of Cervical Cancer by a local Tree Comparison

Authors: Frank Emmert-Streib, Matthias Dehmer, Jing Liu, Max Muhlhauser

Abstract:

The major objective of this paper is to introduce a new method to select genes from DNA microarray data. As criterion to select genes we suggest to measure the local changes in the correlation graph of each gene and to select those genes whose local changes are largest. More precisely, we calculate the correlation networks from DNA microarray data of cervical cancer whereas each network represents a tissue of a certain tumor stage and each node in the network represents a gene. From these networks we extract one tree for each gene by a local decomposition of the correlation network. The interpretation of a tree is that it represents the n-nearest neighbor genes on the n-th level of a tree, measured by the Dijkstra distance, and, hence, gives the local embedding of a gene within the correlation network. For the obtained trees we measure the pairwise similarity between trees rooted by the same gene from normal to cancerous tissues. This evaluates the modification of the tree topology due to tumor progression. Finally, we rank the obtained similarity values from all tissue comparisons and select the top ranked genes. For these genes the local neighborhood in the correlation networks changes most between normal and cancerous tissues. As a result we find that the top ranked genes are candidates suspected to be involved in tumor growth. This indicates that our method captures essential information from the underlying DNA microarray data of cervical cancer.

Keywords: Graph similarity, generalized trees, graph alignment, DNA microarray data, cervical cancer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
8792 Vessel Inscribed Trigonometry to Measure the Vessel Progressive Orientations in the Digital Fundus Image

Authors: Pil Un Kim, Yunjung Lee, Gihyoun Lee, Jin Ho Cho, Myoung Nam Kim

Abstract:

In this paper, the vessel inscribed trigonometry (VITM) for the vessel progression orientation (VPO) is proposed in the two-dimensional fundus image. The VPO is a major factor in the optic disc (OD) detection which is a basic process in the retina analysis. To measure the VPO, skeletons of vessel are used. First, the vessels are classified into three classes as vessel end, vessel branch and vessel stem. And the chain code maps of VS are generated. Next, two farthest neighborhoods of each point on VS are searched by the proposed angle restriction. Lastly, a gradient of the straight line between two farthest neighborhoods is estimated to measure the VPO. VITM is validated by comparing with manual results and 2D Gaussian templates. It is confirmed that VPO of the proposed mensuration is correct enough to detect OD from the results of experiment which applied VITM to detect OD in fundus images.

Keywords: Angle measurement, Optic disc, Retina vessel, Vessel progression orientation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417
8791 Development of a Non-invasive System to Measure the Thickness of the Subcutaneous Adipose Tissue Layer for Human

Authors: Hyuck Ki Hong, Young Chang Jo, Yeon Shik Choi, Beom Joon Kim, Hyo Derk Park

Abstract:

To measure the thickness of the subcutaneous adipose tissue layer, a non-invasive optical measurement system (λ=1300 nm) is introduced. Animal and human subjects are used for the experiments. The results of human subjects are compared with the data of ultrasound device measurements, and a high correlation (r=0.94 for n=11) is observed. There are two modes in the corresponding signals measured by the optical system, which can be explained by two-layered and three-layered tissue models. If the target tissue is thinner than the critical thickness, detected data using diffuse reflectance method follow the three-layered tissue model, so the data increase as the thickness increases. On the other hand, if the target tissue is thicker than the critical thickness, the data follow the two-layered tissue model, so they decrease as the thickness increases.

Keywords: Subcutaneous adipose tissue layer, non-invasive measurement system, two-layered and three-layered tissue models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
8790 Performance Evaluation of Universities as Groups of Decision Making Units

Authors: Ali Payan, Bijan Rahmani Parchicolaie

Abstract:

Universities have different offices such as educational, research, student, administrative, and financial offices. This paper considers universities as groups of decision making units (DMUs) in which DMUs are their offices. This approach gives us with a more just evaluation of universities instead of separate evaluation of the offices of universities. The proposed approach to evaluate group performance of universities is based on common set of weights method in DEA. The suggested method not only can compare groups and measure their efficiencies, but also can calculate the efficiency of units within group and efficiency spread of groups. At last, the suggested method is applied for the analysis of the performance of universities in 14th district of Islamic Azad University as groups under evaluation.

Keywords: Common set of weights, group efficiency, performance analysis, spread efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469
8789 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank, will be focused first when developing their test cases as these modules are vulnerable for defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: Software testing, mutation test, network centrality measure, test case prioritization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543
8788 Tracking Objects in Color Image Sequences: Application to Football Images

Authors: Mourad Moussa, Ali Douik, Hassani Messaoud

Abstract:

In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.

Keywords: Image segmentation, objects tracking, Parzen window, singular value decomposition, target recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1985
8787 Distribution Sampling of Vector Variance without Duplications

Authors: Erna T. Herdiani, Maman A. Djauhari

Abstract:

In recent years, the use of vector variance as a measure of multivariate variability has received much attention in wide range of statistics. This paper deals with a more economic measure of multivariate variability, defined as vector variance minus all duplication elements. For high dimensional data, this will increase the computational efficiency almost 50 % compared to the original vector variance. Its sampling distribution will be investigated to make its applications possible.

Keywords: Asymptotic distribution, covariance matrix, likelihood ratio test, vector variance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
8786 Decision Tree-based Feature Ranking using Manhattan Hierarchical Cluster Criterion

Authors: Yasmin Mohd Yacob, Harsa A. Mat Sakim, Nor Ashidi Mat Isa

Abstract:

Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.

Keywords: Feature ranking, decision tree, hierarchical cluster, Manhattan distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
8785 The Effect of Pilates Method in Scholar’s Trunk Strength and Hamstring Flexibility: Gender Differences

Authors: Noelia González-Gálvez, María Carrasco Poyatos, Pablo Jorge Marcos Pardo, Yuri Feito

Abstract:

Musculoskeletal injuries in school children could be reduced improving trunk strength and hamstring flexibility. Low levels of trunk muscle strength and hamstring flexibility may result in acute and musculoskeletal chronic diseases. The Pilates Method can be appropriate to improve these physical condition attributes and has been rarely employed by this social group. On the other hand, it has been shown that trunk strength and flexibility are different between genders, but there is no evidence about the effect of exercise programs designed to improve both items in school children. Therefore the objective of this study was to measure the effect of a six-week Pilates-based exercise program in 14 year old school children trunk strength and hamstring flexibility, establishing differences in gender. The sample was composed of 57 students divided into experimental group (EG; n=30) and control group (CG; n=27). Bench Trunk Curl test (BTC), Sörensen test and Toe-touch test (TT) were used to measure dynamic muscular resistance in trunk flexion, isometric strength in trunk extension and hamstring flexibility, respectively. EG utilized the Pilates exercise program during six-weeks (2 days/week, 55minutes/session). After this period of training, EG improved trunk strength and hamstring flexibility significantly but there were no significant differences within CG. Although boys were better in BTC test and girls were better in TT test, there were no significant differences between them.

Keywords: Teens, school, trunk muscular resistance, intervention, physical performance, abdominal, back.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2525
8784 A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression

Authors: Nazrina Aziz, Dong Q. Wang

Abstract:

There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.

Keywords: Buckley-James estimators, censored regression, censored data, diagnostic analysis, product-limit estimator, renovated Cook's Distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1438
8783 A New Approach to Optimal Control Problem Constrained by Canonical Form

Authors: B. Farhadinia

Abstract:

In this article, it is considered a class of optimal control problems constrained by differential and integral constraints are called canonical form. A modified measure theoretical approach is introduced to solve this class of optimal control problems.

Keywords: control problem, Canonical form, Measure theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1202
8782 Texture Observation of Bending by XRD and EBSD Method

Authors: Takashi Sakai, Yuri Shimomura

Abstract:

The crystal orientation is a factor that affects the microscopic material properties. Crystal orientation determines the anisotropy of the polycrystalline material. And it is closely related to the mechanical properties of the material. In this paper, for pure copper polycrystalline material, two different methods; X-Ray Diffraction (XRD) and Electron Backscatter Diffraction (EBSD); and the crystal orientation were analyzed. In the latter method, it is possible that the X-ray beam diameter is thicker as compared to the former, to measure the crystal orientation macroscopically relatively. By measurement of the above, we investigated the change in crystal orientation and internal tissues of pure copper.

Keywords: Bending, electron backscatter diffraction, X-ray diffraction, microstructure, IPF map, orientation distribution function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757
8781 State Feedback Controller Design via Takagi- Sugeno Fuzzy Model: LMI Approach

Authors: F. Khaber, K. Zehar, A. Hamzaoui

Abstract:

In this paper, we introduce a robust state feedback controller design using Linear Matrix Inequalities (LMIs) and guaranteed cost approach for Takagi-Sugeno fuzzy systems. The purpose on this work is to establish a systematic method to design controllers for a class of uncertain linear and non linear systems. Our approach utilizes a certain type of fuzzy systems that are based on Takagi-Sugeno (T-S) fuzzy models to approximate nonlinear systems. We use a robust control methodology to design controllers. This method not only guarantees stability, but also minimizes an upper bound on a linear quadratic performance measure. A simulation example is presented to show the effectiveness of this method.

Keywords: Takagi-Sugeno fuzzy model, state feedback, linear matrix inequalities, robust stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2501