Search results for: Paper assessment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16501

Search results for: Paper assessment

2671 A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

Authors: Amir-Massoud Bidgoli, Mehdi Naseri Parsa

Abstract:

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Keywords: feature selection, resampling, reliable features, Consistency Subset Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2591
2670 Evolutionary Algorithms for Learning Primitive Fuzzy Behaviors and Behavior Coordination in Multi-Objective Optimization Problems

Authors: Li Shoutao, Gordon Lee

Abstract:

Evolutionary robotics is concerned with the design of intelligent systems with life-like properties by means of simulated evolution. Approaches in evolutionary robotics can be categorized according to the control structures that represent the behavior and the parameters of the controller that undergo adaptation. The basic idea is to automatically synthesize behaviors that enable the robot to perform useful tasks in complex environments. The evolutionary algorithm searches through the space of parameterized controllers that map sensory perceptions to control actions, thus realizing a specific robotic behavior. Further, the evolutionary algorithm maintains and improves a population of candidate behaviors by means of selection, recombination and mutation. A fitness function evaluates the performance of the resulting behavior according to the robot-s task or mission. In this paper, the focus is in the use of genetic algorithms to solve a multi-objective optimization problem representing robot behaviors; in particular, the A-Compander Law is employed in selecting the weight of each objective during the optimization process. Results using an adaptive fitness function show that this approach can efficiently react to complex tasks under variable environments.

Keywords: adaptive fuzzy neural inference, evolutionary tuning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1516
2669 Support Vector Machine based Intelligent Watermark Decoding for Anticipated Attack

Authors: Syed Fahad Tahir, Asifullah Khan, Abdul Majid, Anwar M. Mirza

Abstract:

In this paper, we present an innovative scheme of blindly extracting message bits from an image distorted by an attack. Support Vector Machine (SVM) is used to nonlinearly classify the bits of the embedded message. Traditionally, a hard decoder is used with the assumption that the underlying modeling of the Discrete Cosine Transform (DCT) coefficients does not appreciably change. In case of an attack, the distribution of the image coefficients is heavily altered. The distribution of the sufficient statistics at the receiving end corresponding to the antipodal signals overlap and a simple hard decoder fails to classify them properly. We are considering message retrieval of antipodal signal as a binary classification problem. Machine learning techniques like SVM is used to retrieve the message, when certain specific class of attacks is most probable. In order to validate SVM based decoding scheme, we have taken Gaussian noise as a test case. We generate a data set using 125 images and 25 different keys. Polynomial kernel of SVM has achieved 100 percent accuracy on test data.

Keywords: Bit Correct Ratio (BCR), Grid Search, Intelligent Decoding, Jackknife Technique, Support Vector Machine (SVM), Watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
2668 CoSP2P: A Component-Based Service Model for Peer-to-Peer Systems

Authors: Candido Alcaide, Manuel Dıaz, Luis Llopis, Antonio Marquez, Bartolome Rubio, Enrique Soler

Abstract:

The increasing complexity of software development based on peer to peer networks makes necessary the creation of new frameworks in order to simplify the developer-s task. Additionally, some applications, e.g. fire detection or security alarms may require real-time constraints and the high level definition of these features eases the application development. In this paper, a service model based on a component model with real-time features is proposed. The high-level model will abstract developers from implementation tasks, such as discovery, communication, security or real-time requirements. The model is oriented to deploy services on small mobile devices, such as sensors, mobile phones and PDAs, where the computation is light-weight. Services can be composed among them by means of the port concept to form complex ad-hoc systems and their implementation is carried out using a component language called UM-RTCOM. In order to apply our proposals a fire detection application is described.

Keywords: Peer-to-peer, mobile systems, real-time, service-oriented architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
2667 Nonlinear Impact Responses for a Damped Frame Supported by Nonlinear Springs with Hysteresis Using Fast FEA

Authors: T. Yamaguchi, M. Watanabe, M. Sasajima, C. Yuan, S. Maruyama, T. B. Ibrahim, H. Tomita

Abstract:

This paper deals with nonlinear vibration analysis using finite element method for frame structures consisting of elastic and viscoelastic damping layers supported by multiple nonlinear concentrated springs with hysteresis damping. The frame is supported by four nonlinear concentrated springs near the four corners. The restoring forces of the springs have cubic non-linearity and linear component of the nonlinear springs has complex quantity to represent linear hysteresis damping. The damping layer of the frame structures has complex modulus of elasticity. Further, the discretized equations in physical coordinate are transformed into the nonlinear ordinary coupled differential equations using normal coordinate corresponding to linear natural modes. Comparing shares of strain energy of the elastic frame, the damping layer and the springs, we evaluate the influences of the damping couplings on the linear and nonlinear impact responses. We also investigate influences of damping changed by stiffness of the elastic frame on the nonlinear coupling in the damped impact responses.

Keywords: Dynamic response, Nonlinear impact response, Finite Element analysis, Numerical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
2666 Knowledge Acquisition and Client Organisations: Case Study of a Student as Producer

Authors: Barry Ardley, Abi Hunt, Nick Taylor

Abstract:

As a theoretical and practical framework this study uses the student as producer approach to learning in higher education, as adopted by the Lincoln International Business School, University of Lincoln, UK. Student as producer positions learners as skilled and capable agents, able to participate as partners with tutors in live research projects. To illuminate the nature of this approach to learning and to highlight its critical issues, the authors report on two guided student consultancy projects. These were set up with the assistance of two local organisations in the city of Lincoln UK. Using the student as producer model to deliver the projects enabled learners to acquire and develop a range of key skills and knowledge, not easily accessible in more traditional educational settings. This paper presents a systematic case study analysis of the eight organising principles of the student as producer model, as adopted by university tutors. The experience of tutors implementing student as producer suggests that the model can be widely applied to benefit not only the learning and teaching experiences of higher education students, and staff, but additionally, a university’s research programme and its community partners.

Keywords: Experiential learning, consultancy clients, student as producer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 251
2665 Stability Analysis of Three-Dimensional Flow and Heat Transfer over a Permeable Shrinking Surface in a Cu-Water Nanofluid

Authors: Roslinda Nazar, Amin Noor, Khamisah Jafar, Ioan Pop

Abstract:

In this paper, the steady laminar three-dimensional boundary layer flow and heat transfer of a copper (Cu)-water nanofluid in the vicinity of a permeable shrinking flat surface in an otherwise quiescent fluid is studied. The nanofluid mathematical model in which the effect of the nanoparticle volume fraction is taken into account is considered. The governing nonlinear partial differential equations are transformed into a system of nonlinear ordinary differential equations using a similarity transformation which is then solved numerically using the function bvp4c from Matlab. Dual solutions (upper and lower branch solutions) are found for the similarity boundary layer equations for a certain range of the suction parameter. A stability analysis has been performed to show which branch solutions are stable and physically realizable. The numerical results for the skin friction coefficient and the local Nusselt number as well as the velocity and temperature profiles are obtained, presented and discussed in detail for a range of various governing parameters.

Keywords: Heat Transfer, Nanofluid, Shrinking Surface, Stability Analysis, Three-Dimensional Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2198
2664 The Impact of the Type of Diversification of Listed Construction Enterprises in China on Corporation Performance

Authors: Yi-Hsin Lin, Ying-Ying Li

Abstract:

The construction industry is the pillar industry in China, accounting for about 6% of the gross domestic product. Along with changes in the external environment of the construction industry in China, the construction firm faces fierce competition. The paper aims to investigate the relationship between diversified types of construction firm and its performance in China. Based on generalist and specialist strategy in organizational ecology, we think a generalist organization can be applied to an enterprise with diversified developments, while specialist groups are extended to professional enterprises .This study takes advantage of annual financial data of listed construction firm to empirically verify the relationship between diversification and corporation performance establishing a regression equation to econometric analysis. We find that: 1) Specialization can significantly improve the level of profitability of listed construction firms, and there is a significant positive relationship with corporate performance; 2) The level of operating performance of listed construction enterprises which engage in unrelated diversification is higher than those with related diversification; 3) The relationship between state-owned construction firms and corporate performance is negative. The more the year of foundation is, the higher performance will be; however, the more the year of being listed, the lower performance will be.

Keywords: Diversification, Specialization, Construction Firm, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
2663 Fingerprint Compression Using Contourlet Transform and Multistage Vector Quantization

Authors: S. Esakkirajan, T. Veerakumar, V. Senthil Murugan, R. Sudhakar

Abstract:

This paper presents a new fingerprint coding technique based on contourlet transform and multistage vector quantization. Wavelets have shown their ability in representing natural images that contain smooth areas separated with edges. However, wavelets cannot efficiently take advantage of the fact that the edges usually found in fingerprints are smooth curves. This issue is addressed by directional transforms, known as contourlets, which have the property of preserving edges. The contourlet transform is a new extension to the wavelet transform in two dimensions using nonseparable and directional filter banks. The computation and storage requirements are the major difficulty in implementing a vector quantizer. In the full-search algorithm, the computation and storage complexity is an exponential function of the number of bits used in quantizing each frame of spectral information. The storage requirement in multistage vector quantization is less when compared to full search vector quantization. The coefficients of contourlet transform are quantized by multistage vector quantization. The quantized coefficients are encoded by Huffman coding. The results obtained are tabulated and compared with the existing wavelet based ones.

Keywords: Contourlet Transform, Directional Filter bank, Laplacian Pyramid, Multistage Vector Quantization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2016
2662 A Review and Comparative Analysis on Cluster Ensemble Methods

Authors: S. Sarumathi, P. Ranjetha, C. Saraswathy, M. Vaishnavi, S. Geetha

Abstract:

Clustering is an unsupervised learning technique for aggregating data objects into meaningful classes so that intra cluster similarity is maximized and inter cluster similarity is minimized in data mining. However, no single clustering algorithm proves to be the most effective in producing the best result. As a result, a new challenging technique known as the cluster ensemble approach has blossomed in order to determine the solution to this problem. For the cluster analysis issue, this new technique is a successful approach. The cluster ensemble's main goal is to combine similar clustering solutions in a way that achieves the precision while also improving the quality of individual data clustering. Because of the massive and rapid creation of new approaches in the field of data mining, the ongoing interest in inventing novel algorithms necessitates a thorough examination of current techniques and future innovation. This paper presents a comparative analysis of various cluster ensemble approaches, including their methodologies, formal working process, and standard accuracy and error rates. As a result, the society of clustering practitioners will benefit from this exploratory and clear research, which will aid in determining the most appropriate solution to the problem at hand.

Keywords: Clustering, cluster ensemble methods, consensus function, data mining, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 834
2661 Numerical Simulation of Supersonic Gas Jet Flows and Acoustics Fields

Authors: Lei Zhang, Wen-jun Ruan, Hao Wang, Peng-xin Wang

Abstract:

The source of the jet noise is generated by rocket exhaust plume during rocket engine testing. A domain decomposition approach is applied to the jet noise prediction in this paper. The aerodynamic noise coupling is based on the splitting into acoustic sources generation and sound propagation in separate physical domains. Large Eddy Simulation (LES) is used to simulate the supersonic jet flow. Based on the simulation results of the flow-fields, the jet noise distribution of the sound pressure level is obtained by applying the Ffowcs Williams-Hawkings (FW-H) acoustics equation and Fourier transform. The calculation results show that the complex structures of expansion waves, compression waves and the turbulent boundary layer could occur due to the strong interaction between the gas jet and the ambient air. In addition, the jet core region, the shock cell and the sound pressure level of the gas jet increase with the nozzle size increasing. Importantly, the numerical simulation results of the far-field sound are in good agreement with the experimental measurements in directivity.

Keywords: Supersonic gas jet, Large Eddy Simulation(LES), acoustic noise, Ffowcs Williams-Hawkings (FW-H) equations, nozzle size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2624
2660 3D Shape Modelling of Left Ventricle: Towards Correlation of Myocardial Scintigraphy Data and Coronarography Result

Authors: A. Ben Abdallah, H. Essabbah, M. H. Bedoui

Abstract:

The myocardial sintigraphy is an imaging modality which provides functional informations. Whereas, coronarography modality gives useful informations about coronary arteries anatomy. In case of coronary artery disease (CAD), the coronarography can not determine precisely which moderate lesions (artery reduction between 50% and 70%), known as the “gray zone", are haemodynamicaly significant. In this paper, we aim to define the relationship between the location and the degree of the stenosis in coronary arteries and the observed perfusion on the myocardial scintigraphy. This allows us to model the impact evolution of these stenoses in order to justify a coronarography or to avoid it for patients suspected being in the gray zone. Our approach is decomposed in two steps. The first step consists in modelling a coronary artery bed and stenoses of different location and degree. The second step consists in modelling the left ventricle at stress and at rest using the sphercical harmonics model and myocardial scintigraphic data. We use the spherical harmonics descriptors to analyse left ventricle model deformation between stress and rest which permits us to conclude if ever an ischemia exists and to quantify it.

Keywords: Spherical harmonics model, vascular bed, 3D reconstruction, left ventricle, myocardial scintigraphy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
2659 A Cuckoo Search with Differential Evolution for Clustering Microarray Gene Expression Data

Authors: M. Pandi, K. Premalatha

Abstract:

A DNA microarray technology is a collection of microscopic DNA spots attached to a solid surface. Scientists use DNA microarrays to measure the expression levels of large numbers of genes simultaneously or to genotype multiple regions of a genome. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. It is handled by clustering which reveals the natural structures and identifying the interesting patterns in the underlying data. In this paper, gene based clustering in gene expression data is proposed using Cuckoo Search with Differential Evolution (CS-DE). The experiment results are analyzed with gene expression benchmark datasets. The results show that CS-DE outperforms CS in benchmark datasets. To find the validation of the clustering results, this work is tested with one internal and one external cluster validation indexes.

Keywords: DNA, Microarray, genomics, Cuckoo Search, Differential Evolution, Gene expression data, Clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
2658 Effect of Fractional Flow Curves on the Heavy Oil and Light Oil Recoveries in Petroleum Reservoirs

Authors: Abdul Jamil Nazari, Shigeo Honma

Abstract:

This paper evaluates and compares the effect of fractional flow curves on the heavy oil and light oil recoveries in a petroleum reservoir. Fingering of flowing water is one of the serious problems of the oil displacement by water and another problem is the estimation of the amount of recover oil from a petroleum reservoir. To address these problems, the fractional flow of heavy oil and light oil are investigated. The fractional flow approach treats the multi-phases flow rate as a total mixed fluid and then describes the individual phases as fractional of the total flow. Laboratory experiments are implemented for two different types of oils, heavy oil, and light oil, to experimentally obtain relative permeability and fractional flow curves. Application of the light oil fractional curve, which exhibits a regular S-shape, to the water flooding method showed that a large amount of mobile oil in the reservoir is displaced by water injection. In contrast, the fractional flow curve of heavy oil does not display an S-shape because of its high viscosity. Although the advance of the injected waterfront is faster than in light oil reservoirs, a significant amount of mobile oil remains behind the waterfront.

Keywords: Fractional flow curve, oil recovery, relative permeability, water fingering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476
2657 Relative Radiometric Correction of Cloudy Multitemporal Satellite Imagery

Authors: Seema Biday, Udhav Bhosle

Abstract:

Repeated observation of a given area over time yields potential for many forms of change detection analysis. These repeated observations are confounded in terms of radiometric consistency due to changes in sensor calibration over time, differences in illumination, observation angles and variation in atmospheric effects. This paper demonstrates applicability of an empirical relative radiometric normalization method to a set of multitemporal cloudy images acquired by Resourcesat1 LISS III sensor. Objective of this study is to detect and remove cloud cover and normalize an image radiometrically. Cloud detection is achieved by using Average Brightness Threshold (ABT) algorithm. The detected cloud is removed and replaced with data from another images of the same area. After cloud removal, the proposed normalization method is applied to reduce the radiometric influence caused by non surface factors. This process identifies landscape elements whose reflectance values are nearly constant over time, i.e. the subset of non-changing pixels are identified using frequency based correlation technique. The quality of radiometric normalization is statistically assessed by R2 value and mean square error (MSE) between each pair of analogous band.

Keywords: Correlation, Frequency domain, Multitemporal, Relative Radiometric Correction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
2656 A General Mandatory Access Control Framework in Distributed Environments

Authors: Feng Yang, Xuehai Zhou, Dalei Hu

Abstract:

In this paper, we propose a general mandatory access framework for distributed systems. The framework can be applied into multiple operating systems and can handle multiple stakeholders. Despite considerable advancements in the area of mandatory access control, a certain approach to enforcing mandatory access control can only be applied in a specific operating system. Other than PC market in which windows captures the overwhelming shares, there are a number of popular operating systems in the emerging smart phone environment, i.e. Android, Windows mobile, Symbian, RIM. It should be noted that more and more stakeholders are involved in smartphone software, such as devices owners, service providers and application providers. Our framework includes three parts—local decision layer, the middle layer and the remote decision layer. The middle layer takes charge of managing security contexts, OS API, operations and policy combination. The design of the remote decision layer doesn’t depend on certain operating systems because of the middle layer’s existence. We implement the framework in windows, linux and other popular embedded systems.

Keywords: Mandatory Access Control, Distributed System, General Platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2237
2655 Sensory Evaluation of the Selected Coffee Products Using Fuzzy Approach

Authors: M.A. Lazim, M. Suriani

Abstract:

Knowing consumers' preferences and perceptions of the sensory evaluation of drink products are very significant to manufacturers and retailers alike. With no appropriate sensory analysis, there is a high risk of market disappointment. This paper aims to rank the selected coffee products and also to determine the best of quality attribute through sensory evaluation using fuzzy decision making model. Three products of coffee drinks were used for sensory evaluation. Data were collected from thirty judges at a hypermarket in Kuala Terengganu, Malaysia. The judges were asked to specify their sensory evaluation in linguistic terms of the quality attributes of colour, smell, taste and mouth feel for each product and also the weight of each quality attribute. Five fuzzy linguistic terms represent the quality attributes were introduced prior analysing. The judgment membership function and the weights were compared to rank the products and also to determine the best quality attribute. The product of Indoc was judged as the first in ranking and 'taste' as the best quality attribute. These implicate the importance of sensory evaluation in identifying consumers- preferences and also the competency of fuzzy approach in decision making.

Keywords: fuzzy decision making, fuzzy linguistic, membership function, sensory evaluation,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2785
2654 Assertion-Driven Test Repair Based on Priority Criteria

Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang

Abstract:

Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent-preservation has been proposed, it does not take into account the association between test repairs and assertions, leading a large number of irrelevant candidates and decreasing the repair capability. This paper proposes a assertion-driven test repair approach. Furthermore, a intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) broken test cases, which are more effective than the existing intent-preserved test repair approach, and our intent-oriented priority criteria work well.

Keywords: Test repair, test intent, software test, test case evolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173
2653 Comparison of Deep Convolutional Neural Networks Models for Plant Disease Identification

Authors: Megha Gupta, Nupur Prakash

Abstract:

Identification of plant diseases has been performed using machine learning and deep learning models on the datasets containing images of healthy and diseased plant leaves. The current study carries out an evaluation of some of the deep learning models based on convolutional neural network architectures for identification of plant diseases. For this purpose, the publicly available New Plant Diseases Dataset, an augmented version of PlantVillage dataset, available on Kaggle platform, containing 87,900 images has been used. The dataset contained images of 26 diseases of 14 different plants and images of 12 healthy plants. The CNN models selected for the study presented in this paper are AlexNet, ZFNet, VGGNet (four models), GoogLeNet, and ResNet (three models). The selected models are trained using PyTorch, an open-source machine learning library, on Google Colaboratory. A comparative study has been carried out to analyze the high degree of accuracy achieved using these models. The highest test accuracy and F1-score of 99.59% and 0.996, respectively, were achieved by using GoogLeNet with Mini-batch momentum based gradient descent learning algorithm.

Keywords: comparative analysis, convolutional neural networks, deep learning, plant disease identification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 646
2652 Enhancing the Effectiveness of Air Defense Systems through Simulation Analysis

Authors: F. Felipe

Abstract:

Air Defense Systems contain high-value assets that are expected to fulfill their mission for several years - in many cases, even decades - while operating in a fast-changing, technology-driven environment. Thus, it is paramount that decision-makers can assess how effective an Air Defense System is in the face of new developing threats, as well as to identify the bottlenecks that could jeopardize the security of the airspace of a country. Given the broad extent of activities and the great variety of assets necessary to achieve the strategic objectives, a systems approach was taken in order to delineate the core requirements and the physical architecture of an Air Defense System. Then, value-focused thinking helped in the definition of the measures of effectiveness. Furthermore, analytical methods were applied to create a formal structure that preliminarily assesses such measures. To validate the proposed methodology, a powerful simulation was also used to determine the measures of effectiveness, now in more complex environments that incorporate both uncertainty and multiple interactions of the entities. The results regarding the validity of this methodology suggest that the approach can support decisions aimed at enhancing the capabilities of Air Defense Systems. In conclusion, this paper sheds some light on how consolidated approaches of Systems Engineering and Operations Research can be used as valid techniques for solving problems regarding a complex and yet vital matter.

Keywords: Air defense, effectiveness, system, simulation, decision-support.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 415
2651 Fuzzy C-Means Clustering for Biomedical Documents Using Ontology Based Indexing and Semantic Annotation

Authors: S. Logeswari, K. Premalatha

Abstract:

Search is the most obvious application of information retrieval. The variety of widely obtainable biomedical data is enormous and is expanding fast. This expansion makes the existing techniques are not enough to extract the most interesting patterns from the collection as per the user requirement. Recent researches are concentrating more on semantic based searching than the traditional term based searches. Algorithms for semantic searches are implemented based on the relations exist between the words of the documents. Ontologies are used as domain knowledge for identifying the semantic relations as well as to structure the data for effective information retrieval. Annotation of data with concepts of ontology is one of the wide-ranging practices for clustering the documents. In this paper, indexing based on concept and annotation are proposed for clustering the biomedical documents. Fuzzy c-means (FCM) clustering algorithm is used to cluster the documents. The performances of the proposed methods are analyzed with traditional term based clustering for PubMed articles in five different diseases communities. The experimental results show that the proposed methods outperform the term based fuzzy clustering.

Keywords: MeSH Ontology, Concept Indexing, Annotation, semantic relations, Fuzzy c-means.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2307
2650 A New Approach to Face Recognition Using Dual Dimension Reduction

Authors: M. Almas Anjum, M. Younus Javed, A. Basit

Abstract:

In this paper a new approach to face recognition is presented that achieves double dimension reduction, making the system computationally efficient with better recognition results and out perform common DCT technique of face recognition. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results change with change in face image resolution and provide optimal results when arriving at a certain resolution level. In the proposed model of face recognition, initially image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to increased computational speed and feature extraction potential of Discrete Cosine Transform (DCT), it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A tradeoff between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL , Yale and EME color database.

Keywords: Biometrics, DCT, Face Recognition, Illumination, Computation, Feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
2649 An Energy Aware Dispatch Scheme WSNs

Authors: Siddhartha Chauhan, Kumar S. Pandey, Prateek Chandra

Abstract:

One of the key research issues in wireless sensor networks (WSNs) is how to efficiently deploy sensors to cover an area. In this paper, we present a Fishnet Based Dispatch Scheme (FiBDS) with energy aware mobility and interest based sensing angle. We propose two algorithms, one is FiBDS centralized algorithm and another is FiBDS distributed algorithm. The centralized algorithm is designed specifically for the non-time critical applications, commonly known as non real-time applications while the distributed algorithm is designed specifically for the time critical applications, commonly known as real-time applications. The proposed dispatch scheme works in a phase-selection manner. In this in each phase a specific constraint is dealt with according to the specified priority and then moved onto the next phase and at the end of each only the best suited nodes for the phase are chosen. Simulation results are presented to verify their effectiveness. 

Keywords: Dispatch Scheme, Energy Aware Mobility, Interest based Sensing, Wireless Sensor Networks (WSNs).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
2648 Order Statistics-based “Anti-Bayesian“ Parametric Classification for Asymmetric Distributions in the Exponential Family

Authors: A. Thomas, B. John Oommen

Abstract:

Although the field of parametric Pattern Recognition (PR) has been thoroughly studied for over five decades, the use of the Order Statistics (OS) of the distributions to achieve this has not been reported. The pioneering work on using OS for classification was presented in [1] for the Uniform distribution, where it was shown that optimal PR can be achieved in a counter-intuitive manner, diametrically opposed to the Bayesian paradigm, i.e., by comparing the testing sample to a few samples distant from the mean. This must be contrasted with the Bayesian paradigm in which, if we are allowed to compare the testing sample with only a single point in the feature space from each class, the optimal strategy would be to achieve this based on the (Mahalanobis) distance from the corresponding central points, for example, the means. In [2], we showed that the results could be extended for a few symmetric distributions within the exponential family. In this paper, we attempt to extend these results significantly by considering asymmetric distributions within the exponential family, for some of which even the closed form expressions of the cumulative distribution functions are not available. These distributions include the Rayleigh, Gamma and certain Beta distributions. As in [1] and [2], the new scheme, referred to as Classification by Moments of Order Statistics (CMOS), attains an accuracy very close to the optimal Bayes’ bound, as has been shown both theoretically and by rigorous experimental testing.

Keywords: Classification using Order Statistics (OS), Exponential family, Moments of OS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1536
2647 The Prostitute’s Body in Diasporic Space: Sexualized China and Chineseness in Yu Dafu’s Sinking and Yan Geling’s The Lost Daughter of Happiness

Authors: Haizhi Wu

Abstract:

Sexualization brings together the interdependent experiences of prostitution and diaspora, establishing a masculine structure where a female’s body mediates the hegemony and sexuality of men from different races. Between eroticism and homesickness, writers of the Chinese diaspora develop sensual approaches to reflect on the diasporic experience and sexual frustration. Noticeably, Yu Dafu in Sinking and Yan Geling in The Lost Daughter of Happiness both take an interest in sexual encounters between an immature teen client and an erotically powerful prostitute in Japan or America, both countries considered colonizers in Chinese history. Both are utilizing the metaphor of body-space interplay to hint at the out-of-text transnational interactions, two writers, however, present distinct understandings of their bond with history and memory of the semi-colonial, semi-feudal China. Examining prostitutes’ bodies in multi-layer diasporic spaces, the central analysis of this paper works on the sexual, colonial, and historical representations of this bodily symbol and the prostitution’s engagement in negotiating with diaspora and “Chineseness”.

Keywords: Chineseness, Diasporic spaces, Prostitutes’s bodies, Sexualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 386
2646 A Safety Analysis Method for Multi-Agent Systems

Authors: Ching Louis Liu, Edmund Kazmierczak, Tim Miller

Abstract:

Safety analysis for multi-agent systems is complicated by the, potentially nonlinear, interactions between agents. This paper proposes a method for analyzing the safety of multi-agent systems by explicitly focusing on interactions and the accident data of systems that are similar in structure and function to the system being analyzed. The method creates a Bayesian network using the accident data from similar systems. A feature of our method is that the events in accident data are labeled with HAZOP guide words. Our method uses an Ontology to abstract away from the details of a multi-agent implementation. Using the ontology, our methods then constructs an “Interaction Map,” a graphical representation of the patterns of interactions between agents and other artifacts. Interaction maps combined with statistical data from accidents and the HAZOP classifications of events can be converted into a Bayesian Network. Bayesian networks allow designers to explore “what it” scenarios and make design trade-offs that maintain safety. We show how to use the Bayesian networks, and the interaction maps to improve multi-agent system designs.

Keywords: Multi-agent system, safety analysis, safety model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1090
2645 Authentication and Data Hiding Using a Reversible ROI-based Watermarking Scheme for DICOM Images

Authors: Osamah M. Al-Qershi, Khoo Bee Ee

Abstract:

In recent years image watermarking has become an important research area in data security, confidentiality and image integrity. Many watermarking techniques were proposed for medical images. However, medical images, unlike most of images, require extreme care when embedding additional data within them because the additional information must not affect the image quality and readability. Also the medical records, electronic or not, are linked to the medical secrecy, for that reason, the records must be confidential. To fulfill those requirements, this paper presents a lossless watermarking scheme for DICOM images. The proposed a fragile scheme combines two reversible techniques based on difference expansion for patient's data hiding and protecting the region of interest (ROI) with tamper detection and recovery capability. Patient's data are embedded into ROI, while recovery data are embedded into region of non-interest (RONI). The experimental results show that the original image can be exactly extracted from the watermarked one in case of no tampering. In case of tampered ROI, tampered area can be localized and recovered with a high quality version of the original area.

Keywords: DICOM, reversible, ROI-based, watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
2644 Auto-regressive Recurrent Neural Network Approach for Electricity Load Forecasting

Authors: Tarik Rashid, B. Q. Huang, M-T. Kechadi, B. Gleeson

Abstract:

this paper presents an auto-regressive network called the Auto-Regressive Multi-Context Recurrent Neural Network (ARMCRN), which forecasts the daily peak load for two large power plant systems. The auto-regressive network is a combination of both recurrent and non-recurrent networks. Weather component variables are the key elements in forecasting because any change in these variables affects the demand of energy load. So the AR-MCRN is used to learn the relationship between past, previous, and future exogenous and endogenous variables. Experimental results show that using the change in weather components and the change that occurred in past load as inputs to the AR-MCRN, rather than the basic weather parameters and past load itself as inputs to the same network, produce higher accuracy of predicted load. Experimental results also show that using exogenous and endogenous variables as inputs is better than using only the exogenous variables as inputs to the network.

Keywords: Daily peak load forecasting, neural networks, recurrent neural networks, auto regressive multi-context neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2548
2643 Energy Recovery Soft Switching Improved Efficiency Half Bridge Inverter for Electronic Ballast Applications

Authors: A. Yazdanpanah Goharrizi

Abstract:

An improved topology of a voltage-fed quasi-resonant soft switching LCrCdc series-parallel half bridge inverter with a constant-frequency for electronic ballast applications is proposed in this paper. This new topology introduces a low-cost solution to reduce switching losses and circuit rating to achieve high-efficiency ballast. Switching losses effect on ballast efficiency is discussed through experimental point of view. In this discussion, an improved topology in which accomplishes soft switching operation over a wide power regulation range is proposed. The proposed structure uses reverse recovery diode to provide better operation for the ballast system. A symmetrical pulse wide modulation (PWM) control scheme is implemented to regulate a wide range of out-put power. Simulation results are kindly verified with the experimental measurements obtained by ballast-lamp laboratory prototype. Different load conditions are provided in order to clarify the performance of the proposed converter.

Keywords: Electronic ballast, Pulse wide modulation (PWM) Reverse recovery diode, Soft switching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2194
2642 Types of Epilepsies and Findings EEG- LORETA about Epilepsy

Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi

Abstract:

Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review the findings EEG- LORETA about epilepsy.

Keywords: Epilepsy, EEG, EEG- Loreta, loreta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3098