Search results for: Naïve Bayesian-based classification
544 Inverse Sets-based Recognition of Video Clips
Authors: Alexei M. Mikhailov
Abstract:
The paper discusses the mathematics of pattern indexing and its applications to recognition of visual patterns that are found in video clips. It is shown that (a) pattern indexes can be represented by collections of inverted patterns, (b) solutions to pattern classification problems can be found as intersections and histograms of inverted patterns and, thus, matching of original patterns avoided.Keywords: Artificial neural cortex, computational biology, data mining, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2115543 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification
Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian
Abstract:
Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.
Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 775542 Seismic Performance of Masonry Buildings in Algeria
Authors: F. Lazzali, S. Bedaoui
Abstract:
Structural performance and seismic vulnerability of masonry buildings in Algeria are investigated in this paper. Structural classification of such buildings is carried out regarding their structural elements. Seismicity of Algeria is briefly discussed. Then vulnerability of masonry buildings and their failure mechanisms in the Boumerdes earthquake (May, 2003) are examined.Keywords: Masonry building, seismic deficiencies, vulnerability classes
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2363541 Data Oriented Model of Image: as a Framework for Image Processing
Authors: A. Habibizad Navin, A. Sadighi, M. Naghian Fesharaki, M. Mirnia, M. Teshnelab, R. Keshmiri
Abstract:
This paper presents a new data oriented model of image. Then a representation of it, ADBT, is introduced. The ability of ADBT is clustering, segmentation, measuring similarity of images etc, with desired precision and corresponding speed.
Keywords: Data oriented modelling, image, clustering, segmentation, classification, ADBT and image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799540 Using Data Clustering in Oral Medicine
Authors: Fahad Shahbaz Khan, Rao Muhammad Anwer, Olof Torgersson
Abstract:
The vast amount of information hidden in huge databases has created tremendous interests in the field of data mining. This paper examines the possibility of using data clustering techniques in oral medicine to identify functional relationships between different attributes and classification of similar patient examinations. Commonly used data clustering algorithms have been reviewed and as a result several interesting results have been gathered.Keywords: Oral Medicine, Cluto, Data Clustering, Data Mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1977539 Decision Algorithm for Smart Airbag Deployment Safety Issues
Authors: Aini Hussain, M A Hannan, Azah Mohamed, Hilmi Sanusi, Burhanuddin Yeop Majlis
Abstract:
Airbag deployment has been known to be responsible for huge death, incidental injuries and broken bones due to low crash severity and wrong deployment decisions. Therefore, the authorities and industries have been looking for more innovative and intelligent products to be realized for future enhancements in the vehicle safety systems (VSSs). Although the VSSs technologies have advanced considerably, they still face challenges such as how to avoid unnecessary and untimely airbag deployments that can be hazardous and fatal. Currently, most of the existing airbag systems deploy without regard to occupant size and position. As such, this paper will focus on the occupant and crash sensing performances due to frontal collisions for the new breed of so called smart airbag systems. It intends to provide a thorough discussion relating to the occupancy detection, occupant size classification, occupant off-position detection to determine safe distance zone for airbag deployment, crash-severity analysis and airbag decision algorithms via a computer modeling. The proposed system model consists of three main modules namely, occupant sensing, crash severity analysis and decision fusion. The occupant sensing system module utilizes the weight sensor to determine occupancy, classify the occupant size, and determine occupant off-position condition to compute safe distance for airbag deployment. The crash severity analysis module is used to generate relevant information pertinent to airbag deployment decision. Outputs from these two modules are fused to the decision module for correct and efficient airbag deployment action. Computer modeling work is carried out using Simulink, Stateflow, SimMechanics and Virtual Reality toolboxes.Keywords: Crash severity analysis, occupant size classification, smart airbag, vehicle safety system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4117538 Identification of Flexographic-printed Newspapers with NIR Spectral Imaging
Authors: Raimund Leitner, Susanne Rosskopf
Abstract:
Near-infrared (NIR) spectroscopy is a widely used method for material identification for laboratory and industrial applications. While standard spectrometers only allow measurements at one sampling point at a time, NIR Spectral Imaging techniques can measure, in real-time, both the size and shape of an object as well as identify the material the object is made of. The online classification and sorting of recovered paper with NIR Spectral Imaging (SI) is used with success in the paper recycling industry throughout Europe. Recently, the globalisation of the recycling material streams caused that water-based flexographic-printed newspapers mainly from UK and Italy appear also in central Europe. These flexo-printed newspapers are not sufficiently de-inkable with the standard de-inking process originally developed for offset-printed paper. This de-inking process removes the ink from recovered paper and is the fundamental processing step to produce high-quality paper from recovered paper. Thus, the flexo-printed newspapers are a growing problem for the recycling industry as they reduce the quality of the produced paper if their amount exceeds a certain limit within the recovered paper material. This paper presents the results of a research project for the development of an automated entry inspection system for recovered paper that was jointly conducted by CTR AG (Austria) and PTS Papiertechnische Stiftung (Germany). Within the project an NIR SI prototype for the identification of flexo-printed newspaper has been developed. The prototype can identify and sort out flexoprinted newspapers in real-time and achieves a detection accuracy for flexo-printed newspaper of over 95%. NIR SI, the technology the prototype is based on, allows the development of inspection systems for incoming goods in a paper production facility as well as industrial sorting systems for recovered paper in the recycling industry in the near future.Keywords: spectral imaging, imaging spectroscopy, NIR, waterbasedflexographic, flexo-printed, recovered paper, real-time classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545537 Ant Colony Optimization for Feature Subset Selection
Authors: Ahmed Al-Ani
Abstract:
The Ant Colony Optimization (ACO) is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It has recently attracted a lot of attention and has been successfully applied to a number of different optimization problems. Due to the importance of the feature selection problem and the potential of ACO, this paper presents a novel method that utilizes the ACO algorithm to implement a feature subset search procedure. Initial results obtained using the classification of speech segments are very promising.Keywords: Ant Colony Optimization, ant systems, feature selection, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3143536 Image Segmentation Using 2-D Histogram in RGB Color Space in Digital Libraries
Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed
Abstract:
This paper presents an unsupervised color image segmentation method. It is based on a hierarchical analysis of 2-D histogram in RGB color space. This histogram minimizes storage space of images and thus facilitates the operations between them. The improved segmentation approach shows a better identification of objects in a color image and, at the same time, the system is fast.Keywords: Image segmentation, hierarchical analysis, 2-D histogram, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626535 Text Mining Technique for Data Mining Application
Authors: M. Govindarajan
Abstract:
Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In decision tree approach is most useful in classification problem. With this technique, tree is constructed to model the classification process. There are two basic steps in the technique: building the tree and applying the tree to the database. This paper describes a proposed C5.0 classifier that performs rulesets, cross validation and boosting for original C5.0 in order to reduce the optimization of error ratio. The feasibility and the benefits of the proposed approach are demonstrated by means of medial data set like hypothyroid. It is shown that, the performance of a classifier on the training cases from which it was constructed gives a poor estimate by sampling or using a separate test file, either way, the classifier is evaluated on cases that were not used to build and evaluate the classifier are both are large. If the cases in hypothyroid.data and hypothyroid.test were to be shuffled and divided into a new 2772 case training set and a 1000 case test set, C5.0 might construct a different classifier with a lower or higher error rate on the test cases. An important feature of see5 is its ability to classifiers called rulesets. The ruleset has an error rate 0.5 % on the test cases. The standard errors of the means provide an estimate of the variability of results. One way to get a more reliable estimate of predictive is by f-fold –cross- validation. The error rate of a classifier produced from all the cases is estimated as the ratio of the total number of errors on the hold-out cases to the total number of cases. The Boost option with x trials instructs See5 to construct up to x classifiers in this manner. Trials over numerous datasets, large and small, show that on average 10-classifier boosting reduces the error rate for test cases by about 25%.Keywords: C5.0, Error Ratio, text mining, training data, test data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2489534 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach
Authors: Helen L. Hein, Joachim Schwarte
Abstract:
As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.
Keywords: Aerogel-based, insulating material, early develop¬ment phase, interval arithmetic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 615533 The Optimization of Decision Rules in Multimodal Decision-Level Fusion Scheme
Authors: Andrey V. Timofeev, Dmitry V. Egorov
Abstract:
This paper introduces an original method of parametric optimization of the structure for multimodal decisionlevel fusion scheme which combines the results of the partial solution of the classification task obtained from assembly of the mono-modal classifiers. As a result, a multimodal fusion classifier which has the minimum value of the total error rate has been obtained.
Keywords: Сlassification accuracy, fusion solution, total error rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975532 A Study on Intuitionistic Fuzzy h-ideal in Γ-Hemirings
Authors: S.K. Sardar, D. Mandal, R. Mukherjee
Abstract:
The notions of intuitionistic fuzzy h-ideal and normal intuitionistic fuzzy h-ideal in Γ-hemiring are introduced and some of the basic properties of these ideals are investigated. Cartesian product of intuitionistic fuzzy h-ideals is also defined. Finally a characterization of intuitionistic fuzzy h-ideals in terms of fuzzy relations is obtained.Keywords: Γ-hemiring, fuzzy h-ideal, normal, cartesian product.Mathematics Subject Classification[2000] :08A72, 16Y99
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4273531 Safety of Industrial Networks
Authors: P. Vazan, P. Tanuska, M. Kebisek, S. Duchovicova
Abstract:
The paper deals with communication standards for control and production system. The authors formulate the requirements for communication security protection. The paper is focused on application protocols of the industrial networks and their basic classification. The typical attacks are analysed and the safety protection, based on requirements for specific industrial network is suggested and defined in this paper.
Keywords: Application protocols, communication standards, industrial networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007530 A Optimal Subclass Detection Method for Credit Scoring
Authors: Luciano Nieddu, Giuseppe Manfredi, Salvatore D'Acunto, Katia La Regina
Abstract:
In this paper a non-parametric statistical pattern recognition algorithm for the problem of credit scoring will be presented. The proposed algorithm is based on a clustering k- means algorithm and allows for the determination of subclasses of homogenous elements in the data. The algorithm will be tested on two benchmark datasets and its performance compared with other well known pattern recognition algorithm for credit scoring.
Keywords: Constrained clustering, Credit scoring, Statistical pattern recognition, Supervised classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2049529 A Supply Chain Perspective of RFID Systems
Authors: A. N. Nambiar
Abstract:
Radio Frequency Identification (RFID) initially introduced during WW-II, has revolutionized the world with its numerous benefits and plethora of implementations in diverse areas ranging from manufacturing to agriculture to healthcare to hotel management. This work reviews the current research in this area with emphasis on applications for supply chain management and to develop a taxonomic framework to classify literature which will enable swift and easy content analysis and also help identify areas for future research.Keywords: RFID, supply chain, applications, classification framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2305528 Advanced Technologies and Algorithms for Efficient Portfolio Selection
Authors: Konstantinos Liagkouras, Konstantinos Metaxiotis
Abstract:
In this paper we present a classification of the various technologies applied for the solution of the portfolio selection problem according to the discipline and the methodological framework followed. We provide a concise presentation of the emerged categories and we are trying to identify which methods considered obsolete and which lie at the heart of the debate. On top of that, we provide a comparative study of the different technologies applied for efficient portfolio construction and we suggest potential paths for future work that lie at the intersection of the presented techniques.
Keywords: Portfolio selection, optimization techniques, financial models, stochastics, heuristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751527 Analysis of Sonographic Images of Breast
Authors: M. Bastanfard, S. Jafari, B.Jalaeian
Abstract:
Ultrasound images are very useful diagnostic tool to distinguish benignant from malignant masses of the breast. However, there is a considerable overlap between benignancy and malignancy in ultrasonic images which makes it difficult to interpret. In this paper, a new noise removal algorithm was used to improve the images and classification process. The masses are classified by wavelet transform's coefficients, morphological and textural features as a novel feature set for this goal. The Bayesian estimation theory is used to classify the tissues in three classes according to their features.Keywords: Bayesian estimation theory, breast, ultrasound, wavelet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446526 EEG Waves Classifier using Wavelet Transform and Fourier Transform
Authors: Maan M. Shaker
Abstract:
The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field due to its rich information about human tasks. In this work EEG waves classification is achieved using the Discrete Wavelet Transform DWT with Fast Fourier Transform (FFT) by adopting the normalized EEG data. The DWT is used as a classifier of the EEG wave's frequencies, while FFT is implemented to visualize the EEG waves in multi-resolution of DWT. Several real EEG data sets (real EEG data for both normal and abnormal persons) have been tested and the results improve the validity of the proposed technique.Keywords: Bioinformatics, DWT, EEG waves, FFT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5557525 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam
Authors: S. Golmohammadi, M. Noorian Bidgoli
Abstract:
Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the Rock Quality Designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and Stress Reduction Factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has been attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the Rock Engineering System (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.
Keywords: Q-system, Rock Engineering System, statistical analysis, rock mass, tunnel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 295524 Research on Software Security Testing
Authors: Gu Tian-yang, Shi Yin-sheng, Fang You-yuan
Abstract:
Software security testing is an important means to ensure software security and trustiness. This paper first mainly discusses the definition and classification of software security testing, and investigates methods and tools of software security testing widely. Then it analyzes and concludes the advantages and disadvantages of various methods and the scope of application, presents a taxonomy of security testing tools. Finally, the paper points out future focus and development directions of software security testing technology.
Keywords: security testing, security functional testing, securityvulnerability testing, testing method, testing tool
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5134523 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data
Authors: Saeid Gharechelou, Ryutaro Tateishi
Abstract:
Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.
Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid monitoring, 2015-Nepal earthquake.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1056522 Clustered Signatures for Modeling and Recognizing 3D Rigid Objects
Authors: H. B. Darbandi, M. R. Ito, J. Little
Abstract:
This paper describes a probabilistic method for three-dimensional object recognition using a shared pool of surface signatures. This technique uses flatness, orientation, and convexity signatures that encode the surface of a free-form object into three discriminative vectors, and then creates a shared pool of data by clustering the signatures using a distance function. This method applies the Bayes-s rule for recognition process, and it is extensible to a large collection of three-dimensional objects.Keywords: Object recognition, modeling, classification, computer vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1278521 Person Identification using Gait by Combined Features of Width and Shape of the Binary Silhouette
Authors: M.K. Bhuyan, Aragala Jagan.
Abstract:
Current image-based individual human recognition methods, such as fingerprints, face, or iris biometric modalities generally require a cooperative subject, views from certain aspects, and physical contact or close proximity. These methods cannot reliably recognize non-cooperating individuals at a distance in the real world under changing environmental conditions. Gait, which concerns recognizing individuals by the way they walk, is a relatively new biometric without these disadvantages. The inherent gait characteristic of an individual makes it irreplaceable and useful in visual surveillance. In this paper, an efficient gait recognition system for human identification by extracting two features namely width vector of the binary silhouette and the MPEG-7-based region-based shape descriptors is proposed. In the proposed method, foreground objects i.e., human and other moving objects are extracted by estimating background information by a Gaussian Mixture Model (GMM) and subsequently, median filtering operation is performed for removing noises in the background subtracted image. A moving target classification algorithm is used to separate human being (i.e., pedestrian) from other foreground objects (viz., vehicles). Shape and boundary information is used in the moving target classification algorithm. Subsequently, width vector of the outer contour of binary silhouette and the MPEG-7 Angular Radial Transform coefficients are taken as the feature vector. Next, the Principal Component Analysis (PCA) is applied to the selected feature vector to reduce its dimensionality. These extracted feature vectors are used to train an Hidden Markov Model (HMM) for identification of some individuals. The proposed system is evaluated using some gait sequences and the experimental results show the efficacy of the proposed algorithm.Keywords: Gait Recognition, Gaussian Mixture Model, PrincipalComponent Analysis, MPEG-7 Angular Radial Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911520 Balancing Tourism and Environment: The ETM Model
Authors: U.V Jose, Muhammed Nahar, Vijayakumar S., Sonia Jose
Abstract:
Environment both endowed and built are essential for tourism. However tourism and environment maintains a complex relationship, where in most cases environment is at the receiving end. Many tourism development activities have adverse environmental effects, mainly emanating from construction of general infrastructure and tourism facilities. These negative impacts of tourism can lead to the destruction of precious natural resources on which it depends. These effects vary between locations; and its effect on a hill destination is highly critical. This study aims at developing a Sustainable Tourism Planning Model for an environmentally sensitive tourism destination in Kerala, India. Being part of the Nilgiri mountain ranges, Munnar falls in the Western Ghats, one of the biological hotspots in the world. Endowed with a unique high altitude environment Munnar inherits highly significant ecological wealth. Giving prime importance to the protection of this ecological heritage, the study proposes a tourism planning model with resource conservation and sustainability as the paramount focus. Conceiving a novel approach towards sustainable tourism planning, the study proposes to assess tourism attractions using Ecological Sensitivity Index (ESI) and Tourism Attractiveness Index (TAI). Integration of these two indices will form the Ecology – Tourism Matrix (ETM), outlining the base for tourism planning in an environmentally sensitive destination. The ETM Matrix leads to a classification of tourism nodes according to its Conservation Significance and Tourism Significance. The spatial integration of such nodes based on the Hub & Spoke Principle constitutes sub – regions within the STZ. Ensuing analyses lead to specific guidelines for the STZ as a whole, specific tourism nodes, hubs and sub-regions. The study results in a multi – dimensional output, viz., (1) Classification system for tourism nodes in an environmentally sensitive region/ destination (2) Conservation / Tourism Development Strategies and Guidelines for the micro and macro regions and (3) A Sustainable Tourism Planning Tool particularly for Ecologically Sensitive Destinations, which can be adapted for other destinations as well.Keywords: Tourism, Environment, Spatial Planning, Model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2809519 Solving of the Fourth Order Differential Equations with the Neumann Problem
Authors: Marziyeh Halimi, Roushanak Lotfikar, Simin Mansouri Borojeni
Abstract:
In this paper we considered the Neumann problem for the fourth order differential equation. First we define the weighted Sobolev space 2 Wα and generalized solution for this equation. Then we consider the existence and uniqueness of the generalized solution, as well as give the description of the spectrum and of the domain of definition of the corresponding operator.Keywords: Neumann problem, weighted Sobolev spaces, generalized solution, spectrum of linear operators.2000 mathematic subject classification: 34A05, 34A30.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1427518 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla
Abstract:
Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.Keywords: Fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 840517 Atrial Fibrillation Analysis Based on Blind Source Separation in 12-lead ECG
Authors: Pei-Chann Chang, Jui-Chien Hsieh, Jyun-Jie Lin, Feng-Ming Yeh
Abstract:
Atrial Fibrillation is the most common sustained arrhythmia encountered by clinicians. Because of the invisible waveform of atrial fibrillation in atrial activation for human, it is necessary to develop an automatic diagnosis system. 12-Lead ECG now is available in hospital and is appropriate for using Independent Component Analysis to estimate the AA period. In this research, we also adopt a second-order blind identification approach to transform the sources extracted by ICA to more precise signal and then we use frequency domain algorithm to do the classification. In experiment, we gather a significant result of clinical data.Keywords: 12-Lead ECG, Atrial Fibrillation, Blind SourceSeparation, Kurtosis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814516 Analysing and Classifying VLF Transients
Authors: Ernst D. Schmitter
Abstract:
Monitoring lightning electromagnetic pulses (sferics) and other terrestrial as well as extraterrestrial transient radiation signals is of considerable interest for practical and theoretical purposes in astro- and geophysics as well as meteorology. Managing a continuous flow of data, automation of the analysis and classification process is important. Features based on a combination of wavelet and statistical methods proved efficient for this task and serve as input into a radial basis function network that is trained to discriminate transient shapes from pulse like to wave like. We concentrate on signals in the Very Low Frequency (VLF, 3 -30 kHz) range in this paper, but the developed methods are independent of this specific choice.
Keywords: Transient signals, statistics, wavelets, neural networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880515 Comparison of Performance between Different SVM Kernels for the Identification of Adult Video
Authors: Hajar Bouirouga, Sanaa El Fkihi , Abdeilah Jilbab, Driss Aboutajdine
Abstract:
In this paper we propose a method for recognition of adult video based on support vector machine (SVM). Different kernel features are proposed to classify adult videos. SVM has an advantage that it is insensitive to the relative number of training example in positive (adult video) and negative (non adult video) classes. This advantage is illustrated by comparing performance between different SVM kernels for the identification of adult video.Keywords: Skin detection, Support vector machine, Pornographic videos, Feature extraction, Video filtering, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2306