Search results for: Clustering Algorithms.
911 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem
Authors: C. E. Nugraheni, L. Abednego
Abstract:
This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as metaheuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.
Keywords: Hyper-heuristics, evolutionary algorithms, production scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2413910 Combining Bagging and Additive Regression
Authors: Sotiris B. Kotsiantis
Abstract:
Bagging and boosting are among the most popular re-sampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noise-free data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using an averaging methodology of bagging and boosting ensembles with 10 sub-learners in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-learners on standard benchmark datasets and the proposed ensemble gave better accuracy.
Keywords: Regressors, statistical learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641909 Financial Ethics: A Review of 2010 Flash Crash
Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid
Abstract:
Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.
Keywords: Flash Crash, Market Crash, Stock Market, Stock Market Crash.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857908 A Novel FFT-Based Frequency Offset Estimator for OFDM Systems
Authors: Mahdi Masoumi, Mehrdad Ardebilipoor, Seyed Aidin Bassam
Abstract:
This paper proposes a novel frequency offset (FO) estimator for orthogonal frequency division multiplexing. Simplicity is most significant feature of this algorithm and can be repeated to achieve acceptable accuracy. Also fractional and integer part of FO is estimated jointly with use of the same algorithm. To do so, instead of using conventional algorithms that usually use correlation function, we use DFT of received signal. Therefore, complexity will be reduced and we can do synchronization procedure by the same hardware that is used to demodulate OFDM symbol. Finally, computer simulation shows that the accuracy of this method is better than other conventional methods.
Keywords: DFT, Estimator, Frequency Offset, IEEE802.11a, OFDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497907 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market
Authors: Cristian Păuna
Abstract:
In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.
Keywords: Algorithmic trading, automated investment system, DAX Deutscher Aktienindex.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696906 Using Genetic Algorithm to Improve Information Retrieval Systems
Authors: Ahmed A. A. Radwan, Bahgat A. Abdel Latef, Abdel Mgeid A. Ali, Osman A. Sadek
Abstract:
This study investigates the use of genetic algorithms in information retrieval. The method is shown to be applicable to three well-known documents collections, where more relevant documents are presented to users in the genetic modification. In this paper we present a new fitness function for approximate information retrieval which is very fast and very flexible, than cosine similarity fitness function.Keywords: Cosine similarity, Fitness function, Genetic Algorithm, Information Retrieval, Query learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2756905 Performance Evaluation of Packet Scheduling with Channel Conditioning Aware Based On WiMAX Networks
Authors: Elmabruk Laias, Abdalla M. Hanashi, Mohammed Alnas
Abstract:
Worldwide Interoperability for Microwave Access (WiMAX) became one of the most challenging issues, since it was responsible for distributing available resources of the network among all users this leaded to the demand of constructing and designing high efficient scheduling algorithms in order to improve the network utilization, to increase the network throughput, and to minimize the end-to-end delay. In this study, the proposed algorithm focuses on an efficient mechanism to serve non_real time traffic in congested networks by considering channel status.
Keywords: WiMAX, Quality of Services (QoS), OPNE, Diff-Serv (DS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835904 The Role of Optimization and Machine Learning in e-Commerce Logistics in 2030
Authors: Vincenzo Capalbo, Gianpaolo Ghiani, Emanuele Manni
Abstract:
Global e-commerce sales have reached unprecedented levels in the past few years. As this trend is only predicted to go up as we continue into the ’20s, new challenges will be faced by companies when planning and controlling e-commerce logistics. In this paper, we survey the related literature on Optimization and Machine Learning as well as on combined methodologies. We also identify the distinctive features of next-generation planning algorithms - namely scalability, model-and-run features and learning capabilities - that will be fundamental to cope with the scale and complexity of logistics in the next decade.Keywords: e-Commerce, Logistics, Machine Learning, Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128903 Digital Image Watermarking in the Wavelet Transform Domain
Authors: Kamran Hameed, Adeel Mumtaz, S.A.M. Gilani
Abstract:
In this paper, we start by first characterizing the most important and distinguishing features of wavelet-based watermarking schemes. We studied the overwhelming amount of algorithms proposed in the literature. Application scenario, copyright protection is considered and building on the experience that was gained, implemented two distinguishing watermarking schemes. Detailed comparison and obtained results are presented and discussed. We concluded that Joo-s [1] technique is more robust for standard noise attacks than Dote-s [2] technique.Keywords: Digital image, Copyright protection, Watermarking, Wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2652902 Effective Features for Disambiguation of Turkish Verbs
Authors: Zeynep Orhan, Zeynep Altan
Abstract:
This paper summarizes the results of some experiments for finding the effective features for disambiguation of Turkish verbs. Word sense disambiguation is a current area of investigation in which verbs have the dominant role. Generally verbs have more senses than the other types of words in the average and detecting these features for verbs may lead to some improvements for other word types. In this paper we have considered only the syntactical features that can be obtained from the corpus and tested by using some famous machine learning algorithms.
Keywords: Word sense disambiguation, feature selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747901 On the Performance of Information Criteria in Latent Segment Models
Authors: Jaime R. S. Fonseca
Abstract:
Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.Keywords: Quantitative Methods, Multivariate Data Analysis, Clustering, Finite Mixture Models, Information Theoretical Criteria, Simulation experiments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519900 Maximum Power Point Tracking Using FLC Tuned with GA
Authors: Mohamed Amine Haraoubia, Abdelaziz Hamzaoui, Najib Essounbouli
Abstract:
The pursuit of the MPPT has led to the development of many kinds of controllers, one of which is the Fuzzy Logic controller, which has proven its worth. To further tune this controller this paper will discuss and analyze the use of Genetic Algorithms to tune the Fuzzy Logic Controller. It will provide an introduction to both systems, and test their compatibility and performance.
Keywords: Fuzzy logic controller (FLC), fuzzy logic (FL), genetic algorithm (GA), maximum power point (MPP), maximum power point tracking (MPPT).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2625899 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Petar Penchev
Abstract:
The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.
Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22898 A Fuzzy Approach to Liver Tumor Segmentation with Zernike Moments
Authors: Abder-Rahman Ali, Antoine Vacavant, Manuel Grand-Brochier, Adélaïde Albouy-Kissi, Jean-Yves Boire
Abstract:
In this paper, we present a new segmentation approach for liver lesions in regions of interest within MRI (Magnetic Resonance Imaging). This approach, based on a two-cluster Fuzzy CMeans methodology, considers the parameter variable compactness to handle uncertainty. Fine boundaries are detected by a local recursive merging of ambiguous pixels with a sequential forward floating selection with Zernike moments. The method has been tested on both synthetic and real images. When applied on synthetic images, the proposed approach provides good performance, segmentations obtained are accurate, their shape is consistent with the ground truth, and the extracted information is reliable. The results obtained on MR images confirm such observations. Our approach allows, even for difficult cases of MR images, to extract a segmentation with good performance in terms of accuracy and shape, which implies that the geometry of the tumor is preserved for further clinical activities (such as automatic extraction of pharmaco-kinetics properties, lesion characterization, etc.).Keywords: Defuzzification, floating search, fuzzy clustering, Zernike moments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050897 MIMO Radar-Based System for Structural Health Monitoring and Geophysical Applications
Authors: Davide D’Aria, Paolo Falcone, Luigi Maggi, Aldo Cero, Giovanni Amoroso
Abstract:
The paper presents a methodology for real-time structural health monitoring and geophysical applications. The key elements of the system are a high performance MIMO RADAR sensor, an optical camera and a dedicated set of software algorithms encompassing interferometry, tomography and photogrammetry. The MIMO Radar sensor proposed in this work, provides an extremely high sensitivity to displacements making the system able to react to tiny deformations (up to tens of microns) with a time scale which spans from milliseconds to hours. The MIMO feature of the system makes the system capable of providing a set of two-dimensional images of the observed scene, each mapped on the azimuth-range directions with noticeably resolution in both the dimensions and with an outstanding repetition rate. The back-scattered energy, which is distributed in the 3D space, is projected on a 2D plane, where each pixel has as coordinates the Line-Of-Sight distance and the cross-range azimuthal angle. At the same time, the high performing processing unit allows to sense the observed scene with remarkable refresh periods (up to milliseconds), thus opening the way for combined static and dynamic structural health monitoring. Thanks to the smart TX/RX antenna array layout, the MIMO data can be processed through a tomographic approach to reconstruct the three-dimensional map of the observed scene. This 3D point cloud is then accurately mapped on a 2D digital optical image through photogrammetric techniques, allowing for easy and straightforward interpretations of the measurements. Once the three-dimensional image is reconstructed, a 'repeat-pass' interferometric approach is exploited to provide the user of the system with high frequency three-dimensional motion/vibration estimation of each point of the reconstructed image. At this stage, the methodology leverages consolidated atmospheric correction algorithms to provide reliable displacement and vibration measurements.
Keywords: Interferometry, MIMO RADAR, SAR, tomography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911896 A Novel Modified Adaptive Fuzzy Inference Engine and Its Application to Pattern Classification
Authors: J. Hossen, A. Rahman, K. Samsudin, F. Rokhani, S. Sayeed, R. Hasan
Abstract:
The Neuro-Fuzzy hybridization scheme has become of research interest in pattern classification over the past decade. The present paper proposes a novel Modified Adaptive Fuzzy Inference Engine (MAFIE) for pattern classification. A modified Apriori algorithm technique is utilized to reduce a minimal set of decision rules based on input output data sets. A TSK type fuzzy inference system is constructed by the automatic generation of membership functions and rules by the fuzzy c-means clustering and Apriori algorithm technique, respectively. The generated adaptive fuzzy inference engine is adjusted by the least-squares fit and a conjugate gradient descent algorithm towards better performance with a minimal set of rules. The proposed MAFIE is able to reduce the number of rules which increases exponentially when more input variables are involved. The performance of the proposed MAFIE is compared with other existing applications of pattern classification schemes using Fisher-s Iris and Wisconsin breast cancer data sets and shown to be very competitive.Keywords: Apriori algorithm, Fuzzy C-means, MAFIE, TSK
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931895 Intelligent Audio Watermarking using Genetic Algorithm in DWT Domain
Authors: M. Ketcham, S. Vongpradhip
Abstract:
In this paper, an innovative watermarking scheme for audio signal based on genetic algorithms (GA) in the discrete wavelet transforms is proposed. It is robust against watermarking attacks, which are commonly employed in literature. In addition, the watermarked image quality is also considered. We employ GA for the optimal localization and intensity of watermark. The watermark detection process can be performed without using the original audio signal. The experimental results demonstrate that watermark is inaudible and robust to many digital signal processing, such as cropping, low pass filter, additive noise.
Keywords: Intelligent Audio Watermarking, GeneticAlgorithm, DWT Domain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2057894 Cluster Based Ant Colony Routing Algorithm for Mobile Ad-Hoc Networks
Authors: Alaa E. Abdallah, Bajes Y. Alskarnah
Abstract:
Ant colony based routing algorithms are known to grantee the packet delivery, but they suffer from the huge overhead of control messages which are needed to discover the route. In this paper we utilize the network nodes positions to group the nodes in connected clusters. We use clusters-heads only on forwarding the route discovery control messages. Our simulations proved that the new algorithm has decreased the overhead dramatically without affecting the delivery rate.
Keywords: Ant colony-based routing, position-based routing, MANET.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564893 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach
Authors: Parvinder S. Sandhu, Hardeep Singh
Abstract:
Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662892 Advanced Neural Network Learning Applied to Pulping Modeling
Authors: Z. Zainuddin, W. D. Wan Rosli, R. Lanouette, S. Sathasivam
Abstract:
This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Keywords: Convergence, pulping modeling, neural networks, preconditioned conjugate gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1409891 On a Conjecture Regarding the Adam Optimizer
Authors: Mohamed Akrout, Douglas Tweed
Abstract:
The great success of deep learning relies on efficient optimizers, which are the algorithms that decide how to adjust network weights and biases based on gradient information. One of the most effective and widely used optimizers in recent years has been the method of adaptive moments, or Adam, but the mathematical reasons behind its effectiveness are still unclear. Attempts to analyse its behaviour have remained incomplete, in part because they hinge on a conjecture which has never been proven, regarding ratios of powers of the first and second moments of the gradient. Here we show that this conjecture is in fact false, but that a modified version of it is true, and can take its place in analyses of Adam.
Keywords: Adam optimizer, Bock’s conjecture, stochastic optimization, average regret.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 376890 Intuition Operator: Providing Genomes with Reason
Authors: Grigorios N. Beligiannis, Georgios A. Tsirogiannis, Panayotis E. Pintelas
Abstract:
In this contribution, the use of a new genetic operator is proposed. The main advantage of using this operator is that it is able to assist the evolution procedure to converge faster towards the optimal solution of a problem. This new genetic operator is called ''intuition'' operator. Generally speaking, one can claim that this operator is a way to include any heuristic or any other local knowledge, concerning the problem, that cannot be embedded in the fitness function. Simulation results show that the use of this operator increases significantly the performance of the classic Genetic Algorithm by increasing the convergence speed of its population.
Keywords: Genetic algorithms, intuition operator, reasonable genomes, complex search space, nonlinear fitness functions
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576889 An Energy Efficient Algorithm for Distributed Mutual Exclusion in Mobile Ad-hoc Networks
Authors: Sayani Sil, Sukanta Das
Abstract:
This paper reports a distributed mutual exclusion algorithm for mobile Ad-hoc networks. The network is clustered hierarchically. The proposed algorithm considers the clustered network as a logical tree and develops a token passing scheme to get the mutual exclusion. The performance analysis and simulation results show that its message requirement is optimal, and thus the algorithm is energy efficient.Keywords: Critical section, Distributed mutual exclusion, MobileAd-hoc network, Token-based algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751888 Secure and Efficient Transmission of Aggregated Data for Mobile Wireless Sensor Networks
Authors: A. Krishna Veni, R.Geetha
Abstract:
Wireless Sensor Networks (WSNs) are suitable for many scenarios in the real world. The retrieval of data is made efficient by the data aggregation techniques. Many techniques for the data aggregation are offered and most of the existing schemes are not energy efficient and secure. However, the existing techniques use the traditional clustering approach where there is a delay during the packet transmission since there is no proper scheduling. The presented system uses the Velocity Energy-efficient and Link-aware Cluster-Tree (VELCT) scheme in which there is a Data Collection Tree (DCT) which improves the lifetime of the network. The VELCT scheme and the construction of DCT reduce the delay and traffic. The network lifetime can be increased by avoiding the frequent change in cluster topology. Secure and Efficient Transmission of Aggregated data (SETA) improves the security of the data transmission via the trust value of the nodes prior the aggregation of data. Since SETA considers the data only from the trustworthy nodes for aggregation, it is more secure in transmitting the data thereby improving the accuracy of aggregated data.
Keywords: Aggregation, lifetime, network security, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217887 Wireless Sensor Networks:A Survey on Ultra-Low Power-Aware Design
Authors: Itziar Marín, Eduardo Arceredillo, Aitzol Zuloaga, Jagoba Arias
Abstract:
Distributed wireless sensor network consist on several scattered nodes in a knowledge area. Those sensors have as its only power supplies a pair of batteries that must let them live up to five years without substitution. That-s why it is necessary to develop some power aware algorithms that could save battery lifetime as much as possible. In this is document, a review of power aware design for sensor nodes is presented. As example of implementations, some resources and task management, communication, topology control and routing protocols are named.Keywords: Low Power Design, Power Awareness, RemoteSensing, Wireless Sensor Networks (WSN).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2184886 Objective Assessment of Psoriasis Lesion Thickness for PASI Scoring using 3D Digital Imaging
Authors: M.H. Ahmad Fadzil, Hurriyatul Fitriyah, Esa Prakasa, Hermawan Nugroho, S.H. Hussein, Azura Mohd. Affandi
Abstract:
Psoriasis is a chronic inflammatory skin condition which affects 2-3% of population around the world. Psoriasis Area and Severity Index (PASI) is a gold standard to assess psoriasis severity as well as the treatment efficacy. Although a gold standard, PASI is rarely used because it is tedious and complex. In practice, PASI score is determined subjectively by dermatologists, therefore inter and intra variations of assessment are possible to happen even among expert dermatologists. This research develops an algorithm to assess psoriasis lesion for PASI scoring objectively. Focus of this research is thickness assessment as one of PASI four parameters beside area, erythema and scaliness. Psoriasis lesion thickness is measured by averaging the total elevation from lesion base to lesion surface. Thickness values of 122 3D images taken from 39 patients are grouped into 4 PASI thickness score using K-means clustering. Validation on lesion base construction is performed using twelve body curvature models and show good result with coefficient of determinant (R2) is equal to 1.Keywords: 3D digital imaging, base construction, PASI, psoriasis lesion thickness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454885 Gene Expression Signature for Classification of Metastasis Positive and Negative Oral Cancer in Homosapiens
Authors: A. Shukla, A. Tarsauliya, R. Tiwari, S. Sharma
Abstract:
Cancer classification to their corresponding cohorts has been key area of research in bioinformatics aiming better prognosis of the disease. High dimensionality of gene data has been makes it a complex task and requires significance data identification technique in order to reducing the dimensionality and identification of significant information. In this paper, we have proposed a novel approach for classification of oral cancer into metastasis positive and negative patients. We have used significance analysis of microarrays (SAM) for identifying significant genes which constitutes gene signature. 3 different gene signatures were identified using SAM from 3 different combination of training datasets and their classification accuracy was calculated on corresponding testing datasets using k-Nearest Neighbour (kNN), Fuzzy C-Means Clustering (FCM), Support Vector Machine (SVM) and Backpropagation Neural Network (BPNN). A final gene signature of only 9 genes was obtained from above 3 individual gene signatures. 9 gene signature-s classification capability was compared using same classifiers on same testing datasets. Results obtained from experimentation shows that 9 gene signature classified all samples in testing dataset accurately while individual genes could not classify all accurately.
Keywords: Cancer, Gene Signature, SAM, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2076884 Fast 2.5D Model Reconstruction of Assembled Parts with High Occlusion for Completeness Inspection
Authors: Matteo Munaro, Stefano Michieletto, Edmond So, Daniele Alberton, Emanuele Menegatti
Abstract:
In this work a dual laser triangulation system is presented for fast building of 2.5D textured models of objects within a production line. This scanner is designed to produce data suitable for 3D completeness inspection algorithms. For this purpose two laser projectors have been used in order to considerably reduce the problem of occlusions in the camera movement direction. Results of reconstruction of electronic boards are presented, together with a comparison with a commercial system.
Keywords: 3D quality inspection, 2.5D reconstruction, laser triangulation, occlusions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1510883 Face Recognition: A Literature Review
Authors: A. S. Tolba, A.H. El-Baz, A.A. El-Harby
Abstract:
The task of face recognition has been actively researched in recent years. This paper provides an up-to-date review of major human face recognition research. We first present an overview of face recognition and its applications. Then, a literature review of the most recent face recognition techniques is presented. Description and limitations of face databases which are used to test the performance of these face recognition algorithms are given. A brief summary of the face recognition vendor test (FRVT) 2002, a large scale evaluation of automatic face recognition technology, and its conclusions are also given. Finally, we give a summary of the research results.Keywords: Combined classifiers, face recognition, graph matching, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7724882 Combining Bagging and Boosting
Authors: S. B. Kotsiantis, P. E. Pintelas
Abstract:
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Keywords: data mining, machine learning, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2563