Search results for: Haar-like Feature
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 940

Search results for: Haar-like Feature

310 Adaptive Bidirectional Flow for Image Interpolation and Enhancement

Authors: Shujun Fu, Qiuqi Ruan, Wenqia Wang

Abstract:

Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually the effects of blurred edges and jagged artifacts in the image to some extent. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to sharpen edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (“jaggies") along the tangent directions. In order to preserve image features such as edges, corners and textures, the nonlinear diffusion coefficients are locally adjusted according to the directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.

Keywords: anisotropic diffusion, bidirectional flow, directional derivatives, edge enhancement, image interpolation, inverse flow, shock filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538
309 Accurate Fault Classification and Section Identification Scheme in TCSC Compensated Transmission Line using SVM

Authors: Pushkar Tripathi, Abhishek Sharma, G. N. Pillai, Indira Gupta

Abstract:

This paper presents a new approach for the protection of Thyristor-Controlled Series Compensator (TCSC) line using Support Vector Machine (SVM). One SVM is trained for fault classification and another for section identification. This method use three phase current measurement that results in better speed and accuracy than other SVM based methods which used single phase current measurement. This makes it suitable for real-time protection. The method was tested on 10,000 data instances with a very wide variation in system conditions such as compensation level, source impedance, location of fault, fault inception angle, load angle at source bus and fault resistance. The proposed method requires only local current measurement.

Keywords: Fault Classification, Section Identification, Feature Selection, Support Vector Machine (SVM), Thyristor-Controlled Series Compensator (TCSC)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2525
308 A New Approach to Workforce Planning

Authors: M. Othman, N. Bhuiyan, G. J. Gouw

Abstract:

In today-s global and competitive market, manufacturing companies are working hard towards improving their production system performance. Most companies develop production systems that can help in cost reduction. Manufacturing systems consist of different elements including production methods, machines, processes, control and information systems. Human issues are an important part of manufacturing systems, yet most companies do not pay sufficient attention to them. In this paper, a workforce planning (WP) model is presented. A non-linear programming model is developed in order to minimize the hiring, firing, training and overtime costs. The purpose is to determine the number of workers for each worker type, the number of workers trained, and the number of overtime hours. Moreover, a decision support system (DSS) based on the proposed model is introduced using the Excel-Lingo software interfacing feature. This model will help to improve the interaction between the workers, managers and the technical systems in manufacturing.

Keywords: Decision Support System, Human Factors, Manufacturing System, Workforce Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2547
307 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: A. Appe, B. Poluparthi, L. Kasivajjula, U. Mv, S. Bagadi, P. Modi, A. Singh, H. Gunupudi, S. Troiano, J. Paul, J. Stovall, J. Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data are considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP (SHapley Additive exPlanations), to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since it is data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for e.g., quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP, a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: Competition, DAGs, hospital, healthcare, machine learning, market share, random forest, SHAP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 286
306 A Weighted Approach to Unconstrained Iris Recognition

Authors: Yao-Hong Tsai

Abstract:

This paper presents a weighted approach to unconstrained iris recognition. In nowadays, commercial systems are usually characterized by strong acquisition constraints based on the subject’s cooperation. However, it is not always achievable for real scenarios in our daily life. Researchers have been focused on reducing these constraints and maintaining the performance of the system by new techniques at the same time. With large variation in the environment, there are two main improvements to develop the proposed iris recognition system. For solving extremely uneven lighting condition, statistic based illumination normalization is first used on eye region to increase the accuracy of iris feature. The detection of the iris image is based on Adaboost algorithm. Secondly, the weighted approach is designed by Gaussian functions according to the distance to the center of the iris. Furthermore, local binary pattern (LBP) histogram is then applied to texture classification with the weight. Experiment showed that the proposed system provided users a more flexible and feasible way to interact with the verification system through iris recognition.

Keywords: Authentication, iris recognition, Adaboost, local binary pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937
305 Automatic Classification of the Stand-to-Sit Phase in the TUG Test Using Machine Learning

Authors: Y. A. Adla, R. Soubra, M. Kasab, M. O. Diab, A. Chkeir

Abstract:

Over the past several years, researchers have shown a great interest in assessing the mobility of elderly people to measure their functional status. Usually, such an assessment is done by conducting tests that require the subject to walk a certain distance, turn around, and finally sit back down. Consequently, this study aims to provide an at home monitoring system to assess the patient’s status continuously. Thus, we proposed a technique to automatically detect when a subject sits down while walking at home. In this study, we utilized a Doppler radar system to capture the motion of the subjects. More than 20 features were extracted from the radar signals out of which 11 were chosen based on their Intraclass Correlation Coefficient (ICC > 0.75). Accordingly, the sequential floating forward selection wrapper was applied to further narrow down the final feature vector. Finally, five features were introduced to the Linear Discriminant Analysis classifier and an accuracy of 93.75% was achieved as well as a precision and recall of 95% and 90% respectively.

Keywords: Doppler radar system, stand-to-sit phase, TUG test, machine learning, classification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 454
304 Characterization of Three Photodetector Types for Computed Tomography Dosimetry

Authors: C. M. M. Paschoal, D. do N. Souza, L. A. P. Santos

Abstract:

In this study three commercial semiconductor devices were characterized in the laboratory for computed tomography dosimetry: one photodiode and two phototransistors. It was evaluated four responses to the irradiation: dose linearity, energy dependence, angular dependence and loss of sensitivity after X ray exposure. The results showed that the three devices have proportional response with the air kerma; the energy dependence displayed for each device suggests that some calibration factors would be applied for each one; the angular dependence showed a similar pattern among the three electronic components. In respect to the fourth parameter analyzed, one phototransistor has the highest sensitivity however it also showed the greatest loss of sensitivity with the accumulated dose. The photodiode was the device with the smaller sensitivity to radiation, on the other hand, the loss of sensitivity after irradiation is negligible. Since high accuracy is a desired feature for a dosimeter, the photodiode can be the most suitable of the three devices for dosimetry in tomography. The phototransistors can also be used for CT dosimetry, however it would be necessary a correction factor due to loss of sensitivity with accumulated dose.

Keywords: Dosimetry, computed tomography, phototransistor, photodiode

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2278
303 Frame Texture Classification Method (FTCM) Applied on Mammograms for Detection of Abnormalities

Authors: Kjersti Engan, Karl Skretting, Jostein Herredsvela, Thor Ole Gulsrud

Abstract:

Texture classification is an important image processing task with a broad application range. Many different techniques for texture classification have been explored. Using sparse approximation as a feature extraction method for texture classification is a relatively new approach, and Skretting et al. recently presented the Frame Texture Classification Method (FTCM), showing very good results on classical texture images. As an extension of that work the FTCM is here tested on a real world application as detection of abnormalities in mammograms. Some extensions to the original FTCM that are useful in some applications are implemented; two different smoothing techniques and a vector augmentation technique. Both detection of microcalcifications (as a primary detection technique and as a last stage of a detection scheme), and soft tissue lesions in mammograms are explored. All the results are interesting, and especially the results using FTCM on regions of interest as the last stage in a detection scheme for microcalcifications are promising.

Keywords: detection, mammogram, texture classification, dictionary learning, FTCM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
302 A New Approach of Fuzzy Methods for Evaluating of Hydrological Data

Authors: Nasser Shamskia, Seyyed Habib Rahmati, Hassan Haleh , Seyyedeh Hoda Rahmati

Abstract:

The main criteria of designing in the most hydraulic constructions essentially are based on runoff or discharge of water. Two of those important criteria are runoff and return period. Mostly, these measures are calculated or estimated by stochastic data. Another feature in hydrological data is their impreciseness. Therefore, in order to deal with uncertainty and impreciseness, based on Buckley-s estimation method, a new fuzzy method of evaluating hydrological measures are developed. The method introduces triangular shape fuzzy numbers for different measures in which both of the uncertainty and impreciseness concepts are considered. Besides, since another important consideration in most of the hydrological studies is comparison of a measure during different months or years, a new fuzzy method which is consistent with special form of proposed fuzzy numbers, is also developed. Finally, to illustrate the methods more explicitly, the two algorithms are tested on one simple example and a real case study.

Keywords: Fuzzy Discharge, Fuzzy estimation, Fuzzy ranking method, Hydrological data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
301 Research on Urban Point of Interest Generalization Method Based on Mapping Presentation

Authors: Chengming Li, Yong Yin, Peipei Guo, Xiaoli Liu

Abstract:

Without taking account of the attribute richness of POI (point of interest) data and spatial distribution limited by roads, a POI generalization method considering both attribute information and spatial distribution has been proposed against the existing point generalization algorithm merely focusing on overall information of point groups. Hierarchical characteristic of urban POI information expression has been firstly analyzed to point out the measurement feature of the corresponding hierarchy. On this basis, an urban POI generalizing strategy has been put forward: POIs urban road network have been divided into three distribution pattern; corresponding generalization methods have been proposed according to the characteristic of POI data in different distribution patterns. Experimental results showed that the method taking into account both attribute information and spatial distribution characteristics of POI can better implement urban POI generalization in the mapping presentation.

Keywords: POI, Road network, spatial information expression, selection method, distribution pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1039
300 An Improved Face Recognition Algorithm Using Histogram-Based Features in Spatial and Frequency Domains

Authors: Qiu Chen, Koji Kotani, Feifei Lee, Tadahiro Ohmi

Abstract:

In this paper, we propose an improved face recognition algorithm using histogram-based features in spatial and frequency domains. For adding spatial information of the face to improve recognition performance, a region-division (RD) method is utilized. The facial area is firstly divided into several regions, then feature vectors of each facial part are generated by Binary Vector Quantization (BVQ) histogram using DCT coefficients in low frequency domains, as well as Local Binary Pattern (LBP) histogram in spatial domain. Recognition results with different regions are first obtained separately and then fused by weighted averaging. Publicly available ORL database is used for the evaluation of our proposed algorithm, which is consisted of 40 subjects with 10 images per subject containing variations in lighting, posing, and expressions. It is demonstrated that face recognition using RD method can achieve much higher recognition rate.

Keywords: Face recognition, Binary vector quantization (BVQ), Local Binary Patterns (LBP), DCT coefficients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
299 2D and 3D Unsteady Simulation of the Heat Transfer in the Sample during Heat Treatment by Moving Heat Source

Authors: Z. Veselý, M. Honner, J. Mach

Abstract:

The aim of the performed work is to establish the 2D and 3D model of direct unsteady task of sample heat treatment by moving source employing computer model on the basis of finite element method. Complex boundary condition on heat loaded sample surface is the essential feature of the task. Computer model describes heat treatment of the sample during heat source movement over the sample surface. It is started from 2D task of sample cross section as a basic model. Possibilities of extension from 2D to 3D task are discussed. The effect of the addition of third model dimension on temperature distribution in the sample is showed. Comparison of various model parameters on the sample temperatures is observed. Influence of heat source motion on the depth of material heat treatment is shown for several velocities of the movement. Presented computer model is prepared for the utilization in laser treatment of machine parts.

Keywords: Computer simulation, unsteady model, heat treatment, complex boundary condition, moving heat source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2038
298 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: Data quality, feature selection, probability distribution, string classification, string length.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1328
297 Signature Recognition and Verification using Hybrid Features and Clustered Artificial Neural Network(ANN)s

Authors: Manasjyoti Bhuyan, Kandarpa Kumar Sarma, Hirendra Das

Abstract:

Signature represents an individual characteristic of a person which can be used for his / her validation. For such application proper modeling is essential. Here we propose an offline signature recognition and verification scheme which is based on extraction of several features including one hybrid set from the input signature and compare them with the already trained forms. Feature points are classified using statistical parameters like mean and variance. The scanned signature is normalized in slant using a very simple algorithm with an intention to make the system robust which is found to be very helpful. The slant correction is further aided by the use of an Artificial Neural Network (ANN). The suggested scheme discriminates between originals and forged signatures from simple and random forgeries. The primary objective is to reduce the two crucial parameters-False Acceptance Rate (FAR) and False Rejection Rate (FRR) with lesser training time with an intension to make the system dynamic using a cluster of ANNs forming a multiple classifier system.

Keywords: offline, algorithm, FAR, FRR, ANN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
296 Aliveness Detection of Fingerprints using Multiple Static Features

Authors: Heeseung Choi, Raechoong Kang, Kyungtaek Choi, Jaihie Kim

Abstract:

Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.

Keywords: Aliveness detection, Fingerprint recognition, individual pore spacing, multiple static features, residual noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1926
295 Key Competences in Economics and Business Field: The Employers’ Side of the Story

Authors: Bruno Škrinjarić

Abstract:

Rapid technological developments and increase in organizations’ interdependence on international scale are changing the traditional workplace paradigm. A key feature of knowledge based economy is that employers are looking for individuals that possess both specific academic skills and knowledge, and also capability to be proactive and respond to problems creatively and autonomously. The focus of this paper is workers with Economics and Business background and its goals are threefold: (1) to explore wide range of competences and identify which are the most important to employers; (2) to investigate the existence and magnitude of gap between required and possessed level of a certain competency; and (3) to inquire how this gap is connected with performance of a company. A study was conducted on a representative sample of Croatian enterprises during the spring of 2016. Results show that generic, rather than specific, competences are more important to employers and the gap between the relative importance of certain competence and its current representation in existing workforce is greater for generic competences than for specific. Finally, results do not support the hypothesis that this gap is correlated with firms’ performance.

Keywords: Competency gap, competency matching, key competences, firm performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
294 Learning Spatio-Temporal Topology of a Multi-Camera Network by Tracking Multiple People

Authors: Yunyoung Nam, Junghun Ryu, Yoo-Joo Choi, We-Duke Cho

Abstract:

This paper presents a novel approach for representing the spatio-temporal topology of the camera network with overlapping and non-overlapping fields of view (FOVs). The topology is determined by tracking moving objects and establishing object correspondence across multiple cameras. To track people successfully in multiple camera views, we used the Merge-Split (MS) approach for object occlusion in a single camera and the grid-based approach for extracting the accurate object feature. In addition, we considered the appearance of people and the transition time between entry and exit zones for tracking objects across blind regions of multiple cameras with non-overlapping FOVs. The main contribution of this paper is to estimate transition times between various entry and exit zones, and to graphically represent the camera topology as an undirected weighted graph using the transition probabilities.

Keywords: Surveillance, multiple camera, people tracking, topology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
293 Infrared Camera-Based Hand Gesture Space Touch System Implementation of Smart Device Environment

Authors: Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park, Kwang-Mo Jung

Abstract:

This paper proposes a method to recognize the tip of a finger and space touch hand gesture using an infrared camera in a smart device environment. The proposed method estimates the tip of a finger with a curvature-based ellipse fitting algorithm, and verifies that the estimated object is indeed a finger with an ellipse fitting rectangular area. The feature extracted from the verified finger tip is used to implement the movement of a mouse and clicking gesture. The proposed algorithm was implemented with an actual smart device to test the proposed method. Empirical parameters were obtained from the keypad software and an image analysis tool for the performance optimization, and a comparative analysis with conventional research showed improved performance with the proposed method.

Keywords: Infrared camera, Hand gesture, Smart device, Space touch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2337
292 Suitability of Requirements Abstraction Model (RAM) Requirements for High-Level System Testing

Authors: Naeem Muhammad, Yves Vandewoude, Yolande Berbers, Robert Feldt

Abstract:

The Requirements Abstraction Model (RAM) helps in managing abstraction in requirements by organizing them at four levels (product, feature, function and component). The RAM is adaptable and can be tailored to meet the needs of the various organizations. Because software requirements are an important source of information for developing high-level tests, organizations willing to adopt the RAM model need to know the suitability of the RAM requirements for developing high-level tests. To investigate this suitability, test cases from twenty randomly selected requirements were developed, analyzed and graded. Requirements were selected from the requirements document of a Course Management System, a web based software system that supports teachers and students in performing course related tasks. This paper describes the results of the requirements document analysis. The results show that requirements at lower levels in the RAM are suitable for developing executable tests whereas it is hard to develop from requirements at higher levels.

Keywords: Market-driven requirements engineering, requirements abstraction model, requirements abstraction, system testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
291 A Hybrid Particle Swarm Optimization Solution to Ramping Rate Constrained Dynamic Economic Dispatch

Authors: Pichet Sriyanyong

Abstract:

This paper presents the application of an enhanced Particle Swarm Optimization (EPSO) combined with Gaussian Mutation (GM) for solving the Dynamic Economic Dispatch (DED) problem considering the operating constraints of generators. The EPSO consists of the standard PSO and a modified heuristic search approaches. Namely, the ability of the traditional PSO is enhanced by applying the modified heuristic search approach to prevent the solutions from violating the constraints. In addition, Gaussian Mutation is aimed at increasing the diversity of global search, whilst it also prevents being trapped in suboptimal points during search. To illustrate its efficiency and effectiveness, the developed EPSO-GM approach is tested on the 3-unit and 10-unit 24-hour systems considering valve-point effect. From the experimental results, it can be concluded that the proposed EPSO-GM provides, the accurate solution, the efficiency, and the feature of robust computation compared with other algorithms under consideration.

Keywords: Particle Swarm Optimization (PSO), GaussianMutation (GM), Dynamic Economic Dispatch (DED).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795
290 A Novel Architecture for Wavelet based Image Fusion

Authors: Susmitha Vekkot, Pancham Shukla

Abstract:

In this paper, we focus on the fusion of images from different sources using multiresolution wavelet transforms. Based on reviews of popular image fusion techniques used in data analysis, different pixel and energy based methods are experimented. A novel architecture with a hybrid algorithm is proposed which applies pixel based maximum selection rule to low frequency approximations and filter mask based fusion to high frequency details of wavelet decomposition. The key feature of hybrid architecture is the combination of advantages of pixel and region based fusion in a single image which can help the development of sophisticated algorithms enhancing the edges and structural details. A Graphical User Interface is developed for image fusion to make the research outcomes available to the end user. To utilize GUI capabilities for medical, industrial and commercial activities without MATLAB installation, a standalone executable application is also developed using Matlab Compiler Runtime.

Keywords: Filter mask, GUI, hybrid architecture, image fusion, Matlab Compiler Runtime, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2389
289 Low Latency Routing Algorithm for Unmanned Aerial Vehicles Ad-Hoc Networks

Authors: Abdel Ilah Alshabtat, Liang Dong

Abstract:

In this paper, we proposed a new routing protocol for Unmanned Aerial Vehicles (UAVs) that equipped with directional antenna. We named this protocol Directional Optimized Link State Routing Protocol (DOLSR). This protocol is based on the well known protocol that is called Optimized Link State Routing Protocol (OLSR). We focused in our protocol on the multipoint relay (MPR) concept which is the most important feature of this protocol. We developed a heuristic that allows DOLSR protocol to minimize the number of the multipoint relays. With this new protocol the number of overhead packets will be reduced and the End-to-End delay of the network will also be minimized. We showed through simulation that our protocol outperformed Optimized Link State Routing Protocol, Dynamic Source Routing (DSR) protocol and Ad- Hoc On demand Distance Vector (AODV) routing protocol in reducing the End-to-End delay and enhancing the overall throughput. Our evaluation of the previous protocols was based on the OPNET network simulation tool.

Keywords: Mobile Ad-Hoc Networks, Ad-Hoc RoutingProtocols, Optimized link State Routing Protocol, Unmanned AerialVehicles, Directional Antenna.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2508
288 A Hybrid Gene Selection Technique Using Improved Mutual Information and Fisher Score for Cancer Classification Using Microarrays

Authors: M. Anidha, K. Premalatha

Abstract:

Feature Selection is significant in order to perform constructive classification in the area of cancer diagnosis. However, a large number of features compared to the number of samples makes the task of classification computationally very hard and prone to errors in microarray gene expression datasets. In this paper, we present an innovative method for selecting highly informative gene subsets of gene expression data that effectively classifies the cancer data into tumorous and non-tumorous. The hybrid gene selection technique comprises of combined Mutual Information and Fisher score to select informative genes. The gene selection is validated by classification using Support Vector Machine (SVM) which is a supervised learning algorithm capable of solving complex classification problems. The results obtained from improved Mutual Information and F-Score with SVM as a classifier has produced efficient results.

Keywords: Gene selection, mutual information, Fisher score, classification, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152
287 Personalized Email Marketing Strategy: A Reinforcement Learning Approach

Authors: Lei Zhang, Tingting Xu, Jun He, Zhenyu Yan, Roger Brooks

Abstract:

Email marketing is one of the most important segments of online marketing. Email content is vital to customers. Different customers may have different familiarity with a product, so a successful marketing strategy must personalize email content based on individual customers’ product affinity. In this study, we build our personalized email marketing strategy with three types of emails: nurture, promotion, and conversion. Each type of emails has a different influence on customers. We investigate this difference by analyzing customers’ open rates, click rates and opt-out rates. Feature importance from response models is also analyzed. The goal of the marketing strategy is to improve the click rate on conversion-type emails. To build the personalized strategy, we formulate the problem as a reinforcement learning problem and adopt a Q-learning algorithm with variations. The simulation results show that our model-based strategy outperforms the current marketer’s strategy.

Keywords: Email marketing, email content, reinforcement learning, machine learning, Q-learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 730
286 An Auxiliary Technique for Coronary Heart Disease Prediction by Analyzing ECG Based on ResNet and Bi-LSTM

Authors: Yang Zhang, Jian He

Abstract:

Heart disease is one of the leading causes of death in the world, and coronary heart disease (CHD) is one of the major heart diseases. Electrocardiogram (ECG) is widely used in the detection of heart diseases, but the traditional manual method for CHD prediction by analyzing ECG requires lots of professional knowledge for doctors. This paper presents sliding window and continuous wavelet transform (CWT) to transform ECG signals into images, and then ResNet and Bi-LSTM are introduced to build the ECG feature extraction network (namely ECGNet). At last, an auxiliary system for CHD prediction was developed based on modified ResNet18 and Bi-LSTM, and the public ECG dataset of CHD from MIMIC-3 was used to train and test the system. The experimental results show that the accuracy of the method is 83%, and the F1-score is 83%. Compared with the available methods for CHD prediction based on ECG, such as kNN, decision tree, VGGNet, etc., this method not only improves the prediction accuracy but also could avoid the degradation phenomenon of the deep learning network.

Keywords: Bi-LSTM, CHD, coronary heart disease, ECG, electrocardiogram, ResNet, sliding window.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 335
285 A Cohesive Lagrangian Swarm and Its Application to Multiple Unicycle-like Vehicles

Authors: Jito Vanualailai, Bibhya Sharma

Abstract:

Swarm principles are increasingly being used to design controllers for the coordination of multi-robot systems or, in general, multi-agent systems. This paper proposes a two-dimensional Lagrangian swarm model that enables the planar agents, modeled as point masses, to swarm whilst effectively avoiding each other and obstacles in the environment. A novel method, based on an extended Lyapunov approach, is used to construct the model. Importantly, the Lyapunov method ensures a form of practical stability that guarantees an emergent behavior, namely, a cohesive and wellspaced swarm with a constant arrangement of individuals about the swarm centroid. Computer simulations illustrate this basic feature of collective behavior. As an application, we show how multiple planar mobile unicycle-like robots swarm to eventually form patterns in which their velocities and orientations stabilize.

Keywords: Attractive-repulsive swarm model, individual-based swarm model, Lagrangian swarm model, Lyapunov stability, Lyapunov-like function, practical stability, unicycle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1539
284 Flexure of Cantilever Thick Beams Using Trigonometric Shear Deformation Theory

Authors: Yuwaraj M. Ghugal, Ajay G. Dahake

Abstract:

A trigonometric shear deformation theory for flexure of thick beams, taking into account transverse shear deformation effects, is developed. The number of variables in the present theory is same as that in the first order shear deformation theory. The sinusoidal function is used in displacement field in terms of thickness coordinate to represent the shear deformation effects. The noteworthy feature of this theory is that the transverse shear stresses can be obtained directly from the use of constitutive relations with excellent accuracy, satisfying the shear stress free conditions on the top and bottom surfaces of the beam. Hence, the theory obviates the need of shear correction factor. Governing differential equations and boundary conditions are obtained by using the principle of virtual work. The thick cantilever isotropic beams are considered for the numerical studies to demonstrate the efficiency of the. Results obtained are discussed critically with those of other theories.

Keywords: Trigonometric shear deformation, thick beam, flexure, principle of virtual work, equilibrium equations, stress.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3092
283 Modern Spectrum Sensing Techniques for Cognitive Radio Networks: Practical Implementation and Performance Evaluation

Authors: Antoni Ivanov, Nikolay Dandanov, Nicole Christoff, Vladimir Poulkov

Abstract:

Spectrum underutilization has made cognitive radio a promising technology both for current and future telecommunications. This is due to the ability to exploit the unused spectrum in the bands dedicated to other wireless communication systems, and thus, increase their occupancy. The essential function, which allows the cognitive radio device to perceive the occupancy of the spectrum, is spectrum sensing. In this paper, the performance of modern adaptations of the four most widely used spectrum sensing techniques namely, energy detection (ED), cyclostationary feature detection (CSFD), matched filter (MF) and eigenvalues-based detection (EBD) is compared. The implementation has been accomplished through the PlutoSDR hardware platform and the GNU Radio software package in very low Signal-to-Noise Ratio (SNR) conditions. The optimal detection performance of the examined methods in a realistic implementation-oriented model is found for the common relevant parameters (number of observed samples, sensing time and required probability of false alarm).

Keywords: Cognitive radio, dynamic spectrum access, GNU Radio, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1172
282 Extending the Aspect Oriented Programming Joinpoint Model for Memory and Type Safety

Authors: Amjad Nusayr

Abstract:

Software security is a general term used to any type of software architecture or model in which security aspects are incorporated in this architecture. These aspects are not part of the main logic of the underlying program. Software security can be achieved using a combination of approaches including but not limited to secure software designs, third part component validation, and secure coding practices. Memory safety is one feature in software security where we ensure that any object in memory is have a valid pointer or a reference with a valid type. Aspect Oriented Programming (AOP) is a paradigm that is concerned with capturing the cross-cutting concerns in code development. AOP is generally used for common cross-cutting concerns like logging and Database transaction managing. In this paper we introduce the concepts that enable AOP to be used for the purpose of memory and type safety. We also present ideas for extending AOP in software security practices.

Keywords: Aspect oriented programming, programming languages, software security, memory and type safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 415
281 More Realistic Model for Simulating Min Protein Dynamics: Lattice Boltzmann Method Incorporating the Role of Nucleoids

Authors: J.Yojina, W. Ngamsaad, N. Nuttavut, D.Triampo, Y. Lenbury, W. Triampo, P. Kanthang, S.Sriyab

Abstract:

The dynamics of Min proteins plays a center role in accurate cell division. Although the nucleoids may presumably play an important role in prokaryotic cell division, there is a lack of models to account for its participation. In this work, we apply the lattice Boltzmann method to investigate protein oscillation based on a mesoscopic model that takes into account the nucleoid-s role. We found that our numerical results are in reasonably good agreement with the previous experimental results On comparing with the other computational models without the presence of nucleoids, the highlight of our finding is that the local densities of MinD and MinE on the cytoplasmic membrane increases, especially along the cell width, when the size of the obstacle increases, leading to a more distinct cap-like structure at the poles. This feature indicated the realistic pattern and reflected the combination of Min protein dynamics and nucleoid-s role.

Keywords: lattice Boltzmann method, cell division, Minproteins oscillation, nucleoid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1246