Search results for: machine-learning algorithms
1258 Secure Transfer of Medical Images Using Hybrid Encryption
Authors: Boukhatem Mohamed Belkaid, Lahdi Mourad
Abstract:
In this paper, we propose a new encryption system for security issues medical images. The hybrid encryption scheme is based on AES and RSA algorithms to validate the three security services are authentication, integrity, and confidentiality. Privacy is ensured by AES, authenticity is ensured by the RSA algorithm. Integrity is assured by the basic function of the correlation between adjacent pixels. Our system generates a unique password every new session of encryption, that will be used to encrypt each frame of the medical image basis to strengthen and ensure his safety. Several metrics have been used for various tests of our analysis. For the integrity test, we noticed the efficiencies of our system and how the imprint cryptographic changes at reception if a change affects the image in the transmission channel.Keywords: AES, RSA, integrity, confidentiality, authentication, medical images, encryption, decryption, key, correlation
Procedia PDF Downloads 4431257 An Optimization Algorithm Based on Dynamic Schema with Dissimilarities and Similarities of Chromosomes
Authors: Radhwan Yousif Sedik Al-Jawadi
Abstract:
Optimization is necessary for finding appropriate solutions to a range of real-life problems. In particular, genetic (or more generally, evolutionary) algorithms have proved very useful in solving many problems for which analytical solutions are not available. In this paper, we present an optimization algorithm called Dynamic Schema with Dissimilarity and Similarity of Chromosomes (DSDSC) which is a variant of the classical genetic algorithm. This approach constructs new chromosomes from a schema and pairs of existing ones by exploring their dissimilarities and similarities. To show the effectiveness of the algorithm, it is tested and compared with the classical GA, on 15 two-dimensional optimization problems taken from literature. We have found that, in most cases, our method is better than the classical genetic algorithm.Keywords: chromosome injection, dynamic schema, genetic algorithm, similarity and dissimilarity
Procedia PDF Downloads 3491256 Conception of a Reliable Low Cost, Autonomous Explorative Hovercraft 1
Authors: A. Brand, S. Burgalat, E. Chastel, M. Jumeline, L. Teilhac
Abstract:
The paper presents actual benefits and drawbacks of a multidirectional Hovercraft conceived with limited resources and designed for indoor exploration. Recent developments in the field have led to apparition of very powerful automotive systems capable of very high calculation and exploration in complex unknown environments. They usually propose very complex algorithms, high precision/cost sensors and sometimes have heavy calculation consumption with complex data fusion. Those systems are usually powerful but have a certain price and the benefits may not be worth the cost, especially considering their hardware limitations and their power consumption. Present approach is to build a compromise between cost, power consumption and results preciseness.Keywords: Hovercraft, indoor exploration, autonomous, multidirectional, wireless control
Procedia PDF Downloads 4171255 Evaluating the Performance of Color Constancy Algorithm
Authors: Damanjit Kaur, Avani Bhatia
Abstract:
Color constancy is significant for human vision since color is a pictorial cue that helps in solving different visions tasks such as tracking, object recognition, or categorization. Therefore, several computational methods have tried to simulate human color constancy abilities to stabilize machine color representations. Two different kinds of methods have been used, i.e., normalization and constancy. While color normalization creates a new representation of the image by canceling illuminant effects, color constancy directly estimates the color of the illuminant in order to map the image colors to a canonical version. Color constancy is the capability to determine colors of objects independent of the color of the light source. This research work studies the most of the well-known color constancy algorithms like white point and gray world.Keywords: color constancy, gray world, white patch, modified white patch
Procedia PDF Downloads 3201254 Evaluating 8D Reports Using Text-Mining
Authors: Benjamin Kuester, Bjoern Eilert, Malte Stonis, Ludger Overmeyer
Abstract:
Increasing quality requirements make reliable and effective quality management indispensable. This includes the complaint handling in which the 8D method is widely used. The 8D report as a written documentation of the 8D method is one of the key quality documents as it internally secures the quality standards and acts as a communication medium to the customer. In practice, however, the 8D report is mostly faulty and of poor quality. There is no quality control of 8D reports today. This paper describes the use of natural language processing for the automated evaluation of 8D reports. Based on semantic analysis and text-mining algorithms the presented system is able to uncover content and formal quality deficiencies and thus increases the quality of the complaint processing in the long term.Keywords: 8D report, complaint management, evaluation system, text-mining
Procedia PDF Downloads 3161253 Text Data Preprocessing Library: Bilingual Approach
Authors: Kabil Boukhari
Abstract:
In the context of information retrieval, the selection of the most relevant words is a very important step. In fact, the text cleaning allows keeping only the most representative words for a better use. In this paper, we propose a library for the purpose text preprocessing within an implemented application to facilitate this task. This study has two purposes. The first, is to present the related work of the various steps involved in text preprocessing, presenting the segmentation, stemming and lemmatization algorithms that could be efficient in the rest of study. The second, is to implement a developed tool for text preprocessing in French and English. This library accepts unstructured text as input and provides the preprocessed text as output, based on a set of rules and on a base of stop words for both languages. The proposed library has been made on different corpora and gave an interesting result.Keywords: text preprocessing, segmentation, knowledge extraction, normalization, text generation, information retrieval
Procedia PDF Downloads 941252 Secret Security Smart Lock Using Artificial Intelligence Hybrid Algorithm
Authors: Vahid Bayrami Rad
Abstract:
Ever since humans developed a collective way of life to the development of urbanization, the concern of security has always been considered one of the most important challenges of life. To protect property, locks have always been a practical tool. With the advancement of technology, the form of locks has changed from mechanical to electric. One of the most widely used fields of using artificial intelligence is its application in the technology of surveillance security systems. Currently, the technologies used in smart anti-theft door handles are one of the most potential fields for using artificial intelligence. Artificial intelligence has the possibility to learn, calculate, interpret and process by analyzing data with the help of algorithms and mathematical models and make smart decisions. We will use Arduino board to process data.Keywords: arduino board, artificial intelligence, image processing, solenoid lock
Procedia PDF Downloads 691251 Comparison Analysis of Multi-Channel Echo Cancellation Using Adaptive Filters
Authors: Sahar Mobeen, Anam Rafique, Irum Baig
Abstract:
Acoustic echo cancellation in multichannel is a system identification application. In real time environment, signal changes very rapidly which required adaptive algorithms such as Least Mean Square (LMS), Leaky Least Mean Square (LLMS), Normalized Least Mean square (NLMS) and average (AFA) having high convergence rate and stable. LMS and NLMS are widely used adaptive algorithm due to less computational complexity and AFA used of its high convergence rate. This research is based on comparison of acoustic echo (generated in a room) cancellation thorough LMS, LLMS, NLMS, AFA and newly proposed average normalized leaky least mean square (ANLLMS) adaptive filters.Keywords: LMS, LLMS, NLMS, AFA, ANLLMS
Procedia PDF Downloads 5661250 Quality Assurance in Software Design Patterns
Authors: Rabbia Tariq, Hannan Sajjad, Mehreen Sirshar
Abstract:
Design patterns are widely used to make the process of development easier as they greatly help the developers to develop the software. Different design patterns have been introduced till now but the behavior of same design pattern may differ in different domains that can lead to the wrong selection of the design pattern. The paper aims to discover the design patterns that suits best with respect to their domain thereby helping the developers to choose an effective design pattern. It presents the comprehensive analysis of design patterns based on different methodologies that include simulation, case study and comparison of various algorithms. Due to the difference of the domain the methodology used in one domain may be inapplicable to the other domain. The paper draws a conclusion based on strength and limitation of each design pattern in their respective domain.Keywords: design patterns, evaluation, quality assurance, software domains
Procedia PDF Downloads 5221249 Towards a Conscious Design in AI by Overcoming Dark Patterns
Authors: Ayse Arslan
Abstract:
One of the important elements underpinning a conscious design is the degree of toxicity in communication. This study explores the mechanisms and strategies for identifying toxic content by avoiding dark patterns. Given the breadth of hate and harassment attacks, this study explores a threat model and taxonomy to assist in reasoning about strategies for detection, prevention, mitigation, and recovery. In addition to identifying some relevant techniques such as nudges, automatic detection, or human-ranking, the study suggests the use of major metrics such as the overhead and friction of solutions on platforms and users or balancing false positives (e.g., incorrectly penalizing legitimate users) against false negatives (e.g., users exposed to hate and harassment) to maintain a conscious design towards fairness.Keywords: AI, ML, algorithms, policy, system design
Procedia PDF Downloads 1211248 Conception of a Reliable Low Cost and Autonomous Explorative Hovercraft
Authors: S. Burgalat, L. Teilhac, A. Brand, E. Chastel, M. Jumeline
Abstract:
The paper presents actual benefits and drawbacks of a multidirectional autonomous hovercraft conceived with limited resources and designed for indoor exploration. Recent developments in the field have led to the apparition of very powerful automotive systems capable of very high calculation and exploration in complex unknown environments. They usually propose very complex algorithms, high precision/cost sensors and sometimes have heavy calculation consumption with complex data fusion. These systems are usually powerful but have a certain price, and the benefits may not be worth the cost, especially considering their hardware limitations and their power consumption. The present approach is to build a compromise between cost, power consumption and results preciseness.Keywords: hovercraft, indoor exploration, autonomous, multidirectional, wireless control
Procedia PDF Downloads 2781247 Predicting the Product Life Cycle of Songs on Radio - How Record Labels Can Manage Product Portfolio and Prioritise Artists by Using Machine Learning Techniques
Authors: Claus N. Holm, Oliver F. Grooss, Robert A. Alphinas
Abstract:
This research strives to predict the remaining product life cycle of a song on radio after it has been played for one or two months. The best results were achieved using a k-d tree to calculate the most similar songs to the test songs and use a Random Forest model to forecast radio plays. An 82.78% and 83.44% accuracy is achieved for the two time periods, respectively. This explorative research leads to over 4500 test metrics to find the best combination of models and pre-processing techniques. Other algorithms tested are KNN, MLP and CNN. The features only consist of daily radio plays and use no musical features.Keywords: hit song science, product life cycle, machine learning, radio
Procedia PDF Downloads 1561246 Pre- and Post-Analyses of Disruptive Quay Crane Scheduling Problem
Authors: K. -H. Yang
Abstract:
In the past, the quay crane operations have been well studied. There were a certain number of scheduling algorithms for quay crane operations, but without considering some nuisance factors that might disrupt the quay crane operations. For example, bad grapples make a crane unable to load or unload containers or a sudden strong breeze stops operations temporarily. Although these disruptive conditions randomly occur, they influence the efficiency of quay crane operations. The disruption is not considered in the operational procedures nor is evaluated in advance for its impacts. This study applies simulation and optimization approaches to develop structures of pre-analysis and post-analysis for the Quay Crane Scheduling Problem to deal with disruptive scenarios for quay crane operation. Numerical experiments are used for demonstrations for the validity of the developed approaches.Keywords: disruptive quay crane scheduling, pre-analysis, post-analysis, disruption
Procedia PDF Downloads 2221245 Parallel Querying of Distributed Ontologies with Shared Vocabulary
Authors: Sharjeel Aslam, Vassil Vassilev, Karim Ouazzane
Abstract:
Ontologies and various semantic repositories became a convenient approach for implementing model-driven architectures of distributed systems on the Web. SPARQL is the standard query language for querying such. However, although SPARQL is well-established standard for querying semantic repositories in RDF and OWL format and there are commonly used APIs which supports it, like Jena for Java, its parallel option is not incorporated in them. This article presents a complete framework consisting of an object algebra for parallel RDF and an index-based implementation of the parallel query engine capable of dealing with the distributed RDF ontologies which share common vocabulary. It has been implemented in Java, and for validation of the algorithms has been applied to the problem of organizing virtual exhibitions on the Web.Keywords: distributed ontologies, parallel querying, semantic indexing, shared vocabulary, SPARQL
Procedia PDF Downloads 2041244 A Quadratic Model to Early Predict the Blastocyst Stage with a Time Lapse Incubator
Authors: Cecile Edel, Sandrine Giscard D'Estaing, Elsa Labrune, Jacqueline Lornage, Mehdi Benchaib
Abstract:
Introduction: The use of incubator equipped with time-lapse technology in Artificial Reproductive Technology (ART) allows a continuous surveillance. With morphocinetic parameters, algorithms are available to predict the potential outcome of an embryo. However, the different proposed time-lapse algorithms do not take account the missing data, and then some embryos could not be classified. The aim of this work is to construct a predictive model even in the case of missing data. Materials and methods: Patients: A retrospective study was performed, in biology laboratory of reproduction at the hospital ‘Femme Mère Enfant’ (Lyon, France) between 1 May 2013 and 30 April 2015. Embryos (n= 557) obtained from couples (n=108) were cultured in a time-lapse incubator (Embryoscope®, Vitrolife, Goteborg, Sweden). Time-lapse incubator: The morphocinetic parameters obtained during the three first days of embryo life were used to build the predictive model. Predictive model: A quadratic regression was performed between the number of cells and time. N = a. T² + b. T + c. N: number of cells at T time (T in hours). The regression coefficients were calculated with Excel software (Microsoft, Redmond, WA, USA), a program with Visual Basic for Application (VBA) (Microsoft) was written for this purpose. The quadratic equation was used to find a value that allows to predict the blastocyst formation: the synthetize value. The area under the curve (AUC) obtained from the ROC curve was used to appreciate the performance of the regression coefficients and the synthetize value. A cut-off value has been calculated for each regression coefficient and for the synthetize value to obtain two groups where the difference of blastocyst formation rate according to the cut-off values was maximal. The data were analyzed with SPSS (IBM, Il, Chicago, USA). Results: Among the 557 embryos, 79.7% had reached the blastocyst stage. The synthetize value corresponds to the value calculated with time value equal to 99, the highest AUC was then obtained. The AUC for regression coefficient ‘a’ was 0.648 (p < 0.001), 0.363 (p < 0.001) for the regression coefficient ‘b’, 0.633 (p < 0.001) for the regression coefficient ‘c’, and 0.659 (p < 0.001) for the synthetize value. The results are presented as follow: blastocyst formation rate under cut-off value versus blastocyst rate formation above cut-off value. For the regression coefficient ‘a’ the optimum cut-off value was -1.14.10-3 (61.3% versus 84.3%, p < 0.001), 0.26 for the regression coefficient ‘b’ (83.9% versus 63.1%, p < 0.001), -4.4 for the regression coefficient ‘c’ (62.2% versus 83.1%, p < 0.001) and 8.89 for the synthetize value (58.6% versus 85.0%, p < 0.001). Conclusion: This quadratic regression allows to predict the outcome of an embryo even in case of missing data. Three regression coefficients and a synthetize value could represent the identity card of an embryo. ‘a’ regression coefficient represents the acceleration of cells division, ‘b’ regression coefficient represents the speed of cell division. We could hypothesize that ‘c’ regression coefficient could represent the intrinsic potential of an embryo. This intrinsic potential could be dependent from oocyte originating the embryo. These hypotheses should be confirmed by studies analyzing relationship between regression coefficients and ART parameters.Keywords: ART procedure, blastocyst formation, time-lapse incubator, quadratic model
Procedia PDF Downloads 3071243 Design of Permanent Sensor Fault Tolerance Algorithms by Sliding Mode Observer for Smart Hybrid Powerpack
Authors: Sungsik Jo, Hyeonwoo Kim, Iksu Choi, Hunmo Kim
Abstract:
In the SHP, LVDT sensor is for detecting the length changes of the EHA output, and the thrust of the EHA is controlled by the pressure sensor. Sensor is possible to cause hardware fault by internal problem or external disturbance. The EHA of SHP is able to be uncontrollable due to control by feedback from uncertain information, on this paper; the sliding mode observer algorithm estimates the original sensor output information in permanent sensor fault. The proposed algorithm shows performance to recovery fault of disconnection and short circuit basically, also the algorithm detect various of sensor fault mode.Keywords: smart hybrid powerpack (SHP), electro hydraulic actuator (EHA), permanent sensor fault tolerance, sliding mode observer (SMO), graphic user interface (GUI)
Procedia PDF Downloads 5491242 Design and Development of an Algorithm to Predict Fluctuations of Currency Rates
Authors: Nuwan Kuruwitaarachchi, M. K. M. Peiris, C. N. Madawala, K. M. A. R. Perera, V. U. N Perera
Abstract:
Dealing with businesses with the foreign market always took a special place in a country’s economy. Political and social factors came into play making currency rate changes fluctuate rapidly. Currency rate prediction has become an important factor for larger international businesses since large amounts of money exchanged between countries. This research focuses on comparing the accuracy of mainly three models; Autoregressive Integrated Moving Average (ARIMA), Artificial Neural Networks(ANN) and Support Vector Machines(SVM). series of data import, export, USD currency exchange rate respect to LKR has been selected for training using above mentioned algorithms. After training the data set and comparing each algorithm, it was able to see that prediction in SVM performed better than other models. It was improved more by combining SVM and SVR models together.Keywords: ARIMA, ANN, FFNN, RMSE, SVM, SVR
Procedia PDF Downloads 2121241 Orthogonal Regression for Nonparametric Estimation of Errors-In-Variables Models
Authors: Anastasiia Yu. Timofeeva
Abstract:
Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.Keywords: grade point average, orthogonal regression, penalized regression spline, locally weighted regression
Procedia PDF Downloads 4161240 Secure Transfer of Medical Images Using Hybrid Encryption Authentication, Confidentiality, Integrity
Authors: Boukhatem Mohammed Belkaid, Lahdir Mourad
Abstract:
In this paper, we propose a new encryption system for security issues medical images. The hybrid encryption scheme is based on AES and RSA algorithms to validate the three security services are authentication, integrity, and confidentiality. Privacy is ensured by AES, authenticity is ensured by the RSA algorithm. Integrity is assured by the basic function of the correlation between adjacent pixels. Our system generates a unique password every new session of encryption, that will be used to encrypt each frame of the medical image basis to strengthen and ensure his safety. Several metrics have been used for various tests of our analysis. For the integrity test, we noticed the efficiencies of our system and how the imprint cryptographic changes at reception if a change affects the image in the transmission channel.Keywords: AES, RSA, integrity, confidentiality, authentication, medical images, encryption, decryption, key, correlation
Procedia PDF Downloads 5401239 Approaches of Flight Level Selection for an Unmanned Aerial Vehicle Round-Trip in Order to Reach Best Range Using Changes in Flight Level Winds
Authors: Dmitry Fedoseyev
Abstract:
The ultimate success of unmanned aerial vehicles (UAVs) depends largely on the effective control of their flight, especially in variable wind conditions. This paper investigates different approaches to selecting the optimal flight level to maximize the range of UAVs. We propose to consider methods based on mathematical models of atmospheric conditions, as well as the use of sensor data and machine learning algorithms to automatically optimize the flight level in real-time. The proposed approaches promise to improve the efficiency and range of UAVs in various wind conditions, which may have significant implications for the application of these systems in various fields, including geodesy, environmental surveillance, and search and rescue operations.Keywords: drone, UAV, flight trajectory, wind-searching, efficiency
Procedia PDF Downloads 651238 A Method for Solving a Bi-Objective Transportation Problem under Fuzzy Environment
Authors: Sukhveer Singh, Sandeep Singh
Abstract:
A bi-objective fuzzy transportation problem with the objectives to minimize the total fuzzy cost and fuzzy time of transportation without according priorities to them is considered. To the best of our knowledge, there is no method in the literature to find efficient solutions of the bi-objective transportation problem under uncertainty. In this paper, a bi-objective transportation problem in an uncertain environment has been formulated. An algorithm has been proposed to find efficient solutions of the bi-objective transportation problem under uncertainty. The proposed algorithm avoids the degeneracy and gives the optimal solution faster than other existing algorithms for the given uncertain transportation problem.Keywords: uncertain transportation problem, efficient solution, ranking function, fuzzy transportation problem
Procedia PDF Downloads 5251237 Structural Invertibility and Optimal Sensor Node Placement for Error and Input Reconstruction in Dynamic Systems
Authors: Maik Kschischo, Dominik Kahl, Philipp Wendland, Andreas Weber
Abstract:
Understanding and modelling of real-world complex dynamic systems in biology, engineering and other fields is often made difficult by incomplete knowledge about the interactions between systems states and by unknown disturbances to the system. In fact, most real-world dynamic networks are open systems receiving unknown inputs from their environment. To understand a system and to estimate the state dynamics, these inputs need to be reconstructed from output measurements. Reconstructing the input of a dynamic system from its measured outputs is an ill-posed problem if only a limited number of states is directly measurable. A first requirement for solving this problem is the invertibility of the input-output map. In our work, we exploit the fact that invertibility of a dynamic system is a structural property, which depends only on the network topology. Therefore, it is possible to check for invertibility using a structural invertibility algorithm which counts the number of node disjoint paths linking inputs and outputs. The algorithm is efficient enough, even for large networks up to a million nodes. To understand structural features influencing the invertibility of a complex dynamic network, we analyze synthetic and real networks using the structural invertibility algorithm. We find that invertibility largely depends on the degree distribution and that dense random networks are easier to invert than sparse inhomogeneous networks. We show that real networks are often very difficult to invert unless the sensor nodes are carefully chosen. To overcome this problem, we present a sensor node placement algorithm to achieve invertibility with a minimum set of measured states. This greedy algorithm is very fast and also guaranteed to find an optimal sensor node-set if it exists. Our results provide a practical approach to experimental design for open, dynamic systems. Since invertibility is a necessary condition for unknown input observers and data assimilation filters to work, it can be used as a preprocessing step to check, whether these input reconstruction algorithms can be successful. If not, we can suggest additional measurements providing sufficient information for input reconstruction. Invertibility is also important for systems design and model building. Dynamic models are always incomplete, and synthetic systems act in an environment, where they receive inputs or even attack signals from their exterior. Being able to monitor these inputs is an important design requirement, which can be achieved by our algorithms for invertibility analysis and sensor node placement.Keywords: data-driven dynamic systems, inversion of dynamic systems, observability, experimental design, sensor node placement
Procedia PDF Downloads 1501236 Multi-Level Security Measures in Cloud Computing
Authors: Shobha G. Ranjan
Abstract:
Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.Keywords: cloud computing, cloud security, integrity, multi-tenancy, security
Procedia PDF Downloads 5011235 Application of Artificial Intelligence in EOR
Authors: Masoumeh Mofarrah, Amir NahanMoghadam
Abstract:
Higher oil prices and increasing oil demand are main reasons for great attention to Enhanced Oil Recovery (EOR). Comprehensive researches have been accomplished to develop, appraise, and improve EOR methods and their application. Recently, Artificial Intelligence (AI) gained popularity in petroleum industry that can help petroleum engineers to solve some fundamental petroleum engineering problems such as reservoir simulation, EOR project risk analysis, well log interpretation and well test model selection. This study presents a historical overview of most popular AI tools including neural networks, genetic algorithms, fuzzy logic, and expert systems in petroleum industry and discusses two case studies to represent the application of two mentioned AI methods for selecting an appropriate EOR method based on reservoir characterization infeasible and effective way.Keywords: artificial intelligence, EOR, neural networks, expert systems
Procedia PDF Downloads 4881234 Development of Non-Intrusive Speech Evaluation Measure Using S-Transform and Light-Gbm
Authors: Tusar Kanti Dash, Ganapati Panda
Abstract:
The evaluation of speech quality and intelligence is critical to the overall effectiveness of the Speech Enhancement Algorithms. Several intrusive and non-intrusive measures are employed to calculate these parameters. Non-Intrusive Evaluation is most challenging as, very often, the reference clean speech data is not available. In this paper, a novel non-intrusive speech evaluation measure is proposed using audio features derived from the Stockwell transform. These features are used with the Light Gradient Boosting Machine for the effective prediction of speech quality and intelligibility. The proposed model is analyzed using noisy and reverberant speech from four databases, and the results are compared with the standard Intrusive Evaluation Measures. It is observed from the comparative analysis that the proposed model is performing better than the standard Non-Intrusive models.Keywords: non-Intrusive speech evaluation, S-transform, light GBM, speech quality, and intelligibility
Procedia PDF Downloads 2601233 A Systematic Review Investigating the Use of EEG Measures in Neuromarketing
Authors: A. M. Byrne, E. Bonfiglio, C. Rigby, N. Edelstyn
Abstract:
Introduction: Neuromarketing employs numerous methodologies when investigating products and advertisement effectiveness. Electroencephalography (EEG), a non-invasive measure of electrical activity from the brain, is commonly used in neuromarketing. EEG data can be considered using time-frequency (TF) analysis, where changes in the frequency of brainwaves are calculated to infer participant’s mental states, or event-related potential (ERP) analysis, where changes in amplitude are observed in direct response to a stimulus. This presentation discusses the findings of a systematic review of EEG measures in neuromarketing. A systematic review summarises evidence on a research question, using explicit measures to identify, select, and critically appraise relevant research papers. Thissystematic review identifies which EEG measures are the most robust predictor of customer preference and purchase intention. Methods: Search terms identified174 papers that used EEG in combination with marketing-related stimuli. Publications were excluded if they were written in a language other than English or were not published as journal articles (e.g., book chapters). The review investigated which TF effect (e.g., theta-band power) and ERP component (e.g., N400) most consistently reflected preference and purchase intention. Machine-learning prediction was also investigated, along with the use of EEG combined with physiological measures such as eye-tracking. Results: Frontal alpha asymmetry was the most reliable TF signal, where an increase in activity over the left side of the frontal lobe indexed a positive response to marketing stimuli, while an increase in activity over the right side indexed a negative response. The late positive potential, a positive amplitude increase around 600 ms after stimulus presentation, was the most reliable ERP component, reflecting the conscious emotional evaluation of marketing stimuli. However, each measure showed mixed results when related to preference and purchase behaviour. Predictive accuracy was greatly improved through machine-learning algorithms such as deep neural networks, especially when combined with eye-tracking or facial expression analyses. Discussion: This systematic review provides a novel catalogue of the most effective use of each EEG measure commonly used in neuromarketing. Exciting findings to emerge are the identification of the frontal alpha asymmetry and late positive potential as markers of preferential responses to marketing stimuli. Predictive accuracy using machine-learning algorithms achieved predictive accuracies as high as 97%, and future research should therefore focus on machine-learning prediction when using EEG measures in neuromarketing.Keywords: EEG, ERP, neuromarketing, machine-learning, systematic review, time-frequency
Procedia PDF Downloads 1121232 Prediction of Coronary Artery Stenosis Severity Based on Machine Learning Algorithms
Authors: Yu-Jia Jian, Emily Chia-Yu Su, Hui-Ling Hsu, Jian-Jhih Chen
Abstract:
Coronary artery is the major supplier of myocardial blood flow. When fat and cholesterol are deposit in the coronary arterial wall, narrowing and stenosis of the artery occurs, which may lead to myocardial ischemia and eventually infarction. According to the World Health Organization (WHO), estimated 740 million people have died of coronary heart disease in 2015. According to Statistics from Ministry of Health and Welfare in Taiwan, heart disease (except for hypertensive diseases) ranked the second among the top 10 causes of death from 2013 to 2016, and it still shows a growing trend. According to American Heart Association (AHA), the risk factors for coronary heart disease including: age (> 65 years), sex (men to women with 2:1 ratio), obesity, diabetes, hypertension, hyperlipidemia, smoking, family history, lack of exercise and more. We have collected a dataset of 421 patients from a hospital located in northern Taiwan who received coronary computed tomography (CT) angiography. There were 300 males (71.26%) and 121 females (28.74%), with age ranging from 24 to 92 years, and a mean age of 56.3 years. Prior to coronary CT angiography, basic data of the patients, including age, gender, obesity index (BMI), diastolic blood pressure, systolic blood pressure, diabetes, hypertension, hyperlipidemia, smoking, family history of coronary heart disease and exercise habits, were collected and used as input variables. The output variable of the prediction module is the degree of coronary artery stenosis. The output variable of the prediction module is the narrow constriction of the coronary artery. In this study, the dataset was randomly divided into 80% as training set and 20% as test set. Four machine learning algorithms, including logistic regression, stepwise regression, neural network and decision tree, were incorporated to generate prediction results. We used area under curve (AUC) / accuracy (Acc.) to compare the four models, the best model is neural network, followed by stepwise logistic regression, decision tree, and logistic regression, with 0.68 / 79 %, 0.68 / 74%, 0.65 / 78%, and 0.65 / 74%, respectively. Sensitivity of neural network was 27.3%, specificity was 90.8%, stepwise Logistic regression sensitivity was 18.2%, specificity was 92.3%, decision tree sensitivity was 13.6%, specificity was 100%, logistic regression sensitivity was 27.3%, specificity 89.2%. From the result of this study, we hope to improve the accuracy by improving the module parameters or other methods in the future and we hope to solve the problem of low sensitivity by adjusting the imbalanced proportion of positive and negative data.Keywords: decision support, computed tomography, coronary artery, machine learning
Procedia PDF Downloads 2291231 A Survey of Discrete Facility Location Problems
Authors: Z. Ulukan, E. Demircioğlu,
Abstract:
Facility location is a complex real-world problem which needs a strategic management decision. This paper provides a general review on studies, efforts and developments in Facility Location Problems which are classical optimization problems having a wide-spread applications in various areas such as transportation, distribution, production, supply chain decisions and telecommunication. Our goal is not to review all variants of different studies in FLPs or to describe very detailed computational techniques and solution approaches, but rather to provide a broad overview of major location problems that have been studied, indicating how they are formulated and what are proposed by researchers to tackle the problem. A brief, elucidative table based on a grouping according to “General Problem Type” and “Methods Proposed” used in the studies is also presented at the end of the work.Keywords: discrete location problems, exact methods, heuristic algorithms, single source capacitated facility location problems
Procedia PDF Downloads 4711230 Designing a Cyclic Redundancy Checker-8 for 32 Bit Input Using VHDL
Authors: Ankit Shai
Abstract:
CRC or Cyclic Redundancy Check is one of the most common, and one of the most powerful error-detecting codes implemented on modern computers. Most of the modern communication protocols use some error detection algorithms in digital networks and storage devices to detect accidental changes to raw data between transmission and reception. Cyclic Redundancy Check, or CRC, is the most popular one among these error detection codes. CRC properties are defined by the generator polynomial length and coefficients. The aim of this project is to implement an efficient FPGA based CRC-8 that accepts a 32 bit input, taking into consideration optimal chip area and high performance, using VHDL. The proposed architecture is implemented on Xilinx ISE Simulator. It is designed while keeping in mind the hardware design, complexity and cost factor.Keywords: cyclic redundancy checker, CRC-8, 32-bit input, FPGA, VHDL, ModelSim, Xilinx
Procedia PDF Downloads 2921229 DNA Multiplier: A Design Architecture of a Multiplier Circuit Using DNA Molecules
Authors: Hafiz Md. Hasan Babu, Khandaker Mohammad Mohi Uddin, Nitish Biswas, Sarreha Tasmin Rikta, Nuzmul Hossain Nahid
Abstract:
Nanomedicine and bioengineering use biological systems that can perform computing operations. In a biocomputational circuit, different types of biomolecules and DNA (Deoxyribose Nucleic Acid) are used as active components. DNA computing has the capability of performing parallel processing and a large storage capacity that makes it diverse from other computing systems. In most processors, the multiplier is treated as a core hardware block, and multiplication is one of the time-consuming and lengthy tasks. In this paper, cost-effective DNA multipliers are designed using algorithms of molecular DNA operations with respect to conventional ones. The speed and storage capacity of a DNA multiplier are also much higher than a traditional silicon-based multiplier.Keywords: biological systems, DNA multiplier, large storage, parallel processing
Procedia PDF Downloads 217