Search results for: validation techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7617

Search results for: validation techniques

7317 Land Cover Remote Sensing Classification Advanced Neural Networks Supervised Learning

Authors: Eiman Kattan

Abstract:

This study aims to evaluate the impact of classifying labelled remote sensing images conventional neural network (CNN) architecture, i.e., AlexNet on different land cover scenarios based on two remotely sensed datasets from different point of views such as the computational time and performance. Thus, a set of experiments were conducted to specify the effectiveness of the selected convolutional neural network using two implementing approaches, named fully trained and fine-tuned. For validation purposes, two remote sensing datasets, AID, and RSSCN7 which are publicly available and have different land covers features were used in the experiments. These datasets have a wide diversity of input data, number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in training, validation, and testing. As a result, the fully trained approach has achieved a trivial result for both of the two data sets, AID and RSSCN7 by 73.346% and 71.857% within 24 min, 1 sec and 8 min, 3 sec respectively. However, dramatic improvement of the classification performance using the fine-tuning approach has been recorded by 92.5% and 91% respectively within 24min, 44 secs and 8 min 41 sec respectively. The represented conclusion opens the opportunities for a better classification performance in various applications such as agriculture and crops remote sensing.

Keywords: conventional neural network, remote sensing, land cover, land use

Procedia PDF Downloads 343
7316 Metropolis-Hastings Sampling Approach for High Dimensional Testing Methods of Autonomous Vehicles

Authors: Nacer Eddine Chelbi, Ayet Bagane, Annie Saleh, Claude Sauvageau, Denis Gingras

Abstract:

As recently stated by National Highway Traffic Safety Administration (NHTSA), to demonstrate the expected performance of a highly automated vehicles system, test approaches should include a combination of simulation, test track, and on-road testing. In this paper, we propose a new validation method for autonomous vehicles involving on-road tests (Field Operational Tests), test track (Test Matrix) and simulation (Worst Case Scenarios). We concentrate our discussion on the simulation aspects, in particular, we extend recent work based on Importance Sampling by using a Metropolis-Hasting algorithm (MHS) to sample collected data from the Safety Pilot Model Deployment (SPMD) in lane-change scenarios. Our proposed MH sampling method will be compared to the Importance Sampling method, which does not perform well in high-dimensional problems. The importance of this study is to obtain a sampler that could be applied to high dimensional simulation problems in order to reduce and optimize the number of test scenarios that are necessary for validation and certification of autonomous vehicles.

Keywords: automated driving, autonomous emergency braking (AEB), autonomous vehicles, certification, evaluation, importance sampling, metropolis-hastings sampling, tests

Procedia PDF Downloads 259
7315 Differential Effect of Technique Majors on Isokinetic Strength in Youth Judoka Athletes

Authors: Chungyu Chen, Yi-Cheng Chen, Po-Hsian Hsu, Hsin-Ying Chen, Yen-Po Hsiao

Abstract:

The purpose of this study was to assess the muscular strength performance of upper and lower extremity in isokinetic system for the youth judo players, and also to compare the strength difference between major techniques. Sixteen male and 20 female judo players (age: 16.7 ± 1.6 years old, training age: 4.5 ± 0.8 years) were served as the volunteers for this study. There were 21 players major hand techniques and 15 players major foot techniques. The Biodex S4 Pro was used to assess the strength performance of extensor and flexor of concentric action under the load condition of 30 degree/sec, 60 degree/sec, and 120 degree/sec for elbow joints and knee joints. The strength parameters were included the maximal torque, the normalized maximal torque, the average power, and the average maximal torque. A t test for independent groups was used to evaluate whether hand major and foot major differ significantly with an alpha level of .05. The result showed the maximal torque of left knee extensor in foot major players (243.5 ± 36.3 Nm) was higher significantly than hand major (210.7 ± 21.0 Nm) under the load of 30 degree/sec (p < .05). There were no differences in upper extremity strength between the hand and foot techniques major in three loads (ps < .05). It indicated that the judo player is required to develop the upper extremity strength overall to secure the execution of major techniques.

Keywords: knee, elbow, power, judo

Procedia PDF Downloads 428
7314 Comparative Analysis of Edge Detection Techniques for Extracting Characters

Authors: Rana Gill, Chandandeep Kaur

Abstract:

Segmentation of images can be implemented using different fundamental algorithms like edge detection (discontinuity based segmentation), region growing (similarity based segmentation), iterative thresholding method. A comprehensive literature review relevant to the study gives description of different techniques for vehicle number plate detection and edge detection techniques widely used on different types of images. This research work is based on edge detection techniques and calculating threshold on the basis of five edge operators. Five operators used are Prewitt, Roberts, Sobel, LoG and Canny. Segmentation of characters present in different type of images like vehicle number plate, name plate of house and characters on different sign boards are selected as a case study in this work. The proposed methodology has seven stages. The proposed system has been implemented using MATLAB R2010a. Comparison of all the five operators has been done on the basis of their performance. From the results it is found that Canny operators produce best results among the used operators and performance of different edge operators in decreasing order is: Canny>Log>Sobel>Prewitt>Roberts.

Keywords: segmentation, edge detection, text, extracting characters

Procedia PDF Downloads 409
7313 Passive Retrofitting Strategies for Windows in Hot and Humid Climate Vijayawada

Authors: Monica Anumula

Abstract:

Nowadays human beings attain comfort zone artificially for heating, cooling and lighting the spaces they live, and their main importance is given to aesthetics of building and they are not designed to protect themselves from climate. They depend on artificial sources of energy resulting in energy wastage. In order to reduce the amount of energy being spent in the construction industry and Energy Package goals by 2020, new ways of constructing houses is required. The larger part of energy consumption of a building is directly related to architectural aspects hence nature has to be integrated into the building design to attain comfort zone and reduce the dependency on artificial source of energy. The research is to develop bioclimatic design strategies and techniques for the walls and roofs of Vijayawada houses. Study and analysis of design strategies and techniques of various cases like Kerala, Mangalore etc. for similar kind of climate is examined in this paper. Understanding the vernacular architecture and modern techniques of that various cases and implementing in the housing of Vijayawada not only decreases energy consumption but also enhances socio cultural values of Vijayawada. This study focuses on the comparison of vernacular techniques and modern building bio climatic strategies to attain thermal comfort and energy reduction in hot and humid climate. This research provides further thinking of new strategies which include both vernacular and modern bioclimatic techniques.

Keywords: bioclimatic design, energy consumption, hot and humid climates, thermal comfort

Procedia PDF Downloads 153
7312 Use of Gaussian-Euclidean Hybrid Function Based Artificial Immune System for Breast Cancer Diagnosis

Authors: Cuneyt Yucelbas, Seral Ozsen, Sule Yucelbas, Gulay Tezel

Abstract:

Due to the fact that there exist only a small number of complex systems in artificial immune system (AIS) that work out nonlinear problems, nonlinear AIS approaches, among the well-known solution techniques, need to be developed. Gaussian function is usually used as similarity estimation in classification problems and pattern recognition. In this study, diagnosis of breast cancer, the second type of the most widespread cancer in women, was performed with different distance calculation functions that euclidean, gaussian and gaussian-euclidean hybrid function in the clonal selection model of classical AIS on Wisconsin Breast Cancer Dataset (WBCD), which was taken from the University of California, Irvine Machine-Learning Repository. We used 3-fold cross validation method to train and test the dataset. According to the results, the maximum test classification accuracy was reported as 97.35% by using of gaussian-euclidean hybrid function for fold-3. Also, mean of test classification accuracies for all of functions were obtained as 94.78%, 94.45% and 95.31% with use of euclidean, gaussian and gaussian-euclidean, respectively. With these results, gaussian-euclidean hybrid function seems to be a potential distance calculation method, and it may be considered as an alternative distance calculation method for hard nonlinear classification problems.

Keywords: artificial immune system, breast cancer diagnosis, Euclidean function, Gaussian function

Procedia PDF Downloads 414
7311 Discovery of the Piano Extended Techniques by Focusing on Symbols That George Crumb Used in Makrokosmos Volumes

Authors: Parham Bakhtiari

Abstract:

George Crumb's Makrokosmos Volumes are considered significant pieces in twentieth-century piano music and showcase the extensive use of different tones and extended techniques on the piano. Crumb's works are known for making references, particularly to music from previous eras which the visual, aural, and numerical characteristics are symbolic in nature. Crumb created a list of symbols and shortened letters to clarify his unique directions to those who performed his compositions. The pianists who prepare to play Makrokosmos must dedicate time to study and analyze Crumb's markings diligently to accurately capture the composer's wishes. The aim of this paper is to provide a collection for pianists looking to perform George Crumb's compositions known as Makrokosmos Volumes. The research traits of unconventional playing techniques and discussions on the music explored by the composer are being described.

Keywords: music, piano, Crumb, Makrokosmos, performance

Procedia PDF Downloads 11
7310 Software Quality Assurance in Network Security using Cryptographic Techniques

Authors: Sidra Shabbir, Ayesha Manzoor, Mehreen Sirshar

Abstract:

The use of the network communication has imposed serious threats to the security of assets over the network. Network security is getting more prone to active and passive attacks which may result in serious consequences to data integrity, confidentiality and availability. Various cryptographic techniques have been proposed in the past few years to combat with the concerned problem by ensuring quality but in order to have a fully secured network; a framework of new cryptosystem was needed. This paper discusses certain cryptographic techniques which have shown far better improvement in the network security with enhanced quality assurance. The scope of this research paper is to cover the security pitfalls in the current systems and their possible solutions based on the new cryptosystems. The development of new cryptosystem framework has paved a new way to the widespread network communications with enhanced quality in network security.

Keywords: cryptography, network security, encryption, decryption, integrity, confidentiality, security algorithms, elliptic curve cryptography

Procedia PDF Downloads 707
7309 Aerobic Bioprocess Control Using Artificial Intelligence Techniques

Authors: M. Caramihai, Irina Severin

Abstract:

This paper deals with the design of an intelligent control structure for a bioprocess of Hansenula polymorpha yeast cultivation. The objective of the process control is to produce biomass in a desired physiological state. The work demonstrates that the designed Hybrid Control Techniques (HCT) are able to recognize specific evolution bioprocess trajectories using neural networks trained specifically for this purpose, in order to estimate the model parameters and to adjust the overall bioprocess evolution through an expert system and a fuzzy structure. The design of the control algorithm as well as its tuning through realistic simulations is presented. Taking into consideration the synergism of different paradigms like fuzzy logic, neural network, and symbolic artificial intelligence (AI), in this paper we present a real and fulfilled intelligent control architecture with application in bioprocess control.

Keywords: bioprocess, intelligent control, neural nets, fuzzy structure, hybrid techniques

Procedia PDF Downloads 383
7308 Teaching, Learning and Evaluation Enhancement of Information Communication Technology Education in Schools through Pedagogical and E-Learning Techniques in the Sri Lankan Context

Authors: M. G. N. A. S. Fernando

Abstract:

This study uses a researchable framework to improve the quality of ICT education and the Teaching Learning Assessment/ Evaluation (TLA/TLE) process. It utilizes existing resources while improving the methodologies along with pedagogical techniques and e-Learning approaches used in the secondary schools of Sri Lanka. The study was carried out in two phases. Phase I focused on investigating the factors which affect the quality of ICT education. Based on the key factors of phase I, the Phase II focused on the design of an Experimental Application Model with 6 activity levels. Each Level in the Activity Model covers one or more levels in the Revised Bloom’s Taxonomy. Towards further enhancement of activity levels, other pedagogical techniques (activity based learning, e-learning techniques, problem solving activities and peer discussions etc.) were incorporated to each level in the activity model as appropriate. The application model was validated by a panel of teachers including a domain expert and was tested in the school environment too. The validity of performance was proved using 6 hypotheses testing and other methodologies. The analysis shows that student performance with problem solving activities increased by 19.5% due to the different treatment levels used. Compared to existing process it was also proved that the embedded techniques (mixture of traditional and modern pedagogical methods and their applications) are more effective with skills development of teachers and students.

Keywords: activity models, Bloom’s taxonomy, ICT education, pedagogies

Procedia PDF Downloads 138
7307 Digital Cinema Watermarking State of Art and Comparison

Authors: H. Kelkoul, Y. Zaz

Abstract:

Nowadays, the vigorous popularity of video processing techniques has resulted in an explosive growth of multimedia data illegal use. So, watermarking security has received much more attention. The purpose of this paper is to explore some watermarking techniques in order to observe their specificities and select the finest methods to apply in digital cinema domain against movie piracy by creating an invisible watermark that includes the date, time and the place where the hacking was done. We have studied three principal watermarking techniques in the frequency domain: Spread spectrum, Wavelet transform domain and finally the digital cinema watermarking transform domain. In this paper, a detailed technique is presented where embedding is performed using direct sequence spread spectrum technique in DWT transform domain. Experiment results shows that the algorithm provides high robustness and good imperceptibility.

Keywords: digital cinema, watermarking, wavelet DWT, spread spectrum, JPEG2000 MPEG4

Procedia PDF Downloads 233
7306 Comparison between Pushover Analysis Techniques and Validation of the Simplified Modal Pushover Analysis

Authors: N. F. Hanna, A. M. Haridy

Abstract:

One of the main drawbacks of the Modal Pushover Analysis (MPA) is the need to perform nonlinear time-history analysis, which complicates the analysis method and time. A simplified version of the MPA has been proposed based on the concept of the inelastic deformation ratio. Furthermore, the effect of the higher modes of vibration is considered by assuming linearly-elastic responses, which enables the use of standard elastic response spectrum analysis. In this thesis, the simplified MPA (SMPA) method is applied to determine the target global drift and the inter-story drifts of steel frame building. The effect of the higher vibration modes is considered within the framework of the SMPA. A comprehensive survey about the inelastic deformation ratio is presented. After that, a suitable expression from literature is selected for the inelastic deformation ratio and then implemented in the SMPA. The estimated seismic demands using the SMPA, such as target drift, base shear, and the inter-story drifts, are compared with the seismic responses determined by applying the standard MPA. The accuracy of the estimated seismic demands is validated by comparing with the results obtained by the nonlinear time-history analysis using real earthquake records.

Keywords: modal analysis, pushover analysis, seismic performance, target displacement

Procedia PDF Downloads 341
7305 Data Mining Techniques for Anti-Money Laundering

Authors: M. Sai Veerendra

Abstract:

Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project on developing a new data mining solution for AML Units in an international investment bank in Ireland, we survey recent data mining approaches for AML. In this paper, we present not only these approaches but also give an overview on the important factors in building data mining solutions for AML activities.

Keywords: data mining, clustering, money laundering, anti-money laundering solutions

Procedia PDF Downloads 516
7304 Multi-Scaled Non-Local Means Filter for Medical Images Denoising: Empirical Mode Decomposition vs. Wavelet Transform

Authors: Hana Rabbouch

Abstract:

In recent years, there has been considerable growth of denoising techniques mainly devoted to medical imaging. This important evolution is not only due to the progress of computing techniques, but also to the emergence of multi-resolution analysis (MRA) on both mathematical and algorithmic bases. In this paper, a comparative study is conducted between the two best-known MRA-based decomposition techniques: the Empirical Mode Decomposition (EMD) and the Discrete Wavelet Transform (DWT). The comparison is carried out in a framework of multi-scale denoising, where a Non-Local Means (NLM) filter is performed scale-by-scale to a sample of benchmark medical images. The results prove the effectiveness of the multiscaled denoising, especially when the NLM filtering is coupled with the EMD.

Keywords: medical imaging, non local means, denoising, multiscaled analysis, empirical mode decomposition, wavelets

Procedia PDF Downloads 118
7303 Predicting Susceptibility to Coronary Artery Disease using Single Nucleotide Polymorphisms with a Large-Scale Data Extraction from PubMed and Validation in an Asian Population Subset

Authors: K. H. Reeta, Bhavana Prasher, Mitali Mukerji, Dhwani Dholakia, Sangeeta Khanna, Archana Vats, Shivam Pandey, Sandeep Seth, Subir Kumar Maulik

Abstract:

Introduction Research has demonstrated a connection between coronary artery disease (CAD) and genetics. We did a deep literature mining using both bioinformatics and manual efforts to identify the susceptible polymorphisms in coronary artery disease. Further, the study sought to validate these findings in an Asian population. Methodology In first phase, we used an automated pipeline which organizes and presents structured information on SNPs, Population and Diseases. The information was obtained by applying Natural Language Processing (NLP) techniques to approximately 28 million PubMed abstracts. To accomplish this, we utilized Python scripts to extract and curate disease-related data, filter out false positives, and categorize them into 24 hierarchical groups using named Entity Recognition (NER) algorithms. From the extensive research conducted, a total of 466 unique PubMed Identifiers (PMIDs) and 694 Single Nucleotide Polymorphisms (SNPs) related to coronary artery disease (CAD) were identified. To refine the selection process, a thorough manual examination of all the studies was carried out. Specifically, SNPs that demonstrated susceptibility to CAD and exhibited a positive Odds Ratio (OR) were selected, and a final pool of 324 SNPs was compiled. The next phase involved validating the identified SNPs in DNA samples of 96 CAD patients and 37 healthy controls from Indian population using Global Screening Array. ResultsThe results exhibited out of 324, only 108 SNPs were expressed, further 4 SNPs showed significant difference of minor allele frequency in cases and controls. These were rs187238 of IL-18 gene, rs731236 of VDR gene, rs11556218 of IL16 gene and rs5882 of CETP gene. Prior researches have reported association of these SNPs with various pathways like endothelial damage, susceptibility of vitamin D receptor (VDR) polymorphisms, and reduction of HDL-cholesterol levels, ultimately leading to the development of CAD. Among these, only rs731236 had been studied in Indian population and that too in diabetes and vitamin D deficiency. For the first time, these SNPs were reported to be associated with CAD in Indian population. Conclusion: This pool of 324 SNP s is a unique kind of resource that can help to uncover risk associations in CAD. Here, we validated in Indian population. Further, validation in different populations may offer valuable insights and contribute to the development of a screening tool and may help in enabling the implementation of primary prevention strategies targeted at the vulnerable population.

Keywords: coronary artery disease, single nucleotide polymorphism, susceptible SNP, bioinformatics

Procedia PDF Downloads 49
7302 Spatial Climate Changes in the Province of Macerata, Central Italy, Analyzed by GIS Software

Authors: Matteo Gentilucci, Marco Materazzi, Gilberto Pambianchi

Abstract:

Climate change is an increasingly central issue in the world, because it affects many of human activities. In this context regional studies are of great importance because they sometimes differ from the general trend. This research focuses on a small area of central Italy which overlooks the Adriatic Sea, the province of Macerata. The aim is to analyze space-based climate changes, for precipitation and temperatures, in the last 3 climatological standard normals (1961-1990; 1971-2000; 1981-2010) through GIS software. The data collected from 30 weather stations for temperature and 61 rain gauges for precipitation were subject to quality controls: validation and homogenization. These data were fundamental for the spatialization of the variables (temperature and precipitation) through geostatistical techniques. To assess the best geostatistical technique for interpolation, the results of cross correlation were used. The co-kriging method with altitude as independent variable produced the best cross validation results for all time periods, among the methods analysed, with 'root mean square error standardized' close to 1, 'mean standardized error' close to 0, 'average standard error' and 'root mean square error' with similar values. The maps resulting from the analysis were compared by subtraction between rasters, producing 3 maps of annual variation and three other maps for each month of the year (1961/1990-1971/2000; 1971/2000-1981/2010; 1961/1990-1981/2010). The results show an increase in average annual temperature of about 0.1°C between 1961-1990 and 1971-2000 and 0.6 °C between 1961-1990 and 1981-2010. Instead annual precipitation shows an opposite trend, with an average difference from 1961-1990 to 1971-2000 of about 35 mm and from 1961-1990 to 1981-2010 of about 60 mm. Furthermore, the differences in the areas have been highlighted with area graphs and summarized in several tables as descriptive analysis. In fact for temperature between 1961-1990 and 1971-2000 the most areally represented frequency is 0.08°C (77.04 Km² on a total of about 2800 km²) with a kurtosis of 3.95 and a skewness of 2.19. Instead, the differences for temperatures from 1961-1990 to 1981-2010 show a most areally represented frequency of 0.83 °C, with -0.45 as kurtosis and 0.92 as skewness (36.9 km²). Therefore it can be said that distribution is more pointed for 1961/1990-1971/2000 and smoother but more intense in the growth for 1961/1990-1981/2010. In contrast, precipitation shows a very similar shape of distribution, although with different intensities, for both variations periods (first period 1961/1990-1971/2000 and second one 1961/1990-1981/2010) with similar values of kurtosis (1st = 1.93; 2nd = 1.34), skewness (1st = 1.81; 2nd = 1.62 for the second) and area of the most represented frequency (1st = 60.72 km²; 2nd = 52.80 km²). In conclusion, this methodology of analysis allows the assessment of small scale climate change for each month of the year and could be further investigated in relation to regional atmospheric dynamics.

Keywords: climate change, GIS, interpolation, co-kriging

Procedia PDF Downloads 101
7301 Review of Malaria Diagnosis Techniques

Authors: Lubabatu Sada Sodangu

Abstract:

Malaria is a major cause of death in tropical and subtropical nations. Malaria cases are continually rising as a result of a number of factors, despite the fact that the condition is now treatable using effective methods. In this situation, quick and effective diagnostic methods are essential for the management and control of malaria. Malaria diagnosis using conventional methods is still troublesome, hence new technologies have been created and implemented to get around the drawbacks. The review describes the currently known malaria diagnostic techniques, their strengths and shortcomings.

Keywords: malaria, technique, diagnosis, Africa

Procedia PDF Downloads 29
7300 Review of Malaria Diagnosis Techniques

Authors: Lubabatu Sada Sodangi

Abstract:

Malaria is a major cause of death in tropical and subtropical nations. Malaria cases are continually rising as a result of a number of factors, despite the fact that the condition is now treatable using effective methods. In this situation, quick and effective diagnostic methods are essential for the management and control of malaria. Malaria diagnosis using conventional methods is still troublesome; hence, new technologies have been created and implemented to get around the drawbacks. The review describes the currently known malaria diagnostic techniques, their strengths, and shortcomings.

Keywords: malaria, technique, diagnosis, Africa

Procedia PDF Downloads 37
7299 Cost Reduction Techniques for Provision of Shelter to Homeless

Authors: Mukul Anand

Abstract:

Quality oriented affordable shelter for all has always been the key issue in the housing sector of our country. Homelessness is the acute form of housing need. It is a paradox that in spite of innumerable government initiated programmes for affordable housing, certain section of society is still devoid of shelter. About nineteen million (18.78 million) households grapple with housing shortage in Urban India in 2012. In Indian scenario there is major mismatch between the people for whom the houses are being built and those who need them. The prime force faced by public authorities in facilitation of quality housing for all is high cost of construction. The present paper will comprehend executable techniques for dilution of cost factor in housing the homeless. The key actors responsible for delivery of cheap housing stock such as capacity building, resource optimization, innovative low cost building material and indigenous skeleton housing system will also be incorporated in developing these techniques. Time performance, which is an important angle of above actors, will also be explored so as to increase the effectiveness of low cost housing. Along with this best practices will be taken up as case studies where both conventional techniques of housing and innovative low cost housing techniques would be cited. Transportation consists of approximately 30% of total construction budget. Thus use of alternative local solutions depending upon the region would be covered so as to highlight major components of low cost housing. Government is laid back regarding base line information on use of innovative low cost method and technique of resource optimization. Therefore, the paper would be an attempt to bring to light simpler solutions for achieving low cost housing.

Keywords: construction, cost, housing, optimization, shelter

Procedia PDF Downloads 417
7298 Fake News Detection for Korean News Using Machine Learning Techniques

Authors: Tae-Uk Yun, Pullip Chung, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Fake news is defined as the news articles that are intentionally and verifiably false, and could mislead readers. Spread of fake news may provoke anxiety, chaos, fear, or irrational decisions of the public. Thus, detecting fake news and preventing its spread has become very important issue in our society. However, due to the huge amount of fake news produced every day, it is almost impossible to identify it by a human. Under this context, researchers have tried to develop automated fake news detection using machine learning techniques over the past years. But, there have been no prior studies proposed an automated fake news detection method for Korean news to our best knowledge. In this study, we aim to detect Korean fake news using text mining and machine learning techniques. Our proposed method consists of two steps. In the first step, the news contents to be analyzed is convert to quantified values using various text mining techniques (topic modeling, TF-IDF, and so on). After that, in step 2, classifiers are trained using the values produced in step 1. As the classifiers, machine learning techniques such as logistic regression, backpropagation network, support vector machine, and deep neural network can be applied. To validate the effectiveness of the proposed method, we collected about 200 short Korean news from Seoul National University’s FactCheck. which provides with detailed analysis reports from 20 media outlets and links to source documents for each case. Using this dataset, we will identify which text features are important as well as which classifiers are effective in detecting Korean fake news.

Keywords: fake news detection, Korean news, machine learning, text mining

Procedia PDF Downloads 246
7297 RP-HPLC Method Development and Its Validation for Simultaneous Estimation of Metoprolol Succinate and Olmesartan Medoxomil Combination in Bulk and Tablet Dosage Form

Authors: S. Jain, R. Savalia, V. Saini

Abstract:

A simple, accurate, precise, sensitive and specific RP-HPLC method was developed and validated for simultaneous estimation of Metoprolol Succinate and Olmesartan Medoxomil in bulk and tablet dosage form. The RP-HPLC method has shown adequate separation for Metoprolol Succinate and Olmesartan Medoxomil from its degradation products. The separation was achieved on a Phenomenex luna ODS C18 (250mm X 4.6mm i.d., 5μm particle size) with an isocratic mixture of acetonitrile: 50mM phosphate buffer pH 4.0 adjusted with glacial acetic acid in the ratio of 55:45 v/v. The mobile phase at a flow rate of 1.0ml/min, Injection volume 20μl and wavelength of detection was kept at 225nm. The retention time for Metoprolol Succinate and Olmesartan Medoxomil was 2.451±0.1min and 6.167±0.1min, respectively. The linearity of the proposed method was investigated in the range of 5-50μg/ml and 2-20μg/ml for Metoprolol Succinate and Olmesartan Medoxomil, respectively. Correlation coefficient was 0.999 and 0.9996 for Metoprolol Succinate and Olmesartan Medoxomil, respectively. The limit of detection was 0.2847μg/ml and 0.1251μg/ml for Metoprolol Succinate and Olmesartan Medoxomil, respectively and the limit of quantification was 0.8630μg/ml and 0.3793μg/ml for Metoprolol and Olmesartan, respectively. Proposed methods were validated as per ICH guidelines for linearity, accuracy, precision, specificity and robustness for estimation of Metoprolol Succinate and Olmesartan Medoxomil in commercially available tablet dosage form and results were found to be satisfactory. Thus the developed and validated stability indicating method can be used successfully for marketed formulations.

Keywords: metoprolol succinate, olmesartan medoxomil, RP-HPLC method, validation, ICH

Procedia PDF Downloads 290
7296 Empirical Roughness Progression Models of Heavy Duty Rural Pavements

Authors: Nahla H. Alaswadko, Rayya A. Hassan, Bayar N. Mohammed

Abstract:

Empirical deterministic models have been developed to predict roughness progression of heavy duty spray sealed pavements for a dataset representing rural arterial roads. The dataset provides a good representation of the relevant network and covers a wide range of operating and environmental conditions. A sample with a large size of historical time series data for many pavement sections has been collected and prepared for use in multilevel regression analysis. The modelling parameters include road roughness as performance parameter and traffic loading, time, initial pavement strength, reactivity level of subgrade soil, climate condition, and condition of drainage system as predictor parameters. The purpose of this paper is to report the approaches adopted for models development and validation. The study presents multilevel models that can account for the correlation among time series data of the same section and to capture the effect of unobserved variables. Study results show that the models fit the data very well. The contribution and significance of relevant influencing factors in predicting roughness progression are presented and explained. The paper concludes that the analysis approach used for developing the models confirmed their accuracy and reliability by well-fitting to the validation data.

Keywords: roughness progression, empirical model, pavement performance, heavy duty pavement

Procedia PDF Downloads 146
7295 A Comparative Analysis of Various Companding Techniques Used to Reduce PAPR in VLC Systems

Authors: Arushi Singh, Anjana Jain, Prakash Vyavahare

Abstract:

Recently, Li-Fi(light-fiedelity) has been launched based on VLC(visible light communication) technique, 100 times faster than WiFi. Now 5G mobile communication system is proposed to use VLC-OFDM as the transmission technique. The VLC system focused on visible rays, is considered for efficient spectrum use and easy intensity modulation through LEDs. The reason of high speed in VLC is LED, as they flicker incredibly fast(order of MHz). Another advantage of employing LED is-it acts as low pass filter results no out-of-band emission. The VLC system falls under the category of ‘green technology’ for utilizing LEDs. In present scenario, OFDM is used for high data-rates, interference immunity and high spectral efficiency. Inspite of the advantages OFDM suffers from large PAPR, ICI among carriers and frequency offset errors. Since, the data transmission technique used in VLC system is OFDM, the system suffers the drawbacks of OFDM as well as VLC, the non-linearity dues to non-linear characteristics of LED and PAPR of OFDM due to which the high power amplifier enters in non-linear region. The proposed paper focuses on reduction of PAPR in VLC-OFDM systems. Many techniques are applied to reduce PAPR such as-clipping-introduces distortion in the carrier; selective mapping technique-suffers wastage of bandwidth; partial transmit sequence-very complex due to exponentially increased number of sub-blocks. The paper discusses three companding techniques namely- µ-law, A-law and advance A-law companding technique. The analysis shows that the advance A-law companding techniques reduces the PAPR of the signal by adjusting the companding parameter within the range. VLC-OFDM systems are the future of the wireless communication but non-linearity in VLC-OFDM is a severe issue. The proposed paper discusses the techniques to reduce PAPR, one of the non-linearities of the system. The companding techniques mentioned in this paper provides better results without increasing the complexity of the system.

Keywords: non-linear companding techniques, peak to average power ratio (PAPR), visible light communication (VLC), VLC-OFDM

Procedia PDF Downloads 264
7294 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate

Authors: Angela Maria Fasnacht

Abstract:

Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.

Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive

Procedia PDF Downloads 96
7293 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine

Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li

Abstract:

Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.

Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation

Procedia PDF Downloads 207
7292 Assessment of Pre-Processing Influence on Near-Infrared Spectra for Predicting the Mechanical Properties of Wood

Authors: Aasheesh Raturi, Vimal Kothiyal, P. D. Semalty

Abstract:

We studied mechanical properties of Eucalyptus tereticornis using FT-NIR spectroscopy. Firstly, spectra were pre-processed to eliminate useless information. Then, prediction model was constructed by partial least squares regression. To study the influence of pre-processing on prediction of mechanical properties for NIR analysis of wood samples, we applied various pretreatment methods like straight line subtraction, constant offset elimination, vector-normalization, min-max normalization, multiple scattering. Correction, first derivative, second derivatives and their combination with other treatment such as First derivative + straight line subtraction, First derivative+ vector normalization and First derivative+ multiplicative scattering correction. The data processing methods in combination of preprocessing with different NIR regions, RMSECV, RMSEP and optimum factors/rank were obtained by optimization process of model development. More than 350 combinations were obtained during optimization process. More than one pre-processing method gave good calibration/cross-validation and prediction/test models, but only the best calibration/cross-validation and prediction/test models are reported here. The results show that one can safely use NIR region between 4000 to 7500 cm-1 with straight line subtraction, constant offset elimination, first derivative and second derivative preprocessing method which were found to be most appropriate for models development.

Keywords: FT-NIR, mechanical properties, pre-processing, PLS

Procedia PDF Downloads 318
7291 Gradient Boosted Trees on Spark Platform for Supervised Learning in Health Care Big Data

Authors: Gayathri Nagarajan, L. D. Dhinesh Babu

Abstract:

Health care is one of the prominent industries that generate voluminous data thereby finding the need of machine learning techniques with big data solutions for efficient processing and prediction. Missing data, incomplete data, real time streaming data, sensitive data, privacy, heterogeneity are few of the common challenges to be addressed for efficient processing and mining of health care data. In comparison with other applications, accuracy and fast processing are of higher importance for health care applications as they are related to the human life directly. Though there are many machine learning techniques and big data solutions used for efficient processing and prediction in health care data, different techniques and different frameworks are proved to be effective for different applications largely depending on the characteristics of the datasets. In this paper, we present a framework that uses ensemble machine learning technique gradient boosted trees for data classification in health care big data. The framework is built on Spark platform which is fast in comparison with other traditional frameworks. Unlike other works that focus on a single technique, our work presents a comparison of six different machine learning techniques along with gradient boosted trees on datasets of different characteristics. Five benchmark health care datasets are considered for experimentation, and the results of different machine learning techniques are discussed in comparison with gradient boosted trees. The metric chosen for comparison is misclassification error rate and the run time of the algorithms. The goal of this paper is to i) Compare the performance of gradient boosted trees with other machine learning techniques in Spark platform specifically for health care big data and ii) Discuss the results from the experiments conducted on datasets of different characteristics thereby drawing inference and conclusion. The experimental results show that the accuracy is largely dependent on the characteristics of the datasets for other machine learning techniques whereas gradient boosting trees yields reasonably stable results in terms of accuracy without largely depending on the dataset characteristics.

Keywords: big data analytics, ensemble machine learning, gradient boosted trees, Spark platform

Procedia PDF Downloads 219
7290 Development and Validation of the University of Mindanao Needs Assessment Scale (UMNAS) for College Students

Authors: Ryan Dale B. Elnar

Abstract:

This study developed a multidimensional need assessment scale for college students called The University of Mindanao Needs Assessment Scale (UMNAS). Although there are context-specific instruments measuring the needs of clinical and non-clinical samples, literature reveals no standardized scales to measure the needs of the college students thus a four-phase item development process was initiated to support its content validity. Comprising seven broad facets namely spiritual-moral, intrapersonal, socio-personal, psycho-emotional, cognitive, physical and sexual, a pyramid model of college needs was deconstructed through FGD sample to support the literature review. Using various construct validity procedures, the model was further tested using a total of 881 Filipino college samples. The result of the study revealed evidences of the reliability and validity of the UMNAS. The reliability indices range from .929-.933. Exploratory and confirmatory factor analyses revealed a one-factor-six-dimensional instrument to measure the needs of the college students. Using multivariate regression analysis, year level and course are found predictors of students’ needs. Content analysis attested the usefulness of the instrument to diagnose students’ personal and academic issues and concerns in conjunction with other measures. The norming process includes 1728 students from the different colleges of the University of Mindanao. Further validation is recommended to establish a national norm for the instrument.

Keywords: needs assessment scale, validity, factor analysis, college students

Procedia PDF Downloads 426
7289 Analysis of Spamming Threats and Some Possible Solutions for Online Social Networking Sites (OSNS)

Authors: Dilip Singh Sisodia, Shrish Verma

Abstract:

Spamming is the most common issue seen nowadays in the Internet especially in Online Social Networking Sites (like Facebook, Twitter, and Google+ etc.). Spam messages keep wasting Internet bandwidth and the storage space of servers. On social network sites; spammers often disguise themselves by creating fake accounts and hijacking user’s accounts for personal gains. They behave like normal user and they continue to change their spamming strategy. To prevent this, most modern spam-filtering solutions are deployed on the receiver side; they are good at filtering spam for end users. In this paper we are presenting some spamming techniques their behaviour and possible solutions. We have analyzed how Spammers enters into online social networking sites (OSNSs) and how they target it and the techniques they use for it. The five discussed techniques of spamming techniques which are clickjacking, social engineered attacks, cross site scripting, URL shortening, and drive by download. We have used elgg framework for demonstration of some of spamming threats and respective implementation of solutions.

Keywords: online social networking sites, spam, attacks, internet, clickjacking / likejacking, drive-by-download, URL shortening, networking, socially engineered attacks, elgg framework

Procedia PDF Downloads 317
7288 Multidrug Therapies For HIV: Hybrid On-Off, Hysteresis On-Off Control and Simple STI

Authors: Magno Enrique Mendoza Meza

Abstract:

This paper deals with the comparison of three control techniques: the hysteresis on-off control (HyOOC), the hybrid on-off control (HOOC) and the simple Structured Treatment Interruptions (sSTI). These techniques are applied to the mathematical model developed by Kirschner and Webb. To compare these techniques we use a cost functional that minimize the wild-type virus population and the mutant virus population, but the main objective is to minimize the systemic cost of treatment and maximize levels of healthy CD4+ T cells. HyOOC, HOOC, and sSTI are applied to the drug therapies using a reverse transcriptase and protease inhibitors; simulations show that these controls maintain the uninfected cells in a small, bounded neighborhood of a pre-specified level. The controller HyOOC and HOOC are designed by appropriate choice of virtual equilibrium points.

Keywords: virus dynamics, on-off control, hysteresis, multi-drug therapies

Procedia PDF Downloads 368