Search results for: support vector machines application
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15301

Search results for: support vector machines application

15031 Molecular Detection of Leishmania from the Phlebotomus Genus: Tendency towards Leishmaniasis Regression in Constantine, North-East of Algeria

Authors: K. Frahtia, I. Mihoubi, S. Picot

Abstract:

Leishmaniasis is a group of parasitic disease with a varied clinical expression caused by flagellate protozoa of the Leishmania genus. These diseases are transmitted to humans and animals by the sting of a vector insect, the female sandfly. Among the groups of dipteral disease vectors, Phlebotominae occupy a prime position and play a significant role in human pathology, such as leishmaniasis that affects nearly 350 million people worldwide. The vector control operation launched by health services throughout the country proves to be effective since despite the prevalence of the disease remains high especially in rural areas, leishmaniasis appears to be declining in Algeria. In this context, this study mainly concerns molecular detection of Leishmania from the vector. Furthermore, a molecular diagnosis has also been made on skin samples taken from patients in the region of Constantine, located in the North-East of Algeria. Concerning the vector, 5858 sandflies were captured, including 4360 males and 1498 females. Male specimens were identified based on their morphological. The morphological identification highlighted the presence of the Phlebotomus genus with a prevalence of 93% against 7% represented by the Sergentomyia genus. About the identified species, P. perniciosus is the most abundant with 59.4% of the male identified population followed by P. longicuspis with 24.7% of the workforce. P. perfiliewi is poorly represented by 6.7% of specimens followed by P. papatasi with 2.2% and 1.5% S. dreyfussi. Concerning skin samples, 45/79 (56.96%) collected samples were found positive by real-time PCR. This rate appears to be in sharp decline compared to previous years (alert peak of 30,227 cases in 2005). Concerning the detection of Leishmania from sandflies by RT-PCR, the results show that 3/60 PCR performed genus are positive with melting temperatures corresponding to that of the reference strain (84.1 +/- 0.4 ° C for L. infantum). This proves that the vectors were parasitized. On the other side, identification by RT-PCR species did not give any results. This could be explained by the presence of an insufficient amount of leishmanian DNA in the vector, and therefore support the hypothesis of the regression of leishmaniasis in Constantine.

Keywords: Algeria, molecular diagnostic, phlebotomus, real time PCR

Procedia PDF Downloads 261
15030 CRISPR-DT: Designing gRNAs for the CRISPR-Cpf1 System with Improved Target Efficiency and Specificity

Authors: Houxiang Zhu, Chun Liang

Abstract:

The CRISPR-Cpf1 system has been successfully applied in genome editing. However, target efficiency of the CRISPR-Cpf1 system varies among different gRNA sequences. The published CRISPR-Cpf1 gRNA data was reanalyzed. Many sequences and structural features of gRNAs (e.g., the position-specific nucleotide composition, position-nonspecific nucleotide composition, GC content, minimum free energy, and melting temperature) correlated with target efficiency were found. Using machine learning technology, a support vector machine (SVM) model was created to predict target efficiency for any given gRNAs. The first web service application, CRISPR-DT (CRISPR DNA Targeting), has been developed to help users design optimal gRNAs for the CRISPR-Cpf1 system by considering both target efficiency and specificity. CRISPR-DT will empower researchers in genome editing.

Keywords: CRISPR-Cpf1, genome editing, target efficiency, target specificity

Procedia PDF Downloads 251
15029 Application Potential of Selected Tools in Context of Critical Infrastructure Protection and Risk Analysis

Authors: Hromada Martin

Abstract:

Risk analysis is considered as a fundamental aspect relevant for ensuring the level of critical infrastructure protection, where the critical infrastructure is seen as system, asset or its part which is important for maintaining the vital societal functions. Article actually discusses and analyzes the potential application of selected tools of information support for the implementation and within the framework of risk analysis and critical infrastructure protection. Use of the information in relation to their risk analysis can be viewed as a form of simplifying the analytical process. It is clear that these instruments (information support) for these purposes are countless, so they were selected representatives who have already been applied in the selected area of critical infrastructure, or they can be used. All presented fact were the basis for critical infrastructure resilience evaluation methodology development.

Keywords: critical infrastructure, protection, resilience, risk analysis

Procedia PDF Downloads 627
15028 Investigation of the Effects of Sampling Frequency on the THD of 3-Phase Inverters Using Space Vector Modulation

Authors: Khattab Al Qaisi, Nicholas Bowring

Abstract:

This paper presents the simulation results of the effects of sampling frequency on the total harmonic distortion (THD) of three-phase inverters using the space vector pulse width modulation (SVPWM) and space vector control (SVC) algorithms. The relationship between the variables was studied using curve fitting techniques, and it has been shown that, for 50 Hz inverters, there is an exponential relation between the sampling frequency and THD up to around 8500 Hz, beyond which the performance of the model becomes irregular, and there is an negative exponential relation between the sampling frequency and the marginal improvement to the THD. It has also been found that the performance of SVPWM is better than that of SVC with the same sampling frequency in most frequency range, including the range where the performance of the former is irregular.

Keywords: DSI, SVPWM, THD, DC-AC converter, sampling frequency, performance

Procedia PDF Downloads 472
15027 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory

Authors: Xu Jiaqiao

Abstract:

Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.

Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments

Procedia PDF Downloads 78
15026 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society

Authors: Irene Yi

Abstract:

Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.

Keywords: computational analysis, gendered grammar, misogynistic language, neural networks

Procedia PDF Downloads 105
15025 Getting to Know the Types of Asphalt, Its Manufacturing and Processing Methods and Its Application in Road Construction

Authors: Hamid Fallah

Abstract:

Asphalt is generally a mixture of stone materials with continuous granulation and a binder, which is usually bitumen. Asphalt is made in different shapes according to its use. The most familiar type of asphalt is hot asphalt or hot asphalt concrete. Stone materials usually make up more than 90% of the asphalt mixture. Therefore, stone materials have a significant impact on the quality of the resulting asphalt. According to the method of application and mixing, asphalt is divided into three categories: hot asphalt, protective asphalt, and cold asphalt. Cold mix asphalt is a mixture of stone materials and mixed bitumen or bitumen emulsion whose raw materials are mixed at ambient temperature. In some types of cold asphalt, the bitumen may be heated as necessary, but other materials are mixed with the bitumen without heating. Protective asphalts are used to make the roadbed impermeable, increase its abrasion and sliding resistance, and also temporarily improve the existing asphalt and concrete surfaces. This type of paving is very economical compared to hot asphalt due to the speed and ease of implementation and the limited need for asphalt machines and equipment. The present article, which is prepared in descriptive library form, introduces asphalt, its types, characteristics, and its application.

Keywords: asphalt, type of asphalt, asphalt concrete, sulfur concrete, bitumen in asphalt, sulfur, stone materials

Procedia PDF Downloads 51
15024 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 396
15023 Computational Cell Segmentation in Immunohistochemically Image of Meningioma Tumor Using Fuzzy C-Means and Adaptive Vector Directional Filter

Authors: Vahid Anari, Leila Shahmohammadi

Abstract:

Diagnosing and interpreting manually from a large cohort dataset of immunohistochemically stained tissue of tumors using an optical microscope involves subjectivity and also is tedious for pathologist specialists. Moreover, digital pathology today represents more of an evolution than a revolution in pathology. In this paper, we develop and test an unsupervised algorithm that can automatically enhance the IHC image of a meningioma tumor and classify cells into positive (proliferative) and negative (normal) cells. A dataset including 150 images is used to test the scheme. In addition, a new adaptive color image enhancement method is proposed based on a vector directional filter (VDF) and statistical properties of filtering the window. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.

Keywords: digital pathology, cell segmentation, immunohistochemically, noise reduction

Procedia PDF Downloads 56
15022 Performance of Constant Load Feed Machining for Robotic Drilling

Authors: Youji Miyake

Abstract:

In aircraft assembly, a large number of preparatory holes are required for screw and rivet joints. Currently, many holes are drilled manually because it is difficult to machine the holes using conventional computerized numerical control(CNC) machines. The application of industrial robots to drill the hole has been considered as an alternative to the CNC machines. However, the rigidity of robot arms is so low that vibration is likely to occur during drilling. In this study, it is proposed constant-load feed machining as a method to perform high-precision drilling while minimizing the thrust force, which is considered to be the cause of vibration. In this method, the drill feed is realized by a constant load applied onto the tool so that the thrust force is theoretically kept below the applied load. The performance of the proposed method was experimentally examined through the deep hole drilling of plastic and simultaneous drilling of metal/plastic stack plates. It was confirmed that the deep hole drilling and simultaneous drilling could be performed without generating vibration by controlling the tool feed rate in the appropriate range.

Keywords: constant load feed machining, robotic drilling, deep hole, simultaneous drilling

Procedia PDF Downloads 183
15021 Control Mechanisms for Sprayer Used in Turkey

Authors: Huseyin Duran, Yesim Benal Oztekin, Kazim Kubilay Vursavus, Ilker Huseyin Celen

Abstract:

There are two main approaches to manufacturing, market and usage of plant protection machinery in Turkey. The first approach is called as ‘Product Safety Approach’ and could be summarized as minimum health and safety requirements of consumer needs on plant protection equipment and machinery products. The second approach is the practices related to the Plant Protection Equipment and Machinery Directive. Product safety approach covers the plant protection machinery product groups within the framework of a new approach directive, Machinery Safety Directive (2006/42 / AT). The new directive is in practice in our country by 03.03.2009, parallel to the revision of the EU Regulation on the Directive (03.03.2009 dated and numbered 27158 published in the Official Gazette). ‘Pesticide Application for Machines’ paragraph is added to the 2006/42 / EC Machinery Safety Directive, which is, in particular, reveals the importance of primary health care and product safety issue, explaining the safety requirements for machines used in the application of plant protection products. The Ministry of Science, Industry and Technology is the authorized organizations in our country for the publication and implementation of this regulation. There is a special regulation, carried out by Ministry of Food, Agriculture and Livestock General Directorate of Food and Control, on the manufacture and sale of plant protection machinery. This regulation, prepared based on 5996 Veterinary Services, Plant Health, Food and Feed Law, is ‘Regulation on Plant Protection Equipment and Machinery’ (published on 02.04.2011 whit number 27893 in the Official Gazette). The purposes of this regulation are practicing healthy and reliable crop production, the preparation, implementation and dissemination of the integrated pest management programs and projects for the development of human health and environmentally friendly pest control methods. This second regulation covers: approval, manufacturing, licensing of Plant Protection Equipment and Machinery; duties and responsibilities of the dealers; principles and procedures related to supply and control of the market. There are no inspection procedures for the application of currently used plant protection machinery in Turkey. In this study, content and application principles of all regulation approaches currently used in Turkey are summarized.

Keywords: plant protection equipment and machinery, product safety, market surveillance, inspection procedures

Procedia PDF Downloads 254
15020 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data

Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad

Abstract:

Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.

Keywords: remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction

Procedia PDF Downloads 326
15019 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 294
15018 Mixed Integer Programming-Based One-Class Classification Method for Process Monitoring

Authors: Younghoon Kim, Seoung Bum Kim

Abstract:

One-class classification plays an important role in detecting outlier and abnormality from normal observations. In the previous research, several attempts were made to extend the scope of application of the one-class classification techniques to statistical process control problems. For most previous approaches, such as support vector data description (SVDD) control chart, the design of the control limits is commonly based on the assumption that the proportion of abnormal observations is approximately equal to an expected Type I error rate in Phase I process. Because of the limitation of the one-class classification techniques based on convex optimization, we cannot make the proportion of abnormal observations exactly equal to expected Type I error rate: controlling Type I error rate requires to optimize constraints with integer decision variables, but convex optimization cannot satisfy the requirement. This limitation would be undesirable in theoretical and practical perspective to construct effective control charts. In this work, to address the limitation of previous approaches, we propose the one-class classification algorithm based on the mixed integer programming technique, which can solve problems formulated with continuous and integer decision variables. The proposed method minimizes the radius of a spherically shaped boundary subject to the number of normal data to be equal to a constant value specified by users. By modifying this constant value, users can exactly control the proportion of normal data described by the spherically shaped boundary. Thus, the proportion of abnormal observations can be made theoretically equal to an expected Type I error rate in Phase I process. Moreover, analogous to SVDD, the boundary can be made to describe complex structures by using some kernel functions. New multivariate control chart applying the effectiveness of the algorithm is proposed. This chart uses a monitoring statistic to characterize the degree of being an abnormal point as obtained through the proposed one-class classification. The control limit of the proposed chart is established by the radius of the boundary. The usefulness of the proposed method was demonstrated through experiments with simulated and real process data from a thin film transistor-liquid crystal display.

Keywords: control chart, mixed integer programming, one-class classification, support vector data description

Procedia PDF Downloads 165
15017 Using Geo-Statistical Techniques and Machine Learning Algorithms to Model the Spatiotemporal Heterogeneity of Land Surface Temperature and its Relationship with Land Use Land Cover

Authors: Javed Mallick

Abstract:

In metropolitan areas, rapid changes in land use and land cover (LULC) have ecological and environmental consequences. Saudi Arabia's cities have experienced tremendous urban growth since the 1990s, resulting in urban heat islands, groundwater depletion, air pollution, loss of ecosystem services, and so on. From 1990 to 2020, this study examines the variance and heterogeneity in land surface temperature (LST) caused by LULC changes in Abha-Khamis Mushyet, Saudi Arabia. LULC was mapped using the support vector machine (SVM). The mono-window algorithm was used to calculate the land surface temperature (LST). To identify LST clusters, the local indicator of spatial associations (LISA) model was applied to spatiotemporal LST maps. In addition, the parallel coordinate (PCP) method was used to investigate the relationship between LST clusters and urban biophysical variables as a proxy for LULC. According to LULC maps, urban areas increased by more than 330% between 1990 and 2018. Between 1990 and 2018, built-up areas had an 83.6% transitional probability. Furthermore, between 1990 and 2020, vegetation and agricultural land were converted into built-up areas at a rate of 17.9% and 21.8%, respectively. Uneven LULC changes in built-up areas result in more LST hotspots. LST hotspots were associated with high NDBI but not NDWI or NDVI. This study could assist policymakers in developing mitigation strategies for urban heat islands

Keywords: land use land cover mapping, land surface temperature, support vector machine, LISA model, parallel coordinate plot

Procedia PDF Downloads 64
15016 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining

Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser

Abstract:

Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.

Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract

Procedia PDF Downloads 643
15015 Features Vector Selection for the Recognition of the Fragmented Handwritten Numeric Chains

Authors: Salim Ouchtati, Aissa Belmeguenai, Mouldi Bedda

Abstract:

In this study, we propose an offline system for the recognition of the fragmented handwritten numeric chains. Firstly, we realized a recognition system of the isolated handwritten digits, in this part; the study is based mainly on the evaluation of neural network performances, trained with the gradient backpropagation algorithm. The used parameters to form the input vector of the neural network are extracted from the binary images of the isolated handwritten digit by several methods: the distribution sequence, sondes application, the Barr features, and the centered moments of the different projections and profiles. Secondly, the study is extended for the reading of the fragmented handwritten numeric chains constituted of a variable number of digits. The vertical projection was used to segment the numeric chain at isolated digits and every digit (or segment) was presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits).

Keywords: features extraction, handwritten numeric chains, image processing, neural networks

Procedia PDF Downloads 257
15014 Singular Perturbed Vector Field Method Applied to the Problem of Thermal Explosion of Polydisperse Fuel Spray

Authors: Ophir Nave

Abstract:

In our research, we present the concept of singularly perturbed vector field (SPVF) method, and its application to thermal explosion of diesel spray combustion. Given a system of governing equations, which consist of hidden Multi-scale variables, the SPVF method transfer and decompose such system to fast and slow singularly perturbed subsystems (SPS). The SPVF method enables us to understand the complex system, and simplify the calculations. Later powerful analytical, numerical and asymptotic methods (e.g method of integral (invariant) manifold (MIM), the homotopy analysis method (HAM) etc.) can be applied to each subsystem. We compare the results obtained by the methods of integral invariant manifold and SPVF apply to spray droplets combustion model. The research deals with the development of an innovative method for extracting fast and slow variables in physical mathematical models. The method that we developed called singular perturbed vector field. This method based on a numerical algorithm applied to global quasi linearization applied to given physical model. The SPVF method applied successfully to combustion processes. Our results were compared to experimentally results. The SPVF is a general numerical and asymptotical method that reveals the hierarchy (multi-scale system) of a given system.

Keywords: polydisperse spray, model reduction, asymptotic analysis, multi-scale systems

Procedia PDF Downloads 209
15013 Numerical Investigation of Poling Vector Angle on Adaptive Sandwich Plate Deflection

Authors: Alireza Pouladkhan, Mohammad Yavari Foroushani, Ali Mortazavi

Abstract:

This paper presents a finite element model for a sandwich plate containing a piezoelectric core. A sandwich plate with a piezoelectric core is constructed using the shear mode of piezoelectric materials. The orientation of poling vector has a significant effect on deflection and stress induced in the piezo-actuated adaptive sandwich plate. In the present study, the influence of this factor for a clamped-clamped-free-free and simple-simple-free-free square sandwich plate is investigated using Finite Element Method. The study uses ABAQUS (v.6.7) software to derive the finite element model of the sandwich plate. By using this model, the study gives the influences of the poling vector angle on the response of the smart structure and determines the maximum transverse displacement and maximum stress induced.

Keywords: finite element method, sandwich plate, poling vector, piezoelectric materials, smart structure, electric enthalpy

Procedia PDF Downloads 224
15012 Design Criteria for an Internal Information Technology Cost Allocation to Support Business Information Technology Alignment

Authors: Andrea Schnabl, Mario Bernhart

Abstract:

The controlling instrument of an internal cost allocation (IT chargeback) is commonly used to make IT costs transparent and controllable. Information Technology (IT) became, especially for information industries, a central competitive factor. Consequently, the focus is not on minimizing IT costs but on the strategic aligned application of IT. Hence, an internal IT cost allocation should be designed to enhance the business-IT alignment (strategic alignment of IT) in order to support the effective application of IT from a company’s point of view. To identify design criteria for an internal cost allocation to support business alignment a case study analysis at a typical medium-sized firm in information industry is performed. Documents, Key Performance Indicators, and cost accounting data over a period of 10 years are analyzed and interviews are performed. The derived design criteria are evaluated by 6 heads of IT departments from 6 different companies, which have an internal IT cost allocation at use. By applying these design criteria an internal cost allocation serves not only for cost controlling but also as an instrument in strategic IT management.

Keywords: accounting for IT services, Business IT Alignment, internal cost allocation, IT controlling, IT governance, strategic IT management

Procedia PDF Downloads 149
15011 Music Genre Classification Based on Non-Negative Matrix Factorization Features

Authors: Soyon Kim, Edward Kim

Abstract:

In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.

Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)

Procedia PDF Downloads 282
15010 HcDD: The Hybrid Combination of Disk Drives in Active Storage Systems

Authors: Shu Yin, Zhiyang Ding, Jianzhong Huang, Xiaojun Ruan, Xiaomin Zhu, Xiao Qin

Abstract:

Since large-scale and data-intensive applications have been widely deployed, there is a growing demand for high-performance storage systems to support data-intensive applications. Compared with traditional storage systems, next-generation systems will embrace dedicated processor to reduce computational load of host machines and will have hybrid combinations of different storage devices. The advent of flash- memory-based solid state disk has become a critical role in revolutionizing the storage world. However, instead of simply replacing the traditional magnetic hard disk with the solid state disk, it is believed that finding a complementary approach to corporate both of them is more challenging and attractive. This paper explores an idea of active storage, an emerging new storage configuration, in terms of the architecture and design, the parallel processing capability, the cooperation of other machines in cluster computing environment, and a disk configuration, the hybrid combination of different types of disk drives. Experimental results indicate that the proposed HcDD achieves better I/O performance and longer storage system lifespan.

Keywords: arallel storage system, hybrid storage system, data inten- sive, solid state disks, reliability

Procedia PDF Downloads 431
15009 Evaluation of Ensemble Classifiers for Intrusion Detection

Authors: M. Govindarajan

Abstract:

One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. 

Keywords: data mining, ensemble, radial basis function, support vector machine, accuracy

Procedia PDF Downloads 236
15008 Design of a Service-Enabled Dependable Integration Environment

Authors: Fuyang Peng, Donghong Li

Abstract:

The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.

Keywords: application integration, dependability, legacy, SOA

Procedia PDF Downloads 353
15007 Investigation and Monitoring Method of Vector Density in Kaohsiung City

Authors: Chiu-Wen Chang, I-Yun Chang, Wei-Ting Chen, Hui-Ping Ho, Chao-Ying Pan, Joh-Jong Huang

Abstract:

Dengue is a ‘community disease’ or ‘environmental disease’, as long as the environment exist suitable container (including natural and artificial) for mosquito breeding, once the virus invade will lead to the dengue epidemic. Surveillance of vector density is critical to effective infectious disease control and play an important role in monitoring the dynamics of mosquitoes in community, such as mosquito species, density, distribution area. The objective of this study was to examine the relationship in vector density survey (Breteau index, Adult index, House index, Container index, and Larvae index) form 2014 to 2016 in Kaohsiung City and evaluate the effects of introducing the Breeding Elimination and Appraisal Team (hereinafter referred to as BEAT) as an intervention measure on eliminating dengue vector breeding site started from May 2016. BEAT were performed on people who were suspected of contracting dengue fever, a surrounding area measuring 50 meters by 50 meters was demarcated as the emergency prevention and treatment zone. BEAT would perform weekly vector mosquito inspections and vector mosquito inspections in regions with a high Gravitrap index and assign a risk assessment index to each region. These indices as well as the prevention and treatment results were immediately reported to epidemic prevention-related units every week. The results indicated that, vector indices from 2014 to 2016 showed no statistically significant differences in the Breteau index, adult index, and house index (p > 0.05) but statistically significant differences in the container index and larvae index (p <0.05). After executing the integrated elimination work, container index and larvae index are statistically significant different from 2014 to 2016 in the (p < 0.05). A post hoc test indicated that the container index of 2014 (M = 12.793) was significantly higher than that of 2016 (M = 7.631), and that the larvae index of 2015 (M = 34.065) was significantly lower than that of 2014 (M = 66.867). The results revealed that effective vector density surveillance could highlight the focus breeding site and then implement the immediate control action (BEAT), which successfully decreased the vector density and the risk of dengue epidemic.

Keywords: Breteau index, dengue control, monitoring method, vector density

Procedia PDF Downloads 170
15006 Analysis Mechanized Boring (TBM) of Tehran Subway Line 7

Authors: Shahin Shabani, Pouya Pourmadadi

Abstract:

Tunnel boring machines (TBMs) have been used for the construction of various tunnels for mining projects for the purpose of access, conveyance of ore and waste, drainage, exploration, water supply and water diversion. Several mining projects have seen the successful and economic beneficial use of TBMs, and there is an increasing awareness of the benefits of TBMs for mining projects. Key technical considerations for the use of TBMs for the construction of tunnels for mining projects include geological issues (rock type, rock alteration, rock strength, rock abrasivity, durability, ground water inflows), depth of cover and the potential for overstressing/rockbursts, site access and terrain, portal locations, TBM constraints, minimum tunnel size, tunnel support requirements, contractor and labor experience, and project schedule demands. This study focuses on tunnelling mining, with the goal to develop methods and tools to be used to gain understanding of these processes, and to analyze metro of Tehran. The Metro Line 7 of Tehran is one of the Longest (26 Km) and deepest (27m) of projects that’s under implementation. Because of major differences like passing under all geotechnical layers of the town and encountering part of it with underground water table and also using mechanized excavation system, is one of special metro projects.

Keywords: TBM, tunnel boring machines economic, metro, line 7

Procedia PDF Downloads 373
15005 Rotor Radial Vent Pumping in Large Synchronous Electrical Machines

Authors: Darren Camilleri, Robert Rolston

Abstract:

Rotor radial vents make use of the pumping effect to increase airflow through the active material thus reduce hotspot temperatures. The effect of rotor radial pumping in synchronous machines has been studied previously. This paper presents the findings of previous studies and builds upon their theories using a parametric numerical approach to investigate the rotor radial pumping effect. The pressure head generated by the poles and radial vent flow-rate were identified as important factors in maximizing the benefits of the pumping effect. The use of Minitab and ANSYS Workbench to investigate the key performance characteristics of radial pumping through a Design of Experiments (DOE) was described. CFD results were compared with theoretical calculations. A correlation for each response variable was derived through a statistical analysis. Findings confirmed the strong dependence of radial vent length on vent pressure head, and radial vent cross-sectional area was proved to be significant in maximising radial vent flow rate.

Keywords: CFD, cooling, electrical machines, regression analysis

Procedia PDF Downloads 305
15004 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 314
15003 The Connection between Social Support, Caregiver Burden, and Life Satisfaction of the Parents Whose Children Have Congenital Heart Disease

Authors: A. Uludağ, F. G. Tufekci, N. Ceviz

Abstract:

Aim: The research has been carried out in order to evaluate caregiver burden, life satisfaction and received social support level of the parents whose children have congenital heart disease; to examine the relationship between the social supports received by them and caregiver burden and life satisfaction. Material and Method: The research which is descriptive and which is searching a relationship has been carried out between the dates June 7, 2012- June 30, 2014, in Erzurum Ataturk University Research and Application Hospital, Department of Pediatrics and Children Cardiology Polyclinic. In the research, it was collaborated with the parents (N = 157) who accepted to participate in, of children who were between the ages of 3 months- 12 years. While gathering the data, a questionnaire, Zarit Caregiver Burden, Life Satisfaction and Social Support Scales have been used. The statistics of the data acquired has been produced by using percentage distribution, mean, and variance and correlation analysis. Ethical principles are followed in the research. Results: In the research, caregiver burden, life satisfaction and social support level received from family (p < 0.05), have been determined higher in the parents whose children have serious congenital heart disease than that of parents whose children have slight disease and social support received from friends has been found lower. It has been determined that there is a strong relation (p < 0.001) through negative direction between both social support levels and caregiver burden of parents; and that there is a strong relation (p < 0.001) through positive direction between both support levels and life satisfaction. Conclusion: That Social Support is in a strong relation with Caregiver Burden through a negative direction and a strong relation with Life Satisfaction through positive direction in parents of all the children who have congenital heart disease requires social support systems to be reinforced. Parents can be led or guided so as to prompt social support systems more.

Keywords: congenital heart disease, child, parents, caregiver burden, life satisfaction, social support

Procedia PDF Downloads 293
15002 Relationship between Food Inflation and Agriculture Lending Rate in Ghana: A Vector Autoregressive Approach

Authors: Raymond K. Dziwornu

Abstract:

Lending rate of agriculture loan has persistently been high and attributed to risk in the sector. This study examined how food inflation and agriculture lending rate react to each other in Ghana using vector autoregressive approach. Quarterly data from 2006 to 2018 was obtained from the Bank of Ghana quarterly bulletin and the Ghana Statistical Service reports. The study found that a positive standard deviation shock to food inflation causes lending rate of agriculture loan to react negatively in the short run, but positively and steadily in the long run. This suggests the need to direct appropriate policy measures to reduce food inflation and consequently, the cost of credit to the agricultural sector for its growth.

Keywords: food inflation, agriculture, lending rate, vector autoregressive, Ghana

Procedia PDF Downloads 133