Search results for: curve feature
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2510

Search results for: curve feature

2390 Effect of Fuel Injection Discharge Curve and Injection Pressure on Upgrading Power and Combustion Parameters in HD Diesel Engine with CFD Simulation

Authors: Saeed Chamehsara, Seyed Mostafa Mirsalim, Mehdi Tajdari

Abstract:

In this study, the effect of fuel injection discharge curve and injection pressure simultaneously for upgrading power of heavy duty diesel engine by simulation of combustion process in AVL-Fire software are discussed. Hence, the fuel injection discharge curve was changed from semi-triangular to rectangular which is usual in common rail fuel injection system. Injection pressure with respect to amount of injected fuel and nozzle hole diameter are changed. Injection pressure is calculated by an experimental equation which is for heavy duty diesel engines with common rail fuel injection system. Upgrading power for 1000 and 2000 bar injection pressure are discussed. For 1000 bar injection pressure with 188 mg injected fuel and 3 mm nozzle hole diameter in compare with first state which is semi-triangular discharge curve with 139 mg injected fuel and 3 mm nozzle hole diameter, upgrading power is about 19% whereas the special change has not been observed in cylinder pressure. On the other hand, both the NOX emission and the Soot emission decreased about 30% and 6% respectively. Compared with first state, for 2000 bar injection pressure that injected fuel and nozzle diameter are 196 mg and 2.6 mm respectively, upgrading power is about 22% whereas cylinder pressure has been fixed and NOX emission and the Soot emissions are decreased 36% and 20%, respectively.

Keywords: CFD simulation, HD diesel engine, upgrading power, injection pressure, fuel injection discharge curve, combustion process

Procedia PDF Downloads 491
2389 Multi-Granularity Feature Extraction and Optimization for Pathological Speech Intelligibility Evaluation

Authors: Chunying Fang, Haifeng Li, Lin Ma, Mancai Zhang

Abstract:

Speech intelligibility assessment is an important measure to evaluate the functional outcomes of surgical and non-surgical treatment, speech therapy and rehabilitation. The assessment of pathological speech plays an important role in assisting the experts. Pathological speech usually is non-stationary and mutational, in this paper, we describe a multi-granularity combined feature schemes, and which is optimized by hierarchical visual method. First of all, the difference granularity level pathological features are extracted which are BAFS (Basic acoustics feature set), local spectral characteristics MSCC (Mel s-transform cepstrum coefficients) and nonlinear dynamic characteristics based on chaotic analysis. Latterly, radar chart and F-score are proposed to optimize the features by the hierarchical visual fusion. The feature set could be optimized from 526 to 96-dimensions.The experimental results denote that new features by support vector machine (SVM) has the best performance, with a recognition rate of 84.4% on NKI-CCRT corpus. The proposed method is thus approved to be effective and reliable for pathological speech intelligibility evaluation.

Keywords: pathological speech, multi-granularity feature, MSCC (Mel s-transform cepstrum coefficients), F-score, radar chart

Procedia PDF Downloads 261
2388 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification

Authors: Hung-Sheng Lin, Cheng-Hsuan Li

Abstract:

Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.

Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction

Procedia PDF Downloads 307
2387 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography

Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo

Abstract:

Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.

Keywords: arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s encoding

Procedia PDF Downloads 175
2386 Object Tracking in Motion Blurred Images with Adaptive Mean Shift and Wavelet Feature

Authors: Iman Iraei, Mina Sharifi

Abstract:

A method for object tracking in motion blurred images is proposed in this article. This paper shows that object tracking could be improved with this approach. We use mean shift algorithm to track different objects as a main tracker. But, the problem is that mean shift could not track the selected object accurately in blurred scenes. So, for better tracking result, and increasing the accuracy of tracking, wavelet transform is used. We use a feature named as blur extent, which could help us to get better results in tracking. For calculating of this feature, we should use Harr wavelet. We can look at this matter from two different angles which lead to determine whether an image is blurred or not and to what extent an image is blur. In fact, this feature left an impact on the covariance matrix of mean shift algorithm and cause to better performance of tracking. This method has been concentrated mostly on motion blur parameter. transform. The results reveal the ability of our method in order to reach more accurately tracking.

Keywords: mean shift, object tracking, blur extent, wavelet transform, motion blur

Procedia PDF Downloads 184
2385 Automated Feature Detection and Matching Algorithms for Breast IR Sequence Images

Authors: Chia-Yen Lee, Hao-Jen Wang, Jhih-Hao Lai

Abstract:

In recent years, infrared (IR) imaging has been considered as a potential tool to assess the efficacy of chemotherapy and early detection of breast cancer. Regions of tumor growth with high metabolic rate and angiogenesis phenomenon lead to the high temperatures. Observation of differences between the heat maps in long term is useful to help assess the growth of breast cancer cells and detect breast cancer earlier, wherein the multi-time infrared image alignment technology is a necessary step. Representative feature points detection and matching are essential steps toward the good performance of image registration and quantitative analysis. However, there is no clear boundary on the infrared images and the subject's posture are different for each shot. It cannot adhesive markers on a body surface for a very long period, and it is hard to find anatomic fiducial markers on a body surface. In other words, it’s difficult to detect and match features in an IR sequence images. In this study, automated feature detection and matching algorithms with two type of automatic feature points (i.e., vascular branch points and modified Harris corner) are developed respectively. The preliminary results show that the proposed method could identify the representative feature points on the IR breast images successfully of 98% accuracy and the matching results of 93% accuracy.

Keywords: Harris corner, infrared image, feature detection, registration, matching

Procedia PDF Downloads 284
2384 A Robust Digital Image Watermarking Against Geometrical Attack Based on Hybrid Scheme

Authors: M. Samadzadeh Mahabadi, J. Shanbehzadeh

Abstract:

This paper presents a hybrid digital image-watermarking scheme, which is robust against varieties of attacks and geometric distortions. The image content is represented by important feature points obtained by an image-texture-based adaptive Harris corner detector. These feature points are extracted from LL2 of 2-D discrete wavelet transform which are obtained by using the Harris-Laplacian detector. We calculate the Fourier transform of circular regions around these points. The amplitude of this transform is rotation invariant. The experimental results demonstrate the robustness of the proposed method against the geometric distortions and various common image processing operations such as JPEG compression, colour reduction, Gaussian filtering, median filtering, and rotation.

Keywords: digital watermarking, geometric distortions, geometrical attack, Harris Laplace, important feature points, rotation, scale invariant feature

Procedia PDF Downloads 478
2383 SIFT and Perceptual Zoning Applied to CBIR Systems

Authors: Simone B. K. Aires, Cinthia O. de A. Freitas, Luiz E. S. Oliveira

Abstract:

This paper contributes to the CBIR systems applied to trademark retrieval. The proposed model includes aspects from visual perception of the shapes, by means of feature extractor associated to a non-symmetrical perceptual zoning mechanism based on the Principles of Gestalt. Thus, the feature set were performed using Scale Invariant Feature Transform (SIFT). We carried out experiments using four different zonings strategies (Z = 4, 5H, 5V, 7) for matching and retrieval tasks. Our proposal method achieved the normalized recall (Rn) equal to 0.84. Experiments show that the non-symmetrical zoning could be considered as a tool to build more reliable trademark retrieval systems.

Keywords: CBIR, Gestalt, matching, non-symmetrical zoning, SIFT

Procedia PDF Downloads 286
2382 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error

Procedia PDF Downloads 290
2381 A Hybrid Feature Selection Algorithm with Neural Network for Software Fault Prediction

Authors: Khalaf Khatatneh, Nabeel Al-Milli, Amjad Hudaib, Monther Ali Tarawneh

Abstract:

Software fault prediction identify potential faults in software modules during the development process. In this paper, we present a novel approach for software fault prediction by combining a feedforward neural network with particle swarm optimization (PSO). The PSO algorithm is employed as a feature selection technique to identify the most relevant metrics as inputs to the neural network. Which enhances the quality of feature selection and subsequently improves the performance of the neural network model. Through comprehensive experiments on software fault prediction datasets, the proposed hybrid approach achieves better results, outperforming traditional classification methods. The integration of PSO-based feature selection with the neural network enables the identification of critical metrics that provide more accurate fault prediction. Results shows the effectiveness of the proposed approach and its potential for reducing development costs and effort by detecting faults early in the software development lifecycle. Further research and validation on diverse datasets will help solidify the practical applicability of the new approach in real-world software engineering scenarios.

Keywords: feature selection, neural network, particle swarm optimization, software fault prediction

Procedia PDF Downloads 63
2380 The Stage and Cause of Regional Industrial Specialization Evolution in China

Authors: Cheng Wen, Zhang Jianhua

Abstract:

This paper aims to probe into the general rules of industry specialization or diversification in a region during its process of economic growth and the specific reasons for the difference of industry specialization development in the eastern, central and western regions of China. It is found in this paper that the changes of regional industry specialization in China, like most of countries in the world, also present the U-shaped curve. Regional industrial structure is diversified in the first place. And when the per capita income exceeds a certain level, distribution of economic resources in this region will be concentrated again. From the perspective of rising total factor productivity and falling of transaction cost in the process of economic development, this paper comes up with a theoretical model to explain the U-shaped curve. Through the empirical test of China's provincial panel data, this paper explains the factors that cause the inequality of the industry specialization development in the eastern, central and western regions of China.

Keywords: u-shaped curve, regional industrial specialization, technological progress, transaction costs

Procedia PDF Downloads 284
2379 Optimized and Secured Digital Watermarking Using Fuzzy Entropy, Bezier Curve and Visual Cryptography

Authors: R. Rama Kishore, Sunesh

Abstract:

Recent development in the usage of internet for different purposes creates a great threat for the copyright protection of the digital images. Digital watermarking can be used to address the problem. This paper presents detailed review of the different watermarking techniques, latest trends in the field of secured, robust and imperceptible watermarking. It also discusses the different optimization techniques used in the field of watermarking in order to improve the robustness and imperceptibility of the method. Different measures are discussed to evaluate the performance of the watermarking algorithm. At the end, this paper proposes a watermarking algorithm using (2, 2) share visual cryptography and Bezier curve based algorithm to improve the security of the watermark. The proposed method uses fractional transformation to improve the robustness of the copyright protection of the method. The algorithm is optimized using fuzzy entropy for better results.

Keywords: digital watermarking, fractional transform, visual cryptography, Bezier curve, fuzzy entropy

Procedia PDF Downloads 337
2378 Phillips Curve Estimation in an Emerging Economy: Evidence from Sub-National Data of Indonesia

Authors: Harry Aginta

Abstract:

Using Phillips curve framework, this paper seeks for new empirical evidence on the relationship between inflation and output in a major emerging economy. By exploiting sub-national data, the contribution of this paper is threefold. First, it resolves the issue of using on-target national inflation rates that potentially causes weakening inflation-output nexus. This is very relevant for Indonesia as its central bank has been adopting inflation targeting framework based on national consumer price index (CPI) inflation. Second, the study tests the relevance of mining sector in output gap estimation. The test for mining sector is important to control for the effects of mining regulation and nominal effects of coal prices on real economic activities. Third, the paper applies panel econometric method by incorporating regional variation that help to improve model estimation. The results from this paper confirm the strong presence of Phillips curve in Indonesia. Positive output gap that reflects excess demand condition gives rise to the inflation rates. In addition, the elasticity of output gap is higher if the mining sector is excluded from output gap estimation. In addition to inflation adaptation, the dynamics of exchange rate and international commodity price are also found to affect inflation significantly. The results are robust to the alternative measurement of output gap

Keywords: Phillips curve, inflation, Indonesia, panel data

Procedia PDF Downloads 99
2377 Unsupervised Learning of Spatiotemporally Coherent Metrics

Authors: Ross Goroshin, Joan Bruna, Jonathan Tompson, David Eigen, Yann LeCun

Abstract:

Current state-of-the-art classification and detection algorithms rely on supervised training. In this work we study unsupervised feature learning in the context of temporally coherent video data. We focus on feature learning from unlabeled video data, using the assumption that adjacent video frames contain semantically similar information. This assumption is exploited to train a convolutional pooling auto-encoder regularized by slowness and sparsity. We establish a connection between slow feature learning to metric learning and show that the trained encoder can be used to define a more temporally and semantically coherent metric.

Keywords: machine learning, pattern clustering, pooling, classification

Procedia PDF Downloads 424
2376 Thermoluminescence Characteristic of Nanocrystalline BaSO4 Doped with Europium

Authors: Kanika S. Raheja, A. Pandey, Shaila Bahl, Pratik Kumar, S. P. Lochab

Abstract:

The subject of undertaking for this paper is the study of BaSO4 nanophosphor doped with Europium in which mainly the concentration of the rare earth impurity Eu (0.05, 0.1, 0.2, 0.5, and 1 mol %) has been varied. A comparative study of the thermoluminescence(TL) properties of the given nanophosphor has also been done using a well-known standard dosimetry material i.e. TLD-100.Firstly, a number of samples were prepared successfully by the chemical co-precipitation method. The whole lot was then compared to a well established standard material (TLD-100) for its TL sensitivity property. BaSO4:Eu ( 0.2 mol%) showed the highest sensitivity out of the lot. It was also found that when compared to the standard TLD-100, BaSo4:Eu (0.2mol%) showed surprisingly high sensitivity for a large range of doses. The TL response curve for all prepared samples has also been studied over a wide range of doses i.e 10Gy to 2kGy for gamma radiation. Almost all the samples of BaSO4:Eu showed a remarkable linearity for a broad range of doses, which is a characteristic feature of a fine TL dosimeter. The graph remained linear even beyond 1kGy for gamma radiation. Thus, the given nanophosphor has been successfully optimised for the concentration of the dopant material to achieve its highest TL sensitivity. Further, the comparative study with the standard material revealed that the current optimised sample shows an astonishingly better TL sensitivity and a phenomenal linear response curve for an incredibly wide range of doses for gamma radiation (Co-60) as compared to the standard TLD-100, which makes the current optimised BaSo4:Eu quite promising as an efficient gamma radiation dosimeter. Lastly, the present phosphor has been optimised for its annealing temperature to acquire the best results while also studying its fading and reusability properties.

Keywords: gamma radiation, nanoparticles, radiation dosimetry, thermoluminescence

Procedia PDF Downloads 410
2375 Cotton Transplantation as a Practice to Escape Infection with Some Soil-Borne Pathogens

Authors: E. M. H. Maggie, M. N. A. Nazmey, M. A. Abdel-Sattar, S. A. Saied

Abstract:

A successful trial of transplanting cotton is reported. Seeds grown in trays for 4-5 weeks in an easily prepared supporting medium such as peat moss or similar plant waste are tried. Careful transplanting of seedlings, with root system as intact as possible, is being made in the permanent field. The practice reduced damping-off incidence rate and allowed full winter crop revenues. Further work is needed to evaluate certain parameters such as growth curve, flowering curve, and yield at economic bases.

Keywords: cotton, transplanting cotton, damping-off diseases, environment sciences

Procedia PDF Downloads 337
2374 Detecting Potential Biomarkers for Ulcerative Colitis Using Hybrid Feature Selection

Authors: Mustafa Alshawaqfeh, Bilal Wajidy, Echin Serpedin, Jan Suchodolski

Abstract:

Inflammatory Bowel disease (IBD) is a disease of the colon with characteristic inflammation. Clinically IBD is detected using laboratory tests (blood and stool), radiology tests (imaging using CT, MRI), capsule endoscopy and endoscopy. There are two variants of IBD referred to as Ulcerative Colitis (UC) and Crohn’s disease. This study employs a hybrid feature selection method that combines a correlation-based variable ranking approach with exhaustive search wrapper methods in order to find potential biomarkers for UC. The proposed biomarkers presented accurate discriminatory power thereby identifying themselves to be possible ingredients to UC therapeutics.

Keywords: ulcerative colitis, biomarker detection, feature selection, inflammatory bowel disease (IBD)

Procedia PDF Downloads 367
2373 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 215
2372 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge

Authors: M. F. Yilmaz, B. Ö. Çağlayan

Abstract:

Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.

Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures

Procedia PDF Downloads 334
2371 Combining an Optimized Closed Principal Curve-Based Method and Evolutionary Neural Network for Ultrasound Prostate Segmentation

Authors: Tao Peng, Jing Zhao, Yanqing Xu, Jing Cai

Abstract:

Due to missing/ambiguous boundaries between the prostate and neighboring structures, the presence of shadow artifacts, as well as the large variability in prostate shapes, ultrasound prostate segmentation is challenging. To handle these issues, this paper develops a hybrid method for ultrasound prostate segmentation by combining an optimized closed principal curve-based method and the evolutionary neural network; the former can fit curves with great curvature and generate a contour composed of line segments connected by sorted vertices, and the latter is used to express an appropriate map function (represented by parameters of evolutionary neural network) for generating the smooth prostate contour to match the ground truth contour. Both qualitative and quantitative experimental results showed that our proposed method obtains accurate and robust performances.

Keywords: ultrasound prostate segmentation, optimized closed polygonal segment method, evolutionary neural network, smooth mathematical model, principal curve

Procedia PDF Downloads 171
2370 Hierarchical Piecewise Linear Representation of Time Series Data

Authors: Vineetha Bettaiah, Heggere S. Ranganath

Abstract:

This paper presents a Hierarchical Piecewise Linear Approximation (HPLA) for the representation of time series data in which the time series is treated as a curve in the time-amplitude image space. The curve is partitioned into segments by choosing perceptually important points as break points. Each segment between adjacent break points is recursively partitioned into two segments at the best point or midpoint until the error between the approximating line and the original curve becomes less than a pre-specified threshold. The HPLA representation achieves dimensionality reduction while preserving prominent local features and general shape of time series. The representation permits course-fine processing at different levels of details, allows flexible definition of similarity based on mathematical measures or general time series shape, and supports time series data mining operations including query by content, clustering and classification based on whole or subsequence similarity.

Keywords: data mining, dimensionality reduction, piecewise linear representation, time series representation

Procedia PDF Downloads 250
2369 Evaluating Models Through Feature Selection Methods Using Data Driven Approach

Authors: Shital Patil, Surendra Bhosale

Abstract:

Cardiac diseases are the leading causes of mortality and morbidity in the world, from recent few decades accounting for a large number of deaths have emerged as the most life-threatening disorder globally. Machine learning and Artificial intelligence have been playing key role in predicting the heart diseases. A relevant set of feature can be very helpful in predicting the disease accurately. In this study, we proposed a comparative analysis of 4 different features selection methods and evaluated their performance with both raw (Unbalanced dataset) and sampled (Balanced) dataset. The publicly available Z-Alizadeh Sani dataset have been used for this study. Four feature selection methods: Data Analysis, minimum Redundancy maximum Relevance (mRMR), Recursive Feature Elimination (RFE), Chi-squared are used in this study. These methods are tested with 8 different classification models to get the best accuracy possible. Using balanced and unbalanced dataset, the study shows promising results in terms of various performance metrics in accurately predicting heart disease. Experimental results obtained by the proposed method with the raw data obtains maximum AUC of 100%, maximum F1 score of 94%, maximum Recall of 98%, maximum Precision of 93%. While with the balanced dataset obtained results are, maximum AUC of 100%, F1-score 95%, maximum Recall of 95%, maximum Precision of 97%.

Keywords: cardio vascular diseases, machine learning, feature selection, SMOTE

Procedia PDF Downloads 89
2368 A Decision Support System to Detect the Lumbar Disc Disease on the Basis of Clinical MRI

Authors: Yavuz Unal, Kemal Polat, H. Erdinc Kocer

Abstract:

In this study, a decision support system comprising three stages has been proposed to detect the disc abnormalities of the lumbar region. In the first stage named the feature extraction, T2-weighted sagittal and axial Magnetic Resonance Images (MRI) were taken from 55 people and then 27 appearance and shape features were acquired from both sagittal and transverse images. In the second stage named the feature weighting process, k-means clustering based feature weighting (KMCBFW) proposed by Gunes et al. Finally, in the third stage named the classification process, the classifier algorithms including multi-layer perceptron (MLP- neural network), support vector machine (SVM), Naïve Bayes, and decision tree have been used to classify whether the subject has lumbar disc or not. In order to test the performance of the proposed method, the classification accuracy (%), sensitivity, specificity, precision, recall, f-measure, kappa value, and computation times have been used. The best hybrid model is the combination of k-means clustering based feature weighting and decision tree in the detecting of lumbar disc disease based on both sagittal and axial MR images.

Keywords: lumbar disc abnormality, lumbar MRI, lumbar spine, hybrid models, hybrid features, k-means clustering based feature weighting

Procedia PDF Downloads 499
2367 Metaphor Institutionalization as Phase Transition: Case Studies of Chinese Metaphors

Authors: Xuri Tang, Ting Pan

Abstract:

Metaphor institutionalization refers to the propagation of a metaphor that leads to its acceptance in speech community as a norm of the language. Such knowledge is important to both theoretical studies of metaphor and practical disciplines such as lexicography and language generation. This paper reports an empirical study of metaphor institutionalization of 14 Chinese metaphors. It first explores the pattern of metaphor institutionalization by fitting the logistic function (or S-shaped curve) to time series data of conventionality of the metaphors that are automatically obtained from a large-scale diachronic Chinese corpus. Then it reports a questionnaire-based survey on the propagation scale of each metaphor, which is measured by the average number of subjects that can easily understand the metaphorical expressions. The study provides two pieces of evidence supporting the hypothesis that metaphor institutionalization is a phrase transition: (1) the pattern of metaphor institutionalization is an S-shaped curve and (2) institutionalized metaphors generally do not propagate to the whole community but remain in equilibrium state. This conclusion helps distinguish metaphor institutionalization from topicalization and other types of semantic change.

Keywords: metaphor institutionalization, phase transition, propagation scale, s-shaped curve

Procedia PDF Downloads 149
2366 A Novel Antenna Design for Telemedicine Applications

Authors: Amar Partap Singh Pharwaha, Shweta Rani

Abstract:

To develop a reliable and cost effective communication platform for the telemedicine applications, novel antenna design has been presented using bacterial foraging optimization (BFO) technique. The proposed antenna geometry is achieved by etching a modified Koch curve fractal shape at the edges and a square shape slot at the center of the radiating element of a patch antenna. It has been found that the new antenna has achieved 43.79% size reduction and better resonating characteristic than the original patch. Representative results for both simulations and numerical validations are reported in order to assess the effectiveness of the developed methodology.

Keywords: BFO, electrical permittivity, fractals, Koch curve

Procedia PDF Downloads 484
2365 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis

Authors: Amir Hajian, Sepehr Damavandinejadmonfared

Abstract:

In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.

Keywords: biometrics, finger vein recognition, principal component analysis (PCA), kernel principal component analysis (KPCA)

Procedia PDF Downloads 341
2364 Runoff Estimation in the Khiyav River Basin by Using the SCS_ CN Model

Authors: F. Esfandyari Darabad, Z. Samadi

Abstract:

The volume of runoff caused by rainfall in the river basin has enticed the researchers in the fields of the water management resources. In this study, first of the hydrological data such as the rainfall and discharge of the Khiyav river basin of Meshkin city in the northwest of Iran collected and then the process of analyzing and reconstructing has been completed. The soil conservation service (scs) has developed a method for calculating the runoff, in which is based on the curve number specification (CN). This research implemented the following model in the Khiyav river basin of Meshkin city by the GIS techniques and concluded the following fact in which represents the usage of weight model in calculating the curve numbers that provides the possibility for the all efficient factors which is contributing to the runoff creation such as; the geometric characteristics of the basin, the basin soil characteristics, vegetation, geology, climate and human factors to be considered, so an accurate estimation of runoff from precipitation to be achieved as the result. The findings also exposed the accident-prone areas in the output of the Khiyav river basin so it was revealed that the Khiyav river basin embodies a high potential for the flood creation.

Keywords: curve number, khiyav river basin, runoff estimation, SCS

Procedia PDF Downloads 596
2363 Hand Motion Trajectory Analysis for Dynamic Hand Gestures Used in Indian Sign Language

Authors: Daleesha M. Viswanathan, Sumam Mary Idicula

Abstract:

Dynamic hand gestures are an intrinsic component in sign language communication. Extracting spatial temporal features of the hand gesture trajectory plays an important role in a dynamic gesture recognition system. Finding a discrete feature descriptor for the motion trajectory based on the orientation feature is the main concern of this paper. Kalman filter algorithm and Hidden Markov Models (HMM) models are incorporated with this recognition system for hand trajectory tracking and for spatial temporal classification, respectively.

Keywords: orientation features, discrete feature vector, HMM., Indian sign language

Procedia PDF Downloads 343
2362 Mechanical Model of Gypsum Board Anchors Subjected Cyclic Shear Loading

Authors: Yoshinori Kitsutaka, Fumiya Ikedo

Abstract:

In this study, the mechanical model of various anchors embedded in gypsum board subjected cyclic shear loading were investigated. Shear tests for anchors embedded in 200 mm square size gypsum board were conducted to measure the load - load displacement curves. The strength of the gypsum board was changed for three conditions and 12 kinds of anchors were selected which were ordinary used for gypsum board anchoring. The loading conditions were a monotonous loading and a cyclic loading controlled by a servo-controlled hydraulic loading system to achieve accurate measurement. The fracture energy for each of the anchors was estimated by the analysis of consumed energy calculated by the load - load displacement curve. The effect of the strength of gypsum board and the types of anchors on the shear properties of gypsum board anchors was cleared. A numerical model to predict the load-unload curve of shear deformation of gypsum board anchors caused by such as the earthquake load was proposed and the validity on the model was proved.

Keywords: gypsum board, anchor, shear test, cyclic loading, load-unload curve

Procedia PDF Downloads 364
2361 Identification of COVID-SARS Variants Based on Lactate Test Results

Authors: Zoltan Horvath, Dora Nagy

Abstract:

In this research, it was examined whether individual COVID variants cause differences in the lactate curve of cyclists. After all, the virus variants attacked different organs in our body during the infections. During our tests, we used a traditional lactate step test, the results of which were compared with the values before the infection. In the tests, it has been proven that different virus variants show unique lactate curves. In this way, based on the lactate curve, it is possible to identify which variant caused the disease. Thanks to this, it has been shorten the return time, because we can apply the best return protocol after infection to the competitors.

Keywords: COVID-Sars19, lactate, virus mutation, lactate profile

Procedia PDF Downloads 41