Search results for: independent controls of multiple electromagnetic features
10815 Music Genre Classification Based on Non-Negative Matrix Factorization Features
Authors: Soyon Kim, Edward Kim
Abstract:
In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)
Procedia PDF Downloads 30110814 The Effect of Metformin in Combination with Dexamethasone on the CXCR4 Level in Multiple Myeloma Cell Line
Authors: Seyede Sanaz Seyedebrahimi, Shima Rahimi, Shohreh Fakhari, Ali Jalili
Abstract:
Background: CXCR4, as a chemokine receptor, plays well-known roles in various types of cancers. Several studies have been conducted to overcome CXCR4 axis acts in multiple myeloma (MM) pathogenesis and progression. Dexamethasone, a standard treatment for multiple myeloma, has been shown to increase CXCR4 levels in multiple myeloma cell lines. Herein, we focused on the effects of metformin and dexamethasone on CXCR4 at the cellular level and the migration rate of cell lines after exposure to a combination compared to single-agent models. Materials and Method: Multiple myeloma cell lines (U266 and RPMI8226) were cultured with different metformin and dexamethasone concentrations in single-agent and combination models. The simultaneous combination doses were calculated by CompuSyn software. Cell surface and mRNA expression of CXCR4 were determined using flow cytometry and the quantitative reverse transcription-polymerase chain reaction (qRT-PCR) assay, respectively. The Transwell cell migration assay evaluated the migration ability. Results: In concurred with previous studies, our results showed a dexamethasone up-regulation effect on CXCR4 in a dose-dependent manner. Although, the metformin single-agent model could reduce CXCR4 expression of U266 and RPMI8226 in cell surface and mRNA expression level. Moreover, the administration of metformin and dexamethasone simultaneously exerted a higher suppression effect on CXCR4 expression than the metformin single-agent model. The migration rate through the combination model's matrigel membrane was remarkably lower than the metformin and dexamethasone single-agent model. Discussion: According to our findings, the combination of metformin and dexamethasone effectively inhibited dexamethasone-induced CXCR4 expression in multiple myeloma cell lines. As a result, metformin may be counted as an alternative medicine combined with other chemotherapies to combat multiple myeloma. However, more research is required.Keywords: CXCR4, dexamethasone, metformin, migration, multiple myeloma
Procedia PDF Downloads 15410813 2D Numerical Modeling of Ultrasonic Measurements in Concrete: Wave Propagation in a Multiple-Scattering Medium
Authors: T. Yu, L. Audibert, J. F. Chaix, D. Komatitsch, V. Garnier, J. M. Henault
Abstract:
Linear Ultrasonic Techniques play a major role in Non-Destructive Evaluation (NDE) for civil engineering structures in concrete since they can meet operational requirements. Interpretation of ultrasonic measurements could be improved by a better understanding of ultrasonic wave propagation in a multiple scattering medium. This work aims to develop a 2D numerical model of ultrasonic wave propagation in a heterogeneous medium, like concrete, integrating the multiple scattering phenomena in SPECFEM software. The coherent field of multiple scattering is obtained by averaging numerical wave fields, and it is used to determine the effective phase velocity and attenuation corresponding to an equivalent homogeneous medium. First, this model is applied to one scattering element (a cylinder) in a homogenous medium in a linear-elastic system, and its validation is completed thanks to the comparison with analytical solution. Then, some cases of multiple scattering by a set of randomly located cylinders or polygons are simulated to perform parametric studies on the influence of frequency and scatterer size, concentration, and shape. Also, the effective properties are compared with the predictions of Waterman-Truell model to verify its validity. Finally, the mortar viscoelastic behavior is introduced in the simulation in order to considerer the dispersion and the attenuation due to porosity included in the cement paste. In the future, different steps will be developed: The comparisons with experimental results, the interpretation of NDE measurements, and the optimization of NDE parameters before an auscultation.Keywords: attenuation, multiple-scattering medium, numerical modeling, phase velocity, ultrasonic measurements
Procedia PDF Downloads 27310812 Assessment of Dental Caries in Children of Age 6 and 7 Years Old in Albania
Authors: Mimoza Canga, Irene Malagnino, Ruzhdie Qafmolla, Vergjini Mulo, Gresa Baboci, Vito Antonio Malagnino
Abstract:
Background: Dental caries represents the most widespread pathology in childhood. The prevalence of dental caries varies with age, gender, socio economic status, geographical location, nutrition habits, and oral hygiene. Objective: The objective of the present longitudinal study is to show clearly the prevalence of dental caries in the children of age 6 and 7 years old in Vlore, Albania, in a two year time period with controls done every 6 months. Materials and methods: The present study was conducted on 530 children, with a controlled sample for a time period of 24 months from September 2019- September 2021. The children in the study had different economic and social backgrounds. The teeth controls were done by the dentists who work at the hospital of the city. The present study was conducted in accordance with Helsinki declaration. Permissions were obtained in written form, which allowed us to perform the observations. Parents had the right to withdraw their children at any time. Statistical analysis was performed using IBM SPSS Statistics 23.0. The significance level (α) was set at 0.05, whereas P-value and analysis of variance (ANOVA) were used to analyze the data. Results: The data of the present study showed that the age group of 6 years old had 139 or 52.3% of the children with dental caries and 127 or 47.7% of them had no dental caries, while at the age of 7 there were 184 or 69.7% of the children with dental caries problems in the permanent molars and 80 or 30.3% of them had no dental caries. In the present study, it was observed that there is a statistically significant association between age group and presence of caries. Children 7 years old had higher presence of dental caries (χ2 = 16.934 (df 1) p-value < 0.001). According to the present research, there is a statistically significant correlation between period of time and the presence of dental caries. Furthermore, in the actual research, it was established that in the time 18-24 months, the prevalence of dental caries was high (χ2=15,318 (df 1) p-value = 0.004). Conclusion: According to the results of the present study performed in Albania in a two year time period with controls done every 6 months, it is proved that the prevalence of dental caries was 17.4 percent higher among children 7 years old in comparison with the children 6 years old.Keywords: age, children, dental caries, permanent molars
Procedia PDF Downloads 22210811 Changes in Pulmonary Functions in Diabetes Mellitus Type 2
Authors: N. Anand, P. S. Nayyer, V. Rana, S. Verma
Abstract:
Background: Diabetes mellitus is a group of disorders characterized by hyperglycemia and associated with microvascular and macrovascular complications. Among the lesser known complications is the involvement of respiratory system. Changes in pulmonary volume, diffusion and elastic properties of lungs as well as the performance of the respiratory muscles lead to a restrictive pattern in lung functions. The present study was aimed to determine the changes in various parameters of pulmonary function tests amongst patients with Type 2 Diabetes Mellitus and also try to study the effect of duration of Diabetes Mellitus on pulmonary function tests. Methods: It was a cross sectional study performed at Dr Baba Saheb Ambedkar Hospital and Medical College in, Delhi, A Tertiary care referral centre which included 200 patients divided into 2 groups. The first group included diagnosed patients with diabetes and the second group included controls. Cases and controls symptomatic for any acute or chronic Respiratory or Cardiovascular illness or a history of smoking were excluded. Both the groups were subjected to spirometry to evaluate for the pulmonary function tests. Result: The mean Forced Vital Capacity (FVC), Forced Expiratory Volume in first second (FEV1), Peak Expiratory Flow Rate(PEFR) was found to be significantly decreased ((P < 0.001) as compared to controls while the mean ratio of Forced Expiratory Volume in First second to Forced Vital Capacity was not significantly decreased( p>0.005). There was no correlation seen with duration of the disease. Conclusion: Forced Vital Capacity (FVC), Forced Expiratory Volume in first second (FEV1), Peak Expiratory Flow Rate(PEFR) were found to be significantly decreased in patients of Diabetes mellitus while ratio of Forced Expiratory Volume in First second to Forced Vital Capacity (FEV1/FVC) was not significantly decreased. The duration of Diabetes mellitus was not found to have any statistically significant effect on Pulmonary function tests (p > 0.005).Keywords: diabetes mellitus, pulmonary function tests, forced vital capacity, forced expiratory volume in first second
Procedia PDF Downloads 36610810 A New Method Separating Relevant Features from Irrelevant Ones Using Fuzzy and OWA Operator Techniques
Authors: Imed Feki, Faouzi Msahli
Abstract:
Selection of relevant parameters from a high dimensional process operation setting space is a problem frequently encountered in industrial process modelling. This paper presents a method for selecting the most relevant fabric physical parameters for each sensory quality feature. The proposed relevancy criterion has been developed using two approaches. The first utilizes a fuzzy sensitivity criterion by exploiting from experimental data the relationship between physical parameters and all the sensory quality features for each evaluator. Next an OWA aggregation procedure is applied to aggregate the ranking lists provided by different evaluators. In the second approach, another panel of experts provides their ranking lists of physical features according to their professional knowledge. Also by applying OWA and a fuzzy aggregation model, the data sensitivity-based ranking list and the knowledge-based ranking list are combined using our proposed percolation technique, to determine the final ranking list. The key issue of the proposed percolation technique is to filter automatically and objectively the relevant features by creating a gap between scores of relevant and irrelevant parameters. It permits to automatically generate threshold that can effectively reduce human subjectivity and arbitrariness when manually choosing thresholds. For a specific sensory descriptor, the threshold is defined systematically by iteratively aggregating (n times) the ranking lists generated by OWA and fuzzy models, according to a specific algorithm. Having applied the percolation technique on a real example, of a well known finished textile product especially the stonewashed denims, usually considered as the most important quality criteria in jeans’ evaluation, we separate the relevant physical features from irrelevant ones for each sensory descriptor. The originality and performance of the proposed relevant feature selection method can be shown by the variability in the number of physical features in the set of selected relevant parameters. Instead of selecting identical numbers of features with a predefined threshold, the proposed method can be adapted to the specific natures of the complex relations between sensory descriptors and physical features, in order to propose lists of relevant features of different sizes for different descriptors. In order to obtain more reliable results for selection of relevant physical features, the percolation technique has been applied for combining the fuzzy global relevancy and OWA global relevancy criteria in order to clearly distinguish scores of the relevant physical features from those of irrelevant ones.Keywords: data sensitivity, feature selection, fuzzy logic, OWA operators, percolation technique
Procedia PDF Downloads 60310809 Face Recognition Using Discrete Orthogonal Hahn Moments
Authors: Fatima Akhmedova, Simon Liao
Abstract:
One of the most critical decision points in the design of a face recognition system is the choice of an appropriate face representation. Effective feature descriptors are expected to convey sufficient, invariant and non-redundant facial information. In this work, we propose a set of Hahn moments as a new approach for feature description. Hahn moments have been widely used in image analysis due to their invariance, non-redundancy and the ability to extract features either globally and locally. To assess the applicability of Hahn moments to Face Recognition we conduct two experiments on the Olivetti Research Laboratory (ORL) database and University of Notre-Dame (UND) X1 biometric collection. Fusion of the global features along with the features from local facial regions are used as an input for the conventional k-NN classifier. The method reaches an accuracy of 93% of correctly recognized subjects for the ORL database and 94% for the UND database.Keywords: face recognition, Hahn moments, recognition-by-parts, time-lapse
Procedia PDF Downloads 37410808 Optimization of a Flux Switching Permanent Magnet Machine Using Laminated Segmented Rotor
Authors: Seyedmilad Kazemisangdehi, Seyedmehdi Kazemisangdehi
Abstract:
Flux switching permanent magnet machines are considered for wide range of applications because of their outstanding merits including high torque/power densities, high efficiency, simple and robust rotor structure. Therefore, several topologies have been proposed like the PM exited flux switching machine, hybrid excited flux switching type, and so on. Recently, a novel laminated segmented rotor flux switching permanent magnet machine was introduced. It features flux barriers on rotor structure to enhance the performances of machine including torque ripple reduction and also torque and efficiency improvements at the same time. This is while, the design of barriers was not optimized by the authors. Therefore, in this paper three coefficients regarding the position of the barriers are considered for optimization. The effect of each coefficient on the performance of this machine is investigated by finite element method and finally an optimized design of flux barriers based on these three coefficients is proposed from different points of view including electromagnetic torque maximization and cogging torque/torque ripple minimization. At optimum design from maximum developed torque aspect, this machine generates 0.65 Nm torque higher than that of the not-optimized design with an almost 0.4 % improvement in efficiency.Keywords: finite element analysis, FSPM, laminated segmented rotor flux switching permanent magnet machine, optimization
Procedia PDF Downloads 22710807 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 12410806 Study on a Family of Optimal Fourth-Order Multiple-Root Solver
Authors: Young Hee Geum
Abstract:
In this paper,we develop the complex dynamics of a family of optimal fourth-order multiple-root solvers and plot their basins of attraction. Mobius conjugacy maps and extraneous fixed points applied to a prototype quadratic polynomial raised to the power of the known integer multiplicity m are investigated. A 300 x 300 uniform grid centered at the origin covering 3 x 3 square region is chosen to visualize the initial values on each basin of attraction in accordance with a coloring scheme based on their dynamical behavior. The illustrative basins of attractions applied to various test polynomials and the corresponding statistical data for convergence are shown to confirm the theoretical convergence.Keywords: basin of attraction, conjugacy, fourth-order, multiple-root finder
Procedia PDF Downloads 29010805 Mammographic Multi-View Cancer Identification Using Siamese Neural Networks
Authors: Alisher Ibragimov, Sofya Senotrusova, Aleksandra Beliaeva, Egor Ushakov, Yuri Markin
Abstract:
Mammography plays a critical role in screening for breast cancer in women, and artificial intelligence has enabled the automatic detection of diseases in medical images. Many of the current techniques used for mammogram analysis focus on a single view (mediolateral or craniocaudal view), while in clinical practice, radiologists consider multiple views of mammograms from both breasts to make a correct decision. Consequently, computer-aided diagnosis (CAD) systems could benefit from incorporating information gathered from multiple views. In this study, the introduce a method based on a Siamese neural network (SNN) model that simultaneously analyzes mammographic images from tri-view: bilateral and ipsilateral. In this way, when a decision is made on a single image of one breast, attention is also paid to two other images – a view of the same breast in a different projection and an image of the other breast as well. Consequently, the algorithm closely mimics the radiologist's practice of paying attention to the entire examination of a patient rather than to a single image. Additionally, to the best of our knowledge, this research represents the first experiments conducted using the recently released Vietnamese dataset of digital mammography (VinDr-Mammo). On an independent test set of images from this dataset, the best model achieved an AUC of 0.87 per image. Therefore, this suggests that there is a valuable automated second opinion in the interpretation of mammograms and breast cancer diagnosis, which in the future may help to alleviate the burden on radiologists and serve as an additional layer of verification.Keywords: breast cancer, computer-aided diagnosis, deep learning, multi-view mammogram, siamese neural network
Procedia PDF Downloads 13610804 A Neural Approach for Color-Textured Images Segmentation
Authors: Khalid Salhi, El Miloud Jaara, Mohammed Talibi Alaoui
Abstract:
In this paper, we present a neural approach for unsupervised natural color-texture image segmentation, which is based on both Kohonen maps and mathematical morphology, using a combination of the texture and the image color information of the image, namely, the fractal features based on fractal dimension are selected to present the information texture, and the color features presented in RGB color space. These features are then used to train the network Kohonen, which will be represented by the underlying probability density function, the segmentation of this map is made by morphological watershed transformation. The performance of our color-texture segmentation approach is compared first, to color-based methods or texture-based methods only, and then to k-means method.Keywords: segmentation, color-texture, neural networks, fractal, watershed
Procedia PDF Downloads 34410803 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: bootstrap, edgeworth approximation, IID, quantile
Procedia PDF Downloads 15810802 Measurements for Risk Analysis and Detecting Hazards by Active Wearables
Authors: Werner Grommes
Abstract:
Intelligent wearables (illuminated vests or hand and foot-bands, smart watches with a laser diode, Bluetooth smart glasses) overflow the market today. They are integrated with complex electronics and are worn very close to the body. Optical measurements and limitation of the maximum light density are needed. Smart watches are equipped with a laser diode or control different body currents. Special glasses generate readable text information that is received via radio transmission. Small high-performance batteries (lithium-ion/polymer) supply the electronics. All these products have been tested and evaluated for risk. These products must, for example, meet the requirements for electromagnetic compatibility as well as the requirements for electromagnetic fields affecting humans or implant wearers. Extensive analyses and measurements were carried out for this purpose. Many users are not aware of these risks. The result of this study should serve as a suggestion to do it better in the future or simply to point out these risks. Commercial LED warning vests, LED hand and foot-bands, illuminated surfaces with inverter (high voltage), flashlights, smart watches, and Bluetooth smart glasses were checked for risks. The luminance, the electromagnetic emissions in the low-frequency as well as in the high-frequency range, audible noises, and nervous flashing frequencies were checked by measurements and analyzed. Rechargeable lithium-ion or lithium-polymer batteries can burn or explode under special conditions like overheating, overcharging, deep discharge or using out of the temperature specification. Some risk analysis becomes necessary. The result of this study is that many smart wearables are worn very close to the body, and an extensive risk analysis becomes necessary. Wearers of active implants like a pacemaker or implantable cardiac defibrillator must be considered. If the wearable electronics include switching regulators or inverter circuits, active medical implants in the near field can be disturbed. A risk analysis is necessary.Keywords: safety and hazards, electrical safety, EMC, EMF, active medical implants, optical radiation, illuminated warning vest, electric luminescent, hand and head lamps, LED, e-light, safety batteries, light density, optical glare effects
Procedia PDF Downloads 10910801 Design of a Permanent Magnet Based Focusing Lens for a Miniature Klystron
Authors: Kumud Singh, Janvin Itteera, Priti Ukarde, Sanjay Malhotra, P. PMarathe, Ayan Bandyopadhay, Rakesh Meena, Vikram Rawat, L. M. Joshi
Abstract:
Application of Permanent magnet technology to high frequency miniature klystron tubes to be utilized for space applications improves the efficiency and operational reliability of these tubes. But nevertheless the task of generating magnetic focusing forces to eliminate beam divergence once the beam crosses the electrostatic focusing regime and enters the drift region in the RF section of the tube throws several challenges. Building a high quality magnet focusing lens to meet beam optics requirement in cathode gun and RF interaction region is considered to be one of the critical issues for these high frequency miniature tubes. In this paper, electromagnetic design and particle trajectory studies in combined electric and magnetic field for optimizing the magnetic circuit using 3D finite element method (FEM) analysis software is presented. A rectangular configuration of the magnet was constructed to accommodate apertures for input and output waveguide sections and facilitate coupling of electromagnetic fields into the input klystron cavity and out from output klystron cavity through coupling loops. Prototype lenses have been built and have been tested after integration with the klystron tube. We discuss the design requirements and challenges, and the results from beam transmission of the prototype lens.Keywords: beam transmission, Brillouin, confined flow, miniature klystron
Procedia PDF Downloads 44210800 Controlling the Expense of Political Contests Using a Modified N-Players Tullock’s Model
Abstract:
This work introduces a generalization of the classical Tullock’s model of one-stage contests under complete information with multiple unlimited numbers of contestants. In classical Tullock’s model, the contest winner is not necessarily the highest bidder. Instead, the winner is determined according to a draw in which the winning probabilities are the relative contestants’ efforts. The Tullock modeling fits well political contests, in which the winner is not necessarily the highest effort contestant. This work presents a modified model which uses a simple non-discriminating rule, namely, a parameter to influence the total costs planned for an election, for example, the contest designer can control the contestants' efforts. The winner pays a fee, and the losers are reimbursed the same amount. Our proposed model includes a mechanism that controls the efforts exerted and balances competition, creating a tighter, less predictable and more interesting contest. Additionally, the proposed model follows the fairness criterion in the sense that it does not alter the contestants' probabilities of winning compared to the classic Tullock’s model. We provide an analytic solution for the contestant's optimal effort and expected reward.Keywords: contests, Tullock's model, political elections, control expenses
Procedia PDF Downloads 14310799 Random Subspace Ensemble of CMAC Classifiers
Authors: Somaiyeh Dehghan, Mohammad Reza Kheirkhahan Haghighi
Abstract:
The rapid growth of domains that have data with a large number of features, while the number of samples is limited has caused difficulty in constructing strong classifiers. To reduce the dimensionality of the feature space becomes an essential step in classification task. Random subspace method (or attribute bagging) is an ensemble classifier that consists of several classifiers that each base learner in ensemble has subset of features. In the present paper, we introduce Random Subspace Ensemble of CMAC neural network (RSE-CMAC), each of which has training with subset of features. Then we use this model for classification task. For evaluation performance of our model, we compare it with bagging algorithm on 36 UCI datasets. The results reveal that the new model has better performance.Keywords: classification, random subspace, ensemble, CMAC neural network
Procedia PDF Downloads 32810798 Investigating the Dynamic Plantar Pressure Distribution in Individuals with Multiple Sclerosis
Authors: Hilal Keklicek, Baris Cetin, Yeliz Salci, Ayla Fil, Umut Altinkaynak, Kadriye Armutlu
Abstract:
Objectives and Goals: Spasticity is a common symptom characterized with a velocity dependent increase in tonic stretch reflexes (muscle tone) in patient with multiple sclerosis (MS). Hypertonic muscles affect the normal plantigrade contact by disturbing accommodation of foot to the ground while walking. It is important to know the differences between healthy and neurologic foot features for management of spasticity related deformities and/or determination of rehabilitation purposes and contents. This study was planned with the aim of investigating the dynamic plantar pressure distribution in individuals with MS and determining the differences between healthy individuals (HI). Methods: Fifty-five individuals with MS (108 foot with spasticity according to Modified Ashworth Scale) and 20 HI (40 foot) were the participants of the study. The dynamic pedobarograph was utilized for evaluation of dynamic loading parameters. Participants were informed to walk at their self-selected speed for seven times to eliminate learning effect. The parameters were divided into 2 categories including; maximum loading pressure (N/cm2) and time of maximum pressure (ms) were collected from heal medial, heal lateral, mid foot, heads of first, second, third, fourth and fifth metatarsal bones. Results: There were differences between the groups in maximum loading pressure of heal medial (p < .001), heal lateral (p < .001), midfoot (p=.041) and 5th metatarsal areas (p=.036). Also, there were differences between the groups the time of maximum pressure of all metatarsal areas, midfoot, heal medial and heal lateral (p < .001) in favor of HI. Conclusions: The study provided basic data about foot pressure distribution in individuals with MS. Results of the study primarily showed that spasticity of lower extremity muscle disrupted the posteromedial foot loading. Secondarily, according to the study result, spasticity lead to inappropriate timing during load transfer from hind foot to forefoot.Keywords: multiple sclerosis, plantar pressure distribution, gait, norm values
Procedia PDF Downloads 31910797 Improved Performance in Content-Based Image Retrieval Using Machine Learning Approach
Authors: B. Ramesh Naik, T. Venugopal
Abstract:
This paper presents a novel approach which improves the high-level semantics of images based on machine learning approach. The contemporary approaches for image retrieval and object recognition includes Fourier transforms, Wavelets, SIFT and HoG. Though these descriptors helpful in a wide range of applications, they exploit zero order statistics, and this lacks high descriptiveness of image features. These descriptors usually take benefit of primitive visual features such as shape, color, texture and spatial locations to describe images. These features do not adequate to describe high-level semantics of the images. This leads to a gap in semantic content caused to unacceptable performance in image retrieval system. A novel method has been proposed referred as discriminative learning which is derived from machine learning approach that efficiently discriminates image features. The analysis and results of proposed approach were validated thoroughly on WANG and Caltech-101 Databases. The results proved that this approach is very competitive in content-based image retrieval.Keywords: CBIR, discriminative learning, region weight learning, scale invariant feature transforms
Procedia PDF Downloads 18010796 Analyze the Effect of TETRA, Terrestrial Trunked Radio, Signal on the Health of People Working in the Gas Refinery
Authors: Mohammad Bagher Heidari, Hefzollah Mohammadian
Abstract:
TETRA (Terrestrial Trunked Radio) is a digital radio communication standard, which has been implemented in several different parts of the gas refinery ninth (phase 12th) by South Pars Gas Complex. Studies on possible impacts on the users' health considering different exposure conditions are missing. Objectives: To investigate possible acute effects of electromagnetic fields (EMF) of two different levels of TETRA hand-held transmitter signals on cognitive function and well-being in healthy young males. Methods: In the present double-blind cross-over study possible effects of short-term (2.5 h) EMF exposure of handset-like signals of TETRA (450 - 470 MHz) were studied in 30 healthy male participants (mean ± SD: 25.4 ±2.6 years). Individuals were tested on nine study days, on which they were exposed to three different exposure conditions (Sham, TETRA 1.5 W/kg and TETRA 10.0 W/kg) in a randomly assigned and balanced order. Participants were tested in the afternoon at a fixed timeframe. Results: Attention remained unchanged in two out of three tasks. In the working memory, significant changes were observed in two out of four subtasks. Significant results were found in 5 out of 35 tested parameters, four of them led to an improvement in performance. Mood, well-being and subjective somatic complaints were not affected by TETRA exposure. Conclusions: The results of the present study do not indicate a negative impact of a short-term EMF- effect of TETRA on cognitive function and well-being in healthy young men.Keywords: TETRA (terrestrial trunked radio), electromagnetic fields (EMF), mobile telecommunication health research (MTHR), antenna
Procedia PDF Downloads 29510795 Electromagnetic Interface Shielding of Graphene Oxide–Carbon Nanotube Hybrid ABS Composites
Authors: Jeevan Jyoti, Bhanu Pratap Singh, S. R. Dhakate
Abstract:
In the present study, multiwalled carbon nanotubes (MWCNTs) and reduced graphene oxide (RGO) were synthesized by chemical vapor deposition and Improved Hummer’s method, respectively and their composite with acrylonitrile butadiene styrene (ABS) were prepared by twin screw co rotating extrusion technique. The electromagnetic interference (EMI) shielding effectiveness of graphene oxide carbon nanotube (GCNTs) hybrid composites was investigated and the results were compared with EMI shielding of carbon nanotube (CNTs) and reduced graphene oxide (RGO) in the frequency range of 12.4-18 GHz (Ku-band). The experimental results indicate that the EMI shielding effectiveness of these composites is achieved up to –21 dB for 10 wt. % loading of GCNT loading. The mechanism of improvement in EMI shielding effectiveness is discussed by resolving their contribution in absorption and reflection loss. The main reason for such a high improved shielding effectiveness has been attributed to the significant improvement in the electrical conductivity of the composites. The electrical conductivity of these GCNT/ABS composites was increased from 10-13 S/cm to 10-7 S/cm showing the improvement of the 6 order of the magnitude. Scanning electron microscopic (SEM) and high resolution transmission electron microscopic (HRTEM) studies showed that the GCNTs were uniformly dispersed in the ABS polymer matrix. GCNTs form a network throughout the polymer matrix and promote the reinforcement.Keywords: ABS, EMI shielding, multiwalled carbon nanotubes, reduced graphene oxide, graphene, oxide-carbon nanotube (GCNTs), twin screw extruder, multiwall carbon nanotube, electrical conductivity
Procedia PDF Downloads 35910794 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory
Authors: Damir Latypov
Abstract:
A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory
Procedia PDF Downloads 15210793 Xerostomia and Caries Incidence in Relation to Metabolic Control in Children and Adolescents with Type 1 Diabetes
Authors: Eftychia Pappa, Heleni Vastardis, Christos Rahiotis, Andriani Vazaiou
Abstract:
The aim of this study was to evaluate the prevalence of dry-mouth symptoms (xerostomia) and compare it with alterations in salivary characteristics of children and adolescents with type 1 diabetes (DM1), as measured with the use of chair-side saliva tests. This study also investigated the possible association between salivary dysfunction and incidence of caries, in relation to the level of metabolic control. A cross-sectional study was performed on young patients (6-18 years old) allocated among 3 groups: 40 patients poorly-controlled (DM1-A, HbA1c>8%), 40 well-controlled (DM1-B, HbA1c≤8%) and 40 age- and sex-matched healthy controls. The study was approved by the Research Ethics Committee of University of Athens and the parents signed written informed consent. All subjects were examined for dental caries, oral hygiene and salivary factors. Assessments of salivary function included self-reported xerostomia, quantification of resting and stimulated whole saliva flow rates, pH values, buffering capacity and saliva’s viscosity. Salivary characteristics were evaluated with the use of GC Saliva Check Buffer (3Μ ESPE). Data were analysed by chi-square and Kruskal-Wallis tests. Subjects with diabetes reported xerostomia more frequently than healthy controls (p<0.05). Unstimulated salivary flow rate and pH values remained significantly lower in DM1-A compared to DM1-B and controls. Low values of resting salivary flow rate were associated with a higher prevalence of dental caries in children and adolescents with poorly-controlled DM1 (p<0.05). The results suggested that diabetes-induced alterations in salivary characteristics are indicative of higher caries susceptibility of diabetics and chair-side saliva tests are a useful tool for the evaluation of caries risk assessment.Keywords: caries risk assessment, saliva diagnostic tests, type 1 diabetes, xerostomia
Procedia PDF Downloads 28510792 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 6110791 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.Keywords: floods, FLIKE, probability distributions, flood frequency, outlier
Procedia PDF Downloads 44910790 Normalized Difference Vegetation Index and Normalize Difference Chlorophyll Changes with Different Irrigation Levels on Sillage Corn
Authors: Cenk Aksit, Suleyman Kodal, Yusuf Ersoy Yildirim
Abstract:
Normalized Difference Vegetation Index (NDVI) is a widely used index in the world that provides reference information, such as the health status of the plant, and the density of the vegetation in a certain area, by making use of the electromagnetic radiation reflected from the plant surface. On the other hand, the chlorophyll index provides reference information about the chlorophyll density in the plant by making use of electromagnetic reflections at certain wavelengths. Chlorophyll concentration is higher in healthy plants and decreases as plant health decreases. This study, it was aimed to determine the changes in Normalize Difference Vegetation Index (NDVI) and Normalize Difference Chlorophyll (NDCI) of silage corn irrigated with subsurface drip irrigation systems under different irrigation levels. In 5 days irrigation interval, the daily potential plant water consumption values were collected, and the calculated amount was applied to the full irrigation and 3 irrigation water levels as irrigation water. The changes in NDVI and NDCI of silage corn irrigated with subsurface drip irrigation systems under different irrigation levels were determined. NDVI values have changed according to the amount of irrigation water applied, and the highest NDVI value has been reached in the subject where the most water is applied. Likewise, it was observed that the chlorophyll value decreased in direct proportion to the amount of irrigation water as the plant approached the harvest.Keywords: NDVI, NDCI, sub-surface drip irrigation, silage corn, deficit irrigation
Procedia PDF Downloads 9510789 Histopathological Features of Basal Cell Carcinoma: A Ten Year Retrospective Statistical Study in Egypt
Authors: Hala M. El-hanbuli, Mohammed F. Darweesh
Abstract:
The incidence rates of any tumor vary hugely with geographical location. Basal Cell Carcinoma (BCC) is one of the most common skin cancer that has many histopathologic subtypes. Objective: The aim was to study the histopathological features of BCC cases that were received in the Pathology Department, Kasr El-Aini hospital, Cairo University, Egypt during the period from Jan 2004 to Dec 2013 and to evaluate the clinical characters through the patient data available in the request sheets. Methods: Slides and data of BCC cases were collected from the archives of the pathology department, Kasr El-Aini hospital. Revision of all available slides and histological classification of BCC according to WHO (2006) was done. Results: A total number of 310 cases of BCC representing about 65% from the total number of malignant skin tumors examined during the 10-years duration in the department. The age ranged from 8 to 84 years, the mean age was (55.7 ± 15.5). Most of the patients (85%) were above the age of 40 years. There was a slight male predominance (55%). Ulcerated BCC was the most common gross picture (60%), followed by nodular lesion (30%) and finally the ulcerated nodule (10%). Most of the lesions situated in the high-risk sites (77%) where the nose was the most common site (35%) followed by the periocular area (22%), then periauricular (15%) and finally perioral (5%). No lesion was reported outside the head. The tumor size was less than 2 centimeters in 65% of cases, and from 2-5 centimeters in the lesions' greatest dimension in the rest of cases. Histopathological reclassification revealed that the nodular BCC was the most common (68%) followed by the pigmented nodular (18.75%). The histologic high-risk groups represented (7.5%) about half of them (3.75%) being basosquamous carcinoma. The total incidence for multiple BCC and 2nd primary was 12%. Recurrent BCC represented 8%. All of the recurrent lesions of BCC belonged to the histologic high-risk group. Conclusion: Basal Cell Carcinoma is the most common skin cancer in the 10-year survey. Histopathological diagnosis and classification of BCC cases are essential for the determination of the tumor type and its biological behavior.Keywords: basal cell carcinoma, high risk, histopathological features, statistical analysis
Procedia PDF Downloads 14810788 Literature Review of Empirical Studies on the Psychological Processes of End-of-Life Cancer Patients
Authors: Kimiyo Shimomai, Mihoko Harada
Abstract:
This study is a literature review of the psychological reactions that occur in end-of-life cancer patients who are nearing death. It searched electronic databases and selected literature related to psychological studies of end-of-life patients. There was no limit on the search period, and the search was conducted until the second week of December 2021. The keywords were specified as “death and dying”, “terminal illness”, “end-of-life”, “palliative care”, “psycho-oncology” and “research”. These literatures referred to Holly (2017): Comprehensive Systematic Review for Advanced Practice Nursing, P268 Figure 10.3 to ensure quality. These literatures were selected with a dissertation score of 4 or 5. The review was conducted in two stages with reference to the procedure of George (2002). First, these references were searched for keywords in the database, and then relevant references were selected from the psychology and nursing studies of end-of-life patients. The number of literatures analyzed was 76 for overseas and 17 for domestic. As for the independent variables, "physical variable" was the most common in 36 literatures (66.7%), followed by "psychological variable" in 35 literatures (64.8%), "spiritual variable" in 21 literatures (38%), and "social variable" in 17 literatures. (31.5%), "Variables related to medical care / treatment" were 16 literatures (29.6%). To summarize the relationship between these independent variables and the dependent variable, when the dependent variable is "psychological variable", the independent variables are "psychological variable", "social variable", and "physical variable". Among the independent variables, the physical variables were the most common. The psychological responses that occur in end-stage cancer patients who are nearing death are mutually influenced by psychological, social, and physical variables. Therefore, it supported the "total pain" advocated by Cicely Saunders.Keywords: cancer patient, end-of-life, literature review, psychological process
Procedia PDF Downloads 12610787 Numerical Investigations on the Coanda Effect
Authors: Florin Frunzulica, Alexandru Dumitrache, Octavian Preotu
Abstract:
The Coanda effect consists of the tendency of a jet to remain attached to a sufficiently long/large convex surface. Flows deflected by a curved surface have caused great interest during last fifty years a major interest in the study of this phenomenon is caused by the possibility of using this effect to aircraft with short take-off and landing, for thrust vectoring. It is also used in applications involving mixing two of more fluids, noise attenuation, ventilation, etc. The paper proposes the numerical study of an aerodynamic configuration that can passively amplify the Coanda effect. On a wing flaps with predetermined configuration, a channel is applied between two particular zones, a low-pressure one and a high-pressure another one, respectively. The secondary flow through this channel yields a gap between the jet and the convex surface, maintaining the jet attached on a longer distance. The section altering-based active control of the secondary flow through the channel controls the attachment of the jet to the surface and automatically controls the deviation angle of the jet. The numerical simulations have been performed in Ansys Fluent for a series of wing flaps-channel configurations with varying jet velocity. The numerical results are in good agreement with experimental results.Keywords: blowing jet, CFD, Coanda effect, circulation control
Procedia PDF Downloads 34410786 Climate Changes in Albania and Their Effect on Cereal Yield
Authors: Lule Basha, Eralda Gjika
Abstract:
This study is focused on analyzing climate change in Albania and its potential effects on cereal yields. Initially, monthly temperature and rainfalls in Albania were studied for the period 1960-2021. Climacteric variables are important variables when trying to model cereal yield behavior, especially when significant changes in weather conditions are observed. For this purpose, in the second part of the study, linear and nonlinear models explaining cereal yield are constructed for the same period, 1960-2021. The multiple linear regression analysis and lasso regression method are applied to the data between cereal yield and each independent variable: average temperature, average rainfall, fertilizer consumption, arable land, land under cereal production, and nitrous oxide emissions. In our regression model, heteroscedasticity is not observed, data follow a normal distribution, and there is a low correlation between factors, so we do not have the problem of multicollinearity. Machine-learning methods, such as random forest, are used to predict cereal yield responses to climacteric and other variables. Random Forest showed high accuracy compared to the other statistical models in the prediction of cereal yield. We found that changes in average temperature negatively affect cereal yield. The coefficients of fertilizer consumption, arable land, and land under cereal production are positively affecting production. Our results show that the Random Forest method is an effective and versatile machine-learning method for cereal yield prediction compared to the other two methods.Keywords: cereal yield, climate change, machine learning, multiple regression model, random forest
Procedia PDF Downloads 89