Search results for: multi-resolution feature fusion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1944

Search results for: multi-resolution feature fusion

1074 Analysis of Expression Data Using Unsupervised Techniques

Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.

Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation

Procedia PDF Downloads 131
1073 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features

Authors: Stylianos Kampakis

Abstract:

This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.

Keywords: neural networks, feature selection, regularization, aggressive reweighting

Procedia PDF Downloads 435
1072 The Molecular Bases of Δβ T-Cell Mediated Antigen Recognition

Authors: Eric Chabrol, Sidonia B.G. Eckle, Renate de Boer, James McCluskey, Jamie Rossjohn, Mirjam H.M. Heemskerk, Stephanie Gras

Abstract:

αβ and γδ T-cells are disparate T-cell lineages that, via their use of either αβ or γδ T-cell antigen receptors (TCRs) respectively, can respond to distinct antigens. Here we characterise a new population of human T-cells, term δβ T-cells, that express TCRs comprising a TCR-δ variable gene fused to a Joining-α/Constant-α domain, paired with an array of TCR-β chains. We characterised the cellular, functional, biophysical and structural characteristic feature of this new T-cells population that reveal some new insight into TCR diversity. We provide molecular bases of how δβ T-cells can recognise viral peptide presented by Human Leukocyte Antigen (HLA) molecule. Our findings highlight how components from αβ and γδTCR gene loci can recombine to confer antigen specificity thus expanding our understanding of T-cell biology and TCR diversity.

Keywords: new delta-beta TCR, HLA, viral peptide, structural immunology

Procedia PDF Downloads 408
1071 Hybrid-Nanoengineering™: A New Platform for Nanomedicine

Authors: Mewa Singh

Abstract:

Nanomedicine, a fusion of nanotechnology and medicine, is an emerging technology ideally suited to the targeted therapies. Nanoparticles overcome the low selectivity of anti-cancer drugs toward the tumor as compared to normal tissue and hence result-in less severe side-effects. Our new technology, HYBRID-NANOENGINEERING™, uses a new molecule (MR007) in the creation of nanoparticles that not only helps in nanonizing the medicine but also provides synergy to the medicine. The simplified manufacturing process will result in reduced manufacturing costs. Treatment is made more convenient because hybrid nanomedicines can be produced in oral, injectable or transdermal formulations. The manufacturing process uses no protein, oil or detergents. The particle size is below 180 nm with a narrow distribution of size. Importantly, these properties confer great stability of the structure. The formulation does not aggregate in plasma and is stable over a wide range of pH. The final hybrid formulation is stable for at least 18 months as a powder. More than 97 drugs, including paclitaxel, docetaxel, tamoxifen, doxorubicinm prednisone, and artemisinin have been nanonized in water soluble formulations. Preclinical studies on cell cultures of tumors show promising results. Our HYBRID-NANOENGINEERING™ platform enables the design and development of hybrid nano-pharmaceuticals that combine efficacy with tolerability, giving patients hope for both extended overall survival and improved quality of life. This study would discuss or present this new discovery of HYBRID-NANOENGINEERING™ which targets drug delivery, synergistic, and potentiating effects, and barriers of drug delivery and advanced drug delivery systems.

Keywords: nano-medicine, nano-particles, drug delivery system, pharmaceuticals

Procedia PDF Downloads 473
1070 Active Learning Techniques in Engineering Education

Authors: H. M. Anitha, Anusha N. Rao

Abstract:

The current developments in technology and ideas have given entirely new dimensions to the field of research and education. New delivery methods are proposed which is an added feature to the engineering education. Particularly, more importance is given to new teaching practices such as Information and Communication Technologies (ICT). It is vital to adopt the new ICT methods which lead to the emergence of novel structure and mode of education. The flipped classroom, think pair share and peer instruction are the latest pedagogical methods which give students to learn the course. This involves students to watch video lectures outside the classroom and solve the problems at home. Students are engaged in group discussions in the classroom. These are the active learning methods wherein the students are involved diversely to learn the course. This paper gives a comprehensive study of past and present research which is going on with flipped classroom, thinks pair share activity and peer instruction.

Keywords: flipped classroom, think pair share, peer instruction, active learning

Procedia PDF Downloads 367
1069 A Family of Distributions on Learnable Problems without Uniform Convergence

Authors: César Garza

Abstract:

In supervised binary classification and regression problems, it is well-known that learnability is equivalent to a uniform convergence of the hypothesis class, and if a problem is learnable, it is learnable by empirical risk minimization. For the general learning setting of unsupervised learning tasks, there are non-trivial learning problems where uniform convergence does not hold. We present here the task of learning centers of mass with an extra feature that “activates” some of the coordinates over the unit ball in a Hilbert space. We show that the learning problem is learnable under a stable RLM rule. We introduce a family of distributions over the domain space with some mild restrictions for which the sample complexity of uniform convergence for these problems must grow logarithmically with the dimension of the Hilbert space. If we take this dimension to infinity, we obtain a learnable problem for which the uniform convergence property fails for a vast family of distributions.

Keywords: statistical learning theory, learnability, uniform convergence, stability, regularized loss minimization

Procedia PDF Downloads 108
1068 Transducers for Measuring Displacements of Rotating Blades in Turbomachines

Authors: Pavel Prochazka

Abstract:

The study deals with transducers for measuring vibration displacements of rotating blade tips in turbomachines. In order to prevent major accidents with extensive economic consequences, it shows an urgent need for every low-pressure steam turbine stage being equipped with modern non-contact measuring system providing information on blade loading, damage and residual lifetime under operation. The requirement of measuring vibration and static characteristics of steam turbine blades, therefore, calls for the development and operational verification of both new types of sensors and measuring principles and methods. The task is really demanding: to measure displacements of blade tips with a resolution of the order of 10 μm by speeds up to 750 m/s, humidity 100% and temperatures up to 200 °C. While in gas turbines are used primarily capacitive and optical transducers, these transducers cannot be used in steam turbines. The reason is moisture vapor, droplets of condensing water and dirt, which disable the function of sensors. Therefore, the most feasible approach was to focus on research of electromagnetic sensors featuring promising characteristics for given blade materials in a steam environment. Following types of sensors have been developed and both experimentally and theoretically studied in the Institute of Thermodynamics, Academy of Sciences of the Czech Republic: eddy-current, Hall effect, inductive and magnetoresistive. Eddy-current transducers demand a small distance of 1 to 2 mm and change properties in the harsh environment of steam turbines. Hall effect sensors have relatively low sensitivity, high values of offset, drift, and especially noise. Induction sensors do not require any supply current and have a simple construction. The magnitude of the sensors output voltage is dependent on the velocity of the measured body and concurrently on the varying magnetic induction, and they cannot be used statically. Magnetoresistive sensors are formed by magnetoresistors arranged into a Wheatstone bridge. Supplying the sensor from a current source provides better linearity. The MR sensors can be used permanently for temperatures up to 200 °C at lower values of the supply current of about 1 mA. The frequency range of 0 to 300 kHz is by an order higher comparing to the Hall effect and induction sensors. The frequency band starts at zero frequency, which is very important because the sensors can be calibrated statically. The MR sensors feature high sensitivity and low noise. The symmetry of the bridge arrangement leads to a high common mode rejection ratio and suppressing disturbances, which is important, especially in industrial applications. The MR sensors feature high sensitivity, high common mode rejection ratio, and low noise, which is important, especially in industrial applications. Magnetoresistive transducers provide a range of excellent properties indicating their priority for displacement measurements of rotating blades in turbomachines.

Keywords: turbines, blade vibration, blade tip timing, non-contact sensors, magnetoresistive sensors

Procedia PDF Downloads 104
1067 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine

Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif

Abstract:

The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.

Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)

Procedia PDF Downloads 353
1066 Critical Role of Lipid Rafts in Influenza a Virus Binding to Host Cell

Authors: Dileep Kumar Verma, Sunil Kumar Lal

Abstract:

Influenza still remains one of the most challenging diseases posing significant threat to public health causing seasonal epidemics and pandemics. Influenza A Virus (IAV) surface protein hemagglutinin is known to play an important role in viral attachment to the host sialic acid receptors and concentrate in lipid rafts for efficient viral fusion. Selective nature of Influenza A virus to utilize rafts micro-domain for efficient virus assembly and budding has been explored in depth. However, the detailed mechanism of IAV binding to host cell membrane and entry into the host remains elusive. In the present study we investigated the role of lipid rafts in early life cycle events of IAV. Role of host lipid rafts was studied using raft disruption method by extraction of cholesterol by Methyl-β-Cyclodextrin. Using GM1, a well-known lipid raft marker, we were able to observe co-localization of IAV on lipid rafts on the host cell membrane. This experiment suggests a direct involvement of lipid rafts in the initiation of the IAV life cycle. Upon disruption of lipid rafts by Methyl-b-cyclodextrin, we observed a significant reduction in IAV binding on the host cell surface indicating a significant decrease in virus attachment to coherent membrane rafts. Our results provide proof that host lipid rafts and their constituents play an important role in the adsorption of IAV. This study opens a new avenues in IAV virus-host interactions to combat infection at a very early steps of the viral lifecycle.

Keywords: lipid raft, adsorption, cholesterol, methyl-β-cyclodextrin, GM1

Procedia PDF Downloads 343
1065 Analysis of the Significance of Multimedia Channels Using Sparse PCA and Regularized SVD

Authors: Kourosh Modarresi

Abstract:

The abundance of media channels and devices has given users a variety of options to extract, discover, and explore information in the digital world. Since, often, there is a long and complicated path that a typical user may venture before taking any (significant) action (such as purchasing goods and services), it is critical to know how each node (media channel) in the path of user has contributed to the final action. In this work, the significance of each media channel is computed using statistical analysis and machine learning techniques. More specifically, “Regularized Singular Value Decomposition”, and “Sparse Principal Component” has been used to compute the significance of each channel toward the final action. The results of this work are a considerable improvement compared to the present approaches.

Keywords: multimedia attribution, sparse principal component, regularization, singular value decomposition, feature significance, machine learning, linear systems, variable shrinkage

Procedia PDF Downloads 289
1064 Communication in a Heterogeneous Ad Hoc Network

Authors: C. Benjbara, A. Habbani

Abstract:

Wireless networks are getting more and more used in every new technology or feature, especially those without infrastructure (Ad hoc mode) which provide a low cost alternative to the infrastructure mode wireless networks and a great flexibility for application domains such as environmental monitoring, smart cities, precision agriculture, and so on. These application domains present a common characteristic which is the need of coexistence and intercommunication between modules belonging to different types of ad hoc networks like wireless sensor networks, mesh networks, mobile ad hoc networks, vehicular ad hoc networks, etc. This vision to bring to life such heterogeneous networks will make humanity duties easier but its development path is full of challenges. One of these challenges is the communication complexity between its components due to the lack of common or compatible protocols standard. This article proposes a new patented routing protocol based on the OLSR standard in order to resolve the heterogeneous ad hoc networks communication issue. This new protocol is applied on a specific network architecture composed of MANET, VANET, and FANET.

Keywords: Ad hoc, heterogeneous, ID-Node, OLSR

Procedia PDF Downloads 188
1063 An Analysis of the Strategies Employed to Curate, Conserve and Digitize the Timbuktu Manuscripts

Authors: F. Saptouw

Abstract:

This paper briefly reviews the range of curatorial interventions made to preserve and display the Timbuktu Manuscripts. The government of South Africa and Mali collaborated to preserve the manuscripts, and brief notes will be presented about the value of archives in those specific spaces. The research initiatives of the Tombouctou Manuscripts Project, based at the University of Cape Town, feature prominently in the text. A brief overview of the history of the archive will be presented and its preservation as a key turning point in curating the intellectual history of the continent. ­­­The strategies of preservation, curation, publication and digitization are presented as complimentary interventions. Each materialization of the manuscripts contributes something significant; the complexity of the contribution is dependent primarily on the format of presentation. This integrated reading of the manuscripts is presented as a means to gain a more nuanced understanding of the past, which greatly surpasses how much information would be gleaned from relying on a single media format.

Keywords: archive, curatorship, cultural heritage, museum practice, Timbuktu manuscripts

Procedia PDF Downloads 94
1062 A Novel Method for Silence Removal in Sounds Produced by Percussive Instruments

Authors: B. Kishore Kumar, Rakesh Pogula, T. Kishore Kumar

Abstract:

The steepness of an audio signal which is produced by the musical instruments, specifically percussive instruments is the perception of how high tone or low tone which can be considered as a frequency closely related to the fundamental frequency. This paper presents a novel method for silence removal and segmentation of music signals produced by the percussive instruments and the performance of proposed method is studied with the help of MATLAB simulations. This method is based on two simple features, namely the signal energy and the spectral centroid. As long as the feature sequences are extracted, a simple thresholding criterion is applied in order to remove the silence areas in the sound signal. The simulations were carried on various instruments like drum, flute and guitar and results of the proposed method were analyzed.

Keywords: percussive instruments, spectral energy, spectral centroid, silence removal

Procedia PDF Downloads 384
1061 Effect of Personality Traits on Classification of Political Orientation

Authors: Vesile Evrim, Aliyu Awwal

Abstract:

Today as in the other domains, there are an enormous number of political transcripts available in the Web which is waiting to be mined and used for various purposes such as statistics and recommendations. Therefore, automatically determining the political orientation on these transcripts becomes crucial. The methodologies used by machine learning algorithms to do the automatic classification are based on different features such as Linguistic. Considering the ideology differences between Liberals and Conservatives, in this paper, the effect of Personality Traits on political orientation classification is studied. This is done by considering the correlation between LIWC features and the BIG Five Personality Traits. Several experiments are conducted on Convote U.S. Congressional-Speech dataset with seven benchmark classification algorithms. The different methodologies are applied on selecting different feature sets that constituted by 8 to 64 varying number of features. While Neuroticism is obtained to be the most differentiating personality trait on classification of political polarity, when its top 10 representative features are combined with several classification algorithms, it outperformed the results presented in previous research.

Keywords: politics, personality traits, LIWC, machine learning

Procedia PDF Downloads 478
1060 Road Safety in the Great Britain: An Exploratory Data Analysis

Authors: Jatin Kumar Choudhary, Naren Rayala, Abbas Eslami Kiasari, Fahimeh Jafari

Abstract:

The Great Britain has one of the safest road networks in the world. However, the consequences of any death or serious injury are devastating for loved ones, as well as for those who help the severely injured. This paper aims to analyse the Great Britain's road safety situation and show the response measures for areas where the total damage caused by accidents can be significantly and quickly reduced. In this paper, we do an exploratory data analysis using STATS19 data. For the past 30 years, the UK has had a good record in reducing fatalities. The UK ranked third based on the number of road deaths per million inhabitants. There were around 165,000 accidents reported in the Great Britain in 2009 and it has been decreasing every year until 2019 which is under 120,000. The government continues to scale back road deaths empowering responsible road users by identifying and prosecuting the parameters that make the roads less safe.

Keywords: road safety, data analysis, openstreetmap, feature expanding.

Procedia PDF Downloads 117
1059 Partially Phosphorylated Polyvinyl Phosphate-PPVP Composite: Synthesis and Its Potentiality for Zr (IV) Extraction from an Acidic Medium

Authors: Khaled Alshamari

Abstract:

Synthesized partially phosphorylated polyvinyl phosphate derivative (PPVP) was functionalized to extract Zirconium (IV) from Egyptian zircon sand. The specifications for the PPVP composite were approved effectively via different techniques, namely, FT-IR, XPS, BET, EDX, TGA, HNMR, C-NMR, GC-MS, XRD and ICP-OES analyses, which demonstrated a satisfactory synthesis of PPVP and zircon dissolution from Egyptian zircon sand. Factors controlling parameters, such as pH values, shaking time, initial zirconium concentration, PPVP dose, nitrate ions concentration, co-ions, temperature and eluting agents, have been optimized. At 25 ◦C, pH 0, 20 min shaking, 0.05 mol/L zirconium ions and 0.5 mol/L nitrate ions, PPVP has an exciting preservation potential of 195 mg/g, equivalent to 390 mg/L zirconium ions. From the extraction–distribution isotherm, the practical outcomes of Langmuir’s modeling are better than the Freundlich model, with a theoretical value of 196.07 mg/g, which is more in line with the experimental results of 195 mg/g. The zirconium ions adsorption onto the PPVP composite follows the pseudo-second-order kinetics with a theoretical capacity value of 204.08 mg/g. According to thermodynamic potential, the extraction process was expected to be an exothermic, spontaneous and beneficial extraction at low temperatures. The thermodynamic parameters ∆S (−0.03 kJ/mol), ∆H (−12.22 kJ/mol) and ∆G were also considered. As the temperature grows, ∆G values increase from −2.948 kJ/mol at 298 K to −1.941 kJ/mol at 338 K. Zirconium ions may be eluted from the working loaded PPVP by 0.025M HNO₃, with a 99% efficiency rate. It was found that zirconium ions revealed good separation factors towards some co-ions such as Hf⁴+ (28.82), Fe³+ (10.64), Ti⁴+ (28.82), V⁵+ (86.46) and U⁶+ (68.17). A successful alkali fusion technique with NaOH flux followed by the extraction with PPVP is used to obtain a high-purity zirconia concentrate with a zircon content of 72.77 % and a purity of 98.29%. As a result of this, the improved factors could finally be used.

Keywords: zirconium extraction, partially phosphorylated polyvinyl phosphate (PPVP), acidic medium, zircon

Procedia PDF Downloads 43
1058 Experimental Study of Sahara Climat Effect in Photovoltaic Solar Module

Authors: A. Benatiallah, A. Hadjadj, D. Benatiallah, F. Abaidi, A. Harrouz

Abstract:

Photovoltaic system is established as a reliable and economical source of electricity in rural and Sahara areas, especially in developing countries where the population is dispersed, has low consumption of energy and the grid power is not extended to these areas due to viability and financial problems. The production of energy by the photovoltaic system is very fluctuates and depend of meteorological conditions. Wind is a very important and often neglected parameter in the behavior of the solar module. The electric performances of a solar module to the silicon are very appreciable to the blows; in the present work we have studies the behavior of multi-crystal solar module according to the density of dust, and the principals electric feature of the solar module. An evaluation permits to affirm that a solar module under the effect of sand will collect a lower flux to the normal conditions.

Keywords: photovoltaic, multi-crystal module, experimental, effect of dust, performances

Procedia PDF Downloads 287
1057 Hydrological Response of the Glacierised Catchment: Himalayan Perspective

Authors: Sonu Khanal, Mandira Shrestha

Abstract:

Snow and Glaciers are the largest dependable reserved sources of water for the river system originating from the Himalayas so an accurate estimate of the volume of water contained in the snowpack and the rate of release of water from snow and glaciers are, therefore, needed for efficient management of the water resources. This research assess the fusion of energy exchanges between the snowpack, air above and soil below according to mass and energy balance which makes it apposite than the models using simple temperature index for the snow and glacier melt computation. UEBGrid a Distributed energy based model is used to calculate the melt which is then routed by Geo-SFM. The model robustness is maintained by incorporating the albedo generated from the Landsat-7 ETM images on a seasonal basis for the year 2002-2003 and substrate map derived from TM. The Substrate file includes predominantly the 4 major thematic layers viz Snow, clean ice, Glaciers and Barren land. This approach makes use of CPC RFE-2 and MERRA gridded data sets as the source of precipitation and climatic variables. The subsequent model run for the year between 2002-2008 shows a total annual melt of 17.15 meter is generate from the Marshyangdi Basin of which 71% is contributed by the glaciers , 18% by the rain and rest being from the snow melt. The albedo file is decisive in governing the melt dynamics as 30% increase in the generated surface albedo results in the 10% decrease in the simulated discharge. The melt routed with the land cover and soil variables using Geo-SFM shows Nash-Sutcliffe Efficiency of 0.60 with observed discharge for the study period.

Keywords: Glacier, Glacier melt, Snowmelt, Energy balance

Procedia PDF Downloads 440
1056 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.

Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation

Procedia PDF Downloads 344
1055 Identifying Promoters and Their Types Based on a Two-Layer Approach

Authors: Bin Liu

Abstract:

Prokaryotic promoter, consisted of two short DNA sequences located at in -35 and -10 positions, is responsible for controlling the initiation and expression of gene expression. Different types of promoters have different functions, and their consensus sequences are similar. In addition, their consensus sequences may be different for the same type of promoter, which poses difficulties for promoter identification. Unfortunately, all existing computational methods treat promoter identification as a binary classification task and can only identify whether a query sequence belongs to a specific promoter type. It is desired to develop computational methods for effectively identifying promoters and their types. Here, a two-layer predictor is proposed to try to deal with the problem. The first layer is designed to predict whether a given sequence is a promoter and the second layer predicts the type of promoter that is judged as a promoter. Meanwhile, we also analyze the importance of feature and sequence conversation in two aspects: promoter identification and promoter type identification. To the best knowledge of ours, it is the first computational predictor to detect promoters and their types.

Keywords: promoter, promoter type, random forest, sequence information

Procedia PDF Downloads 168
1054 Protection of Steel Bars in Reinforce Concrete with Zinc Based Coverings

Authors: Hamed Rajabzadeh Gatabi, Soroush Dastgheibifard, Mahsa Asnafi

Abstract:

There is no doubt that reinforced concrete is known as one of the most significant materials which is used in construction industry for many years. Although, some natural elements in dealing with environment can contribute to its corrosion or failure. One of which is bar or so-called reinforcement failure. So as to combat this problem, one of the oxidization prevention methods investigated was the barrier protection method implemented over the application of an organic coating, specifically fusion-bonded epoxy. In this study comparative method is prepared on two different kinds of covered bars (zinc-riches epoxy and polyamide epoxy coated bars) and also uncoated bar. With the aim of evaluate these reinforced concretes, the stickiness, toughness, thickness and corrosion performance of coatings were compared by some tools like Cu/CuSo4 electrodes, EIS and etc. Different types of concretes were exposed to the salty environment (NaCl 3.5%) and their durability was measured. As stated by the experiments in research and investigations, thick coatings (named epoxies) have acceptable stickiness and strength. Polyamide epoxy coatings stickiness to the bars was a bit better than that of zinc-rich epoxy coatings; nonetheless it was stiffer than the zinc rich epoxy coatings. Conversely, coated bars with zinc-rich epoxy showed more negative oxidization potentials, which take revenge protection of bars by zinc particles. On the whole, zinc-rich epoxy coverings is more corrosion-proof than polyamide epoxy coatings due to consuming zinc elements and some other parameters, additionally if the epoxy coatings without surface defects are applied on the rebar surface carefully, it can be said that the life of steel structures is subjected to increase dramatically.

Keywords: surface coating, epoxy polyamide, reinforce concrete bars, salty environment

Procedia PDF Downloads 271
1053 Cardiovascular Disease Prediction Using Machine Learning Approaches

Authors: P. Halder, A. Zaman

Abstract:

It is estimated that heart disease accounts for one in ten deaths worldwide. United States deaths due to heart disease are among the leading causes of death according to the World Health Organization. Cardiovascular diseases (CVDs) account for one in four U.S. deaths, according to the Centers for Disease Control and Prevention (CDC). According to statistics, women are more likely than men to die from heart disease as a result of strokes. A 50% increase in men's mortality was reported by the World Health Organization in 2009. The consequences of cardiovascular disease are severe. The causes of heart disease include diabetes, high blood pressure, high cholesterol, abnormal pulse rates, etc. Machine learning (ML) can be used to make predictions and decisions in the healthcare industry. Thus, scientists have turned to modern technologies like Machine Learning and Data Mining to predict diseases. The disease prediction is based on four algorithms. Compared to other boosts, the Ada boost is much more accurate.

Keywords: heart disease, cardiovascular disease, coronary artery disease, feature selection, random forest, AdaBoost, SVM, decision tree

Procedia PDF Downloads 136
1052 Online Prediction of Nonlinear Signal Processing Problems Based Kernel Adaptive Filtering

Authors: Hamza Nejib, Okba Taouali

Abstract:

This paper presents two of the most knowing kernel adaptive filtering (KAF) approaches, the kernel least mean squares and the kernel recursive least squares, in order to predict a new output of nonlinear signal processing. Both of these methods implement a nonlinear transfer function using kernel methods in a particular space named reproducing kernel Hilbert space (RKHS) where the model is a linear combination of kernel functions applied to transform the observed data from the input space to a high dimensional feature space of vectors, this idea known as the kernel trick. Then KAF is the developing filters in RKHS. We use two nonlinear signal processing problems, Mackey Glass chaotic time series prediction and nonlinear channel equalization to figure the performance of the approaches presented and finally to result which of them is the adapted one.

Keywords: online prediction, KAF, signal processing, RKHS, Kernel methods, KRLS, KLMS

Procedia PDF Downloads 381
1051 YBa2Cu3O7-d Nanoparticles Doped by Ferromagnetic Nanoparticles of Y3Fe5O12

Authors: Samir Khene

Abstract:

Present and future industrial uses of high critical temperature superconductors require high critical temperatures TC and strong current densities JC. These two aims constitute the two motivations of the scientific research in this domain. The most significant feature of any superconductor, from the viewpoint of uses, is the maximum electrical transport current density that this superconductor is capable of withstanding without loss of energy. In this work, vortices pinning in conventional and high-TC superconductors will be studied. Our experiments on vortices pinning in single crystals and nanoparticles of YBa2Cu3O7- and La1.85 Sr0.15CuO will be presented. It will be given special attention to the study of the YBa2Cu3O7- nanoparticles doped by ferromagnetic nanoparticles of Y3Fe5O12. The ferromagnetism and superconductivity coexistence in this compound will be demonstrated, and the influence of these ferromagnetic nanoparticles on the variations of the critical current density JC in YBa2Cu3O7- nanoparticles as a function of applied field H and temperature T will be studied.

Keywords: ferromagnetism, superconductivity, coexistence, magnetic material

Procedia PDF Downloads 57
1050 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression

Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras

Abstract:

In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.

Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression

Procedia PDF Downloads 103
1049 Statistical Wavelet Features, PCA, and SVM-Based Approach for EEG Signals Classification

Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh

Abstract:

The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the support-vectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.

Keywords: discrete wavelet transform, electroencephalogram, pattern recognition, principal component analysis, support vector machine

Procedia PDF Downloads 618
1048 Shared Decision-Making in Holistic Healthcare: Integrating Evidence-Based Medicine and Values-Based Medicine

Authors: Ling-Lang Huang

Abstract:

Research Background: Historically, the evolution of medicine has not only aimed to extend life but has also inadvertently introduced suffering in the process of maintaining life, presenting a contemporary challenge. We must carefully assess the conflict between the length of life and the quality of living. Evidence-Based Medicine (EBM) exists primarily to ensure the quality of cures. However, EBM alone does not fulfill our ultimate medical goals; we must also evaluate Value-Based Medicine (VBM) to find the best treatment for patients. Research Methodology: We can attempt to integrate EBM with VBM. Within the five steps of EBM, the first three steps (Ask—Acquire—Appraise) focus on the physical aspect of humans. However, in the fourth and fifth steps (Apply—Assess), the focus shifts from the physical to applying evidence-based treatment to the patient and assessing its effectiveness, considering a holistic approach to the individual. To consider VBM for patients, we can divide the process into three steps: The first step is "awareness," recognizing that each patient inhabits a different life-world and possesses unique differences. The second step is "integration," akin to the hermeneutic concept of the Fusion of Horizons. This means being aware of differences and also understanding the origins of these patient differences. The third step is "respect," which involves setting aside our adherence to medical objectivity and scientific rigor to respect the ultimate healthcare decisions made by individuals regarding their lives. Discussion and Conclusion: After completing these three steps of VBM, we can return to the fifth step of EBM: Assess. Our assessment can now transcend the physical treatment focus of the initial steps to align with a holistic care philosophy.

Keywords: shared decision-making, evidence-based medicine, values-based medicine, holistic healthcare

Procedia PDF Downloads 27
1047 The Tourism Pattern Based on Lifestyle: A Case Study of Suzhou City in China

Authors: Ling Chen, Lanyan Peng

Abstract:

In the new round of institutional reform of the State Council, Ministry of Culture and Ministry of Tourism were formed into a new department, Ministry of Culture and Tourism, which embodied the idea of the fusion development of cultural and tourism industries. At the same time, domestic tourists pay more attention to the tourism experience and tourism quality. The tourism patterns have been changed from the sightseeing mode of the individual scenic spot to the lifestyle mode of feeling the cultural atmosphere of the tourist destination. Therefore, this paper focuses on the tourism pattern based on lifestyle, studies the development status, content, and implementation measures of the tourism pattern. As the tourism pattern based on lifestyle integrating cultural and tourism industries in-depth, tourists can experience the living atmosphere, living conditions and living quality of the tourist destination, and deeply understand the urban cultural connotation during the trip. Suzhou has taken a series of measures to build up a tourism pattern based on lifestyle-'Suzhou life' tourism, including regional planning of tourism, integration of cultural resources, construction of urban atmosphere, and upgrading infrastructure. 'Suzhou life' tourism is based on the Suzhou food (cooked wheaten food, dim sum, specialty snacks), tourist attractions (Suzhou gardens, the ancient city) and characteristic recreational ways (appreciating Kun opera, enjoying Suzhou Pingtan, tea drinking). And the continuous integration of the three components above meet the spiritual, cultural needs of tourists and upgrade the tourism pattern based on lifestyle. Finally, the paper puts forward the tourism pattern planning suggestions.

Keywords: tourism pattern, lifestyle, integration of cultural and tourism industries, Suzhou life

Procedia PDF Downloads 220
1046 A Framework for Review Spam Detection Research

Authors: Mohammadali Tavakoli, Atefeh Heydari, Zuriati Ismail, Naomie Salim

Abstract:

With the increasing number of people reviewing products online in recent years, opinion sharing websites has become the most important source of customers’ opinions. Unfortunately, spammers generate and post fake reviews in order to promote or demote brands and mislead potential customers. These are notably destructive not only for potential customers but also for business holders and manufacturers. However, research in this area is not adequate, and many critical problems related to spam detection have not been solved to date. To provide green researchers in the domain with a great aid, in this paper, we have attempted to create a high-quality framework to make a clear vision on review spam-detection methods. In addition, this report contains a comprehensive collection of detection metrics used in proposed spam-detection approaches. These metrics are extremely applicable for developing novel detection methods.

Keywords: fake reviews, feature collection, opinion spam, spam detection

Procedia PDF Downloads 393
1045 Churn Prediction for Telecommunication Industry Using Artificial Neural Networks

Authors: Ulas Vural, M. Ergun Okay, E. Mesut Yildiz

Abstract:

Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.

Keywords: customer relationship management, churn prediction, telecom industry, deep learning, artificial neural networks

Procedia PDF Downloads 128