Search results for: method detection limit
19691 Prediction of Disability-Adjustment Mental Illness Using Machine Learning
Authors: S. R. M. Krishna, R. Santosh Kumar, V. Kamakshi Prasad
Abstract:
Machine learning techniques are applied for the analysis of the impact of mental illness on the burden of disease. It is calculated using the disability-adjusted life year (DALY). DALYs for a disease is the sum of years of life lost due to premature mortality (YLLs) + No of years of healthy life lost due to disability (YLDs). The critical analysis is done based on the Data sources, machine learning techniques and feature extraction method. The reviewing is done based on major databases. The extracted data is examined using statistical analysis and machine learning techniques were applied. The prediction of the impact of mental illness on the population using machine learning techniques is an alternative approach to the old traditional strategies, which are time-consuming and may not be reliable. The approach makes it necessary for a comprehensive adoption, innovative algorithms, and an understanding of the limitations and challenges. The obtained prediction is a way of understanding the underlying impact of mental illness on the health of the people and it enables us to get a healthy life expectancy. The growing impact of mental illness and the challenges associated with the detection and treatment of mental disorders make it necessary for us to understand the complete effect of it on the majority of the population. Procedia PDF Downloads 4519690 The Proposal of Modification of California Pipe Method for Inclined Pipe
Authors: Wojciech Dąbrowski, Joanna Bąk, Laurent Solliec
Abstract:
Nowadays technical and technological progress and constant development of methods and devices applied to sanitary engineering is indispensable. Issues related to sanitary engineering involve flow measurements for water and wastewater. The precise measurement is very important and pivotal for further actions, like monitoring. There are many methods and techniques of flow measurement in the area of sanitary engineering. Weirs and flumes are well–known methods and common used. But also there are alternative methods. Some of them are very simple methods, others are solutions using high technique. The old–time method combined with new technique could be more useful than earlier. Paper describes substitute method of flow gauging (California pipe method) and proposal of modification of this method used for inclined pipe. Examination of possibility of improving and developing old–time methods is direction of the investigation.Keywords: California pipe, sewerage, flow rate measurement, water, wastewater, improve, modification, hydraulic monitoring, stream
Procedia PDF Downloads 44019689 Comparison of Pbs/Zns Quantum Dots Synthesis Methods
Authors: Mahbobeh Bozhmehrani, Afshin Farah Bakhsh
Abstract:
Nanoparticles with PbS core of 12 nm and shell of approximately 3 nm were synthesized at PbS:ZnS ratios of 1.01:0.1 using Merca Ptopropionic Acid as stabilizing agent. PbS/ZnS nanoparticles present a dramatically increase of Photoluminescence intensity, confirming the confinement of the PbS core by increasing the Quantum Yield from 0.63 to 0.92 by the addition of the ZnS shell. In this case, the synthesis by microwave method allows obtaining nanoparticles with enhanced optical characteristics than those of nanoparticles synthesized by colloidal method.Keywords: Pbs/Zns, quantum dots, colloidal method, microwave
Procedia PDF Downloads 29019688 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification
Authors: Hung-Sheng Lin, Cheng-Hsuan Li
Abstract:
Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction
Procedia PDF Downloads 35219687 The Impact of Training Method on Programming Learning Performance
Authors: Chechen Liao, Chin Yi Yang
Abstract:
Although several factors that affect learning to program have been identified over the years, there continues to be no indication of any consensus in understanding why some students learn to program easily and quickly while others have difficulty. Seldom have researchers considered the problem of how to help the students enhance the programming learning outcome. The research had been conducted at a high school in Taiwan. Students participating in the study consist of 330 tenth grade students enrolled in the Basic Computer Concepts course with the same instructor. Two types of training methods-instruction-oriented and exploration-oriented were conducted. The result of this research shows that the instruction-oriented training method has better learning performance than exploration-oriented training method.Keywords: learning performance, programming learning, TDD, training method
Procedia PDF Downloads 42919686 The Use of SD Bioline TB AgMPT64® Detection Assay for Rapid Characterization of Mycobacteria in Nigeria
Authors: S. Ibrahim, U. B. Abubakar, S. Danbirni, A. Usman, F. M. Ballah, C. A. Kudi, L. Lawson, G. H. Abdulrazak, I. A. Abdulkadir
Abstract:
Performing culture and characterization of mycobacteria in low resource settings like Nigeria is a very difficult task to undertake because of the very few and limited laboratories carrying out such an experiment; this is a largely due to stringent and laborious nature of the tests. Hence, a rapid, simple and accurate test for characterization is needed. The “SD BIOLINE TB Ag MPT 64 Rapid ®” is a simple and rapid immunochromatographic test used in differentiating Mycobacteria into Mycobacterium tuberculosis (NTM). The 100 sputa were obtained from patients suspected to be infected with tuberculosis and presented themselves to hospitals for check-up and treatment were involved in the study. The samples were cultured in a class III Biosafety cabinet and level III biosafety practices were followed. Forty isolates were obtained from the cultured sputa, and there were identified as Acid-fast bacilli (AFB) using Zeihl-Neelsen acid-fast stain. All the isolates (AFB positive) were then subjected to the SD BIOLINE Analyses. A total of 31 (77.5%) were characterized as MTBC, while nine (22.5%) were NTM. The total turnaround time for the rapid assay was just 30 minutes as compared to a few days of phenotypic and genotypic method. It was simple, rapid and reliable test to differentiate MTBC from NTM.Keywords: culture, mycobacteria, non tuberculous mycobacterium, SD Bioline
Procedia PDF Downloads 35019685 Deep Learning Based Polarimetric SAR Images Restoration
Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli
Abstract:
In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry
Procedia PDF Downloads 9819684 A Review of Protocols and Guidelines Addressing the Exposure of Occupants to Electromagnetic Field (EMF) Radiation in Buildings
Authors: Shabnam Monadizadeh, Charles Kibert, Jiaxuan Li, Janghoon Woo, Ashish Asutosh, Samira Roostaei, Maryam Kouhirostami
Abstract:
A significant share of the technology that has emerged over the past several decades produces electromagnetic field (EMF) radiation. Communications devices, household appliances, industrial equipment, and medical devices all produce EMF radiation with a variety of frequencies, strengths, and ranges. Some EMF radiation, such as Extremely Low Frequency (ELF), Radio Frequency (RF), and the ionizing range have been shown to have harmful effects on human health. Depending on the frequency and strength of the radiation, EMF radiation can have health effects at the cellular level as well as at brain, nervous, and cardiovascular levels. Health authorities have enacted regulations locally and globally to set critical values to limit the adverse effects of EMF radiation. By introducing a more comprehensive field of EMF radiation study and practice, architects and designers can design for a safer electromagnetic (EM) indoor environment, and, as building and construction specialists, will be able to monitor and reduce EM radiation. This paper identifies the nature of EMF radiation in the built environment, the various EMF radiation sources, and its human health effects. It addresses European and US regulations for EMF radiation in buildings and provides a preliminary action plan. The challenges of developing measurement protocols for the various EMF radiation frequency ranges and determining the effects of EMF radiation on building occupants are discussed. This paper argues that a mature method for measuring EMF radiation in building environments and linking these measurements to human health impacts occupant health should be developed to provide adequate safeguards for human occupants of buildings for future research.Keywords: biological affection, electromagnetic field, building regulation, human health, healthy building, clean construction
Procedia PDF Downloads 18819683 Approximate Confidence Interval for Effect Size Base on Bootstrap Resampling Method
Authors: S. Phanyaem
Abstract:
This paper presents the confidence intervals for the effect size base on bootstrap resampling method. The meta-analytic confidence interval for effect size is proposed that are easy to compute. A Monte Carlo simulation study was conducted to compare the performance of the proposed confidence intervals with the existing confidence intervals. The best confidence interval method will have a coverage probability close to 0.95. Simulation results have shown that our proposed confidence intervals perform well in terms of coverage probability and expected length.Keywords: effect size, confidence interval, bootstrap method, resampling
Procedia PDF Downloads 59719682 On the Network Packet Loss Tolerance of SVM Based Activity Recognition
Authors: Gamze Uslu, Sebnem Baydere, Alper K. Demir
Abstract:
In this study, data loss tolerance of Support Vector Machines (SVM) based activity recognition model and multi activity classification performance when data are received over a lossy wireless sensor network is examined. Initially, the classification algorithm we use is evaluated in terms of resilience to random data loss with 3D acceleration sensor data for sitting, lying, walking and standing actions. The results show that the proposed classification method can recognize these activities successfully despite high data loss. Secondly, the effect of differentiated quality of service performance on activity recognition success is measured with activity data acquired from a multi hop wireless sensor network, which introduces high data loss. The effect of number of nodes on the reliability and multi activity classification success is demonstrated in simulation environment. To the best of our knowledge, the effect of data loss in a wireless sensor network on activity detection success rate of an SVM based classification algorithm has not been studied before.Keywords: activity recognition, support vector machines, acceleration sensor, wireless sensor networks, packet loss
Procedia PDF Downloads 47819681 Portrayal of Women in Television Advertisement
Authors: Priya Sarah Vijoy
Abstract:
The aim of this study is to analyze the Portrayal of women in Television Advertisements. This research study is conducted to analyze how women are portrayed in Television Advertisements. Advertising dates back to several hundreds of years. Right from the beginning, the seller wanted his goods to be sold and he used various techniques for achieving his objective. Advertisements have consistently confined women to traditional mother, home, or beauty/sex-oriented roles that are not representative of women’s diversity. Currently, in our society the television stereotyping of woman is the dominating forces in the media that degrade women and limit their representation. Thus the study analyzes how women are portrayed in Television advertisements and find whether roles of women in Television Advertisement are related to the product or not.Keywords: advertising, stereotyping, television, women
Procedia PDF Downloads 44619680 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks
Authors: Ahmed Abdullah Ahmed
Abstract:
The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments
Procedia PDF Downloads 51419679 Using a Hybrid Method to Eradicate Bamboo Growth along the Route of Overhead Power Lines
Authors: Miriam Eduful
Abstract:
The Electricity Company of Ghana (ECG) is under obligation, demanded by the Public Utility and Regulation Commission to meet set performance indices. However, in certain parts of the country, bamboo related power interruptions have become a challenge. Growth rate of the bamboo is such that the cost of regular vegetation maintenance along route of the overhead power lines has become prohibitive. To address the problem, several methods and techniques of bamboo eradication have being used. Some of these methods involved application of chemical compounds that are considered inimical and dangerous to the environment. In this paper, three methods of bamboo eradication along the route of the ECG overhead power lines have been investigated. A hybrid method has been found to be very effective and ecologically friendly. The method is locally available and comparatively inexpensive to apply.Keywords: bamboo, eradication, hybrid method, gly gold
Procedia PDF Downloads 37219678 A Quick Method for Seismic Vulnerability Evaluation of Offshore Structures by Static and Dynamic Nonlinear Analyses
Authors: Somayyeh Karimiyan
Abstract:
To evaluate the seismic vulnerability of vital offshore structures with the highest possible precision, Nonlinear Time History Analyses (NLTHA), is the most reliable method. However, since it is very time-consuming, a quick procedure is greatly desired. This paper presents a quick method by combining the Push Over Analysis (POA) and the NLTHA. The POA is preformed first to recognize the more critical members, and then the NLTHA is performed to evaluate more precisely the critical members’ vulnerability. The proposed method has been applied to jacket type structure. Results show that combining POA and NLTHA is a reliable seismic evaluation method, and also that none of the earthquake characteristics alone, can be a dominant factor in vulnerability evaluation.Keywords: jacket structure, seismic evaluation, push-over and nonlinear time history analyses, critical members
Procedia PDF Downloads 28519677 Diagnosis of Induction Machine Faults by DWT
Authors: Hamidreza Akbari
Abstract:
In this paper, for detection of inclined eccentricity in an induction motor, time–frequency analysis of the stator startup current is carried out. For this purpose, the discrete wavelet transform is used. Data are obtained from simulations, using winding function approach. The results show the validity of the approach for detecting the fault and discriminating with respect to other faults.Keywords: induction machine, fault, DWT, electric
Procedia PDF Downloads 35219676 Application of Residual Correction Method on Hyperbolic Thermoelastic Response of Hollow Spherical Medium in Rapid Transient Heat Conduction
Authors: Po-Jen Su, Huann-Ming Chou
Abstract:
In this article we uses the residual correction method to deal with transient thermoelastic problems with a hollow spherical region when the continuum medium possesses spherically isotropic thermoelastic properties. Based on linear thermoelastic theory, the equations of hyperbolic heat conduction and thermoelastic motion were combined to establish the thermoelastic dynamic model with consideration of the deformation acceleration effect and non-Fourier effect under the condition of transient thermal shock. The approximate solutions of temperature and displacement distributions are obtained using the residual correction method based on the maximum principle in combination with the finite difference method, making it easier and faster to obtain upper and lower approximations of exact solutions. The proposed method is found to be an effective numerical method with satisfactory accuracy. Moreover, the result shows that the effect of transient thermal shock induced by deformation acceleration is enhanced by non-Fourier heat conduction with increased peak stress. The influence on the stress increases with the thermal relaxation time.Keywords: maximum principle, non-Fourier heat conduction, residual correction method, thermo-elastic response
Procedia PDF Downloads 42919675 Novel Technique for calculating Surface Potential Gradient of Overhead Line Conductors
Authors: Sudip Sudhir Godbole
Abstract:
In transmission line surface potential gradient is a critical design parameter for planning overhead line, as it determines the level of corona loss (CL), radio interference (RI) and audible noise (AN).With increase of transmission line voltage level bulk power transfer is possible, using bundle conductor configuration used, it is more complex to find accurate surface stress in bundle configuration. The majority of existing models for surface gradient calculations are based on analytical methods which restrict their application in simulating complex surface geometry. This paper proposes a novel technique which utilizes both analytical and numerical procedure to predict the surface gradient. One of 400 kV transmission line configurations has been selected as an example to compare the results for different methods. The different strand shapes are a key variable in determining.Keywords: surface gradient, Maxwell potential coefficient method, market and Mengele’s method, successive images method, charge simulation method, finite element method
Procedia PDF Downloads 54119674 Spatial Object-Oriented Template Matching Algorithm Using Normalized Cross-Correlation Criterion for Tracking Aerial Image Scene
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
Leaning on the development of aerial laser scanning in the Philippine geospatial industry, researches about remote sensing and machine vision technology became a trend. Object detection via template matching is one of its application which characterized to be fast and in real time. The paper purposely attempts to provide application for robust pattern matching algorithm based on the normalized cross correlation (NCC) criterion function subjected in Object-based image analysis (OBIA) utilizing high-resolution aerial imagery and low density LiDAR data. The height information from laser scanning provides effective partitioning order, thus improving the hierarchal class feature pattern which allows to skip unnecessary calculation. Since detection is executed in the object-oriented platform, mathematical morphology and multi-level filter algorithms were established to effectively avoid the influence of noise, small distortion and fluctuating image saturation that affect the rate of recognition of features. Furthermore, the scheme is evaluated to recognized the performance in different situations and inspect the computational complexities of the algorithms. Its effectiveness is demonstrated in areas of Misamis Oriental province, achieving an overall accuracy of 91% above. Also, the garnered results portray the potential and efficiency of the implemented algorithm under different lighting conditions.Keywords: algorithm, LiDAR, object recognition, OBIA
Procedia PDF Downloads 24919673 Rectenna Modeling Based on MoM-GEC Method for RF Energy Harvesting
Authors: Soulayma Smirani, Mourad Aidi, Taoufik Aguili
Abstract:
Energy harvesting has arisen as a prominent research area for low power delivery to RF devices. Rectennas have become a key element in this technology. In this paper, electromagnetic modeling of a rectenna system is presented. In our approach, a hybrid technique was demonstrated to associate both the method of auxiliary sources (MAS) and MoM-GEC (the method of moments combined with the generalized equivalent circuit technique). Auxiliary sources were used in order to substitute specific electronic devices. Therefore, a simple and controllable model is obtained. Also, it can easily be interconnected to form different topologies of rectenna arrays for more energy harvesting. At last, simulation results show the feasibility and simplicity of the proposed rectenna model with high precision and computation efficiency.Keywords: computational electromagnetics, MoM-GEC method, rectennas, RF energy harvesting
Procedia PDF Downloads 17719672 Status and Results from EXO-200
Authors: Ryan Maclellan
Abstract:
EXO-200 has provided one of the most sensitive searches for neutrinoless double-beta decay utilizing 175 kg of enriched liquid xenon in an ultra-low background time projection chamber. This detector has demonstrated excellent energy resolution and background rejection capabilities. Using the first two years of data, EXO-200 has set a limit of 1.1x10^25 years at 90% C.L. on the neutrinoless double-beta decay half-life of Xe-136. The experiment has experienced a brief hiatus in data taking during a temporary shutdown of its host facility: the Waste Isolation Pilot Plant. EXO-200 expects to resume data taking in earnest this fall with upgraded detector electronics. Results from the analysis of EXO-200 data and an update on the current status of EXO-200 will be presented.Keywords: double-beta, Majorana, neutrino, neutrinoless
Procedia PDF Downloads 41719671 Progressive Damage Analysis of Mechanically Connected Composites
Authors: Şeyma Saliha Fidan, Ozgur Serin, Ata Mugan
Abstract:
While performing verification analyses under static and dynamic loads that composite structures used in aviation are exposed to, it is necessary to obtain the bearing strength limit value for mechanically connected composite structures. For this purpose, various tests are carried out in accordance with aviation standards. There are many companies in the world that perform these tests in accordance with aviation standards, but the test costs are very high. In addition, due to the necessity of producing coupons, the high cost of coupon materials, and the long test times, it is necessary to simulate these tests on the computer. For this purpose, various test coupons were produced by using reinforcement and alignment angles of the composite radomes, which were integrated into the aircraft. Glass fiber reinforced and Quartz prepreg is used in the production of the coupons. The simulations of the tests performed according to the American Society for Testing and Materials (ASTM) D5961 Procedure C standard were performed on the computer. The analysis model was created in three dimensions for the purpose of modeling the bolt-hole contact surface realistically and obtaining the exact bearing strength value. The finite element model was carried out with the Analysis System (ANSYS). Since a physical break cannot be made in the analysis studies carried out in the virtual environment, a hypothetical break is realized by reducing the material properties. The material properties reduction coefficient was determined as 10%, which is stated to give the most realistic approach in the literature. There are various theories in this method, which is called progressive failure analysis. Because the hashin theory does not match our experimental results, the puck progressive damage method was used in all coupon analyses. When the experimental and numerical results are compared, the initial damage and the resulting force drop points, the maximum damage load values , and the bearing strength value are very close. Furthermore, low error rates and similar damage patterns were obtained in both test and simulation models. In addition, the effects of various parameters such as pre-stress, use of bushing, the ratio of the distance between the bolt hole center and the plate edge to the hole diameter (E/D), the ratio of plate width to hole diameter (W/D), hot-wet environment conditions were investigated on the bearing strength of the composite structure.Keywords: puck, finite element, bolted joint, composite
Procedia PDF Downloads 10519670 Development and Validation of a HPLC Method for Standardization of Methanolic Extract of Hypericum sinaicum Hochst
Authors: Taghreed A. Ibrahim, Atef A. El-Hela, Hala M. El-Hefnawy
Abstract:
The chromatographic profile of methanol extract of Hypericum sinaicum was determined using HPLC-DAD. Apigenin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied for standardization of Hypericum sinaicum methanol extract.Keywords: quality control, standardization, falvonoids, methanol extract
Procedia PDF Downloads 50719669 Evolution of Propiconazole and Tebuconazole Residues through the Post-Harvest Application in 'Angeleno' Plum
Authors: M. J. Rodríguez, F. M. Sánchez, B. Velardo, P. Calvo, M. J. Serradilla, J. Delgado, J. M. López
Abstract:
The main problems in storage and later transport of fruits, are the decays developed that reduce the quality on destination’s markets. Nowadays, there is an increasing interest in the use of compounds to avoid decays in post-harvest. Triazole fungicides are agrochemicals widely used in the agricultural industry due to their wide spectrum of actions, and in some case, they are used in citrus fruit post-harvest. Moreover, its use is not authorized in plum post-harvest, but in order to a future possible authorization, the evolutions of propiconazole and tebuconazole residues are studied after its post-harvest application in ‘Angeleno’ plum.Keywords: maximum residue limit (MRL), triazole fungicides, decay, Prunus salicina
Procedia PDF Downloads 31819668 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: data augmentation, mutex task generation, meta-learning, text classification.
Procedia PDF Downloads 9919667 Moment Estimators of the Parameters of Zero-One Inflated Negative Binomial Distribution
Authors: Rafid Saeed Abdulrazak Alshkaki
Abstract:
In this paper, zero-one inflated negative binomial distribution is considered, along with some of its structural properties, then its parameters were estimated using the method of moments. It is found that the method of moments to estimate the parameters of the zero-one inflated negative binomial models is not a proper method and may give incorrect conclusions.Keywords: zero one inflated models, negative binomial distribution, moments estimator, non negative integer sampling
Procedia PDF Downloads 29719666 Modified Newton's Iterative Method for Solving System of Nonlinear Equations in Two Variables
Authors: Sara Mahesar, Saleem M. Chandio, Hira Soomro
Abstract:
Nonlinear system of equations in two variables is a system which contains variables of degree greater or equal to two or that comprises of the transcendental functions. Mathematical modeling of numerous physical problems occurs as a system of nonlinear equations. In applied and pure mathematics it is the main dispute to solve a system of nonlinear equations. Numerical techniques mainly used for finding the solution to problems where analytical methods are failed, which leads to the inexact solutions. To find the exact roots or solutions in case of the system of non-linear equations there does not exist any analytical technique. Various methods have been proposed to solve such systems with an improved rate of convergence and accuracy. In this paper, a new scheme is developed for solving system of non-linear equation in two variables. The iterative scheme proposed here is modified form of the conventional Newton’s Method (CN) whose order of convergence is two whereas the order of convergence of the devised technique is three. Furthermore, the detailed error and convergence analysis of the proposed method is also examined. Additionally, various numerical test problems are compared with the results of its counterpart conventional Newton’s Method (CN) which confirms the theoretic consequences of the proposed method.Keywords: conventional Newton’s method, modified Newton’s method, order of convergence, system of nonlinear equations
Procedia PDF Downloads 26119665 Analytical Soliton Solutions of the Fractional Jaulent-Miodek System
Authors: Sajeda Elbashabsheh, Kamel Al-Khaled
Abstract:
This paper applies a modified Laplace Adomian decomposition method to solve the time-fractional JaulentMiodek system. The method produce convergent series solutions with easily compatible components. This paper considers the Caputo fractional derivative. The effectiveness and applicability of the method are demonstrated by comparing its results with those of prior studies. Results are presented in tables and figures. These solutions might be imperative and significant for the explanation of some practical physical phenomena. All computations and figures in the work are done using MATHEMATICA. The numerical results demonstrate that the current methods are effective, reliable, and simple to i implement for nonlinear fractional partial differential equations.Keywords: approximate solutions, Jaulent-Miodek system, Adomian decomposition method, solitons
Procedia PDF Downloads 4819664 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 12619663 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability
Procedia PDF Downloads 21019662 Extraction of Natural Colorant from the Flowers of Flame of Forest Using Ultrasound
Authors: Sunny Arora, Meghal A. Desai
Abstract:
An impetus towards green consumerism and implementation of sustainable techniques, consumption of natural products and utilization of environment friendly techniques have gained accelerated acceptance. Butein, a natural colorant, has many medicinal properties apart from its use in dyeing industries. Extraction of butein from the flowers of flame of forest was carried out using ultrasonication bath. Solid loading (2-6 g), extraction time (30-50 min), volume of solvent (30-50 mL) and types of solvent (methanol, ethanol and water) have been studied to maximize the yield of butein using the Taguchi method. The highest yield of butein 4.67% (w/w) was obtained using 4 g of plant material, 40 min of extraction time and 30 mL volume of methanol as a solvent. The present method provided a greater reduction in extraction time compared to the conventional method of extraction. Hence, the outcome of the present investigation could further be utilized to develop the method at a higher scale.Keywords: butein, flowers of Flame of the Forest, Taguchi method, ultrasonic bath
Procedia PDF Downloads 479