Search results for: mixture regression model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19724

Search results for: mixture regression model

19334 A Novel Approach towards Test Case Prioritization Technique

Authors: Kamna Solanki, Yudhvir Singh, Sandeep Dalal

Abstract:

Software testing is a time and cost intensive process. A scrutiny of the code and rigorous testing is required to identify and rectify the putative bugs. The process of bug identification and its consequent correction is continuous in nature and often some of the bugs are removed after the software has been launched in the market. This process of code validation of the altered software during the maintenance phase is termed as Regression testing. Regression testing ubiquitously considers resource constraints; therefore, the deduction of an appropriate set of test cases, from the ensemble of the entire gamut of test cases, is a critical issue for regression test planning. This paper presents a novel method for designing a suitable prioritization process to optimize fault detection rate and performance of regression test on predefined constraints. The proposed method for test case prioritization m-ACO alters the food source selection criteria of natural ants and is basically a modified version of Ant Colony Optimization (ACO). The proposed m-ACO approach has been coded in 'Perl' language and results are validated using three examples by computation of Average Percentage of Faults Detected (APFD) metric.

Keywords: regression testing, software testing, test case prioritization, test suite optimization

Procedia PDF Downloads 333
19333 Effect of Mixture of Flaxseed and Pumpkin Seeds Powder on Hypercholesterolemia

Authors: Zahra Ashraf

Abstract:

Flax and pumpkin seeds are a rich source of unsaturated fatty acids, antioxidants and fiber, known to have anti-atherogenic properties. Hypercholesterolemia is a state characterized by the elevated level of cholesterol in the blood. This research was designed to study the effect of flax and pumpkin seeds powder mixture on hypercholesterolemia and body weight. Rat’s species were selected as human representative. Thirty male albino rats were divided into three groups: a control group, a CD-chol group (control diet+cholesterol) fed with 1.5% cholesterol and FP-chol group (flaxseed and pumpkin seed powder+ cholesterol) fed with 1.5% cholesterol. Flax and pumpkin seed powder mixed at proportion of (5/1) (omega-3 and omega-6). Blood samples were collected to examine lipid profile and body weight was also measured. Thus the data was subjected to analysis of variance. In CD-chol group, body weight, total cholesterol TC, triacylglycerides TG in plasma, plasma LDL-C, ratio significantly increased with a decrease in plasma HDL (good cholesterol). In FP-chol group lipid parameters and body weights were decreased significantly with an increase in HDL and decrease in LDL (bad cholesterol). The mean values of body weight, total cholesterol, triglycerides, low density lipoprotein and high density lipoproteins in FP-chol group were 240.66±11.35g, 59.60±2.20mg/dl, 50.20±1.79 mg/dl, 36.20±1.62mg/dl, 36.40±2.20 mg/dl, respectively. Flaxseed and pumpkin seeds powder mixture showed reduction in body weight, serum cholesterol, low density lipoprotein and triglycerides. While significant increase was shown in high density lipoproteins when given to hypercholesterolemic rats. Our results suggested that flax and pumpkin seed mixture has hypocholesterolemic effects which were probably mediated by polyunsaturated fatty acids (omega-3 and omega-6) present in seed mixture.

Keywords: hypercolesterolemia, omega 3 and omega 6 fatty acids, cardiovascular diseases

Procedia PDF Downloads 414
19332 Walmart Sales Forecasting using Machine Learning in Python

Authors: Niyati Sharma, Om Anand, Sanjeev Kumar Prasad

Abstract:

Assuming future sale value for any of the organizations is one of the major essential characteristics of tactical development. Walmart Sales Forecasting is the finest illustration to work with as a beginner; subsequently, it has the major retail data set. Walmart uses this sales estimate problem for hiring purposes also. We would like to analyzing how the internal and external effects of one of the largest companies in the US can walk out their Weekly Sales in the future. Demand forecasting is the planned prerequisite of products or services in the imminent on the basis of present and previous data and different stages of the market. Since all associations is facing the anonymous future and we do not distinguish in the future good demand. Hence, through exploring former statistics and recent market statistics, we envisage the forthcoming claim and building of individual goods, which are extra challenging in the near future. As a result of this, we are producing the required products in pursuance of the petition of the souk in advance. We will be using several machine learning models to test the exactness and then lastly, train the whole data by Using linear regression and fitting the training data into it. Accuracy is 8.88%. The extra trees regression model gives the best accuracy of 97.15%.

Keywords: random forest algorithm, linear regression algorithm, extra trees classifier, mean absolute error

Procedia PDF Downloads 146
19331 3D Hybrid Multiphysics Lattice Boltzmann Model for Studying the Flow Behavior of Emulsions in Structured Rectangular Microchannels

Authors: Luma Al-Tamimi, Hassan Farhat, Wessam Hasan

Abstract:

A three-dimensional (3D) hybrid quasi-steady thermal lattice Boltzmann model is developed to couple the effects of surfactant, temperature, interfacial tension, and contact angle. This 3D model is an extended scheme of a previously introduced two-dimensional (2D) hybrid lattice Boltzmann model. The 3D model is used to study the combined multi-physics effects on emulsion systems flowing in rectangular microchannels with and without confinements, where the suspended phase is made of droplets, plugs, or a mixture of both. The simulation results show that emulsion systems with plugs as the suspended phase are more efficient than with droplets, whereas mixed systems that form large plugs through coalescence have even greater efficiency. The 3D contact angle model generates matching results to those of the 2D model, which were validated with experiments. Furthermore, the effects of various confinements on adhering single drop systems are investigated for delineating their influence on the power required for transporting the suspended phase through the channel. It is shown that the deeper the constriction is, the lower the system efficiency. Increasing the surfactant concentration or fluid temperature in a channel with confinement carries a substantial positive effect on oil droplet transportation.

Keywords: lattice Boltzmann method, thermal, contact angle, surfactants, high viscosity ratio, porous media

Procedia PDF Downloads 173
19330 Studying the Moisture Sources and the Stable Isotope Characteristic of Moisture in Northern Khorasan Province, North-Eastern Iran

Authors: Mojtaba Heydarizad, Hamid Ghalibaf Mohammadabadi

Abstract:

Iran is a semi-arid and arid country in south-western Asia in the Middle East facing intense climatological drought from the early times. Therefore, studying the precipitation events and the moisture sources and air masses causing precipitation has great importance in this region. In this study, the moisture sources and stable isotope content of precipitation moisture in three main events in 2015 have been studied in North-Eastern Iran. HYSPLIT model backward trajectories showed that the Caspian Sea and the mixture of the Caspian and Mediterranean Seas are dominant moisture sources for the studied events. This showed the role of cP (Siberian) and Mediterranean (MedT) air masses. Stable isotope studies showed that precipitation events originated from the Caspian Sea with lower Sea Surface Temperature (SST) have more depleted isotope values. However, precipitation events sourced from the mixture of the Caspian and the Mediterranean Seas (with higher SST) showed more enriched isotope values.

Keywords: HYSPLIT, Iran, Northern Khorasan, stable isotopes

Procedia PDF Downloads 129
19329 Choosing between the Regression Correlation, the Rank Correlation, and the Correlation Curve

Authors: Roger L. Goodwin

Abstract:

This paper presents a rank correlation curve. The traditional correlation coefficient is valid for both continuous variables and for integer variables using rank statistics. Since the correlation coefficient has already been established in rank statistics by Spearman, such a calculation can be extended to the correlation curve. This paper presents two survey questions. The survey collected non-continuous variables. We will show weak to moderate correlation. Obviously, one question has a negative effect on the other. A review of the qualitative literature can answer which question and why. The rank correlation curve shows which collection of responses has a positive slope and which collection of responses has a negative slope. Such information is unavailable from the flat, "first-glance" correlation statistics.

Keywords: Bayesian estimation, regression model, rank statistics, correlation, correlation curve

Procedia PDF Downloads 469
19328 Numerical Prediction of Wall Eroded Area by Cavitation

Authors: Ridha Zgolli, Ahmed Belhaj, Maroua Ennouri

Abstract:

This study presents a new method to predict cavitation area that may be eroded. It is based on the post-treatment of URANS simulations in cavitant flows. The most RANS calculations with incompressible consideration are based on cavitation model using mixture fluid with density (ρm) calculated as a function of liquid density (ρliq), vapour or gas density (ρvap) and vapour or gas volume fraction α (ρm = αρvap + (1-α) ρliq). The calculations are performed on hydrofoil geometries and compared with experimental works concerning flows characteristics (size of pocket, pressure, velocity). We present here the used cavitation model and the approach followed to evaluate the value of α fixing the shape of pocket around wall before collapsing.

Keywords: flows, CFD, cavitation, erosion

Procedia PDF Downloads 335
19327 Fluoride as Obturating Material in Primary Teeth

Authors: Syed Ameer Haider Jafri

Abstract:

The primary goal of a root canal treatment in deciduous teeth is to eliminate infection and to retain the tooth in a functional state until it gets physiologically exfoliated and replaced by permanent successor. Important requisite of a root canal filling material for primary teeth is that, it should resorb at a similar rate as the roots of primary tooth, be harmless to the periapical tissue and to the permanent tooth germ, resorb readily if pushed beyond the apex, be antiseptic, radio-opaque, should not shrink, adhere to the walls, not discolor the tooth and easy to fill & remove, if required at any stage. Presently available, commonly used obturating materials for primary teeth are zinc oxide eugenol, calcium hydroxide and iodoform based pastes. None of these materials so far meet the ideal requirement of root canal filling material. So in search of ideal obturating material, this study was planed, in which mixture of calcium hydroxide, zinc oxide & sodium fluoride and mixture of calcium hydroxide & sodium fluoride was compared clinically and radiographically with calcium hydroxide for the obturation of root canals of 75 carious exposed primary mandibular second molars of 59 children aged 4-9 years. All the three material shows good results, but after a follow-up of 9 months mixture of calcium hydroxide, two percent sodium fluoride & zinc oxide powder closely follow the resorption of root, mixture of calcium hydroxide, two percent sodium fluoride follow resorption of root in the beginning but later on majority of cases shows faster resorption whereas calcium hydroxide starts depleting from the canal from the beginning even as early as 3 months. Thus mixture of calcium hydroxide, two percent sodium fluoride & zinc oxide found to be best obturaring material for primary tooth.

Keywords: obturating material, primary teeth, root canal treatment, success rate

Procedia PDF Downloads 302
19326 Numerical Simulation of Air Flow, Exhaust and Their Mixture in a Helicopter Exhaust Injective Cooler

Authors: Mateusz Paszko, Konrad Pietrykowski, Krzysztof Skiba

Abstract:

Due to low-altitude and relatively low flight speed, today’s combat assets like missile weapons equipped with infrared guidance systems are one of the most important threats to the helicopters performing combat missions. Especially meaningful in helicopter aviation is infrared emission by exhaust gases, regressed to the surroundings. Due to high temperature, exhaust gases are a major factor in detectability of a helicopter performing air combat operations. This study presents the results of simulating the flow of the mixture of exhaust and air in the flow duct of an injective exhaust cooler, adapted to cooperate with the PZL 10W turbine engine. The simulation was performed using a numerical model and the ANSYS Fluent software. Simulation computations were conducted for set flight conditions of the PZL W-3 Falcon helicopter. The conclusions resulting from the conducted numerical computations should allow for optimisation of the flow duct geometry in the cooler, in order to achieve the greatest possible temperature reduction of exhaust exiting into the surroundings. It is expected that the obtained results should be useful for further works related to the development of the final version of exhaust cooler for the PZL W-3 Falcon helicopter.

Keywords: exhaust cooler, helicopter, numerical simulation, stealth

Procedia PDF Downloads 144
19325 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements

Authors: Shagufta Tabassum

Abstract:

The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. In this paper, we discuss the basic calibration and normalization procedure for time-domain reflectometry measurements. Our approach is to explain the different types of error occur during TDR measurements and how these errors can be eliminated or minimized.

Keywords: time domain reflectometry measurement techinque, cable and connector loss, oscilloscope loss, and normalization technique

Procedia PDF Downloads 200
19324 An Inquiry of the Impact of Flood Risk on Housing Market with Enhanced Geographically Weighted Regression

Authors: Lin-Han Chiang Hsieh, Hsiao-Yi Lin

Abstract:

This study aims to determine the impact of the disclosure of flood potential map on housing prices. The disclosure is supposed to mitigate the market failure by reducing information asymmetry. On the other hand, opponents argue that the official disclosure of simulated results will only create unnecessary disturbances on the housing market. This study identifies the impact of the disclosure of the flood potential map by comparing the hedonic price of flood potential before and after the disclosure. The flood potential map used in this study is published by Taipei municipal government in 2015, which is a result of a comprehensive simulation based on geographical, hydrological, and meteorological factors. The residential property sales data of 2013 to 2016 is used in this study, which is collected from the actual sales price registration system by the Department of Land Administration (DLA). The result shows that the impact of flood potential on residential real estate market is statistically significant both before and after the disclosure. But the trend is clearer after the disclosure, suggesting that the disclosure does have an impact on the market. Also, the result shows that the impact of flood potential differs by the severity and frequency of precipitation. The negative impact for a relatively mild, high frequency flood potential is stronger than that for a heavy, low possibility flood potential. The result indicates that home buyers are of more concern to the frequency, than the intensity of flood. Another contribution of this study is in the methodological perspective. The classic hedonic price analysis with OLS regression suffers from two spatial problems: the endogeneity problem caused by omitted spatial-related variables, and the heterogeneity concern to the presumption that regression coefficients are spatially constant. These two problems are seldom considered in a single model. This study tries to deal with the endogeneity and heterogeneity problem together by combining the spatial fixed-effect model and geographically weighted regression (GWR). A series of literature indicates that the hedonic price of certain environmental assets varies spatially by applying GWR. Since the endogeneity problem is usually not considered in typical GWR models, it is arguable that the omitted spatial-related variables might bias the result of GWR models. By combing the spatial fixed-effect model and GWR, this study concludes that the effect of flood potential map is highly sensitive by location, even after controlling for the spatial autocorrelation at the same time. The main policy application of this result is that it is improper to determine the potential benefit of flood prevention policy by simply multiplying the hedonic price of flood risk by the number of houses. The effect of flood prevention might vary dramatically by location.

Keywords: flood potential, hedonic price analysis, endogeneity, heterogeneity, geographically-weighted regression

Procedia PDF Downloads 287
19323 Developing Measurement Model of Interpersonal Skills of Youth

Authors: Mohd Yusri Ibrahim

Abstract:

Although it is known that interpersonal skills are essential for personal development, the debate however continues as to how to measure those skills, especially in youths. This study was conducted to develop a measurement model of interpersonal skills by suggesting three construct namely personal, skills and relationship; six function namely self, perception, listening, conversation, emotion and conflict management; and 30 behaviours as indicators. This cross-sectional survey by questionnaires was applied in east side of peninsula of Malaysia for 150 respondents, and analyzed by structural equation modelling (SEM) by AMOS. The suggested constructs, functions and indicators were consider accepted as measurement elements by observing on regression weight for standard loading, average variance extracted (AVE) for convergent validity, square root of AVE for discriminant validity, composite reliability (CR), and at least three fit indexes for model fitness. Finally, a measurement model of interpersonal skill for youth was successfully developed.

Keywords: interpersonal communication, interpersonal skill, youth, communication skill

Procedia PDF Downloads 311
19322 Detecting Overdispersion for Mortality AIDS in Zero-inflated Negative Binomial Death Rate (ZINBDR) Co-infection Patients in Kelantan

Authors: Mohd Asrul Affedi, Nyi Nyi Naing

Abstract:

Overdispersion is present in count data, and basically when a phenomenon happened, a Negative Binomial (NB) is commonly used to replace a standard Poisson model. Analysis of count data event, such as mortality cases basically Poisson regression model is appropriate. Hence, the model is not appropriate when existing a zero values. The zero-inflated negative binomial model is appropriate. In this article, we modelled the mortality cases as a dependent variable by age categorical. The objective of this study to determine existing overdispersion in mortality data of AIDS co-infection patients in Kelantan.

Keywords: negative binomial death rate, overdispersion, zero-inflation negative binomial death rate, AIDS

Procedia PDF Downloads 459
19321 Analysis of Effect of Microfinance on the Profit Level of Small and Medium Scale Enterprises in Lagos State, Nigeria

Authors: Saheed Olakunle Sanusi, Israel Ajibade Adedeji

Abstract:

The study analysed the effect of microfinance on the profit level of small and medium scale enterprises in Lagos. The data for the study were obtained by simple random sampling, and total of one hundred and fifty (150) small and medium scale enterprises (SMEs) were sampled for the study. Seventy-five (75) each are microfinance users and non-users. Data were analysed using descriptive statistics, logit model, t-test and ordinary least square (OLS) regression. The mean profit of the enterprises using microfinance is ₦16.8m, while for the non-users of microfinance is ₦5.9m. The mean profit of microfinance users is statistically different from the non-users. The result of the logit model specified for the determinant of access to microfinance showed that three of specified variables- educational status of the enterprise head, credit utilisation and volume of business investment are significant at P < 0.01. Enterprises with many years of experience, highly educated enterprise heads and high volume of business investment have more potential access to microfinance. The OLS regression model indicated that three parameters namely number of school years, the volume of business investment and (dummy) participation in microfinance were found to be significant at P < 0.05. These variables are therefore significant determinants of impacts of microfinance on profit level in the study area. The study, therefore, concludes and recommends that to improve the status of small and medium scale enterprises for an increase in profit, the full benefit of access to microfinance can be enhanced through investment in social infrastructure and human capital development. Also, concerted efforts should be made to encouraged non-users of microfinance among SMEs to use it in order to boost their profit.

Keywords: credit utilisation, logit model, microfinance, small and medium enterprises

Procedia PDF Downloads 200
19320 Numerical Analysis of Laminar Reflux Condensation from Gas-Vapour Mixtures in Vertical Parallel Plate Channels

Authors: Foad Hassaninejadafarahani, Scott Ormiston

Abstract:

Reflux condensation occurs in a vertical channels and tubes when there is an upward core flow of vapor (or gas-vapor mixture) and a downward flow of the liquid film. The understanding of this condensation configuration is crucial in the design of reflux condensers, distillation columns, and in loss-of-coolant safety analyses in nuclear power plant steam generators. The unique feature of this flow is the upward flow of the vapor-gas mixture (or pure vapor) that retards the liquid flow via shear at the liquid-mixture interface. The present model solves the full, elliptic governing equations in both the film and the gas-vapor core flow. The computational mesh is non-orthogonal and adapts dynamically the phase interface, thus produces sharp and accurate interface. Shear forces and heat and mass transfer at the interface are accounted for fundamentally. This modeling is a big step ahead of current capabilities by removing the limitations of previous reflux condensation models which inherently cannot account for the detailed local balances of shear, mass, and heat transfer at the interface. Discretisation has been done based on a finite volume method and a co-located variable storage scheme. An in-house computer code was developed to implement the numerical solution scheme. Detailed results are presented for laminar reflux condensation from steam-air mixtures flowing in vertical parallel plate channels. The results include velocity and pressure profiles, as well as axial variations of film thickness, Nusselt number and interface gas mass fraction.

Keywords: Reflux, Condensation, CFD-Two Phase, Nusselt number

Procedia PDF Downloads 358
19319 Estimating the Life-Distribution Parameters of Weibull-Life PV Systems Utilizing Non-Parametric Analysis

Authors: Saleem Z. Ramadan

Abstract:

In this paper, a model is proposed to determine the life distribution parameters of the useful life region for the PV system utilizing a combination of non-parametric and linear regression analysis for the failure data of these systems. Results showed that this method is dependable for analyzing failure time data for such reliable systems when the data is scarce.

Keywords: masking, bathtub model, reliability, non-parametric analysis, useful life

Procedia PDF Downloads 555
19318 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution

Authors: Haiyan Wu, Ying Liu, Shaoyun Shi

Abstract:

Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.

Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction

Procedia PDF Downloads 133
19317 Examining the Cognitive Abilities and Financial Literacy Among Street Entrepreneurs: Evidence From North-East, India

Authors: Aayushi Lyngwa, Bimal Kishore Sahoo

Abstract:

The study discusses the relationship between cognitive ability and the level of education attained by the tribal street entrepreneurs on their financial literacy. It is driven by the objective of examining the effect of cognitive ability on financial ability on the one hand and determining the effect of the same on financial literacy on the other. A field experiment was conducted on 203 tribal street vendors in the north-eastern Indian state of Mizoram. This experiment's calculations are conditioned by providing each question scores like math score (cognitive ability), financial score and debt score (financial ability). After that, categories for each of the variables, like math category (math score), financial category (financial score) and debt category (debt score), are generated to run the regression model. Since the dependent variable is ordinal, an ordered logit regression model was applied. The study shows that street vendors' cognitive and financial abilities are highly correlated. It, therefore, confirms that cognitive ability positively affects the financial literacy of street vendors through the increase in attainment of educational levels. It is also found that concerning the type of street vendors, regular street vendors are more likely to have better cognitive abilities than temporary street vendors. Additionally, street vendors with more cognitive and financial abilities gained better monthly profits and performed habits of bookkeeping. The study attempts to draw a particular focus on a set-up which is economically and socially marginalized in the Indian economy. Its finding contributes to understanding financial literacy in an understudied area and provides policy implications through inclusive financial systems solutions in an economy limited to tribal street vendors.

Keywords: financial literacy, education, street entrepreneurs, tribals, cognitive ability, financial ability, ordered logit regression.

Procedia PDF Downloads 101
19316 Genetic Change in Escherichia coli KJ122 That Improved Succinate Production from an Equal Mixture of Xylose and Glucose

Authors: Apichai Sawisit, Sirima Suvarnakuta Jantama, Sunthorn Kanchanatawee, Lonnie O. Ingram, Kaemwich Jantama

Abstract:

Escherichia coli KJ122 was engineered to produce succinate from glucose using the wild type GalP for glucose uptake instead of the native phosphotransferase system (ptsI mutation). This strain ferments 10% (w/v) xylose poorly. Mutants were selected by serial transfers in AM1 mineral salts medium with 10% (w/v) xylose. Evolved mutants exhibited a similar improvement, co-fermentation of an equal mixture of xylose and glucose. One of these, AS1600a, produced 84.26±1.37 g/L succinate, equivalent to that produced by the parent (KJ122) strain from 10% glucose (85.46±1.78 g/L). AS1600a was sequenced and found to contain a mutation in galactose permease (GalP, G236D). Expressing the galP* mutation gene in KJ122ΔgalP resembled the xylose utilization phenotype of the mutant AS1600a. The strain AS1600a and KJ122ΔgalP (pLOI5746; galP*) also co-fermented a mixture of glucose, xylose, arabinose, and galactose in sugarcane bagasse hydrolysate for succinate production.

Keywords: xylose, furfural, succinate, sugarcane bagasse, E. coli

Procedia PDF Downloads 385
19315 Reaction Rate Behavior of a Methane-Air Mixture over a Platinum Catalyst in a Single Channel Catalytic Reactor

Authors: Doo Ki Lee, Kumaresh Selvakumar, Man Young Kim

Abstract:

Catalytic combustion is an environmentally friendly technique to combust fuels in gas turbines. In this paper, the behavior of surface reaction rate on catalytic combustion is studied with respect to the heterogeneous oxidation of methane-air mixture in a catalytic reactor. Plug flow reactor (PFR), the simplified single catalytic channel assists in investigating the catalytic combustion phenomenon over the Pt catalyst by promoting the desired chemical reactions. The numerical simulation with multi-step elementary surface reactions is governed by the availability of free surface sites onto the catalytic surface and thereby, the catalytic combustion characteristics are demonstrated by examining the rate of the reaction for lean fuel mixture. Further, two different surface reaction mechanisms are adopted and compared for surface reaction rates to indicate the controlling heterogeneous reaction for better fuel conversion. The performance of platinum catalyst under heterogeneous reaction is analyzed under the same temperature condition, where the catalyst with the higher kinetic rate of reaction would have a maximum catalytic activity for enhanced methane catalytic combustion.

Keywords: catalytic combustion, heterogeneous reaction, plug flow reactor, surface reaction rate

Procedia PDF Downloads 269
19314 Rheological Characterization of Gels Based on Medicinal Plant Extracts Mixture (Zingibar Officinale and Cinnamomum Cassia)

Authors: Zahia Aliche, Fatiha Boudjema, Benyoucef Khelidj, Selma Mettai, Zohra Bouriahi, Saliha Mohammed Belkebir, Ridha Mazouz

Abstract:

The purpose of this work is the study of the viscoelastic behaviour formulating gels based plant extractions. The extracts of Zingibar officinale and Cinnamomum cassia were included in the gel at different concentrations of these plants in order to be applied in anti-inflammatory drugs. The yield of ethanolic extraction of Zingibar o. is 3.98% and for Cinnamomum c., essential oil by hydrodistillation is 1.67 %. The ethanolic extract of Zingibar.o, the essential oil of Cinnamomum c. and the mixture showed an anti-DPPH radicals’ activity, presented by EC50 values of 11.32, 13.48 and 14.39 mg/ml respectively. A gel based on different concentrations of these extracts was prepared. Microbiological tests conducted against Staphylococcus aureus and Escherichia colishowed moderate inhibition of Cinnamomum c. gel and less the gel based on Cinnamomum c./ Zingibar o. (20/80). The yeast Candida albicansis resistant to gels. The viscoelastic formulation property was carried out in dynamic and creep and modeled with the Kelvin-Voigt model. The influence of some parameters on the stability of the gel (time, temperature and applied stress) has been studied.

Keywords: Cinnamomum cassia, Zingibar officinale, antioxidant activity, antimicrobien activity, gel, viscoelastic behaviour

Procedia PDF Downloads 84
19313 Prediction on Housing Price Based on Deep Learning

Authors: Li Yu, Chenlu Jiao, Hongrun Xin, Yan Wang, Kaiyang Wang

Abstract:

In order to study the impact of various factors on the housing price, we propose to build different prediction models based on deep learning to determine the existing data of the real estate in order to more accurately predict the housing price or its changing trend in the future. Considering that the factors which affect the housing price vary widely, the proposed prediction models include two categories. The first one is based on multiple characteristic factors of the real estate. We built Convolution Neural Network (CNN) prediction model and Long Short-Term Memory (LSTM) neural network prediction model based on deep learning, and logical regression model was implemented to make a comparison between these three models. Another prediction model is time series model. Based on deep learning, we proposed an LSTM-1 model purely regard to time series, then implementing and comparing the LSTM model and the Auto-Regressive and Moving Average (ARMA) model. In this paper, comprehensive study of the second-hand housing price in Beijing has been conducted from three aspects: crawling and analyzing, housing price predicting, and the result comparing. Ultimately the best model program was produced, which is of great significance to evaluation and prediction of the housing price in the real estate industry.

Keywords: deep learning, convolutional neural network, LSTM, housing prediction

Procedia PDF Downloads 302
19312 Applying Multiplicative Weight Update to Skin Cancer Classifiers

Authors: Animish Jain

Abstract:

This study deals with using Multiplicative Weight Update within artificial intelligence and machine learning to create models that can diagnose skin cancer using microscopic images of cancer samples. In this study, the multiplicative weight update method is used to take the predictions of multiple models to try and acquire more accurate results. Logistic Regression, Convolutional Neural Network (CNN), and Support Vector Machine Classifier (SVMC) models are employed within the Multiplicative Weight Update system. These models are trained on pictures of skin cancer from the ISIC-Archive, to look for patterns to label unseen scans as either benign or malignant. These models are utilized in a multiplicative weight update algorithm which takes into account the precision and accuracy of each model through each successive guess to apply weights to their guess. These guesses and weights are then analyzed together to try and obtain the correct predictions. The research hypothesis for this study stated that there would be a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The SVMC model had an accuracy of 77.88%. The CNN model had an accuracy of 85.30%. The Logistic Regression model had an accuracy of 79.09%. Using Multiplicative Weight Update, the algorithm received an accuracy of 72.27%. The final conclusion that was drawn was that there was a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The conclusion was made that using a CNN model would be the best option for this problem rather than a Multiplicative Weight Update system. This is due to the possibility that Multiplicative Weight Update is not effective in a binary setting where there are only two possible classifications. In a categorical setting with multiple classes and groupings, a Multiplicative Weight Update system might become more proficient as it takes into account the strengths of multiple different models to classify images into multiple categories rather than only two categories, as shown in this study. This experimentation and computer science project can help to create better algorithms and models for the future of artificial intelligence in the medical imaging field.

Keywords: artificial intelligence, machine learning, multiplicative weight update, skin cancer

Procedia PDF Downloads 74
19311 Investigating the Influence of the Ferro Alloys Consumption on the Slab Product Standard Cost with Different Grades Using Regression Analysis (A Case Study of Iran's Iron and Steel Industry)

Authors: Iman Fakhrian, Ali Salehi Manzari

Abstract:

Consistent Profitability is one of the most important priorities in manufacturing companies. One of the fundamental factors for increasing the companies profitability is cost management. Isfahan's mobarakeh steel company is one of the largest producers of the slab product grades in the middle east. Raw material cost constitutes about 70% of the company's expenditures. The costs of the ferro alloys have a remarkable contribution of the raw material costs. This research aims to determine the ferro alloys which have significant effect on the variability of the standard cost of the slab product grades. Used data in this study were collected from standard costing system of isfahan's mobarakeh steel company in 2022. The results of conducting the regression analysis model show that expense items: 03020, 03045, 03125, 03130 and 03150 have dominant role in variability of the standard cost of the slab product grades. In other words, the mentioned ferro alloys have noticeable and significant role in variability of the standard cost of the slab product grades.

Keywords: consistent profitability, ferro alloys, slab product grades, regression analysis

Procedia PDF Downloads 64
19310 Assessment of Forest Above Ground Biomass Through Linear Modeling Technique Using SAR Data

Authors: Arjun G. Koppad

Abstract:

The study was conducted in Joida taluk of Uttara Kannada district, Karnataka, India, to assess the land use land cover (LULC) and forest aboveground biomass using L band SAR data. The study area covered has dense, moderately dense, and sparse forests. The sampled area was 0.01 percent of the forest area with 30 sampling plots which were selected randomly. The point center quadrate (PCQ) method was used to select the tree and collected the tree growth parameters viz., tree height, diameter at breast height (DBH), and diameter at the tree base. The tree crown density was measured with a densitometer. Each sample plot biomass was estimated using the standard formula. In this study, the LULC classification was done using Freeman-Durden, Yamaghuchi and Pauli polarimetric decompositions. It was observed that the Freeman-Durden decomposition showed better LULC classification with an accuracy of 88 percent. An attempt was made to estimate the aboveground biomass using SAR backscatter. The ALOS-2 PALSAR-2 L-band data (HH, HV, VV &VH) fully polarimetric quad-pol SAR data was used. SAR backscatter-based regression model was implemented to retrieve forest aboveground biomass of the study area. Cross-polarization (HV) has shown a good correlation with forest above-ground biomass. The Multi Linear Regression analysis was done to estimate aboveground biomass of the natural forest areas of the Joida taluk. The different polarizations (HH &HV, VV &HH, HV & VH, VV&VH) combination of HH and HV polarization shows a good correlation with field and predicted biomass. The RMSE and value for HH & HV and HH & VV were 78 t/ha and 0.861, 81 t/ha and 0.853, respectively. Hence the model can be recommended for estimating AGB for the dense, moderately dense, and sparse forest.

Keywords: forest, biomass, LULC, back scatter, SAR, regression

Procedia PDF Downloads 22
19309 Bioactive Compounds Characterization of Cereal-Based Porridge Enriched with Cirina forda

Authors: Kunle Oni

Abstract:

This study investigated the bioactivity potentials of porridge from yellow maize and malted sorghum enriched with Cirinaforda.All the samples were analyzed using standard methods.Results showed that the highest value 217.03μmolTEAC/100g, 43.3 mmol Fe2+ /100g, and 35.56% for DPPH, FRAP and TBARS respectively were reported in sample 50FYM+20MS+30CF, while the lowest value 146.10μmolTEAC/100, 20.18±0.11 mmol Fe2+/100g and 13.25% for DPPH, FRAP and TBARS were reported in the control sample.The oxalate and tannin contents were lowest in sample 50FYM+20MS+30CFbutOxalate was highest in the control sample while tannin was highest in sample 60FYM+20MS+20CF.The phytate content was highest in the 60FYM+20MS+20CF mixture (2.32 mg/100g) and lowest in the control (100% FYM) porridge (2.20 mg/100g).The result also showed that the total phenolic content was highest in the 60FYM+20MS+20CF mixture (318.28 mg GAE/100g) and lowest in the50FYM+30MS+20CF mixture (264.18mg GAE/100g).The total flavonoid content had the50FYM+20MS+30CFmixture having the highest content (189.31mg RE/100g) and the 60FYM+20MS+20CF mixture having the lowest (90.10mg RE/100g). The enrichment of the porridge with C. fordaincreased the concentration of various bioactive compounds compared to the control sample. The identified compounds cinnamic acid, methyl ester, 10-Methyl-E-11-tridecen-1-ol propionate, methaqualone,3-(2-Hydroxy-6-methylphenyl)-4(3H)-quinazolinone, and oleic acid

Keywords: bioactive compounds, characterization, cereal-based porridge, Cirina forda

Procedia PDF Downloads 52
19308 Machine Learning Methods for Flood Hazard Mapping

Authors: Stefano Zappacosta, Cristiano Bove, Maria Carmela Marinelli, Paola di Lauro, Katarina Spasenovic, Lorenzo Ostano, Giuseppe Aiello, Marco Pietrosanto

Abstract:

This paper proposes a novel neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The proposed hybrid model can be used to classify four different increasing levels of hazard. The classification capability was compared with the flood hazard mapping River Basin Plans (PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale). The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.

Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment

Procedia PDF Downloads 171
19307 Educational Data Mining: The Case of the Department of Mathematics and Computing in the Period 2009-2018

Authors: Mário Ernesto Sitoe, Orlando Zacarias

Abstract:

University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.

Keywords: evasion and retention, cross-validation, bagging, stacking

Procedia PDF Downloads 78
19306 Using Linear Logistic Regression to Evaluation the Patient and System Delay and Effective Factors in Mortality of Patients with Acute Myocardial Infarction

Authors: Firouz Amani, Adalat Hoseinian, Sajjad Hakimian

Abstract:

Background: The mortality due to Myocardial Infarction (MI) is often occur during the first hours after onset of symptom. So, for taking the necessary treatment and decreasing the mortality rate, timely visited of the hospital could be effective in this regard. The aim of this study was to investigate the impact of effective factors in mortality of MI patients by using Linear Logistic Regression. Materials and Methods: In this case-control study, all patients with Acute MI who referred to the Ardabil city hospital were studied. All of died patients were considered as the case group (n=27) and we select 27 matched patients without Acute MI as a control group. Data collected for all patients in two groups by a same checklist and then analyzed by SPSS version 24 software using statistical methods. We used the linear logistic regression model to determine the effective factors on mortality of MI patients. Results: The mean age of patients in case group was significantly higher than control group (75.1±11.7 vs. 63.1±11.6, p=0.001).The history of non-cardinal diseases in case group with 44.4% significantly higher than control group with 7.4% (p=0.002).The number of performed PCIs in case group with 40.7% significantly lower than control group with 74.1% (P=0.013). The time distance between hospital admission and performed PCI in case group with 110.9 min was significantly upper than control group with 56 min (P=0.001). The mean of delay time from Onset of symptom to hospital admission (patient delay) and the mean of delay time from hospital admissions to receive treatment (system delay) was similar between two groups. By using logistic regression model we revealed that history of non-cardinal diseases (OR=283) and the number of performed PCIs (OR=24.5) had significant impact on mortality of MI patients in compare to other factors. Conclusion: Results of this study showed that of all studied factors, the number of performed PCIs, history of non-cardinal illness and the interval between onset of symptoms and performed PCI have significant relation with morality of MI patients and other factors were not meaningful. So, doing more studies with a large sample and investigated other involved factors such as smoking, weather and etc. is recommended in future.

Keywords: acute MI, mortality, heart failure, arrhythmia

Procedia PDF Downloads 118
19305 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue

Authors: Rachel Y. Zhang, Christopher K. Anderson

Abstract:

A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.

Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine

Procedia PDF Downloads 127