Search results for: logistic model tree
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17995

Search results for: logistic model tree

17365 Transport Mode Selection under Lead Time Variability and Emissions Constraint

Authors: Chiranjit Das, Sanjay Jharkharia

Abstract:

This study is focused on transport mode selection under lead time variability and emissions constraint. In order to reduce the carbon emissions generation due to transportation, organization has often faced a dilemmatic choice of transport mode selection since logistic cost and emissions reduction are complementary with each other. Another important aspect of transportation decision is lead-time variability which is least considered in transport mode selection problem. Thus, in this study, we provide a comprehensive mathematical based analytical model to decide transport mode selection under emissions constraint. We also extend our work through analysing the effect of lead time variability in the transport mode selection by a sensitivity analysis. In order to account lead time variability into the model, two identically normally distributed random variables are incorporated in this study including unit lead time variability and lead time demand variability. Therefore, in this study, we are addressing following questions: How the decisions of transport mode selection will be affected by lead time variability? How lead time variability will impact on total supply chain cost under carbon emissions? To accomplish these objectives, a total transportation cost function is developed including unit purchasing cost, unit transportation cost, emissions cost, holding cost during lead time, and penalty cost for stock out due to lead time variability. A set of modes is available to transport each node, in this paper, we consider only four transport modes such as air, road, rail, and water. Transportation cost, distance, emissions level for each transport mode is considered as deterministic and static in this paper. Each mode is having different emissions level depending on the distance and product characteristics. Emissions cost is indirectly affected by the lead time variability if there is any switching of transport mode from lower emissions prone transport mode to higher emissions prone transport mode in order to reduce penalty cost. We provide a numerical analysis in order to study the effectiveness of the mathematical model. We found that chances of stock out during lead time will be higher due to the higher variability of lead time and lad time demand. Numerical results show that penalty cost of air transport mode is negative that means chances of stock out zero, but, having higher holding and emissions cost. Therefore, air transport mode is only selected when there is any emergency order to reduce penalty cost, otherwise, rail and road transport is the most preferred mode of transportation. Thus, this paper is contributing to the literature by a novel approach to decide transport mode under emissions cost and lead time variability. This model can be extended by studying the effect of lead time variability under some other strategic transportation issues such as modal split option, full truck load strategy, and demand consolidation strategy etc.

Keywords: carbon emissions, inventory theoretic model, lead time variability, transport mode selection

Procedia PDF Downloads 434
17364 Oat βeta Glucan Attenuates the Development of Atherosclerosis and Improves the Intestinal Barrier Function by Reducing Bacterial Endotoxin Translocation in APOE-/- MICE

Authors: Dalal Alghawas, Jetty Lee, Kaisa Poutanen, Hani El-Nezami

Abstract:

Oat β-glucan a water soluble non starch linear polysaccharide has been approved as a cholesterol lowering agent by various food safety administrations and is commonly used to reduce the risk of heart disease. The molecular weight of oat β-glucan can vary depending on the extraction and fractionation methods. It is not clear whether the molecular weight has a significant impact at reducing the acceleration of atherosclerosis. The aim of this study was to investigate three different oat β-glucan fractionations on the development of atherosclerosis in vivo. With special focus on plaque stability and the intestinal barrier function. To test this, ApoE-/- female mice were fed a high fat diet supplemented with oat bran, high molecular weight (HMW) oat β-glucan fractionate and low molecular weight (LMW) oat β-glucan fractionate for 16 weeks. Atherosclerosis risk markers were measured in the plasma, heart and aortic tree. Plaque size was measured in the aortic root and aortic tree. ICAM-1, VCAM-1, E-Selectin, P-Selectin, protein levels were assessed from the aortic tree to determine plaque stability at 16 weeks. The expression of p22phox at the aortic root was evaluated to study the NADPH oxidase complex involved in nitric oxide bioavailability and vascular elasticity. The tight junction proteins E-cadherin and beta-catenin from western blot analyses were analysed as an intestinal barrier function test. Plasma LPS, intestinal D-lactate levels and hepatic FMO gene expression were carried out to confirm whether the compromised intestinal barrier lead to endotoxemia. The oat bran and HMW oat β-glucan diet groups were more effective than the LMW β-glucan diet group at reducing the plaque size and showed marked improvements in plaque stability. The intestinal barrier was compromised for all the experimental groups however the endotoxemia levels were higher in the LMW β-glucan diet group. The oat bran and HMW oat β-glucan diet groups were more effective at attenuating the development of atherosclerosis. Reasons for this could be due to the LMW oat β-glucan diet group’s low viscosity in the gut and the inability to block the reabsorption of cholesterol. Furthermore the low viscosity may allow more bacterial endotoxin translocation through the impaired intestinal barrier. In future food technologists should carefully consider how to incorporate LMW oat β-glucan as a health promoting food.

Keywords: Atherosclerosis, beta glucan, endotoxemia, intestinal barrier function

Procedia PDF Downloads 420
17363 Census and Mapping of Oil Palms Over Satellite Dataset Using Deep Learning Model

Authors: Gholba Niranjan Dilip, Anil Kumar

Abstract:

Conduct of accurate reliable mapping of oil palm plantations and census of individual palm trees is a huge challenge. This study addresses this challenge and developed an optimized solution implemented deep learning techniques on remote sensing data. The oil palm is a very important tropical crop. To improve its productivity and land management, it is imperative to have accurate census over large areas. Since, manual census is costly and prone to approximations, a methodology for automated census using panchromatic images from Cartosat-2, SkySat and World View-3 satellites is demonstrated. It is selected two different study sites in Indonesia. The customized set of training data and ground-truth data are created for this study from Cartosat-2 images. The pre-trained model of Single Shot MultiBox Detector (SSD) Lite MobileNet V2 Convolutional Neural Network (CNN) from the TensorFlow Object Detection API is subjected to transfer learning on this customized dataset. The SSD model is able to generate the bounding boxes for each oil palm and also do the counting of palms with good accuracy on the panchromatic images. The detection yielded an F-Score of 83.16 % on seven different images. The detections are buffered and dissolved to generate polygons demarcating the boundaries of the oil palm plantations. This provided the area under the plantations and also gave maps of their location, thereby completing the automated census, with a fairly high accuracy (≈100%). The trained CNN was found competent enough to detect oil palm crowns from images obtained from multiple satellite sensors and of varying temporal vintage. It helped to estimate the increase in oil palm plantations from 2014 to 2021 in the study area. The study proved that high-resolution panchromatic satellite image can successfully be used to undertake census of oil palm plantations using CNNs.

Keywords: object detection, oil palm tree census, panchromatic images, single shot multibox detector

Procedia PDF Downloads 160
17362 Public Preferences for Lung Cancer Screening in China: A Discrete Choice Experiment

Authors: Zixuan Zhao, Lingbin Du, Le Wang, Youqing Wang, Yi Yang, Jingjun Chen, Hengjin Dong

Abstract:

Objectives: Few results from public attitudes for lung cancer screening are available both in China and abroad. This study aimed to identify preferred lung cancer screening modalities in a Chinese population and predict uptake rates of different modalities. Materials and Methods: A discrete choice experiment questionnaire was administered to 392 Chinese individuals aged 50–74 years who were at high risk for lung cancer. Each choice set had two lung screening options and an option to opt-out, and respondents were asked to choose the most preferred one. Both mixed logit analysis and stepwise logistic analysis were conducted to explore whether preferences were related to respondent characteristics and identify which kinds of respondents were more likely to opt out of any screening. Results: On mixed logit analysis, attributes that were predictive of choice at 1% level of statistical significance included the screening interval, screening venue, and out-of-pocket costs. The preferred screening modality seemed to be screening by low-dose computed tomography (LDCT) + blood test once a year in a general hospital at a cost of RMB 50; this could increase the uptake rate by 0.40 compared to the baseline setting. On stepwise logistic regression, those with no endowment insurance were more likely to opt out; those who were older and housewives/househusbands, and those with a health check habit and with commercial endowment insurance were less likely to opt out from a screening programme. Conclusions: There was considerable variance between real risk and self-perceived risk of lung cancer among respondents, and further research is required in this area. Lung cancer screening uptake can be increased by offering various screening modalities, so as to help policymakers further design the screening modality.

Keywords: lung cancer, screening, China., discrete choice experiment

Procedia PDF Downloads 259
17361 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies

Keywords: crop yield, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 409
17360 A Five-Year Follow-up Survey Using Regression Analysis Finds Only Maternal Age to Be a Significant Medical Predictor for Infertility Treatment

Authors: Lea Stein, Sabine Rösner, Alessandra Lo Giudice, Beate Ditzen, Tewes Wischmann

Abstract:

For many couples bearing children is a consistent life goal; however, it cannot always be fulfilled. Undergoing infertility treatment does not guarantee pregnancies and live births. Couples have to deal with miscarriages and sometimes even discontinue infertility treatment. Significant medical predictors for the outcome of infertility treatment have yet to be fully identified. To further our understanding, a cross-sectional five-year follow-up survey was undertaken, in which 95 women and 82 men that have been treated at the Women’s Hospital of Heidelberg University participated. Binary logistic regressions, parametric and non-parametric methods were used for our sample to determine the relevance of biological (infertility diagnoses, maternal and paternal age) and lifestyle factors (smoking, drinking, over- and underweight) on the outcome of infertility treatment (clinical pregnancy, live birth, miscarriage, dropout rate). During infertility treatment, 72.6% of couples became pregnant and 69.5% were able to give birth. Suffering from miscarriages 27.5% of couples and 20.5% decided to discontinue an unsuccessful fertility treatment. The binary logistic regression models for clinical pregnancies, live births and dropouts were statistically significant for the maternal age, whereas the paternal age in addition to maternal and paternal BMI, smoking, infertility diagnoses and infections, showed no significant predicting effect on any of the outcome variables. The results confirm an effect of maternal age on infertility treatment, whereas the relevance of other medical predictors remains unclear. Further investigations should be considered to increase our knowledge of medical predictors.

Keywords: advanced maternal age, assisted reproductive technology, female factor, male factor, medical predictors, infertility treatment, reproductive medicine

Procedia PDF Downloads 109
17359 Numerical Modeling of the Depth-Averaged Flow over a Hill

Authors: Anna Avramenko, Heikki Haario

Abstract:

This paper reports the development and application of a 2D depth-averaged model. The main goal of this contribution is to apply the depth averaged equations to a wind park model in which the treatment of the geometry, introduced on the mathematical model by the mass and momentum source terms. The depth-averaged model will be used in future to find the optimal position of wind turbines in the wind park. K-E and 2D LES turbulence models were consider in this article. 2D CFD simulations for one hill was done to check the depth-averaged model in practise.

Keywords: depth-averaged equations, numerical modeling, CFD, wind park model

Procedia PDF Downloads 603
17358 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population

Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath

Abstract:

Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.

Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics

Procedia PDF Downloads 161
17357 UBCSAND Model Calibration for Generic Liquefaction Triggering Curves

Authors: Jui-Ching Chou

Abstract:

Numerical simulation is a popular method used to evaluate the effects of soil liquefaction on a structure or the effectiveness of a mitigation plan. Many constitutive models (UBCSAND model, PM4 model, SANISAND model, etc.) were presented to model the liquefaction phenomenon. In general, inputs of a constitutive model need to be calibrated against the soil cyclic resistance before being applied to the numerical simulation model. Then, simulation results can be compared with results from simplified liquefaction potential assessing methods. In this article, inputs of the UBCSAND model, a simple elastic-plastic stress-strain model, are calibrated against several popular generic liquefaction triggering curves of simplified liquefaction potential assessing methods via FLAC program. Calibrated inputs can provide engineers to perform a preliminary evaluation of an existing structure or a new design project.

Keywords: calibration, liquefaction, numerical simulation, UBCSAND Model

Procedia PDF Downloads 173
17356 Comparison of Several Peat Qualities as Amendment to Improve Afforestation of Mine Wastes

Authors: Marie Guittonny-LarchevêQue

Abstract:

In boreal Canada, industrial activities such as forestry, peat extraction and metal mines often occur nearby. At closure, mine waste storage facilities have to be reclaimed. On tailings storage facilities, tree plantations can achieve rapid restoration of forested landscapes. However, trees poorly grow in mine tailings and organic amendments like peat are required to improve tailings’ structure and nutrients. Canada is a well-known producer of horticultural quality peat, but some lower quality peats coming from areas adjacent to the reclaimed mines could allow successful revegetation. In particular, hemic peat coming from the bottom of peat-bogs is more decomposed than fibric peat and is less valued for horticulture. Moreover, forest peat is sometimes excavated and piled by the forest industry after cuttings to stimulate tree regeneration on the exposed mineral soil. The objective of this project was to compare the ability of peats of differing quality and origin to improve tailings structure, nutrients and tree development. A greenhouse experiment was conducted along one growing season in 2016 with a complete randomized block design combining 8 repetitions (blocks) x 2 tree species (Populus tremuloides and Pinus banksiana) x 6 substrates (tailings, commercial horticultural peat, and mixtures of tailings with commercial peat, forest peat, local fibric peat, or local hemic peat) x 2 fertilization levels (with or without mineral fertilization). The used tailings came from a gold mine and were low in sulfur and trace metals. The commercial peat had a slightly acidic pH (around 6) while other peats had a clearly acidic pH (around 3). However, mixing peat with slightly alkaline tailings resulted in a pH close to 7 whatever the tested peats. The macroporosity of mixtures was intermediate between the low values of tailings (4%) and the high values of commercial peat alone (34%). Seedling survival was lower on tailings for poplar compared to all other treatments, with or without fertilization. Survival and growth were similar among all treatments for pine. Fertilization had no impact on the maximal height and diameter of poplar seedlings but changed the relative performance of the substrates. When not fertilized, poplar seedlings grown in commercial peat were the highest and largest, and the smallest and slenderest in tailings, with intermediate values in mixtures. When fertilized, poplar seedlings grown in commercial peat were smaller and slender compared to all other substrates. However for this species, foliar, shoot, and root biomass production was the greatest in commercial peat and the lowest in tailings compared to all mixtures, whether fertilized or not. The mixture with local fibric peat provided the seedlings with the lowest foliar N concentrations compared to all other substrates whatever the species or the fertilization treatment. At the short-term, the performance of all the tested peats were close when mixed to tailings, showing that peats of lower quality could be valorized instead of using horticultural peat. These results demonstrate that intersectorial synergies in accordance with the principles of circular economy may be developed in boreal Canada between local industries around the reclamation of mine waste dumps.

Keywords: boreal trees, mine spoil, mine revegetation, intersectorial synergies

Procedia PDF Downloads 250
17355 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model 1: Description

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies.

Keywords: runoff, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 378
17354 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 490
17353 Stock Market Prediction by Regression Model with Social Moods

Authors: Masahiro Ohmura, Koh Kakusho, Takeshi Okadome

Abstract:

This paper presents a regression model with autocorrelated errors in which the inputs are social moods obtained by analyzing the adjectives in Twitter posts using a document topic model. The regression model predicts Dow Jones Industrial Average (DJIA) more precisely than autoregressive moving-average models.

Keywords: stock market prediction, social moods, regression model, DJIA

Procedia PDF Downloads 548
17352 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 40
17351 Metabolic Predictive Model for PMV Control Based on Deep Learning

Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon

Abstract:

In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.

Keywords: deep learning, indoor quality, metabolism, predictive model

Procedia PDF Downloads 257
17350 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 429
17349 Model Averaging in a Multiplicative Heteroscedastic Model

Authors: Alan Wan

Abstract:

In recent years, the body of literature on frequentist model averaging in statistics has grown significantly. Most of this work focuses on models with different mean structures but leaves out the variance consideration. In this paper, we consider a regression model with multiplicative heteroscedasticity and develop a model averaging method that combines maximum likelihood estimators of unknown parameters in both the mean and variance functions of the model. Our weight choice criterion is based on a minimisation of a plug-in estimator of the model average estimator's squared prediction risk. We prove that the new estimator possesses an asymptotic optimality property. Our investigation of finite-sample performance by simulations demonstrates that the new estimator frequently exhibits very favourable properties compared to some existing heteroscedasticity-robust model average estimators. The model averaging method hedges against the selection of very bad models and serves as a remedy to variance function misspecification, which often discourages practitioners from modeling heteroscedasticity altogether. The proposed model average estimator is applied to the analysis of two real data sets.

Keywords: heteroscedasticity-robust, model averaging, multiplicative heteroscedasticity, plug-in, squared prediction risk

Procedia PDF Downloads 384
17348 Reliability Prediction of Tires Using Linear Mixed-Effects Model

Authors: Myung Hwan Na, Ho- Chun Song, EunHee Hong

Abstract:

We widely use normal linear mixed-effects model to analysis data in repeated measurement. In case of detecting heteroscedasticity and the non-normality of the population distribution at the same time, normal linear mixed-effects model can give improper result of analysis. To achieve more robust estimation, we use heavy tailed linear mixed-effects model which gives more exact and reliable analysis conclusion than standard normal linear mixed-effects model.

Keywords: reliability, tires, field data, linear mixed-effects model

Procedia PDF Downloads 563
17347 Land Use, Land Cover Changes and Woody Vegetation Status of Tsimur Saint Gebriel Monastery, in Tigray Region, Northern Ethiopia

Authors: Abraha Hatsey, Nesibu Yahya, Abeje Eshete

Abstract:

Ethiopian Orthodox Tewahido Church has a long tradition of conserving the Church vegetation and is an area treated as a refugee camp for many endangered indigenous tree species in Northern Ethiopia. Though around 36,000 churches exist in Ethiopia, only a few churches have been studied so far. Thus, this study assessed the land use land cover change of 3km buffer (1986-2018) and the woody species diversity and regeneration status of Tsimur St. Gebriel monastery in Tigray region, Northern Ethiopia. For vegetation study, systematic sampling was used with 100m spacing between plots and between transects. Plot size was 20m*20m for the main plot and 2 subplots (5m*5m each) for the regeneration study. Tree height, diameter at breast height(DBH) and crown area were measured in the main plot for all trees with DBH ≥ 5cm. In the subplots, all seedlings and saplings were counted with DBH < 5cm. The data was analyzed on excel and Pass biodiversity software for diversity and evenness analysis. The major land cover classes identified include bare land, farmland, forest, shrubland and wetland. The extents of forest and shrubland were declined considerably due to bare land and agricultural land expansions within the 3km buffer, indicating an increasing pressure on the church forest. Regarding the vegetation status, A total of 19 species belonging to 13 families were recorded in the monastery. The diversity (H’) and evenness recorded were 2.4 and 0.5, respectively. The tree density (DBH ≥ 5cm) was 336/ha and a crown cover of 65%. Olea europaea was the dominant (6.4m2/ha out of 10.5m2 total basal area) and a frequent species (100%) with good regeneration in the monastery. The rest of the species are less frequent and are mostly confined to water sources with good site conditions. Juniperus procera (overharvested) and the other indigenous species were with few trees left and with no/very poor regeneration status. The species having poor density, frequency and regeneration (Junperus procera, Nuxia congesta Fersen and Jasminium abyssinica) need prior conservation and enrichment planting. The indigenous species could also serve as a potential seed source for the reproduction and restoration of nearby degraded landscapes. The buffer study also demonstrated expansion of agriculture and bare land, which could be a threat to the forest of the isolated monastery. Hence, restoring the buffer zone is the only guarantee for the healthy existence of the church forest.

Keywords: church forests, regeneration, land use change, vegetation status

Procedia PDF Downloads 205
17346 Evaluating the Accuracy of Biologically Relevant Variables Generated by ClimateAP

Authors: Jing Jiang, Wenhuan XU, Lei Zhang, Shiyi Zhang, Tongli Wang

Abstract:

Climate data quality significantly affects the reliability of ecological modeling. In the Asia Pacific (AP) region, low-quality climate data hinders ecological modeling. ClimateAP, a software developed in 2017, generates high-quality climate data for the AP region, benefiting researchers in forestry and agriculture. However, its adoption remains limited. This study aims to confirm the validity of biologically relevant variable data generated by ClimateAP during the normal climate period through comparison with the currently available gridded data. Climate data from 2,366 weather stations were used to evaluate the prediction accuracy of ClimateAP in comparison with the commonly used gridded data from WorldClim1.4. Univariate regressions were applied to 48 monthly biologically relevant variables, and the relationship between the observational data and the predictions made by ClimateAP and WorldClim was evaluated using Adjusted R-Squared and Root Mean Squared Error (RMSE). Locations were categorized into mountainous and flat landforms, considering elevation, slope, ruggedness, and Topographic Position Index. Univariate regressions were then applied to all biologically relevant variables for each landform category. Random Forest (RF) models were implemented for the climatic niche modeling of Cunninghamia lanceolata. A comparative analysis of the prediction accuracies of RF models constructed with distinct climate data sources was conducted to evaluate their relative effectiveness. Biologically relevant variables were obtained from three unpublished Chinese meteorological datasets. ClimateAPv3.0 and WorldClim predictions were obtained from weather station coordinates and WorldClim1.4 rasters, respectively, for the normal climate period of 1961-1990. Occurrence data for Cunninghamia lanceolata came from integrated biodiversity databases with 3,745 unique points. ClimateAP explains a minimum of 94.74%, 97.77%, 96.89%, and 94.40% of monthly maximum, minimum, average temperature, and precipitation variances, respectively. It outperforms WorldClim in 37 biologically relevant variables with lower RMSE values. ClimateAP achieves higher R-squared values for the 12 monthly minimum temperature variables and consistently higher Adjusted R-squared values across all landforms for precipitation. ClimateAP's temperature data yields lower Adjusted R-squared values than gridded data in high-elevation, rugged, and mountainous areas but achieves higher values in mid-slope drainages, plains, open slopes, and upper slopes. Using ClimateAP improves the prediction accuracy of tree occurrence from 77.90% to 82.77%. The biologically relevant climate data produced by ClimateAP is validated based on evaluations using observations from weather stations. The use of ClimateAP leads to an improvement in data quality, especially in non-mountainous regions. The results also suggest that using biologically relevant variables generated by ClimateAP can slightly enhance climatic niche modeling for tree species, offering a better understanding of tree species adaptation and resilience compared to using gridded data.

Keywords: climate data validation, data quality, Asia pacific climate, climatic niche modeling, random forest models, tree species

Procedia PDF Downloads 68
17345 Modeling Geogenic Groundwater Contamination Risk with the Groundwater Assessment Platform (GAP)

Authors: Joel Podgorski, Manouchehr Amini, Annette Johnson, Michael Berg

Abstract:

One-third of the world’s population relies on groundwater for its drinking water. Natural geogenic arsenic and fluoride contaminate ~10% of wells. Prolonged exposure to high levels of arsenic can result in various internal cancers, while high levels of fluoride are responsible for the development of dental and crippling skeletal fluorosis. In poor urban and rural settings, the provision of drinking water free of geogenic contamination can be a major challenge. In order to efficiently apply limited resources in the testing of wells, water resource managers need to know where geogenically contaminated groundwater is likely to occur. The Groundwater Assessment Platform (GAP) fulfills this need by providing state-of-the-art global arsenic and fluoride contamination hazard maps as well as enabling users to create their own groundwater quality models. The global risk models were produced by logistic regression of arsenic and fluoride measurements using predictor variables of various soil, geological and climate parameters. The maps display the probability of encountering concentrations of arsenic or fluoride exceeding the World Health Organization’s (WHO) stipulated concentration limits of 10 µg/L or 1.5 mg/L, respectively. In addition to a reconsideration of the relevant geochemical settings, these second-generation maps represent a great improvement over the previous risk maps due to a significant increase in data quantity and resolution. For example, there is a 10-fold increase in the number of measured data points, and the resolution of predictor variables is generally 60 times greater. These same predictor variable datasets are available on the GAP platform for visualization as well as for use with a modeling tool. The latter requires that users upload their own concentration measurements and select the predictor variables that they wish to incorporate in their models. In addition, users can upload additional predictor variable datasets either as features or coverages. Such models can represent an improvement over the global models already supplied, since (a) users may be able to use their own, more detailed datasets of measured concentrations and (b) the various processes leading to arsenic and fluoride groundwater contamination can be isolated more effectively on a smaller scale, thereby resulting in a more accurate model. All maps, including user-created risk models, can be downloaded as PDFs. There is also the option to share data in a secure environment as well as the possibility to collaborate in a secure environment through the creation of communities. In summary, GAP provides users with the means to reliably and efficiently produce models specific to their region of interest by making available the latest datasets of predictor variables along with the necessary modeling infrastructure.

Keywords: arsenic, fluoride, groundwater contamination, logistic regression

Procedia PDF Downloads 348
17344 Towards a Measurement-Based E-Government Portals Maturity Model

Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri

Abstract:

The e-government emerging concept transforms the way in which the citizens are dealing with their governments. Thus, the citizens can execute the intended services online anytime and anywhere. This results in great benefits for both the governments (reduces the number of officers) and the citizens (more flexibility and time saving). Therefore, building a maturity model to assess the e-government portals becomes desired to help in the improvement process of such portals. This paper aims at proposing an e-government maturity model based on the measurement of the best practices’ presence. The main benefit of such maturity model is to provide a way to rank an e-government portal based on the used best practices, and also giving a set of recommendations to go to the higher stage in the maturity model.

Keywords: best practices, e-government portal, maturity model, quality model

Procedia PDF Downloads 338
17343 Machine Learning Techniques in Bank Credit Analysis

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner

Abstract:

The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.

Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines

Procedia PDF Downloads 103
17342 Variation in Wood Anatomical Properties of Acacia seyal var. seyal Tree Species Growing in Different Zones in Sudan

Authors: Hanadi Mohamed Shawgi Gamal, Ashraf Mohamed Ahmed Abdalla

Abstract:

Sudan is endowed by a great diversity of tree species; nevertheless, the utilization of wood resources has traditionally concentrated on a few number of species. With the great variation in the climatic zones of Sudan, great variations are expected in the anatomical properties between and within species. This variation needs to be fully explored in order to suggest the best uses for the species. Modern research on wood has substantiated that the climatic condition where the species grow has significant effect on wood properties. Understanding the extent of variability of wood is important because the uses for each kind of wood are related to its characteristics; furthermore, the suitability or quality of wood for a particular purpose is determined by the variability of one or more of these characteristics. The present study demonstrates the effect of rainfall zones in some anatomical properties of Acacia seyal var. seyal growing in Sudan. For this purpose, twenty healthy trees were collected randomly from two zones (ten trees per zone). One zone with relatively low rainfall (273mm annually) which represented by North Kordofan state and White Nile state and the second with relatively high rainfall (701 mm annually) represented by Blue Nile state and South Kordofan state. From each sampled tree, a stem disc (3 cm thick) was cut at 10% from stem height. One radius was obtained in central stem dices. Two representative samples were taken from each disc, one at 10% distance from pith to bark, the second at 90% in order to represent the juvenile and mature wood. The investigated anatomical properties were fibers length, fibers and vessels diameter, lumen diameter, and wall thickness as well as cell proportions. The result of the current study reveals significant differences between zones in mature wood vessels diameter and wall thickness, as well as juvenile wood vessels, wall thickness. The higher values were detected in the drier zone. Significant differences were also observed in juvenile wood fiber length, diameter as well as wall thickness. Contrary to vessels diameter and wall thickness, the fiber length, diameter as well as wall thickness were decreased in the drier zone. No significant differences have been detected in cell proportions of juvenile and mature wood. The significant differences in some fiber and vessels dimension lead to expect significant differences in wood density. From these results, Acacia seyal var. seyal seems to be well adapted with the change in rainfall and may survive in any rainfall zone.

Keywords: Acacia seyal var. seyal, anatomical properties, rainfall zones, variation

Procedia PDF Downloads 148
17341 Bringing the World to Net Zero Carbon Dioxide by Sequestering Biomass Carbon

Authors: Jeffrey A. Amelse

Abstract:

Many corporations aspire to become Net Zero Carbon Carbon Dioxide by 2035-2050. This paper examines what it will take to achieve those goals. Achieving Net Zero CO₂ requires an understanding of where energy is produced and consumed, the magnitude of CO₂ generation, and proper understanding of the Carbon Cycle. The latter leads to the distinction between CO₂ and biomass carbon sequestration. Short reviews are provided for prior technologies proposed for reducing CO₂ emissions from fossil fuels or substitution by renewable energy, to focus on their limitations and to show that none offer a complete solution. Of these, CO₂ sequestration is poised to have the largest impact. It will just cost money, scale-up is a huge challenge, and it will not be a complete solution. CO₂ sequestration is still in the demonstration and semi-commercial scale. Transportation accounts for only about 30% of total U.S. energy demand, and renewables account for only a small fraction of that sector. Yet, bioethanol production consumes 40% of U.S. corn crop, and biodiesel consumes 30% of U.S. soybeans. It is unrealistic to believe that biofuels can completely displace fossil fuels in the transportation market. Bioethanol is traced through its Carbon Cycle and shown to be both energy inefficient and inefficient use of biomass carbon. Both biofuels and CO₂ sequestration reduce future CO₂ emissions from continued use of fossil fuels. They will not remove CO₂ already in the atmosphere. Planting more trees has been proposed as a way to reduce atmospheric CO₂. Trees are a temporary solution. When they complete their Carbon Cycle, they die and release their carbon as CO₂ to the atmosphere. Thus, planting more trees is just 'kicking the can down the road.' The only way to permanently remove CO₂ already in the atmosphere is to break the Carbon Cycle by growing biomass from atmospheric CO₂ and sequestering biomass carbon. Sequestering tree leaves is proposed as a solution. Unlike wood, leaves have a short Carbon Cycle time constant. They renew and decompose every year. Allometric equations from the USDA indicate that theoretically, sequestrating only a fraction of the world’s tree leaves can get the world to Net Zero CO₂ without disturbing the underlying forests. How can tree leaves be permanently sequestered? It may be as simple as rethinking how landfills are designed to discourage instead of encouraging decomposition. In traditional landfills, municipal waste undergoes rapid initial aerobic decomposition to CO₂, followed by slow anaerobic decomposition to methane and CO₂. The latter can take hundreds to thousands of years. The first step in anaerobic decomposition is hydrolysis of cellulose to release sugars, which those who have worked on cellulosic ethanol know is challenging for a number of reasons. The key to permanent leaf sequestration may be keeping the landfills dry and exploiting known inhibitors for anaerobic bacteria.

Keywords: carbon dioxide, net zero, sequestration, biomass, leaves

Procedia PDF Downloads 128
17340 Search for Flavour Changing Neutral Current Couplings of Higgs-up Sector Quarks at Future Circular Collider (FCC-eh)

Authors: I. Turk Cakir, B. Hacisahinoglu, S. Kartal, A. Yilmaz, A. Yilmaz, Z. Uysal, O. Cakir

Abstract:

In the search for new physics beyond the Standard Model, Flavour Changing Neutral Current (FCNC) is a good research field in terms of the observability at future colliders. Increased Higgs production with higher energy and luminosity in colliders is essential for verification or falsification of our knowledge of physics and predictions, and the search for new physics. Prospective electron-proton collider constituent of the Future Circular Collider project is FCC-eh. It offers great sensitivity due to its high luminosity and low interference. In this work, thq FCNC interaction vertex with off-shell top quark decay at electron-proton colliders is studied. By using MadGraph5_aMC@NLO multi-purpose event generator, observability of tuh and tch couplings are obtained with equal coupling scenario. Upper limit on branching ratio of tree level top quark FCNC decay is determined as 0.012% at FCC-eh with 1 ab ^−1 luminosity.

Keywords: FCC, FCNC, Higgs Boson, Top Quark

Procedia PDF Downloads 212
17339 Multiplying Vulnerability of Child Health Outcome and Food Diversity in India

Authors: Mukesh Ravi Raushan

Abstract:

Despite consideration of obesity as a deadly public health issue contributing 2.6 million deaths worldwide every year developing country like India is facing malnutrition and it is more common than in Sub-Saharan Africa. About one in every three malnourished children in the world lives in India. The paper assess the nutritional health among children using data from total number of 43737 infant and young children aged 0-59 months (µ = 29.54; SD = 17.21) of the selected households by National Family Health Survey, 2005-06. The wasting was measured by a Z-score of standardized weight-for-height according to the WHO child growth standards. The impact of education with place of residence was found to be significantly associated with the complementary food diversity score (CFDS) in India. The education of mother was positively associated with the CFDS but the degree of performance was lower in rural India than their counterpart from urban. The result of binary logistic regression on wasting with WHO seven types of recommended food for children in India suggest that child who consumed the milk product food (OR: 0.87, p<0.0001) were less likely to be malnourished than their counterparts who did not consume, whereas, in case of other food items as the child who consumed food product of seed (OR: 0.75, p<0.0001) were less likely to be malnourished than those who did not. The nutritional status among children were negatively associated with the protein containing complementary food given the child as those child who received pulse in last 24 hour were less likely to be wasted (OR: 0.87, p<0.00001) as compared to the reference categories. The frequency to feed the indexed child increases by 10 per cent the expected change in child health outcome in terms of wasting decreases by 2 per cent in India when place of residence, education, religion, and birth order were controlled. The index gets improved as the risk for malnutrition among children in India decreases.

Keywords: CFDS, food diversity index, India, logistic regression

Procedia PDF Downloads 261
17338 Comparison of Cervical Length Using Transvaginal Ultrasonography and Bishop Score to Predict Succesful Induction

Authors: Lubena Achmad, Herman Kristanto, Julian Dewantiningrum

Abstract:

Background: The Bishop score is a standard method used to predict the success of induction. This examination tends to be subjective with high inter and intraobserver variability, so it was presumed to have a low predictive value in terms of the outcome of labor induction. Cervical length measurement using transvaginal ultrasound is considered to be more objective to assess the cervical length. Meanwhile, this examination is not a complicated procedure and less invasive than vaginal touché. Objective: To compare transvaginal ultrasound and Bishop score in predicting successful induction. Methods: This study was a prospective cohort study. One hundred and twenty women with singleton pregnancies undergoing induction of labor at 37 – 42 weeks and met inclusion and exclusion criteria were enrolled in this study. Cervical assessment by both transvaginal ultrasound and Bishop score were conducted prior induction. The success of labor induction was defined as an ability to achieve active phase ≤ 12 hours after induction. To figure out the best cut-off point of cervical length and Bishop score, receiver operating characteristic (ROC) curves were plotted. Logistic regression analysis was used to determine which factors best-predicted induction success. Results: This study showed significant differences in terms of age, premature rupture of the membrane, the Bishop score, cervical length and funneling as significant predictors of successful induction. Using ROC curves found that the best cut-off point for prediction of successful induction was 25.45 mm for cervical length and 3 for Bishop score. Logistic regression was performed and showed only premature rupture of membranes and cervical length ≤ 25.45 that significantly predicted the success of labor induction. By excluding premature rupture of the membrane as the indication of induction, cervical length less than 25.3 mm was a better predictor of successful induction. Conclusion: Compared to Bishop score, cervical length using transvaginal ultrasound was a better predictor of successful induction.

Keywords: Bishop Score, cervical length, induction, successful induction, transvaginal sonography

Procedia PDF Downloads 325
17337 CFD Simulation of a Large Scale Unconfined Hydrogen Deflagration

Authors: I. C. Tolias, A. G. Venetsanos, N. Markatos

Abstract:

In the present work, CFD simulations of a large scale open deflagration experiment are performed. Stoichiometric hydrogen-air mixture occupies a 20 m hemisphere. Two combustion models are compared and are evaluated against the experiment. The Eddy Dissipation Model and a Multi-physics combustion model which is based on Yakhot’s equation for the turbulent flame speed. The values of models’ critical parameters are investigated. The effect of the turbulence model is also examined. k-ε model and LES approach were tested.

Keywords: CFD, deflagration, hydrogen, combustion model

Procedia PDF Downloads 502
17336 Impact of Meteorological Factors on Influenza Activity in Pakistan; A Tale of Two Cities

Authors: Nadia Nisar

Abstract:

Background: In the temperate regions Influenza activities occur sporadically all year round with peaks coinciding during cold months. Meteorological and environmental conditions play significant role in the transmission of influenza globally. In this study, we assessed the relationship between meteorological parameters and influenza activity in two geographical areas of Pakistan. Methods: Influenza data were collected from Islamabad (north) and Multan (south) regions of national influenza surveillance system during 2010-2015. Meteorological database was obtained from National Climatic Data Center (Pakistan). Logistic regression model with a stepwise approach was used to explore the relationship between meteorological parameters with influenza peaks. In statistical model, we used the weekly proportion of laboratory-confirmed influenza positive samples to represent Influenza activity with metrological parameters as the covariates (temperature, humidity and precipitation). We also evaluate the link between environmental conditions associated with seasonal influenza epidemics: 'cold-dry' and 'humid-rainy'. Results: We found that temperature and humidity was positively associated with influenza in north and south both locations (OR = 0.927 (0.88-0.97)) & (OR = 0.1.078 (1.027-1.132)) and (OR = 1.023 (1.008-1.037)) & (OR = 0.978 (0.964-0.992)) respectively, whilst precipitation was negatively associated with influenza (OR = 1.054 (1.039-1.070)) & (OR = 0.949 (0.935-0.963)). In both regions, temperature and humidity had the highest contribution to the model as compared to the precipitation. We revealed that the p-value for all of climate parameters is <0.05 by Independent-sample t-test. These results demonstrate that there were significant relationships between climate factors and influenza infection with correlation coefficients: 0.52-0.90. The total contribution of these three climatic variables accounted for 89.04%. The reported number of influenza cases increased sharply during the cold-dry season (i.e., winter) when humidity and temperature are at minimal levels. Conclusion: Our findings showed that measures of temperature, humidity and cold-dry season (winter) can be used as indicators to forecast influenza infections. Therefore integrating meteorological parameters for influenza forecasting in the surveillance system may benefit the public health efforts in reducing the burden of seasonal influenza. More studies are necessary to understand the role of these parameters in the viral transmission and host susceptibility process.

Keywords: influenza, climate, metrological, environmental

Procedia PDF Downloads 200