Search results for: regression models drone
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9386

Search results for: regression models drone

7796 Studying the Effects of Economic and Financial Development as Well as Institutional Quality on Environmental Destruction in the Upper-Middle Income Countries

Authors: Morteza Raei Dehaghi, Seyed Mohammad Mirhashemi

Abstract:

The current study explored the effect of economic development, financial development and institutional quality on environmental destruction in upper-middle income countries during the time period of 1999-2011. The dependent variable is logarithm of carbon dioxide emissions that can be considered as an index for destruction or quality of the environment given to its effects on the environment. Financial development and institutional development variables as well as some control variables were considered. In order to study cross-sectional correlation among the countries under study, Pesaran and Friz test was used. Since the results of both tests show cross-sectional correlation in the countries under study, seemingly unrelated regression method was utilized for model estimation. The results disclosed that Kuznets’ environmental curve hypothesis is confirmed in upper-middle income countries and also, financial development and institutional quality have a significant effect on environmental quality. The results of this study can be considered by policy makers in countries with different income groups to have access to a growth accompanied by improved environmental quality.

Keywords: economic development, environmental destruction, financial development, institutional development, seemingly unrelated regression

Procedia PDF Downloads 348
7795 A Sliding Mesh Technique and Compressibility Correction Effects of Two-Equation Turbulence Models for a Pintle-Perturbed Flow Analysis

Authors: J. Y. Heo, H. G. Sung

Abstract:

Numerical simulations have been performed for assessment of compressibility correction of two-equation turbulence models suitable for large scale separation flows perturbed by pintle strokes. In order to take into account pintle movement, a sliding mesh method was applied. The chamber pressure, mass flow rate, and thrust have been analyzed, and the response lag and sensitivity at the chamber and nozzle were estimated for a movable pintle. The nozzle performance for pintle reciprocating as its insertion and extraction processes, were analyzed to better understand the dynamic performance of the pintle nozzle.

Keywords: pintle, sliding mesh, turbulent model, compressibility correction

Procedia PDF Downloads 490
7794 Hate Speech Detection Using Deep Learning and Machine Learning Models

Authors: Nabil Shawkat, Jamil Saquer

Abstract:

Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.

Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification

Procedia PDF Downloads 136
7793 Sociodemographic Predictors of Flourishing among Older Adults in Rural and Urban Mongolia

Authors: Saranchuluun Otgon, Sugarmaa Myagmarjav, Khorolsuren Lkhagvasuren, Fabio Casati

Abstract:

Background: Flourishing is a eudaimonic dimension of psychological well-being that has been associated with positive social and health-related outcomes. Determining the factors associated with health and well-being is important to the development of evidence-based intervention programs, policies, and action plans targeting the older adult population, especially in low- and middle-income countries, such as Mongolia, where evidence-based research on aging, health, and well-being is still scarce. This study makes important contributions to the study of well-being in later age and also to policy activities for the older population in Mongolia. Methods: We employed multiple regression models to predict the factors of flourishing using data from 304 older adults living in urban and rural Mongolia. Data is collected by the standardized and validated questionnaire adopted by Ed Diener. Results: The median score of the flourishing of urban and rural older adults in Mongolia was significantly different, 53 and 50, respectively. The sex (β = 2.52,p = 0.034), level of education(β = 0.94, p = 0.026), receive help for the activity of daily living (β = 2.16, p = 0.022) determine the flourishing of older adults living in a rural area, while self-reported health (β = 0.94, p = 0.026), the number of social activities, friends network determine to flourish of older adults living urban area. Conclusion: Older adults who live in urban areas have more psychological resources and strengths than those in rural areas. Determinants of flourishing are different in different settings. For instance, individual and family factors determine flourishing in rural areas, and social ties determine flourishing in urban areas.

Keywords: flourishing, predictors, older adults, Mongolia, psychological well-being

Procedia PDF Downloads 130
7792 High-Tech Based Simulation and Analysis of Maximum Power Point in Energy System: A Case Study Using IT Based Software Involving Regression Analysis

Authors: Enemeri George Uweiyohowo

Abstract:

Improved achievement with respect to output control of photovoltaic (PV) systems is one of the major focus of PV in recent times. This is evident to its low carbon emission and efficiency. Power failure or outage from commercial providers, in general, does not promote development to public and private sector, these basically limit the development of industries. The need for a well-structured PV system is of importance for an efficient and cost-effective monitoring system. The purpose of this paper is to validate the maximum power point of an off-grid PV system taking into consideration the most effective tilt and orientation angles for PV's in the southern hemisphere. This paper is based on analyzing the system using a solar charger with MPPT from a pulse width modulation (PWM) perspective. The power conditioning device chosen is a solar charger with MPPT. The practical setup consists of a PV panel that is set to an orientation angle of 0∘N, with a corresponding tilt angle of 36∘, 26∘ and 16∘. Preliminary results include regression analysis (normal probability plot) showing the maximum power point in the system as well the best tilt angle for maximum power point tracking.

Keywords: poly-crystalline PV panels, information technology (IT), maximum power point tracking (MPPT), pulse width modulation (PWM)

Procedia PDF Downloads 213
7791 Effect of Lactone Glycoside on Feeding Deterrence and Nutritive Physiology of Tobacco Caterpillar Spodoptera litura Fabricius (Noctuidae: Lepidoptera)

Authors: Selvamuthukumaran Thirunavukkarasu, Arivudainambi Sundararajan

Abstract:

The plant active molecules with their known mode of action are important leads to the development of newer insecticides. Lactone glycoside was identified earlier as the active principle in Cleistanthus collinus (Roxb.) Benth. (Fam: Euphorbiaceae). It possessed feeding deterrent, insecticidal and insect growth regulatory actions at varying concentrations. Deducing its mode of action opens a possibility of its further development. A no-choice leaf disc bioassay was carried out with lactone glycoside at different doses for different instars and Deterrence Indices were worked out. Using regression analysis concentrations imparting 10, 30 and 50 per cent deterrence (DI10, DI30 & DI50) were worked out. At these doses, effect on nutritional indices like Relative Consumption and Growth Rates (RCR & RGR), Efficiencies of Conversion of Ingested and Digested food (ECI & ECD) and Approximate Digestibility (AD) were worked out. The Relative Consumption and Growth Rate of control and lactone glycoside larva were compared by regression analysis. Regression analysis of deterrence indices revealed that the concentrations needed for imparting 50 per cent deterrence was 60.66, 68.47 and 71.10 ppm for third, fourth and fifth instars respectively. Relative consumption rate (RCR) and relative growth rate (RGR) were reduced. This confirmed the antifeedant action of the fraction. Approximate digestibility (AD) was found greater in treatments indicating reduced faeces because of poor digestibility and retention of food in the gut. Efficiency of conversion of both ingested and digested (ECI and ECD) food was also found to be greatly reduced. This indicated presence of toxic action. This was proved by comparing growth efficiencies of control and lactone glycoside treated larvae. Lactone glycoside was found to possess both feeding deterrent and toxic modes of action. Studies on molecular targets based on this preliminary site of action lead to new insecticide development.

Keywords: Spodoptera litura Fabricius, Cleistanthus collinus (Roxb.) Benth, feeding deterrence, mode of action

Procedia PDF Downloads 155
7790 Association of Preoperative Pain Catastrophizing with Postoperative Pain after Lower Limb Trauma Surgery

Authors: Asish Subedi, Krishna Pokharel, Birendra Prasad Sah, Pashupati Chaudhary

Abstract:

Objectives: To evaluate an association between preoperative Nepali pain catastrophizing scale (N-PCS) scores and postoperative pain intensity and total opioid consumption. Methods: In this prospective cohort study we enrolled 135 patients with an American Society of Anaesthesiologists physical status I or II, aged between 18 and 65 years, and scheduled for surgery for lower-extremity fracture under spinal anaesthesia. Maximum postoperative pain reported during the 24 h was classified into two groups, no-mild pain group (Numeric rating scale [NRS] scores 1 to 3) and a moderate-severe pain group (NRS 4-10). The Spearman correlation coefficient was used to compare the association between the baseline N-PCS scores and outcome variables, i.e., the maximum NRS pain score and the total tramadol consumption within the first 24 h after surgery. Logistic regression models were used to identify the predictors for the intensity of postoperative pain. Results: As four patients violated the protocol, the data of 131 patients were analysed. Mean N-PCS scores reported by the moderate-severe pain group was 27.39 ±9.50 compared to 18.64 ±10 mean N-PCS scores by the no-mild pain group (p<0.001). Preoperative PCS scores correlated positively with postoperative pain intensity (r =0.39, [95% CI 0.23-0.52], p<0.001) and total tramadol consumption (r =0.32, [95% CI 0.16-0.47], p<0.001). An increase in catastrophizing scores was associated with postoperative moderate-severe pain (odds ratio, 1.08 [95% confidence interval, 1.02-1.15], p=0.006) after adjusting for gender, ethnicity and preoperative anxiety. Conclusion: Patients who reported higher pain catastrophizing preoperatively were at increased risk of experiencing moderate-severe postoperative pain.

Keywords: nepali, pain catastrophizing, postoperative pain, trauma

Procedia PDF Downloads 120
7789 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models

Authors: Ramin Vafadary, Maryam Khanbaghi

Abstract:

Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.

Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series

Procedia PDF Downloads 96
7788 Application of Logistics Regression Model to Ascertain the Determinants of Food Security among Households in Maiduguri, Metropolis, Borno State, Nigeria

Authors: Abdullahi Yahaya Musa, Harun Rann Bakari

Abstract:

The study examined the determinants of food security among households in Maiduguri, Metropolis, Borno State, Nigeria. The objectives of the study are to: examine the determinants of food security among households; identify the coping strategies employed by food-insecure households in Maiduguri, Metropolis, Borno State, Nigeria. The population of the study is 843,964 respondents out of which 400 respondents were sampled. The study used a self-developed questionnaire to collect data from four hundred (400) respondents. Four hundred (400) copies of questionnaires were administered and all were retrieved, making 100% return rate. The study employed descriptive and inferential statistics for data analysis. Descriptive statistics (frequency counts and percentages) was used to analyze the socio-economic characteristics of the respondents and objective four, while inferential statistics (logit regression analysis) was used to analyze one. Four hundred (400) copies of questionnaires were administered and all the four hundred (400) were retrieved, making a 100% return rate. The results were presented in tables and discussed according to the research objectives. The study revealed that HHA, HHE, HHSZ, HHSX, HHAS, HHI, HHFS, HHFE, HHAC and HHCDR were the determinants of food security in Maiduguri Metropolis. Relying on less preferred foods, purchasing food on credit, limiting food intake to ensure children get enough, borrowing money to buy foodstuffs, relying on help from relatives or friends outside the household, adult family members skipping or reducing a meal because of insufficient finances and ration money to household members to buy street food were the coping strategies employed by food-insecure households in Maiduguri metropolis. The study recommended that Nigeria Government should intensify the fight against the Boko haram insurgency. This will put an end to Boko Haram Insurgency and enable farmers to return to farming in Borno state.

Keywords: internally displaced persons, food security, coping strategies, descriptive statistics, logistics regression model, odd ratio

Procedia PDF Downloads 147
7787 Hidden Markov Movement Modelling with Irregular Data

Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith

Abstract:

Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.

Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator

Procedia PDF Downloads 244
7786 Winter Wheat Yield Forecasting Using Sentinel-2 Imagery at the Early Stages

Authors: Chunhua Liao, Jinfei Wang, Bo Shan, Yang Song, Yongjun He, Taifeng Dong

Abstract:

Winter wheat is one of the main crops in Canada. Forecasting of within-field variability of yield in winter wheat at the early stages is essential for precision farming. However, the crop yield modelling based on high spatial resolution satellite data is generally affected by the lack of continuous satellite observations, resulting in reducing the generalization ability of the models and increasing the difficulty of crop yield forecasting at the early stages. In this study, the correlations between Sentinel-2 data (vegetation indices and reflectance) and yield data collected by combine harvester were investigated and a generalized multivariate linear regression (MLR) model was built and tested with data acquired in different years. It was found that the four-band reflectance (blue, green, red, near-infrared) performed better than their vegetation indices (NDVI, EVI, WDRVI and OSAVI) in wheat yield prediction. The optimum phenological stage for wheat yield prediction with highest accuracy was at the growing stages from the end of the flowering to the beginning of the filling stage. The best MLR model was therefore built to predict wheat yield before harvest using Sentinel-2 data acquired at the end of the flowering stage. Further, to improve the ability of the yield prediction at the early stages, three simple unsupervised domain adaptation (DA) methods were adopted to transform the reflectance data at the early stages to the optimum phenological stage. The winter wheat yield prediction using multiple vegetation indices showed higher accuracy than using single vegetation index. The optimum stage for winter wheat yield forecasting varied with different fields when using vegetation indices, while it was consistent when using multispectral reflectance and the optimum stage for winter wheat yield prediction was at the end of flowering stage. The average testing RMSE of the MLR model at the end of the flowering stage was 604.48 kg/ha. Near the booting stage, the average testing RMSE of yield prediction using the best MLR was reduced to 799.18 kg/ha when applying the mean matching domain adaptation approach to transform the data to the target domain (at the end of the flowering) compared to that using the original data based on the models developed at the booting stage directly (“MLR at the early stage”) (RMSE =1140.64 kg/ha). This study demonstrated that the simple mean matching (MM) performed better than other DA methods and it was found that “DA then MLR at the optimum stage” performed better than “MLR directly at the early stages” for winter wheat yield forecasting at the early stages. The results indicated that the DA had a great potential in near real-time crop yield forecasting at the early stages. This study indicated that the simple domain adaptation methods had a great potential in crop yield prediction at the early stages using remote sensing data.

Keywords: wheat yield prediction, domain adaptation, Sentinel-2, within-field scale

Procedia PDF Downloads 64
7785 Comparison of Adsorbents for Ammonia Removal from Mining Wastewater

Authors: F. Al-Sheikh, C. Moralejo, M. Pritzker, W. A. Anderson, A. Elkamel

Abstract:

Ammonia in mining wastewater is a significant problem, and treatment can be especially difficult in cold climates where biological treatment is not feasible. An adsorption process is one of the alternative processes that can be used to reduce ammonia concentrations to acceptable limits, and therefore a LEWATIT resin strongly acidic H+ form ion exchange resin and a Bowie Chabazite Na form AZLB-Na zeolite were tested to assess their effectiveness. For these adsorption tests, two packed bed columns (a mini-column constructed from a 32-cm long x 1-cm diameter piece of glass tubing, and a 60-cm long x 2.5-cm diameter Ace Glass chromatography column) were used containing varying quantities of the adsorbents. A mining wastewater with ammonia concentrations of 22.7 mg/L was fed through the columns at controlled flowrates. In the experimental work, maximum capacities of the LEWATIT ion exchange resin were 0.438, 0.448, and 1.472 mg/g for 3, 6, and 9 g respectively in a mini column and 1.739 mg/g for 141.5 g in a larger Ace column while the capacities for the AZLB-Na zeolite were 0.424, and 0.784 mg/g for 3, and 6 g respectively in the mini column and 1.1636 mg/g for 38.5 g in the Ace column. In the theoretical work, Thomas, Adams-Bohart, and Yoon-Nelson models were constructed to describe a breakthrough curve of the adsorption process and find the constants of the above-mentioned models. In the regeneration tests, 5% hydrochloric acid, HCl (v/v) and 10% sodium hydroxide, NaOH (w/v) were used to regenerate the LEWATIT resin and AZLB-Na zeolite with 44 and 63.8% recovery, respectively. In conclusion, continuous flow adsorption using a LEWATIT ion exchange resin and an AZLB-Na zeolite is efficient when using a co-flow technique for removal of the ammonia from wastewater. Thomas, Adams-Bohart, and Yoon-Nelson models satisfactorily fit the data with R2 closer to 1 in all cases.

Keywords: AZLB-Na zeolite, continuous adsorption, Lewatit resin, models, regeneration

Procedia PDF Downloads 389
7784 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility

Procedia PDF Downloads 124
7783 Towards a Standardization in Scheduling Models: Assessing the Variety of Homonyms

Authors: Marcel Rojahn, Edzard Weber, Norbert Gronau

Abstract:

Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in) consistency for the research field of job-shop scheduling through a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability, and combinability of scheduling methods are unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.

Keywords: job-shop scheduling, terminology, notation, standardization

Procedia PDF Downloads 109
7782 Development and Validation of a Coronary Heart Disease Risk Score in Indian Type 2 Diabetes Mellitus Patients

Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad

Abstract:

Diabetes in India is growing at an alarming rate and the complications caused by it need to be controlled. Coronary heart disease (CHD) is one of the complications that will be discussed for prediction in this study. India has the second most number of diabetes patients in the world. To the best of our knowledge, there is no CHD risk score for Indian type 2 diabetes patients. Any form of CHD has been taken as the event of interest. A sample of 750 was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of CHD. Predictive risk scores of CHD events are designed by cox proportional hazard regression. Model calibration and discrimination is assessed from Hosmer Lemeshow and area under receiver operating characteristic (ROC) curve. Overfitting and underfitting of the model is checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Youden’s index is used to choose the optimal cut off point from the scores. Five year probability of CHD is predicted by both survival function and Markov chain two state model and the better technique is concluded. The risk scores for CHD developed can be calculated by doctors and patients for self-control of diabetes. Furthermore, the five-year probabilities can be implemented as well to forecast and maintain the condition of patients.

Keywords: coronary heart disease, cox proportional hazard regression, ROC curve, type 2 diabetes Mellitus

Procedia PDF Downloads 219
7781 Prenatal Lead Exposure and Postpartum Depression: An Exploratory Study of Women in Mexico

Authors: Nia McRae, Robert Wright, Ghalib Bello

Abstract:

Introduction: Postpartum depression is a prevalent mood disorder that is detrimental to the mental and physical health of mothers and their newborns. Lead (Pb) is a toxic metal that is associated with hormonal imbalance and mental impairments. The hormone changes that accompany pregnancy and childbirth may be exacerbated by Pb and increase new mothers’ susceptibility to postpartum depression. To the best of the author’s knowledge, this is the only study that investigates the association between prenatal Pb exposure and postpartum depression. Identifying risk factors can contribute to improved prevention and treatment strategies for postpartum depression. Methods: Data was derived from the Programming Research in Obesity, Growth, Environment and Social Stress (PROGRESS) study which is an ongoing longitudinal birth cohort. Postpartum depression was identified by a score of 13 or above on the 10-Item Edinburg Postnatal Depression Scale (EPDS) 6-months and 12-months postpartum. Pb was measured in the blood (BPb) in the second and third trimester and in the tibia and patella 1-month postpartum. Quantile regression models were used to assess the relationship between BPb and postpartum depression. Results: BPb in the second trimester was negatively associated with the 80th percentile of depression 6-months postpartum (β: -0.26; 95% CI: -0.51, -0.01). No significant association was found between BPb in the third trimester and depression 6-months postpartum. BPb in the third trimester exhibited an inverse relationship with the 60th percentile (β: -0.23; 95% CI: -0.41, -0.06), 70th percentile (β: -0.31; 95% CI: -0.52, -0.10), and 90th percentile of depression 12-months postpartum (β: -0.36; 95% CI: -0.69, -0.03). There was no significant association between BPb in the second trimester and depression 12-months postpartum. Bone Pb concentrations were not significantly associated with postpartum depression. Conclusion: The negative association between BPb and postpartum depression may support research which demonstrates lead is a nontherapeutic stimulant. Further research is needed to verify these results and identify effect modifiers.

Keywords: depression, lead, postpartum, prenatal

Procedia PDF Downloads 225
7780 Mediating and Moderating Function of Corporate Governance on Firm Tax Planning and Firm Tax Disclosure Relationship

Authors: Mahfoudh Hussein Mgammal

Abstract:

The purpose of this paper is to investigate the moderating and mediating effect of corporate governance mechanisms proxy on the relationship of tax planning measured by effective tax rate components and tax disclosure. This paper tested the hypotheses by a 3-step hierarchical regression with 2010 to 2012 Malaysian-listed nonfinancial firms. We found companies positively value tax-planning activities. This indicates that tax planning is seen as a source of companies' wealth creation as the results show that there is an association between the tax disclosure and the extent of tax planning, and this relationship is highly significant. Examination of the implications of corporate governance mechanisms on the tax disclosure-tax planning association showed the lack of a significant coefficient related to any of the interactive variables. This makes it hard to understand the nature of the association. Finally, we further study the sensitivity of the results, the outcomes were also examined for the robustness and strength of the model specification utilizing OLS-effect estimators and the absence of tax planning related factors (GRTH, LEVE, and CAPNT). The findings of these tests display there is no effect on the tax planning-tax disclosure association. The outcomes of the annual regressions test show that the panel regressions results differ over time because there is a time difference impact on the associations, and the different models are not completely proportionate as a whole. Moreover, our paper lends some support to recent theory on the importance of taxes to corporate governance by demonstrating how the agency costs of tax planning allow certain shareholders to benefit from firm activities at the expense of others.

Keywords: tax disclosure, tax planning, corporate governance, effective tax rate

Procedia PDF Downloads 151
7779 Assessment of Material Type, Diameter, Orientation and Closeness of Fibers in Vulcanized Reinforced Rubbers

Authors: Ali Osman Güney, Bahattin Kanber

Abstract:

In this work, the effect of material type, diameter, orientation and closeness of fibers on the general performance of reinforced vulcanized rubbers are investigated using finite element method with experimental verification. Various fiber materials such as hemp, nylon, polyester are used for different fiber diameters, orientations and closeness. 3D finite element models are developed by considering bonded contact elements between fiber and rubber sheet interfaces. The fibers are assumed as linear elastic, while vulcanized rubber is considered as hyper-elastic. After an experimental verification of finite element results, the developed models are analyzed under prescribed displacement that causes tension. The normal stresses in fibers and shear stresses between fibers and rubber sheet are investigated in all models. Large deformation of reinforced rubber sheet also represented with various fiber conditions under incremental loading. A general assessment is achieved about best fiber properties of reinforced rubber sheets for tension-load conditions.

Keywords: reinforced vulcanized rubbers, fiber properties, out of plane loading, finite element method

Procedia PDF Downloads 347
7778 Improving the Biomechanical Resistance of a Treated Tooth via Composite Restorations Using Optimised Cavity Geometries

Authors: Behzad Babaei, B. Gangadhara Prusty

Abstract:

The objective of this study is to assess the hypotheses that a restored tooth with a class II occlusal-distal (OD) cavity can be strengthened by designing an optimized cavity geometry, as well as selecting the composite restoration with optimized elastic moduli when there is a sharp de-bonded edge at the interface of the tooth and restoration. Methods: A scanned human maxillary molar tooth was segmented into dentine and enamel parts. The dentine and enamel profiles were extracted and imported into a finite element (FE) software. The enamel rod orientations were estimated virtually. Fifteen models for the restored tooth with different cavity occlusal depths (1.5, 2, and 2.5 mm) and internal cavity angles were generated. By using a semi-circular stone part, a 400 N load was applied to two contact points of the restored tooth model. The junctions between the enamel, dentine, and restoration were considered perfectly bonded. All parts in the model were considered homogeneous, isotropic, and elastic. The quadrilateral and triangular elements were employed in the models. A mesh convergence analysis was conducted to verify that the element numbers did not influence the simulation results. According to the criteria of a 5% error in the stress, we found that a total element number of over 14,000 elements resulted in the convergence of the stress. A Python script was employed to automatically assign 2-22 GPa moduli (with increments of 4 GPa) for the composite restorations, 18.6 GPa to the dentine, and two different elastic moduli to the enamel (72 GPa in the enamel rods’ direction and 63 GPa in perpendicular one). The linear, homogeneous, and elastic material models were considered for the dentine, enamel, and composite restorations. 108 FEA simulations were successively conducted. Results: The internal cavity angles (α) significantly altered the peak maximum principal stress at the interface of the enamel and restoration. The strongest structures against the contact loads were observed in the models with α = 100° and 105. Even when the enamel rods’ directional mechanical properties were disregarded, interestingly, the models with α = 100° and 105° exhibited the highest resistance against the mechanical loads. Regarding the effect of occlusal cavity depth, the models with 1.5 mm depth showed higher resistance to contact loads than the model with thicker cavities (2.0 and 2.5 mm). Moreover, the composite moduli in the range of 10-18 GPa alleviated the stress levels in the enamel. Significance: For the class II OD cavity models in this study, the optimal geometries, composite properties, and occlusal cavity depths were determined. Designing the cavities with α ≥100 ̊ was significantly effective in minimizing peak stress levels. The composite restoration with optimized properties reduced the stress concentrations on critical points of the models. Additionally, when more enamel was preserved, the sturdier enamel-restoration interface against the mechanical loads was observed.

Keywords: dental composite restoration, cavity geometry, finite element approach, maximum principal stress

Procedia PDF Downloads 102
7777 An Application of Graph Theory to The Electrical Circuit Using Matrix Method

Authors: Samai'la Abdullahi

Abstract:

A graph is a pair of two set and so that a graph is a pictorial representation of a system using two basic element nodes and edges. A node is represented by a circle (either hallo shade) and edge is represented by a line segment connecting two nodes together. In this paper, we present a circuit network in the concept of graph theory application and also circuit models of graph are represented in logical connection method were we formulate matrix method of adjacency and incidence of matrix and application of truth table.

Keywords: euler circuit and path, graph representation of circuit networks, representation of graph models, representation of circuit network using logical truth table

Procedia PDF Downloads 561
7776 Using Neural Networks for Click Prediction of Sponsored Search

Authors: Afroze Ibrahim Baqapuri, Ilya Trofimov

Abstract:

Sponsored search is a multi-billion dollar industry and makes up a major source of revenue for search engines (SE). Click-through-rate (CTR) estimation plays a crucial role for ads selection, and greatly affects the SE revenue, advertiser traffic and user experience. We propose a novel architecture of solving CTR prediction problem by combining artificial neural networks (ANN) with decision trees. First, we compare ANN with respect to other popular machine learning models being used for this task. Then we go on to combine ANN with MatrixNet (proprietary implementation of boosted trees) and evaluate the performance of the system as a whole. The results show that our approach provides a significant improvement over existing models.

Keywords: neural networks, sponsored search, web advertisement, click prediction, click-through rate

Procedia PDF Downloads 572
7775 Seismic Behavior of Suction Caisson Foundations

Authors: Mohsen Saleh Asheghabadi, Alireza Jafari Jebeli

Abstract:

Increasing population growth requires more sustainable development of energy. This non-contaminated energy has an inexhaustible energy source. One of the vital parameters in such structures is the choice of foundation type. Suction caissons are now used extensively worldwide for offshore wind turbine. Considering the presence of a number of offshore wind farms in earthquake areas, the study of the seismic behavior of suction caisson is necessary for better design. In this paper, the results obtained from three suction caisson models with different diameter (D) and skirt length (L) in saturated sand were compared with centrifuge test results. All models are analyzed using 3D finite element (FE) method taking account of elasto-plastic Mohr–Coulomb constitutive model for soil which is available in the ABAQUS library. The earthquake load applied to the base of models with a maximum acceleration of 0.65g. The results showed that numerical method is in relative good agreement with centrifuge results. The settlement and rotation of foundation decrease by increasing the skirt length and foundation diameter. The sand soil outside the caisson is prone to liquefaction due to its low confinement.

Keywords: liquefaction, suction caisson foundation, offshore wind turbine, numerical analysis, seismic behavior

Procedia PDF Downloads 119
7774 Effects of Temperature and Cysteine Addition on Formation of Flavor from Maillard Reaction Using Xylose and Rapeseed Meal Peptide

Authors: Zuoyong Zhang, Min Yu, Jinlong Zhao, Shudong He

Abstract:

The Maillard reaction can produce the flavor enhancing substance through the chemical crosslinking between free amino group of the protein or polypeptide with the carbonyl of the reducing sugar. In this research, solutions of rapeseed meal peptide and D-xylose with or without L-cysteine (RXC or RX) were heated over a range of temperatures (80-140 °C) for 2 h. It was observed that RXs had a severe browning,while RXCs accompanied by more pH decrement with the temperature increasing. Then the correlation among data of quantitative sensory descriptive analysis, free amino acid (FAA) and GC–MS of RXCs and RXs were analyzed using the partial least square regression method. Results suggested that the Maillard reaction product (MRPs) with cysteine formed at 120 °C (RXC-120) had greater sensory properties especially meat-like flavor compared to other MRPs. Meanwhile, it revealed that glutamic and glycine not only had a positive contribution to meaty aroma but also showed a significant and positive influence on umami taste of RXs based on the FAA data. Moreover, the sulfur-containing compounds showed a significant positive correlation with the meat-like flavor of RXCs, while RXs depended on furans and nitrogenous-containing compounds with more caramel-like flavor. Therefore, a MRP with strong meaty flavor could be obtained at 120 °C by addition of cysteine.

Keywords: rapeseed meal, Maillard reaction, sensory characteristics, FAA, GC–MS, partial least square regression

Procedia PDF Downloads 267
7773 A Cros Sectional Observational Study of Prescription Pattern of Gastro-Protective Drugs with Non-Steroidal Anti-Inflammatory Drugs in Nilgiris, India

Authors: B.S. Roopa

Abstract:

Objectives: To investigate the prevalence of concomitant use of GPDs in patients treated with NSAIDs and GPDs in recommended dose and frequency as prophylaxis. And also to know the association between risk factors and prescription of GPDs in patients treated with NSAIDs. Methods: Study was a prospective, observational, cross-sectional survey. Data from patients with prescription of NSAIDs at the out-patient departments of secondary care Hospital, Nilgiris, India were collected in a specially designed proforma for a period of 45 days. Analysis using χ2 tests for discrete variables. Factors that might be associated with prescription of GPD with NSIADs were assessed in multiple logistic regression models. Results: Three hundred and three patients were included in this study, and the rate of GPD prescription was 89.1%. Most of the patients received H2-receptor antagonist, and, to a lesser degree, antacid and proton pump inhibitor. Patients with history of GI ulcer/bleeding were much more likely to be co-prescribed GPD than those who had no history of GI disorders .Compared with patients who were managed in general outpatient clinic, those managed in Secondary care hospital in Nilgrisis, India were more likely to receive GPD. Conclusions: The prescription rate of GPD with NSAIDs is high. Patients were prescribed with H2RA with dose of 150mg twice daily, which are not effective in reducing the risk of NSAIDs induced gastric ulcer. Only the frequency of NSAIDs prescription was considered significant determinant for the co-prescription with GPAs in patients who are < 65 years and ≥ 65 years old.

Keywords: gastro protective agents, non steridol anti inlfammatory agents

Procedia PDF Downloads 296
7772 Robotics Technology Supported Pedagogic Models in Science, Technology, Engineering, Arts and Mathematics Education

Authors: Sereen Itani

Abstract:

As the world aspires for technological innovation, Innovative Robotics Technology-Supported Pedagogic Models in STEAM Education (Science, Technology, Engineering, Arts, and Mathematics) are critical in our global education system to build and enhance the next generation 21st century skills. Thus, diverse international schools endeavor in attempts to construct an integrated robotics and technology enhanced curriculum based on interdisciplinary subjects. Accordingly, it is vital that the globe remains resilient in STEAM fields by equipping the future learners and educators with Innovative Technology Experiences through robotics to support such fields. A variety of advanced teaching methods is employed to learn about Robotics Technology-integrated pedagogic models. Therefore, it is only when STEAM and innovations in Robotic Technology becomes integrated with real-world applications that transformational learning can occur. Robotics STEAM education implementation faces major challenges globally. Moreover, STEAM skills and concepts are communicated in separation from the real world. Instilling the passion for robotics and STEAM subjects and educators’ preparation could lead to the students’ majoring in such fields by acquiring enough knowledge to make vital contributions to the global STEAM industries. Thus, this necessitates the establishment of Pedagogic models such as Innovative Robotics Technologies to enhance STEAM education and develop students’ 21st-century skills. Moreover, an ICT innovative supported robotics classroom will help educators empower and assess students academically. Globally, the Robotics Design System and platforms are developing in schools and university labs creating a suitable environment for the robotics cross-discipline STEAM learning. Accordingly, the research aims at raising awareness about the importance of robotics design systems and methodologies of effective employment of robotics innovative technology-supported pedagogic models to enhance and develop (STEAM) education globally and enhance the next generation 21st century skills.

Keywords: education, robotics, STEAM (Science, Technology, Engineering, Arts and Mathematics Education), challenges

Procedia PDF Downloads 384
7771 Computer Simulation Studies of Aircraft Wing Architectures on Vibration Responses

Authors: Shengyong Zhang, Mike Mikulich

Abstract:

Vibration is a crucial limiting consideration in the analysis and design of airplane wing structures to avoid disastrous failures due to the propagation of existing cracks in the material. In this paper, we build CAD models of aircraft wings to capture the design intent with configurations. Subsequent FEA vibration analysis is performed to study the natural vibration properties and impulsive responses of the resulting user-defined wing models. This study reveals the variations of the wing’s vibration characteristics with respect to changes in its structural configurations. Integrating CAD modelling and FEA vibration analysis enables designers to improve wing architectures for implementing design requirements in the preliminary design stage.

Keywords: aircraft wing, CAD modelling, FEA, vibration analysis

Procedia PDF Downloads 165
7770 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity

Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink

Abstract:

The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.

Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction

Procedia PDF Downloads 313
7769 Using Mathematical Models to Predict the Academic Performance of Students from Initial Courses in Engineering School

Authors: Martín Pratto Burgos

Abstract:

The Engineering School of the University of the Republic in Uruguay offers an Introductory Mathematical Course from the second semester of 2019. This course has been designed to assist students in preparing themselves for math courses that are essential for Engineering Degrees, namely Math1, Math2, and Math3 in this research. The research proposes to build a model that can accurately predict the student's activity and academic progress based on their performance in the three essential Mathematical courses. Additionally, there is a need for a model that can forecast the incidence of the Introductory Mathematical Course in the three essential courses approval during the first academic year. The techniques used are Principal Component Analysis and predictive modelling using the Generalised Linear Model. The dataset includes information from 5135 engineering students and 12 different characteristics based on activity and course performance. Two models are created for a type of data that follows a binomial distribution using the R programming language. Model 1 is based on a variable's p-value being less than 0.05, and Model 2 uses the stepAIC function to remove variables and get the lowest AIC score. After using Principal Component Analysis, the main components represented in the y-axis are the approval of the Introductory Mathematical Course, and the x-axis is the approval of Math1 and Math2 courses as well as student activity three years after taking the Introductory Mathematical Course. Model 2, which considered student’s activity, performed the best with an AUC of 0.81 and an accuracy of 84%. According to Model 2, the student's engagement in school activities will continue for three years after the approval of the Introductory Mathematical Course. This is because they have successfully completed the Math1 and Math2 courses. Passing the Math3 course does not have any effect on the student’s activity. Concerning academic progress, the best fit is Model 1. It has an AUC of 0.56 and an accuracy rate of 91%. The model says that if the student passes the three first-year courses, they will progress according to the timeline set by the curriculum. Both models show that the Introductory Mathematical Course does not directly affect the student’s activity and academic progress. The best model to explain the impact of the Introductory Mathematical Course on the three first-year courses was Model 1. It has an AUC of 0.76 and 98% accuracy. The model shows that if students pass the Introductory Mathematical Course, it will help them to pass Math1 and Math2 courses without affecting their performance on the Math3 course. Matching the three predictive models, if students pass Math1 and Math2 courses, they will stay active for three years after taking the Introductory Mathematical Course, and also, they will continue following the recommended engineering curriculum. Additionally, the Introductory Mathematical Course helps students to pass Math1 and Math2 when they start Engineering School. Models obtained in the research don't consider the time students took to pass the three Math courses, but they can successfully assess courses in the university curriculum.

Keywords: machine-learning, engineering, university, education, computational models

Procedia PDF Downloads 95
7768 Institutional Quality and Tax Compliance: A Cross-Country Regression Evidence

Authors: Debi Konukcu Onal, Tarkan Cavusoglu

Abstract:

In modern societies, the costs of public goods and services are shared through taxes paid by citizens. However, taxation has always been a frictional issue, as tax obligations are perceived to be a financial burden for taxpayers rather than being merit that fulfills the redistribution, regulation and stabilization functions of the welfare state. The tax compliance literature evolves into discussing why people still pay taxes in systems with low costs of legal enforcement. Related empirical and theoretical works show that a wide range of socially oriented behavioral factors can stimulate voluntary compliance and subversive effects as well. These behavioral motivations are argued to be driven by self-enforcing rules of informal institutions, either independently or through interactions with legal orders set by formal institutions. The main focus of this study is to investigate empirically whether institutional particularities have a significant role in explaining the cross-country differences in the tax noncompliance levels. A part of the controversy about the driving forces behind tax noncompliance may be attributed to the lack of empirical evidence. Thus, this study aims to fill this gap through regression estimates, which help to trace the link between institutional quality and noncompliance on a cross-country basis. Tax evasion estimates of Buehn and Schneider is used as the proxy measure for the tax noncompliance levels. Institutional quality is quantified by three different indicators (percentile ranks of Worldwide Governance Indicators, ratings of the International Country Risk Guide, and the country ratings of the Freedom in the World). Robust Least Squares and Threshold Regression estimates based on the sample of the Organization for Economic Co-operation and Development (OECD) countries imply that tax compliance increases with institutional quality. Moreover, a threshold-based asymmetry is detected in the effect of institutional quality on tax noncompliance. That is, the negative effects of tax burdens on compliance are found to be more pronounced in countries with institutional quality below a certain threshold. These findings are robust to all alternative indicators of institutional quality, supporting the significant interaction of societal values with the individual taxpayer decisions.

Keywords: institutional quality, OECD economies, tax compliance, tax evasion

Procedia PDF Downloads 134
7767 Reconstruction of Holographic Dark Energy in Chameleon Brans-Dicke Cosmology

Authors: Surajit Chattopadhyay

Abstract:

Accelerated expansion of the current universe is well-established in the literature. Dark energy and modified gravity are two approaches to account for this accelerated expansion. In the present work, we consider scalar field models of dark energy, namely, tachyon and DBI essence in the framework of chameleon Brans-Dicke cosmology. The equation of state parameter is reconstructed and the subsequent cosmological implications are studied. We examined the stability for the obtained solutions of the crossing of the phantom divide under a quantum correction of massless conformally invariant fields and we have seen that quantum correction could be small when the phantom crossing occurs and the obtained solutions of the phantom crossing could be stable under the quantum correction. In the subsequent phase, we have established a correspondence between the NHDE model and the quintessence, the DBI-essence and the tachyon scalar field models in the framework of chameleon Brans–Dicke cosmology. We reconstruct the potentials and the dynamics for these three scalar field models we have considered. The reconstructed potentials are found to increase with the evolution of the universe and in a very late stage they are observed to decay.

Keywords: dark energy, holographic principle, modified gravity, reconstruction

Procedia PDF Downloads 412