Search results for: general linear regression model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24230

Search results for: general linear regression model

21770 Assessment of Pastoralist-Crop Farmers Conflict and Food Security of Farming Households in Kwara State, Nigeria

Authors: S. A. Salau, I. F. Ayanda, I. Afe, M. O. Adesina, N. B. Nofiu

Abstract:

Food insecurity is still a critical challenge among rural and urban households in Nigeria. The country’s food insecurity situation became more pronounced due to frequent conflict between pastoralist and crop farmers. Thus, this study assesses pastoralist-crop farmers’ conflict and food security of farming households in Kwara state, Nigeria. The specific objectives are to measure the food security status of the respondents, quantify pastoralist- crop farmers’ conflict, determine the effect of pastoralist- crop farmers conflict on food security and describe the effective coping strategies adopted by the respondents to reduce the effect of food insecurity. A combination of purposive and simple random sampling techniques will be used to select 250 farming households for the study. The analytical tools include descriptive statistics, Likert-scale, logistic regression, and food security index. Using the food security index approach, the percentage of households that were food secure and insecure will be known. Pastoralist- crop farmers’ conflict will be measured empirically by quantifying loses due to the conflict. The logistic regression will indicate if pastoralist- crop farmers’ conflict is a critical determinant of food security among farming households in the study area. The coping strategies employed by the respondents in cushioning the effects of food insecurity will also be revealed. Empirical studies on the effect of pastoralist- crop farmers’ conflict on food security are rare in the literature. This study will quantify conflict and reveal the direction as well as the extent of the relationship between conflict and food security. It could contribute to the identification and formulation of strategies for the minimization of conflict among pastoralist and crop farmers in an attempt to reduce food insecurity. Moreover, this study could serve as valuable reference material for future researches and open up new areas for further researches.

Keywords: agriculture, conflict, coping strategies, food security, logistic regression

Procedia PDF Downloads 191
21769 A Comparative Study of the Techno-Economic Performance of the Linear Fresnel Reflector Using Direct and Indirect Steam Generation: A Case Study under High Direct Normal Irradiance

Authors: Ahmed Aljudaya, Derek Ingham, Lin Ma, Kevin Hughes, Mohammed Pourkashanian

Abstract:

Researchers, power companies, and state politicians have given concentrated solar power (CSP) much attention due to its capacity to generate large amounts of electricity whereas overcoming the intermittent nature of solar resources. The Linear Fresnel Reflector (LFR) is a well-known CSP technology type for being inexpensive, having a low land use factor, and suffering from low optical efficiency. The LFR was considered a cost-effective alternative option to the Parabolic Trough Collector (PTC) because of its simplistic design, and this often outweighs its lower efficiency. The LFR has been found to be a promising option for directly producing steam to a thermal cycle in order to generate low-cost electricity, but also it has been shown to be promising for indirect steam generation. The purpose of this important analysis is to compare the annual performance of the Direct Steam Generation (DSG) and Indirect Steam Generation (ISG) of LFR power plants using molten salt and other different Heat Transfer Fluids (HTF) to investigate their technical and economic effects. A 50 MWe solar-only system is examined as a case study for both steam production methods in extreme weather conditions. In addition, a parametric analysis is carried out to determine the optimal solar field size that provides the lowest Levelized Cost of Electricity (LCOE) while achieving the highest technical performance. As a result of optimizing the optimum solar field size, the solar multiple (SM) is found to be between 1.2 – 1.5 in order to achieve as low as 9 Cent/KWh for the direct steam generation of the linear Fresnel reflector. In addition, the power plant is capable of producing around 141 GWh annually and up to 36% of the capacity factor, whereas the ISG produces less energy at a higher cost. The optimization results show that the DSG’s performance overcomes the ISG in producing around 3% more annual energy, 2% lower LCOE, and 28% less capital cost.

Keywords: concentrated solar power, levelized cost of electricity, linear Fresnel reflectors, steam generation

Procedia PDF Downloads 111
21768 Code Mixing and Code-Switching Patterns in Kannada-English Bilingual Children and Adults Who Stutter

Authors: Vasupradaa Manivannan, Santosh Maruthy

Abstract:

Background/Aims: Preliminary evidence suggests that code-switching and code-mixing may act as one of the voluntary coping behavior to avoid the stuttering characteristics in children and adults; however, less is known about the types and patterns of code-mixing (CM) and code-switching (CS). Further, it is not known how it is different between children to adults who stutter. This study aimed to identify and compare the CM and CS patterns between Kannada-English bilingual children and adults who stutter. Method: A standard group comparison was made between five children who stutter (CWS) in the age range of 9-13 years and five adults who stutter (AWS) in the age range of 20-25 years. The participants who are proficient in Kannada (first language- L1) and English (second language- L2) were considered for the study. There were two tasks given to both the groups, a) General conversation (GC) with 10 random questions, b) Narration task (NAR) (Story / General Topic, for example., A Memorable Life Event) in three different conditions {Mono Kannada (MK), Mono English (ME), and Bilingual (BIL) Condition}. The children and adults were assessed online (via Zoom session) with a high-quality internet connection. The audio and video samples of the full assessment session were auto-recorded and manually transcribed. The recorded samples were analyzed for the percentage of dysfluencies using SSI-4 and CM, and CS exhibited in each participant using Matrix Language Frame (MLF) model parameters. The obtained data were analyzed using the Statistical Package for the Social Sciences (SPSS) software package (Version 20.0). Results: The mean, median, and standard deviation values were obtained for the percentage of dysfluencies (%SS) and frequency of CM and CS in Kannada-English bilingual children and adults who stutter for various parameters obtained through the MLF model. The inferential results indicated that %SS significantly varied between population (AWS vs CWS), languages (L1 vs L2), and tasks (GC vs NAR) but not across free (BIL) and bound (MK, ME) conditions. It was also found that the frequency of CM and CS patterns varies between CWS and AWS. The AWS had a lesser %SS but greater use of CS patterns than CWS, which is due to their excessive coping skills. The language mixing patterns were more observed in L1 than L2, and it was significant in most of the MLF parameters. However, there was a significantly higher (P<0.05) %SS in L2 than L1. The CS and CS patterns were more in conditions 1 and 3 than 2, which may be due to the higher proficiency of L2 than L1. Conclusion: The findings highlight the importance of assessing the CM and CS behaviors, their patterns, and the frequency of CM and CS between CWS and AWS on MLF parameters in two different tasks across three conditions. The results help us to understand CM and CS strategies in bilingual persons who stutter.

Keywords: bilinguals, code mixing, code switching, stuttering

Procedia PDF Downloads 78
21767 Impact of Interest and Foreign Exchange Rates Liberalization on Investment Decision in Nigeria

Authors: Kemi Olalekan Oduntan

Abstract:

This paper was carried out in order to empirical, and descriptively analysis how interest rate and foreign exchange rate liberalization influence investment decision in Nigeria. The study spanned through the period of 1985 – 2014, secondary data were restricted to relevant variables such as investment (Proxy by Gross Fixed Capital Formation) saving rate, interest rate and foreign exchange rate. Theories and empirical literature from various scholars were reviews in the paper. Ordinary Least Square regression method was used for the analysis of data collection. The result of the regression was critically interpreted and discussed. It was discovered for empirical finding that tax investment decision in Nigeria is highly at sensitive rate. Hence, all the alternative hypotheses were accepted while the respective null hypotheses were rejected as a result of interest rate and foreign exchange has significant effect on investment in Nigeria. Therefore, impact of interest rate and foreign exchange rate on the state of investment in the economy cannot be over emphasized.

Keywords: interest rate, foreign exchange liberalization, investment decision, economic growth

Procedia PDF Downloads 364
21766 Theoretical Modeling of Self-Healing Polymers Crosslinked by Dynamic Bonds

Authors: Qiming Wang

Abstract:

Dynamic polymer networks (DPNs) crosslinked by dynamic bonds have received intensive attention because of their special crack-healing capability. Diverse DPNs have been synthesized using a number of dynamic bonds, including dynamic covalent bond, hydrogen bond, ionic bond, metal-ligand coordination, hydrophobic interaction, and others. Despite the promising success in the polymer synthesis, the fundamental understanding of their self-healing mechanics is still at the very beginning. Especially, a general analytical model to understand the interfacial self-healing behaviors of DPNs has not been established. Here, we develop polymer-network based analytical theories that can mechanistically model the constitutive behaviors and interfacial self-healing behaviors of DPNs. We consider that the DPN is composed of interpenetrating networks crosslinked by dynamic bonds. bonds obey a force-dependent chemical kinetics. During the self-healing process, we consider the The network chains follow inhomogeneous chain-length distributions and the dynamic polymer chains diffuse across the interface to reform the dynamic bonds, being modeled by a diffusion-reaction theory. The theories can predict the stress-stretch behaviors of original and self-healed DPNs, as well as the healing strength in a function of healing time. We show that the theoretically predicted healing behaviors can consistently match the documented experimental results of DPNs with various dynamic bonds, including dynamic covalent bonds (diarylbibenzofuranone and olefin metathesis), hydrogen bonds, and ionic bonds. We expect our model to be a powerful tool for the self-healing community to invent, design, understand, and optimize self-healing DPNs with various dynamic bonds.

Keywords: self-healing polymers, dynamic covalent bonds, hydrogen bonds, ionic bonds

Procedia PDF Downloads 187
21765 Non-Destructive Prediction System Using near Infrared Spectroscopy for Crude Palm Oil

Authors: Siti Nurhidayah Naqiah Abdull Rani, Herlina Abdul Rahim

Abstract:

Near infrared (NIR) spectroscopy has always been of great interest in the food and agriculture industries. The development of predictive models has facilitated the estimation process in recent years. In this research, 176 crude palm oil (CPO) samples acquired from Felda Johor Bulker Sdn Bhd were studied. A FOSS NIRSystem was used to tak e absorbance measurements from the sample. The wavelength range for the spectral measurement is taken at 1600nm to 1900nm. Partial Least Square Regression (PLSR) prediction model with 50 optimal number of principal components was implemented to study the relationship between the measured Free Fatty Acid (FFA) values and the measured spectral absorption. PLSR showed predictive ability of FFA values with correlative coefficient (R) of 0.9808 for the training set and 0.9684 for the testing set.

Keywords: palm oil, fatty acid, NIRS, PLSR

Procedia PDF Downloads 209
21764 Quality Assessment of New Zealand Mānuka Honeys Using Hyperspectral Imaging Combined with Deep 1D-Convolutional Neural Networks

Authors: Hien Thi Dieu Truong, Mahmoud Al-Sarayreh, Pullanagari Reddy, Marlon M. Reis, Richard Archer

Abstract:

New Zealand mānuka honey is a honeybee product derived mainly from Leptospermum scoparium nectar. The potent antibacterial activity of mānuka honey derives principally from methylglyoxal (MGO), in addition to the hydrogen peroxide and other lesser activities present in all honey. MGO is formed from dihydroxyacetone (DHA) unique to L. scoparium nectar. Mānuka honey also has an idiosyncratic phenolic profile that is useful as a chemical maker. Authentic mānuka honey is highly valuable, but almost all honey is formed from natural mixtures of nectars harvested by a hive over a time period. Once diluted by other nectars, mānuka honey irrevocably loses value. We aimed to apply hyperspectral imaging to honey frames before bulk extraction to minimise the dilution of genuine mānuka by other honey and ensure authenticity at the source. This technology is non-destructive and suitable for an industrial setting. Chemometrics using linear Partial Least Squares (PLS) and Support Vector Machine (SVM) showed limited efficacy in interpreting chemical footprints due to large non-linear relationships between predictor and predictand in a large sample set, likely due to honey quality variability across geographic regions. Therefore, an advanced modelling approach, one-dimensional convolutional neural networks (1D-CNN), was investigated for analysing hyperspectral data for extraction of biochemical information from honey. The 1D-CNN model showed superior prediction of honey quality (R² = 0.73, RMSE = 2.346, RPD= 2.56) to PLS (R² = 0.66, RMSE = 2.607, RPD= 1.91) and SVM (R² = 0.67, RMSE = 2.559, RPD=1.98). Classification of mono-floral manuka honey from multi-floral and non-manuka honey exceeded 90% accuracy for all models tried. Overall, this study reveals the potential of HSI and deep learning modelling for automating the evaluation of honey quality in frames.

Keywords: mānuka honey, quality, purity, potency, deep learning, 1D-CNN, chemometrics

Procedia PDF Downloads 139
21763 Explaining E-Learning Systems Usage in Higher Education Institutions: UTAUT Model

Authors: Muneer Abbad

Abstract:

This research explains the e-learning usage in a university in Jordan. Unified theory of acceptance and use of technology (UTAUT) model has been used as a base model to explain the usage. UTAUT is a model of individual acceptance that is compiled mainly from different models of technology acceptance. This research is the initial part from full explanations of the users' acceptance model that use Structural Equation Modelling (SEM) method to explain the users' acceptance of the e-learning systems based on UTAUT model. In this part data has been collected and prepared for further analysis. The main factors of UTAUT model has been tested as different factors using exploratory factor analysis (EFA). The second phase will be confirmatory factor analysis (CFA) and SEM to explain the users' acceptance of e-learning systems.

Keywords: e-learning, moodle, adoption, Unified Theory of Acceptance and Use of Technology (UTAUT)

Procedia PDF Downloads 407
21762 Levy Model for Commodity Pricing

Authors: V. Benedico, C. Anacleto, A. Bearzi, L. Brice, V. Delahaye

Abstract:

The aim in present paper is to construct an affordable and reliable commodity prices based on a recalculation of its cost through time which allows visualize the potential risks and thus, take more appropriate decisions regarding forecasts. Here attention has been focused on Levy model, more reliable and realistic than classical random Gaussian one as it takes into consideration observed abrupt jumps in case of sudden price variation. In application to Energy Trading sector where it has never been used before, equations corresponding to Levy model have been written for electricity pricing in European market. Parameters have been set in order to predict and simulate the price and its evolution through time to remarkable accuracy. As predicted by Levy model, the results show significant spikes which reach unconventional levels contrary to currently used Brownian model.

Keywords: commodity pricing, Lévy Model, price spikes, electricity market

Procedia PDF Downloads 429
21761 Linear Study of Electrostatic Ion Temperature Gradient Mode with Entropy Gradient Drift and Sheared Ion Flows

Authors: M. Yaqub Khan, Usman Shabbir

Abstract:

History of plasma reveals that continuous struggle of experimentalists and theorists are not fruitful for confinement up to now. It needs a change to bring the research through entropy. Approximately, all the quantities like number density, temperature, electrostatic potential, etc. are connected to entropy. Therefore, it is better to change the way of research. In ion temperature gradient mode with the help of Braginskii model, Boltzmannian electrons, effect of velocity shear is studied inculcating entropy in the magnetoplasma. New dispersion relation is derived for ion temperature gradient mode, and dependence on entropy gradient drift is seen. It is also seen velocity shear enhances the instability but in anomalous transport, its role is not seen significantly but entropy. This work will be helpful to the next step of tokamak and space plasmas.

Keywords: entropy, velocity shear, ion temperature gradient mode, drift

Procedia PDF Downloads 387
21760 Association of the Time in Targeted Blood Glucose Range of 3.9–10 Mmol/L with the Mortality of Critically Ill Patients with or without Diabetes

Authors: Guo Yu, Haoming Ma, Peiru Zhou

Abstract:

BACKGROUND: In addition to hyperglycemia, hypoglycemia, and glycemic variability, a decrease in the time in the targeted blood glucose range (TIR) may be associated with an increased risk of death for critically ill patients. However, the relationship between the TIR and mortality may be influenced by the presence of diabetes and glycemic variability. METHODS: A total of 998 diabetic and non-diabetic patients with severe diseases in the ICU were selected for this retrospective analysis. The TIR is defined as the percentage of time spent in the target blood glucose range of 3.9–10.0 mmol/L within 24 hours. The relationship between TIR and in-hospital in diabetic and non-diabetic patients was analyzed. The effect of glycemic variability was also analyzed. RESULTS: The binary logistic regression model showed that there was a significant association between the TIR as a continuous variable and the in-hospital death of severely ill non-diabetic patients (OR=0.991, P=0.015). As a classification variable, TIR≥70% was significantly associated with in-hospital death (OR=0.581, P=0.003). Specifically, TIR≥70% was a protective factor for the in-hospital death of severely ill non-diabetic patients. The TIR of severely ill diabetic patients was not significantly associated with in-hospital death; however, glycemic variability was significantly and independently associated with in-hospital death (OR=1.042, P=0.027). Binary logistic regression analysis of comprehensive indices showed that for non-diabetic patients, the C3 index (low TIR & high CV) was a risk factor for increased mortality (OR=1.642, P<0.001). In addition, for diabetic patients, the C3 index was an independent risk factor for death (OR=1.994, P=0.008), and the C4 index (low TIR & low CV) was independently associated with increased survival. CONCLUSIONS: The TIR of non-diabetic patients during ICU hospitalization was associated with in-hospital death even after adjusting for disease severity and glycemic variability. There was no significant association between the TIR and mortality of diabetic patients. However, for both diabetic and non-diabetic critically ill patients, the combined effect of high TIR and low CV was significantly associated with ICU mortality. Diabetic patients seem to have higher blood glucose fluctuations and can tolerate a large TIR range. Both diabetic and non-diabetic critically ill patients should maintain blood glucose levels within the target range to reduce mortality.

Keywords: severe disease, diabetes, blood glucose control, time in targeted blood glucose range, glycemic variability, mortality

Procedia PDF Downloads 222
21759 Finding DEA Targets Using Multi-Objective Programming

Authors: Farzad Sharifi, Raziyeh Shamsi

Abstract:

In this paper, we obtain the projection of inefficient units in data envelopment analysis (DEA) in the case of stochastic inputs and outputs using the multi-objective programming (MOP) structure. In some problems, the inputs might be stochastic while the outputs are deterministic, and vice versa. In such cases, we propose molti-objective DEA-R model, because in some cases (e.g., when unnecessary and irrational weights by the BCC model reduces the efficiency score), an efficient DMU is introduced as inefficient by the BCC model, whereas the DMU is considered efficient by the DEA-R model. In some other case, only the ratio of stochastic data may be available (e.g; the ratio of stochastic inputs to stochastic outputs). Thus, we provide multi objective DEA model without explicit outputs and prove that in-put oriented MOP DEA-R model in the invariable return to scale case can be replacing by MOP- DEA model without explicit outputs in the variable return to scale and vice versa. Using the interactive methods for solving the proposed model, yields a projection corresponding to the viewpoint of the DM and the analyst, which is nearer to reality and more practical. Finally, an application is provided.

Keywords: DEA, MOLP, STOCHASTIC, DEA-R

Procedia PDF Downloads 398
21758 Nonparametric Specification Testing for the Drift of the Short Rate Diffusion Process Using a Panel of Yields

Authors: John Knight, Fuchun Li, Yan Xu

Abstract:

Based on a new method of the nonparametric estimator of the drift function, we propose a consistent test for the parametric specification of the drift function in the short rate diffusion process using observations from a panel of yields. The test statistic is shown to follow an asymptotic normal distribution under the null hypothesis that the parametric drift function is correctly specified, and converges to infinity under the alternative. Taking the daily 7-day European rates as a proxy of the short rate, we use our test to examine whether the drift of the short rate diffusion process is linear or nonlinear, which is an unresolved important issue in the short rate modeling literature. The testing results indicate that none of the drift functions in this literature adequately captures the dynamics of the drift, but nonlinear specification performs better than the linear specification.

Keywords: diffusion process, nonparametric estimation, derivative security price, drift function and volatility function

Procedia PDF Downloads 368
21757 An Experiment of Three-Dimensional Point Clouds Using GoPro

Authors: Jong-Hwa Kim, Mu-Wook Pyeon, Yang-dam Eo, Ill-Woong Jang

Abstract:

Construction of geo-spatial information recently tends to develop as multi-dimensional geo-spatial information. People constructing spatial information is also expanding its area to the general public from some experts. As well as, studies are in progress using a variety of devices, with the aim of near real-time update. In this paper, getting the stereo images using GoPro device used widely also to the general public as well as experts. And correcting the distortion of the images, then by using SIFT, DLT, is acquired the point clouds. It presented a possibility that on the basis of this experiment, using a video device that is readily available in real life, to create a real-time digital map.

Keywords: GoPro, SIFT, DLT, point clouds

Procedia PDF Downloads 469
21756 Willingness to Pay for Improvements of MSW Disposal: Views from Online Survey

Authors: Amornchai Challcharoenwattana, Chanathip Pharino

Abstract:

Rising amount of MSW every day, maximizing material diversions from landfills via recycling is a prefer method to land dumping. Characteristic of Thai MSW is classified as 40 -60 per cent compostable wastes while potentially recyclable materials in waste streams are composed of plastics, papers, glasses, and metals. However, rate of material recovery from MSW, excluding composting or biogas generation, in Thailand is still low. Thailand’s recycling rate in 2010 was only 20.5 per cent. Central government as well as local governments in Thailand have tried to curb this problem by charging some of MSW management fees at the users. However, the fee is often too low to promote MSW minimization. The objective of this paper is to identify levels of willingness-to-pay (WTP) for MSW recycling in different social structures with expected outcome of sustainable MSW managements for different town settlements to maximize MSW recycling pertaining to each town’s potential. The method of eliciting WTP is a payment card. The questionnaire was deployed using online survey during December 2012. Responses were categorized into respondents living in Bangkok, living in other municipality areas, or outside municipality area. The responses were analysed using descriptive statistics, and multiple linear regression analysis to identify relationships and factors that could influence high or low WTP. During the survey period, there were 168 filled questionnaires from total 689 visits. However, only 96 questionnaires could be usable. Among respondents in the usable questionnaires, 36 respondents lived in within the boundary of Bangkok Metropolitan Administration while 45 respondents lived in the chartered areas that were classified as other municipality but not in BMA. Most of respondents were well-off as 75 respondents reported positive monthly cash flow (77.32%), 15 respondents reported neutral monthly cash flow (15.46%) while 7 respondent reported negative monthly cash flow (7.22%). For WTP data including WTP of 0 baht with valid responses, ranking from the highest means of WTP to the lowest WTP of respondents by geographical locations for good MSW management were Bangkok (196 baht/month), municipalities (154 baht/month), and non-urbanized towns (111 baht/month). In-depth analysis was conducted to analyse whether there are additional room for further increase of MSW management fees from the current payment that each correspondent is currently paying. The result from multiple-regression analysis suggested that the following factors could impacts the increase or decrease of WTP: incomes, age, and gender. Overall, the outcome of this study suggests that survey respondents are likely to support improvement of MSW treatments that are not solely relying on landfilling technique. Recommendations for further studies are to obtain larger sample sizes in order to improve statistical powers and to provide better accuracy of WTP study.

Keywords: MSW, willingness to pay, payment card, waste seperation

Procedia PDF Downloads 290
21755 Model Predictive Controller for Pasteurization Process

Authors: Tesfaye Alamirew Dessie

Abstract:

Our study focuses on developing a Model Predictive Controller (MPC) and evaluating it against a traditional PID for a pasteurization process. Utilizing system identification from the experimental data, the dynamics of the pasteurization process were calculated. Using best fit with data validation, residual, and stability analysis, the quality of several model architectures was evaluated. The validation data fit the auto-regressive with exogenous input (ARX322) model of the pasteurization process by roughly 80.37 percent. The ARX322 model structure was used to create MPC and PID control techniques. After comparing controller performance based on settling time, overshoot percentage, and stability analysis, it was found that MPC controllers outperform PID for those parameters.

Keywords: MPC, PID, ARX, pasteurization

Procedia PDF Downloads 163
21754 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli

Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha

Abstract:

Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.

Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus

Procedia PDF Downloads 283
21753 Locus of Control, Metacognitive Knowledge, Metacognitive Regulation, and Student Performance in an Introductory Economics Course

Authors: Ahmad A. Kader

Abstract:

In the principles of Microeconomics course taught during the Fall Semester 2019, 158out of 179 students participated in the completion of two questionnaires and a survey describing their demographic and academic profiles. The two questionnaires include the 29 items of the Rotter Locus of Control Scale and the 52 items of the Schraw andDennisonMetacognitive Awareness Scale. The 52 items consist of 17 items describing knowledge of cognition and 37 items describing the regulation of cognition. The paper is intended to show the combined influence of locus of control, metacognitive knowledge, and metacognitive regulation on student performance. The survey covers variables that have been tested and recognized in economic education literature, which include GPA, gender, age, course level, race, student classification, whether the course was required or elective, employments, whether a high school economic course was taken, and attendance. Regression results show that of the economic education variables, GPA, classification, whether the course was required or elective, and attendance are the only significant variables in their influence on student grade. Of the educational psychology variables, the regression results show that the locus of control variable has a negative and significant effect, while the metacognitive knowledge variable has a positive and significant effect on student grade. Also, the adjusted R square value increased markedly with the addition of the locus of control, metacognitive knowledge, and metacognitive regulation variables to the regression equation. The t test results also show that students who are internally oriented and are high on the metacognitive knowledge scale significantly outperform students who are externally oriented and are low on the metacognitive knowledge scale. The implication of these results for educators is discussed in the paper.

Keywords: locus of control, metacognitive knowledge, metacognitive regulation, student performance, economic education

Procedia PDF Downloads 120
21752 The 10-year Risk of Major Osteoporotic and Hip Fractures Among Indonesian People Living with HIV

Authors: Iqbal Pramukti, Mamat Lukman, Hasniatisari Harun, Kusman Ibrahim

Abstract:

Introduction: People living with HIV had a higher risk of osteoporotic fracture than the general population. The purpose of this study was to predict the 10-year risk of fracture among people living with HIV (PLWH) using FRAX™ and to identify characteristics related to the fracture risk. Methodology: This study consisted of 75 subjects. The ten-year probability of major osteoporotic fractures (MOF) and hip fractures was assessed using the FRAX™ algorithm. A cross-tabulation was used to identify the participant’s characteristics related to fracture risk. Results: The overall mean 10-year probability of fracture was 2.4% (1.7) for MOF and 0.4% (0.3) for hip fractures. For MOF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use showed a higher MOF score than those who were not (3.1 vs. 2.5; 4.6 vs 2.5; and 3.4 vs 2.5, respectively). For HF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use also showed a higher HF score than those who were not (0.5 vs. 0.3; 0.8 vs. 0.3; and 0.5 vs. 0.3, respectively). Conclusions: The 10-year risk of fracture was higher among PLWH with several factors, including the parent’s hip. Fracture history, smoking behavior and glucocorticoid used. Further analysis on determining factors using multivariate regression analysis with a larger sample size is required to confirm the factors associated with the high fracture risk.

Keywords: HIV, PLWH, osteoporotic fractures, hip fractures, 10-year risk of fracture, FRAX

Procedia PDF Downloads 49
21751 Relationship between the Ability of Accruals and Non-Systematic Risk of Shares for Companies Listed in Stock Exchange: Case Study, Tehran

Authors: Lina Najafian, Hamidreza Vakilifard

Abstract:

The present study focused on the relationship between the quality of accruals and non-systematic risk. The independent study variables included the ability of accruals, the information content of accruals, and amount of discretionary accruals considered as accruals quality measures. The dependent variable was non-systematic risk based on the Fama and French Three Factor model (FFTFM) and the capital asset pricing model (CAPM). The control variables were firm size, financial leverage, stock return, cash flow fluctuations, and book-to-market ratio. The data collection method was based on library research and document mining including financial statements. Multiple regression analysis was used to analyze the data. The study results showed that there is a significant direct relationship between financial leverage and discretionary accruals and non-systematic risk based on FFTFM and CAPM. There is also a significant direct relationship between the ability of accruals, information content of accruals, firm size, and stock return and non-systematic based on both models. It was also found that there is no relationship between book-to-market ratio and cash flow fluctuations and non-systematic risk.

Keywords: accruals quality, non-systematic risk, CAPM, FFTFM

Procedia PDF Downloads 159
21750 Modeling a Closed Loop Supply Chain with Continuous Price Decrease and Dynamic Deterministic Demand

Authors: H. R. Kamali, A. Sadegheih, M. A. Vahdat-Zad, H. Khademi-Zare

Abstract:

In this paper, a single product, multi-echelon, multi-period closed loop supply chain is surveyed, including a variety of costs, time conditions, and capacities, to plan and determine the values and time of the components procurement, production, distribution, recycling and disposal specially for high-tech products that undergo a decreasing production cost and sale price over time. For this purpose, the mathematic model of the problem that is a kind of mixed integer linear programming is presented, and it is finally proved that the problem belongs to the category of NP-hard problems.

Keywords: closed loop supply chain, continuous price decrease, NP-hard, planning

Procedia PDF Downloads 364
21749 Comparing Machine Learning Estimation of Fuel Consumption of Heavy-Duty Vehicles

Authors: Victor Bodell, Lukas Ekstrom, Somayeh Aghanavesi

Abstract:

Fuel consumption (FC) is one of the key factors in determining expenses of operating a heavy-duty vehicle. A customer may therefore request an estimate of the FC of a desired vehicle. The modular design of heavy-duty vehicles allows their construction by specifying the building blocks, such as gear box, engine and chassis type. If the combination of building blocks is unprecedented, it is unfeasible to measure the FC, since this would first r equire the construction of the vehicle. This paper proposes a machine learning approach to predict FC. This study uses around 40,000 vehicles specific and o perational e nvironmental c onditions i nformation, such as road slopes and driver profiles. A ll v ehicles h ave d iesel engines and a mileage of more than 20,000 km. The data is used to investigate the accuracy of machine learning algorithms Linear regression (LR), K-nearest neighbor (KNN) and Artificial n eural n etworks (ANN) in predicting fuel consumption for heavy-duty vehicles. Performance of the algorithms is evaluated by reporting the prediction error on both simulated data and operational measurements. The performance of the algorithms is compared using nested cross-validation and statistical hypothesis testing. The statistical evaluation procedure finds that ANNs have the lowest prediction error compared to LR and KNN in estimating fuel consumption on both simulated and operational data. The models have a mean relative prediction error of 0.3% on simulated data, and 4.2% on operational data.

Keywords: artificial neural networks, fuel consumption, friedman test, machine learning, statistical hypothesis testing

Procedia PDF Downloads 178
21748 Exploring the Possibility of Islamic Banking as a Viable Alternative to the Conventional Banking Model

Authors: Lavan Vickneson

Abstract:

In today’s modern economy, the conventional banking model is the primary banking system used around the world. A significant problem faced by the conventional banking model is the recurring nature of banking crises. History’s record of the various banking crises, ranging from the Great Depression to the 2008 subprime mortgage crisis, is testament to the fact that banking crises continue to strike despite the preventive measures in place, such as bank’s minimum capital requirements and deposit guarantee schemes. If banking crises continue to occur despite these preventive measures, it necessarily follows that there are inherent flaws with the conventional banking model itself. In light of this, a possible alternative banking model to the conventional banking model is Islamic banking. To date, Islamic banking has been a niche market, predominantly serving Muslim investors. This paper seeks to explore the possibility of Islamic banking being more than just a niche market and playing a greater role in banking sectors around the world, by being a viable alternative to the conventional banking model.

Keywords: bank crises, conventional banking model, Islamic banking, niche market

Procedia PDF Downloads 282
21747 Profitability Analysis of Investment in Oil Palm Value Chain in Osun State, Nigeria

Authors: Moyosooore A. Babalola, Ayodeji S. Ogunleye

Abstract:

The main focus of the study was to determine the profitability of investment in the Oil Palm value chain of Osun State, Nigeria in 2015. The specific objectives were to describe the socio-economic characteristics of Oil Palm investors (producers, processors and marketers), to determine the profitability of the investment to investors in the Oil Palm value chain, and to determine the factors affecting the profitability of the investment of the oil palm investors in Osun state. A sample of 100 respondents was selected in this cross-sectional survey. Multiple stage sampling procedure was used for data collection of producers and processors while purposive sampling was used for marketers. Data collected was analyzed using the following analytical tools: descriptive statistics, budgetary analysis and regression analysis. The results of the gross margin showed that the producers and processors were more profitable than the marketers in the oil palm value chain with their benefit-cost ratios as 1.93, 1.82 and 1.11 respectively. The multiple regression analysis showed that education and years of experience were significant among marketers and producers while age and years of experience had significant influence on the gross margin of processors. Based on these findings, improvement on the level of education of oil palm investors is recommended in order to address the relatively low access to post-primary education among the oil palm investors in Osun State. In addition to this, it is important that training be made available to oil palm investors. This will improve the quality of their years of experience, ensuring that it has a positive influence on their gross margin. Low access to credit among processors and producer could be corrected by making extension services available to them. Marketers would also greatly benefit from subsidized prices on oil palm products to increase their gross margin, as the huge percentage of their total cost comes from acquiring palm oil.

Keywords: oil palm, profitability analysis, regression analysis, value chain

Procedia PDF Downloads 363
21746 Developing the Morphological Field of Problem Context to Assist Multi-Methodology in Operations Research

Authors: Mahnaz Hosseinzadeh, Mohammad Reza Mehregan

Abstract:

In this paper, we have developed a morphological field to assist multi- methodology (combining methodologies together in whole or part) in Operations Research (OR) for the problem contexts in Iranian organizations. So, we have attempted to identify some dimensions for problem context according to Iranian organizational problems. Then, a general morphological program is designed which helps the OR practitioner to determine the suitable OR methodology as output for any configuration of conditions in a problem context as input and to reveal the fields necessary to be improved in OR. Applying such a program would have interesting results for OR practitioners.

Keywords: hard, soft and emancipatory operations research, General Morphological Analysis (GMA), multi-methodology, problem context

Procedia PDF Downloads 298
21745 Predictive Modeling of Bridge Conditions Using Random Forest

Authors: Miral Selim, May Haggag, Ibrahim Abotaleb

Abstract:

The aging of transportation infrastructure presents significant challenges, particularly concerning the monitoring and maintenance of bridges. This study investigates the application of Random Forest algorithms for predictive modeling of bridge conditions, utilizing data from the US National Bridge Inventory (NBI). The research is significant as it aims to improve bridge management through data-driven insights that can enhance maintenance strategies and contribute to overall safety. Random Forest is chosen for its robustness, ability to handle complex, non-linear relationships among variables, and its effectiveness in feature importance evaluation. The study begins with comprehensive data collection and cleaning, followed by the identification of key variables influencing bridge condition ratings, including age, construction materials, environmental factors, and maintenance history. Random Forest is utilized to examine the relationships between these variables and the predicted bridge conditions. The dataset is divided into training and testing subsets to evaluate the model's performance. The findings demonstrate that the Random Forest model effectively enhances the understanding of factors affecting bridge conditions. By identifying bridges at greater risk of deterioration, the model facilitates proactive maintenance strategies, which can help avoid costly repairs and minimize service disruptions. Additionally, this research underscores the value of data-driven decision-making, enabling better resource allocation to prioritize maintenance efforts where they are most necessary. In summary, this study highlights the efficiency and applicability of Random Forest in predictive modeling for bridge management. Ultimately, these findings pave the way for more resilient and proactive management of bridge systems, ensuring their longevity and reliability for future use.

Keywords: data analysis, random forest, predictive modeling, bridge management

Procedia PDF Downloads 22
21744 Development of Orbital TIG Welding Robot System for the Pipe

Authors: Dongho Kim, Sung Choi, Kyowoong Pee, Youngsik Cho, Seungwoo Jeong, Soo-Ho Kim

Abstract:

This study is about the orbital TIG welding robot system which travels on the guide rail installed on the pipe, and welds and tracks the pipe seam using the LVS (Laser Vision Sensor) joint profile data. The orbital welding robot system consists of the robot, welder, controller, and LVS. Moreover we can define the relationship between welding travel speed and wire feed speed, and we can make the linear equation using the maximum and minimum amount of weld metal. Using the linear equation we can determine the welding travel speed and the wire feed speed accurately corresponding to the area of weld captured by LVS. We applied this orbital TIG welding robot system to the stainless steel or duplex pipe on DSME (Daewoo Shipbuilding and Marine Engineering Co. Ltd.,) shipyard and the result of radiographic test is almost perfect. (Defect rate: 0.033%).

Keywords: adaptive welding, automatic welding, pipe welding, orbital welding, laser vision sensor, LVS, welding D/B

Procedia PDF Downloads 688
21743 A Numerical Hybrid Finite Element Model for Lattice Structures Using 3D/Beam Elements

Authors: Ahmadali Tahmasebimoradi, Chetra Mang, Xavier Lorang

Abstract:

Thanks to the additive manufacturing process, lattice structures are replacing the traditional structures in aeronautical and automobile industries. In order to evaluate the mechanical response of the lattice structures, one has to resort to numerical techniques. Ansys is a globally well-known and trusted commercial software that allows us to model the lattice structures and analyze their mechanical responses using either solid or beam elements. In this software, a script may be used to systematically generate the lattice structures for any size. On the one hand, solid elements allow us to correctly model the contact between the substrates (the supports of the lattice structure) and the lattice structure, the local plasticity, and the junctions of the microbeams. However, their computational cost increases rapidly with the size of the lattice structure. On the other hand, although beam elements reduce the computational cost drastically, it doesn’t correctly model the contact between the lattice structures and the substrates nor the junctions of the microbeams. Also, the notion of local plasticity is not valid anymore. Moreover, the deformed shape of the lattice structure doesn’t correspond to the deformed shape of the lattice structure using 3D solid elements. In this work, motivated by the pros and cons of the 3D and beam models, a numerically hybrid model is presented for the lattice structures to reduce the computational cost of the simulations while avoiding the aforementioned drawbacks of the beam elements. This approach consists of the utilization of solid elements for the junctions and beam elements for the microbeams connecting the corresponding junctions to each other. When the global response of the structure is linear, the results from the hybrid models are in good agreement with the ones from the 3D models for body-centered cubic with z-struts (BCCZ) and body-centered cubic without z-struts (BCC) lattice structures. However, the hybrid models have difficulty to converge when the effect of large deformation and local plasticity are considerable in the BCCZ structures. Furthermore, the effect of the junction’s size of the hybrid models on the results is investigated. For BCCZ lattice structures, the results are not affected by the junction’s size. This is also valid for BCC lattice structures as long as the ratio of the junction’s size to the diameter of the microbeams is greater than 2. The hybrid model can take into account the geometric defects. As a demonstration, the point clouds of two lattice structures are parametrized in a platform called LATANA (LATtice ANAlysis) developed by IRT-SystemX. In this process, for each microbeam of the lattice structures, an ellipse is fitted to capture the effect of shape variation and roughness. Each ellipse is represented by three parameters; semi-major axis, semi-minor axis, and angle of rotation. Having the parameters of the ellipses, the lattice structures are constructed in Spaceclaim (ANSYS) using the geometrical hybrid approach. The results show a negligible discrepancy between the hybrid and 3D models, while the computational cost of the hybrid model is lower than the computational cost of the 3D model.

Keywords: additive manufacturing, Ansys, geometric defects, hybrid finite element model, lattice structure

Procedia PDF Downloads 112
21742 Optimizing Network Latency with Fast Path Assignment for Incoming Flows

Authors: Qing Lyu, Hang Zhu

Abstract:

Various flows in the network require to go through different types of middlebox. The improper placement of network middlebox and path assignment for flows could greatly increase the network latency and also decrease the performance of network. Minimizing the total end to end latency of all the ows requires to assign path for the incoming flows. In this paper, the flow path assignment problem in regard to the placement of various kinds of middlebox is studied. The flow path assignment problem is formulated to a linear programming problem, which is very time consuming. On the other hand, a naive greedy algorithm is studied. Which is very fast but causes much more latency than the linear programming algorithm. At last, the paper presents a heuristic algorithm named FPA, which takes bottleneck link information and estimated bandwidth occupancy into consideration, and achieves near optimal latency in much less time. Evaluation results validate the effectiveness of the proposed algorithm.

Keywords: flow path, latency, middlebox, network

Procedia PDF Downloads 207
21741 Prediction of Compressive Strength Using Artificial Neural Network

Authors: Vijay Pal Singh, Yogesh Chandra Kotiyal

Abstract:

Structures are a combination of various load carrying members which transfer the loads to the foundation from the superstructure safely. At the design stage, the loading of the structure is defined and appropriate material choices are made based upon their properties, mainly related to strength. The strength of materials kept on reducing with time because of many factors like environmental exposure and deformation caused by unpredictable external loads. Hence, to predict the strength of materials used in structures, various techniques are used. Among these techniques, Non-Destructive Techniques (NDT) are the one that can be used to predict the strength without damaging the structure. In the present study, the compressive strength of concrete has been predicted using Artificial Neural Network (ANN). The predicted strength was compared with the experimentally obtained actual compressive strength of concrete and equations were developed for different models. A good co-relation has been obtained between the predicted strength by these models and experimental values. Further, the co-relation has been developed using two NDT techniques for prediction of strength by regression analysis. It was found that the percentage error has been reduced between the predicted strength by using combined techniques in place of single techniques.

Keywords: rebound, ultra-sonic pulse, penetration, ANN, NDT, regression

Procedia PDF Downloads 428