Search results for: random number
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11612

Search results for: random number

10922 Tomato-Weed Classification by RetinaNet One-Step Neural Network

Authors: Dionisio Andujar, Juan lópez-Correa, Hugo Moreno, Angela Ri

Abstract:

The increased number of weeds in tomato crops highly lower yields. Weed identification with the aim of machine learning is important to carry out site-specific control. The last advances in computer vision are a powerful tool to face the problem. The analysis of RGB (Red, Green, Blue) images through Artificial Neural Networks had been rapidly developed in the past few years, providing new methods for weed classification. The development of the algorithms for crop and weed species classification looks for a real-time classification system using Object Detection algorithms based on Convolutional Neural Networks. The site study was located in commercial corn fields. The classification system has been tested. The procedure can detect and classify weed seedlings in tomato fields. The input to the Neural Network was a set of 10,000 RGB images with a natural infestation of Cyperus rotundus l., Echinochloa crus galli L., Setaria italica L., Portulaca oeracea L., and Solanum nigrum L. The validation process was done with a random selection of RGB images containing the aforementioned species. The mean average precision (mAP) was established as the metric for object detection. The results showed agreements higher than 95 %. The system will provide the input for an online spraying system. Thus, this work plays an important role in Site Specific Weed Management by reducing herbicide use in a single step.

Keywords: deep learning, object detection, cnn, tomato, weeds

Procedia PDF Downloads 90
10921 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods

Authors: Vinayak Bassi, Rajpreet Singh

Abstract:

Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.

Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing

Procedia PDF Downloads 145
10920 Customer Churn Prediction by Using Four Machine Learning Algorithms Integrating Features Selection and Normalization in the Telecom Sector

Authors: Alanoud Moraya Aldalan, Abdulaziz Almaleh

Abstract:

A crucial component of maintaining a customer-oriented business as in the telecom industry is understanding the reasons and factors that lead to customer churn. Competition between telecom companies has greatly increased in recent years. It has become more important to understand customers’ needs in this strong market of telecom industries, especially for those who are looking to turn over their service providers. So, predictive churn is now a mandatory requirement for retaining those customers. Machine learning can be utilized to accomplish this. Churn Prediction has become a very important topic in terms of machine learning classification in the telecommunications industry. Understanding the factors of customer churn and how they behave is very important to building an effective churn prediction model. This paper aims to predict churn and identify factors of customers’ churn based on their past service usage history. Aiming at this objective, the study makes use of feature selection, normalization, and feature engineering. Then, this study compared the performance of four different machine learning algorithms on the Orange dataset: Logistic Regression, Random Forest, Decision Tree, and Gradient Boosting. Evaluation of the performance was conducted by using the F1 score and ROC-AUC. Comparing the results of this study with existing models has proven to produce better results. The results showed the Gradients Boosting with feature selection technique outperformed in this study by achieving a 99% F1-score and 99% AUC, and all other experiments achieved good results as well.

Keywords: machine learning, gradient boosting, logistic regression, churn, random forest, decision tree, ROC, AUC, F1-score

Procedia PDF Downloads 122
10919 Numerical Approach to a Mathematical Modeling of Bioconvection Due to Gyrotactic Micro-Organisms over a Nonlinear Inclined Stretching Sheet

Authors: Madhu Aneja, Sapna Sharma

Abstract:

The water-based bioconvection of a nanofluid containing motile gyrotactic micro-organisms over nonlinear inclined stretching sheet has been investigated. The governing nonlinear boundary layer equations of the model are reduced to a system of ordinary differential equations via Oberbeck-Boussinesq approximation and similarity transformations. Further, the modified set of equations with associated boundary conditions are solved using Finite Element Method. The impact of various pertinent parameters on the velocity, temperature, nanoparticles concentration, density of motile micro-organisms profiles are obtained and analyzed in details. The results show that with the increase in angle of inclination δ, velocity decreases while temperature, nanoparticles concentration, a density of motile micro-organisms increases. Additionally, the skin friction coefficient, Nusselt number, Sherwood number, density number are computed for various thermophysical parameters. It is noticed that increasing Brownian motion and thermophoresis parameter leads to an increase in temperature of fluid which results in a reduction in Nusselt number. On the contrary, Sherwood number rises with an increase in Brownian motion and thermophoresis parameter. The findings have been validated by comparing the results of special cases with existing studies.

Keywords: bioconvection, finite element method, gyrotactic micro-organisms, inclined stretching sheet, nanofluid

Procedia PDF Downloads 178
10918 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 293
10917 Upper Bounds on the Paired Domination Number of Cubic Graphs

Authors: Bin Sheng, Changhong Lu

Abstract:

Let G be a simple undirected graph with no isolated vertex. A paired dominating set of G is a dominating set which induces a subgraph that has a perfect matching. The paired domination number of G, denoted by γₚᵣ(G), is the size of its smallest paired dominating set. Goddard and Henning conjectured that γₚᵣ(G) ≤ 4n/7 holds for every graph G with δ(G) ≥ 3, except the Petersen Graph. In this paper, we prove this conjecture for cubic graphs.

Keywords: paired dominating set, upper bound, cubic graphs, weight function

Procedia PDF Downloads 227
10916 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 219
10915 A Petri Net Model to Obtain the Throughput of Unreliable Production Lines in the Buffer Allocation Problem

Authors: Joselito Medina-Marin, Alexandr Karelin, Ana Tarasenko, Juan Carlos Seck-Tuoh-Mora, Norberto Hernandez-Romero, Eva Selene Hernandez-Gress

Abstract:

A production line designer faces with several challenges in manufacturing system design. One of them is the assignment of buffer slots in between every machine of the production line in order to maximize the throughput of the whole line, which is known as the Buffer Allocation Problem (BAP). The BAP is a combinatorial problem that depends on the number of machines and the total number of slots to be distributed on the production line. In this paper, we are proposing a Petri Net (PN) Model to obtain the throughput in unreliable production lines, based on PN mathematical tools and the decomposition method. The results obtained by this methodology are similar to those presented in previous works, and the number of machines is not a hard restriction.

Keywords: buffer allocation problem, Petri Nets, throughput, production lines

Procedia PDF Downloads 290
10914 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 90
10913 Combining a Continuum of Hidden Regimes and a Heteroskedastic Three-Factor Model in Option Pricing

Authors: Rachid Belhachemi, Pierre Rostan, Alexandra Rostan

Abstract:

This paper develops a discrete-time option pricing model for index options. The model consists of two key ingredients. First, daily stock return innovations are driven by a continuous hidden threshold mixed skew-normal (HTSN) distribution which generates conditional non-normality that is needed to fit daily index return. The most important feature of the HTSN is the inclusion of a latent state variable with a continuum of states, unlike the traditional mixture distributions where the state variable is discrete with little number of states. The HTSN distribution belongs to the class of univariate probability distributions where parameters of the distribution capture the dependence between the variable of interest and the continuous latent state variable (the regime). The distribution has an interpretation in terms of a mixture distribution with time-varying mixing probabilities. It has been shown empirically that this distribution outperforms its main competitor, the mixed normal (MN) distribution, in terms of capturing the stylized facts known for stock returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence. Second, heteroscedasticity in the model is captured by a threeexogenous-factor GARCH model (GARCHX), where the factors are taken from the principal components analysis of various world indices and presents an application to option pricing. The factors of the GARCHX model are extracted from a matrix of world indices applying principal component analysis (PCA). The empirically determined factors are uncorrelated and represent truly different common components driving the returns. Both factors and the eight parameters inherent to the HTSN distribution aim at capturing the impact of the state of the economy on price levels since distribution parameters have economic interpretations in terms of conditional volatilities and correlations of the returns with the hidden continuous state. The PCA identifies statistically independent factors affecting the random evolution of a given pool of assets -in our paper a pool of international stock indices- and sorting them by order of relative importance. The PCA computes a historical cross asset covariance matrix and identifies principal components representing independent factors. In our paper, factors are used to calibrate the HTSN-GARCHX model and are ultimately responsible for the nature of the distribution of random variables being generated. We benchmark our model to the MN-GARCHX model following the same PCA methodology and the standard Black-Scholes model. We show that our model outperforms the benchmark in terms of RMSE in dollar losses for put and call options, which in turn outperforms the analytical Black-Scholes by capturing the stylized facts known for index returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence.

Keywords: continuous hidden threshold, factor models, GARCHX models, option pricing, risk-premium

Procedia PDF Downloads 291
10912 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 84
10911 Impact of Working Capital Management Strategies on Firm's Value and Profitability

Authors: Jonghae Park, Daesung Kim

Abstract:

The impact of aggressive and conservative working capital‘s strategies on the value and profitability of the firms has been evaluated by applying the panel data regression analysis. The control variables used in the regression models are natural log of firm size, sales growth, and debt. We collected a panel of 13,988 companies listed on the Korea stock market covering the period 2000-2016. The major findings of this study are as follow: 1) We find a significant negative correlation between firm profitability and the number of days inventory (INV) and days accounts payable (AP). The firm’s profitability can also be improved by reducing the number of days of inventory and days accounts payable. 2) We also find a significant positive correlation between firm profitability and the number of days accounts receivable (AR) and cash ratios (CR). In other words, the cash is associated with high corporate profitability. 3) Tobin's analysis showed that only the number of days accounts receivable (AR) and cash ratios (CR) had a significant relationship. In conclusion, companies can increase profitability by reducing INV and increasing AP, but INV and AP did not affect corporate value. In particular, it is necessary to increase CA and decrease AR in order to increase Firm’s profitability and value.

Keywords: working capital, working capital management, firm value, profitability

Procedia PDF Downloads 169
10910 Theoretical and Experimental Investigation of Heat Pipes for Solar Collector Applications

Authors: Alireza Ghadiri, Soheila Memarzadeh, Arash Ghadiri

Abstract:

Heat pipes are efficient heat transfer devices for solar hot water heating systems. However, the effective downward transfer of solar energy in an integrated heat pipe system provides increased design and implementation options. There is a lack of literature about flat plate wicked assisted heat pipe solar collector, especially with the presence of finned water-cooled condenser wicked heat pipes for solar energy applications. In this paper, the consequence of incorporating fins arrays into the condenser region of screen mesh heat pipe solar collector is investigated. An experimental model and a transient theoretical model are conducted to compare the performances of the solar heating system at a different period of the year. A good agreement is shown between the model and the experiment. Two working fluids are investigated (water and methanol) and results reveal that water slightly outperforms methanol with a collector instantaneous efficiency of nearly 60%. That modest improvement is achieved by adding fins to the condenser region of the heat pipes. Results show that the collector efficiency increase as the number of fins increases (upon certain number) and reveal that the mesh number is an important factor which affect the overall collector efficiency. An optimal heat pipe mesh number of 100 meshes/in. With two layers appears to be favorable in such collectors for their design and operating conditions.

Keywords: heat pipe, solar collector, capillary limit, mesh number

Procedia PDF Downloads 425
10909 Cows Milk Quality on Different Sized Dairy Farms

Authors: Ramutė Miseikienė, Saulius Tusas

Abstract:

Somatic cell count and bacteria count are the main indicators of cow milk quality. The aim of this study was to analyze and compare parameters of milk quality in different-sized cows herds. Milk quality of ten dairy cows farms during one year period was analyzed. Dairy farms were divided into five groups according to number of cows in the farm (under 50 cows, 51–100 cows, 101–200 cows, 201–400 cows and more than 400 cows). The averages of somatic cells bacteria count in milk and milk freezing temperature were analyzed. Also, these parameters of milk quality were compared during outdoor (from May to September) and indoor (from October to April) periods. The largest number of SCC was established in the smallest farms, i.e., in farms under 50 cows and 51-100 cows (respectively 264±9,19 and 300±10,24 thousand/ml). Reliable link between the smallest and largest dairy farms and farms with 101-200 and 201-400 cows and count of somatic cells in milk has not been established (P > 0.05). Bacteria count had a low tendency to decrease when the number of cows in farms increased. The highest bacteria number was determined in the farms with 51-100 cows and the the lowest bacteria count was in milk when 201-400 and more than 401 cows were kept. With increasing the number of cows milk maximal freezing temperature decreases (significant negative trend), i. e, indicator is improving. It should be noted that in all farms milk freezing point never exceeded requirements (-0.515 °C). The highest difference between SCC in milk during the indoor and outdoor periods was established in farms with 201-400 cows (respectively 218.49 thousand/ml and 268.84 thousand/ml). However, the count of SC was significantly higher (P < 0.05) during outdoor period in large farms (201-400 and more cows). There was no significant difference between bacteria count in milk during both – outdoor and indoor – periods (P > 0.05).

Keywords: bacteria, cow, farm size, somatic cell count

Procedia PDF Downloads 252
10908 Fish Scales as a Nonlethal Screening Tools for Assessing the Effects of Surface Water Contaminants in Cyprinus Carpio

Authors: Shahid Mahboob, Hafiz Muhammad Ashraf, Salma Sultana, Tayyaba Sultana, Khalid Al-Ghanim, Fahid Al-Misned, Zubair Ahmedd

Abstract:

There is an increasing need for an effective tool to estimate the risks derived from the large number of pollutants released to the environment by human activities. Typical screening procedures are highly invasive or lethal to the fish. Recent studies show that fish scales biochemically respond to a range of contaminants, including toxic metals, organic compounds, and endocrine disruptors. The present study evaluated the effects of the surface water contaminants on Cyprinus carpio in the Ravi River by comparing DNA extracted non-lethally from their scales to DNA extracted from the scales of fish collected from a controlled fish farm. A single, random sampling was conducted. Fish were broadly categorised into three weight categories (W1, W2 and W3). The experimental samples in the W1, W2 and W3 categories had an average DNA concentration (µg/µl) that was lower than the control samples. All control samples had a single DNA band; whereas the experimental samples in W1 fish had 1 to 2 bands, the experimental samples in W2 fish had two bands and the experimental samples in W3 fish had fragmentation in the form of three bands. These bands exhibit the effects of pollution on fish in the Ravi River. On the basis findings of this study, we propose that fish scales can be successfully employed as a new non-lethal tool for the evaluation of the effect of surface water contaminants.

Keywords: fish scales, Cyprinus carpio, heavy metals, non-invasive, DNA fragmentation

Procedia PDF Downloads 394
10907 Dynamic vs. Static Bankruptcy Prediction Models: A Dynamic Performance Evaluation Framework

Authors: Mohammad Mahdi Mousavi

Abstract:

Bankruptcy prediction models have been implemented for continuous evaluation and monitoring of firms. With the huge number of bankruptcy models, an extensive number of studies have focused on answering the question that which of these models are superior in performance. In practice, one of the drawbacks of existing comparative studies is that the relative assessment of alternative bankruptcy models remains an exercise that is mono-criterion in nature. Further, a very restricted number of criteria and measure have been applied to compare the performance of competing bankruptcy prediction models. In this research, we overcome these methodological gaps through implementing an extensive range of criteria and measures for comparison between dynamic and static bankruptcy models, and through proposing a multi-criteria framework to compare the relative performance of bankruptcy models in forecasting firm distress for UK firms.

Keywords: bankruptcy prediction, data envelopment analysis, performance criteria, performance measures

Procedia PDF Downloads 235
10906 Asia Pacific University of Technology and Innovation

Authors: Esther O. Adebitan, Florence Oyelade

Abstract:

The Millennium Development Goals (MDGs) was initiated by the UN member nations’ aspiration for the betterment of human life. It is expressed in a set of numerical ‎and time-bound targets. In more recent time, the aspiration is shifting away from just the achievement to the sustainability of achieved MDGs beyond the 2015 target. The main objective of this study was assessing how much the hotel industry within the Nigerian Federal Capital Territory (FCT) as a member of the global community is involved in the achievement of sustainable MDGs within the FCT. The study had two population groups consisting of 160 hotels and the communities where these are located. Stratified random sampling technique was adopted in selecting 60 hotels based on large, medium ‎and small hotels categorisation, while simple random sampling technique was used to elicit information from 30 residents of three of the hotels host communities. The study was guided by tree research questions and two hypotheses aimed to ascertain if hotels see the need to be involved in, and have policies in pursuit of achieving sustained MDGs, and to determine public opinion regarding hotels contribution towards the achievement of the MDGs in their communities. A 22 item questionnaire was designed ‎and administered to hotel managers while 11 item questionnaire was designed ‎and administered to hotels’ host communities. Frequency distribution and percentage as well as Chi-square were used to analyse data. Results showed no significant involvement of the hotel industry in achieving sustained MDGs in the FCT and that there was disconnect between the hotels and their immediate communities. The study recommended that hotels should, as part of their Corporate Social Responsibility pick at least one of the goals to work on in order to be involved in the attainment of enduring Millennium Development Goals.

Keywords: MDGs, hotels, FCT, host communities, corporate social responsibility

Procedia PDF Downloads 402
10905 Enhancing Temporal Extrapolation of Wind Speed Using a Hybrid Technique: A Case Study in West Coast of Denmark

Authors: B. Elshafei, X. Mao

Abstract:

The demand for renewable energy is significantly increasing, major investments are being supplied to the wind power generation industry as a leading source of clean energy. The wind energy sector is entirely dependable and driven by the prediction of wind speed, which by the nature of wind is very stochastic and widely random. This s0tudy employs deep multi-fidelity Gaussian process regression, used to predict wind speeds for medium term time horizons. Data of the RUNE experiment in the west coast of Denmark were provided by the Technical University of Denmark, which represent the wind speed across the study area from the period between December 2015 and March 2016. The study aims to investigate the effect of pre-processing the data by denoising the signal using empirical wavelet transform (EWT) and engaging the vector components of wind speed to increase the number of input data layers for data fusion using deep multi-fidelity Gaussian process regression (GPR). The outcomes were compared using root mean square error (RMSE) and the results demonstrated a significant increase in the accuracy of predictions which demonstrated that using vector components of the wind speed as additional predictors exhibits more accurate predictions than strategies that ignore them, reflecting the importance of the inclusion of all sub data and pre-processing signals for wind speed forecasting models.

Keywords: data fusion, Gaussian process regression, signal denoise, temporal extrapolation

Procedia PDF Downloads 126
10904 Numerical Investigation of Hot Oil Velocity Effect on Force Heat Convection and Impact of Wind Velocity on Convection Heat Transfer in Receiver Tube of Parabolic Trough Collector System

Authors: O. Afshar

Abstract:

A solar receiver is designed for operation under extremely uneven heat flux distribution, cyclic weather, and cloud transient cycle conditions, which can include large thermal stress and even receiver failure. In this study, the effect of different oil velocity on convection coefficient factor and impact of wind velocity on local Nusselt number by Finite Volume Method will be analyzed. This study is organized to give an overview of the numerical modeling using a MATLAB software, as an accurate, time efficient and economical way of analyzing the heat transfer trends over stationary receiver tube for different Reynolds number. The results reveal when oil velocity is below 0.33m/s, the value of convection coefficient is negligible at low temperature. The numerical graphs indicate that when oil velocity increases up to 1.2 m/s, heat convection coefficient increases significantly. In fact, a reduction in oil velocity causes a reduction in heat conduction through the glass envelope. In addition, the different local Nusselt number is reduced when the wind blows toward the concave side of the collector and it has a significant effect on heat losses reduction through the glass envelope.

Keywords: receiver tube, heat convection, heat conduction, Nusselt number

Procedia PDF Downloads 342
10903 Numerical Analysis of Passive Controlled Turbulent Flow around a Circular Cylinder

Authors: Mustafa Soyler, Mustafa M. Yavuz, Bulent Yaniktepe, Coskun Ozalp

Abstract:

In this study, unsteady two-dimensional turbulent flow around a circular cylinder and passive control of the flow with groove on the cylinder was examined. In the CFD analysis, solutions were made using turbulent flow conditions. Steady and unsteady solutions were used in turbulent flow analysis. Numerical analysis of the flow around the circular cylinder is difficult since flow is not in a stable regime when Reynold number is between 1000 and 10000. The analyses in this study were performed at a subcritical Re number of 5000 and the results were compared with available experimental results of the drag coefficient (Cd) and Strouhal (St) number values in the literature. The effect of different groove types and depths on the Cd coefficient has been analyzed and grooves increase the Cd coefficient compared to the smooth cylinder.

Keywords: CFD, drag coefficient, flow over cylinder, passive flow control

Procedia PDF Downloads 215
10902 Influence of Vibration Amplitude on Reaction Time and Drowsiness Level

Authors: Mohd A. Azizan, Mohd Z. Zali

Abstract:

It is well established that exposure to vibration has an adverse effect on human health, comfort, and performance. However, there is little quantitative knowledge on performance combined with drowsiness level during vibration exposure. This paper reports a study investigating the influence of vibration amplitude on seated occupant reaction time and drowsiness level. Eighteen male volunteers were recruited for this experiment. Before commencing the experiment, total transmitted acceleration measured at interfaces between the seat pan and seatback to human body was adjusted to become 0.2 ms-2 r.m.s and 0.4 ms-2 r.m.s for each volunteer. Seated volunteers were exposed to Gaussian random vibration with frequency band 1-15 Hz at two level of amplitude (low vibration amplitude and medium vibration amplitude) for 20-minutes in separate days. For the purpose of drowsiness measurement, volunteers were asked to complete 10-minutes PVT test before and after vibration exposure and rate their subjective drowsiness by giving score using Karolinska Sleepiness Scale (KSS) before vibration, every 5-minutes interval and following 20-minutes of vibration exposure. Strong evidence of drowsiness was found as there was a significant increase in reaction time and number of lapse following exposure to vibration in both conditions. However, the effect is more apparent in medium vibration amplitude. A steady increase of drowsiness level can also be observed in KSS in all volunteers. However, no significant differences were found in KSS between low vibration amplitude and medium vibration amplitude. It is concluded that exposure to vibration has an adverse effect on human alertness level and more pronounced at higher vibration amplitude. Taken together, these findings suggest a role of vibration in promoting drowsiness, especially at higher vibration amplitude.

Keywords: drowsiness, human vibration, karolinska sleepiness scale, psychomotor vigilance test

Procedia PDF Downloads 273
10901 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface

Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari

Abstract:

With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.

Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis

Procedia PDF Downloads 403
10900 A Shift in Approach from Cereal Based Diet to Dietary Diversity in India: A Case Study of Aligarh District

Authors: Abha Gupta, Deepak K. Mishra

Abstract:

Food security issue in India has surrounded over availability and accessibility of cereal which is regarded as the only food group to check hunger and improve nutrition. Significance of fruits, vegetables, meat and other food products have totally been neglected given the fact that they provide essential nutrients to the body. There is a need to shift the emphasis from cereal-based approach to a more diverse diet so that aim of achieving food security may change from just reducing hunger to an overall health. This paper attempts to analyse how far dietary diversity level has been achieved across different socio-economic groups in India. For this purpose, present paper sets objectives to determine (a) percentage share of different food groups to total food expenditure and consumption by background characteristics (b) source of and preference for all food items and, (c) diversity of diet across socio-economic groups. A cross sectional survey covering 304 households selected through proportional stratified random sampling was conducted in six villages of Aligarh district of Uttar Pradesh, India. Information on amount of food consumed, source of consumption and expenditure on food (74 food items grouped into 10 major food groups) was collected with a recall period of seven days. Per capita per day food consumption/expenditure was calculated through dividing consumption/expenditure by household size and number seven. Food variety score was estimated by giving 0 values to those food groups/items which had not been eaten and 1 to those which had been taken by households in last seven days. Addition of all food group/item score gave result of food variety score. Diversity of diet was computed using Herfindahl-Hirschman index. Findings of the paper show that cereal, milk, roots and tuber food groups contribute a major share in total consumption/expenditure. Consumption of these food groups vary across socio-economic groups whereas fruit, vegetables, meat and other food consumption remain low and same. Estimation of dietary diversity show higher concentration of diet due to higher consumption of cereals, milk, root and tuber products and dietary diversity slightly varies across background groups. Muslims, Scheduled caste, small farmers, lower income class, food insecure, below poverty line and labour families show higher concentration of diet as compared to their counterpart groups. These groups also evince lower mean intake of number of food item in a week due to poor economic constraints and resultant lower accessibility to number of expensive food items. Results advocate to make a shift from cereal based diet to dietary diversity which not only includes cereal and milk products but also nutrition rich food items such as fruits, vegetables, meat and other products. Integrating a dietary diversity approach in food security programmes of the country would help to achieve nutrition security as hidden hunger is widespread among the Indian population.

Keywords: dietary diversity, food Security, India, socio-economic groups

Procedia PDF Downloads 330
10899 Human-Wildlife Conflicts in Urban Areas of Zimbabwe

Authors: Davie G. Dave, Prisca H. Mugabe, Tonderai Mutibvu

Abstract:

Globally, HWCs are on the rise. Such is the case with urban areas in Zimbabwe, yet little has been documented about it. This study was done to provide insights into the occurrence of human-wildlife conflicts in urban areas. The study was carried out in Harare, Bindura, Masvingo, Beitbridge, and Chiredzi to determine the cause, nature, extent, and frequency of occurrence of HWC, to determine the key wildlife species involved in conflicts and management practices done to combat wildlife conflicts in these areas. Several sampling techniques encompassing multi-stage sampling, stratified random, purposive, and simple random sampling were employed for placing residential areas into three strata according to population density, selecting residential areas, and selecting actual participants. Data were collected through a semi-structured questionnaire and key informant interviews. The results revealed that property destruction and crop damage were the most prevalent conflicts. Of the 15 animals that were cited, snakes, baboons, and monkeys were associated with the most conflicts. The occurrence of HWCs was mainly attributed to the increase in both animal and human populations. To curtail these HWCs, the local people mainly used non-lethal methods, whilst lethal methods were used by authorities for some of the reported cases. The majority of the conflicts were seasonal and less severe. There were growing concerns by respondents on the issues of wildlife conflicts, especially in those areas that had primates, such as Warren Park in Harare and Limpopo View in Beitbridge. There are HWCs hotspots in urban areas, and to ameliorate this, suggestions are that there is a need for a multi-action approach that includes general awareness campaigns on HWCs and land use planning that involves the creation of green spaces to ease wildlife management.

Keywords: human-wildlife conflicts, mitigation measures, residential areas, types of conflicts, urban areas

Procedia PDF Downloads 51
10898 The Value of Routine Terminal Ileal Biopsies for the Investigation of Diarrhea

Authors: Swati Bhasin, Ali Ahmed, Valence Xavier, Ben Liu

Abstract:

Aims: Diarrhea is a problem that is a frequent clinic referral to the gastroenterology and surgical team from the General practitioner. To establish a diagnosis, these patients undergo colonoscopy. The current practice at our district general hospital is to perform random left and right colonic biopsies. National guidelines issued by the British Society of Gastroenterology advise all patients presenting with chronic diarrhea should have an Ileoscopy as an indicator for colonoscopy completion. Our primary aim was to check if Terminal ileum (TI) biopsy is required to establish a diagnosis of inflammatory bowel disease (IBD). Methods: Data was collected retrospectively from November 2018 to November 2019. The target population were patients who underwent colonoscopies for diarrhea. Demographic data, endoscopic and histology findings of TI were assessed and analyzed. Results: 140 patients with a mean age of 57 years (19-84) underwent a colonoscopy (M: F; 1:2.3). 92 patients had random colonic biopsies taken and based on the histological results of these, 15 patients (16%) were diagnosed with IBD. The TI was successfully intubated in 40 patients, of which 32 patients had colonic biopsies taken as well. 8 patients did not have a colonic biopsy taken. Macroscopic abnormality in the TI was detected in 5 patients, all of whom were biopsied. Based on histological results of the biopsy, 3 patients (12%) were diagnosed with IBD. These 3 patients (100%) also had colonic biopsies taken simultaneously and showed inflammation. None of the patients had a diagnosis of IBD confirmed on TI intubation alone (where colonic biopsies were not done). None of the patients has a diagnosis of IBD confirmed on TI intubation alone (where colonic biopsies were negative). Conclusion: TI intubation is a highly-skilled, time-consuming procedure with a higher risk of perforation, which as per our study, has little additional diagnostic value in finding IBD for symptoms of diarrhea if colonic biopsies are taken. We propose that diarrhea is a colonic symptom; therefore, colonic biopsies are positive for inflammation if the diarrhea is secondary to IBD. We conclude that all of the IBDs can be diagnosed simply with colonic biopsies.

Keywords: biopsy, colon, IBD, terminal ileum

Procedia PDF Downloads 108
10897 Familiarity with Nursing and Description of Nurses Duties

Authors: Narges Solaymani

Abstract:

Definition of Nurse: Nurse: A person who is educated and skilled in the field of scientific principles and professional skills of health care, treatment, and medical training of patients. Nursing is a very important profession in the societies of the world. Although in the past, all caregivers of the sick and disabled were called nurses, nowadays, a nurse is a person who has a university education in this field. There are nurses in bachelor's, master's, and doctoral degrees in nursing. New courses have been launched in the master's degree based on duty-oriented nurses. A nurse cannot have an independent treatment center but is a member of the treatment team in established treatment centers such as hospitals, clinics, or offices. Nurses can establish counseling centers and provide nursing services at home. According to the standards, the number of nurses should be three times the number of doctors or twice the number of hospital beds, or there should be three nurses for every thousand people. Also, international standards show that in the internal and surgical department, every 4 to 6 patients should have a nurse.

Keywords: nurse, intensive care, CPR, bandage

Procedia PDF Downloads 48
10896 GIS Application in Surface Runoff Estimation for Upper Klang River Basin, Malaysia

Authors: Suzana Ramli, Wardah Tahir

Abstract:

Estimation of surface runoff depth is a vital part in any rainfall-runoff modeling. It leads to stream flow calculation and later predicts flood occurrences. GIS (Geographic Information System) is an advanced and opposite tool used in simulating hydrological model due to its realistic application on topography. The paper discusses on calculation of surface runoff depth for two selected events by using GIS with Curve Number method for Upper Klang River basin. GIS enables maps intersection between soil type and land use that later produces curve number map. The results show good correlation between simulated and observed values with more than 0.7 of R2. Acceptable performance of statistical measurements namely mean error, absolute mean error, RMSE, and bias are also deduced in the paper.

Keywords: surface runoff, geographic information system, curve number method, environment

Procedia PDF Downloads 269
10895 Evolution and Obstacles Encountered in the Realm of Sports Tourism in Pakistan

Authors: Muhammad Saleem

Abstract:

Tourism stands as one of the swiftly expanding sectors globally, contributing to 10% of the overall worldwide GDP. It holds a vital role in generating income, fostering employment opportunities, alleviating poverty, facilitating foreign exchange earnings, and advancing intercultural understanding. This industry encompasses a spectrum of activities, encompassing transportation, communication, hospitality, catering, entertainment, and advertising. The objective of this study is to assess the evolution and obstacles encountered by sports tourism in Pakistan. In pursuit of this objective, relevant literature has been scrutinized, while data has been acquired from 60 respondents, employing a simple random sampling approach for analysis. The survey comprised close-ended inquiries directed towards all participants. Analytical tools such as mean, mode, median, graphs, and percentages have been employed for data analysis. The findings revealed through robust analysis, indicate that the mean, mode, and median tools consistently yield results surpassing the 70% mark, underscoring that heightened development within sports tourism significantly augments its progress. Effective governance demonstrates a favorable influence on sports tourism, with increased government-provided safety and security potentially amplifying its expansion, thus attracting a higher number of tourists and consequently propelling the growth of the sports tourism sector. This study holds substantial significance for both academic scholars and industry practitioners within Pakistan's tourism landscape, as previous explorations in this realm have been relatively limited.

Keywords: obstacles-spots, evolution-tourism, sports-pakistan, sports-obstacles-pakistan

Procedia PDF Downloads 35
10894 Assessing and Identifying Factors Affecting Customers Satisfaction of Commercial Bank of Ethiopia: The Case of West Shoa Zone (Bako, Gedo, Ambo, Ginchi and Holeta), Ethiopia

Authors: Habte Tadesse Likassa, Bacha Edosa

Abstract:

Customer’s satisfaction was very important thing that is required for the existence of banks to be more productive and success in any organization and business area. The main goal of the study is assessing and identifying factors that influence customer’s satisfaction in West Shoa Zone of Commercial Bank of Ethiopia (Holeta, Ginchi, Ambo, Gedo and Bako). Stratified random sampling procedure was used in the study and by using simple random sampling (lottery method) 520 customers were drawn from the target population. By using Probability Proportional Size Techniques sample size for each branch of banks were allocated. Both descriptive and inferential statistics methods were used in the study. A binary logistic regression model was fitted to see the significance of factors affecting customer’s satisfaction in this study. SPSS statistical package was used for data analysis. The result of the study reveals that the overall level of customer’s satisfaction in the study area is low (38.85%) as compared those who were not satisfied (61.15%). The result of study showed that all most all factors included in the study were significantly associated with customer’s satisfaction. Therefore, it can be concluded that based on the comparison of branches on their customers satisfaction by using odd ratio customers who were using Ambo and Bako are less satisfied as compared to customers who were in Holeta branch. Additionally, customers who were in Ginchi and Gedo were more satisfied than that of customers who were in Holeta. Since the level of customers satisfaction was low in the study area, it is more advisable and recommended for concerned body works cooperatively more in maximizing satisfaction of their customers.

Keywords: customers, satisfaction, binary logistic, complain handling process, waiting time

Procedia PDF Downloads 447
10893 Mediation Role of Teachers’ Surface Acting and Deep Acting on the Relationship between Calling Orientation and Work Engagement

Authors: Yohannes Bisa Biramo

Abstract:

This study examined the meditational role of surface acting and deep acting on the relationship between calling orientation and work engagement of teachers in secondary schools of Wolaita Zone, Wolaita, Ethiopia. A predictive non-experimental correlational design was performed among 300 secondary school teachers. Stratified random sampling followed by a systematic random sampling technique was used as the basis for selecting samples from the target population. To analyze the data, Structural Equation Modeling (SEM) was used to test the association between the independent variables and the dependent variables. Furthermore, the goodness of fit of the study variables was tested using SEM to see and explain the path influence of the independent variable on the dependent variable. Confirmatory factor analysis (CFA) was conducted to test the validity of the scales in the study and to assess the measurement model fit indices. The analysis result revealed that calling was significantly and positively correlated with surface acting, deep acting and work engagement. Similarly, surface acting was significantly and positively correlated with deep acting and work engagement. And also, deep acting was significantly and positively correlated with work engagement. With respect to mediation analysis, the result revealed that surface acting mediated the relationship between calling and work engagement and also deep acting mediated the relationship between calling and work engagement. Besides, by using the model of the present study, the school leaders and practitioners can identify a core area to be considered in recruiting and letting teachers teach, in giving induction training for newly employed teachers and in performance appraisal.

Keywords: calling, surface acting, deep acting, work engagement, mediation, teachers

Procedia PDF Downloads 64