Search results for: random non-response
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1996

Search results for: random non-response

1696 Land Suitability Prediction Modelling for Agricultural Crops Using Machine Learning Approach: A Case Study of Khuzestan Province, Iran

Authors: Saba Gachpaz, Hamid Reza Heidari

Abstract:

The sharp increase in population growth leads to more pressure on agricultural areas to satisfy the food supply. To achieve this, more resources should be consumed and, besides other environmental concerns, highlight sustainable agricultural development. Land-use management is a crucial factor in obtaining optimum productivity. Machine learning is a widely used technique in the agricultural sector, from yield prediction to customer behavior. This method focuses on learning and provides patterns and correlations from our data set. In this study, nine physical control factors, namely, soil classification, electrical conductivity, normalized difference water index (NDWI), groundwater level, elevation, annual precipitation, pH of water, annual mean temperature, and slope in the alluvial plain in Khuzestan (an agricultural hotspot in Iran) are used to decide the best agricultural land use for both rainfed and irrigated agriculture for ten different crops. For this purpose, each variable was imported into Arc GIS, and a raster layer was obtained. In the next level, by using training samples, all layers were imported into the python environment. A random forest model was applied, and the weight of each variable was specified. In the final step, results were visualized using a digital elevation model, and the importance of all factors for each one of the crops was obtained. Our results show that despite 62% of the study area being allocated to agricultural purposes, only 42.9% of these areas can be defined as a suitable class for cultivation purposes.

Keywords: land suitability, machine learning, random forest, sustainable agriculture

Procedia PDF Downloads 52
1695 Assessing Functional Structure in European Marine Ecosystems Using a Vector-Autoregressive Spatio-Temporal Model

Authors: Katyana A. Vert-Pre, James T. Thorson, Thomas Trancart, Eric Feunteun

Abstract:

In marine ecosystems, spatial and temporal species structure is an important component of ecosystems’ response to anthropological and environmental factors. Although spatial distribution patterns and fish temporal series of abundance have been studied in the past, little research has been allocated to the joint dynamic spatio-temporal functional patterns in marine ecosystems and their use in multispecies management and conservation. Each species represents a function to the ecosystem, and the distribution of these species might not be random. A heterogeneous functional distribution will lead to a more resilient ecosystem to external factors. Applying a Vector-Autoregressive Spatio-Temporal (VAST) model for count data, we estimate the spatio-temporal distribution, shift in time, and abundance of 140 species of the Eastern English Chanel, Bay of Biscay and Mediterranean Sea. From the model outputs, we determined spatio-temporal clusters, calculating p-values for hierarchical clustering via multiscale bootstrap resampling. Then, we designed a functional map given the defined cluster. We found that the species distribution within the ecosystem was not random. Indeed, species evolved in space and time in clusters. Moreover, these clusters remained similar over time deriving from the fact that species of a same cluster often shifted in sync, keeping the overall structure of the ecosystem similar overtime. Knowing the co-existing species within these clusters could help with predicting data-poor species distribution and abundance. Further analysis is being performed to assess the ecological functions represented in each cluster.

Keywords: cluster distribution shift, European marine ecosystems, functional distribution, spatio-temporal model

Procedia PDF Downloads 166
1694 Examination of Public Hospital Unions Technical Efficiencies Using Data Envelopment Analysis and Machine Learning Techniques

Authors: Songul Cinaroglu

Abstract:

Regional planning in health has gained speed for developing countries in recent years. In Turkey, 89 different Public Hospital Unions (PHUs) were conducted based on provincial levels. In this study technical efficiencies of 89 PHUs were examined by using Data Envelopment Analysis (DEA) and machine learning techniques by dividing them into two clusters in terms of similarities of input and output indicators. Number of beds, physicians and nurses determined as input variables and number of outpatients, inpatients and surgical operations determined as output indicators. Before performing DEA, PHUs were grouped into two clusters. It is seen that the first cluster represents PHUs which have higher population, demand and service density than the others. The difference between clusters was statistically significant in terms of all study variables (p ˂ 0.001). After clustering, DEA was performed for general and for two clusters separately. It was found that 11% of PHUs were efficient in general, additionally 21% and 17% of them were efficient for the first and second clusters respectively. It is seen that PHUs, which are representing urban parts of the country and have higher population and service density, are more efficient than others. Random forest decision tree graph shows that number of inpatients is a determinative factor of efficiency of PHUs, which is a measure of service density. It is advisable for public health policy makers to use statistical learning methods in resource planning decisions to improve efficiency in health care.

Keywords: public hospital unions, efficiency, data envelopment analysis, random forest

Procedia PDF Downloads 103
1693 Evolution of Predator-prey Body-size Ratio: Spatial Dimensions of Foraging Space

Authors: Xin Chen

Abstract:

It has been widely observed that marine food webs have significantly larger predator–prey body-size ratios compared with their terrestrial counterparts. A number of hypotheses have been proposed to account for such difference on the basis of primary productivity, trophic structure, biophysics, bioenergetics, habitat features, energy efficiency, etc. In this study, an alternative explanation is suggested based on the difference in the spatial dimensions of foraging arenas: terrestrial animals primarily forage in two dimensional arenas, while marine animals mostly forage in three dimensional arenas. Using 2-dimensional and 3-dimensional random walk simulations, it is shown that marine predators with 3-dimensional foraging would normally have a greater foraging efficiency than terrestrial predators with 2-dimensional foraging. Marine prey with 3-dimensional dispersion usually has greater swarms or aggregations than terrestrial prey with 2-dimensional dispersion, which again favours a greater predator foraging efficiency in marine animals. As an analytical tool, a Lotka-Volterra based adaptive dynamical model is developed with the predator-prey ratio embedded as an adaptive variable. The model predicts that high predator foraging efficiency and high prey conversion rate will dynamically lead to the evolution of a greater predator-prey ratio. Therefore, marine food webs with 3-dimensional foraging space, which generally have higher predator foraging efficiency, will evolve a greater predator-prey ratio than terrestrial food webs.

Keywords: predator-prey, body size, lotka-volterra, random walk, foraging efficiency

Procedia PDF Downloads 46
1692 Analysis of Seismic Waves Generated by Blasting Operations and their Response on Buildings

Authors: S. Ziaran, M. Musil, M. Cekan, O. Chlebo

Abstract:

The paper analyzes the response of buildings and industrially structures on seismic waves (low frequency mechanical vibration) generated by blasting operations. The principles of seismic analysis can be applied for different kinds of excitation such as: earthquakes, wind, explosions, random excitation from local transportation, periodic excitation from large rotating and/or machines with reciprocating motion, metal forming processes such as forging, shearing and stamping, chemical reactions, construction and earth moving work, and other strong deterministic and random energy sources caused by human activities. The article deals with the response of seismic, low frequency, mechanical vibrations generated by nearby blasting operations on a residential home. The goal was to determine the fundamental natural frequencies of the measured structure; therefore it is important to determine the resonant frequencies to design a suitable modal damping. The article also analyzes the package of seismic waves generated by blasting (Primary waves – P-waves and Secondary waves S-waves) and investigated the transfer regions. For the detection of seismic waves resulting from an explosion, the Fast Fourier Transform (FFT) and modal analysis, in the frequency domain, is used and the signal was acquired and analyzed also in the time domain. In the conclusions the measured results of seismic waves caused by blasting in a nearby quarry and its effect on a nearby structure (house) is analyzed. The response on the house, including the fundamental natural frequency and possible fatigue damage is also assessed.

Keywords: building structure, seismic waves, spectral analysis, structural response

Procedia PDF Downloads 369
1691 Determination of Klebsiella Pneumoniae Susceptibility to Antibiotics Using Infrared Spectroscopy and Machine Learning Algorithms

Authors: Manal Suleiman, George Abu-Aqil, Uraib Sharaha, Klaris Riesenberg, Itshak Lapidot, Ahmad Salman, Mahmoud Huleihel

Abstract:

Klebsiella pneumoniae is one of the most aggressive multidrug-resistant bacteria associated with human infections resulting in high mortality and morbidity. Thus, for an effective treatment, it is important to diagnose both the species of infecting bacteria and their susceptibility to antibiotics. Current used methods for diagnosing the bacterial susceptibility to antibiotics are time-consuming (about 24h following the first culture). Thus, there is a clear need for rapid methods to determine the bacterial susceptibility to antibiotics. Infrared spectroscopy is a well-known method that is known as sensitive and simple which is able to detect minor biomolecular changes in biological samples associated with developing abnormalities. The main goal of this study is to evaluate the potential of infrared spectroscopy in tandem with Random Forest and XGBoost machine learning algorithms to diagnose the susceptibility of Klebsiella pneumoniae to antibiotics within approximately 20 minutes following the first culture. In this study, 1190 Klebsiella pneumoniae isolates were obtained from different patients with urinary tract infections. The isolates were measured by the infrared spectrometer, and the spectra were analyzed by machine learning algorithms Random Forest and XGBoost to determine their susceptibility regarding nine specific antibiotics. Our results confirm that it was possible to classify the isolates into sensitive and resistant to specific antibiotics with a success rate range of 80%-85% for the different tested antibiotics. These results prove the promising potential of infrared spectroscopy as a powerful diagnostic method for determining the Klebsiella pneumoniae susceptibility to antibiotics.

Keywords: urinary tract infection (UTI), Klebsiella pneumoniae, bacterial susceptibility, infrared spectroscopy, machine learning

Procedia PDF Downloads 133
1690 The Role of Urban Development Patterns for Mitigating Extreme Urban Heat: The Case Study of Doha, Qatar

Authors: Yasuyo Makido, Vivek Shandas, David J. Sailor, M. Salim Ferwati

Abstract:

Mitigating extreme urban heat is challenging in a desert climate such as Doha, Qatar, since outdoor daytime temperature area often too high for the human body to tolerate. Recent studies demonstrate that cities in arid and semiarid areas can exhibit ‘urban cool islands’ - urban areas that are cooler than the surrounding desert. However, the variation of temperatures as a result of the time of day and factors leading to temperature change remain at the question. To address these questions, we examined the spatial and temporal variation of air temperature in Doha, Qatar by conducting multiple vehicle-base local temperature observations. We also employed three statistical approaches to model surface temperatures using relevant predictors: (1) Ordinary Least Squares, (2) Regression Tree Analysis and (3) Random Forest for three time periods. Although the most important determinant factors varied by day and time, distance to the coast was the significant determinant at midday. A 70%/30% holdout method was used to create a testing dataset to validate the results through Pearson’s correlation coefficient. The Pearson’s analysis suggests that the Random Forest model more accurately predicts the surface temperatures than the other methods. We conclude with recommendations about the types of development patterns that show the greatest potential for reducing extreme heat in air climates.

Keywords: desert cities, tree-structure regression model, urban cool Island, vehicle temperature traverse

Procedia PDF Downloads 365
1689 Object-Based Image Analysis for Gully-Affected Area Detection in the Hilly Loess Plateau Region of China Using Unmanned Aerial Vehicle

Authors: Hu Ding, Kai Liu, Guoan Tang

Abstract:

The Chinese Loess Plateau suffers from serious gully erosion induced by natural and human causes. Gully features detection including gully-affected area and its two dimension parameters (length, width, area et al.), is a significant task not only for researchers but also for policy-makers. This study aims at gully-affected area detection in three catchments of Chinese Loess Plateau, which were selected in Changwu, Ansai, and Suide by using unmanned aerial vehicle (UAV). The methodology includes a sequence of UAV data generation, image segmentation, feature calculation and selection, and random forest classification. Two experiments were conducted to investigate the influences of segmentation strategy and feature selection. Results showed that vertical and horizontal root-mean-square errors were below 0.5 and 0.2 m, respectively, which were ideal for the Loess Plateau region. The segmentation strategy adopted in this paper, which considers the topographic information, and optimal parameter combination can improve the segmentation results. Besides, the overall extraction accuracy in Changwu, Ansai, and Suide achieved was 84.62%, 86.46%, and 93.06%, respectively, which indicated that the proposed method for detecting gully-affected area is more objective and effective than traditional methods. This study demonstrated that UAV can bridge the gap between field measurement and satellite-based remote sensing, obtaining a balance in resolution and efficiency for catchment-scale gully erosion research.

Keywords: unmanned aerial vehicle (UAV), object-analysis image analysis, gully erosion, gully-affected area, Loess Plateau, random forest

Procedia PDF Downloads 190
1688 Intrusion Detection in Cloud Computing Using Machine Learning

Authors: Faiza Babur Khan, Sohail Asghar

Abstract:

With an emergence of distributed environment, cloud computing is proving to be the most stimulating computing paradigm shift in computer technology, resulting in spectacular expansion in IT industry. Many companies have augmented their technical infrastructure by adopting cloud resource sharing architecture. Cloud computing has opened doors to unlimited opportunities from application to platform availability, expandable storage and provision of computing environment. However, from a security viewpoint, an added risk level is introduced from clouds, weakening the protection mechanisms, and hardening the availability of privacy, data security and on demand service. Issues of trust, confidentiality, and integrity are elevated due to multitenant resource sharing architecture of cloud. Trust or reliability of cloud refers to its capability of providing the needed services precisely and unfailingly. Confidentiality is the ability of the architecture to ensure authorization of the relevant party to access its private data. It also guarantees integrity to protect the data from being fabricated by an unauthorized user. So in order to assure provision of secured cloud, a roadmap or model is obligatory to analyze a security problem, design mitigation strategies, and evaluate solutions. The aim of the paper is twofold; first to enlighten the factors which make cloud security critical along with alleviation strategies and secondly to propose an intrusion detection model that identifies the attackers in a preventive way using machine learning Random Forest classifier with an accuracy of 99.8%. This model uses less number of features. A comparison with other classifiers is also presented.

Keywords: cloud security, threats, machine learning, random forest, classification

Procedia PDF Downloads 293
1687 Customer Churn Prediction by Using Four Machine Learning Algorithms Integrating Features Selection and Normalization in the Telecom Sector

Authors: Alanoud Moraya Aldalan, Abdulaziz Almaleh

Abstract:

A crucial component of maintaining a customer-oriented business as in the telecom industry is understanding the reasons and factors that lead to customer churn. Competition between telecom companies has greatly increased in recent years. It has become more important to understand customers’ needs in this strong market of telecom industries, especially for those who are looking to turn over their service providers. So, predictive churn is now a mandatory requirement for retaining those customers. Machine learning can be utilized to accomplish this. Churn Prediction has become a very important topic in terms of machine learning classification in the telecommunications industry. Understanding the factors of customer churn and how they behave is very important to building an effective churn prediction model. This paper aims to predict churn and identify factors of customers’ churn based on their past service usage history. Aiming at this objective, the study makes use of feature selection, normalization, and feature engineering. Then, this study compared the performance of four different machine learning algorithms on the Orange dataset: Logistic Regression, Random Forest, Decision Tree, and Gradient Boosting. Evaluation of the performance was conducted by using the F1 score and ROC-AUC. Comparing the results of this study with existing models has proven to produce better results. The results showed the Gradients Boosting with feature selection technique outperformed in this study by achieving a 99% F1-score and 99% AUC, and all other experiments achieved good results as well.

Keywords: machine learning, gradient boosting, logistic regression, churn, random forest, decision tree, ROC, AUC, F1-score

Procedia PDF Downloads 106
1686 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 205
1685 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 71
1684 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 64
1683 Asia Pacific University of Technology and Innovation

Authors: Esther O. Adebitan, Florence Oyelade

Abstract:

The Millennium Development Goals (MDGs) was initiated by the UN member nations’ aspiration for the betterment of human life. It is expressed in a set of numerical ‎and time-bound targets. In more recent time, the aspiration is shifting away from just the achievement to the sustainability of achieved MDGs beyond the 2015 target. The main objective of this study was assessing how much the hotel industry within the Nigerian Federal Capital Territory (FCT) as a member of the global community is involved in the achievement of sustainable MDGs within the FCT. The study had two population groups consisting of 160 hotels and the communities where these are located. Stratified random sampling technique was adopted in selecting 60 hotels based on large, medium ‎and small hotels categorisation, while simple random sampling technique was used to elicit information from 30 residents of three of the hotels host communities. The study was guided by tree research questions and two hypotheses aimed to ascertain if hotels see the need to be involved in, and have policies in pursuit of achieving sustained MDGs, and to determine public opinion regarding hotels contribution towards the achievement of the MDGs in their communities. A 22 item questionnaire was designed ‎and administered to hotel managers while 11 item questionnaire was designed ‎and administered to hotels’ host communities. Frequency distribution and percentage as well as Chi-square were used to analyse data. Results showed no significant involvement of the hotel industry in achieving sustained MDGs in the FCT and that there was disconnect between the hotels and their immediate communities. The study recommended that hotels should, as part of their Corporate Social Responsibility pick at least one of the goals to work on in order to be involved in the attainment of enduring Millennium Development Goals.

Keywords: MDGs, hotels, FCT, host communities, corporate social responsibility

Procedia PDF Downloads 389
1682 The Causes and Effects of Delinquent Behaviour among Students in Juvenile Home: A Case Study of Osun State

Authors: Baleeqs, O. Adegoke, Adeola, O. Aburime

Abstract:

Juvenile delinquency is fast becoming one of the largest problems facing many societies due to many different factors ranging from parental factors to bullying at schools all which had led to different theoretical notions by different scholars. Delinquency is an illegal or immoral behaviour, especially by the young person who behaves in a way that is illegal or that society does not approve of. The purpose of the study was to investigate causes and effects of delinquent behaviours among adolescent in juvenile home in Osun State. A descriptive survey research type was employed. The random sampling technique was used to select 100 adolescents in Juvenile home in Osun State. Questionnaires were developed and given to them. The data collected from this study were analyzed using frequency counts and percentage for the demographic data in section A, while the two research hypotheses postulated for this study were tested using t-test statistics at the significance level of 0.05. Findings revealed that the greatest school effects of delinquent behaviours among adolescent in juvenile home in Osun by respondents were their aggressive behaviours. Findings revealed that there was a significant difference in the causes and effects of delinquent behaviours among adolescent in juvenile home in Osun State. It was also revealed that there was no significant difference in the causes and effects of delinquent behaviours among secondary school students in Osun based on gender. These recommendations were made in order to address the findings of this study: More number of teachers should be appointed in the observation home so that it will be possible to provide teaching to the different age group of delinquents. Developing the infrastructure facilities of short stay homes and observation home is a top priority. Proper counseling session’s interval is highly essential for these juveniles.

Keywords: behaviour, delinquency, juvenile, random sampling, statistical techniques, survey

Procedia PDF Downloads 161
1681 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface

Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari

Abstract:

With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.

Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis

Procedia PDF Downloads 385
1680 Human-Wildlife Conflicts in Urban Areas of Zimbabwe

Authors: Davie G. Dave, Prisca H. Mugabe, Tonderai Mutibvu

Abstract:

Globally, HWCs are on the rise. Such is the case with urban areas in Zimbabwe, yet little has been documented about it. This study was done to provide insights into the occurrence of human-wildlife conflicts in urban areas. The study was carried out in Harare, Bindura, Masvingo, Beitbridge, and Chiredzi to determine the cause, nature, extent, and frequency of occurrence of HWC, to determine the key wildlife species involved in conflicts and management practices done to combat wildlife conflicts in these areas. Several sampling techniques encompassing multi-stage sampling, stratified random, purposive, and simple random sampling were employed for placing residential areas into three strata according to population density, selecting residential areas, and selecting actual participants. Data were collected through a semi-structured questionnaire and key informant interviews. The results revealed that property destruction and crop damage were the most prevalent conflicts. Of the 15 animals that were cited, snakes, baboons, and monkeys were associated with the most conflicts. The occurrence of HWCs was mainly attributed to the increase in both animal and human populations. To curtail these HWCs, the local people mainly used non-lethal methods, whilst lethal methods were used by authorities for some of the reported cases. The majority of the conflicts were seasonal and less severe. There were growing concerns by respondents on the issues of wildlife conflicts, especially in those areas that had primates, such as Warren Park in Harare and Limpopo View in Beitbridge. There are HWCs hotspots in urban areas, and to ameliorate this, suggestions are that there is a need for a multi-action approach that includes general awareness campaigns on HWCs and land use planning that involves the creation of green spaces to ease wildlife management.

Keywords: human-wildlife conflicts, mitigation measures, residential areas, types of conflicts, urban areas

Procedia PDF Downloads 30
1679 Important Factors Affecting the Effectiveness of Quality Control Circles

Authors: Sogol Zarafshan

Abstract:

The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.

Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis

Procedia PDF Downloads 107
1678 The Value of Routine Terminal Ileal Biopsies for the Investigation of Diarrhea

Authors: Swati Bhasin, Ali Ahmed, Valence Xavier, Ben Liu

Abstract:

Aims: Diarrhea is a problem that is a frequent clinic referral to the gastroenterology and surgical team from the General practitioner. To establish a diagnosis, these patients undergo colonoscopy. The current practice at our district general hospital is to perform random left and right colonic biopsies. National guidelines issued by the British Society of Gastroenterology advise all patients presenting with chronic diarrhea should have an Ileoscopy as an indicator for colonoscopy completion. Our primary aim was to check if Terminal ileum (TI) biopsy is required to establish a diagnosis of inflammatory bowel disease (IBD). Methods: Data was collected retrospectively from November 2018 to November 2019. The target population were patients who underwent colonoscopies for diarrhea. Demographic data, endoscopic and histology findings of TI were assessed and analyzed. Results: 140 patients with a mean age of 57 years (19-84) underwent a colonoscopy (M: F; 1:2.3). 92 patients had random colonic biopsies taken and based on the histological results of these, 15 patients (16%) were diagnosed with IBD. The TI was successfully intubated in 40 patients, of which 32 patients had colonic biopsies taken as well. 8 patients did not have a colonic biopsy taken. Macroscopic abnormality in the TI was detected in 5 patients, all of whom were biopsied. Based on histological results of the biopsy, 3 patients (12%) were diagnosed with IBD. These 3 patients (100%) also had colonic biopsies taken simultaneously and showed inflammation. None of the patients had a diagnosis of IBD confirmed on TI intubation alone (where colonic biopsies were not done). None of the patients has a diagnosis of IBD confirmed on TI intubation alone (where colonic biopsies were negative). Conclusion: TI intubation is a highly-skilled, time-consuming procedure with a higher risk of perforation, which as per our study, has little additional diagnostic value in finding IBD for symptoms of diarrhea if colonic biopsies are taken. We propose that diarrhea is a colonic symptom; therefore, colonic biopsies are positive for inflammation if the diarrhea is secondary to IBD. We conclude that all of the IBDs can be diagnosed simply with colonic biopsies.

Keywords: biopsy, colon, IBD, terminal ileum

Procedia PDF Downloads 96
1677 Assessing and Identifying Factors Affecting Customers Satisfaction of Commercial Bank of Ethiopia: The Case of West Shoa Zone (Bako, Gedo, Ambo, Ginchi and Holeta), Ethiopia

Authors: Habte Tadesse Likassa, Bacha Edosa

Abstract:

Customer’s satisfaction was very important thing that is required for the existence of banks to be more productive and success in any organization and business area. The main goal of the study is assessing and identifying factors that influence customer’s satisfaction in West Shoa Zone of Commercial Bank of Ethiopia (Holeta, Ginchi, Ambo, Gedo and Bako). Stratified random sampling procedure was used in the study and by using simple random sampling (lottery method) 520 customers were drawn from the target population. By using Probability Proportional Size Techniques sample size for each branch of banks were allocated. Both descriptive and inferential statistics methods were used in the study. A binary logistic regression model was fitted to see the significance of factors affecting customer’s satisfaction in this study. SPSS statistical package was used for data analysis. The result of the study reveals that the overall level of customer’s satisfaction in the study area is low (38.85%) as compared those who were not satisfied (61.15%). The result of study showed that all most all factors included in the study were significantly associated with customer’s satisfaction. Therefore, it can be concluded that based on the comparison of branches on their customers satisfaction by using odd ratio customers who were using Ambo and Bako are less satisfied as compared to customers who were in Holeta branch. Additionally, customers who were in Ginchi and Gedo were more satisfied than that of customers who were in Holeta. Since the level of customers satisfaction was low in the study area, it is more advisable and recommended for concerned body works cooperatively more in maximizing satisfaction of their customers.

Keywords: customers, satisfaction, binary logistic, complain handling process, waiting time

Procedia PDF Downloads 429
1676 Mediation Role of Teachers’ Surface Acting and Deep Acting on the Relationship between Calling Orientation and Work Engagement

Authors: Yohannes Bisa Biramo

Abstract:

This study examined the meditational role of surface acting and deep acting on the relationship between calling orientation and work engagement of teachers in secondary schools of Wolaita Zone, Wolaita, Ethiopia. A predictive non-experimental correlational design was performed among 300 secondary school teachers. Stratified random sampling followed by a systematic random sampling technique was used as the basis for selecting samples from the target population. To analyze the data, Structural Equation Modeling (SEM) was used to test the association between the independent variables and the dependent variables. Furthermore, the goodness of fit of the study variables was tested using SEM to see and explain the path influence of the independent variable on the dependent variable. Confirmatory factor analysis (CFA) was conducted to test the validity of the scales in the study and to assess the measurement model fit indices. The analysis result revealed that calling was significantly and positively correlated with surface acting, deep acting and work engagement. Similarly, surface acting was significantly and positively correlated with deep acting and work engagement. And also, deep acting was significantly and positively correlated with work engagement. With respect to mediation analysis, the result revealed that surface acting mediated the relationship between calling and work engagement and also deep acting mediated the relationship between calling and work engagement. Besides, by using the model of the present study, the school leaders and practitioners can identify a core area to be considered in recruiting and letting teachers teach, in giving induction training for newly employed teachers and in performance appraisal.

Keywords: calling, surface acting, deep acting, work engagement, mediation, teachers

Procedia PDF Downloads 53
1675 The microbial evaluation of cow raw milk used in private dairy factories in of Zawia city, Libya

Authors: Obied A. Alwan, Elgerbi, M. Ali

Abstract:

This study was conducted on the cow milk which is used in the local milk factories of Zawia. This was completely random sampling the unscheduled samples. The microbiologic result have approved that the count of bacteria and the count of E.Coli are very high and all the manufacturing places which were included in the study have lacked the health conditions.

Keywords: raw milk, dairy factories, Libya, microbiologic

Procedia PDF Downloads 411
1674 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population

Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath

Abstract:

Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.

Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics

Procedia PDF Downloads 118
1673 Economic Decision Making under Cognitive Load: The Role of Numeracy and Financial Literacy

Authors: Vânia Costa, Nuno De Sá Teixeira, Ana C. Santos, Eduardo Santos

Abstract:

Financial literacy and numeracy have been regarded as paramount for rational household decision making in the increasing complexity of financial markets. However, financial decisions are often made under sub-optimal circumstances, including cognitive overload. The present study aims to clarify how financial literacy and numeracy, taken as relevant expert knowledge for financial decision-making, modulate possible effects of cognitive load. Participants were required to perform a choice between a sure loss or a gambling pertaining a financial investment, either with or without a competing memory task. Two experiments were conducted varying only the content of the competing task. In the first, the financial choice task was made while maintaining on working memory a list of five random letters. In the second, cognitive load was based upon the retention of six random digits. In both experiments, one of the items in the list had to be recalled given its serial position. Outcomes of the first experiment revealed no significant main effect or interactions involving cognitive load manipulation and numeracy and financial literacy skills, strongly suggesting that retaining a list of random letters did not interfere with the cognitive abilities required for financial decision making. Conversely, and in the second experiment, a significant interaction between the competing mnesic task and level of financial literacy (but not numeracy) was found for the frequency of choice of a gambling option. Overall, and in the control condition, both participants with high financial literacy and high numeracy were more prone to choose the gambling option. However, and when under cognitive load, participants with high financial literacy were as likely as their illiterate counterparts to choose the gambling option. This outcome is interpreted as evidence that financial literacy prevents intuitive risk-aversion reasoning only under highly favourable conditions, as is the case when no other task is competing for cognitive resources. In contrast, participants with higher levels of numeracy were consistently more prone to choose the gambling option in both experimental conditions. These results are discussed in the light of the opposition between classical dual-process theories and fuzzy-trace theories for intuitive decision making, suggesting that while some instances of expertise (as numeracy) are prone to support easily accessible gist representations, other expert skills (as financial literacy) depend upon deliberative processes. It is furthermore suggested that this dissociation between types of expert knowledge might depend on the degree to which they are generalizable across disparate settings. Finally, applied implications of the present study are discussed with a focus on how it informs financial regulators and the importance and limits of promoting financial literacy and general numeracy.

Keywords: decision making, cognitive load, financial literacy, numeracy

Procedia PDF Downloads 148
1672 Application of Rapidly Exploring Random Tree Star-Smart and G2 Quintic Pythagorean Hodograph Curves to the UAV Path Planning Problem

Authors: Luiz G. Véras, Felipe L. Medeiros, Lamartine F. Guimarães

Abstract:

This work approaches the automatic planning of paths for Unmanned Aerial Vehicles (UAVs) through the application of the Rapidly Exploring Random Tree Star-Smart (RRT*-Smart) algorithm. RRT*-Smart is a sampling process of positions of a navigation environment through a tree-type graph. The algorithm consists of randomly expanding a tree from an initial position (root node) until one of its branches reaches the final position of the path to be planned. The algorithm ensures the planning of the shortest path, considering the number of iterations tending to infinity. When a new node is inserted into the tree, each neighbor node of the new node is connected to it, if and only if the extension of the path between the root node and that neighbor node, with this new connection, is less than the current extension of the path between those two nodes. RRT*-smart uses an intelligent sampling strategy to plan less extensive routes by spending a smaller number of iterations. This strategy is based on the creation of samples/nodes near to the convex vertices of the navigation environment obstacles. The planned paths are smoothed through the application of the method called quintic pythagorean hodograph curves. The smoothing process converts a route into a dynamically-viable one based on the kinematic constraints of the vehicle. This smoothing method models the hodograph components of a curve with polynomials that obey the Pythagorean Theorem. Its advantage is that the obtained structure allows computation of the curve length in an exact way, without the need for quadratural techniques for the resolution of integrals.

Keywords: path planning, path smoothing, Pythagorean hodograph curve, RRT*-Smart

Procedia PDF Downloads 145
1671 The Effect of Group Counseling on the Victimhood Perceptions of Adolescent Who Are the Subject of Peer Victimization and on Their Coping Strategies

Authors: İsmail Seçer, Taştan Seçer

Abstract:

In this study, the effect of the group counseling on the victimhood perceptions of the primary school 7th and 8th grade students who are determined to be the subject of peer victimization and their dealing way with it was analyzed. The research model is Solomon Four Group Experimental Model. In this model, there are four groups that were determined with random sampling. Two of the groups have been used as experimental group and the other two have been used as control group. Solomon model is defined as real experimental model. In real experimental models, there are multiple groups consisting of subject which have similar characteristics, and selection of the subjects is done with random sampling. For this purpose, 230 students from Kültür Kurumu Primary School in Erzurum were asked to fill Adolescent Peer Victim Form. 100 students whose victim scores were higher and who were determined to be the subject of bullying were talked face to face and informed about the current study, and they were asked if they were willing to participate or not. As a result of these interviews, 60 students were determined to participate in the experimental study and four group consisting of 15 people were created with simple random sampling method. After the groups had been formed, experimental and control group were determined with casting lots. After determining experimental and control groups, an 11-session group counseling activity which was prepared by the researcher according to the literature was applied. The purpose of applying group counseling is to change the ineffective dealing ways with bullying and their victimhood perceptions. Each session was planned to be 75 minutes and applied as planned. In the control groups, counseling activities in the primary school counseling curricula was applied for 11 weeks. As a result of the study, physical, emotional and verbal victimhood perceptions of the participants in the experimental groups were decreased significantly compared to pre-experimental situations and to those in control group. Besides, it was determined that this change observed in the victimhood perceptions of the experimental group occurred independently from the effect of variables such as gender, age and academic success. The first evidence of the study related to the dealing ways is that the scores of the participants in the experimental group related to the ineffective dealing ways such as despair and avoidance is decreased significantly compared to the pre-experimental situation and to those in control group. The second evidence related to the dealing ways is that the scores of the participants in the experimental group related to effective dealing ways such as seeking for help, consulting social support, resistance and optimism is increased significantly compared to the pre-experimental situation and to those in control group. According to the evidence obtained through the study, it can be said that group counseling is an effective approach to change the victimhood perceptions of the individuals who are the subject of bullying and their dealing strategies with it.

Keywords: bullying, perception of victimization, coping strategies, ancova analysis

Procedia PDF Downloads 353
1670 Alcohol-Containing versus Aqueous-Based Solutions for Skin Preparation in Abdominal Surgery: A Systematic Review and Meta-Analysis

Authors: Dimitra V. Peristeri, Hussameldin M. Nour, Amiya Ahsan, Sameh Abogabal, Krishna K. Singh, Muhammad Shafique Sajid

Abstract:

Introduction: The use of optimal skin antiseptic agents for the prevention of surgical site infection (SSI) is of critical importance, especially during abdominal surgical procedures. Alcohol-based chlorhexidine gluconate (CHG) and aqueous-based povidone-iodine (PVI) are the two most common skin antiseptics used nowadays. The objective of this article is to evaluate the effectiveness of alcohol-based CHG versus aqueous-based PVI used for skin preparation before abdominal surgery to reduce SSIs. Methods: Standard medical databases such as MEDLINE, Embase, Pubmed, and Cochrane Library were searched to find randomised, controlled trials (RCTs) comparing alcohol-based CHG skin preparation versus aqueous-based PVI in patients undergoing abdominal surgery. The combined outcomes of SSIs were calculated using an odds ratio (OR) with 95% confidence intervals (95% CI). All data were analysed using Review Manager (RevMan) Software 5.4, and the meta-analysis was performed with a random effect model analysis. Results: A total of 11 studies, all RCTs, were included (n= 12072 participants), recruiting adult patients undergoing abdominal surgery. In the random effect model analysis, the use of alcohol-based CHG in patients undergoing abdominal surgery was associated with a reduced risk of SSI compared to aqueous-based PVI (OR: 0.84; 95% CI [0.74, 0.96], z= 2.61, p= 0.009). Conclusion: Alcohol-based CHG may be more effective for preventing the risk of SSI compared to aqueous-based PVI agents in abdominal surgery. The conclusion of this meta-analysis may add a guiding value to reinforce current clinical practice guidelines.

Keywords: skin preparation, surgical site infection, chlorhexidine, skin antiseptics

Procedia PDF Downloads 71
1669 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis

Authors: Chang-Jen Lan

Abstract:

Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as X

Keywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index

Procedia PDF Downloads 109
1668 Count Regression Modelling on Number of Migrants in Households

Authors: Tsedeke Lambore Gemecho, Ayele Taye Goshu

Abstract:

The main objective of this study is to identify the determinants of the number of international migrants in a household and to compare regression models for count response. This study is done by collecting data from total of 2288 household heads of 16 randomly sampled districts in Hadiya and Kembata-Tembaro zones of Southern Ethiopia. The Poisson mixed models, as special cases of the generalized linear mixed model, is explored to determine effects of the predictors: age of household head, farm land size, and household size. Two ethnicities Hadiya and Kembata are included in the final model as dummy variables. Stepwise variable selection has indentified four predictors: age of head, farm land size, family size and dummy variable ethnic2 (0=other, 1=Kembata). These predictors are significant at 5% significance level with count response number of migrant. The Poisson mixed model consisting of the four predictors with random effects districts. Area specific random effects are significant with the variance of about 0.5105 and standard deviation of 0.7145. The results show that the number of migrant increases with heads age, family size, and farm land size. In conclusion, there is a significantly high number of international migration per household in the area. Age of household head, family size, and farm land size are determinants that increase the number of international migrant in households. Community-based intervention is needed so as to monitor and regulate the international migration for the benefits of the society.

Keywords: Poisson regression, GLM, number of migrant, Hadiya and Kembata Tembaro zones

Procedia PDF Downloads 262
1667 Significance of Personnel Recruitment in Implementation of Computer Aided Design Curriculum of Architecture Schools

Authors: Kelechi E. Ezeji

Abstract:

The inclusion of relevant content in curricula of architecture schools is vital for attainment of Computer Aided Design (CAD) proficiency by graduates. Implementing this content involves, among other variables, the presence of competent tutors. Consequently, this study sought to investigate the importance of personnel recruitment for inclusion of content vital to the implementation of CAD in the curriculum for architecture education. This was with a view to developing a framework for appropriate implementation of CAD curriculum. It was focused on departments of architecture in universities in south-east Nigeria which have been accredited by National Universities Commission. Survey research design was employed. Data were obtained from sources within the study area using questionnaires, personal interviews, physical observation/enumeration and examination of institutional documents. A multi-stage stratified random sampling method was adopted. The first stage of stratification involved random sampling by balloting of the departments. The second stage involved obtaining respondents’ population from the number of staff and students of sample population. Chi Square analysis tool for nominal variables and Pearson’s product moment correlation test for interval variables were used for data analysis. With ρ < 0.5, the study found significant correlation between the number of CAD literate academic staff and use of CAD in design studio/assignments; that increase in the overall number of teaching staff significantly affected total CAD credit units in the curriculum of the department. The implications of these findings were that for successful implementation leading to attainment of CAD proficiency to occur, CAD-literacy should be a factor in the recruitment of staff and a policy of in-house training should be pursued.

Keywords: computer-aided design, education, personnel recruitment, curriculum

Procedia PDF Downloads 181