Search results for: cox proportional hazard regression
3989 The Study of Flood Resilient House in Ebo-Town
Authors: Alagie Salieu Nankey
Abstract:
Flood-resistant house is the key mechanism to withstand flood hazards in Ebo-Town. It emerged simple yet powerful way of mitigating flooding in the community of Ebo- Town. Even though there are different types of buildings, little is known yet how and why flood affects building severely. In this paper, we examine three different types of flood-resistant buildings that are suitable for Ebo Town. We gather content and contextual features from six (6) respondents and used this data set to identify factors that are significantly associated with the flood-resistant house. Moreover, we built a suitable design concept. We found that amongst all the theories studied in the literature study Slit or Elevated House is the most suitable building design in Ebo-Town and Pile foundation is the most appropriate foundation type in the study area. Amongst contextual features, local materials are the most economical materials for the proposed design. This research proposes a framework that explains the theoretical relationships between flood hazard zones and flood-resistant houses in Ebo Town. Moreover, this research informs the design of sense-making and analytics tools for the resistant house.Keywords: flood-resistant, slit, flood hazard zone, pile foundation
Procedia PDF Downloads 443988 Multiobjective Optimization of a Pharmaceutical Formulation Using Regression Method
Authors: J. Satya Eswari, Ch. Venkateswarlu
Abstract:
The formulation of a commercial pharmaceutical product involves several composition factors and response characteristics. When the formulation requires to satisfy multiple response characteristics which are conflicting, an optimal solution requires the need for an efficient multiobjective optimization technique. In this work, a regression is combined with a non-dominated sorting differential evolution (NSDE) involving Naïve & Slow and ε constraint techniques to derive different multiobjective optimization strategies, which are then evaluated by means of a trapidil pharmaceutical formulation. The analysis of the results show the effectiveness of the strategy that combines the regression model and NSDE with the integration of both Naïve & Slow and ε constraint techniques for Pareto optimization of trapidil formulation. With this strategy, the optimal formulation at pH=6.8 is obtained with the decision variables of micro crystalline cellulose, hydroxypropyl methylcellulose and compression pressure. The corresponding response characteristics of rate constant and release order are also noted down. The comparison of these results with the experimental data and with those of other multiple regression model based multiobjective evolutionary optimization strategies signify the better performance for optimal trapidil formulation.Keywords: pharmaceutical formulation, multiple regression model, response surface method, radial basis function network, differential evolution, multiobjective optimization
Procedia PDF Downloads 4093987 PM₁₀ and PM2.5 Concentrations in Bangkok over Last 10 Years: Implications for Air Quality and Health
Authors: Tin Thongthammachart, Wanida Jinsart
Abstract:
Atmospheric particulate matter particles with a diameter less than 10 microns (PM₁₀) and less than 2.5 microns (PM₂.₅) have adverse health effect. The impact from PM was studied from both health and regulatory perspective. Ambient PM data was collected over ten years in Bangkok and vicinity areas of Thailand from 2007 to 2017. Statistical models were used to forecast PM concentrations from 2018 to 2020. Monitoring monthly data averaged concentration of PM₁₀ and PM₂.₅ were used as input to forecast the monthly average concentration of PM. The forecasting results were validated by root means square error (RMSE). The predicted results were used to determine hazard risk for the carcinogenic disease. The health risk values were interpolated with GIS with ordinary kriging technique to create hazard maps in Bangkok and vicinity area. GIS-based maps illustrated the variability of PM distribution and high-risk locations. These evaluated results could support national policy for the sake of human health.Keywords: PM₁₀, PM₂.₅, statistical models, atmospheric particulate matter
Procedia PDF Downloads 1593986 A Correlations Study on Nursing Staff's Shifts Systems, Workplace Fatigue, and Quality of Working Life
Authors: Jui Chen Wu, Ming Yi Hsu
Abstract:
Background and Purpose: Shift work of nursing staff is inevitable in hospital to provide continuing medical care. However, shift work is considered as a health hazard that may cause physical and psychological problems. Serious workplace fatigue of nursing shift work might impact on family, social and work life, moreover, causes serious reduction of quality of medical care, or even malpractice. This study aims to explore relationships among nursing staff’s shift, workplace fatigue and quality of working life. Method: Structured questionnaires were used in this study to explore relationships among shift work, workplace fatigue and quality of working life in nursing staffs. We recruited 590 nursing staffs in different Community Teaching hospitals in Taiwan. Data analysed by descriptive statistics, single sample t-test, single factor analysis, Pearson correlation coefficient and hierarchical regression, etc. Results: The overall workplace fatigue score is 50.59 points. In further analysis, the score of personal burnout, work-related burnout, over-commitment and client-related burnout are 57.86, 53.83, 45.95 and 44.71. The basic attributes of nursing staff are significantly different from those of workplace fatigue with different ages, licenses, sleeping quality, self-conscious health status, number of care patients of chronic diseases and number of care people in the obstetric ward. The shift variables revealed no significant influence on workplace fatigue during the hierarchical regression analysis. About the analysis on nursing staff’s basic attributes and shift on the quality of working life, descriptive results show that the overall quality of working life of nursing staff is 3.23 points. Comparing the average score of the six aspects, the ranked average score are 3.47 (SD= .43) in interrelationship, 3.40 (SD= .46) in self-actualisation, 3.30 (SD= .40) in self-efficacy, 3.15 (SD= .38) in vocational concept, 3.07 (SD= .37) in work aspects, and 3.02 (SD= .56) in organization aspects. The basic attributes of nursing staff are significantly different from quality of working life in different marriage situations, education level, years of nursing work, occupation area, sleep quality, self-conscious health status and number of care in medical ward. There are significant differences between shift mode and shift rate with the quality of working life. The results of the hierarchical regression analysis reveal that one of the shifts variables 'shift mode' which does affect staff’s quality of working life. The workplace fatigue is negatively correlated with the quality of working life, and the over-commitment in the workplace fatigue is positively related to the vocational concept of the quality of working life. According to the regression analysis of nursing staff’s basic attributes, shift mode, workplace fatigue and quality of working life related shift, the results show that the workplace fatigue has a significant impact on nursing staff’s quality of working life. Conclusion: According to our study, shift work is correlated with workplace fatigue in nursing staffs. This results work as important reference for human resources management in hospitals to establishing a more positive and healthy work arrangement policy.Keywords: nursing staff, shift, workplace fatigue, quality of working life
Procedia PDF Downloads 2723985 Multi-Linear Regression Based Prediction of Mass Transfer by Multiple Plunging Jets
Abstract:
The paper aims to compare the performance of vertical and inclined multiple plunging jets and to model and predict their mass transfer capacity by multi-linear regression based approach. The multiple vertical plunging jets have jet impact angle of θ = 90O; whereas, multiple inclined plunging jets have jet impact angle of θ = 600. The results of the study suggests that mass transfer is higher for multiple jets, and inclined multiple plunging jets have up to 1.6 times higher mass transfer than vertical multiple plunging jets under similar conditions. The derived relationship, based on multi-linear regression approach, has successfully predicted the volumetric mass transfer coefficient (KLa) from operational parameters of multiple plunging jets with a correlation coefficient of 0.973, root mean square error of 0.002 and coefficient of determination of 0.946. The results suggests that predicted overall mass transfer coefficient is in good agreement with actual experimental values; thereby suggesting the utility of derived relationship based on multi-linear regression based approach and can be successfully employed in modelling mass transfer by multiple plunging jets.Keywords: mass transfer, multiple plunging jets, multi-linear regression, earth sciences
Procedia PDF Downloads 4613984 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts
Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig
Abstract:
This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.Keywords: expert interview, hazard management, modeling, simulation, snow avalanche
Procedia PDF Downloads 3263983 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management
Authors: Chokri Slim
Abstract:
The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines
Procedia PDF Downloads 1503982 Risk and Coping: Understanding Community Responses to Calls for Disaster Evacuation in Central Philippines
Authors: Soledad Natalia M. Dalisay, Mylene De Guzman
Abstract:
In archipelagic countries like the Philippines, many communities thrive along coastal areas. The sea is the community members’ main source of livelihood and the site of many cultural activities. For these communities, the sea is their life and livelihood. Nevertheless, the sea also poses a hazard during the rainy season when typhoons frequent their communities. Coastal communities often encounter threats from storm surges and flooding that are common when there are typhoons. During such periods, disaster evacuation programs are implemented. However, in many instances, evacuation has been the bane of local government officials implementing such programs in their communities as resistance from community members is often encountered. Such resistance is often viewed by program implementers as due to the fact that people were hard headed and ignorant of the potential impacts of living in hazard prone areas. This paper argues that it is not for these reasons that people refused to evacuate. Drawing from data collected from fieldwork done in three sites in Central Philippines affected by super typhoon Haiyan, this study aimed to provide a contextualized understanding of peoples’ refusal to heed disaster evacuation warnings. This study utilized the multi-sited ethnography approach with in-depth episodic interviews, focus group discussions, participatory risk mapping and key informant interviews in gathering data on peoples’ experiences and insights specifically on evacuation during typhoon Haiyan. This study showed that people have priorities and considerations vital in their social lives that they are protecting in their refusal to leave their homes for pre-emptive evacuation. It is not that they are not aware of the risks when the face the hazard. It is more that they had faith in the local knowledge and strategies that they have developed since the time of their ancestors as a result of living and engaging with hazards in their areas for as long as they could remember. The study also revealed that risk in encounters with hazards was gendered. Furthermore, previous engagement with local government officials and the manner in which the pre-emptive evacuation programs were implemented had cast doubt on the value of such programs in saving their lives. Life in the designated evacuation areas can be as dangerous if not more compared with living in their coastal homes. There seems to be the impression that in the evacuation program of the government, people were being moved from hazard zones to death zones. Thus, this paper ends with several recommendations that may contribute to building more responsive evacuation programs that aim to build people’s resilience while taking into consideration the local moral world in communities in identified hazard zones.Keywords: coastal communities, disaster evacuation, disaster risk perception, social and cultural responses to hazards
Procedia PDF Downloads 3373981 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects
Authors: Sami Mestiri, Abdeljelil Farhat
Abstract:
The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC
Procedia PDF Downloads 5423980 Bayesian Variable Selection in Quantile Regression with Application to the Health and Retirement Study
Authors: Priya Kedia, Kiranmoy Das
Abstract:
There is a rich literature on variable selection in regression setting. However, most of these methods assume normality for the response variable under consideration for implementing the methodology and establishing the statistical properties of the estimates. In many real applications, the distribution for the response variable may be non-Gaussian, and one might be interested in finding the best subset of covariates at some predetermined quantile level. We develop dynamic Bayesian approach for variable selection in quantile regression framework. We use a zero-inflated mixture prior for the regression coefficients, and consider the asymmetric Laplace distribution for the response variable for modeling different quantiles of its distribution. An efficient Gibbs sampler is developed for our computation. Our proposed approach is assessed through extensive simulation studies, and real application of the proposed approach is also illustrated. We consider the data from health and retirement study conducted by the University of Michigan, and select the important predictors when the outcome of interest is out-of-pocket medical cost, which is considered as an important measure for financial risk. Our analysis finds important predictors at different quantiles of the outcome, and thus enhance our understanding on the effects of different predictors on the out-of-pocket medical cost.Keywords: variable selection, quantile regression, Gibbs sampler, asymmetric Laplace distribution
Procedia PDF Downloads 1563979 Landslide Hazard a Gigantic Problem in Indian Himalayan Region: Needs In-Depth Research to Minimize Disaster
Authors: Varun Joshi, M. S. Rawat
Abstract:
The Indian Himalayan Region (IHR) is inherently fragile and susceptible to landslide hazard due to its extremely weak geology, highly rugged topography and heavy monsoonal rainfall. One of the most common hazards in the IHR is landslide, and this event is particularly frequent in Himalayan states of India i.e. Jammu & Kashmir, Himachal Pradesh, Uttarakhand, Sikkim, Manipur and Arunachal Pradesh. Landslides are mostly triggered by extreme rainfall events but the incidence increases during monsoon months (June to September). Natural slopes which are otherwise stable but they get destabilized due to anthropogenic activities like construction of various developmental activities and deforestation. These activities are required to fulfill the developmental needs and upliftment of societal status in the region. Landslides also trigger during major earthquakes and reported most observable and damaging phenomena. Studies indicate that the landslide phenomenon has increased many folds due to developmental activities in Himalayan region. Gradually increasing and devastating consequences of landslides turned into one of the most important hydro-geological hazards in Himalayan states especially in Uttarakhand and Sikkim states of India. The recent most catastrophic rainfall in June 2013 in Uttarakhand lead to colossal loss of life and property. The societal damage due to this incident is still to be recovered even after three years. Sikkim earthquake of September 2011 is witnessed for triggering of large number of coseismic landslides. The rescue and relief team faced huge problem in helping the trapped villagers in remote locations of the state due to road side blockade by landslides. The recent past incidences of landslides in Uttarakhand, as well as Sikkim states, created a new domain of research in terms of understanding the phenomena of landslide and management of disaster in such situation. Every year at many locations landslides trigger which force dwellers to either evacuate their dwelling or lose their life and property. The communication and transportation networks are also severely affected by landslides at several locations. Many times the drinking water supply disturbed and shortage of daily need household items reported during monsoon months. To minimize the severity of landslide in IHR requires in-depth research and developmental planning. For most of the areas in the present study, landslide hazard zonation is done on 1:50,000 scale. The land use planning maps on extensive basis are not available. Therefore, there is a need of large-scale landslide hazard zonation and land use planning maps. If the scientist conduct research on desired aspects and their outcome of research is utilized by the government in developmental planning then the incidents of landslide could be minimized, subsequent impact on society, life and property would be reduced. Along with the scientific research, there is another need of awareness generation in the region for stake holders and local dwellers to combat with the landslide hazard, if triggered in their location.Keywords: coseismic, Indian Himalayan Region, landslide hazard zonation, Sikkim, societal, Uttarakhand
Procedia PDF Downloads 2513978 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race
Authors: Joonas Pääkkönen
Abstract:
In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling
Procedia PDF Downloads 1243977 The Predictors of Student Engagement: Instructional Support vs Emotional Support
Authors: Tahani Salman Alangari
Abstract:
Student success can be impacted by internal factors such as their emotional well-being and external factors such as organizational support and instructional support in the classroom. This study is to identify at least one factor that forecasts student engagement. It is a cross-sectional, conducted on 6206 teachers and encompassed three years of data collection and observations of math instruction in approximately 50 schools and 300 classrooms. A multiple linear regression revealed that a model predicting student engagement from emotional support, classroom organization, and instructional support was significant. Four linear regression models were tested using hierarchical regression to examine the effects of independent variables: emotional support was the highest predictor of student engagement while instructional support was the lowest.Keywords: student engagement, emotional support, organizational support, instructional support, well-being
Procedia PDF Downloads 813976 Elastic and Plastic Collision Comparison Using Finite Element Method
Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier
Abstract:
The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.Keywords: collision, impact models, finite element method, Hertz Theory
Procedia PDF Downloads 1753975 Comprehensive Risk Analysis of Decommissioning Activities with Multifaceted Hazard Factors
Authors: Hyeon-Kyo Lim, Hyunjung Kim, Kune-Woo Lee
Abstract:
Decommissioning process of nuclear facilities can be said to consist of a sequence of problem solving activities, partly because there may exist working environments contaminated by radiological exposure, and partly because there may also exist industrial hazards such as fire, explosions, toxic materials, and electrical and physical hazards. As for an individual hazard factor, risk assessment techniques are getting known to industrial workers with advance of safety technology, but the way how to integrate those results is not. Furthermore, there are few workers who experienced decommissioning operations a lot in the past. Therefore, not a few countries in the world have been trying to develop appropriate counter techniques in order to guarantee safety and efficiency of the process. In spite of that, there still exists neither domestic nor international standard since nuclear facilities are too diverse and unique. In the consequence, it is quite inevitable to imagine and assess the whole risk in the situation anticipated one by one. This paper aimed to find out an appropriate technique to integrate individual risk assessment results from the viewpoint of experts. Thus, on one hand the whole risk assessment activity for decommissioning operations was modeled as a sequence of individual risk assessment steps, and on the other, a hierarchical risk structure was developed. Then, risk assessment procedure that can elicit individual hazard factors one by one were introduced with reference to the standard operation procedure (SOP) and hierarchical task analysis (HTA). With an assumption of quantification and normalization of individual risks, a technique to estimate relative weight factors was tried by using the conventional Analytic Hierarchical Process (AHP) and its result was reviewed with reference to judgment of experts. Besides, taking the ambiguity of human judgment into consideration, debates based upon fuzzy inference was added with a mathematical case study.Keywords: decommissioning, risk assessment, analytic hierarchical process (AHP), fuzzy inference
Procedia PDF Downloads 4243974 Solving of Types Mathematical Routine and Non-Routine Problems in Algebra
Authors: Verónica Díaz Quezada
Abstract:
The importance given to the development of the problem solving skill and the requirement to solve problems framed in mathematical or real life contexts, in practice, they are not evidence in relation to the teaching of proportional variations. This qualitative and descriptive study aims to (1) to improve problem solving ability of high school students in Chile, (ii) to elaborate and describe a didactic intervention strategy based on learning situations in proportional variations, focused on solving types of routine problems of various contexts and non-routine problems. For this purpose, participant observation was conducted, test of mathematics problems and an opinion questionnaire to thirty-six high school students. Through the results, the highest academic performance is evidenced in the routine problems of purely mathematical context, realistic, fantasy context, and non-routine problems, except in the routine problems of real context and compound proportionality problems. The results highlight the need to consider in the curriculum different types of problems in the teaching of mathematics that relate the discipline to everyday life situationsKeywords: algebra, high school, proportion variations, nonroutine problem solving, routine problem solving
Procedia PDF Downloads 1403973 Modeling Standpipe Pressure Using Multivariable Regression Analysis by Combining Drilling Parameters and a Herschel-Bulkley Model
Authors: Seydou Sinde
Abstract:
The aims of this paper are to formulate mathematical expressions that can be used to estimate the standpipe pressure (SPP). The developed formulas take into account the main factors that, directly or indirectly, affect the behavior of SPP values. Fluid rheology and well hydraulics are some of these essential factors. Mud Plastic viscosity, yield point, flow power, consistency index, flow rate, drillstring, and annular geometries are represented by the frictional pressure (Pf), which is one of the input independent parameters and is calculated, in this paper, using Herschel-Bulkley rheological model. Other input independent parameters include the rate of penetration (ROP), applied load or weight on the bit (WOB), bit revolutions per minute (RPM), bit torque (TRQ), and hole inclination and direction coupled in the hole curvature or dogleg (DL). The technique of repeating parameters and Buckingham PI theorem are used to reduce the number of the input independent parameters into the dimensionless revolutions per minute (RPMd), the dimensionless torque (TRQd), and the dogleg, which is already in the dimensionless form of radians. Multivariable linear and polynomial regression technique using PTC Mathcad Prime 4.0 is used to analyze and determine the exact relationships between the dependent parameter, which is SPP, and the remaining three dimensionless groups. Three models proved sufficiently satisfactory to estimate the standpipe pressure: multivariable linear regression model 1 containing three regression coefficients for vertical wells; multivariable linear regression model 2 containing four regression coefficients for deviated wells; and multivariable polynomial quadratic regression model containing six regression coefficients for both vertical and deviated wells. Although that the linear regression model 2 (with four coefficients) is relatively more complex and contains an additional term over the linear regression model 1 (with three coefficients), the former did not really add significant improvements to the later except for some minor values. Thus, the effect of the hole curvature or dogleg is insignificant and can be omitted from the input independent parameters without significant losses of accuracy. The polynomial quadratic regression model is considered the most accurate model due to its relatively higher accuracy for most of the cases. Data of nine wells from the Middle East were used to run the developed models with satisfactory results provided by all of them, even if the multivariable polynomial quadratic regression model gave the best and most accurate results. Development of these models is useful not only to monitor and predict, with accuracy, the values of SPP but also to early control and check for the integrity of the well hydraulics as well as to take the corrective actions should any unexpected problems appear, such as pipe washouts, jet plugging, excessive mud losses, fluid gains, kicks, etc.Keywords: standpipe, pressure, hydraulics, nondimensionalization, parameters, regression
Procedia PDF Downloads 843972 Suitability of Indonesia's Tax Administration with Abu Yusuf Thought
Authors: Dina Safrina
Abstract:
This paper aims to discuss the suitability of tax administration in Indonesia based on Islamic Shari'a by looking at Abu Yusuf's idea of taxation. This research is a qualitative research and using data collection method by library research, that is by studying, deepening and citing theories or concepts from a number of literature. The purpose of this paper is to find out whether taxation in Indonesia is consistent with the thinking of Islamic economists, namely Abu Yusuf's idea which became known by economists as the canons of taxation. The ability to pay, lax time giving for taxpayers and the centralization of decision-making in the tax administration are some of the principles it emphasizes. In taxation he recommends the use of the Muqassamah (Proportional Tax) system rather than the Mixed (Fixed Tax) system. In this case, the determination of tax rates in Indonesia there are using fixed tax system, proportional tax, progressive tax and regressive tax. Abu Yusuf opposed the existence of Qabalah institution (the guarantor of tax payments to the state) at the time and suggested a tax administration centered and paid directly to the state. This is in accordance with those already applied in Indonesia where tax collection is done centrally. The tax system in Indonesia using self assessment system, which is the authority and responsibility given by the government to the taxpayer to calculate, pay and report the tax itself becomes the gap for taxpayers to commit fraud. Prerequisites that must be met for the success of this system is with the tax consciousness, tax honesty, tax mindedness, and tax discipline.Keywords: Abu Yusuf, Indonesia, tax, tax administration
Procedia PDF Downloads 4183971 Estimation of Functional Response Model by Supervised Functional Principal Component Analysis
Authors: Hyon I. Paek, Sang Rim Kim, Hyon A. Ryu
Abstract:
In functional linear regression, one typical problem is to reduce dimension. Compared with multivariate linear regression, functional linear regression is regarded as an infinite-dimensional case, and the main task is to reduce dimensions of functional response and functional predictors. One common approach is to adapt functional principal component analysis (FPCA) on functional predictors and then use a few leading functional principal components (FPC) to predict the functional model. The leading FPCs estimated by the typical FPCA explain a major variation of the functional predictor, but these leading FPCs may not be mostly correlated with the functional response, so they may not be significant in the prediction for response. In this paper, we propose a supervised functional principal component analysis method for a functional response model with FPCs obtained by considering the correlation of the functional response. Our method would have a better prediction accuracy than the typical FPCA method.Keywords: supervised, functional principal component analysis, functional response, functional linear regression
Procedia PDF Downloads 753970 Awareness on Department of Education’s Disaster Risk Reduction Management Program at Oriental Mindoro National High School: Basis for Support School DRRM Program
Authors: Nimrod Bantigue
Abstract:
The Department of Education is continuously providing safe teaching-learning facilities and hazard-free environments to the learners. To achieve this goal, teachers’ awareness of DepEd’s DRRM programs and activities is extremely important; thus, this descriptive correlational quantitative study was conceptualized. This research answered four questions on the profile and level of awareness of the 153 teacher respondents of Oriental Mindoro National High School for the academic year 2018-2019. Stratified proportional sampling was employed, and both descriptive and inferential statistics were utilized to treat data. The findings revealed that the majority of the teachers at OMNHS are female and are in the age bracket of 20-40. Most are married and pursue graduate studies. They have moderate awareness of the Department of Education’s DRRM programs and activities in terms of assessment of risks activities, planning activities, implementation activities during disaster and evaluation and monitoring activities with 3.32, 3.12, 3.40 and 3.31 as computed means, respectively. Further, the result showed a significant relationship between the profile of the respondents such as age, civil status and educational attainment and the level of awareness. On the contrary, sex does not have a significant relationship with the level of awareness. The Support School DRRM program with Utilization Guide on School DRRM Manual was proposed to increase, improve and strengthen the weakest areas of awareness rated in each DRRM activity, such as assessment of risks, planning, and implementation during disasters and monitoring and evaluation.Keywords: awareness, management, monitoring, risk reduction
Procedia PDF Downloads 2193969 Analyzing the Influence of Hydrometeorlogical Extremes, Geological Setting, and Social Demographic on Public Health
Authors: Irfan Ahmad Afip
Abstract:
This main research objective is to accurately identify the possibility for a Leptospirosis outbreak severity of a certain area based on its input features into a multivariate regression model. The research question is the possibility of an outbreak in a specific area being influenced by this feature, such as social demographics and hydrometeorological extremes. If the occurrence of an outbreak is being subjected to these features, then the epidemic severity for an area will be different depending on its environmental setting because the features will influence the possibility and severity of an outbreak. Specifically, this research objective was three-fold, namely: (a) to identify the relevant multivariate features and visualize the patterns data, (b) to develop a multivariate regression model based from the selected features and determine the possibility for Leptospirosis outbreak in an area, and (c) to compare the predictive ability of multivariate regression model and machine learning algorithms. Several secondary data features were collected locations in the state of Negeri Sembilan, Malaysia, based on the possibility it would be relevant to determine the outbreak severity in the area. The relevant features then will become an input in a multivariate regression model; a linear regression model is a simple and quick solution for creating prognostic capabilities. A multivariate regression model has proven more precise prognostic capabilities than univariate models. The expected outcome from this research is to establish a correlation between the features of social demographic and hydrometeorological with Leptospirosis bacteria; it will also become a contributor for understanding the underlying relationship between the pathogen and the ecosystem. The relationship established can be beneficial for the health department or urban planner to inspect and prepare for future outcomes in event detection and system health monitoring.Keywords: geographical information system, hydrometeorological, leptospirosis, multivariate regression
Procedia PDF Downloads 1153968 On Estimating the Headcount Index by Using the Logistic Regression Estimator
Authors: Encarnación Álvarez, Rosa M. García-Fernández, Juan F. Muñoz, Francisco J. Blanco-Encomienda
Abstract:
The problem of estimating a proportion has important applications in the field of economics, and in general, in many areas such as social sciences. A common application in economics is the estimation of the headcount index. In this paper, we define the general headcount index as a proportion. Furthermore, we introduce a new quantitative method for estimating the headcount index. In particular, we suggest to use the logistic regression estimator for the problem of estimating the headcount index. Assuming a real data set, results derived from Monte Carlo simulation studies indicate that the logistic regression estimator can be more accurate than the traditional estimator of the headcount index.Keywords: poverty line, poor, risk of poverty, Monte Carlo simulations, sample
Procedia PDF Downloads 4233967 A Comparative Study on Sampling Techniques of Polynomial Regression Model Based Stochastic Free Vibration of Composite Plates
Authors: S. Dey, T. Mukhopadhyay, S. Adhikari
Abstract:
This paper presents an exhaustive comparative investigation on sampling techniques of polynomial regression model based stochastic natural frequency of composite plates. Both individual and combined variations of input parameters are considered to map the computational time and accuracy of each modelling techniques. The finite element formulation of composites is capable to deal with both correlated and uncorrelated random input variables such as fibre parameters and material properties. The results obtained by Polynomial regression (PR) using different sampling techniques are compared. Depending on the suitability of sampling techniques such as 2k Factorial designs, Central composite design, A-Optimal design, I-Optimal, D-Optimal, Taguchi’s orthogonal array design, Box-Behnken design, Latin hypercube sampling, sobol sequence are illustrated. Statistical analysis of the first three natural frequencies is presented to compare the results and its performance.Keywords: composite plate, natural frequency, polynomial regression model, sampling technique, uncertainty quantification
Procedia PDF Downloads 5133966 Heart Attack Prediction Using Several Machine Learning Methods
Authors: Suzan Anwar, Utkarsh Goyal
Abstract:
Heart rate (HR) is a predictor of cardiovascular, cerebrovascular, and all-cause mortality in the general population, as well as in patients with cardio and cerebrovascular diseases. Machine learning (ML) significantly improves the accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment while avoiding unnecessary treatment of others. This research examines relationship between the individual's various heart health inputs like age, sex, cp, trestbps, thalach, oldpeaketc, and the likelihood of developing heart disease. Machine learning techniques like logistic regression and decision tree, and Python are used. The results of testing and evaluating the model using the Heart Failure Prediction Dataset show the chance of a person having a heart disease with variable accuracy. Logistic regression has yielded an accuracy of 80.48% without data handling. With data handling (normalization, standardscaler), the logistic regression resulted in improved accuracy of 87.80%, decision tree 100%, random forest 100%, and SVM 100%.Keywords: heart rate, machine learning, SVM, decision tree, logistic regression, random forest
Procedia PDF Downloads 1383965 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation
Authors: Yoonsuh Jung, Steven N. MacEachern
Abstract:
Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.Keywords: cross-validation, model selection, quantile regression, tuning parameter selection
Procedia PDF Downloads 4383964 Isothermal Crystallization Kinetics of Lauric Acid Methyl Ester from DSC Measurements
Authors: Charine Faith H. Lagrimas, Rommel N. Galvan, Rizalinda L. de Leon
Abstract:
An ongoing study, methyl laurate to be used as a refrigerant in an HVAC system, requires the crystallization kinetics of the said substance. Step-wise and normal forms of Avrami model parameters were used to describe the isothermal crystallization kinetics of methyl laurate at different temperatures from Differential Scanning Calorimetry (DSC) measurements. At 3 °C, parameters showed that methyl laurate exhibits a secondary crystallization. The primary crystallization occurred with instantaneous nuclei and spherulitic growth; followed by a secondary instantaneous nucleation with a lower growth of dimensionality, rod-like. At 4 °C to 6 °C, the exotherms from DSC implied that the system was under the isokinetic range. The kinetics behavior is the same which is instantaneous nucleation with one-dimensional growth. The differences for the isokinetic range temperatures are the activation energies (directly proportional to T) and nucleation rates (inversely proportional to T). From the images obtained during the crystallization of methyl laurate using an optical microscope, it is confirmed that the nucleation and crystal growth modes obtained from the optical microscope are consistent with the parameters from Avrami model.Keywords: Avrami model, isothermal crystallization, lipids kinetics, methyl laurate
Procedia PDF Downloads 3423963 Vision Aided INS for Soft Landing
Authors: R. Sri Karthi Krishna, A. Saravana Kumar, Kesava Brahmaji, V. S. Vinoj
Abstract:
The lunar surface may contain rough and non-uniform terrain with dips and peaks. Soft-landing is a method of landing the lander on the lunar surface without any damage to the vehicle. This project focuses on finding a safe landing site for the vehicle by developing a method for the lateral velocity determination of the lunar lander. This is done by processing the real time images obtained by means of an on-board vision sensor. The hazard avoidance phase of the soft-landing starts when the vehicle is about 200 m above the lunar surface. Here, the lander has a very low velocity of about 10 cm/s:vertical and 5 m/s:horizontal. On the detection of a hazard the lander is navigated by controlling the vertical and lateral velocity. In order to find an appropriate landing site and to accordingly navigate, the lander image processing is performed continuously. The images are taken continuously until the landing site is determined, and the lander safely lands on the lunar surface. By integrating this vision-based navigation with the INS a better accuracy for the soft-landing of the lunar lander can be obtained.Keywords: vision aided INS, image processing, lateral velocity estimation, materials engineering
Procedia PDF Downloads 4663962 Instability Index Method and Logistic Regression to Assess Landslide Susceptibility in County Route 89, Taiwan
Authors: Y. H. Wu, Ji-Yuan Lin, Yu-Ming Liou
Abstract:
This study aims to set up the landslide susceptibility map of County Route 89 at Ren-Ai Township in Nantou County using the Instability Index Method and Logistic regression. Seven susceptibility factors including Slope Angle, Aspect, Elevation, Distance to fold, Distance to River, Distance to Road and Accumulated Rainfall were obtained by GIS based on the Typhoon Toraji landslide area identified by Industrial Technology Research Institute in 2001. To calculate the landslide percentage of each factor and acquire the weight and grade the grid by means of Instability Index Method. In this study, landslide susceptibility can be classified into four grades: high, medium high, medium low and low, in order to determine the advantages and disadvantages of the two models. The precision of this model is verified by classification error matrix and SRC curve. These results suggest that the logistic regression model is a preferred method than instability index in the assessment of landslide susceptibility. It is suitable for the landslide prediction and precaution in this area in the future.Keywords: instability index method, logistic regression, landslide susceptibility, SRC curve
Procedia PDF Downloads 2923961 Seismic Vulnerability Mitigation of Non-Engineered Buildings
Authors: Muhammad Tariq A. Chaudhary
Abstract:
The tremendous loss of life that resulted in the aftermath of recent earthquakes in developing countries is mostly due to the collapse of non-engineered and semi-engineered building structures. Such structures are used as houses, schools, primary healthcare centres and government offices. These building are classified structurally into two categories viz. non-engineered and semi-engineered. Non-engineered structures include: adobe, Unreinforced Masonry (URM) and wood buildings. Semi-engineered buildings are mostly low-rise (up to 3 story) light concrete frame structures or masonry bearing walls with reinforced concrete slab. This paper presents an overview of the typical damage observed in non-engineered structures and their most likely causes in the past earthquakes with specific emphasis on the performance of such structures in the 2005 Kashmir earthquake. It is demonstrated that seismic performance of these structures can be improved from life-safety viewpoint by adopting simple low-cost modifications to the existing construction practices. Incorporation of some of these practices in the reconstruction efforts after the 2005 Kashmir earthquake are examined in the last section for mitigating seismic risk hazard.Keywords: Kashmir earthquake, non-engineered buildings, seismic hazard, structural details, structural strengthening
Procedia PDF Downloads 2863960 Implementation of a Non-Poissonian Model in a Low-Seismicity Area
Authors: Ludivine Saint-Mard, Masato Nakajima, Gloria Senfaute
Abstract:
In areas with low to moderate seismicity, the probabilistic seismic hazard analysis frequently uses a Poisson approach, which assumes independence in time and space of events to determine the annual probability of earthquake occurrence. Nevertheless, in countries with high seismic rate, such as Japan, it is frequently use non-poissonian model which assumes that next earthquake occurrence depends on the date of previous one. The objective of this paper is to apply a non-poissonian models in a region of low to moderate seismicity to get a feedback on the following questions: can we overcome the lack of data to determine some key parameters?, and can we deal with uncertainties to apply largely this methodology on an industrial context?. The Brownian-Passage-Time model was applied to a fault located in France and conclude that even if the lack of data can be overcome with some calculations, the amount of uncertainties and number of scenarios leads to a numerous branches in PSHA, making this method difficult to apply on a large scale of low to moderate seismicity areas and in an industrial context.Keywords: probabilistic seismic hazard, non-poissonian model, earthquake occurrence, low seismicity
Procedia PDF Downloads 62