Search results for: Cox proportional hazard regression
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4246

Search results for: Cox proportional hazard regression

3376 Factors Affecting Expectations and Intentions of University Students in Educational Context

Authors: Davut Disci

Abstract:

Objective: to measure the factors affecting expectations and intentions of using mobile phone in educational contexts by university students, using advanced equations and modeling techniques. Design and Methodology: According to the literature, Mobile Addiction, Parental Surveillance-Safety/Security, Social Relations, and Mobile Behavior are most used terms of defining mobile use of people. Therefore, these variables are tried to be measured to find and estimate their effects on expectations and intentions of using mobile phone in educational context. 421 university students participated in this study and there are 229 Female and 192 Male students. For the purpose of examining the mobile behavior and educational expectations and intentions, a questionnaire is prepared and applied to the participants who had to answer all the questions online. Furthermore, responses to close-ended questions are analyzed by using The Statistical Package for Social Sciences(SPSS) software, reliabilities are measured by Cronbach’s Alpha analysis and hypothesis are examined via using Multiple Regression and Linear Regression analysis and the model is tested with Structural Equation Modeling (SEM) technique which is important for testing the model scientifically. Besides these responses, open-ended questions are taken into consideration. Results: When analyzing data gathered from close-ended questions, it is found that Mobile Addiction, Parental Surveillance, Social Relations and Frequency of Using Mobile Phone Applications are affecting the mobile behavior of the participants in different levels, helping them to use mobile phone in educational context. Moreover, as for open-ended questions, participants stated that they use many mobile applications in their learning environment in terms of contacting with friends, watching educational videos, finding course material via internet. They also agree in that mobile phone brings greater flexibility to their lives. According to the SEM results the model is not evaluated and it can be said that it may be improved to show in SEM besides in multiple regression. Conclusion: This study shows that the specified model can be used by educationalist, school authorities to improve their learning environment.

Keywords: learning technology, instructional technology, mobile learning, technology

Procedia PDF Downloads 452
3375 The Relationship between Inventory Management and Profitability: A Comparative Research on Turkish Firms Operated in Weaving Industry, Eatables Industry, Wholesale and Retail Industry

Authors: Gamze Sekeroglu, Mikail Altan

Abstract:

Working capital is identified as firm’s all current assets. Inventories which are one of the working capital elements are very important among current assets for firms. Because, profitability is an indicator for firms’ financial success is provided with minimum cost and optimum inventory quantity. So in this study, it is investigated as comparatively that the effect of inventory management on the profitability of Turkish firms which operated in weaving industry, eatables industry, wholesale and retail industry in between 2003 – 2012 years. Research data consist of profitability ratios and inventory turnovers ratio calculated by using balance sheets and income statements of firms which operated in Borsa Istanbul (BIST). In this research, the relationship between inventories and profitability is investigated by using SPSS-20 software with regression and correlation analysis. The results achieved from three industry departments which exist in study interpreted as comparatively. Accordingly, it is determined that there is a positive relationship between inventory management and profitability in eatables industry. However, it was founded that there is no relationship between inventory management and profitability in weaving industry and wholesale and retail industry.

Keywords: profitability, regression analysis, inventory management, working capital

Procedia PDF Downloads 336
3374 Eco-Nanofiltration Membranes: Nanofiltration Membrane Technology Utilization-Based Fiber Pineapple Leaves Waste as Solutions for Industrial Rubber Liquid Waste Processing and Fertilizer Crisis in Indonesia

Authors: Andi Setiawan, Annisa Ulfah Pristya

Abstract:

Indonesian rubber plant area reached 2.9 million hectares with productivity reached 1.38 million. High rubber productivity is directly proportional to the amount of waste produced rubber processing industry. Rubber industry would produce a negative impact on the rubber industry in the form of environmental pollution caused by waste that has not been treated optimally. Rubber industrial wastewater containing high-nitrogen compounds (nitrate and ammonia) and phosphate compounds which cause water pollution and odor problems due to the high ammonia content. On the other hand, demand for NPK fertilizers in Indonesia continues to increase from year to year and in need of ammonia and phosphate as raw material. Based on domestic demand, it takes a year to 400,000 tons of ammonia and Indonesia imports 200,000 tons of ammonia per year valued at IDR 4.2 trillion. As well, the lack of phosphoric acid to be imported from Jordan, Morocco, South Africa, the Philippines, and India as many as 225 thousand tons per year. During this time, the process of wastewater treatment is generally done with a rubber on the tank to contain the waste and then precipitated, filtered and the rest released into the environment. However, this method is inefficient and thus require high energy costs because through many stages before producing clean water that can be discharged into the river. On the other hand, Indonesia has the potential of pineapple fruit can be harvested throughout the year in all of Indonesia. In 2010, production reached 1,406,445 tons of pineapple in Indonesia or about 9.36 percent of the total fruit production in Indonesia. Increased productivity is directly proportional to the amount of pineapple waste pineapple leaves are kept continuous and usually just dumped in the ground or disposed of with other waste at the final disposal. Through Eco-Nanofiltration Membrane-Based Fiber Pineapple leaves Waste so that environmental problems can be solved efficiently. Nanofiltration is a process that uses pressure as a driving force that can be either convection or diffusion of each molecule. Nanofiltration membranes that can split water to nano size so as to separate the waste processed residual economic value that N and P were higher as a raw material for the manufacture of NPK fertilizer to overcome the crisis in Indonesia. The raw materials were used to manufacture Eco-Nanofiltration Membrane is cellulose from pineapple fiber which processed into cellulose acetate which is biodegradable and only requires a change of the membrane every 6 months. Expected output target is Green eco-technology so with nanofiltration membranes not only treat waste rubber industry in an effective, efficient and environmentally friendly but also lowers the cost of waste treatment compared to conventional methods.

Keywords: biodegradable, cellulose diacetate, fertilizers, pineapple, rubber

Procedia PDF Downloads 449
3373 The Effect of Environmental, Social, and Governance (ESG) Disclosure on Firms’ Credit Rating and Capital Structure

Authors: Heba Abdelmotaal

Abstract:

This paper explores the impact of the extent of a company's environmental, social, and governance (ESG) disclosure on credit rating and capital structure. The analysis is based on a sample of 202 firms from the 350 FTSE firms over the period of 2008-2013. ESG disclosure score is measured using Proprietary Bloomberg score based on the extent of a company's Environmental, Social, and Governance (ESG) disclosure. The credit rating is measured by The QuiScore, which is a measure of the likelihood that a company will become bankrupt in the twelve months following the date of calculation. The Capital Structure is measured by long term debt ratio. Two hypotheses are test using panel data regression. The results suggested that the higher degree of ESG disclosure leads to better credit rating. There is significant negative relationship between ESG disclosure and the long term debit percentage. The paper includes implications for the transparency which is resulting of the ESG disclosure could support the Monitoring Function. The monitoring role of disclosure is the increasing in the transparency of the credit rating agencies, also it could affect on managers’ actions. This study provides empirical evidence on the material of ESG disclosure on credit ratings changes and the firms’ capital decision making.

Keywords: capital structure, credit rating agencies, ESG disclosure, panel data regression

Procedia PDF Downloads 360
3372 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 188
3371 Cleaner Technology for Stone Crushers

Authors: S. M. Ahuja

Abstract:

There are about 12000 stone crusher units in India and are located in clusters around urban areas to the stone quarries. These crushers create lot of fugitive dust emissions and noise pollution which is a major health hazard for the people working in the crushers and also living in its vicinity. Ambient air monitoring was carried out near various stone crushers and it has been observed that fugitive emission varied from 300 to 8000 mg/Nm3. A number of stone crushers were thoroughly studied and their existing pollution control devices were examined. Limitations in the existing technology were also studied. A technology consisting of minimal effective spray nozzles to reduce the emissions at source followed by a containment cum control system having modular cyclones as air pollution control device has been conceived. Besides preliminary energy audit has also been carried out in some of the stone crushers which indicates substantial potential for energy saving.

Keywords: stone crushers, spray nozzles, energy audit

Procedia PDF Downloads 333
3370 Scour Depth Prediction around Bridge Piers Using Neuro-Fuzzy and Neural Network Approaches

Authors: H. Bonakdari, I. Ebtehaj

Abstract:

The prediction of scour depth around bridge piers is frequently considered in river engineering. One of the key aspects in efficient and optimum bridge structure design is considered to be scour depth estimation around bridge piers. In this study, scour depth around bridge piers is estimated using two methods, namely the Adaptive Neuro-Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN). Therefore, the effective parameters in scour depth prediction are determined using the ANN and ANFIS methods via dimensional analysis, and subsequently, the parameters are predicted. In the current study, the methods’ performances are compared with the nonlinear regression (NLR) method. The results show that both methods presented in this study outperform existing methods. Moreover, using the ratio of pier length to flow depth, ratio of median diameter of particles to flow depth, ratio of pier width to flow depth, the Froude number and standard deviation of bed grain size parameters leads to optimal performance in scour depth estimation.

Keywords: adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN), bridge pier, scour depth, nonlinear regression (NLR)

Procedia PDF Downloads 221
3369 How Social Support, Interaction with Clients and Work-Family Conflict Contribute to Mental Well-Being for Employees in the Human Service System

Authors: Uwe C. Fischer

Abstract:

Mental health and well-being for employees working in the human service system are getting more and more important given the increasing rate of absenteeism at work. Besides individual capacities, social and community factors seem to be important in the working setting. Starting from a demand resource framework including the classical demand control aspects, social support systems, specific demands and resources of the client work, and work-family conflict were considered in the present study. We state hypothetically, that these factors have a meaningful association with the mental quality of life of employees working in the field of social, educational and health sectors. 1140 employees, working in human service organizations (education, youth care, nursing etc.) were asked for strains and resources at work (selected scales from Salutogenetic Subjective Work Assessment SALSA and own new scales for client work), work-family conflict, and mental quality of life from the German Short Form Health Survey. Considering the complex influences of the variables, we conducted a multiple hierarchical regression analysis. One third of the whole variance of the mental quality of life can be declared by the different variables of the model. When the variables concerning social influences were included in the hierarchical regression, the influence of work related control resource decreased. Excessive workload, work-family conflict, social support by supervisors, co-workers and other persons outside work, as well as strains and resources associated with client work had significant regression coefficients. Conclusions: Social support systems are crucial in the social, educational and health related service sector, regarding the influence on mental well-being. Especially the work-family conflict focuses on the importance of the work-life balance. Also the specific strains and resources of the client work, measured with new constructed scales, showed great impact on mental health. Therefore occupational health promotion should focus more on the social factors within and outside the working place.

Keywords: client interaction, human service system, mental health, social support, work-family conflict

Procedia PDF Downloads 440
3368 Analytical Authentication of Butter Using Fourier Transform Infrared Spectroscopy Coupled with Chemometrics

Authors: M. Bodner, M. Scampicchio

Abstract:

Fourier Transform Infrared (FT-IR) spectroscopy coupled with chemometrics was used to distinguish between butter samples and non-butter samples. Further, quantification of the content of margarine in adulterated butter samples was investigated. Fingerprinting region (1400-800 cm–1) was used to develop unsupervised pattern recognition (Principal Component Analysis, PCA), supervised modeling (Soft Independent Modelling by Class Analogy, SIMCA), classification (Partial Least Squares Discriminant Analysis, PLS-DA) and regression (Partial Least Squares Regression, PLS-R) models. PCA of the fingerprinting region shows a clustering of the two sample types. All samples were classified in their rightful class by SIMCA approach; however, nine adulterated samples (between 1% and 30% w/w of margarine) were classified as belonging both at the butter class and at the non-butter one. In the two-class PLS-DA model’s (R2 = 0.73, RMSEP, Root Mean Square Error of Prediction = 0.26% w/w) sensitivity was 71.4% and Positive Predictive Value (PPV) 100%. Its threshold was calculated at 7% w/w of margarine in adulterated butter samples. Finally, PLS-R model (R2 = 0.84, RMSEP = 16.54%) was developed. PLS-DA was a suitable classification tool and PLS-R a proper quantification approach. Results demonstrate that FT-IR spectroscopy combined with PLS-R can be used as a rapid, simple and safe method to identify pure butter samples from adulterated ones and to determine the grade of adulteration of margarine in butter samples.

Keywords: adulterated butter, margarine, PCA, PLS-DA, PLS-R, SIMCA

Procedia PDF Downloads 147
3367 Investigating the Relationship between Emotional Intelligence and Self-Efficacy of Physical Education Teachers in Ilam Province

Authors: Ali Heyrani, Maryam Saidyousefi

Abstract:

The aim of the present study was to investigate the relationship between emotional intelligence and Self-Efficacy of physical education teachers in Ilam province. The research method is descriptive correlational. The study participants were of 170 physical education teachers (90 males, 80 females) with an age range of 20 to 50 years, who were selected randomly. The instruments for data collection were Emotional Intelligence Questionnaire Bar-on (1997) to assess the Emotional Intelligence teachers and Self-Efficacy Questionnaire to measure their Self-Efficacy. The questionnaires used in the interior are reliable and valid. To analyze the data, descriptive statistics and inferential tests (Kolmogorov-Smirnov test, Pearson correlation and multiple regression) at a significance level of P <0/ 05 were used. The Results showed that there is a significant positive relationship between totall emotional intelligence and Self-Efficacy of teachers, so the more emotional intelligence of physical education teachers the better the extent of Self-Efficacy. Also, the results arising from regression analysis gradually showed that among components of emotional intelligence, three components, the General Mood, Adaptability, and Interpersonal Communication to Self-Efficacy are of a significant positive relationship and are able to predict the Self-Efficacy of physical education teachers. It seems the application of this study ҆s results can help to education authorities to promote the level of teachers’ emotional intelligence and therefore the improvement of their Self-Efficacy and success in learners’ teaching and training.

Keywords: emotional intelligence, self-efficacy, physical education teachers, Ilam province

Procedia PDF Downloads 523
3366 Robust Inference with a Skew T Distribution

Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici

Abstract:

There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.

Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness

Procedia PDF Downloads 397
3365 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements

Authors: Ebru Turgal, Beyza Doganay Erdogan

Abstract:

Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.

Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data

Procedia PDF Downloads 203
3364 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 137
3363 Monitoring Blood Pressure Using Regression Techniques

Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim

Abstract:

Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.

Keywords: blood pressure, noninvasive optical system, principal component analysis, PCA, continuous monitoring

Procedia PDF Downloads 161
3362 Development of PCI Prediction Models for Distress Evaluation of Asphalt Pavements

Authors: Hamid Noori

Abstract:

A scientific approach is essential for evaluating pavement surface conditions at the network level. The Pavement Condition Index (PCI) is widely used to assess surface conditions and determine appropriate treatments. This study examines three national highways using a network survey vehicle to collect distress data. The first two corridors were used for evaluation and comparison, while the third corridor validated the predicted PCI values. Multiple linear regression (MLR) initially modeled the relationship between PCI and distress variables but showed poor predictive accuracy. Therefore, K-nearest neighbors (KNN) and artificial neural network (ANN) models were developed, providing better results. A methodology for prioritizing pavement sections was introduced, and the pavement sections were based on PCI, IRI, and rut values through Combined Index Rankings (CIR). In addition, a methodology has been proposed for the selection of appropriate treatment of the ranked candidate pavement section. The proposed treatment selection process considers PCI, IRI, rutting, and FWD test results, aligning with a customized PCI rating scale. A Decision Tree was developed to recommend suitable treatments based on these criteria.

Keywords: pavement distresses, pavement condition index, multiple linear regression, artificial neural network, k-nearest neighbors, combined index ranking

Procedia PDF Downloads 0
3361 VISMA: A Method for System Analysis in Early Lifecycle Phases

Authors: Walter Sebron, Hans Tschürtz, Peter Krebs

Abstract:

The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.

Keywords: analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety

Procedia PDF Downloads 226
3360 Predictive Analysis of the Stock Price Market Trends with Deep Learning

Authors: Suraj Mehrotra

Abstract:

The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.

Keywords: machine learning, testing set, artificial intelligence, stock analysis

Procedia PDF Downloads 96
3359 Risk and Uncertainty in Aviation: A Thorough Analysis of System Vulnerabilities

Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu

Abstract:

Hazard assessment and risks quantification are key components for estimating the impact of existing regulations. But since regulatory compliance cannot cover all risks in aviation, the authors point out that by studying causal factors and eliminating uncertainty, an accurate analysis can be outlined. The research debuts by making delimitations on notions, as confusion on the terms over time has reflected in less rigorous analysis. Throughout this paper, it will be emphasized the fact that the variation in human performance and organizational factors represent the biggest threat from an operational perspective. Therefore, advanced risk assessment methods analyzed by the authors aim to understand vulnerabilities of the system given by a nonlinear behavior. Ultimately, the mathematical modeling of existing hazards and risks by eliminating uncertainty implies establishing an optimal solution (i.e. risk minimization).

Keywords: control, human factor, optimization, risk management, uncertainty

Procedia PDF Downloads 249
3358 Optimization of Hemp Fiber Reinforced Concrete for Various Environmental Conditions

Authors: Zoe Chang, Max Williams, Gautham Das

Abstract:

The purpose of this study is to evaluate the incorporation of hemp fibers (HF) in concrete. Hemp fiber reinforced concrete (HFRC) is becoming more popular as an alternative for regular mix designs. This study was done to evaluate the compressive strength of HFRC regarding mix procedure. Hemp fibers were obtained from the manufacturer and hand-processed to ensure uniformity in width and length. The fibers were added to the concrete as both wet and dry mixes to investigate and optimize the mix design process. Results indicated that the dry mix had a compressive strength of 1157 psi compared to the wet mix of 985 psi. This dry mix compressive strength was within range of the standard mix compressive strength of 1533 psi. The statistical analysis revealed that the mix design process needs further optimization and uniformity concerning the addition of HF. Regression analysis revealed the standard mix design had a coefficient of 0.9 as compared to the dry mix of 0.375, indicating a variation in the mixing process. While completing the dry mix, the addition of plain hemp fibers caused them to intertwine, creating lumps and inconsistency. However, during the wet mixing process, combining water and hemp fibers before incorporation allows the fibers to uniformly disperse within the mix; hence the regression analysis indicated a better coefficient of 0.55. This study concludes that HRFC is a viable alternative to regular mixes; however, more research surrounding its characteristics needs to be conducted.

Keywords: hemp fibers, hemp reinforced concrete, wet & dry, freeze thaw testing, compressive strength

Procedia PDF Downloads 200
3357 River's Bed Level Changing Pattern Due to Sedimentation, Case Study: Gash River, Kassala, Sudan

Authors: Faisal Ali, Hasssan Saad Mohammed Hilmi, Mustafa Mohamed, Shamseddin Musa

Abstract:

The Gash rivers an ephemeral river, it usually flows from July to September, it has a braided pattern with high sediment content, of 15200 ppm in suspension, and 360 kg/sec as bed load. The Gash river bed has an average slope of 1.3 m/Km. The objectives of this study were: assessing the Gash River bed level patterns; quantifying the annual variations in Gash bed level; and recommending a suitable method to reduce the sediment accumulation on the Gash River bed. The study covered temporally the period 1905-2013 using datasets included the Gash river flows, and the cross sections. The results showed that there is an increasing trend in the river bed of 5 cm3 per year. This is resulted in changing the behavior of the flood routing and consequently the flood hazard is tremendously increased in Kassala city.

Keywords: bed level, cross section, gash river, sedimentation

Procedia PDF Downloads 543
3356 Impact Factor Analysis for Spatially Varying Aerosol Optical Depth in Wuhan Agglomeration

Authors: Wenting Zhang, Shishi Liu, Peihong Fu

Abstract:

As an indicator of air quality and directly related to concentration of ground PM2.5, the spatial-temporal variation and impact factor analysis of Aerosol Optical Depth (AOD) have been a hot spot in air pollution. This paper concerns the non-stationarity and the autocorrelation (with Moran’s I index of 0.75) of the AOD in Wuhan agglomeration (WHA), in central China, uses the geographically weighted regression (GRW) to identify the spatial relationship of AOD and its impact factors. The 3 km AOD product of Moderate Resolution Imaging Spectrometer (MODIS) is used in this study. Beyond the economic-social factor, land use density factors, vegetable cover, and elevation, the landscape metric is also considered as one factor. The results suggest that the GWR model is capable of dealing with spatial varying relationship, with R square, corrected Akaike Information Criterion (AICc) and standard residual better than that of ordinary least square (OLS) model. The results of GWR suggest that the urban developing, forest, landscape metric, and elevation are the major driving factors of AOD. Generally, the higher AOD trends to located in the place with higher urban developing, less forest, and flat area.

Keywords: aerosol optical depth, geographically weighted regression, land use change, Wuhan agglomeration

Procedia PDF Downloads 357
3355 Fear of Negative Evaluation, Social Support and Wellbeing in People with Vitiligo

Authors: Rafia Rafique, Mutmina Zainab

Abstract:

The present study investigated the relationship between fear of negative evaluation (FNE), social support and well-being in people with Vitiligo. It was hypothesized that low level of FNE and greater social support is likely to predict well-being. It was also hypothesized that social support is likely to moderate the relationship between FNE and well-being. Correlational research design was used for the present study. Non-probability purposive sampling technique was used to collect a sample (N=122) of people with Vitiligo. Hierarchical Moderated Regression analysis was used to test prediction and moderation. Brief Fear of Negative Evaluation Scale, Multidimensional Scale of Perceived Social Support (MSPSS) and Mental Health Continuum-Short form (MHC-SF) were used to evaluate the study variables. Fear of negative evaluation negatively predicted well-being (emotional and psychological). Social support from significant others and friends predicted social well-being. Social Support from family predicted emotional and psychological well-being. It was found that social support from significant others moderated the relationship between FNE and emotional well-being and social support from family moderated the relationship between FNE and social well-being. Dermatologists treating people with Vitiligo need to educate them and their families about the buffering role of social support (family and significant others). Future studies need to focus on other important mediating factors that can possibly explain the relationship between fear of negative evaluation and wellbeing.

Keywords: fear of negative evaluation, hierarchical moderated regression, vitiligo, well-being

Procedia PDF Downloads 303
3354 Survival Analysis after a First Ischaemic Stroke Event: A Case-Control Study in the Adult Population of England.

Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski

Abstract:

Stroke is associated with a significant risk of morbidity and mortality. There is scarcity of research on the long-term survival after first-ever ischaemic stroke (IS) events in England with regards to effects of different medical therapies and comorbidities. The objective of this study was to model the all-cause mortality after an IS diagnosis in the adult population of England. Using a retrospective case-control design, we extracted the electronic medical records of patients born prior to or in year 1960 in England with a first-ever ischaemic stroke diagnosis from January 1986 to January 2017 within the Health and Improvement Network (THIN) database. Participants with a history of ischaemic stroke were matched to 3 controls by sex and age at diagnosis and general practice. The primary outcome was the all-cause mortality. The hazards of the all-cause mortality were estimated using a Weibull-Cox survival model which included both scale and shape effects and a shared random effect of general practice. The model included sex, birth cohort, socio-economic status, comorbidities and medical therapies. 20,250 patients with a history of IS (cases) and 55,519 controls were followed up to 30 years. From 2008 to 2015, the one-year all-cause mortality for the IS patients declined with an absolute change of -0.5%. Preventive treatments to cases increased considerably over time. These included prescriptions of statins and antihypertensives. However, prescriptions for antiplatelet drugs decreased in the routine general practice since 2010. The survival model revealed a survival benefit of antiplatelet treatment to stroke survivors with hazard ratio (HR) of 0.92 (0.90 – 0.94). IS diagnosis had significant interactions with gender and age at entry and hypertension diagnosis. IS diagnosis was associated with high risk of all-cause mortality with HR= 3.39 (3.05-3.72) for cases compared to controls. Hypertension was associated with poor survival with HR = 4.79 (4.49 - 5.09) for hypertensive cases relative to non-hypertensive controls, though the detrimental effect of hypertension has not reached significance for hypertensive controls, HR = 1.19(0.82-1.56). This study of English primary care data showed that between 2008 and 2015, the rates of prescriptions of stroke preventive treatments increased, and a short-term all-cause mortality after IS stroke declined. However, stroke resulted in poor long-term survival. Hypertension, a modifiable risk factor, was found to be associated with poor survival outcomes in IS patients. Antiplatelet drugs were found to be protective to survival. Better efforts are required to reduce the burden of stroke through health service development and primary prevention.

Keywords: general practice, hazard ratio, health improvement network (THIN), ischaemic stroke, multiple imputation, Weibull-Cox model.

Procedia PDF Downloads 187
3353 Transformational Justice for Employees' Job Satisfaction

Authors: Hassan Barau Singhry

Abstract:

Purpose: Leadership or the absence of it is an important behaviour affecting employees’ job satisfaction. Although, there are many models of leadership, one that stands out in a period of change is the transformational behaviour. The aim of this study is to investigate the role of an organizational justice on the relationship between transformational leadership and employee job satisfaction. The study is based on the assumption that change begins with leaders and leaders should be fair and just. Methodology: A cross-sectional survey through structured questionnaire was employed to collect the data of this study. The population is selected the three tiers of government such as the local, state, and federal governments in Nigeria. The sampling method used in this research is stratified random sampling. 418 middle managers of public organizations respondents to the questionnaire. Multiple regression aided by structural equation modeling was employed to test 4 hypothesized relationships. Finding: The regression results support for the mediating role of organizational justice such as distributive, procedural, interpersonal and informational justice in the link between transformational leadership and job satisfaction. Originality/value: This study adds to the literature of human resource management by empirically validating and integrating transformational leadership behaviour with the four dimensions of organizational justice theory. The study is expected to be beneficial to the top and middle-level administrators as well as theory building and testing.

Keywords: distributive justice, job satisfaction, organizational justice, procedural justice, transformational leadership

Procedia PDF Downloads 175
3352 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 157
3351 Numerical Solving Method for Specific Dynamic Performance of Unstable Flight Dynamics with PD Attitude Control

Authors: M. W. Sun, Y. Zhang, L. M. Zhang, Z. H. Wang, Z. Q. Chen

Abstract:

In the realm of flight control, the Proportional- Derivative (PD) control is still widely used for the attitude control in practice, particularly for the pitch control, and the attitude dynamics using PD controller should be investigated deeply. According to the empirical knowledge about the unstable flight dynamics, the control parameter combination conditions to generate sole or finite number of closed-loop oscillations, which is a quite smooth response and is more preferred by practitioners, are presented in analytical or numerical manners. To analyze the effects of the combination conditions of the control parameters, the roots of several polynomials are sought to obtain feasible solutions. These conditions can also be plotted in a 2-D plane which makes the conditions be more explicit by using multiple interval operations. Finally, numerical examples are used to validate the proposed methods and some comparisons are also performed.

Keywords: attitude control, dynamic performance, numerical solving method, interval, unstable flight dynamics

Procedia PDF Downloads 581
3350 Blood Glucose Level Measurement from Breath Analysis

Authors: Tayyab Hassan, Talha Rehman, Qasim Abdul Aziz, Ahmad Salman

Abstract:

The constant monitoring of blood glucose level is necessary for maintaining health of patients and to alert medical specialists to take preemptive measures before the onset of any complication as a result of diabetes. The current clinical monitoring of blood glucose uses invasive methods repeatedly which are uncomfortable and may result in infections in diabetic patients. Several attempts have been made to develop non-invasive techniques for blood glucose measurement. In this regard, the existing methods are not reliable and are less accurate. Other approaches claiming high accuracy have not been tested on extended dataset, and thus, results are not statistically significant. It is a well-known fact that acetone concentration in breath has a direct relation with blood glucose level. In this paper, we have developed the first of its kind, reliable and high accuracy breath analyzer for non-invasive blood glucose measurement. The acetone concentration in breath was measured using MQ 138 sensor in the samples collected from local hospitals in Pakistan involving one hundred patients. The blood glucose levels of these patients are determined using conventional invasive clinical method. We propose a linear regression classifier that is trained to map breath acetone level to the collected blood glucose level achieving high accuracy.

Keywords: blood glucose level, breath acetone concentration, diabetes, linear regression

Procedia PDF Downloads 173
3349 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome

Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler

Abstract:

Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.

Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model

Procedia PDF Downloads 153
3348 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents

Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi

Abstract:

In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.

Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles

Procedia PDF Downloads 444
3347 Use of Protection Motivation Theory to Assess Preventive Behaviors of COVID-19

Authors: Maryam Khazaee-Pool, Tahereh Pashaei, Koen Ponnet

Abstract:

Background: The global prevalence and morbidity of Coronavirus disease 2019 (COVID-19) are high. Preventive behaviors are proven to reduce the damage caused by the disease. There is a paucity of information on determinants of preventive behaviors in response to COVID-19 in Mazandaran province, north of Iran. So, we aimed to evaluate the protection motivation theory (PMT) in promoting preventive behaviors of COVID-19 in Mazandaran province. Materials and Methods: In this descriptive cross-sectional study, 1220 individuals participated. They were selected via social networks using convenience sampling in 2020. Data were collected online using a demographic questionnaire and a valid and reliable scale based on PMT. Data analysis was done using the Pearson correlation coefficient and linear regression in SPSS V24. Result: The mean age of the participants was 39.34±8.74 years. The regression model showed perceived threat (ß =0.033, P =0.007), perceived costs (ß=0.039, P=0.045), perceived self-efficacy (ß =0.116, P>0.001), and perceived fear (ß=0.131, P>0.001) as the significant predictors of COVID-19 preventive behaviors. This model accounted for 78% of the variance in these behaviors. Conclusion: According to constructs of the PMT associated with protection against COVID-19, educational programs and health promotion based on the theory and benefiting from social networks could be helpful in increasing the motivation of people towards protective behaviors against COVID-19.

Keywords: questionnaire development, validation, intention, prevention, covid-19

Procedia PDF Downloads 44