Search results for: panel data regression models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29903

Search results for: panel data regression models

29093 Modelling Consistency and Change of Social Attitudes in 7 Years of Longitudinal Data

Authors: Paul Campbell, Nicholas Biddle

Abstract:

There is a complex, endogenous relationship between individual circumstances, attitudes, and behaviour. This study uses longitudinal panel data to assess changes in social and political attitudes over a 7-year period. Attitudes are captured with the question 'what is the most important issue facing Australia today', collected at multiple time points in a longitudinal survey of 2200 Australians. Consistency of attitudes, and factors predicting change over time, are assessed. The consistency of responses has methodological implications for data collection, specifically how often such questions ought to be asked of a population. When change in attitude is observed, this study assesses the extent to which individual demographic characteristics, personality traits, and broader societal events predict change.

Keywords: attitudes, longitudinal survey analysis, personality, social values

Procedia PDF Downloads 126
29092 Mapping Man-Induced Soil Degradation in Armenia's High Mountain Pastures through Remote Sensing Methods: A Case Study

Authors: A. Saghatelyan, Sh. Asmaryan, G. Tepanosyan, V. Muradyan

Abstract:

One of major concern to Armenia has been soil degradation emerged as a result of unsustainable management and use of grasslands, this in turn largely impacting environment, agriculture and finally human health. Hence, assessment of soil degradation is an essential and urgent objective set out to measure its possible consequences and develop a potential management strategy. Since recently, an essential tool for assessing pasture degradation has been remote sensing (RS) technologies. This research was done with an intention to measure preciseness of Linear spectral unmixing (LSU) and NDVI-SMA methods to estimate soil surface components related to degradation (fractional vegetation cover-FVC, bare soils fractions, surface rock cover) and determine appropriateness of these methods for mapping man-induced soil degradation in high mountain pastures. Taking into consideration a spatially complex and heterogeneous biogeophysical structure of the studied site, we used high resolution multispectral QuickBird imagery of a pasture site in one of Armenia’s rural communities - Nerkin Sasoonashen. The accuracy assessment was done by comparing between the land cover abundance data derived through RS methods and the ground truth land cover abundance data. A significant regression was established between ground truth FVC estimate and both NDVI-LSU and LSU - produced vegetation abundance data (R2=0.636, R2=0.625, respectively). For bare soil fractions linear regression produced a general coefficient of determination R2=0.708. Because of poor spectral resolution of the QuickBird imagery LSU failed with assessment of surface rock abundance (R2=0.015). It has been well documented by this particular research, that reduction in vegetation cover runs in parallel with increase in man-induced soil degradation, whereas in the absence of man-induced soil degradation a bare soil fraction does not exceed a certain level. The outcomes show that the proposed method of man-induced soil degradation assessment through FVC, bare soil fractions and field data adequately reflects the current status of soil degradation throughout the studied pasture site and may be employed as an alternate of more complicated models for soil degradation assessment.

Keywords: Armenia, linear spectral unmixing, remote sensing, soil degradation

Procedia PDF Downloads 323
29091 Predicting Factors of Hearing Protection Device Use of Workers in Kaolin Mineral Dressing Factories, Thailand

Authors: Watcharapong Yaowarat, Thanee Kaewthummanukul, Waruntorn Jongrungrotsakul

Abstract:

Noise-induced hearing loss, the most significant occupational and safety problem among the working population, can be effectively prevented through hearing protection devices (HPDs) use. This study aimed to examine whether the following factors, perceived benefits, perceived barriers, perceived self-efficacy, and interpersonal and situational influences about using hearing protection could predict HPD use among 132 qualified workers in production lines at Kaolin Mineral Dressing factories, Uttaradit and Lampang provinces. Data collection was undertaken from August to September 2020 according to the interview form developed by Yaruang et al. (2010), which was assured by a panel of experts and its reliability value was at an acceptable level. Data analysis was performed using logistic regression analysis. The results revealed that only the situational factor of using hearing protection could predict HPD use, which accounted for 21.80 percent of the total variance for HPD use. It was also found that the study sample who had a score for the situational factors on using hearing protection greater than or equal to the median was 4.16 times more likely to use HPDs than those who had lower median scores. (OR = 4.16, p < .05). The results, thus, indicate that organization policies addressing worker health along with enhancing a supportive environment for HPD use, in particular, the provision of various HPDs, are of great importance. Therefore, occupational health nurses and related health teams should enhance workers’ use of HPDs effectively through knowledge dissemination by adopting strategies appropriate to the workplace context leading to an achievement of worker health policy focusing on work safety.

Keywords: predicting factors, hearing protection device, factors predicting hearing protection device use, kaolin mineral dressing factories

Procedia PDF Downloads 133
29090 Machine Learning Development Audit Framework: Assessment and Inspection of Risk and Quality of Data, Model and Development Process

Authors: Jan Stodt, Christoph Reich

Abstract:

The usage of machine learning models for prediction is growing rapidly and proof that the intended requirements are met is essential. Audits are a proven method to determine whether requirements or guidelines are met. However, machine learning models have intrinsic characteristics, such as the quality of training data, that make it difficult to demonstrate the required behavior and make audits more challenging. This paper describes an ML audit framework that evaluates and reviews the risks of machine learning applications, the quality of the training data, and the machine learning model. We evaluate and demonstrate the functionality of the proposed framework by auditing an steel plate fault prediction model.

Keywords: audit, machine learning, assessment, metrics

Procedia PDF Downloads 260
29089 Access Control System for Big Data Application

Authors: Winfred Okoe Addy, Jean Jacques Dominique Beraud

Abstract:

Access control systems (ACs) are some of the most important components in safety areas. Inaccuracies of regulatory frameworks make personal policies and remedies more appropriate than standard models or protocols. This problem is exacerbated by the increasing complexity of software, such as integrated Big Data (BD) software for controlling large volumes of encrypted data and resources embedded in a dedicated BD production system. This paper proposes a general access control strategy system for the diffusion of Big Data domains since it is crucial to secure the data provided to data consumers (DC). We presented a general access control circulation strategy for the Big Data domain by describing the benefit of using designated access control for BD units and performance and taking into consideration the need for BD and AC system. We then presented a generic of Big Data access control system to improve the dissemination of Big Data.

Keywords: access control, security, Big Data, domain

Procedia PDF Downloads 129
29088 Accounting Knowledge Management and Value Creation of SME in Chatuchak Market: Case Study Ceramics Product

Authors: Runglaksamee Rodkam

Abstract:

The purpose of this research was to study the influence of accountants’ potential performance on their working process, a case study of Government Savings Banks in the northeast of Thailand. The independent variables included accounting knowledge, accounting skill, accounting value, accounting ethics, and accounting attitude, while the dependent variable included the success of the working process. A total of 155 accountants working for Government Savings Banks were selected by random sampling. A questionnaire was used as a tool for collecting data. Descriptive statistics in this research included percentage, mean, and multiple regression analyses. The findings revealed that the majority of accountants were female with an age between 35-40 years old. Most of the respondents had an undergraduate degree with ten years of experience. Moreover, the factors of accounting knowledge, accounting skill, accounting a value and accounting ethics and accounting attitude were rated at a high level. The findings from regression analysis of observation data revealed a causal relationship in that the observation data could explain at least 51 percent of the success in the accountants’ working process.

Keywords: influence, potential performance, success, working process

Procedia PDF Downloads 248
29087 A Study of High Viscosity Oil-Gas Slug Flow Using Gamma Densitometer

Authors: Y. Baba, A. Archibong-Eso, H. Yeung

Abstract:

Experimental study of high viscosity oil-gas flows in horizontal pipelines published in literature has indicated that hydrodynamic slug flow is the dominant flow pattern observed. Investigations have shown that hydrodynamic slugging brings about high instabilities in pressure that can damage production facilities thereby making it inherent to study high viscous slug flow regime so as to improve the understanding of its flow dynamics. Most slug flow models used in the petroleum industry for the design of pipelines together with their closure relationships were formulated based on observations of low viscosity liquid-gas flows. New experimental investigations and data are therefore required to validate these models. In cases where these models underperform, improving upon or building new predictive models and correlations will also depend on the new experimental dataset and further understanding of the flow dynamics in high viscous oil-gas flows. In this study conducted at the Flow laboratory, Oil and Gas Engineering Centre of Cranfield University, slug flow variables such as pressure gradient, mean liquid holdup, frequency and slug length for oil viscosity ranging from 1..0 – 5.5 Pa.s are experimentally investigated and analysed. The study was carried out in a 0.076m ID pipe, two fast sampling gamma densitometer and pressure transducers (differential and point) were used to obtain experimental measurements. Comparison of the measured slug flow parameters to the existing slug flow prediction models available in the literature showed disagreement with high viscosity experimental data thus highlighting the importance of building new predictive models and correlations.

Keywords: gamma densitometer, mean liquid holdup, pressure gradient, slug frequency and slug length

Procedia PDF Downloads 325
29086 Estimation of Sediment Transport into a Reservoir Dam

Authors: Kiyoumars Roushangar, Saeid Sadaghian

Abstract:

Although accurate sediment load prediction is very important in planning, designing, operating and maintenance of water resources structures, the transport mechanism is complex, and the deterministic transport models are based on simplifying assumptions often lead to large prediction errors. In this research, firstly, two intelligent ANN methods, Radial Basis and General Regression Neural Networks, are adopted to model of total sediment load transport into Madani Dam reservoir (north of Iran) using the measured data and then applicability of the sediment transport methods developed by Engelund and Hansen, Ackers and White, Yang, and Toffaleti for predicting of sediment load discharge are evaluated. Based on comparison of the results, it is found that the GRNN model gives better estimates than the sediment rating curve and mentioned classic methods.

Keywords: sediment transport, dam reservoir, RBF, GRNN, prediction

Procedia PDF Downloads 492
29085 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 149
29084 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model

Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim

Abstract:

Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).

Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph

Procedia PDF Downloads 270
29083 The Relationship between Parenting Style, Nonattachment and Inferiority

Authors: Yu-Chien Huang, Shu-Chen Yang

Abstract:

Introduction: Parenting style, non-attachment, and inferiority are important topics in psychology, but the related research on nonattachment is still lacking. Therefore, the purposes of this study were to explore the relationship between parenting style, nonattachment, and inferiority. Methods: We conducted a correlational study, and three instruments were utilized to collect data: parenting style scale, nonattachment scale, and inferiority scale. The inter-reliability Cronbach's α used in this research indicated good inter item reliability and the test-retest reliability that showed a good consistency. The data were analyzed using the descriptive statistics, Chi-square test, one way ANOVA, Pearson’s correlation, and regression analysis. Results: A total of 200 participators were tested in this research. As a result of the study, inferiority had a positive correlation with authoritarian parenting style; nonattachment had a negative correlation with authoritarian parenting style; and with inferiority, the hypothesis was supported. In the linear mediation models, nonattachment was found to be partially mediated the relationship between authoritarian parenting style and inferiority. Conclusion: These findings imply that interventions aimed at enhancing nonattachment as a way to improve inferiority are a good strategy.

Keywords: inferiority, nonattachment, parenting style, psychology

Procedia PDF Downloads 124
29082 Towards an Effective Approach for Modelling near Surface Air Temperature Combining Weather and Satellite Data

Authors: Nicola Colaninno, Eugenio Morello

Abstract:

The urban environment affects local-to-global climate and, in turn, suffers global warming phenomena, with worrying impacts on human well-being, health, social and economic activities. Physic-morphological features of the built-up space affect urban air temperature, locally, causing the urban environment to be warmer compared to surrounding rural. This occurrence, typically known as the Urban Heat Island (UHI), is normally assessed by means of air temperature from fixed weather stations and/or traverse observations or based on remotely sensed Land Surface Temperatures (LST). The information provided by ground weather stations is key for assessing local air temperature. However, the spatial coverage is normally limited due to low density and uneven distribution of the stations. Although different interpolation techniques such as Inverse Distance Weighting (IDW), Ordinary Kriging (OK), or Multiple Linear Regression (MLR) are used to estimate air temperature from observed points, such an approach may not effectively reflect the real climatic conditions of an interpolated point. Quantifying local UHI for extensive areas based on weather stations’ observations only is not practicable. Alternatively, the use of thermal remote sensing has been widely investigated based on LST. Data from Landsat, ASTER, or MODIS have been extensively used. Indeed, LST has an indirect but significant influence on air temperatures. However, high-resolution near-surface air temperature (NSAT) is currently difficult to retrieve. Here we have experimented Geographically Weighted Regression (GWR) as an effective approach to enable NSAT estimation by accounting for spatial non-stationarity of the phenomenon. The model combines on-site measurements of air temperature, from fixed weather stations and satellite-derived LST. The approach is structured upon two main steps. First, a GWR model has been set to estimate NSAT at low resolution, by combining air temperature from discrete observations retrieved by weather stations (dependent variable) and the LST from satellite observations (predictor). At this step, MODIS data, from Terra satellite, at 1 kilometer of spatial resolution have been employed. Two time periods are considered according to satellite revisit period, i.e. 10:30 am and 9:30 pm. Afterward, the results have been downscaled at 30 meters of spatial resolution by setting a GWR model between the previously retrieved near-surface air temperature (dependent variable), the multispectral information as provided by the Landsat mission, in particular the albedo, and Digital Elevation Model (DEM) from the Shuttle Radar Topography Mission (SRTM), both at 30 meters. Albedo and DEM are now the predictors. The area under investigation is the Metropolitan City of Milan, which covers an area of approximately 1,575 km2 and encompasses a population of over 3 million inhabitants. Both models, low- (1 km) and high-resolution (30 meters), have been validated according to a cross-validation that relies on indicators such as R2, Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE). All the employed indicators give evidence of highly efficient models. In addition, an alternative network of weather stations, available for the City of Milano only, has been employed for testing the accuracy of the predicted temperatures, giving and RMSE of 0.6 and 0.7 for daytime and night-time, respectively.

Keywords: urban climate, urban heat island, geographically weighted regression, remote sensing

Procedia PDF Downloads 189
29081 Polymer Industrial Floors: The Possibility of Using Secondary Raw Materials from Solar Panels

Authors: J. Kosikova, B. Vacenovska, M. Vyhnankova

Abstract:

The paper reports on the subject of recycling and further use of secondary raw materials obtained from solar panels, which is becoming a very up to date topic in recent years. Recycling these panels is very difficult and complex, and the use of resulting secondary raw materials is still not fully resolved. Within the research carried out at the Brno University of Technology, new polymer materials used for industrial floors are being developed. Secondary raw materials are incorporated into these polymers as fillers. One of the tested filler materials was glass obtained from solar panels. The following text describes procedures and results of the tests that were performed on these materials, confirming the possibility of the use of solar panel glass in industrial polymer flooring systems.

Keywords: fillers, industrial floors, recycling, secondary raw material, solar panel

Procedia PDF Downloads 280
29080 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 52
29079 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 443
29078 TELUM Land Use Model: An Investigation of Data Requirements and Calibration Results for Chittenden County MPO, U.S.A.

Authors: Georgia Pozoukidou

Abstract:

TELUM software is a land use model designed specifically to help metropolitan planning organizations (MPOs) prepare their transportation improvement programs and fulfill their numerous planning responsibilities. In this context obtaining, preparing, and validating socioeconomic forecasts are becoming fundamental tasks for an MPO in order to ensure that consistent population and employment data are provided to travel demand models. Chittenden County Metropolitan Planning Organization of Vermont State was used as a case study to test the applicability of TELUM land use model. The technical insights and lessons learned from the land use model application have transferable value for all MPOs faced with land use forecasting development and transportation modelling.

Keywords: calibration data requirements, land use models, land use planning, metropolitan planning organizations

Procedia PDF Downloads 285
29077 A Case Comparative Study of Infant Mortality Rate in North-West Nigeria

Authors: G. I. Onwuka, A. Danbaba, S. U. Gulumbe

Abstract:

This study investigated of Infant Mortality Rate as observed at a general hospital in Kaduna-South, Kaduna State, North West Nigeria. The causes of infant Mortality were examined. The data used for this analysis were collected at the statistics unit of the Hospital. The analysis was carried out on the data using Multiple Linear regression Technique and this showed that there is linear relationship between the dependent variable (death) and the independent variables (malaria, measles, anaemia, and coronary heart disease). The resultant model also revealed that a unit increment in each of these diseases would result to a unit increment in death recorded, 98.7% of the total variation in mortality is explained by the given model. The highest number of mortality was recorded in July, 2005 and the lowest mortality recorded in October, 2009.Recommendations were however made based on the results of the study.

Keywords: infant mortality rate, multiple linear regression, diseases, serial correlation

Procedia PDF Downloads 319
29076 Competitiveness of African Countries through Open Quintuple Helix Model

Authors: B. G. C. Ahodode, S. Fekkaklouhail

Abstract:

Following the triple helix theory, this study aims to evaluate the innovation system effect on African countries’ competitiveness by taking into account external contributions; according to the extent that developing countries (especially African countries) are characterized by weak innovation systems whose synergy operates more at the foreign level than domestic and global. To do this, we used the correlation test, parsimonious regression techniques, and panel estimation between 2013 and 2016. Results show that the degree of innovation synergy has a significant effect on competitiveness in Africa. Specifically, while the opening system (OPESYS) and social system (SOCSYS) contribute respectively in importance order to 0.634 and 0.284 (at 1%) significant points of increase in the GCI, the political system (POLSYS) and educational system (EDUSYS) only increase it to 0.322 and 0.169 at 5% significance level while the effect of the economic system (ECOSYS) is not significant on Global Competitiveness Index.

Keywords: innovation system, innovation, competitiveness, Africa

Procedia PDF Downloads 64
29075 Sales-Based Dynamic Investment and Leverage Decisions: A Longitudinal Study

Authors: Rihab Belguith, Fathi Abid

Abstract:

The paper develops a system-based approach to investigate the dynamic adjustment of debt structure and investment policies of the Dow-Jones index. This approach enables the assessment of relations among sales, debt, and investment opportunities by considering the simultaneous effect of the market environmental change and future growth opportunities. We integrate the firm-specific sales variance to capture the industries' conditions in the model. Empirical results were obtained through a panel data set of firms with different sectors. The analysis support that environmental change does not affect equally the different industry since operating leverage differs among industries and so the sensitivity to sales variance. Including adjusted-specific variance, we find that there is no monotonic relation between leverage, sales, and investment. The firm may choose a low debt level in response to high sales variance but high leverage to attenuate the negative relation between sales variance and the current level of investment. We further find that while the overall effect of debt maturity on leverage is unaffected by the level of growth opportunities, the shorter the maturity of debt is, the smaller the direct effect of sales variance on investment.

Keywords: dynamic panel, investment, leverage decision, sales uncertainty

Procedia PDF Downloads 240
29074 Optimization and Simulation Models Applied in Engineering Planning and Management

Authors: Abiodun Ladanu Ajala, Wuyi Oke

Abstract:

Mathematical simulation and optimization models packaged within interactive computer programs provide a common way for planners and managers to predict the behaviour of any proposed water resources system design or management policy before it is implemented. Modeling presents a principal technique of predicting the behaviour of the proposed infrastructural designs or management policies. Models can be developed and used to help identify specific alternative plans that best meet those objectives. This study discusses various types of models, their development, architecture, data requirements, and applications in the field of engineering. It also outlines the advantages and limitations of each the optimization and simulation models presented. The techniques explored in this review include; dynamic programming, linear programming, fuzzy optimization, evolutionary algorithms and finally artificial intelligence techniques. Previous studies carried out using some of the techniques mentioned above were reviewed, and most of the results from different researches showed that indeed optimization and simulation provides viable alternatives and predictions which form a basis for decision making in building engineering structures and also in engineering planning and management.

Keywords: linear programming, mutation, optimization, simulation

Procedia PDF Downloads 580
29073 Developing and Evaluating Clinical Risk Prediction Models for Coronary Artery Bypass Graft Surgery

Authors: Mohammadreza Mohebbi, Masoumeh Sanagou

Abstract:

The ability to predict clinical outcomes is of great importance to physicians and clinicians. A number of different methods have been used in an effort to accurately predict these outcomes. These methods include the development of scoring systems based on multivariate statistical modelling, and models involving the use of classification and regression trees. The process usually consists of two consecutive phases, namely model development and external validation. The model development phase consists of building a multivariate model and evaluating its predictive performance by examining calibration and discrimination, and internal validation. External validation tests the predictive performance of a model by assessing its calibration and discrimination in different but plausibly related patients. A motivate example focuses on prediction modeling using a sample of patients undergone coronary artery bypass graft (CABG) has been used for illustrative purpose and a set of primary considerations for evaluating prediction model studies using specific quality indicators as criteria to help stakeholders evaluate the quality of a prediction model study has been proposed.

Keywords: clinical prediction models, clinical decision rule, prognosis, external validation, model calibration, biostatistics

Procedia PDF Downloads 289
29072 Construction of QSAR Models to Predict Potency on a Series of substituted Imidazole Derivatives as Anti-fungal Agents

Authors: Sara El Mansouria Beghdadi

Abstract:

Quantitative structure–activity relationship (QSAR) modelling is one of the main computer tools used in medicinal chemistry. Over the past two decades, the incidence of fungal infections has increased due to the development of resistance. In this study, the QSAR was performed on a series of esters of 2-carboxamido-3-(1H-imidazole-1-yl) propanoic acid derivatives. These compounds have showed moderate and very good antifungal activity. The multiple linear regression (MLR) was used to generate the linear 2d-QSAR models. The dataset consists of 115 compounds with their antifungal activity (log MIC) against «Candida albicans» (ATCC SC5314). Descriptors were calculated, and different models were generated using Chemoffice, Avogadro, GaussView software. The selected model was validated. The study suggests that the increase in lipophilicity and the reduction in the electronic character of the substituent in R1, as well as the reduction in the steric hindrance of the substituent in R2 and its aromatic character, supporting the potentiation of the antifungal effect. The results of QSAR could help scientists to propose new compounds with higher antifungal activities intended for immunocompromised patients susceptible to multi-resistant nosocomial infections.

Keywords: quantitative structure–activity relationship, imidazole, antifungal, candida albicans (ATCC SC5314)

Procedia PDF Downloads 79
29071 A Novel Approach towards Test Case Prioritization Technique

Authors: Kamna Solanki, Yudhvir Singh, Sandeep Dalal

Abstract:

Software testing is a time and cost intensive process. A scrutiny of the code and rigorous testing is required to identify and rectify the putative bugs. The process of bug identification and its consequent correction is continuous in nature and often some of the bugs are removed after the software has been launched in the market. This process of code validation of the altered software during the maintenance phase is termed as Regression testing. Regression testing ubiquitously considers resource constraints; therefore, the deduction of an appropriate set of test cases, from the ensemble of the entire gamut of test cases, is a critical issue for regression test planning. This paper presents a novel method for designing a suitable prioritization process to optimize fault detection rate and performance of regression test on predefined constraints. The proposed method for test case prioritization m-ACO alters the food source selection criteria of natural ants and is basically a modified version of Ant Colony Optimization (ACO). The proposed m-ACO approach has been coded in 'Perl' language and results are validated using three examples by computation of Average Percentage of Faults Detected (APFD) metric.

Keywords: regression testing, software testing, test case prioritization, test suite optimization

Procedia PDF Downloads 330
29070 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate

Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe

Abstract:

This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.

Keywords: ARIMA, error metrices, model selection, SETAR

Procedia PDF Downloads 239
29069 Control Strategy for a Solar Vehicle Race

Authors: Francois Defay, Martim Calao, Jean Francois Dassieu, Laurent Salvetat

Abstract:

Electrical vehicles are a solution for reducing the pollution using green energy. The shell Eco-Marathon provides rules in order to minimize the battery use for the race. The use of solar panel combined with efficient motor control and race strategy allow driving a 60kg vehicle with one pilot using only the solar energy in the best case. This paper presents a complete modelization of a solar vehicle used for the shell eco-marathon. This project called Helios is cooperation between non-graduated students, academic institutes, and industrials. The prototype is an ultra-energy-efficient vehicle based on one-meter square solar panel and an own-made brushless controller to optimize the electrical part. The vehicle is equipped with sensors and embedded system to provide all the data in real time in order to evaluate the best strategy for the course. A complete modelization with Matlab/Simulink is used to test the optimal strategy to increase the global endurance. Experimental results are presented to validate the different parts of the model: mechanical, aerodynamics, electrical, solar panel. The major finding of this study is to provide solutions to identify the model parameters (Rolling Resistance Coefficient, drag coefficient, motor torque coefficient, etc.) by means of experimental results combined with identification techniques. One time the coefficients are validated, the strategy to optimize the consumption and the average speed can be tested first in simulation before to be implanted for the race. The paper describes all the simulation and experimental parts and provides results in order to optimize the global efficiency of the vehicle. This works have been started four years ago and evolved many students for the experimental and theoretical parts and allow to increase the knowledge on electrical self-efficient vehicle.

Keywords: electrical vehicle, endurance, optimization, shell eco-marathon

Procedia PDF Downloads 260
29068 Machine Learning for Classifying Risks of Death and Length of Stay of Patients in Intensive Unit Care Beds

Authors: Itamir de Morais Barroca Filho, Cephas A. S. Barreto, Ramon Malaquias, Cezar Miranda Paula de Souza, Arthur Costa Gorgônio, João C. Xavier-Júnior, Mateus Firmino, Fellipe Matheus Costa Barbosa

Abstract:

Information and Communication Technologies (ICT) in healthcare are crucial for efficiently delivering medical healthcare services to patients. These ICTs are also known as e-health and comprise technologies such as electronic record systems, telemedicine systems, and personalized devices for diagnosis. The focus of e-health is to improve the quality of health information, strengthen national health systems, and ensure accessible, high-quality health care for all. All the data gathered by these technologies make it possible to help clinical staff with automated decisions using machine learning. In this context, we collected patient data, such as heart rate, oxygen saturation (SpO2), blood pressure, respiration, and others. With this data, we were able to develop machine learning models for patients’ risk of death and estimate the length of stay in ICU beds. Thus, this paper presents the methodology for applying machine learning techniques to develop these models. As a result, although we implemented these models on an IoT healthcare platform, helping clinical staff in healthcare in an ICU, it is essential to create a robust clinical validation process and monitoring of the proposed models.

Keywords: ICT, e-health, machine learning, ICU, healthcare

Procedia PDF Downloads 97
29067 Asymmetric Information and Composition of Capital Inflows: Stock Market Microstructure Analysis of Asia Pacific Countries

Authors: Farid Habibi Tanha, Hawati Janor, Mojtaba Jahanbazi

Abstract:

The purpose of this study is to examine the effect of asymmetric information on the composition of capital inflows. This study uses the stock market microstructure to capture the asymmetric information. Such an approach allows one to capture the level and extent of the asymmetric information from a firm’s perspective. This study focuses on the two-dimensional measure of the market microstructure in capturing asymmetric information. The composition of capital inflows is measured by running six models simultaneously. By employing the panel data technique, the main finding of this research shows an increase in the asymmetric information of the stock market, in any of the two dimensions of width and depth. This leads to the reduction of foreign investments in both forms of foreign portfolio investment (FPI) and foreign direct investment (FDI), while the reduction in FPI is higher than that of the FDI. The significant effect of asymmetric information on capital inflows implicitly suggests for policymakers to control the changes of foreign capital inflows through transparency in the level of the market.

Keywords: capital flows composition, asymmetric information, stock market microstructure, foreign portfolio investment, foreign direct investment

Procedia PDF Downloads 357
29066 Urbanization and Income Inequality in Thailand

Authors: Acumsiri Tantikarnpanit

Abstract:

This paper aims to examine the relationship between urbanization and income inequality in Thailand during the period 2002–2020. Using a panel of data for 76 provinces collected from Thailand’s National Statistical Office (Labor Force Survey: LFS), as well as geospatial data from the U.S. Air Force Defense Meteorological Satellite Program (DMSP) and the Visible Infrared Imaging Radiometer Suite Day/Night band (VIIRS-DNB) satellite for nineteen selected years. This paper employs two different definitions to identify urban areas: 1) Urban areas defined by Thailand's National Statistical Office (Labor Force Survey: LFS), and 2) Urban areas estimated using nighttime light data from the DMSP and VIIRS-DNB satellite. The second method includes two sub-categories: 2.1) Determining urban areas by calculating nighttime light density with a population density of 300 people per square kilometer, and 2.2) Calculating urban areas based on nighttime light density corresponding to a population density of 1,500 people per square kilometer. The empirical analysis based on Ordinary Least Squares (OLS), fixed effects, and random effects models reveals a consistent U-shaped relationship between income inequality and urbanization. The findings from the econometric analysis demonstrate that urbanization or population density has a significant and negative impact on income inequality. Moreover, the square of urbanization shows a statistically significant positive impact on income inequality. Additionally, there is a negative association between logarithmically transformed income and income inequality. This paper also proposes the inclusion of satellite imagery, geospatial data, and spatial econometric techniques in future studies to conduct quantitative analysis of spatial relationships.

Keywords: income inequality, nighttime light, population density, Thailand, urbanization

Procedia PDF Downloads 72
29065 A Double PWM Source Inverter Technique with Reduced Leakage Current for Application on Standalone Systems

Authors: Md.Noman Habib Khan, M. S. Tajul Islam, T. S. Gunawan, M. Hasanuzzaman

Abstract:

The photovoltaic (PV) panel with no galvanic isolation system is well-known technique in the world which is effective and deliver power with enhanced efficiency. The PV generation presented here is for stand-alone system installed in remote areas when as the resulting power gets connected to electronic load installation instead of being tied to the grid. Though very small, even then transformer-less topology is shown to be with leakage in pico-ampere range. By using PWM technique PWM, leakage current in different situations is shown. The results that are demonstrated in this paper show how the pico-ampere current is reduced to femto-ampere through use of inductors and capacitors of suitable values of inductor and capacitors with the load.

Keywords: photovoltaic (PV) panel, duty cycle, pulse duration modulation (PDM), leakage current

Procedia PDF Downloads 528
29064 Observed Changes in Constructed Precipitation at High Resolution in Southern Vietnam

Authors: Nguyen Tien Thanh, Günter Meon

Abstract:

Precipitation plays a key role in water cycle, defining the local climatic conditions and in ecosystem. It is also an important input parameter for water resources management and hydrologic models. With spatial continuous data, a certainty of discharge predictions or other environmental factors is unquestionably better than without. This is, however, not always willingly available to acquire for a small basin, especially for coastal region in Vietnam due to a low network of meteorological stations (30 stations) on long coast of 3260 km2. Furthermore, available gridded precipitation datasets are not fine enough when applying to hydrologic models. Under conditions of global warming, an application of spatial interpolation methods is a crucial for the climate change impact studies to obtain the spatial continuous data. In recent research projects, although some methods can perform better than others do, no methods draw the best results for all cases. The objective of this paper therefore, is to investigate different spatial interpolation methods for daily precipitation over a small basin (approximately 400 km2) located in coastal region, Southern Vietnam and find out the most efficient interpolation method on this catchment. The five different interpolation methods consisting of cressman, ordinary kriging, regression kriging, dual kriging and inverse distance weighting have been applied to identify the best method for the area of study on the spatio-temporal scale (daily, 10 km x 10 km). A 30-year precipitation database was created and merged into available gridded datasets. Finally, observed changes in constructed precipitation were performed. The results demonstrate that the method of ordinary kriging interpolation is an effective approach to analyze the daily precipitation. The mixed trends of increasing and decreasing monthly, seasonal and annual precipitation have documented at significant levels.

Keywords: interpolation, precipitation, trend, vietnam

Procedia PDF Downloads 273