Search results for: low order model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27286

Search results for: low order model

24316 The Quality Improvement of Painting Assignments for Grade 4-6 Students by Using PDCA Cycle

Authors: Pawinee Sorawech

Abstract:

The purpose of this study was to investigate the quality improvement of painting assignments for grade 4-6 students by using PDCA cycle. This study employed a qualitative technique. Suan Sunandha Rajabhat University and its demonstration school were selected as the area of study. An in-depth interview was utilized. The findings revealed that model of PDCA cycle was a proper model to increase the quality of painting assignments for grade 4-6 students. The six steps of improvement included: studying the PDCA model, setting up a plan, determining the scope of work, creating a strategy, developing a quality for painting assignment, and coming up with a handbook for a quality improvement of painting assignment.

Keywords: quality, painting assignments, PDCA cycle, grade 4-6 students

Procedia PDF Downloads 482
24315 Using Flow Line Modelling, Remote Sensing for Reconstructing Glacier Volume Loss Model for Athabasca Glacier, Canadian Rockies

Authors: Rituparna Nath, Shawn J. Marshall

Abstract:

Glaciers are one of the main sensitive climatic indicators, as they respond strongly to small climatic shifts. We develop a flow line model of glacier dynamics to simulate the past and future extent of glaciers in the Canadian Rocky Mountains, with the aim of coupling this model within larger scale regional climate models of glacier response to climate change. This paper will focus on glacier-climate modeling and reconstructions of glacier volume from the Little Ice Age (LIA) to present for Athabasca Glacier, Alberta, Canada. Glacier thickness, volume and mass change will be constructed using flow line modelling and examination of different climate scenarios that are able to give good reconstructions of LIA ice extent. With the availability of SPOT 5 imagery, Digital elevation models and GIS Arc Hydro tool, ice catchment properties-glacier width and LIA moraines have been extracted using automated procedures. Simulation of glacier mass change will inform estimates of meltwater run off over the historical period and model calibration from the LIA reconstruction will aid in future projections of the effects of climate change on glacier recession. Furthermore, the model developed will be effective for further future studies with ensembles of glaciers.

Keywords: flow line modeling, Athabasca Glacier, glacier mass balance, Remote Sensing, Arc hydro tool, little ice age

Procedia PDF Downloads 268
24314 Numerical Investigation of the Electromagnetic Common Rail Injector Characteristics

Authors: Rafal Sochaczewski, Ksenia Siadkowska, Tytus Tulwin

Abstract:

The paper describes the modeling of a fuel injector for common rail systems. A one-dimensional model of a solenoid-valve-controlled injector with Valve Closes Orifice (VCO) spray was modelled in the AVL Hydsim. This model shows the dynamic phenomena that occur in the injector. The accuracy of the calibration, based on a regulation of the parameters of the control valve and the nozzle needle lift, was verified by comparing the numerical results of injector flow rate. Our model is capable of a precise simulation of injector operating parameters in relation to injection time and fuel pressure in a fuel rail. As a result, there were made characteristics of the injector flow rate and backflow.

Keywords: common rail, diesel engine, fuel injector, modeling

Procedia PDF Downloads 412
24313 Two Layer Photo-Thermal Deflection Model to Investigate the Electronic Properties in BGaAs/GaAs Alloys

Authors: S. Ilahi, M. Baira, F. Saidi, N. Yacoubi, L. Auvray, H. Maaref

Abstract:

Photo-thermal deflection technique (PTD) is used to study the nonradiative recombination process in BGaAs/GaAs alloy with boron composition of 3% and 8% grown by metal organic chemical vapor deposition (MOCVD). A two layer theoretical model has been developed taking into account both thermal and electronic contribution in the photothermal signal allowing to extract the electronic parameters namely electronic diffusivity, surface and interface recombination. It is found that the increase of boron composition alters the BGaAs epilayers transport properties.

Keywords: photothermal defelction technique, two layer model, BGaAs/GaAs alloys, boron composition

Procedia PDF Downloads 301
24312 Parameters Identification of Granular Soils around PMT Test by Inverse Analysis

Authors: Younes Abed

Abstract:

The successful application of in-situ testing of soils heavily depends on development of interpretation methods of tests. The pressuremeter test simulates the expansion of a cylindrical cavity and because it has well defined boundary conditions, it is more unable to rigorous theoretical analysis (i. e. cavity expansion theory) then most other in-situ tests. In this article, and in order to make the identification process more convenient, we propose a relatively simple procedure which involves the numerical identification of some mechanical parameters of a granular soil, especially, the elastic modulus and the friction angle from a pressuremeter curve. The procedure, applied here to identify the parameters of generalised prager model associated to the Drucker & Prager criterion from a pressuremeter curve, is based on an inverse analysis approach, which consists of minimizing the function representing the difference between the experimental curve and the curve obtained by integrating the model along the loading path in in-situ testing. The numerical process implemented here is based on the established finite element program. We present a validation of the proposed approach by a database of tests on expansion of cylindrical cavity. This database consists of four types of tests; thick cylinder tests carried out on the Hostun RF sand, pressuremeter tests carried out on the Hostun sand, in-situ pressuremeter tests carried out at the site of Fos with marine self-boring pressuremeter and in-situ pressuremeter tests realized on the site of Labenne with Menard pressuremeter.

Keywords: granular soils, cavity expansion, pressuremeter test, finite element method, identification procedure

Procedia PDF Downloads 292
24311 Control HVAC Parameters by Brain Emotional Learning Based Intelligent Controller (BELBIC)

Authors: Javad Abdi, Azam Famil Khalili

Abstract:

Modeling emotions have attracted much attention in recent years, both in cognitive psychology and design of artificial systems. However, it is a negative factor in decision-making; emotions have shown to be a strong faculty for making fast satisfying decisions. In this paper, we have adapted a computational model based on the limbic system in the mammalian brain for control engineering applications. Learning in this model based on Temporal Difference (TD) Learning, we applied the proposed controller (termed BELBIC) for a simple model of a submarine. The model was supposed to reach the desired depth underwater. Our results demonstrate excellent control action, disturbance handling, and system parameter robustness for TDBELBIC. The proposal method, regarding the present conditions, the system action in the part and the controlling aims, can control the system in a way that these objectives are attained in the least amount of time and the best way.

Keywords: artificial neural networks, temporal difference, brain emotional learning based intelligent controller, heating- ventilating and air conditioning

Procedia PDF Downloads 433
24310 Efficiency and Scale Elasticity in Network Data Envelopment Analysis: An Application to International Tourist Hotels in Taiwan

Authors: Li-Hsueh Chen

Abstract:

Efficient operation is more and more important for managers of hotels. Unlike the manufacturing industry, hotels cannot store their products. In addition, many hotels provide room service, and food and beverage service simultaneously. When efficiencies of hotels are evaluated, the internal structure should be considered. Hence, based on the operational characteristics of hotels, this study proposes a DEA model to simultaneously assess the efficiencies among the room production division, food and beverage production division, room service division and food and beverage service division. However, not only the enhancement of efficiency but also the adjustment of scale can improve the performance. In terms of the adjustment of scale, scale elasticity or returns to scale can help to managers to make decisions concerning expansion or contraction. In order to construct a reasonable approach to measure the efficiencies and scale elasticities of hotels, this study builds an alternative variable-returns-to-scale-based two-stage network DEA model with the combination of parallel and series structures to explore the scale elasticities of the whole system, room production division, food and beverage production division, room service division and food and beverage service division based on the data of international tourist hotel industry in Taiwan. The results may provide valuable information on operational performance and scale for managers and decision makers.

Keywords: efficiency, scale elasticity, network data envelopment analysis, international tourist hotel

Procedia PDF Downloads 225
24309 Association Between Short-term NOx Exposure and Asthma Exacerbations in East London: A Time Series Regression Model

Authors: Hajar Hajmohammadi, Paul Pfeffer, Anna De Simoni, Jim Cole, Chris Griffiths, Sally Hull, Benjamin Heydecker

Abstract:

Background: There is strong interest in the relationship between short-term air pollution exposure and human health. Most studies in this field focus on serious health effects such as death or hospital admission, but air pollution exposure affects many people with less severe impacts, such as exacerbations of respiratory conditions. A lack of quantitative analysis and inconsistent findings suggest improved methodology is needed to understand these effectsmore fully. Method: We developed a time series regression model to quantify the relationship between daily NOₓ concentration and Asthma exacerbations requiring oral steroids from primary care settings. Explanatory variables include daily NOₓ concentration measurements extracted from 8 available background and roadside monitoring stations in east London and daily ambient temperature extracted for London City Airport, located in east London. Lags of NOx concentrations up to 21 days (3 weeks) were used in the model. The dependent variable was the daily number of oral steroid courses prescribed for GP registered patients with asthma in east London. A mixed distribution model was then fitted to the significant lags of the regression model. Result: Results of the time series modelling showed a significant relationship between NOₓconcentrations on each day and the number of oral steroid courses prescribed in the following three weeks. In addition, the model using only roadside stations performs better than the model with a mixture of roadside and background stations.

Keywords: air pollution, time series modeling, public health, road transport

Procedia PDF Downloads 144
24308 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence

Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno

Abstract:

Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.

Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index

Procedia PDF Downloads 168
24307 Clinician's Perspective of Common Factors of Change in Family Therapy: A Cross-National Exploration

Authors: Hassan Karimi, Fred Piercy, Ruoxi Chen, Ana L. Jaramillo-Sierra, Wei-Ning Chang, Manjushree Palit, Catherine Martosudarmo, Angelito Antonio

Abstract:

Background: The two psychotherapy camps, the randomized clinical trials (RCTs) and the common factors model, have competitively claimed specific explanations for therapy effectiveness. Recently, scholars called for empirical evidence to show the role of common factors in therapeutic outcome in marriage and family therapy. Purpose: This cross-national study aims to explore how clinicians, across different nations and theoretical orientations, attribute the contribution of common factors to therapy outcome. Method: A brief common factors questionnaire (CFQ-with a Cronbach’s Alpha, 0.77) was developed and administered in seven nations. A series of statistical analyses (paired-samples t-test, independent sample t-test, ANOVA) were conducted: to compare clinicians perceived contribution of total common factors versus model-specific factors, to compare each pair of common factors’ categories, and to compare clinicians from collectivistic nations versus clinicians from individualistic nation. Results: Clinicians across seven nations attributed 86% to common factors versus 14% to model-specific factors. Clinicians attributed 34% of therapeutic change to client’s factors, 26% to therapist’s factors, 26% to relationship factors, and 14% to model-specific techniques. The ANOVA test indicated each of the three categories of common factors (client 34%, therapist 26%, relationship 26%) showed higher contribution in therapeutic outcome than the category of model specific factors (techniques 14%). Clinicians with psychology degree attributed more contribution to model-specific factors than clinicians with MFT and counseling degrees who attributed more contribution to client factors. Clinicians from collectivistic nations attributed larger contributions to therapist’s factors (M=28.96, SD=12.75) than the US clinicians (M=23.22, SD=7.73). The US clinicians attributed a larger contribution to client’s factors (M=39.02, SD=1504) than clinicians from the collectivistic nations (M=28.71, SD=15.74). Conclusion: The findings indicate clinicians across the globe attributed more than two thirds of therapeutic change to CFs, which emphasize the training of the common factors model in the field. CFs, like model-specific factors, vary in their contribution to therapy outcome in relation to specific client, therapist, problem, treatment model, and sociocultural context. Sociocultural expectations and norms should be considered as a context in which both CFs and model-specific factors function toward therapeutic goals. Clinicians need to foster a cultural competency specifically regarding the divergent ways that CFs can be activated due to specific sociocultural values.

Keywords: common factors, model-specific factors, cross-national survey, therapist cultural competency, enhancing therapist efficacy

Procedia PDF Downloads 287
24306 Performance Optimization on Waiting Time Using Queuing Theory in an Advanced Manufacturing Environment: Robotics to Enhance Productivity

Authors: Ganiyat Soliu, Glen Bright, Chiemela Onunka

Abstract:

Performance optimization plays a key role in controlling the waiting time during manufacturing in an advanced manufacturing environment to improve productivity. Queuing mathematical modeling theory was used to examine the performance of the multi-stage production line. Robotics as a disruptive technology was implemented into a virtual manufacturing scenario during the packaging process to study the effect of waiting time on productivity. The queuing mathematical model was used to determine the optimum service rate required by robots during the packaging stage of manufacturing to yield an optimum production cost. Different rates of production were assumed in a virtual manufacturing environment, cost of packaging was estimated with optimum production cost. An equation was generated using queuing mathematical modeling theory and the theorem adopted for analysis of the scenario is the Newton Raphson theorem. Queuing theory presented here provides an adequate analysis of the number of robots required to regulate waiting time in order to increase the number of output. Arrival rate of the product was fast which shows that queuing mathematical model was effective in minimizing service cost and the waiting time during manufacturing. At a reduced waiting time, there was an improvement in the number of products obtained per hour. The overall productivity was improved based on the assumptions used in the queuing modeling theory implemented in the virtual manufacturing scenario.

Keywords: performance optimization, productivity, queuing theory, robotics

Procedia PDF Downloads 154
24305 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 676
24304 A Basic Modeling Approach for the 3D Protein Structure of Insulin

Authors: Daniel Zarzo Montes, Manuel Zarzo Castelló

Abstract:

Proteins play a fundamental role in biology, but their structure is complex, and it is a challenge for teachers to conceptually explain the differences between their primary, secondary, tertiary, and quaternary structures. On the other hand, there are currently many computer programs to visualize the 3D structure of proteins, but they require advanced training and knowledge. Moreover, it becomes difficult to visualize the sequence of amino acids in these models, and how the protein conformation is reached. Given this drawback, a simple and instructive procedure is proposed in order to teach the protein structure to undergraduate and graduate students. For this purpose, insulin has been chosen because it is a protein that consists of 51 amino acids, a relatively small number. The methodology has consisted of the use of plastic atom models, which are frequently used in organic chemistry and biochemistry to explain the chirality of biomolecules. For didactic purposes, when the aim is to teach the biochemical foundations of proteins, a manipulative system seems convenient, starting from the chemical structure of amino acids. It has the advantage that the bonds between amino acids can be conveniently rotated, following the pattern marked by the 3D models. First, the 51 amino acids were modeled, and then they were linked according to the sequence of this protein. Next, the three disulfide bonds that characterize the stability of insulin have been established, and then the alpha-helix structure has been formed. In order to reach the tertiary 3D conformation of this protein, different interactive models available on the Internet have been visualized. In conclusion, the proposed methodology seems very suitable for biology and biochemistry students because they can learn the fundamentals of protein modeling by means of a manipulative procedure as a basis for understanding the functionality of proteins. This methodology would be conveniently useful for a biology or biochemistry laboratory practice, either at the pre-graduate or university level.

Keywords: protein structure, 3D model, insulin, biomolecule

Procedia PDF Downloads 55
24303 Modeling Corruption Dynamics Within Bono and Ahafo Police Service in Ghana

Authors: Adam Ahmed Hosney

Abstract:

The existence of a culture of corruption within an institution, such as the police, could be a sign of failure from various angles. There is a general perception among Ghanaians that the most corrupt institution is the police service. The purpose of this study is to formulate and analyze a nonlinear mathematical model to investigate corruption as an epidemic within the Ghana police service, this study revealed the basic reproduction number for corruption extinction and corruption survival. The threshold conditions for all kinds of equilibrium points are obtained using linearization methods and Lyapunov functional methods, and they demonstrate local asymptotic stability for both corrupt endemic and corrupt free equilibrium states. The model was analyzed qualitatively, and the solution was derived. The model appears to be positively invariant and attractive. Therefore, the region exhibits positive invariance. Thus, it is adequate to think about the dynamics of the model. For the purpose of illustrating the solution, the graphic result was presented and discussed. Results show that corruption will die out within the police service if the government shows no tolerance for those involved in corrupt practices. Study findings indicate that leaders should be trustworthy, demonstrate a complete and viable commitment to addressing corruption, and make it a priority to provide mass education to all citizens as well as using religious leaders to fight corruption since most Ghanaians are religious and trust their leaders.

Keywords: mathematical model, differential equation, dynamical system, simulation

Procedia PDF Downloads 27
24302 Using Structural Equation Modeling to Analyze the Impact of Remote Work on Job Satisfaction

Authors: Florian Pfeffel, Valentin Nickolai, Christian Louis Kühner

Abstract:

Digitalization has disrupted the traditional workplace environment by allowing many employees to work from anywhere at any time. This trend of working from home was further accelerated due to the COVID-19 crisis, which forced companies to rethink their workplace models. While in many companies, this shift happened out of pure necessity; many employees were left more satisfied with their job due to the opportunity to work from home. This study focuses on employees’ job satisfaction in the service sector in dependence on the different work models, which are defined as a “work from home” model, the traditional “work in office” model, and a hybrid model. Using structural equation modeling (SEM), these three work models have been analyzed based on 13 influencing factors on job satisfaction that have been further summarized in the three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, a survey has been conducted with n = 684 employees in the service sector. Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. Additionally, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that employees who work entirely remotely or have a hybrid work model are significantly more satisfied with their job, with a job satisfaction score of 5.0 respectively on a scale from 1 (very dissatisfied) to 7 (very satisfied), than employees do not have the option to work from home with a score of 4.6. This comes as a result of the lower identification with the work in the model without any remote working. Furthermore, the responses indicate that it is important to consider the individual preferences of each employee when it comes to the work model to achieve overall higher job satisfaction. Thus, it can be argued that companies can profit off of more motivation and higher productivity by considering the individual work model preferences, therefore, increasing the identification with the respective work.

Keywords: home-office, identification with work, job satisfaction, new work, remote work, structural equation modeling

Procedia PDF Downloads 82
24301 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook

Authors: Chien-Jen Liu, Shu Ching Yang

Abstract:

Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.

Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness

Procedia PDF Downloads 346
24300 Scenario-Based Analysis of Electric Vehicle Penetration in Road Transportation in Laos

Authors: Bouneua Khamphilavanh, Toshihiko Masui

Abstract:

The penetration of EV (electric vehicle) technology in Lao road transportation, in this study, was analyzed by using the AIM/CGE [Laos] model. The computable general equilibrium (CGE) model was developed by the Asia-Pacific Integrated Model (AIM) team. In line with the increase of the number of road vehicles, the energy demand in the transport sector has been gradually increased which resulted in a large amount of budget spent for importing fossil fuels during the last decade, and a high carbon dioxide emission from the transport sector, hence the aim of this research is to analyze the impact of EVs penetration on economic and CO₂ emission in short-term, middle-term, and long-term. By the year 2050, the expected gross domestic product (GDP) value, due to Laos will spend more budget for importing the EV, will be gradually lost up to one percent. The cumulative CO₂ emission from 2020 to 2050 in BAU case will be 12,000 GgCO₂eq, and those in the EV mitigation case will be 9,300 GgCO₂eq, which accounting for likely 77% cumulative CO₂ emission reduction in the road transport sector by introducing the EV technology.

Keywords: GDP, CO₂ mitigation, CGE model, EV technology, transport

Procedia PDF Downloads 278
24299 Assessment of Landfill Pollution Load on Hydroecosystem by Use of Heavy Metal Bioaccumulation Data in Fish

Authors: Gintarė Sauliutė, Gintaras Svecevičius

Abstract:

Landfill leachates contain a number of persistent pollutants, including heavy metals. They have the ability to spread in ecosystems and accumulate in fish which most of them are classified as top-consumers of trophic chains. Fish are freely swimming organisms; but perhaps, due to their species-specific ecological and behavioral properties, they often prefer the most suitable biotopes and therefore, did not avoid harmful substances or environments. That is why it is necessary to evaluate the persistent pollutant dispersion in hydroecosystem using fish tissue metal concentration. In hydroecosystems of hybrid type (e.g. river-pond-river) the distance from the pollution source could be a perfect indicator of such a kind of metal distribution. The studies were carried out in the Kairiai landfill neighboring hybrid-type ecosystem which is located 5 km east of the Šiauliai City. Fish tissue (gills, liver, and muscle) metal concentration measurements were performed on two types of ecologically-different fishes according to their feeding characteristics: benthophagous (Gibel carp, roach) and predatory (Northern pike, perch). A number of mathematical models (linear, non-linear, using log and other transformations) have been applied in order to identify the most satisfactorily description of the interdependence between fish tissue metal concentration and the distance from the pollution source. However, the only one log-multiple regression model revealed the pattern that the distance from the pollution source is closely and positively correlated with metal concentration in all predatory fish tissues studied (gills, liver, and muscle).

Keywords: bioaccumulation in fish, heavy metals, hydroecosystem, landfill leachate, mathematical model

Procedia PDF Downloads 286
24298 Bankruptcy Prediction Analysis on Mining Sector Companies in Indonesia

Authors: Devina Aprilia Gunawan, Tasya Aspiranti, Inugrah Ratia Pratiwi

Abstract:

This research aims to classify the mining sector companies based on Altman’s Z-score model, and providing an analysis based on the Altman’s Z-score model’s financial ratios to provide a picture about the financial condition in mining sector companies in Indonesia and their viability in the future, and to find out the partial and simultaneous impact of each of the financial ratio variables in the Altman’s Z-score model, namely (WC/TA), (RE/TA), (EBIT/TA), (MVE/TL), and (S/TA), toward the financial condition represented by the Z-score itself. Among 38 mining sector companies listed in Indonesia Stock Exchange (IDX), 28 companies are selected as research sample according to the purposive sampling criteria.The results of this research showed that during 3 years research period at 2010-2012, the amount of the companies that was predicted to be healthy in each year was less than half of the total sample companies and not even reach up to 50%. The multiple regression analysis result showed that all of the research hypotheses are accepted, which means that (WC/TA), (RE/TA), (EBIT/TA), (MVE/TL), and (S/TA), both partially and simultaneously had an impact towards company’s financial condition.

Keywords: Altman’s Z-score model, financial condition, mining companies, Indonesia

Procedia PDF Downloads 529
24297 An Acerbate Psychotics Symptoms, Social Support, Stressful Life Events, Medication Use Self-Efficacy Impact on Social Dysfunction: A Cross Sectional Self-Rated Study of Persons with Schizophrenia Patient and Misusing Methamphetamines

Authors: Ek-Uma Imkome, Jintana Yunibhand, Waraporn Chaiyawat

Abstract:

Background: Persons with schizophrenia patient and misusing methamphetamines suffering from social dysfunction that impact on their quality of life. Knowledge of factors related to social dysfunction will guide the effective intervention. Objectives: To determine the direct effect, indirect effect and total effect of an acerbate Psychotics’ Symptoms, Social Support, Stressful life events, Medication use self-efficacy impact on social dysfunction in Thai schizophrenic patient and methamphetamine misuse. Methods: Data were collected from schizophrenic and methamphetamine misuse patient by self report. A linear structural relationship was used to test the hypothesized path model. Results: The hypothesized model was found to fit the empirical data and explained 54% of the variance of the psychotic symptoms (X2 = 114.35, df = 92, p-value = 0.05, X2 /df = 1.24, GFI = 0.96, AGFI = 0.92, CFI = 1.00, NFI = 0.99, NNFI = 0.99, RMSEA = 0.02). The highest total effect on social dysfunction was psychotic symptoms (0.67, p<0.05). Medication use self-efficacy had a direct effect on psychotic symptoms (-0.25, p<0.01), and social support had direct effect on medication use self efficacy (0.36, p <0.01). Conclusions: Psychotic symptoms and stressful life events were the significance factors that influenced direct on social dysfunctioning. Therefore, interventions that are designed to manage these factors are crucial in order to enhance social functioning in this population.

Keywords: psychotic symptoms, methamphetamine, schizophrenia, stressful life events, social dysfunction, social support, medication use self efficacy

Procedia PDF Downloads 208
24296 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 194
24295 Tax Evasion and Macroeconomic (In)stability

Authors: Wei-Neng Wang, Jhy-Yuan Shieh, Jhy-Hwa Chen, Juin-Jen Chang

Abstract:

This paper incorporate tax evasion into a one-sector real business cycle (RBC) model to explores the quantitative interrelations between income tax rate and equilibrium (in)determinacy, and income tax rate is endogenously determined in order to balance the government budget. We find that the level of the effective income tax rate is key factor for equilibrium (in)determinacy, instead of the level of income tax rate in a tax evasion economy. Under an economy with tax evasion, the higher income tax rate is not sufficiently to lead to equilibrium indeterminate, it must combine with a necessary condition which is the lower fraction of tax evasion and that can result in agents' optimistic expectations to become self-fulfilling and sunspot fluctuation more likely to occur. On the other hand, an economy with tax evasion can see its macroeconomy become more stabilize, and a higher fraction of income tax evasion may has a stronger stabilizing effect.

Keywords: tax evasion, balanced-budget rule, equlibirium (in)determinacy, effective income tax rate

Procedia PDF Downloads 63
24294 Model and Neural Control of the Depth of Anesthesia during Surgery

Authors: Javier Fernandez, Mayte Medina, Rafael Fernandez de Canete, Nuria Alcain, Juan Carlos Ramos-Diaz

Abstract:

At present, the experimentation of anesthetic drugs on patients requires a regulation protocol, and the response of each patient to several doses of entry drug must be well known. Therefore, the development of pharmacological dose control systems is a promising field of research in anesthesiology. In this paper, it has been developed a non-linear compartmental the pharmacokinetic-pharmacodynamical model which describes the anesthesia depth effect in a sufficiently reliable way over a set of patients with the depth effect quantified by the Bi-Spectral Index. Afterwards, an Artificial Neural Network (ANN) predictive controller has been designed based on the depth of anesthesia model so as to keep the patient in the optimum condition while he undergoes surgical treatment. For the purpose of quantifying the efficiency of the neural predictive controller, a classical proportional-integral-derivative controller has also been developed to compare both strategies. Results show the superior performance of predictive neural controller during BiSpectral Index reference tracking.

Keywords: anesthesia, bi-spectral index, neural network control, pharmacokinetic-pharmacodynamical model

Procedia PDF Downloads 337
24293 Influence of Foundation Size on Seismic Response of Mid-rise Buildings Considering Soil-Structure-Interaction

Authors: Quoc Van Nguyen, Behzad Fatahi, Aslan S. Hokmabadi

Abstract:

Performance based seismic design is a modern approach to earthquake-resistant design shifting emphasis from “strength” to “performance”. Soil-Structure Interaction (SSI) can influence the performance level of structures significantly. In this paper, a fifteen storey moment resisting frame sitting on a shallow foundation (footing) with different sizes is simulated numerically using ABAQUS software. The developed three dimensional numerical simulation accounts for nonlinear behaviour of the soil medium by considering the variation of soil stiffness and damping as a function of developed shear strain in the soil elements during earthquake. Elastic-perfectly plastic model is adopted to simulate piles and structural elements. Quiet boundary conditions are assigned to the numerical model and appropriate interface elements, capable of modelling sliding and separation between the foundation and soil elements, are considered. Numerical results in terms of base shear, lateral deformations, and inter-storey drifts of the structure are compared for the cases of soil-structure interaction system with different foundation sizes as well as fixed base condition (excluding SSI). It can be concluded that conventional design procedures excluding SSI may result in aggressive design. Moreover, the size of the foundation can influence the dynamic characteristics and seismic response of the building due to SSI and should therefore be given careful consideration in order to ensure a safe and cost effective seismic design.

Keywords: soil-structure-interaction, seismic response, shallow foundation, abaqus, rayleigh damping

Procedia PDF Downloads 506
24292 Numerical Simulations on Feasibility of Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization

Authors: Taiki Baba, Tomoaki Hashimoto

Abstract:

The random dither quantization method enables us to achieve much better performance than the simple uniform quantization method for the design of quantized control systems. Motivated by this fact, the stochastic model predictive control method in which a performance index is minimized subject to probabilistic constraints imposed on the state variables of systems has been proposed for linear feedback control systems with random dither quantization. In other words, a method for solving optimal control problems subject to probabilistic state constraints for linear discrete-time control systems with random dither quantization has been already established. To our best knowledge, however, the feasibility of such a kind of optimal control problems has not yet been studied. Our objective in this paper is to investigate the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization. To this end, we provide the results of numerical simulations that verify the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization.

Keywords: model predictive control, stochastic systems, probabilistic constraints, random dither quantization

Procedia PDF Downloads 282
24291 A Comparative Evaluation of the SIR and SEIZ Epidemiological Models to Describe the Diffusion Characteristics of COVID-19 Polarizing Viewpoints on Online

Authors: Maryam Maleki, Esther Mead, Mohammad Arani, Nitin Agarwal

Abstract:

This study is conducted to examine how opposing viewpoints related to COVID-19 were diffused on Twitter. To accomplish this, six datasets using two epidemiological models, SIR (Susceptible, Infected, Recovered) and SEIZ (Susceptible, Exposed, Infected, Skeptics), were analyzed. The six datasets were chosen because they represent opposing viewpoints on the COVID-19 pandemic. Three of the datasets contain anti-subject hashtags, while the other three contain pro-subject hashtags. The time frame for all datasets is three years, starting from January 2020 to December 2022. The findings revealed that while both models were effective in evaluating the propagation trends of these polarizing viewpoints, the SEIZ model was more accurate with a relatively lower error rate (6.7%) compared to the SIR model (17.3%). Additionally, the relative error for both models was lower for anti-subject hashtags compared to pro-subject hashtags. By leveraging epidemiological models, insights into the propagation trends of polarizing viewpoints on Twitter were gained. This study paves the way for the development of methods to prevent the spread of ideas that lack scientific evidence while promoting the dissemination of scientifically backed ideas.

Keywords: mathematical modeling, epidemiological model, seiz model, sir model, covid-19, twitter, social network analysis, social contagion

Procedia PDF Downloads 62
24290 Dynamics of Adiabatic Rapid Passage in an Open Rabi Dimer Model

Authors: Justin Zhengjie Tan, Yang Zhao

Abstract:

Adiabatic Rapid Passage, a popular method of achieving population inversion, is studied in a Rabi dimer model in the presence of noise which acts as a dissipative environment. The integration of the multi-Davydov D2 Ansatz into the time-dependent variational framework enables us to model the intricate quantum system accurately. By influencing the system with a driving field strength resonant with the energy spacing, the probability of adiabatic rapid passage, which is modelled after the Landau Zener model, can be derived along with several other observables, such as the photon population. The effects of a dissipative environment can be reproduced by coupling the system to a common phonon mode. By manipulating the strength and frequency of the driving field, along with the coupling strength of the phonon mode to the qubits, we are able to control the qubits and photon dynamics and subsequently increase the probability of Adiabatic Rapid Passage happening.

Keywords: quantum electrodynamics, adiabatic rapid passage, Landau-Zener transitions, dissipative environment

Procedia PDF Downloads 86
24289 Developing a Green Strategic Management Model with regarding HSE-MS

Authors: Amin Padash, Gholam Reza Nabi Bid Hendi, Hassan Hoveidi

Abstract:

Purpose: The aim of this research is developing a model for green management based on Health, Safety and Environmental Management System. An HSE-MS can be a powerful tool for organizations to both improve their environmental, health and safety performance, and enhance their business efficiency to green management. Model: The model is developed in this study can be used for industries as guidelines for implementing green management issue by considering Health, Safety and Environmental Management System. Case Study: The Pars Special Economic / Energy Zone Organization on behalf of Iran’s Petroleum Ministry and National Iranian Oil Company (NIOC) manages and develops the South and North oil and gas fields in the region. Methodology: This research according to objective is applied and based on implementing is descriptive and also prescription. We used technique MCDM (Multiple Criteria Decision-Making) for determining the priorities of the factors. Based on process approach the model consists of the following steps and components: first factors involved in green issues are determined. Based on them a framework is considered. Then with using MCDM (Multiple Criteria Decision-Making) algorithms (TOPSIS) the priority of basic variables are determined. The authors believe that the proposed model and results of this research can aid industries managers to implement green subjects according to Health, Safety and Environmental Management System in a more efficient and effective manner. Finding and conclusion: Basic factors involved in green issues and their weights can be the main finding. Model and relation between factors are the other finding of this research. The case is considered Petrochemical Company for promoting the system of ecological industry thinking.

Keywords: Fuzzy-AHP method , green management, health, safety and environmental management system, MCDM technique, TOPSIS

Procedia PDF Downloads 411
24288 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 155
24287 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 205