Search results for: finite element model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19098

Search results for: finite element model

15288 Using Flow Line Modelling, Remote Sensing for Reconstructing Glacier Volume Loss Model for Athabasca Glacier, Canadian Rockies

Authors: Rituparna Nath, Shawn J. Marshall

Abstract:

Glaciers are one of the main sensitive climatic indicators, as they respond strongly to small climatic shifts. We develop a flow line model of glacier dynamics to simulate the past and future extent of glaciers in the Canadian Rocky Mountains, with the aim of coupling this model within larger scale regional climate models of glacier response to climate change. This paper will focus on glacier-climate modeling and reconstructions of glacier volume from the Little Ice Age (LIA) to present for Athabasca Glacier, Alberta, Canada. Glacier thickness, volume and mass change will be constructed using flow line modelling and examination of different climate scenarios that are able to give good reconstructions of LIA ice extent. With the availability of SPOT 5 imagery, Digital elevation models and GIS Arc Hydro tool, ice catchment properties-glacier width and LIA moraines have been extracted using automated procedures. Simulation of glacier mass change will inform estimates of meltwater run off over the historical period and model calibration from the LIA reconstruction will aid in future projections of the effects of climate change on glacier recession. Furthermore, the model developed will be effective for further future studies with ensembles of glaciers.

Keywords: flow line modeling, Athabasca Glacier, glacier mass balance, Remote Sensing, Arc hydro tool, little ice age

Procedia PDF Downloads 268
15287 Numerical Investigation of the Electromagnetic Common Rail Injector Characteristics

Authors: Rafal Sochaczewski, Ksenia Siadkowska, Tytus Tulwin

Abstract:

The paper describes the modeling of a fuel injector for common rail systems. A one-dimensional model of a solenoid-valve-controlled injector with Valve Closes Orifice (VCO) spray was modelled in the AVL Hydsim. This model shows the dynamic phenomena that occur in the injector. The accuracy of the calibration, based on a regulation of the parameters of the control valve and the nozzle needle lift, was verified by comparing the numerical results of injector flow rate. Our model is capable of a precise simulation of injector operating parameters in relation to injection time and fuel pressure in a fuel rail. As a result, there were made characteristics of the injector flow rate and backflow.

Keywords: common rail, diesel engine, fuel injector, modeling

Procedia PDF Downloads 412
15286 Two Layer Photo-Thermal Deflection Model to Investigate the Electronic Properties in BGaAs/GaAs Alloys

Authors: S. Ilahi, M. Baira, F. Saidi, N. Yacoubi, L. Auvray, H. Maaref

Abstract:

Photo-thermal deflection technique (PTD) is used to study the nonradiative recombination process in BGaAs/GaAs alloy with boron composition of 3% and 8% grown by metal organic chemical vapor deposition (MOCVD). A two layer theoretical model has been developed taking into account both thermal and electronic contribution in the photothermal signal allowing to extract the electronic parameters namely electronic diffusivity, surface and interface recombination. It is found that the increase of boron composition alters the BGaAs epilayers transport properties.

Keywords: photothermal defelction technique, two layer model, BGaAs/GaAs alloys, boron composition

Procedia PDF Downloads 301
15285 Dow Polyols near Infrared Chemometric Model Reduction Based on Clustering: Reducing Thirty Global Hydroxyl Number (OH) Models to Less Than Five

Authors: Wendy Flory, Kazi Czarnecki, Matthijs Mercy, Mark Joswiak, Mary Beth Seasholtz

Abstract:

Polyurethane Materials are present in a wide range of industrial segments such as Furniture, Building and Construction, Composites, Automotive, Electronics, and more. Dow is one of the leaders for the manufacture of the two main raw materials, Isocyanates and Polyols used to produce polyurethane products. Dow is also a key player for the manufacture of Polyurethane Systems/Formulations designed for targeted applications. In 1990, the first analytical chemometric models were developed and deployed for use in the Dow QC labs of the polyols business for the quantification of OH, water, cloud point, and viscosity. Over the years many models have been added; there are now over 140 models for quantification and hundreds for product identification, too many to be reasonable for support. There are 29 global models alone for the quantification of OH across > 70 products at many sites. An attempt was made to consolidate these into a single model. While the consolidated model proved good statistics across the entire range of OH, several products had a bias by ASTM E1655 with individual product validation. This project summary will show the strategy for global model updates for OH, to reduce the number of models for quantification from over 140 to 5 or less using chemometric methods. In order to gain an understanding of the best product groupings, we identify clusters by reducing spectra to a few dimensions via Principal Component Analysis (PCA) and Uniform Manifold Approximation and Projection (UMAP). Results from these cluster analyses and a separate validation set allowed dow to reduce the number of models for predicting OH from 29 to 3 without loss of accuracy.

Keywords: hydroxyl, global model, model maintenance, near infrared, polyol

Procedia PDF Downloads 135
15284 Control HVAC Parameters by Brain Emotional Learning Based Intelligent Controller (BELBIC)

Authors: Javad Abdi, Azam Famil Khalili

Abstract:

Modeling emotions have attracted much attention in recent years, both in cognitive psychology and design of artificial systems. However, it is a negative factor in decision-making; emotions have shown to be a strong faculty for making fast satisfying decisions. In this paper, we have adapted a computational model based on the limbic system in the mammalian brain for control engineering applications. Learning in this model based on Temporal Difference (TD) Learning, we applied the proposed controller (termed BELBIC) for a simple model of a submarine. The model was supposed to reach the desired depth underwater. Our results demonstrate excellent control action, disturbance handling, and system parameter robustness for TDBELBIC. The proposal method, regarding the present conditions, the system action in the part and the controlling aims, can control the system in a way that these objectives are attained in the least amount of time and the best way.

Keywords: artificial neural networks, temporal difference, brain emotional learning based intelligent controller, heating- ventilating and air conditioning

Procedia PDF Downloads 433
15283 A Convolutional Deep Neural Network Approach for Skin Cancer Detection Using Skin Lesion Images

Authors: Firas Gerges, Frank Y. Shih

Abstract:

Malignant melanoma, known simply as melanoma, is a type of skin cancer that appears as a mole on the skin. It is critical to detect this cancer at an early stage because it can spread across the body and may lead to the patient's death. When detected early, melanoma is curable. In this paper, we propose a deep learning model (convolutional neural networks) in order to automatically classify skin lesion images as malignant or benign. Images underwent certain pre-processing steps to diminish the effect of the normal skin region on the model. The result of the proposed model showed a significant improvement over previous work, achieving an accuracy of 97%.

Keywords: deep learning, skin cancer, image processing, melanoma

Procedia PDF Downloads 148
15282 Association Between Short-term NOx Exposure and Asthma Exacerbations in East London: A Time Series Regression Model

Authors: Hajar Hajmohammadi, Paul Pfeffer, Anna De Simoni, Jim Cole, Chris Griffiths, Sally Hull, Benjamin Heydecker

Abstract:

Background: There is strong interest in the relationship between short-term air pollution exposure and human health. Most studies in this field focus on serious health effects such as death or hospital admission, but air pollution exposure affects many people with less severe impacts, such as exacerbations of respiratory conditions. A lack of quantitative analysis and inconsistent findings suggest improved methodology is needed to understand these effectsmore fully. Method: We developed a time series regression model to quantify the relationship between daily NOₓ concentration and Asthma exacerbations requiring oral steroids from primary care settings. Explanatory variables include daily NOₓ concentration measurements extracted from 8 available background and roadside monitoring stations in east London and daily ambient temperature extracted for London City Airport, located in east London. Lags of NOx concentrations up to 21 days (3 weeks) were used in the model. The dependent variable was the daily number of oral steroid courses prescribed for GP registered patients with asthma in east London. A mixed distribution model was then fitted to the significant lags of the regression model. Result: Results of the time series modelling showed a significant relationship between NOₓconcentrations on each day and the number of oral steroid courses prescribed in the following three weeks. In addition, the model using only roadside stations performs better than the model with a mixture of roadside and background stations.

Keywords: air pollution, time series modeling, public health, road transport

Procedia PDF Downloads 144
15281 Level Set Based Extraction and Update of Lake Contours Using Multi-Temporal Satellite Images

Authors: Yindi Zhao, Yun Zhang, Silu Xia, Lixin Wu

Abstract:

The contours and areas of water surfaces, especially lakes, often change due to natural disasters and construction activities. It is an effective way to extract and update water contours from satellite images using image processing algorithms. However, to produce optimal water surface contours that are close to true boundaries is still a challenging task. This paper compares the performances of three different level set models, including the Chan-Vese (CV) model, the signed pressure force (SPF) model, and the region-scalable fitting (RSF) energy model for extracting lake contours. After experiment testing, it is indicated that the RSF model, in which a region-scalable fitting (RSF) energy functional is defined and incorporated into a variational level set formulation, is superior to CV and SPF, and it can get desirable contour lines when there are “holes” in the regions of waters, such as the islands in the lake. Therefore, the RSF model is applied to extracting lake contours from Landsat satellite images. Four temporal Landsat satellite images of the years of 2000, 2005, 2010, and 2014 are used in our study. All of them were acquired in May, with the same path/row (121/036) covering Xuzhou City, Jiangsu Province, China. Firstly, the near infrared (NIR) band is selected for water extraction. Image registration is conducted on NIR bands of different temporal images for information update, and linear stretching is also done in order to distinguish water from other land cover types. Then for the first temporal image acquired in 2000, lake contours are extracted via the RSF model with initialization of user-defined rectangles. Afterwards, using the lake contours extracted the previous temporal image as the initialized values, lake contours are updated for the current temporal image by means of the RSF model. Meanwhile, the changed and unchanged lakes are also detected. The results show that great changes have taken place in two lakes, i.e. Dalong Lake and Panan Lake, and RSF can actually extract and effectively update lake contours using multi-temporal satellite image.

Keywords: level set model, multi-temporal image, lake contour extraction, contour update

Procedia PDF Downloads 366
15280 Sedimentological and Geochemical Characteristics of Aeolian Sediments and Their Implication for Sand Origin in the Yarlung Zangbo River Valley, Southern Qinghai-Tibetan Plateau

Authors: Na Zhou, Chun-Lai Zhang, Qing Li, Bingqi Zhu, Xun-Ming Wang

Abstract:

The understanding of the dynamics of aeolian sand in the Yarlung Zangbo River Valley (YLZBV), southern Qinghai-Tibetan Plateau, including its origins, transportation,and deposition, remains preliminary. In this study, we investigated the extensive origin of aeolian sediments in the YLZBV by analyzing the distribution and composition of sediment’s grain size and geochemical composition in dune sediments collected from the wide river terraces. The major purpose is to characterize the sedimentological and geochemical compositions of these aeolian sediments, trace back to their sources, and understand their influencing factors. As a result, the grain size and geochemistry variations, which showed a significant correlation between grain sizes distribution and element abundances, give a strong evidence that the important part of the aeolian sediments in the downstream areas was firstly derived from the upper reaches by intense fluvial processes. However, the sediments experienced significant mixing process with local inputs and reconstructed by regional wind transportation. The diverse compositions and tight associations in the major and trace element geochemistry between the up- and down-stream aeolian sediments and the local detrital rocks, which were collected from the surrounding mountains, suggest that the upstream aeolian sediments had originated from the various close-range rock types, and experienced intensive mixing processes via aeolian- fluvial dynamics. Sand mass transported by water and wind was roughly estimated to qualify the interplay between the aeolian and fluvial processes controlling the sediment transport, yield, and ultimately shaping the aeolian landforms in the mainstream of the YLZBV.

Keywords: grain size distribution, geochemistry, wind and water load, sand source, Yarlung Zangbo River Valley

Procedia PDF Downloads 97
15279 Clinician's Perspective of Common Factors of Change in Family Therapy: A Cross-National Exploration

Authors: Hassan Karimi, Fred Piercy, Ruoxi Chen, Ana L. Jaramillo-Sierra, Wei-Ning Chang, Manjushree Palit, Catherine Martosudarmo, Angelito Antonio

Abstract:

Background: The two psychotherapy camps, the randomized clinical trials (RCTs) and the common factors model, have competitively claimed specific explanations for therapy effectiveness. Recently, scholars called for empirical evidence to show the role of common factors in therapeutic outcome in marriage and family therapy. Purpose: This cross-national study aims to explore how clinicians, across different nations and theoretical orientations, attribute the contribution of common factors to therapy outcome. Method: A brief common factors questionnaire (CFQ-with a Cronbach’s Alpha, 0.77) was developed and administered in seven nations. A series of statistical analyses (paired-samples t-test, independent sample t-test, ANOVA) were conducted: to compare clinicians perceived contribution of total common factors versus model-specific factors, to compare each pair of common factors’ categories, and to compare clinicians from collectivistic nations versus clinicians from individualistic nation. Results: Clinicians across seven nations attributed 86% to common factors versus 14% to model-specific factors. Clinicians attributed 34% of therapeutic change to client’s factors, 26% to therapist’s factors, 26% to relationship factors, and 14% to model-specific techniques. The ANOVA test indicated each of the three categories of common factors (client 34%, therapist 26%, relationship 26%) showed higher contribution in therapeutic outcome than the category of model specific factors (techniques 14%). Clinicians with psychology degree attributed more contribution to model-specific factors than clinicians with MFT and counseling degrees who attributed more contribution to client factors. Clinicians from collectivistic nations attributed larger contributions to therapist’s factors (M=28.96, SD=12.75) than the US clinicians (M=23.22, SD=7.73). The US clinicians attributed a larger contribution to client’s factors (M=39.02, SD=1504) than clinicians from the collectivistic nations (M=28.71, SD=15.74). Conclusion: The findings indicate clinicians across the globe attributed more than two thirds of therapeutic change to CFs, which emphasize the training of the common factors model in the field. CFs, like model-specific factors, vary in their contribution to therapy outcome in relation to specific client, therapist, problem, treatment model, and sociocultural context. Sociocultural expectations and norms should be considered as a context in which both CFs and model-specific factors function toward therapeutic goals. Clinicians need to foster a cultural competency specifically regarding the divergent ways that CFs can be activated due to specific sociocultural values.

Keywords: common factors, model-specific factors, cross-national survey, therapist cultural competency, enhancing therapist efficacy

Procedia PDF Downloads 287
15278 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 676
15277 Modeling Corruption Dynamics Within Bono and Ahafo Police Service in Ghana

Authors: Adam Ahmed Hosney

Abstract:

The existence of a culture of corruption within an institution, such as the police, could be a sign of failure from various angles. There is a general perception among Ghanaians that the most corrupt institution is the police service. The purpose of this study is to formulate and analyze a nonlinear mathematical model to investigate corruption as an epidemic within the Ghana police service, this study revealed the basic reproduction number for corruption extinction and corruption survival. The threshold conditions for all kinds of equilibrium points are obtained using linearization methods and Lyapunov functional methods, and they demonstrate local asymptotic stability for both corrupt endemic and corrupt free equilibrium states. The model was analyzed qualitatively, and the solution was derived. The model appears to be positively invariant and attractive. Therefore, the region exhibits positive invariance. Thus, it is adequate to think about the dynamics of the model. For the purpose of illustrating the solution, the graphic result was presented and discussed. Results show that corruption will die out within the police service if the government shows no tolerance for those involved in corrupt practices. Study findings indicate that leaders should be trustworthy, demonstrate a complete and viable commitment to addressing corruption, and make it a priority to provide mass education to all citizens as well as using religious leaders to fight corruption since most Ghanaians are religious and trust their leaders.

Keywords: mathematical model, differential equation, dynamical system, simulation

Procedia PDF Downloads 27
15276 Specific Frequency of Globular Clusters in Different Galaxy Types

Authors: Ahmed H. Abdullah, Pavel Kroupa

Abstract:

Globular clusters (GC) are important objects for tracing the early evolution of a galaxy. We study the correlation between the cluster population and the global properties of the host galaxy. We found that the correlation between cluster population (NGC) and the baryonic mass (Mb) of the host galaxy are best described as 10 −5.6038Mb. In order to understand the origin of the U -shape relation between the GC specific frequency (SN) and Mb (caused by the high value of SN for dwarfs galaxies and giant ellipticals and a minimum SN for intermediate mass galaxies≈ 1010M), we derive a theoretical model for the specific frequency (SNth). The theoretical model for SNth is based on the slope of the power-law embedded cluster mass function (β) and different time scale (Δt) of the forming galaxy. Our results show a good agreement between the observation and the model at a certain β and Δt. The model seems able to reproduce higher value of SNth of β = 1.5 at the midst formation time scale.

Keywords: galaxies: dwarf, globular cluster: specific frequency, number of globular clusters, formation time scale

Procedia PDF Downloads 326
15275 Using Structural Equation Modeling to Analyze the Impact of Remote Work on Job Satisfaction

Authors: Florian Pfeffel, Valentin Nickolai, Christian Louis Kühner

Abstract:

Digitalization has disrupted the traditional workplace environment by allowing many employees to work from anywhere at any time. This trend of working from home was further accelerated due to the COVID-19 crisis, which forced companies to rethink their workplace models. While in many companies, this shift happened out of pure necessity; many employees were left more satisfied with their job due to the opportunity to work from home. This study focuses on employees’ job satisfaction in the service sector in dependence on the different work models, which are defined as a “work from home” model, the traditional “work in office” model, and a hybrid model. Using structural equation modeling (SEM), these three work models have been analyzed based on 13 influencing factors on job satisfaction that have been further summarized in the three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, a survey has been conducted with n = 684 employees in the service sector. Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. Additionally, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that employees who work entirely remotely or have a hybrid work model are significantly more satisfied with their job, with a job satisfaction score of 5.0 respectively on a scale from 1 (very dissatisfied) to 7 (very satisfied), than employees do not have the option to work from home with a score of 4.6. This comes as a result of the lower identification with the work in the model without any remote working. Furthermore, the responses indicate that it is important to consider the individual preferences of each employee when it comes to the work model to achieve overall higher job satisfaction. Thus, it can be argued that companies can profit off of more motivation and higher productivity by considering the individual work model preferences, therefore, increasing the identification with the respective work.

Keywords: home-office, identification with work, job satisfaction, new work, remote work, structural equation modeling

Procedia PDF Downloads 82
15274 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook

Authors: Chien-Jen Liu, Shu Ching Yang

Abstract:

Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.

Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness

Procedia PDF Downloads 346
15273 Scenario-Based Analysis of Electric Vehicle Penetration in Road Transportation in Laos

Authors: Bouneua Khamphilavanh, Toshihiko Masui

Abstract:

The penetration of EV (electric vehicle) technology in Lao road transportation, in this study, was analyzed by using the AIM/CGE [Laos] model. The computable general equilibrium (CGE) model was developed by the Asia-Pacific Integrated Model (AIM) team. In line with the increase of the number of road vehicles, the energy demand in the transport sector has been gradually increased which resulted in a large amount of budget spent for importing fossil fuels during the last decade, and a high carbon dioxide emission from the transport sector, hence the aim of this research is to analyze the impact of EVs penetration on economic and CO₂ emission in short-term, middle-term, and long-term. By the year 2050, the expected gross domestic product (GDP) value, due to Laos will spend more budget for importing the EV, will be gradually lost up to one percent. The cumulative CO₂ emission from 2020 to 2050 in BAU case will be 12,000 GgCO₂eq, and those in the EV mitigation case will be 9,300 GgCO₂eq, which accounting for likely 77% cumulative CO₂ emission reduction in the road transport sector by introducing the EV technology.

Keywords: GDP, CO₂ mitigation, CGE model, EV technology, transport

Procedia PDF Downloads 278
15272 Bankruptcy Prediction Analysis on Mining Sector Companies in Indonesia

Authors: Devina Aprilia Gunawan, Tasya Aspiranti, Inugrah Ratia Pratiwi

Abstract:

This research aims to classify the mining sector companies based on Altman’s Z-score model, and providing an analysis based on the Altman’s Z-score model’s financial ratios to provide a picture about the financial condition in mining sector companies in Indonesia and their viability in the future, and to find out the partial and simultaneous impact of each of the financial ratio variables in the Altman’s Z-score model, namely (WC/TA), (RE/TA), (EBIT/TA), (MVE/TL), and (S/TA), toward the financial condition represented by the Z-score itself. Among 38 mining sector companies listed in Indonesia Stock Exchange (IDX), 28 companies are selected as research sample according to the purposive sampling criteria.The results of this research showed that during 3 years research period at 2010-2012, the amount of the companies that was predicted to be healthy in each year was less than half of the total sample companies and not even reach up to 50%. The multiple regression analysis result showed that all of the research hypotheses are accepted, which means that (WC/TA), (RE/TA), (EBIT/TA), (MVE/TL), and (S/TA), both partially and simultaneously had an impact towards company’s financial condition.

Keywords: Altman’s Z-score model, financial condition, mining companies, Indonesia

Procedia PDF Downloads 529
15271 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 194
15270 Model and Neural Control of the Depth of Anesthesia during Surgery

Authors: Javier Fernandez, Mayte Medina, Rafael Fernandez de Canete, Nuria Alcain, Juan Carlos Ramos-Diaz

Abstract:

At present, the experimentation of anesthetic drugs on patients requires a regulation protocol, and the response of each patient to several doses of entry drug must be well known. Therefore, the development of pharmacological dose control systems is a promising field of research in anesthesiology. In this paper, it has been developed a non-linear compartmental the pharmacokinetic-pharmacodynamical model which describes the anesthesia depth effect in a sufficiently reliable way over a set of patients with the depth effect quantified by the Bi-Spectral Index. Afterwards, an Artificial Neural Network (ANN) predictive controller has been designed based on the depth of anesthesia model so as to keep the patient in the optimum condition while he undergoes surgical treatment. For the purpose of quantifying the efficiency of the neural predictive controller, a classical proportional-integral-derivative controller has also been developed to compare both strategies. Results show the superior performance of predictive neural controller during BiSpectral Index reference tracking.

Keywords: anesthesia, bi-spectral index, neural network control, pharmacokinetic-pharmacodynamical model

Procedia PDF Downloads 337
15269 Developing a Model for Information Giving Behavior in Virtual Communities

Authors: Pui-Lai To, Chechen Liao, Tzu-Ling Lin

Abstract:

Virtual communities have created a range of new social spaces in which to meet and interact with one another. Both as a stand-alone model or as a supplement to sustain competitive advantage for normal business models, building virtual communities has been hailed as one of the major strategic innovations of the new economy. However for a virtual community to evolve, the biggest challenge is how to make members actively give information or provide advice. Even in busy virtual communities, usually, only a small fraction of members post information actively. In order to investigate the determinants of information giving willingness of those contributors who usually actively provide their opinions, we proposed a model to understand the reasons for contribution in communities. The study will definitely serve as a basis for the future growth of information giving in virtual communities.

Keywords: information giving, social identity, trust, virtual community

Procedia PDF Downloads 322
15268 Numerical Simulations on Feasibility of Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization

Authors: Taiki Baba, Tomoaki Hashimoto

Abstract:

The random dither quantization method enables us to achieve much better performance than the simple uniform quantization method for the design of quantized control systems. Motivated by this fact, the stochastic model predictive control method in which a performance index is minimized subject to probabilistic constraints imposed on the state variables of systems has been proposed for linear feedback control systems with random dither quantization. In other words, a method for solving optimal control problems subject to probabilistic state constraints for linear discrete-time control systems with random dither quantization has been already established. To our best knowledge, however, the feasibility of such a kind of optimal control problems has not yet been studied. Our objective in this paper is to investigate the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization. To this end, we provide the results of numerical simulations that verify the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization.

Keywords: model predictive control, stochastic systems, probabilistic constraints, random dither quantization

Procedia PDF Downloads 282
15267 A Comparative Evaluation of the SIR and SEIZ Epidemiological Models to Describe the Diffusion Characteristics of COVID-19 Polarizing Viewpoints on Online

Authors: Maryam Maleki, Esther Mead, Mohammad Arani, Nitin Agarwal

Abstract:

This study is conducted to examine how opposing viewpoints related to COVID-19 were diffused on Twitter. To accomplish this, six datasets using two epidemiological models, SIR (Susceptible, Infected, Recovered) and SEIZ (Susceptible, Exposed, Infected, Skeptics), were analyzed. The six datasets were chosen because they represent opposing viewpoints on the COVID-19 pandemic. Three of the datasets contain anti-subject hashtags, while the other three contain pro-subject hashtags. The time frame for all datasets is three years, starting from January 2020 to December 2022. The findings revealed that while both models were effective in evaluating the propagation trends of these polarizing viewpoints, the SEIZ model was more accurate with a relatively lower error rate (6.7%) compared to the SIR model (17.3%). Additionally, the relative error for both models was lower for anti-subject hashtags compared to pro-subject hashtags. By leveraging epidemiological models, insights into the propagation trends of polarizing viewpoints on Twitter were gained. This study paves the way for the development of methods to prevent the spread of ideas that lack scientific evidence while promoting the dissemination of scientifically backed ideas.

Keywords: mathematical modeling, epidemiological model, seiz model, sir model, covid-19, twitter, social network analysis, social contagion

Procedia PDF Downloads 62
15266 Dynamics of Adiabatic Rapid Passage in an Open Rabi Dimer Model

Authors: Justin Zhengjie Tan, Yang Zhao

Abstract:

Adiabatic Rapid Passage, a popular method of achieving population inversion, is studied in a Rabi dimer model in the presence of noise which acts as a dissipative environment. The integration of the multi-Davydov D2 Ansatz into the time-dependent variational framework enables us to model the intricate quantum system accurately. By influencing the system with a driving field strength resonant with the energy spacing, the probability of adiabatic rapid passage, which is modelled after the Landau Zener model, can be derived along with several other observables, such as the photon population. The effects of a dissipative environment can be reproduced by coupling the system to a common phonon mode. By manipulating the strength and frequency of the driving field, along with the coupling strength of the phonon mode to the qubits, we are able to control the qubits and photon dynamics and subsequently increase the probability of Adiabatic Rapid Passage happening.

Keywords: quantum electrodynamics, adiabatic rapid passage, Landau-Zener transitions, dissipative environment

Procedia PDF Downloads 87
15265 Adsorption of Cd2+ from Aqueous Solutions Using Chitosan Obtained from a Mixture of Littorina littorea and Achatinoidea Shells

Authors: E. D. Paul, O. F. Paul, J. E. Toryila, A. J. Salifu, C. E. Gimba

Abstract:

Adsorption of Cd2+ ions from aqueous solution by Chitosan, a natural polymer, obtained from a mixture of the exoskeletons of Littorina littorea (Periwinkle) and Achatinoidea (Snail) was studied at varying adsorbent dose, contact time, metal ion concentrations, temperature and pH using batch adsorption method. The equilibrium adsorption isotherms were determined between 298 K and 345 K. The adsorption data were adjusted to Langmuir, Freundlich and the pseudo second order kinetic models. It was found that the Langmuir isotherm model most fitted the experimental data, with a maximum monolayer adsorption of 35.1 mgkg⁻¹ at 308 K. The entropy and enthalpy of adsorption were -0.1121 kJmol⁻¹K⁻¹ and -11.43 kJmol⁻¹ respectively. The Freundlich adsorption model, gave Kf and n values consistent with good adsorption. The pseudo-second order reaction model gave a straight line plot with rate constant of 1.291x 10⁻³ kgmg⁻¹ min⁻¹. The qe value was 21.98 mgkg⁻¹, indicating that the adsorption of Cadmium ion by the chitosan composite followed the pseudo-second order kinetic model.

Keywords: adsorption, chitosan, littorina littorea, achatinoidea, natural polymer

Procedia PDF Downloads 403
15264 Developing a Green Strategic Management Model with regarding HSE-MS

Authors: Amin Padash, Gholam Reza Nabi Bid Hendi, Hassan Hoveidi

Abstract:

Purpose: The aim of this research is developing a model for green management based on Health, Safety and Environmental Management System. An HSE-MS can be a powerful tool for organizations to both improve their environmental, health and safety performance, and enhance their business efficiency to green management. Model: The model is developed in this study can be used for industries as guidelines for implementing green management issue by considering Health, Safety and Environmental Management System. Case Study: The Pars Special Economic / Energy Zone Organization on behalf of Iran’s Petroleum Ministry and National Iranian Oil Company (NIOC) manages and develops the South and North oil and gas fields in the region. Methodology: This research according to objective is applied and based on implementing is descriptive and also prescription. We used technique MCDM (Multiple Criteria Decision-Making) for determining the priorities of the factors. Based on process approach the model consists of the following steps and components: first factors involved in green issues are determined. Based on them a framework is considered. Then with using MCDM (Multiple Criteria Decision-Making) algorithms (TOPSIS) the priority of basic variables are determined. The authors believe that the proposed model and results of this research can aid industries managers to implement green subjects according to Health, Safety and Environmental Management System in a more efficient and effective manner. Finding and conclusion: Basic factors involved in green issues and their weights can be the main finding. Model and relation between factors are the other finding of this research. The case is considered Petrochemical Company for promoting the system of ecological industry thinking.

Keywords: Fuzzy-AHP method , green management, health, safety and environmental management system, MCDM technique, TOPSIS

Procedia PDF Downloads 411
15263 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 155
15262 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 205
15261 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 52
15260 Flood Early Warning and Management System

Authors: Yogesh Kumar Singh, T. S. Murugesh Prabhu, Upasana Dutta, Girishchandra Yendargaye, Rahul Yadav, Rohini Gopinath Kale, Binay Kumar, Manoj Khare

Abstract:

The Indian subcontinent is severely affected by floods that cause intense irreversible devastation to crops and livelihoods. With increased incidences of floods and their related catastrophes, an Early Warning System for Flood Prediction and an efficient Flood Management System for the river basins of India is a must. Accurately modeled hydrological conditions and a web-based early warning system may significantly reduce economic losses incurred due to floods and enable end users to issue advisories with better lead time. This study describes the design and development of an EWS-FP using advanced computational tools/methods, viz. High-Performance Computing (HPC), Remote Sensing, GIS technologies, and open-source tools for the Mahanadi River Basin of India. The flood prediction is based on a robust 2D hydrodynamic model, which solves shallow water equations using the finite volume method. Considering the complexity of the hydrological modeling and the size of the basins in India, it is always a tug of war between better forecast lead time and optimal resolution at which the simulations are to be run. High-performance computing technology provides a good computational means to overcome this issue for the construction of national-level or basin-level flash flood warning systems having a high resolution at local-level warning analysis with a better lead time. High-performance computers with capacities at the order of teraflops and petaflops prove useful while running simulations on such big areas at optimum resolutions. In this study, a free and open-source, HPC-based 2-D hydrodynamic model, with the capability to simulate rainfall run-off, river routing, and tidal forcing, is used. The model was tested for a part of the Mahanadi River Basin (Mahanadi Delta) with actual and predicted discharge, rainfall, and tide data. The simulation time was reduced from 8 hrs to 3 hrs by increasing CPU nodes from 45 to 135, which shows good scalability and performance enhancement. The simulated flood inundation spread and stage were compared with SAR data and CWC Observed Gauge data, respectively. The system shows good accuracy and better lead time suitable for flood forecasting in near-real-time. To disseminate warning to the end user, a network-enabled solution is developed using open-source software. The system has query-based flood damage assessment modules with outputs in the form of spatial maps and statistical databases. System effectively facilitates the management of post-disaster activities caused due to floods, like displaying spatial maps of the area affected, inundated roads, etc., and maintains a steady flow of information at all levels with different access rights depending upon the criticality of the information. It is designed to facilitate users in managing information related to flooding during critical flood seasons and analyzing the extent of the damage.

Keywords: flood, modeling, HPC, FOSS

Procedia PDF Downloads 89
15259 An Application of the Single Equation Regression Model

Authors: S. K. Ashiquer Rahman

Abstract:

Recently, oil has become more influential in almost every economic sector as a key material. As can be seen from the news, when there are some changes in an oil price or OPEC announces a new strategy, its effect spreads to every part of the economy directly and indirectly. That’s a reason why people always observe the oil price and try to forecast the changes of it. The most important factor affecting the price is its supply which is determined by the number of wildcats drilled. Therefore, a study about the number of wellheads and other economic variables may give us some understanding of the mechanism indicated by the amount of oil supplies. In this paper, we will consider a relationship between the number of wellheads and three key factors: the price of the wellhead, domestic output, and GNP constant dollars. We also add trend variables in the models because the consumption of oil varies from time to time. Moreover, this paper will use an econometrics method to estimate parameters in the model, apply some tests to verify the result we acquire, and then conclude the model.

Keywords: price, domestic output, GNP, trend variable, wildcat activity

Procedia PDF Downloads 62