Search results for: COSMO models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6808

Search results for: COSMO models

5248 Naked Machismo: Uncovered Masculinity in an Israeli Home Design Campaign

Authors: Gilad Padva, Sigal Barak Brandes

Abstract:

This research centers on an unexpected Israeli advertising campaign for Elemento, a local furniture company, which eroticizes male nudity. The discussed campaign includes a series of printed ads that depict naked male models in effeminate positions. This campaign included a series of ads published in Haaretz, a small-scaled yet highly prestigious daily newspaper which is typically read by urban middle-upper-class left-winged Israelis. Apparently, this campaign embodies an alternative masculinity that challenges the prevalent machismo in Israeli society and advertising. Although some of the ads focus on young men in effeminate positions, they never expose their genitals and anuses, and their bodies are never permeable. The 2010s Elemento male models are seemingly contrasted to conventional representation of manhood in contemporary mainstream advertising. They display a somewhat inactive, passive and self-indulgent masculinity which involves 'conspicuous leisure'. In the process of commodity fetishism, the advertised furniture are emptied of the original meaning of their production, and then filled with new meanings in ways that both mystify the product and turn it into a fetish object. Yet, our research critically reconsiders this sensational campaign as sophisticated patriarchal parody that does not subvert but rather reconfirms and even fetishizes patriarchal premises; it parodizes effeminacy rather than the prevalent (Israeli) machismo. Following Pierre Bourdieu's politics of cultural taste, our research reconsiders and criticizes the male models' domesticated masculinity in a fantasized and cosmopolitan hedonistic habitus. Notwithstanding, we suggest that the Elemento campaign, despite its conformity, does question some Israeli and global axioms about gender roles, corporeal ideologies, idealized bodies, and domesticated phalluses and anuses. Although the naked truth is uncovered by this campaign, it does erect a vibrant discussion of contemporary masculinities and their exploitation in current mass consumption.

Keywords: male body, campaign, advertising, gender studies, men's studies, Israeli culture, masculinity, parody, effeminacy

Procedia PDF Downloads 214
5247 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.

Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element

Procedia PDF Downloads 76
5246 Development of an Information System Based on the Establishment and Evaluation of Performance Rating by Application Part/Type of Remodeling Element Technologies

Authors: Sungwon Jung

Abstract:

The percentage of 20 years or older apartment houses in South Korea is approximately 20% (1.55 million houses), and the explosive increase of aged houses is expected around the first planned new towns. Accordingly, we should prepare for social issues such as difficulty of housing lease and degradation of housing performance. The improvement of performance of aged houses is essential for achieving the national energy and carbon reduction goals, and we should develop techniques to respond to the changing construction environment. Furthermore, we should develop a performance evaluation system that is appropriate for the demands of residents such as the improvement of remodeling floor plan by performance improvement in line with the residence type of the housing vulnerable groups such as low-income group and elderly people living alone. For this purpose, remodeling techniques and business models optimized for the target complexes must be spread through the development of various business models. In addition, it is necessary to improve the remodeling business by improving the laws and systems related to the improvement of the residential performance and to prepare techniques to respond to the increasing business demands. In other words, performance improvement and evaluation and knowledge systems need to be researched as new issues related to remodeling that has not been addressed in the existing research.

Keywords: remodelling, performance evaluation, web-based system, big data

Procedia PDF Downloads 227
5245 Conceptualizing the Cyber Insecurity Risk in the Ethics of Automated Warfare

Authors: Otto Kakhidze, Hoda Alkhzaimi, Adam Ramey, Nasir Memon

Abstract:

This paper provides an alternative, cyber security based a conceptual framework for the ethics of automated warfare. The large body of work produced on fully or partially autonomous warfare systems tends to overlook malicious security factors as in the possibility of technical attacks on these systems when it comes to the moral and legal decision-making. The argument provides a risk-oriented justification to why technical malicious risks cannot be dismissed in legal, ethical and policy considerations when warfare models are being implemented and deployed. The assumptions of the paper are supported by providing a broader model that contains the perspective of technological vulnerabilities through the lenses of the Game Theory, Just War Theory as well as standard and non-standard defense ethics. The paper argues that a conventional risk-benefit analysis without considering ethical factors is insufficient for making legal and policy decisions on automated warfare. This approach will provide the substructure for security and defense experts as well as legal scholars, ethicists and decision theorists to work towards common justificatory grounds that will accommodate the technical security concerns that have been overlooked in the current legal and policy models.

Keywords: automated warfare, ethics of automation, inherent hijacking, security vulnerabilities, risk, uncertainty

Procedia PDF Downloads 357
5244 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification

Authors: Bing Li, Zhi Li, Yilong Yang

Abstract:

Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.

Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery

Procedia PDF Downloads 138
5243 Statistical Analysis with Prediction Models of User Satisfaction in Software Project Factors

Authors: Katawut Kaewbanjong

Abstract:

We analyzed a volume of data and found significant user satisfaction in software project factors. A statistical significance analysis (logistic regression) and collinearity analysis determined the significance factors from a group of 71 pre-defined factors from 191 software projects in ISBSG Release 12. The eight prediction models used for testing the prediction potential of these factors were Neural network, k-NN, Naïve Bayes, Random forest, Decision tree, Gradient boosted tree, linear regression and logistic regression prediction model. Fifteen pre-defined factors were truly significant in predicting user satisfaction, and they provided 82.71% prediction accuracy when used with a neural network prediction model. These factors were client-server, personnel changes, total defects delivered, project inactive time, industry sector, application type, development type, how methodology was acquired, development techniques, decision making process, intended market, size estimate approach, size estimate method, cost recording method, and effort estimate method. These findings may benefit software development managers considerably.

Keywords: prediction model, statistical analysis, software project, user satisfaction factor

Procedia PDF Downloads 126
5242 Evaluating India's Smart Cities against the Sustainable Development Goals

Authors: Suneet Jagdev

Abstract:

17 Sustainable Development Goals were adopted by the world leaders in September 2015 at the United Nations Sustainable Development Summit. These goals were adopted by UN member states to promote prosperity, health and human rights while protecting the planet. Around the same time, the Government of India launched the Smart City Initiative to speed up development of state of the art infrastructure and services in 100 cities with a focus on sustainable and inclusive development. These cities are meant to become role models for other cities in India and promote sustainable regional development. This paper examines goals set under the Smart City Initiative and evaluates them in terms of the Sustainable Development Goals, using case studies of selected Smart Cities in India. The study concludes that most Smart City projects at present actually consist of individual solutions to individual problems identified in a community rather than comprehensive models for complex issues in cities across India. Systematic, logical and comparative analysis of important literature and data has been done, collected from government sources, government papers, research papers by various experts on the topic, and results from some online surveys. Case studies have been used for a graphical analysis highlighting the issues of migration, ecology, economy and social equity in these Smart Cities.

Keywords: housing, migration, smart cities, sustainable development goals, urban infrastructure

Procedia PDF Downloads 412
5241 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand

Authors: Manit Pollar

Abstract:

Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.

Keywords: SARIMA, time series model, dengue cases, Thailand

Procedia PDF Downloads 360
5240 Defective Autophagy Disturbs Neural Migration and Network Activity in hiPSC-Derived Cockayne Syndrome B Disease Models

Authors: Julia Kapr, Andrea Rossi, Haribaskar Ramachandran, Marius Pollet, Ilka Egger, Selina Dangeleit, Katharina Koch, Jean Krutmann, Ellen Fritsche

Abstract:

It is widely acknowledged that animal models do not always represent human disease. Especially human brain development is difficult to model in animals due to a variety of structural and functional species-specificities. This causes significant discrepancies between predicted and apparent drug efficacies in clinical trials and their subsequent failure. Emerging alternatives based on 3D in vitro approaches, such as human brain spheres or organoids, may in the future reduce and ultimately replace animal models. Here, we present a human induced pluripotent stem cell (hiPSC)-based 3D neural in a vitro disease model for the Cockayne Syndrome B (CSB). CSB is a rare hereditary disease and is accompanied by severe neurologic defects, such as microcephaly, ataxia and intellectual disability, with currently no treatment options. Therefore, the aim of this study is to investigate the molecular and cellular defects found in neural hiPSC-derived CSB models. Understanding the underlying pathology of CSB enables the development of treatment options. The two CSB models used in this study comprise a patient-derived hiPSC line and its isogenic control as well as a CSB-deficient cell line based on a healthy hiPSC line (IMR90-4) background thereby excluding genetic background-related effects. Neurally induced and differentiated brain sphere cultures were characterized via RNA Sequencing, western blot (WB), immunocytochemistry (ICC) and multielectrode arrays (MEAs). CSB-deficiency leads to an altered gene expression of markers for autophagy, focal adhesion and neural network formation. Cell migration was significantly reduced and electrical activity was significantly increased in the disease cell lines. These data hint that the cellular pathologies is possibly underlying CSB. By induction of autophagy, the migration phenotype could be partially rescued, suggesting a crucial role of disturbed autophagy in defective neural migration of the disease lines. Altered autophagy may also lead to inefficient mitophagy. Accordingly, disease cell lines were shown to have a lower mitochondrial base activity and a higher susceptibility to mitochondrial stress induced by rotenone. Since mitochondria play an important role in neurotransmitter cycling, we suggest that defective mitochondria may lead to altered electrical activity in the disease cell lines. Failure to clear the defective mitochondria by mitophagy and thus missing initiation cues for new mitochondrial production could potentiate this problem. With our data, we aim at establishing a disease adverse outcome pathway (AOP), thereby adding to the in-depth understanding of this multi-faced disorder and subsequently contributing to alternative drug development.

Keywords: autophagy, disease modeling, in vitro, pluripotent stem cells

Procedia PDF Downloads 121
5239 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

Authors: Ruchika Malhotra, Megha Khanna

Abstract:

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics

Procedia PDF Downloads 420
5238 An Enhanced Hybrid Backoff Technique for Minimizing the Occurrence of Collision in Mobile Ad Hoc Networks

Authors: N. Sabiyath Fatima, R. K. Shanmugasundaram

Abstract:

In Mobile Ad-hoc Networks (MANETS), every node performs both as transmitter and receiver. The existing backoff models do not exactly forecast the performance of the wireless network. Also, the existing models experience elevated packet collisions. Every time a collision happens, the station’s contention window (CW) is doubled till it arrives at the utmost value. The main objective of this paper is to diminish collision by means of contention window Multiplicative Increase Decrease Backoff (CWMIDB) scheme. The intention of rising CW is to shrink the collision possibility by distributing the traffic into an outsized point in time. Within wireless Ad hoc networks, the CWMIDB algorithm dynamically controls the contention window of the nodes experiencing collisions. During packet communication, the backoff counter is evenly selected from the given choice of [0, CW-1]. At this point, CW is recognized as contention window and its significance lies on the amount of unsuccessful transmission that had happened for the packet. On the initial transmission endeavour, CW is put to least amount value (C min), if transmission effort fails, subsequently the value gets doubled, and once more the value is set to least amount on victorious broadcast. CWMIDB is simulated inside NS2 environment and its performance is compared with Binary Exponential Backoff Algorithm. The simulation results show improvement in transmission probability compared to that of the existing backoff algorithm.

Keywords: backoff, contention window, CWMIDB, MANET

Procedia PDF Downloads 280
5237 Improving the Management Systems of the Ownership Risks in Conditions of Transformation of the Russian Economy

Authors: Mikhail V. Khachaturyan

Abstract:

The article analyzes problems of improving the management systems of the ownership risks in the conditions of the transformation of the Russian economy. Among the main sources of threats business owners should highlight is the inefficiency of the implementation of business models and interaction with hired managers. In this context, it is particularly important to analyze the relationship of business models and ownership risks. The analysis of this problem appears to be relevant for a number of reasons: Firstly, the increased risk appetite of the owner directly affects the business model and the composition of his holdings; secondly, owners with significant stakes in the company are factors in the formation of particular types of risks for owners, for which relations have a significant influence on a firm's competitiveness and ultimately determines its survival; and thirdly, inefficient system of management ownership of risk is one of the main causes of mass bankruptcies, which significantly affects the stable operation of the economy as a whole. The separation of the processes of possession, disposal and use in modern organizations is the cause of not only problems in the process of interaction between the owner and managers in managing the organization as a whole, but also the asymmetric information about the kinds and forms of the main risks. Managers tend to avoid risky projects, inhibit the diversification of the organization's assets, while owners can insist on the development of such projects, with the aim not only of creating new values for themselves and consumers, but also increasing the value of the company as a result of increasing capital. In terms of separating ownership and management, evaluation of projects by the ratio of risk-yield requires preservation of the influence of the owner on the process of development and making management decisions. It is obvious that without a clearly structured system of participation of the owner in managing the risks of their business, further development is hopeless. In modern conditions of forming a risk management system, owners are compelled to compromise between the desire to increase the organization's ability to produce new value, and, consequently, increase its cost due to the implementation of risky projects and the need to tolerate the cost of lost opportunities of risk diversification. Improving the effectiveness of the management of ownership risks may also contribute to the revitalization of creditors on implementation claims to inefficient owners, which ultimately will contribute to the efficiency models of ownership control to exclude variants of insolvency. It is obvious that in modern conditions, the success of the model of the ownership of risk management and audit is largely determined by the ability and willingness of the owner to find a compromise between potential opportunities for expanding the firm's ability to create new value through risk and maintaining the current level of new value creation and an acceptable level of risk through the use of models of diversification.

Keywords: improving, ownership risks, problem, Russia

Procedia PDF Downloads 352
5236 Estimation of Probabilistic Fatigue Crack Propagation Models of AZ31 Magnesium Alloys under Various Load Ratio Conditions by Using the Interpolation of a Random Variable

Authors: Seon Soon Choi

Abstract:

The essential purpose is to present the good fatigue crack propagation model describing a stochastic fatigue crack growth behavior in a rolled magnesium alloy, AZ31, under various load ratio conditions. Fatigue crack propagation experiments were carried out in laboratory air under four conditions of load ratio, R, using AZ31 to investigate the crack growth behavior. The stochastic fatigue crack growth behavior was analyzed using an interpolation of random variable, Z, introduced to an empirical fatigue crack propagation model. The empirical fatigue models used in this study are Paris-Erdogan model, Walker model, Forman model, and modified Forman model. It was found that the random variable is useful in describing the stochastic fatigue crack growth behaviors under various load ratio conditions. The good probabilistic model describing a stochastic fatigue crack growth behavior under various load ratio conditions was also proposed.

Keywords: magnesium alloys, fatigue crack propagation model, load ratio, interpolation of random variable

Procedia PDF Downloads 413
5235 A Study on Improvement of Performance of Anti-Splash Device for Cargo Oil Tank Vent Pipe Using CFD Simulation and Artificial Neural Network

Authors: Min-Woo Kim, Ok-Kyun Na, Jun-Ho Byun, Jong-Hwan Park, Seung-Hwa Yang, Joon-Hong Park, Young-Chul Park

Abstract:

This study is focused on the comparative analysis and improvement to grasp the flow characteristic of the Anti-Splash Device located under the P/V Valve and new concept design models using the CFD analysis and Artificial Neural Network. The P/V valve located upper deck to solve the pressure rising and vacuum condition of inner tank of the liquid cargo ships occurred oil outflow accident by transverse and longitudinal sloshing force. Anti-Splash Device is fitted to improve and prevent this problem in the shipbuilding industry. But the oil outflow accidents are still reported by ship owners. Thus, four types of new design model are presented by study. Then, comparative analysis is conducted with new models and existing model. Mostly the key criterion of this problem is flux in the outlet of the Anti-Splash Device. Therefore, the flow and velocity are grasped by transient analysis. And then it decided optimum model and design parameters to develop model. Later, it needs to develop an Anti-Splash Device by Flow Test to get certification and verification using experiment equipment.

Keywords: anti-splash device, P/V valve, sloshing, artificial neural network

Procedia PDF Downloads 592
5234 Replacement of the Distorted Dentition of the Cone Beam Computed Tomography Scan Models for Orthognathic Surgery Planning

Authors: T. Almutairi, K. Naudi, N. Nairn, X. Ju, B. Eng, J. Whitters, A. Ayoub

Abstract:

Purpose: At present Cone Beam Computed Tomography (CBCT) imaging does not record dental morphology accurately due to the scattering produced by metallic restorations and the reported magnification. The aim of this pilot study is the development and validation of a new method for the replacement of the distorted dentition of CBCT scans with the dental image captured by the digital intraoral camera. Materials and Method: Six dried skulls with orthodontics brackets on the teeth were used in this study. Three intra-oral markers made of dental stone were constructed which were attached to orthodontics brackets. The skulls were CBCT scanned, and occlusal surface was captured using TRIOS® 3D intraoral scanner. Marker based and surface based registrations were performed to fuse the digital intra-oral scan(IOS) into the CBCT models. This produced a new composite digital model of the skull and dentition. The skulls were scanned again using the commercially accurate Laser Faro® arm to produce the 'gold standard' model for the assessment of the accuracy of the developed method. The accuracy of the method was assessed by measuring the distance between the occlusal surfaces of the new composite model and the 'gold standard' 3D model of the skull and teeth. The procedure was repeated a week apart to measure the reproducibility of the method. Results: The results showed no statistically significant difference between the measurements on the first and second occasions. The absolute mean distance between the new composite model and the laser model ranged between 0.11 mm to 0.20 mm. Conclusion: The dentition of the CBCT can be accurately replaced with the dental image captured by the intra-oral scanner to create a composite model. This method will improve the accuracy of orthognathic surgical prediction planning, with the final goal of the fabrication of a physical occlusal wafer without to guide orthognathic surgery and eliminate the need for dental impression.

Keywords: orthognathic surgery, superimposition, models, cone beam computed tomography

Procedia PDF Downloads 200
5233 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution

Procedia PDF Downloads 263
5232 Stability Assessment of Chamshir Dam Based on DEM, South West Zagros

Authors: Rezvan Khavari

Abstract:

The Zagros fold-thrust belt in SW Iran is a part of the Alpine-Himalayan system which consists of a variety of structures with different sizes or geometries. The study area is Chamshir Dam, which is located on the Zohreh River, 20 km southeast of Gachsaran City (southwest Iran). The satellite images are valuable means available to geologists for locating geological or geomorphological features expressing regional fault or fracture systems, therefore, the satellite images were used for structural analysis of the Chamshir dam area. As well, using the DEM and geological maps, 3D Models of the area have been constructed. Then, based on these models, all the acquired fracture traces data were integrated in Geographic Information System (GIS) environment by using Arc GIS software. Based on field investigation and DEM model, main structures in the area consist of Cham Shir syncline and two fault sets, the main thrust faults with NW-SE direction and small normal faults in NE-SW direction. There are three joint sets in the study area, both of them (J1 and J3) are the main large fractures around the Chamshir dam. These fractures indeed consist with the normal faults in NE-SW direction. The third joint set in NW-SE is normal to the others. In general, according to topography, geomorphology and structural geology evidences, Chamshir dam has a potential for sliding in some parts of Gachsaran formation.

Keywords: DEM, chamshir dam, zohreh river, satellite images

Procedia PDF Downloads 483
5231 Numerical Pricing of Financial Options under Irrational Exercise Times and Regime-Switching Models

Authors: Mohammad Saber Rohi, Saghar Heidari

Abstract:

In this paper, we studied the pricing problem of American options under a regime-switching model with the possibility of a non-optimal exercise policy (early or late exercise time) which is called an irrational strategy. For this, we consider a Markovmodulated model for the dynamic of the underlying asset as an alternative model to the classical Balck-Scholes-Merton model (BSM) and an intensity-based model for the irrational strategy, to provide more realistic results for American option prices under the irrational behavior in real financial markets. Applying a partial differential equation (PDE) approach, the pricing problem of American options under regime-switching models can be formulated as coupled PDEs. To solve the resulting systems of PDEs in this model, we apply a finite element method as the numerical solving procedure to the resulting variational inequality. Under some appropriate assumptions, we establish the stability of the method and compare its accuracy to some recent works to illustrate the suitability of the proposed model and the accuracy of the applied numerical method for the pricing problem of American options under the regime-switching model with irrational behaviors.

Keywords: irrational exercise strategy, rationality parameter, regime-switching model, American option, finite element method, variational inequality

Procedia PDF Downloads 74
5230 A Machine Learning Approach for Anomaly Detection in Environmental IoT-Driven Wastewater Purification Systems

Authors: Giovanni Cicceri, Roberta Maisano, Nathalie Morey, Salvatore Distefano

Abstract:

The main goal of this paper is to present a solution for a water purification system based on an Environmental Internet of Things (EIoT) platform to monitor and control water quality and machine learning (ML) models to support decision making and speed up the processes of purification of water. A real case study has been implemented by deploying an EIoT platform and a network of devices, called Gramb meters and belonging to the Gramb project, on wastewater purification systems located in Calabria, south of Italy. The data thus collected are used to control the wastewater quality, detect anomalies and predict the behaviour of the purification system. To this extent, three different statistical and machine learning models have been adopted and thus compared: Autoregressive Integrated Moving Average (ARIMA), Long Short Term Memory (LSTM) autoencoder, and Facebook Prophet (FP). The results demonstrated that the ML solution (LSTM) out-perform classical statistical approaches (ARIMA, FP), in terms of both accuracy, efficiency and effectiveness in monitoring and controlling the wastewater purification processes.

Keywords: environmental internet of things, EIoT, machine learning, anomaly detection, environment monitoring

Procedia PDF Downloads 154
5229 Quantification of the Erosion Effect on Small Caliber Guns: Experimental and Numerical Analysis

Authors: Dhouibi Mohamed, Stirbu Bogdan, Chabotier André, Pirlot Marc

Abstract:

Effects of erosion and wear on the performance of small caliber guns have been analyzed throughout numerical and experimental studies. Mainly, qualitative observations were performed. Correlations between the volume change of the chamber and the maximum pressure are limited. This paper focuses on the development of a numerical model to predict the maximum pressure evolution when the interior shape of the chamber changes in the different weapon’s life phases. To fulfill this goal, an experimental campaign, followed by a numerical simulation study, is carried out. Two test barrels, « 5.56x45mm NATO » and « 7.62x51mm NATO,» are considered. First, a Coordinate Measuring Machine (CMM) with a contact scanning probe is used to measure the interior profile of the barrels after each 300-shots cycle until their worn out. Simultaneously, the EPVAT (Electronic Pressure Velocity and Action Time) method with a special WEIBEL radar are used to measure: (i) the chamber pressure, (ii) the action time, (iii) and the bullet velocity in each barrel. Second, a numerical simulation study is carried out. Thus, a coupled interior ballistic model is developed using the dynamic finite element program LS-DYNA. In this work, two different models are elaborated: (i) coupled Eularien Lagrangian method using fluid-structure interaction (FSI) techniques and a coupled thermo-mechanical finite element using a lumped parameter model (LPM) as a subroutine. Those numerical models are validated and checked through three experimental results, such as (i) the muzzle velocity, (ii) the chamber pressure, and (iii) the surface morphology of fired projectiles. Results show a good agreement between experiments and numerical simulations. Next, a comparison between the two models is conducted. The projectile motions, the dynamic engraving resistances and the maximum pressures are compared and analyzed. Finally, using this obtained database, a statistical correlation between the muzzle velocity, the maximum pressure and the chamber volume is established.

Keywords: engraving process, finite element analysis, gun barrel erosion, interior ballistics, statistical correlation

Procedia PDF Downloads 219
5228 Novel Adaptive Radial Basis Function Neural Networks Based Approach for Short-Term Load Forecasting of Jordanian Power Grid

Authors: Eyad Almaita

Abstract:

In this paper, a novel adaptive Radial Basis Function Neural Networks (RBFNN) algorithm is used to forecast the hour by hour electrical load demand in Jordan. A small and effective RBFNN model is used to forecast the hourly total load demand based on a small number of features. These features are; the load in the previous day, the load in the same day in the previous week, the temperature in the same hour, the hour number, the day number, and the day type. The proposed adaptive RBFNN model can enhance the reliability of the conventional RBFNN after embedding the network in the system. This is achieved by introducing an adaptive algorithm that allows the change of the weights of the RBFNN after the training process is completed, which will eliminates the need to retrain the RBFNN model again. The data used in this paper is real data measured by National Electrical Power co. (Jordan). The data for the period Jan./2012-April/2013 is used train the RBFNN models and the data for the period May/2013- Sep. /2013 is used to validate the models effectiveness.

Keywords: load forecasting, adaptive neural network, radial basis function, short-term, electricity consumption

Procedia PDF Downloads 348
5227 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 182
5226 Fluid Catalytic Cracking: Zeolite Catalyzed Chemical Industry Processes

Authors: Mithil Pandey, Ragunathan Bala Subramanian

Abstract:

One of the major conversion technologies in the oil refinery industry is Fluid catalytic cracking (FCC) which produces the majority of the world’s gasoline. Some useful products are generated from the vacuum gas oil, heavy gas oil and residue feedstocks by the FCC unit in an oil refinery. Moreover, Zeolite catalysts (zeo-catalysts) have found widespread applications and have proved to be substantial and paradigmatic in oil refining and petrochemical processes, such as FCC because of their porous features. Several famous zeo-catalysts have been fabricated and applied in industrial processes as milestones in history, and have brought on huge changes in petrochemicals. So far, more than twenty types of zeolites have been industrially applied, and their versatile porous architectures with their essential features have contributed to affect the catalytic efficiency. This poster depicts the evolution of pore models in zeolite catalysts which are accompanied by an increase in environmental and demands. The crucial roles of modulating pore models are outlined for zeo-catalysts for the enhancement of their catalytic performances in various industrial processes. The development of industrial processes for the FCC process, aromatic conversions and olefin production, makes it obvious that the pore architecture plays a very important role in zeo-catalysis processes. By looking at the different necessities of industrial processes, rational construction of the pore model is critically essential. Besides, the pore structure of the zeolite would have a substantial and direct effect on the utilization efficiency of the zeo-catalyst.

Keywords: catalysts, fluid catalytic cracking, industrial processes, zeolite

Procedia PDF Downloads 358
5225 Detection and Quantification of Active Pharmaceutical Ingredients as Adulterants in Garcinia cambogia Slimming Preparations Using NIR Spectroscopy Combined with Chemometrics

Authors: Dina Ahmed Selim, Eman Shawky Anwar, Rasha Mohamed Abu El-Khair

Abstract:

A rapid, simple and efficient method with minimal sample treatment was developed for authentication of Garcinia cambogia fruit peel powder, along with determining undeclared active pharmaceutical ingredients (APIs) in its herbal slimming dietary supplements using near infrared spectroscopy combined with chemometrics. Five featured adulterants, including sibutramine, metformin, orlistat, ephedrine, and theophylline are selected as target compounds. The Near infrared spectral data matrix of authentic Garcinia cambogia fruit peel and specimens degraded by intentional contamination with the five selected APIs was subjected to hierarchical clustering analysis to investigate their bundling figure. SIMCA models were established to ensure the genuiness of Garcinia cambogia fruit peel which resulted in perfect classification of all tested specimens. Adulterated samples were utilized for construction of PLSR models based on different APIs contents at minute levels of fraud practices (LOQ < 0.2% w/w).The suggested approach can be applied to enhance and guarantee the safety and quality of Garcinia fruit peel powder as raw material and in dietary supplements.

Keywords: Garcinia cambogia, Quality control, NIR spectroscopy, Chemometrics

Procedia PDF Downloads 78
5224 Suitability of Satellite-Based Data for Groundwater Modelling in Southwest Nigeria

Authors: O. O. Aiyelokun, O. A. Agbede

Abstract:

Numerical modelling of groundwater flow can be susceptible to calibration errors due to lack of adequate ground-based hydro-metrological stations in river basins. Groundwater resources management in Southwest Nigeria is currently challenged by overexploitation, lack of planning and monitoring, urbanization and climate change; hence to adopt models as decision support tools for sustainable management of groundwater; they must be adequately calibrated. Since river basins in Southwest Nigeria are characterized by missing data, and lack of adequate ground-based hydro-meteorological stations; the need for adopting satellite-based data for constructing distributed models is crucial. This study seeks to evaluate the suitability of satellite-based data as substitute for ground-based, for computing boundary conditions; by determining if ground and satellite based meteorological data fit well in Ogun and Oshun River basins. The Climate Forecast System Reanalysis (CFSR) global meteorological dataset was firstly obtained in daily form and converted to monthly form for the period of 432 months (January 1979 to June, 2014). Afterwards, ground-based meteorological data for Ikeja (1981-2010), Abeokuta (1983-2010), and Oshogbo (1981-2010) were compared with CFSR data using Goodness of Fit (GOF) statistics. The study revealed that based on mean absolute error (MEA), coefficient of correlation, (r) and coefficient of determination (R²); all meteorological variables except wind speed fit well. It was further revealed that maximum and minimum temperature, relative humidity and rainfall had high range of index of agreement (d) and ratio of standard deviation (rSD), implying that CFSR dataset could be used to compute boundary conditions such as groundwater recharge and potential evapotranspiration. The study concluded that satellite-based data such as the CFSR should be used as input when constructing groundwater flow models in river basins in Southwest Nigeria, where majority of the river basins are partially gaged and characterized with long missing hydro-metrological data.

Keywords: boundary condition, goodness of fit, groundwater, satellite-based data

Procedia PDF Downloads 132
5223 An Evaluation of the Artificial Neural Network and Adaptive Neuro Fuzzy Inference System Predictive Models for the Remediation of Crude Oil-Contaminated Soil Using Vermicompost

Authors: Precious Ehiomogue, Ifechukwude Israel Ahuchaogu, Isiguzo Edwin Ahaneku

Abstract:

Vermicompost is the product of the decomposition process using various species of worms, to create a mixture of decomposing vegetable or food waste, bedding materials, and vemicast. This process is called vermicomposting, while the rearing of worms for this purpose is called vermiculture. Several works have verified the adsorption of toxic metals using vermicompost but the application is still scarce for the retention of organic compounds. This research brings to knowledge the effectiveness of earthworm waste (vermicompost) for the remediation of crude oil contaminated soils. The remediation methods adopted in this study were two soil washing methods namely, batch and column process which represent laboratory and in-situ remediation. Characterization of the vermicompost and crude oil contaminated soil were performed before and after the soil washing using Fourier transform infrared (FTIR), scanning electron microscopy (SEM), X-ray fluorescence (XRF), X-ray diffraction (XRD) and Atomic adsorption spectrometry (AAS). The optimization of washing parameters, using response surface methodology (RSM) based on Box-Behnken Design was performed on the response from the laboratory experimental results. This study also investigated the application of machine learning models [Artificial neural network (ANN), Adaptive neuro fuzzy inference system (ANFIS). ANN and ANFIS were evaluated using the coefficient of determination (R²) and mean square error (MSE)]. Removal efficiency obtained from the Box-Behnken design experiment ranged from 29% to 98.9% for batch process remediation. Optimization of the experimental factors carried out using numerical optimization techniques by applying desirability function method of the response surface methodology (RSM) produce the highest removal efficiency of 98.9% at absorbent dosage of 34.53 grams, adsorbate concentration of 69.11 (g/ml), contact time of 25.96 (min), and pH value of 7.71, respectively. Removal efficiency obtained from the multilevel general factorial design experiment ranged from 56% to 92% for column process remediation. The coefficient of determination (R²) for ANN was (0.9974) and (0.9852) for batch and column process, respectively, showing the agreement between experimental and predicted results. For batch and column precess, respectively, the coefficient of determination (R²) for RSM was (0.9712) and (0.9614), which also demonstrates agreement between experimental and projected findings. For the batch and column processes, the ANFIS coefficient of determination was (0.7115) and (0.9978), respectively. It can be concluded that machine learning models can predict the removal of crude oil from polluted soil using vermicompost. Therefore, it is recommended to use machines learning models to predict the removal of crude oil from contaminated soil using vermicompost.

Keywords: ANFIS, ANN, crude-oil, contaminated soil, remediation and vermicompost

Procedia PDF Downloads 113
5222 Ecosystem Carbon Stocks Vary in Reference to the Models Used, Socioecological Factors and Agroforestry Practices in Central Ethiopia

Authors: Gadisa Demie, Mesele Negash, Zerihun Asrat, Lojka Bohdan

Abstract:

Deforestation and forest degradation in the tropics have led to significant carbon (C) emissions. Agroforestry (AF) is a suitable land-use option for tackling such declines in ecosystem services, including climate change mitigation. However, it is unclear how biomass models, AF practices, and socio-ecological factors determine these roles, which hinders the implementation of climate change mitigation initiatives. This study aimed to estimate the ecosystem C stocks of the studied AF practices in relation to socio-ecological variables in central Ethiopia. Out of 243 AF farms inventoried, 108 were chosen at random from three AF practices to estimate their biomass and soil organic carbon. A total of 432 soil samples were collected from 0–30 and 30–60 cm soil depths; 216 samples were taken for each soil organic carbon fraction (%C) and bulk density computation. The study found that the currently developed allometric equations were the most accurate to estimate biomass C for trees growing in the landscape when compared to previous models. The study found higher overall biomass C in woodlots (165.62 Mg ha-¹) than in homegardens (134.07 Mg ha-¹) and parklands (19.98 Mg ha-¹). Conversely, overall, SOC was higher for homegardens (143.88 Mg ha-¹), but lower for parklands (53.42 Mg ha-¹). The ecosystem C stock was comparable between homegardens (277.95 Mg ha-¹) and woodlots (275.44 Mg ha-¹). The study found that elevation, wealthy levels, AF farm age, and size have a positive and significant (P < 0.05) effect on overall biomass and ecosystem C stocks but non-significant with slope (P > 0.05). Similarly, SOC increased with increasing elevation, AF farm age, and wealthy status but decreased with slope and non-significant with AF farm size. The study also showed that species diversity had a positive (P <0.05) effect on overall biomass C stocks in homegardens. The overall study highlights that AF practices have a great potential to lock up more carbon in biomass and soils; however, these potentials were determined by socioecological variables. Thus, these factors should be considered in management strategies that preserve trees in agricultural landscapes in order to mitigate climate change and support the livelihoods of farmers.

Keywords: agricultural landscape, biomass, climate change, soil organic carbon

Procedia PDF Downloads 53
5221 Dynamic Reliability for a Complex System and Process: Application on Offshore Platform in Mozambique

Authors: Raed KOUTA, José-Alcebiades-Ernesto HLUNGUANE, Eric Châtele

Abstract:

The search for and exploitation of new fossil energy resources is taking place in the context of the gradual depletion of existing deposits. Despite the adoption of international targets to combat global warming, the demand for fuels continues to grow, contradicting the movement towards an energy-efficient society. The increase in the share of offshore in global hydrocarbon production tends to compensate for the depletion of terrestrial reserves, thus constituting a major challenge for the players in the sector. Through the economic potential it represents, and the energy independence it provides, offshore exploitation is also a challenge for States such as Mozambique, which have large maritime areas and whose environmental wealth must be considered. The exploitation of new reserves on economically viable terms depends on available technologies. The development of deep and ultra-deep offshore requires significant research and development efforts. Progress has also been made in managing the multiple risks inherent in this activity. Our study proposes a reliability approach to develop products and processes designed to live at sea. Indeed, the context of an offshore platform requires highly reliable solutions to overcome the difficulties of access to the system for regular maintenance and quick repairs and which must resist deterioration and degradation processes. One of the characteristics of failures that we consider is the actual conditions of use that are considered 'extreme.' These conditions depend on time and the interactions between the different causes. These are the two factors that give the degradation process its dynamic character, hence the need to develop dynamic reliability models. Our work highlights mathematical models that can explicitly manage interactions between components and process variables. These models are accompanied by numerical resolution methods that help to structure a dynamic reliability approach in a physical and probabilistic context. The application developed makes it possible to evaluate the reliability, availability, and maintainability of a floating storage and unloading platform for liquefied natural gas production.

Keywords: dynamic reliability, offshore plateform, stochastic process, uncertainties

Procedia PDF Downloads 122
5220 An Inquiry into the Usage of Complex Systems Models to Examine the Effects of the Agent Interaction in a Political Economic Environment

Authors: Ujjwall Sai Sunder Uppuluri

Abstract:

Group theory is a powerful tool that researchers can use to provide a structural foundation for their Agent Based Models. These Agent Based models are argued by this paper to be the future of the Social Science Disciplines. More specifically, researchers can use them to apply evolutionary theory to the study of complex social systems. This paper illustrates one such example of how theoretically an Agent Based Model can be formulated from the application of Group Theory, Systems Dynamics, and Evolutionary Biology to analyze the strategies pursued by states to mitigate risk and maximize usage of resources to achieve the objective of economic growth. This example can be applied to other social phenomena and this makes group theory so useful to the analysis of complex systems, because the theory provides the mathematical formulaic proof for validating the complex system models that researchers build and this will be discussed by the paper. The aim of this research, is to also provide researchers with a framework that can be used to model political entities such as states on a 3-dimensional plane. The x-axis representing resources (tangible and intangible) available to them, y the risks, and z the objective. There also exist other states with different constraints pursuing different strategies to climb the mountain. This mountain’s environment is made up of risks the state faces and resource endowments. This mountain is also layered in the sense that it has multiple peaks that must be overcome to reach the tallest peak. A state that sticks to a single strategy or pursues a strategy that is not conducive to the climbing of that specific peak it has reached is not able to continue advancement. To overcome the obstacle in the state’s path, it must innovate. Based on the definition of a group, we can categorize each state as being its own group. Each state is a closed system, one which is made up of micro level agents who have their own vectors and pursue strategies (actions) to achieve some sub objectives. The state also has an identity, the inverse being anarchy and/or inaction. Finally, the agents making up a state interact with each other through competition and collaboration to mitigate risks and achieve sub objectives that fall within the primary objective. Thus, researchers can categorize the state as an organism that reflects the sum of the output of the interactions pursued by agents at the micro level. When states compete, they employ a strategy and that state which has the better strategy (reflected by the strategies pursued by her parts) is able to out-compete her counterpart to acquire some resource, mitigate some risk or fulfil some objective. This paper will attempt to illustrate how group theory combined with evolutionary theory and systems dynamics can allow researchers to model the long run development, evolution, and growth of political entities through the use of a bottom up approach.

Keywords: complex systems, evolutionary theory, group theory, international political economy

Procedia PDF Downloads 141
5219 Simplified Modelling of Visco-Elastic Fluids for Use in Recoil Damping Systems

Authors: Prasad Pokkunuri

Abstract:

Visco-elastic materials combine the stress response properties of both solids and fluids and have found use in a variety of damping applications – both vibrational and acoustic. Defense and automotive applications, in particular, are subject to high impact and shock loading – for example: aircraft landing gear, firearms, and shock absorbers. Field responsive fluids – a class of smart materials – are the preferred choice of energy absorbents because of their controllability. These fluids’ stress response can be controlled by the application of a magnetic or electric field, in a closed loop. Their rheological properties – elasticity, plasticity, and viscosity – can be varied all the way from that of a liquid such as water to a hard solid. This work presents a simplified model to study the impulse response behavior of such fluids for use in recoil damping systems. The well-known Burger’s equation, in conjunction with various visco-elastic constitutive models, is used to represent fluid behavior. The Kelvin-Voigt, Upper Convected Maxwell (UCM), and Oldroyd-B constitutive models are implemented in this study. Using these models in a one-dimensional framework eliminates additional complexities due to geometry, pressure, body forces, and other source terms. Using a finite difference formulation to numerically solve the governing equation(s), the response to an initial impulse is studied. The disturbance is confined within the problem domain with no-inflow, no-outflow boundary conditions, and its decay characteristics studied. Visco-elastic fluids typically involve a time-dependent stress relaxation which gives rise to interesting behavior when subjected to an impulsive load. For particular values of viscous damping and elastic modulus, the fluid settles into a stable oscillatory state, absorbing and releasing energy without much decay. The simplified formulation enables a comprehensive study of different modes of system response, by varying relevant parameters. Using the insights gained from this study, extension to a more detailed multi-dimensional model is considered.

Keywords: Burgers Equation, Impulse Response, Recoil Damping Systems, Visco-elastic Fluids

Procedia PDF Downloads 294