Search results for: representation of graph models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8180

Search results for: representation of graph models

7370 Intrusion Detection in Computer Networks Using a Hybrid Model of Firefly and Differential Evolution Algorithms

Authors: Mohammad Besharatloo

Abstract:

Intrusion detection is an important research topic in network security because of increasing growth in the use of computer network services. Intrusion detection is done with the aim of detecting the unauthorized use or abuse in the networks and systems by the intruders. Therefore, the intrusion detection system is an efficient tool to control the user's access through some predefined regulations. Since, the data used in intrusion detection system has high dimension, a proper representation is required to show the basis structure of this data. Therefore, it is necessary to eliminate the redundant features to create the best representation subset. In the proposed method, a hybrid model of differential evolution and firefly algorithms was employed to choose the best subset of properties. In addition, decision tree and support vector machine (SVM) are adopted to determine the quality of the selected properties. In the first, the sorted population is divided into two sub-populations. These optimization algorithms were implemented on these sub-populations, respectively. Then, these sub-populations are merged to create next repetition population. The performance evaluation of the proposed method is done based on KDD Cup99. The simulation results show that the proposed method has better performance than the other methods in this context.

Keywords: intrusion detection system, differential evolution, firefly algorithm, support vector machine, decision tree

Procedia PDF Downloads 94
7369 Modelling Phase Transformations in Zircaloy-4 Fuel Cladding under Transient Heating Rates

Authors: Jefri Draup, Antoine Ambard, Chi-Toan Nguyen

Abstract:

Zirconium alloys exhibit solid-state phase transformations under thermal loading. These can lead to a significant evolution of the microstructure and associated mechanical properties of materials used in nuclear fuel cladding structures. Therefore, the ability to capture effects of phase transformation on the material constitutive behavior is of interest during conditions of severe transient thermal loading. Whilst typical Avrami, or Johnson-Mehl-Avrami-Kolmogorov (JMAK), type models for phase transformations have been shown to have a good correlation with the behavior of Zircaloy-4 under constant heating rates, the effects of variable and fast heating rates are not fully explored. The present study utilises the results of in-situ high energy synchrotron X-ray diffraction (SXRD) measurements in order to validate the phase transformation models for Zircaloy-4 under fast variable heating rates. These models are used to assess the performance of fuel cladding structures under loss of coolant accident (LOCA) scenarios. The results indicate that simple Avrami type models can provide a reasonable indication of the phase distribution in experimental test specimens under variable fast thermal loading. However, the accuracy of these models deteriorates under the faster heating regimes, i.e., 100Cs⁻¹. The studies highlight areas for improvement of simple Avrami type models, such as the inclusion of temperature rate dependence of the JMAK n-exponent.

Keywords: accident, fuel, modelling, zirconium

Procedia PDF Downloads 142
7368 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps

Authors: Yong Bum Shin

Abstract:

This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.

Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process

Procedia PDF Downloads 82
7367 A Comparative Evaluation of the SIR and SEIZ Epidemiological Models to Describe the Diffusion Characteristics of COVID-19 Polarizing Viewpoints on Online

Authors: Maryam Maleki, Esther Mead, Mohammad Arani, Nitin Agarwal

Abstract:

This study is conducted to examine how opposing viewpoints related to COVID-19 were diffused on Twitter. To accomplish this, six datasets using two epidemiological models, SIR (Susceptible, Infected, Recovered) and SEIZ (Susceptible, Exposed, Infected, Skeptics), were analyzed. The six datasets were chosen because they represent opposing viewpoints on the COVID-19 pandemic. Three of the datasets contain anti-subject hashtags, while the other three contain pro-subject hashtags. The time frame for all datasets is three years, starting from January 2020 to December 2022. The findings revealed that while both models were effective in evaluating the propagation trends of these polarizing viewpoints, the SEIZ model was more accurate with a relatively lower error rate (6.7%) compared to the SIR model (17.3%). Additionally, the relative error for both models was lower for anti-subject hashtags compared to pro-subject hashtags. By leveraging epidemiological models, insights into the propagation trends of polarizing viewpoints on Twitter were gained. This study paves the way for the development of methods to prevent the spread of ideas that lack scientific evidence while promoting the dissemination of scientifically backed ideas.

Keywords: mathematical modeling, epidemiological model, seiz model, sir model, covid-19, twitter, social network analysis, social contagion

Procedia PDF Downloads 67
7366 Comparative Sustainability Performance Analysis of Australian Companies Using Composite Measures

Authors: Ramona Zharfpeykan, Paul Rouse

Abstract:

Organizational sustainability is important to both organizations themselves and their stakeholders. Despite its increasing popularity and increasing numbers of organizations reporting sustainability, research on evaluating and comparing the sustainability performance of companies is limited. The aim of this study was to develop models to measure sustainability performance for both cross-sectional and longitudinal comparisons across companies in the same or different industries. A secondary aim was to see if sustainability reports can be used to evaluate sustainability performance. The study used both a content analysis of Australian sustainability reports in mining and metals and financial services for 2011-2014 and a survey of Australian and New Zealand organizations. Two methods ranging from a composite index using uniform weights to data envelopment analysis (DEA) were employed to analyze the data and develop the models. The results show strong statistically significant relationships between the developed models, which suggests that each model provides a consistent, systematic and reasonably robust analysis. The results of the models show that for both industries, companies that had sustainability scores above or below the industry average stayed almost the same during the study period. These indices and models can be used by companies to evaluate their sustainability performance and compare it with previous years, or with other companies in the same or different industries. These methods can also be used by various stakeholders and sustainability ranking companies such as the Global Reporting Initiative (GRI).

Keywords: data envelopment analysis, sustainability, sustainability performance measurement system, sustainability performance index, global reporting initiative

Procedia PDF Downloads 181
7365 A Study of High Viscosity Oil-Gas Slug Flow Using Gamma Densitometer

Authors: Y. Baba, A. Archibong-Eso, H. Yeung

Abstract:

Experimental study of high viscosity oil-gas flows in horizontal pipelines published in literature has indicated that hydrodynamic slug flow is the dominant flow pattern observed. Investigations have shown that hydrodynamic slugging brings about high instabilities in pressure that can damage production facilities thereby making it inherent to study high viscous slug flow regime so as to improve the understanding of its flow dynamics. Most slug flow models used in the petroleum industry for the design of pipelines together with their closure relationships were formulated based on observations of low viscosity liquid-gas flows. New experimental investigations and data are therefore required to validate these models. In cases where these models underperform, improving upon or building new predictive models and correlations will also depend on the new experimental dataset and further understanding of the flow dynamics in high viscous oil-gas flows. In this study conducted at the Flow laboratory, Oil and Gas Engineering Centre of Cranfield University, slug flow variables such as pressure gradient, mean liquid holdup, frequency and slug length for oil viscosity ranging from 1..0 – 5.5 Pa.s are experimentally investigated and analysed. The study was carried out in a 0.076m ID pipe, two fast sampling gamma densitometer and pressure transducers (differential and point) were used to obtain experimental measurements. Comparison of the measured slug flow parameters to the existing slug flow prediction models available in the literature showed disagreement with high viscosity experimental data thus highlighting the importance of building new predictive models and correlations.

Keywords: gamma densitometer, mean liquid holdup, pressure gradient, slug frequency and slug length

Procedia PDF Downloads 330
7364 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model

Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim

Abstract:

Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).

Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph

Procedia PDF Downloads 280
7363 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.

Keywords: wind fragility, glass window, high rise building, wind disaster

Procedia PDF Downloads 259
7362 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance

Authors: Flora Babongo, Valerie Chavez

Abstract:

Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.

Keywords: causal inference, DAGs, BAMLSS, financial index

Procedia PDF Downloads 152
7361 RAPDAC: Role Centric Attribute Based Policy Driven Access Control Model

Authors: Jamil Ahmed

Abstract:

Access control models aim to decide whether a user should be denied or granted access to the user‟s requested activity. Various access control models have been established and proposed. The most prominent of these models include role-based, attribute-based, policy based access control models as well as role-centric attribute based access control model. In this paper, a novel access control model is presented called “Role centric Attribute based Policy Driven Access Control (RAPDAC) model”. RAPDAC incorporates the concept of “policy” in the “role centric attribute based access control model”. It leverages the concept of "policy‟ by precisely combining the evaluation of conditions, attributes, permissions and roles in order to allow authorization access. This approach allows capturing the "access control policy‟ of a real time application in a well defined manner. RAPDAC model allows making access decision at much finer granularity as illustrated by the case study of a real time library information system.

Keywords: authorization, access control model, role based access control, attribute based access control

Procedia PDF Downloads 161
7360 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes

Authors: Nadarajah I. Ramesh

Abstract:

Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.

Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model

Procedia PDF Downloads 279
7359 Integrated Mathematical Modeling and Advance Visualization of Magnetic Nanoparticle for Drug Delivery, Drug Release and Effects to Cancer Cell Treatment

Authors: Norma Binti Alias, Che Rahim Che The, Norfarizan Mohd Said, Sakinah Abdul Hanan, Akhtar Ali

Abstract:

This paper discusses on the transportation of magnetic drug targeting through blood within vessels, tissues and cells. There are three integrated mathematical models to be discussed and analyze the concentration of drug and blood flow through magnetic nanoparticles. The cell therapy brought advancement in the field of nanotechnology to fight against the tumors. The systematic therapeutic effect of Single Cells can reduce the growth of cancer tissue. The process of this nanoscale phenomena system is able to measure and to model, by identifying some parameters and applying fundamental principles of mathematical modeling and simulation. The mathematical modeling of single cell growth depends on three types of cell densities such as proliferative, quiescent and necrotic cells. The aim of this paper is to enhance the simulation of three types of models. The first model represents the transport of drugs by coupled partial differential equations (PDEs) with 3D parabolic type in a cylindrical coordinate system. This model is integrated by Non-Newtonian flow equations, leading to blood liquid flow as the medium for transportation system and the magnetic force on the magnetic nanoparticles. The interaction between the magnetic force on drug with magnetic properties produces induced currents and the applied magnetic field yields forces with tend to move slowly the movement of blood and bring the drug to the cancer cells. The devices of nanoscale allow the drug to discharge the blood vessels and even spread out through the tissue and access to the cancer cells. The second model is the transport of drug nanoparticles from the vascular system to a single cell. The treatment of the vascular system encounters some parameter identification such as magnetic nanoparticle targeted delivery, blood flow, momentum transport, density and viscosity for drug and blood medium, intensity of magnetic fields and the radius of the capillary. Based on two discretization techniques, finite difference method (FDM) and finite element method (FEM), the set of integrated models are transformed into a series of grid points to get a large system of equations. The third model is a single cell density model involving the three sets of first order PDEs equations for proliferating, quiescent and necrotic cells change over time and space in Cartesian coordinate which regulates under different rates of nutrients consumptions. The model presents the proliferative and quiescent cell growth depends on some parameter changes and the necrotic cells emerged as the tumor core. Some numerical schemes for solving the system of equations are compared and analyzed. Simulation and computation of the discretized model are supported by Matlab and C programming languages on a single processing unit. Some numerical results and analysis of the algorithms are presented in terms of informative presentation of tables, multiple graph and multidimensional visualization. As a conclusion, the integrated of three types mathematical modeling and the comparison of numerical performance indicates that the superior tool and analysis for solving the complete set of magnetic drug delivery system which give significant effects on the growth of the targeted cancer cell.

Keywords: mathematical modeling, visualization, PDE models, magnetic nanoparticle drug delivery model, drug release model, single cell effects, avascular tumor growth, numerical analysis

Procedia PDF Downloads 428
7358 Gastronomy: The Preferred Digital Business Models and Impacts in Business Economics within Hospitality, Tourism, and Catering Sectors through Online Commerce

Authors: John Oupa Hlatshwayo

Abstract:

Background: There seem to be preferred digital business models with varying impacts within hospitality, tourism and catering sub-sectors explored through online commerce, as all are ingrained in the business economics domain. Aim: A study aims to establish if such phenomena (Digital Business Models) exist and to what extent if any, within the hospitality, tourism and catering industries, respectively. Setting: This is a qualitative study conducted by exploring several (Four) institutions globally through Case Studies. Method: This research explored explanatory case studies to answer questions about ‘how’ or ’why’ with little control by a researcher over the occurrence of events. It is qualitative research, deductive, and inductive methods. Hence, a comprehensive approach to analyzing qualitative data was attainable through immersion by reading to understand the information. Findings: The results corroborated the notion that digital business models are applicable, by and large, in business economics. Thus, three sectors wherein enterprises operate in the business economics sphere have been narrowed down i.e. hospitality, tourism and catering, are also referred to as triangular polygons due to the atypical nature of being ‘stand-alone’, yet ‘sub-sectors’, but there are confounding factors to consider. Conclusion: The significance of digital business models and digital transformation shows an inevitable merger between business and technology within Hospitality, Tourism, and Catering. Contribution: Such symbiotic relationship of business and technology, persistent evolution of clients’ interface with end-products, forever changing market, current adaptation as well as adjustment to ‘new world order’ by enterprises must be embraced constantly without fail by Business Practitioners, Academics, Business Students, Organizations and Governments.

Keywords: digital business models, hospitality, tourism, catering, business economics

Procedia PDF Downloads 22
7357 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 132
7356 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 412
7355 Designing Agile Product Development Processes by Transferring Mechanisms of Action Used in Agile Software Development

Authors: Guenther Schuh, Michael Riesener, Jan Kantelberg

Abstract:

Due to the fugacity of markets and the reduction of product lifecycles, manufacturing companies from high-wage countries are nowadays faced with the challenge to place more innovative products within even shorter development time on the market. At the same time, volatile customer requirements have to be satisfied in order to successfully differentiate from market competitors. One potential approach to address the explained challenges is provided by agile values and principles. These agile values and principles already proofed their success within software development projects in the form of management frameworks like Scrum or concrete procedure models such as Extreme Programming or Crystal Clear. Those models lead to significant improvements regarding quality, costs and development time and are therefore used within most software development projects. Motivated by the success within the software industry, manufacturing companies have tried to transfer agile mechanisms of action to the development of hardware products ever since. Though first empirical studies show similar effects in the agile development of hardware products, no comprehensive procedure model for the design of development iterations has been developed for hardware development yet due to different constraints of the domains. For this reason, this paper focusses on the design of agile product development processes by transferring mechanisms of action used in agile software development towards product development. This is conducted by decomposing the individual systems 'product development' and 'agile software development' into relevant elements and symbiotically composing the elements of both systems in respect of the design of agile product development processes afterwards. In a first step, existing product development processes are described following existing approaches of the system theory. By analyzing existing case studies from industrial companies as well as academic approaches, characteristic objectives, activities and artefacts are identified within a target-, action- and object-system. In partial model two, mechanisms of action are derived from existing procedure models of agile software development. These mechanisms of action are classified in a superior strategy level, in a system level comprising characteristic, domain-independent activities and their cause-effect relationships as well as in an activity-based element level. Within partial model three, the influence of the identified agile mechanism of action towards the characteristic system elements of product development processes is analyzed. For this reason, target-, action- and object-system of the product development are compared with the strategy-, system- and element-level of agile mechanism of action by using the graph theory. Furthermore, the necessity of existence of activities within iteration can be determined by defining activity-specific degrees of freedom. Based on this analysis, agile product development processes are designed in form of different types of iterations within a last step. By defining iteration-differentiating characteristics and their interdependencies, a logic for the configuration of activities, their form of execution as well as relevant artefacts for the specific iteration is developed. Furthermore, characteristic types of iteration for the agile product development are identified.

Keywords: activity-based process model, agile mechanisms of action, agile product development, degrees of freedom

Procedia PDF Downloads 208
7354 Identifying Coloring in Graphs with Twins

Authors: Souad Slimani, Sylvain Gravier, Simon Schmidt

Abstract:

Recently, several vertex identifying notions were introduced (identifying coloring, lid-coloring,...); these notions were inspired by identifying codes. All of them, as well as original identifying code, is based on separating two vertices according to some conditions on their closed neighborhood. Therefore, twins can not be identified. So most of known results focus on twin-free graph. Here, we show how twins can modify optimal value of vertex-identifying parameters for identifying coloring and locally identifying coloring.

Keywords: identifying coloring, locally identifying coloring, twins, separating

Procedia PDF Downloads 148
7353 Using Traffic Micro-Simulation to Assess the Benefits of Accelerated Pavement Construction for Reducing Traffic Emissions

Authors: Sudipta Ghorai, Ossama Salem

Abstract:

Pavement maintenance, repair, and rehabilitation (MRR) processes may have considerable environmental impacts due to traffic disruptions associated with work zones. The simulation models in use to predict the emission of work zones were mostly static emission factor models (SEFD). SEFD calculates emissions based on average operation conditions e.g. average speed and type of vehicles. Although these models produce accurate results for large-scale planning studies, they are not suitable for analyzing driving conditions at the micro level such as acceleration, deceleration, idling, cruising, and queuing in a work zone. The purpose of this study is to prepare a comprehensive work zone environmental assessment (WEA) framework to calculate the emissions caused due to disrupted traffic; by integrating traffic microsimulation tools with emission models. This will help highway officials to assess the benefits of accelerated construction and opt for the most suitable TMP not only economically but also from an environmental point of view.

Keywords: accelerated construction, pavement MRR, traffic microsimulation, congestion, emissions

Procedia PDF Downloads 449
7352 Performance Evaluation of Using Genetic Programming Based Surrogate Models for Approximating Simulation Complex Geochemical Transport Processes

Authors: Hamed K. Esfahani, Bithin Datta

Abstract:

Transport of reactive chemical contaminant species in groundwater aquifers is a complex and highly non-linear physical and geochemical process especially for real life scenarios. Simulating this transport process involves solving complex nonlinear equations and generally requires huge computational time for a given aquifer study area. Development of optimal remediation strategies in aquifers may require repeated solution of such complex numerical simulation models. To overcome this computational limitation and improve the computational feasibility of large number of repeated simulations, Genetic Programming based trained surrogate models are developed to approximately simulate such complex transport processes. Transport process of acid mine drainage, a hazardous pollutant is first simulated using a numerical simulated model: HYDROGEOCHEM 5.0 for a contaminated aquifer in a historic mine site. Simulation model solution results for an illustrative contaminated aquifer site is then approximated by training and testing a Genetic Programming (GP) based surrogate model. Performance evaluation of the ensemble GP models as surrogate models for the reactive species transport in groundwater demonstrates the feasibility of its use and the associated computational advantages. The results show the efficiency and feasibility of using ensemble GP surrogate models as approximate simulators of complex hydrogeologic and geochemical processes in a contaminated groundwater aquifer incorporating uncertainties in historic mine site.

Keywords: geochemical transport simulation, acid mine drainage, surrogate models, ensemble genetic programming, contaminated aquifers, mine sites

Procedia PDF Downloads 278
7351 Discrete Choice Modeling in Education: Evaluating Early Childhood Educators’ Practices

Authors: Michalis Linardakis, Vasilis Grammatikopoulos, Athanasios Gregoriadis, Kalliopi Trouli

Abstract:

Discrete choice models belong to the family of Conjoint analysis that are applied on the preferences of the respondents towards a set of scenarios that describe alternative choices. The scenarios have been pre-designed to cover all the attributes of the alternatives that may affect the choices. In this study, we examine how preschool educators integrate physical activities into their everyday teaching practices through the use of discrete choice models. One of the advantages of discrete choice models compared to other more traditional data collection methods (e.g. questionnaires and interviews that use ratings) is that the respondent is called to select among competitive and realistic alternatives, rather than objectively rate each attribute that the alternatives may have. We present the effort to construct and choose representative attributes that would cover all possible choices of the respondents, and the scenarios that have arisen. For the purposes of the study, we used a sample of 50 preschool educators in Greece that responded to 4 scenarios (from the total of 16 scenarios that the orthogonal design resulted), with each scenario having three alternative teaching practices. Seven attributes of the alternatives were used in the scenarios. For the analysis of the data, we used multinomial logit model with random effects, multinomial probit model and generalized mixed logit model. The conclusions drawn from the estimated parameters of the models are discussed.

Keywords: conjoint analysis, discrete choice models, educational data, multivariate statistical analysis

Procedia PDF Downloads 465
7350 Forecasting Model for Rainfall in Thailand: Case Study Nakhon Ratchasima Province

Authors: N. Sopipan

Abstract:

In this paper, we study of rainfall time series of weather stations in Nakhon Ratchasima province in Thailand using various statistical methods enabled to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. ARIMA and Holt-Winter models based on exponential smoothing were built. All the models proved to be adequate. Therefore, could give information that can help decision makers establish strategies for proper planning of agriculture, drainage system and other water resource applications in Nakhon Ratchasima province. We found the best perform for forecasting is ARIMA(1,0,1)(1,0,1)12.

Keywords: ARIMA Models, exponential smoothing, Holt-Winter model

Procedia PDF Downloads 300
7349 From Myth to Screen: A Cultural Criticism of the Adaptation of Nordic Mythology in Marvel Cinematic Universe’s Thor Trilogy

Authors: Vathya Anindita Putri, Henny Saptatia Drajati Nugrahani

Abstract:

This research aims to explore the representation of Nordic mythology in the commercial film titled “Thor” produced by the Marvel Cinematic Universe. First, the Nordic mythology adaptation and representation in “Thor” compared to other media. Second, the importance of using the mise en scene technique, the comprehensive portrayal of Nordic mythology and the audience's experiences in enjoying the film. This research is conducted using qualitative methods. The two research questions are analyzed using three theories: Adaptation theory by Robert Stam, Mise en Scene theory by Jean-Luc Godard, and Cultural Criticism theory by Michel Foucault. Robert Stam emphasizes the importance of social and historical in understanding film adaptations. Film adaptations always occur in a specific cultural and historical context; therefore, authors and producers must consider these factors when creating a successful adaptation. Jean-Luc Godard uses the “politiques des auteurs” approach to understand that films are not just cultural products made for entertainment, but they are works of art by authors and directors. It is important to explore how authors and directors convey their ideas and emotions in their films, in this case, a film set in Nordic mythology. Foucault takes an approach to analyzing power that considers how power operates and influences social relationships in a specific context. Foucault’s theory is used to analyze how the representation of Nordic mythology is used as an instrument of power by the Marvel Cinematic Universe to influence how the audience views Nordic mythology. The initial findings of this research are that the fusion of Nordic mythology with modern superhero storytelling in the film “Thor” produced by Marvel, is successful. The film contains conflicts in the modern world and represents the symbolism of Nordic mythology. The rich and interesting atmosphere of Nordic mythology is presented through epic battle scenes, captivating character roles, and the use of visual effects that make the film more vivid and real.

Keywords: adaptation theory, cultural criticism theory, film criticism, Marvel cinematic universe, Mise en Scene theory, Nordic mythology

Procedia PDF Downloads 87
7348 A Computational Framework for Decoding Hierarchical Interlocking Structures with SL Blocks

Authors: Yuxi Liu, Boris Belousov, Mehrzad Esmaeili Charkhab, Oliver Tessmann

Abstract:

This paper presents a computational solution for designing reconfigurable interlocking structures that are fully assembled with SL Blocks. Formed by S-shaped and L-shaped tetracubes, SL Block is a specific type of interlocking puzzle. Analogous to molecular self-assembly, the aggregation of SL blocks will build a reversible hierarchical and discrete system where a single module can be numerously replicated to compose semi-interlocking components that further align, wrap, and braid around each other to form complex high-order aggregations. These aggregations can be disassembled and reassembled, responding dynamically to design inputs and changes with a unique capacity for reconfiguration. To use these aggregations as architectural structures, we developed computational tools that automate the configuration of SL blocks based on architectural design objectives. There are three critical phases in our work. First, we revisit the hierarchy of the SL block system and devise a top-down-type design strategy. From this, we propose two key questions: 1) How to translate 3D polyominoes into SL block assembly? 2) How to decompose the desired voxelized shapes into a set of 3D polyominoes with interlocking joints? These two questions can be considered the Hamiltonian path problem and the 3D polyomino tiling problem. Then, we derive our solution to each of them based on two methods. The first method is to construct the optimal closed path from an undirected graph built from the voxelized shape and translate the node sequence of the resulting path into the assembly sequence of SL blocks. The second approach describes interlocking relationships of 3D polyominoes as a joint connection graph. Lastly, we formulate the desired shapes and leverage our methods to achieve their reconfiguration within different levels. We show that our computational strategy will facilitate the efficient design of hierarchical interlocking structures with a self-replicating geometric module.

Keywords: computational design, SL-blocks, 3D polyomino puzzle, combinatorial problem

Procedia PDF Downloads 130
7347 Experimental Study of Energy Absorption Efficiency (EAE) of Warp-Knitted Spacer Fabric Reinforced Foam (WKSFRF) Under Low-Velocity Impact

Authors: Amirhossein Dodankeh, Hadi Dabiryan, Saeed Hamze

Abstract:

Using fabrics to reinforce composites considerably leads to improved mechanical properties, including resistance to the impact load and the energy absorption of composites. Warp-knitted spacer fabrics (WKSF) are fabrics consisting of two layers of warp-knitted fabric connected by pile yarns. These connections create a space between the layers filled by pile yarns and give the fabric a three-dimensional shape. Today because of the unique properties of spacer fabrics, they are widely used in the transportation, construction, and sports industries. Polyurethane (PU) foams are commonly used as energy absorbers, but WKSF has much better properties in moisture transfer, compressive properties, and lower heat resistance than PU foam. It seems that the use of warp-knitted spacer fabric reinforced PU foam (WKSFRF) can lead to the production and use of composite, which has better properties in terms of energy absorption from the foam, its mold formation is enhanced, and its mechanical properties have been improved. In this paper, the energy absorption efficiency (EAE) of WKSFRF under low-velocity impact is investigated experimentally. The contribution of the effect of each of the structural parameters of the WKSF on the absorption of impact energy has also been investigated. For this purpose, WKSF with different structures such as two different thicknesses, small and large mesh sizes, and position of the meshes facing each other and not facing each other were produced. Then 6 types of composite samples with different structural parameters were fabricated. The physical properties of samples like weight per unit area and fiber volume fraction of composite were measured for 3 samples of any type of composites. Low-velocity impact with an initial energy of 5 J was carried out on 3 samples of any type of composite. The output of the low-velocity impact test is acceleration-time (A-T) graph with a lot deviation point, in order to achieve the appropriate results, these points were removed using the FILTFILT function of MATLAB R2018a. Using Newtonian laws of physics force-displacement (F-D) graph was drawn from an A-T graph. We know that the amount of energy absorbed is equal to the area under the F-D curve. Determination shows the maximum energy absorption is 2.858 J which is related to the samples reinforced with fabric with large mesh, high thickness, and not facing of the meshes relative to each other. An index called energy absorption efficiency was defined, which means absorption energy of any kind of our composite divided by its fiber volume fraction. With using this index, the best EAE between the samples is 21.6 that occurs in the sample with large mesh, high thickness, and meshes facing each other. Also, the EAE of this sample is 15.6% better than the average EAE of other composite samples. Generally, the energy absorption on average has been increased 21.2% by increasing the thickness, 9.5% by increasing the size of the meshes from small to big, and 47.3% by changing the position of the meshes from facing to non-facing.

Keywords: composites, energy absorption efficiency, foam, geometrical parameters, low-velocity impact, warp-knitted spacer fabric

Procedia PDF Downloads 171
7346 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review

Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha

Abstract:

The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).

Keywords: interoperability, interoperability maturity model, school management system, scoping review

Procedia PDF Downloads 209
7345 Representation of Female Experiences by Upcoming African Women Writers: A Case Study of Three Post-2000 South African Narratives

Authors: Liberty Takudzwa Nyete

Abstract:

This paper examines the feminine representation of women’s experiences in relation to womanhood as depicted by selected three South African female authors:. The study examines the challenges, difficulties and strategies used by various female characters’ to deal with situations in a typical apartheid and post-apartheid society. It also explores the way in which gender, race and class discourses are treated in the selected texts. The three authors, born and bred at the peak of the anti-apartheid movement and women’s protest against patriarchy, witnessed the effects of apartheid on both their families and societies at large which could perhaps have influenced their writing. The study is informed by both the feminist and womanist ideologies postulated by different theorists. In particular, the study of Not Woman Enough considers issues of motherhood, womanhood and racism; while that of Shameless focuses on the importance of women’s narration of their own stories, sexuality and racism; and the depiction of sexual violence, class, and women’s roles in the fight against oppression is explored with regard to This Book Betrays My Brother. Thus, the study concludes on the social makeovers that include women in all the spheres of life, such as education and the economy, which were largely dominated by men but are no longer defined by economic status, physical attributes, class nor sexuality.

Keywords: apartheid, feminism, prostitution, sexual violence, womanism, womanhood

Procedia PDF Downloads 245
7344 Models, Methods and Technologies for Protection of Critical Infrastructures from Cyber-Physical Threats

Authors: Ivan Župan

Abstract:

Critical infrastructure is essential for the functioning of a country and is designated for special protection by governments worldwide. Due to the increase in smart technology usage in every facet of the industry, including critical infrastructure, the exposure to malicious cyber-physical attacks has grown in the last few years. Proper security measures must be undertaken in order to defend against cyber-physical threats that can disrupt the normal functioning of critical infrastructure and, consequently the functioning of the country. This paper provides a review of the scientific literature of models, methods and technologies used to protect from cyber-physical threats in industries. The focus of the literature was observed from three aspects. The first aspect, resilience, concerns itself with the robustness of the system’s defense against threats, as well as preparation and education about potential future threats. The second aspect concerns security risk management for systems with cyber-physical aspects, and the third aspect investigates available testbed environments for testing developed models on scaled models of vulnerable infrastructure.

Keywords: critical infrastructure, cyber-physical security, smart industry, security methodology, security technology

Procedia PDF Downloads 77
7343 Narrating 1968: Felipe Cazals’ Canoa (1976) and Images of Massacre

Authors: Nancy Elizabeth Naranjo Garcia

Abstract:

Canoa (1976) by Felipe Cazals is a film that exposes the consequences of power that the Mexican State exercised over the 1968 student movement. The film, in this particular way, approaches the Tlatelolco Massacre from a point of view that takes into consideration the events that led up to it. Nonetheless, the reference to the political tension in Canoa remains ambiguous. Thus, the cinematographic representation refers to an event that leaves space for reflection, and as a consequence leaves evidence of an image that signals the notion of survival as Georges Didi-Huberman points out. In addition to denouncing the oppressive force by the Mexican State, the images in Canoa also emphasize what did not happen in Tlatelolco and its condensation with the student activists. To observe the images that Canoa offers in a new light, this work proposes further exploration with the following questions; How do the images in Canoa narrate? How are the images inserted in the film? In this fashion, a more profound comprehension of the objective and the essence of the images becomes feasible. As a result, it is possible to analyze the images of Canoa with the real killing at San Miguel Canoa in literature. The film visualizes a testimony of the event that once seemed unimaginable, an image that anticipates and structures the proceeding event. Therefore, this study takes a second look at how Canoa considers not only the killing at San Miguel Canoa and the Tlatlelolco Massacre, but goes further on contextualize an unimaginable image.

Keywords: cinematographic representation, student movement, Tlatelolco Massacre, unimaginable image

Procedia PDF Downloads 222
7342 Mental Health Representation in Video Games

Authors: Leonid Rybakovski

Abstract:

Contemporary media offer a variety of themes for the diverse tastes of their audiences. The Digital games medium was mostly perceived as an instrument of entertainment. But being a part of global trends while constantly pushing the boundaries of storytelling in virtual reality and standing on the edge of technology also brings huge responsibility for game designers around the globe. A very recent emerging topic over the last years was an individual's mental state. In recent years there has been a shift in mental problems representations in commercial game releases such as Hell blade: Senua's Sacrifice and Sea of Solitude. The aim of this study is to research the approach of mental illness representation in media and digital games over the years and to suggest alternatives for putting characters who suffer from mental illness at the forefront of the storyline. This study traces dominant representations of characters with mental illness in digital games, reflecting the major change of the game industry toward inclusiveness. At the same time, the research embraces a hybrid approach to the academic study of digital games and includes the development of a game that follows a post-traumatic young girl, forcing the users to live her life through her eyes. The game prototype was developed as part of the Mdes Game Design and Development program and consisted of academic research and game development practices.

Keywords: framing analysis, mental condition, up keying, game mechanics

Procedia PDF Downloads 174
7341 Comparative Analysis of Effecting Factors on Fertility by Birth Order: A Hierarchical Approach

Authors: Ali Hesari, Arezoo Esmaeeli

Abstract:

Regarding to dramatic changes of fertility and higher order births during recent decades in Iran, access to knowledge about affecting factors on different birth orders has crucial importance. In this study, According to hierarchical structure of many of social sciences data and the effect of variables of different levels of social phenomena that determine different birth orders in 365 days ending to 1390 census have been explored by multilevel approach. In this paper, 2% individual row data for 1390 census is analyzed by HLM software. Three different hierarchical linear regression models are estimated for data analysis of the first and second, third, fourth and more birth order. Research results displays different outcomes for three models. Individual level variables entered in equation are; region of residence (rural/urban), age, educational level and labor participation status and province level variable is GDP per capita. Results show that individual level variables have different effects in these three models and in second level we have different random and fixed effects in these models.

Keywords: fertility, birth order, hierarchical approach, fixe effects, random effects

Procedia PDF Downloads 339