Search results for: real gas model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19919

Search results for: real gas model

19859 On Parameter Estimation of Simultaneous Linear Functional Relationship Model for Circular Variables

Authors: N. A. Mokhtar, A. G. Hussin, Y. Z. Zubairi

Abstract:

This paper proposes a new simultaneous simple linear functional relationship model by assuming equal error variances. We derive the maximum likelihood estimate of the parameters in the simultaneous model and the covariance. We show by simulation study the small bias values of the parameters suggest the suitability of the estimation method. As an illustration, the proposed simultaneous model is applied to real data of the wind direction and wave direction measured by two different instruments.

Keywords: simultaneous linear functional relationship model, Fisher information matrix, parameter estimation, circular variables

Procedia PDF Downloads 334
19858 A Mean–Variance–Skewness Portfolio Optimization Model

Authors: Kostas Metaxiotis

Abstract:

Portfolio optimization is one of the most important topics in finance. This paper proposes a mean–variance–skewness (MVS) portfolio optimization model. Traditionally, the portfolio optimization problem is solved by using the mean–variance (MV) framework. In this study, we formulate the proposed model as a three-objective optimization problem, where the portfolio's expected return and skewness are maximized whereas the portfolio risk is minimized. For solving the proposed three-objective portfolio optimization model we apply an adapted version of the non-dominated sorting genetic algorithm (NSGAII). Finally, we use a real dataset from FTSE-100 for validating the proposed model.

Keywords: evolutionary algorithms, portfolio optimization, skewness, stock selection

Procedia PDF Downloads 169
19857 Second Order Cone Optimization Approach to Two-stage Network DEA

Authors: K. Asanimoghadam, M. Salahi, A. Jamalian

Abstract:

Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.

Keywords: network DEA, conic optimization, undesirable output, SBM

Procedia PDF Downloads 175
19856 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment

Authors: Michael Gidey Gebru

Abstract:

Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.

Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output

Procedia PDF Downloads 24
19855 Model Averaging in a Multiplicative Heteroscedastic Model

Authors: Alan Wan

Abstract:

In recent years, the body of literature on frequentist model averaging in statistics has grown significantly. Most of this work focuses on models with different mean structures but leaves out the variance consideration. In this paper, we consider a regression model with multiplicative heteroscedasticity and develop a model averaging method that combines maximum likelihood estimators of unknown parameters in both the mean and variance functions of the model. Our weight choice criterion is based on a minimisation of a plug-in estimator of the model average estimator's squared prediction risk. We prove that the new estimator possesses an asymptotic optimality property. Our investigation of finite-sample performance by simulations demonstrates that the new estimator frequently exhibits very favourable properties compared to some existing heteroscedasticity-robust model average estimators. The model averaging method hedges against the selection of very bad models and serves as a remedy to variance function misspecification, which often discourages practitioners from modeling heteroscedasticity altogether. The proposed model average estimator is applied to the analysis of two real data sets.

Keywords: heteroscedasticity-robust, model averaging, multiplicative heteroscedasticity, plug-in, squared prediction risk

Procedia PDF Downloads 344
19854 Design and Development of a Platform for Analyzing Spatio-Temporal Data from Wireless Sensor Networks

Authors: Walid Fantazi

Abstract:

The development of sensor technology (such as microelectromechanical systems (MEMS), wireless communications, embedded systems, distributed processing and wireless sensor applications) has contributed to a broad range of WSN applications which are capable of collecting a large amount of spatiotemporal data in real time. These systems require real-time data processing to manage storage in real time and query the data they process. In order to cover these needs, we propose in this paper a Snapshot spatiotemporal data model based on object-oriented concepts. This model allows saving storing and reducing data redundancy which makes it easier to execute spatiotemporal queries and save analyzes time. Further, to ensure the robustness of the system as well as the elimination of congestion from the main access memory we propose a spatiotemporal indexing technique in RAM called Captree *. As a result, we offer an RIA (Rich Internet Application) -based SOA application architecture which allows the remote monitoring and control.

Keywords: WSN, indexing data, SOA, RIA, geographic information system

Procedia PDF Downloads 232
19853 Comparative Study of Deep Reinforcement Learning Algorithm Against Evolutionary Algorithms for Finding the Optimal Values in a Simulated Environment Space

Authors: Akshay Paranjape, Nils Plettenberg, Robert Schmitt

Abstract:

Traditional optimization methods like evolutionary algorithms are widely used in production processes to find an optimal or near-optimal solution of control parameters based on the simulated environment space of a process. These algorithms are computationally intensive and therefore do not provide the opportunity for real-time optimization. This paper utilizes the Deep Reinforcement Learning (DRL) framework to find an optimal or near-optimal solution for control parameters. A model based on maximum a posteriori policy optimization (Hybrid-MPO) that can handle both numerical and categorical parameters is used as a benchmark for comparison. A comparative study shows that DRL can find optimal solutions of similar quality as compared to evolutionary algorithms while requiring significantly less time making them preferable for real-time optimization. The results are confirmed in a large-scale validation study on datasets from production and other fields. A trained XGBoost model is used as a surrogate for process simulation. Finally, multiple ways to improve the model are discussed.

Keywords: reinforcement learning, evolutionary algorithms, production process optimization, real-time optimization, hybrid-MPO

Procedia PDF Downloads 84
19852 Kalman Filter for Bilinear Systems with Application

Authors: Abdullah E. Al-Mazrooei

Abstract:

In this paper, we present a new kind of the bilinear systems in the form of state space model. The evolution of this system depends on the product of state vector by its self. The well known Lotak Volterra and Lorenz models are special cases of this new model. We also present here a generalization of Kalman filter which is suitable to work with the new bilinear model. An application to real measurements is introduced to illustrate the efficiency of the proposed algorithm.

Keywords: bilinear systems, state space model, Kalman filter, application, models

Procedia PDF Downloads 408
19851 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 281
19850 Virtualization and Visualization Based Driver Configuration in Operating System

Authors: Pavan Shah

Abstract:

In an Embedded system, Virtualization and visualization technology can provide us an effective response and measurable work in a software development environment. In addition to work of virtualization and virtualization can be easily deserved to provide the best resource sharing between real-time hardware applications and a healthy environment. However, the virtualization is noticeable work to minimize the I/O work and utilize virtualization & virtualization technology for either a software development environment (SDE) or a runtime environment of real-time embedded systems (RTMES) or real-time operating system (RTOS) eras. In this Paper, we particularly focus on virtualization and visualization overheads data of network which generates the I/O and implementation of standardized I/O (i.e., Virto), which can work as front-end network driver in a real-time operating system (RTOS) hardware module. Even there have been several work studies are available based on the virtualization operating system environment, but for the Virto on a general-purpose OS, my implementation is on the open-source Virto for a real-time operating system (RTOS). In this paper, the measurement results show that implementation which can improve the bandwidth and latency of memory management of the real-time operating system environment (RTMES) for getting more accuracy of the trained model.

Keywords: virtualization, visualization, network driver, operating system

Procedia PDF Downloads 109
19849 Interest Rate Prediction with Taylor Rule

Authors: T. Bouchabchoub, A. Bendahmane, A. Haouriqui, N. Attou

Abstract:

This paper presents simulation results of Forex predicting model equations in order to give approximately a prevision of interest rates. First, Hall-Taylor (HT) equations have been used with Taylor rule (TR) to adapt them to European and American Forex Markets. Indeed, initial Taylor Rule equation is conceived for all Forex transactions in every States: It includes only one equation and six parameters. Here, the model has been used with Hall-Taylor equations, initially including twelve equations which have been reduced to only three equations. Analysis has been developed on the following base macroeconomic variables: Real change rate, investment wages, anticipated inflation, realized inflation, real production, interest rates, gap production and potential production. This model has been used to specifically study the impact of an inflation shock on macroeconomic director interest rates.

Keywords: interest rate, Forex, Taylor rule, production, European Central Bank (ECB), Federal Reserve System (FED).

Procedia PDF Downloads 506
19848 A Novel Approach of Power Transformer Diagnostic Using 3D FEM Parametrical Model

Authors: M. Brandt, A. Peniak, J. Makarovič, P. Rafajdus

Abstract:

This paper deals with a novel approach of power transformers diagnostics. This approach identifies the exact location and the range of a fault in the transformer and helps to reduce operation costs related to handling of the faulty transformer, its disassembly and repair. The advantage of the approach is a possibility to simulate healthy transformer and also all faults, which can occur in transformer during its operation without its disassembling, which is very expensive in practice. The approach is based on creating frequency dependent impedance of the transformer by sweep frequency response analysis measurements and by 3D FE parametrical modeling of the fault in the transformer. The parameters of the 3D FE model are the position and the range of the axial short circuit. Then, by comparing the frequency dependent impedances of the parametrical models with the measured ones, the location and the range of the fault is identified. The approach was tested on a real transformer and showed high coincidence between the real fault and the simulated one.

Keywords: transformer, parametrical model of transformer, fault, sweep frequency response analysis, finite element method

Procedia PDF Downloads 457
19847 Partial Differential Equation-Based Modeling of Brain Response to Stimuli

Authors: Razieh Khalafi

Abstract:

The brain is the information processing centre of the human body. Stimuli in the form of information are transferred to the brain and then brain makes the decision on how to respond to them. In this research, we propose a new partial differential equation which analyses the EEG signals and make a relationship between the incoming stimuli and the brain response to them. In order to test the proposed model, a set of external stimuli applied to the model and the model’s outputs were checked versus the real EEG data. The results show that this model can model the EEG signal well. The proposed model is useful not only for modelling of EEG signal in case external stimuli but it can be used for modelling of brain response in case of internal stimuli.

Keywords: brain, stimuli, partial differential equation, response, EEG signal

Procedia PDF Downloads 533
19846 Knowledge Discovery from Production Databases for Hierarchical Process Control

Authors: Pavol Tanuska, Pavel Vazan, Michal Kebisek, Dominika Jurovata

Abstract:

The paper gives the results of the project that was oriented on the usage of knowledge discoveries from production systems for needs of the hierarchical process control. One of the main project goals was the proposal of knowledge discovery model for process control. Specifics data mining methods and techniques was used for defined problems of the process control. The gained knowledge was used on the real production system, thus, the proposed solution has been verified. The paper documents how it is possible to apply new discovery knowledge to be used in the real hierarchical process control. There are specified the opportunities for application of the proposed knowledge discovery model for hierarchical process control.

Keywords: hierarchical process control, knowledge discovery from databases, neural network, process control

Procedia PDF Downloads 455
19845 Authoring of Augmented Reality Manuals for Not Physically Available Products

Authors: Vito M. Manghisi, Michele Gattullo, Alessandro Evangelista, Enricoandrea Laviola

Abstract:

In this work, we compared two solutions for displaying a demo version of an Augmented Reality (AR) manual when the real product is not available, opting to replace it with its computer-aided design (CAD) model. AR has been proved to be effective in maintenance and assembly operations by many studies in the literature. However, most of them present solutions for existing products, usually converting old, printed manuals into AR manuals. In this case, authoring consists of defining how to convey existing instructions through AR. It is not a simple choice, and demo versions are created to test the design goodness. However, this becomes impossible when the product is not physically available, as for new products. A solution could be creating an entirely virtual environment with the product and the instructions. However, in this way, user interaction is completely different from that in the real application, then it would be hard testing the usability of the AR manual. This work aims to propose and compare two different solutions for the displaying of a demo version of an AR manual to support authoring in case of a product that is not physically available. We used as a case study that of an innovative semi-hermetic compressor that has not yet been produced. The applications were developed for a handheld device, using Unity 3D. The main issue was how to show the compressor and attach instructions on it. In one approach, we used Vuforia natural feature tracking to attach a CAD model of the compressor to a 2D image that is a drawing in scale 1:1 of the top-view of the CAD model. In this way, during the AR manual demonstration, the 3D model of the compressor is displayed on the user's device in place of the real compressor, and all the virtual instructions are attached to it. In the other approach, we first created a support application that shows the CAD model of the compressor on a marker. Then, we registered a video of this application, moving around the marker, obtaining a video that shows the CAD model from every point of view. For the AR manual, we used the Vuforia model target (360° option) to track the CAD model of the compressor, as it was the real compressor. Then, during the demonstration, the video is shown on a fixed large screen, and instructions are displayed attached to it in the AR manual. The first solution presents the main drawback to keeping the printed image with everyone working on the authoring of the AR manual, but allows to show the product in a real scale and interaction during the demonstration is very simple. The second one does not need a printed marker during the demonstration but a screen. Still, the compressor model is resized, and interaction is awkward since the user has to play the video on the screen to rotate the compressor. The two solutions were evaluated together with the company, and the preferred was the first one due to a more natural interaction.

Keywords: augmented reality, human computer interaction, operating instructions, maintenance, assembly

Procedia PDF Downloads 105
19844 On the Design of a Secure Two-Party Authentication Scheme for Internet of Things Using Cancelable Biometrics and Physically Unclonable Functions

Authors: Behnam Zahednejad, Saeed Kosari

Abstract:

Widespread deployment of Internet of Things (IoT) has raised security and privacy issues in this environment. Designing a secure two-factor authentication scheme between the user and server is still a challenging task. In this paper, we focus on Cancelable Biometric (CB) as an authentication factor in IoT. We show that previous CB-based scheme fail to provide real two-factor security, Perfect Forward Secrecy (PFS) and suffer database attacks and traceability of the user. Then we propose our improved scheme based on CB and Physically Unclonable Functions (PUF), which can provide real two-factor security, PFS, user’s unlinkability, and resistance to database attack. In addition, Key Compromise Impersonation (KCI) resilience is achieved in our scheme. We also prove the security of our proposed scheme formally using both Real-Or-Random (RoR) model and the ProVerif analysis tool. For the usability of our scheme, we conducted a performance analysis and showed that our scheme has the least communication cost compared to the previous CB-based scheme. The computational cost of our scheme is also acceptable for the IoT environment.

Keywords: IoT, two-factor security, cancelable biometric, key compromise impersonation resilience, perfect forward secrecy, database attack, real-or-random model, ProVerif

Procedia PDF Downloads 76
19843 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: collision, impact models, finite element method, Hertz Theory

Procedia PDF Downloads 150
19842 The Relationship between Urbanization and the Rapid Development of Real Estate Industry in China: Taking Chongqing as an Example

Authors: Deng Tingting

Abstract:

There is a very close interaction between the rapid development of the real estate industry and regional urbanization. The real estate problem can be boiled down to the problem of urbanization, in essence. The growth of hundreds of millions of people in the future will determine the development of low-level demand in the real estate market. At the same time, the practical problems of urbanization also seriously restrict the healthy development of real estate itself. The latter two interact with each other by adjusting the industrial structure, economic aggregate, regional population flow, and many other linkage factors. Through the case analysis of Chongqing, this paper finds that the urbanization of Chongqing and the overall development level of the real estate industry are still in the stage of development and upgrading, and its development potential and future development and application space are still very large. Therefore, from the perspective of the regional economy, studying the interaction between the two is of great significance to accelerate the process of urbanization in Chongqing, promote the healthy development of the real estate industry, and promote the rapid growth of the regional economy.

Keywords: urbanization, demographics, real estate, interrelationships

Procedia PDF Downloads 105
19841 Stacking Ensemble Approach for Combining Different Methods in Real Estate Prediction

Authors: Sol Girouard, Zona Kostic

Abstract:

A home is often the largest and most expensive purchase a person makes. Whether the decision leads to a successful outcome will be determined by a combination of critical factors. In this paper, we propose a method that efficiently handles all the factors in residential real estate and performs predictions given a feature space with high dimensionality while controlling for overfitting. The proposed method was built on gradient descent and boosting algorithms and uses a mixed optimizing technique to improve the prediction power. Usually, a single model cannot handle all the cases thus our approach builds multiple models based on different subsets of the predictors. The algorithm was tested on 3 million homes across the U.S., and the experimental results demonstrate the efficiency of this approach by outperforming techniques currently used in forecasting prices. With everyday changes on the real estate market, our proposed algorithm capitalizes from new events allowing more efficient predictions.

Keywords: real estate prediction, gradient descent, boosting, ensemble methods, active learning, training

Procedia PDF Downloads 252
19840 Analyzing the Effects of Real Income and Biomass Energy Consumption on Carbon Dioxide (CO2) Emissions: Empirical Evidence from the Panel of Biomass-Consuming Countries

Authors: Eyup Dogan

Abstract:

This empirical aims to analyze the impacts of real income and biomass energy consumption on the level of emissions in the EKC model for the panel of biomass-consuming countries over the period 1980-2011. Because we detect the presence of cross-sectional dependence and heterogeneity across countries for the analyzed data, we use panel estimation methods robust to cross-sectional dependence and heterogeneity. The CADF and the CIPS panel unit root tests indicate that carbon emissions, real income and biomass energy consumption are stationary at the first-differences. The LM bootstrap panel cointegration test shows that the analyzed variables are cointegrated. Results from the panel group-mean DOLS and the panel group-mean FMOLS estimators show that increase in biomass energy consumption decreases CO2 emissions and the EKC hypothesis is validated. Therefore, countries are advised to boost their production and increase the use of biomass energy for lower level of emissions.

Keywords: biomass energy, CO2 emissions, EKC model, heterogeneity, cross-sectional dependence

Procedia PDF Downloads 273
19839 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements

Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath

Abstract:

Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.

Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing

Procedia PDF Downloads 150
19838 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions

Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen

Abstract:

Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.

Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma

Procedia PDF Downloads 154
19837 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data

Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.

Keywords: real-time spatial big data, quality of service, vertical partitioning, horizontal partitioning, matching algorithm, hamming distance, stream query

Procedia PDF Downloads 138
19836 Payment for Pain: Differences between Hypothetical and Real Preferences

Authors: J. Trarbach, S. Schosser, B. Vogt

Abstract:

Decision-makers tend to prefer the first alternative over subsequent alternatives which is called the primacy effect. To reliably measure this effect, we conducted an experiment with real consequences for preference statements. Therefore, we elicit preferences of subjects using a rating scale, i.e. hypothetical preferences, and willingness to pay, i.e. real preferences, for two sequences of pain. Within these sequences, both overall intensity and duration of pain are identical. Hence, a rational decision-maker should be indifferent, whereas the primacy effect predicts a stronger preference for the first sequence. What we see is a primacy effect only for hypothetical preferences. This effect vanishes for real preferences.

Keywords: decision making, primacy effect, real incentives, willingness to pay

Procedia PDF Downloads 273
19835 The Contact Behaviors of Seals Under Combined Normal and Tangential Loading: A Multiscale Finite Element Contact Analysis

Authors: Runliang Wang, Jianhua Liu, Duo Jia, Xiaoyu Ding

Abstract:

The contact between sealing surfaces plays a vital role in guaranteeing the sealing performance of various seals. To date, analyses of sealing structures have rarely considered both structural parameters (macroscale) and surface roughness information (microscale) of sealing surfaces due to the complex modeling process. Meanwhile, most of the contact analyses applied to seals were conducted only under normal loading, which still existssome distance from real loading conditions in engineering. In this paper, a multiscale rough contact model, which took both macrostructural parameters of seals and surface roughness information of sealing surfaces into consideration for the cone-cone seal, was established. By using the finite element method (FEM), the combined normal and tangential loading was applied to the model to simulate the assembly process of the cone-cone seal. The evolution of the contact behaviors during the assembly process, such as the real contact area (RCA), the distribution of contact pressure, and contact status, are studied in detail. The results showed the non-linear relationship between the RCA and the load, which was different from the normal loading cases. In addition, the evolution of the real contact area of cone-cone seals with isotropic and anisotropic rough surfaces are also compared quantitatively.

Keywords: contact mechanics, FEM, randomly rough surface, real contact area, sealing

Procedia PDF Downloads 160
19834 Constructing a Probabilistic Ontology from a DBLP Data

Authors: Emna Hlel, Salma Jamousi, Abdelmajid Ben Hamadou

Abstract:

Every model for knowledge representation to model real-world applications must be able to cope with the effects of uncertain phenomena. One of main defects of classical ontology is its inability to represent and reason with uncertainty. To remedy this defect, we try to propose a method to construct probabilistic ontology for integrating uncertain information in an ontology modeling a set of basic publications DBLP (Digital Bibliography & Library Project) using a probabilistic model.

Keywords: classical ontology, probabilistic ontology, uncertainty, Bayesian network

Procedia PDF Downloads 324
19833 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 236
19832 Classification of Generative Adversarial Network Generated Multivariate Time Series Data Featuring Transformer-Based Deep Learning Architecture

Authors: Thrivikraman Aswathi, S. Advaith

Abstract:

As there can be cases where the use of real data is somehow limited, such as when it is hard to get access to a large volume of real data, we need to go for synthetic data generation. This produces high-quality synthetic data while maintaining the statistical properties of a specific dataset. In the present work, a generative adversarial network (GAN) is trained to produce multivariate time series (MTS) data since the MTS is now being gathered more often in various real-world systems. Furthermore, the GAN-generated MTS data is fed into a transformer-based deep learning architecture that carries out the data categorization into predefined classes. Further, the model is evaluated across various distinct domains by generating corresponding MTS data.

Keywords: GAN, transformer, classification, multivariate time series

Procedia PDF Downloads 100
19831 Real Estate Rigidities: The Effect of Cash Transactions and the Impact of Demonetisation on Them

Authors: Dishant Shahi, Aradhya Shandilya, Nand Kumar

Abstract:

We study here the impact of the black component referred to as X component in the text on Real estate transactions. The X component involved not only acts as friction in transaction but also leads to dysfunctionality in the capital market of real estate. The effect of the component is presented by using a model of economy which seeks resemblance with that of India involving property deals. The rigidities which hinder smooth transactions in property or land deals are depicted and their impact on the economy as a whole has been modelled. The effect of subprime crisis (2007) on Indian housing capital market and the role which the X component played during it, is also included in one of the sections. In the entire text, we have utilised 4 Quadrant graphs to study supply and demand causalities involved in commercial real estate. At the end we have included the impact of demonetisation as a move to counter the problem of overvaluation in the property assets arising due to the X component. The case of Demonetisation which has been the latest move by the Indian Government to control huge amount of black money in circulation has been included along with its impact on the housing and rent as well as the capital market.

Keywords: X-component, 4Q graph, real estate, capital markets, demonetisation, consumer sentiments

Procedia PDF Downloads 341
19830 Integrated Vegetable Production Planning Considering Crop Rotation Rules Using a Mathematical Mixed Integer Programming Model

Authors: Mohammadali Abedini Sanigy, Jiangang Fei

Abstract:

In this paper, a mathematical optimization model was developed to maximize the profit in a vegetable production planning problem. It serves as a decision support system that assists farmers in land allocation to crops and harvest scheduling decisions. The developed model can handle different rotation rules in two consecutive cycles of production, which is a common practice in organic production system. Moreover, different production methods of the same crop were considered in the model formulation. The main strength of the model is that it is not restricted to predetermined production periods, which makes the planning more flexible. The model is classified as a mixed integer programming (MIP) model and formulated in PYOMO -a Python package to formulate optimization models- and solved via Gurobi and CPLEX optimizer packages. The model was tested with secondary data from 'Australian vegetable growing farms', and the results were obtained and discussed with the computational test runs. The results show that the model can successfully provide reliable solutions for real size problems.

Keywords: crop rotation, harvesting, mathematical model formulation, vegetable production

Procedia PDF Downloads 162