Search results for: stochastic approximation
328 A Bayesian Network Approach to Customer Loyalty Analysis: A Case Study of Home Appliances Industry in Iran
Authors: Azam Abkhiz, Abolghasem Nasir
Abstract:
To achieve sustainable competitive advantage in the market, it is necessary to provide and improve customer satisfaction and Loyalty. To reach this objective, companies need to identify and analyze their customers. Thus, it is critical to measure the level of customer satisfaction and Loyalty very carefully. This study attempts to build a conceptual model to provide clear insights of customer loyalty. Using Bayesian networks (BNs), a model is proposed to evaluate customer loyalty and its consequences, such as repurchase and positive word-of-mouth. BN is a probabilistic approach that predicts the behavior of a system based on observed stochastic events. The most relevant determinants of customer loyalty are identified by the literature review. Perceived value, service quality, trust, corporate image, satisfaction, and switching costs are the most important variables that explain customer loyalty. The data are collected by use of a questionnaire-based survey from 1430 customers of a home appliances manufacturer in Iran. Four scenarios and sensitivity analyses are performed to run and analyze the impact of different determinants on customer loyalty. The proposed model allows businesses to not only set their targets but proactively manage their customer behaviors as well.Keywords: customer satisfaction, customer loyalty, Bayesian networks, home appliances industry
Procedia PDF Downloads 139327 Competition and Cooperation of Prosumers in Cournot Games with Uncertainty
Authors: Yong-Heng Shi, Peng Hao, Bai-Chen Xie
Abstract:
Solar prosumers are playing increasingly prominent roles in the power system. However, its uncertainty affects the outcomes and functions of the power market, especially in the asymmetric information environment. Therefore, an important issue is how to take effective measures to reduce the impact of uncertainty on market equilibrium. We propose a two-level stochastic differential game model to explore the Cournot decision problem of prosumers. In particular, we study the impact of punishment and cooperation mechanisms on the efficiency of the Cournot game in which prosumers face uncertainty. The results show that under the penalty mechanism of fixed and variable rates, producers and consumers tend to take conservative actions to hedge risks, and the variable rates mechanism is more reasonable. Compared with non-cooperative situations, prosumers can improve the efficiency of the game through cooperation, which we attribute to the superposition of market power and uncertainty reduction. In addition, the market environment of asymmetric information intensifies the role of uncertainty. It reduces social welfare but increases the income of prosumers. For regulators, promoting alliances is an effective measure to realize the integration, optimization, and stable grid connection of producers and consumers.Keywords: Cournot games, power market, uncertainty, prosumer cooperation
Procedia PDF Downloads 107326 Elastic Deformation of Multistory RC Frames under Lateral Loads
Authors: Hamdy Elgohary, Majid Assas
Abstract:
Estimation of lateral displacement and interstory drifts represent a major step in multistory frames design. In the preliminary design stage, it is essential to perform a fast check for the expected values of lateral deformations. This step will help to ensure the compliance of the expected values with the design code requirements. Also, in some cases during or after the detailed design stage, it may be required to carry fast check of lateral deformations by design reviewer. In the present paper, a parametric study is carried out on the factors affecting in the lateral displacements of multistory frame buildings. Based on the results of the parametric study, simplified empirical equations are recommended for the direct determination of the lateral deflection of multistory frames. The results obtained using the recommended equations have been compared with the results obtained by finite element analysis. The comparison shows that the proposed equations lead to good approximation for the estimation of lateral deflection of multistory RC frame buildings.Keywords: lateral deflection, interstory drift, approximate analysis, multistory frames
Procedia PDF Downloads 271325 Photon Blockade in Non-Hermitian Optomechanical Systems with Nonreciprocal Couplings
Authors: J. Y. Sun, H. Z. Shen
Abstract:
We study the photon blockade at exceptional points for a non-Hermitian optomechanical system coupled to the driven whispering-gallery-mode microresonator with two nanoparticles under the weak optomechanical coupling approximation, where exceptional points emerge periodically by controlling the relative angle of the nanoparticles. We find that conventional photon blockade occurs at exceptional points for the eigenenergy resonance of the single-excitation subspace driven by a laser field and discuss the physical origin of conventional photon blockade. Under the weak driving condition, we analyze the influences of the different parameters on conventional photon blockade. We investigate conventional photon blockade at nonexceptional points, which exists at two optimal detunings due to the eigenstates in the single-excitation subspace splitting from one (coalescence) at exceptional points to two at nonexceptional points. Unconventional photon blockade can occur at nonexceptional points, while it does not exist at exceptional points since the destructive quantum interference cannot occur due to the two different quantum pathways to the two-photon state not being formed. The realization of photon blockade in our proposal provides a viable and flexible way for the preparation of single-photon sources in the non-Hermitian optomechanical system.Keywords: optomechanical systems, photon blockade, non-hermitian, exceptional points
Procedia PDF Downloads 140324 Revisiting the Fiscal Theory of Sovereign Risk from the DSGE View
Authors: Eiji Okano, Kazuyuki Inagaki
Abstract:
We revisit Uribe's `Fiscal Theory of Sovereign Risk' advocating that there is a trade-off between stabilizing inflation and suppressing default. We develop a class of dynamic stochastic general equilibrium (DSGE) model with nominal rigidities and compare two de facto inflation stabilization policies, optimal monetary policy and optimal monetary and fiscal policy with the minimizing interest rate spread policy which completely suppress the default. Under the optimal monetary and fiscal policy, not only the nominal interest rate but also the tax rate work to minimize welfare costs through stabilizing inflation. Under the optimal monetary both inflation and output gap are completely stabilized although those are fluctuating under the optimal monetary policy. In addition, volatility in the default rate under the optimal monetary policy is considerably lower than one under the optimal monetary policy. Thus, there is not the SI-SD trade-off. In addition, while the minimizing interest rate spread policy makes inflation rate severely volatile, the optimal monetary and fiscal policy stabilize both the inflation and the default. A trade-off between stabilizing inflation and suppressing default is not so severe what pointed out by Uribe.Keywords: sovereign risk, optimal monetary policy, fiscal theory of the price level, DSGE
Procedia PDF Downloads 321323 A Numerical Description of a Fibre Reinforced Concrete Using a Genetic Algorithm
Authors: Henrik L. Funke, Lars Ulke-Winter, Sandra Gelbrich, Lothar Kroll
Abstract:
This work reports about an approach for an automatic adaptation of concrete formulations based on genetic algorithms (GA) to optimize a wide range of different fit-functions. In order to achieve the goal, a method was developed which provides a numerical description of a fibre reinforced concrete (FRC) mixture regarding the production technology and the property spectrum of the concrete. In a first step, the FRC mixture with seven fixed components was characterized by varying amounts of the components. For that purpose, ten concrete mixtures were prepared and tested. The testing procedure comprised flow spread, compressive and bending tensile strength. The analysis and approximation of the determined data was carried out by GAs. The aim was to obtain a closed mathematical expression which best describes the given seven-point cloud of FRC by applying a Gene Expression Programming with Free Coefficients (GEP-FC) strategy. The seven-parametric FRC-mixtures model which is generated according to this method correlated well with the measured data. The developed procedure can be used for concrete mixtures finding closed mathematical expressions, which are based on the measured data.Keywords: concrete design, fibre reinforced concrete, genetic algorithms, GEP-FC
Procedia PDF Downloads 280322 Bayesian Value at Risk Forecast Using Realized Conditional Autoregressive Expectiel Mdodel with an Application of Cryptocurrency
Authors: Niya Chen, Jennifer Chan
Abstract:
In the financial market, risk management helps to minimize potential loss and maximize profit. There are two ways to assess risks; the first way is to calculate the risk directly based on the volatility. The most common risk measurements are Value at Risk (VaR), sharp ratio, and beta. Alternatively, we could look at the quantile of the return to assess the risk. Popular return models such as GARCH and stochastic volatility (SV) focus on modeling the mean of the return distribution via capturing the volatility dynamics; however, the quantile/expectile method will give us an idea of the distribution with the extreme return value. It will allow us to forecast VaR using return which is direct information. The advantage of using these non-parametric methods is that it is not bounded by the distribution assumptions from the parametric method. But the difference between them is that expectile uses a second-order loss function while quantile regression uses a first-order loss function. We consider several quantile functions, different volatility measures, and estimates from some volatility models. To estimate the expectile of the model, we use Realized Conditional Autoregressive Expectile (CARE) model with the bayesian method to achieve this. We would like to see if our proposed models outperform existing models in cryptocurrency, and we will test it by using Bitcoin mainly as well as Ethereum.Keywords: expectile, CARE Model, CARR Model, quantile, cryptocurrency, Value at Risk
Procedia PDF Downloads 109321 Off-Farm Work and Cost Efficiency in Staple Food Production among Small-Scale Farmers in North Central Nigeria
Authors: C. E. Ogbanje, S. A. N. D. Chidebelu, N. J. Nweze
Abstract:
The study evaluated off-farm work and cost efficiency in staple food production among small-scale farmers in North Central Nigeria. Multistage sampling technique was used to select 360 respondents (participants and non-participants in off-farm work). Primary data obtained were analysed using stochastic cost frontier and test of means’ difference. Capital input was lower for participants (N2,596.58) than non-participants (N11,099.14). Gamma (γ) was statistically significant. Farm size significantly (p<0.01) increased cost outlay for participants and non-participants. Average input prices of enterprises one and two significantly (p<0.01) increased cost. Sex, household size, credit obtained, formal education, farming experience, and farm income significantly (p<0.05) reduced cost inefficiency for non-participants. Average cost efficiency was 11%. Farm capital was wasted. Participants’ substitution of capital for labour did not put them at a disadvantage. Extension agents should encourage farmers to obtain financial relief from off-farm work but not to the extent of endangering farm cost efficiency.Keywords: cost efficiency, mean difference, North Central Nigeria, off-farm work, participants and non-participants, small-scale farmers
Procedia PDF Downloads 362320 Optimizing the Passenger Throughput at an Airport Security Checkpoint
Authors: Kun Li, Yuzheng Liu, Xiuqi Fan
Abstract:
High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.Keywords: queue theory, security check, stochatic process, Monte Carlo simulation
Procedia PDF Downloads 200319 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information
Procedia PDF Downloads 813318 Approximating Maximum Speed on Road from Curvature Information of Bezier Curve
Authors: M. Yushalify Misro, Ahmad Ramli, Jamaludin M. Ali
Abstract:
Bezier curves have useful properties for path generation problem, for instance, it can generate the reference trajectory for vehicles to satisfy the path constraints. Both algorithms join cubic Bezier curve segment smoothly to generate the path. Some of the useful properties of Bezier are curvature. In mathematics, the curvature is the amount by which a geometric object deviates from being flat, or straight in the case of a line. Another extrinsic example of curvature is a circle, where the curvature is equal to the reciprocal of its radius at any point on the circle. The smaller the radius, the higher the curvature thus the vehicle needs to bend sharply. In this study, we use Bezier curve to fit highway-like curve. We use the different approach to finding the best approximation for the curve so that it will resemble highway-like curve. We compute curvature value by analytical differentiation of the Bezier Curve. We will then compute the maximum speed for driving using the curvature information obtained. Our research works on some assumptions; first the Bezier curve estimates the real shape of the curve which can be verified visually. Even, though, the fitting process of Bezier curve does not interpolate exactly on the curve of interest, we believe that the estimation of speed is acceptable. We verified our result with the manual calculation of the curvature from the map.Keywords: speed estimation, path constraints, reference trajectory, Bezier curve
Procedia PDF Downloads 375317 Interactive Winding Geometry Design of Power Transformers
Authors: Paffrath Meinhard, Zhou Yayun, Guo Yiqing, Ertl Harald
Abstract:
Winding geometry design is an important part of power transformer electrical design. Conventionally, the winding geometry is designed manually, which is a time-consuming job because it involves many iteration steps in order to meet all cost, manufacturing and electrical requirements. Here a method is presented which automatically generates the winding geometry for given user parameters and allows the user to interactively set and change parameters. To achieve this goal, the winding problem is transferred to a mixed integer nonlinear optimization problem. The relevant geometrical design parameters are defined as optimization variables. The cost and other requirements are modeled as constraints. For the solution, a stochastic ant colony optimization algorithm is applied. It is well-known, that an optimizer can get stuck in a local minimum. For the winding problem, we present efficient strategies to come out of local minima, furthermore a reduced variable search range helps to accelerate the solution process. Numerical examples show that the optimization result is delivered within seconds such that the user can interactively change the variable search area and constraints to improve the design.Keywords: ant colony optimization, mixed integer nonlinear programming, power transformer, winding design
Procedia PDF Downloads 380316 Active Linear Quadratic Gaussian Secondary Suspension Control of Flexible Bodied Railway Vehicle
Authors: Kaushalendra K. Khadanga, Lee Hee Hyol
Abstract:
Passenger comfort has been paramount in the design of suspension systems of high speed cars. To analyze the effect of vibration on vehicle ride quality, a vertical model of a six degree of freedom railway passenger vehicle, with front and rear suspension, is built. It includes car body flexible effects and vertical rigid modes. A second order linear shaping filter is constructed to model Gaussian white noise into random rail excitation. The temporal correlation between the front and rear wheels is given by a second order Pade approximation. The complete track and the vehicle model are then designed. An active secondary suspension system based on a Linear Quadratic Gaussian (LQG) optimal control method is designed. The results show that the LQG control method reduces the vertical acceleration, pitching acceleration and vertical bending vibration of the car body as compared to the passive system.Keywords: active suspension, bending vibration, railway vehicle, vibration control
Procedia PDF Downloads 260315 Data Hiding by Vector Quantization in Color Image
Authors: Yung Gi Wu
Abstract:
With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.Keywords: data hiding, vector quantization, watermark, color image
Procedia PDF Downloads 364314 Analytical Solutions to the N-Dimensional Schrödinger Equation with a Collective Potential Model to Study Energy Spectra Andthermodynamic Properties of Selected Diatomic Molecules
Authors: BenedictI Ita, Etido P. Inyang
Abstract:
In this work, the resolutions of the N-dimensional Schrödinger equation with the screened modified Kratzerplus inversely quadratic Yukawa potential (SMKIQYP) have been obtained with the Greene-Aldrich approximation scheme using the Nikiforov-Uvarov method. The eigenvalues and the normalized eigenfunctions are obtained. We then apply the energy spectrum to study four (HCl, N₂, NO, and CO) diatomic molecules. The results show that the energy spectra of these diatomic molecules increase as quantum numbers increase. The energy equation was also used to calculate the partition function and other thermodynamic properties. We predicted the partition function of CO and NO. To check the accuracy of our work, the special case (Modified Kratzer and screened Modified Kratzer potentials) of the collective potential energy eigenvalues agrees excellently with the existing literature.Keywords: Schrödinger equation, Nikiforov-Uvarov method, modified screened Kratzer, inversely quadratic Yukawa potential, diatomic molecules
Procedia PDF Downloads 84313 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 244312 Loading Factor Performance of a Centrifugal Compressor Impeller: Specific Features and Way of Modeling
Authors: K. Soldatova, Y. Galerkin
Abstract:
A loading factor performance is necessary for the modeling of centrifugal compressor gas dynamic performance curve. Measured loading factors are linear function of a flow coefficient at an impeller exit. The performance does not depend on the compressibility criterion. To simulate loading factor performances, the authors present two parameters: a loading factor at zero flow rate and an angle between an ordinate and performance line. The calculated loading factor performances of non-viscous are linear too and close to experimental performances. Loading factor performances of several dozens of impellers with different blade exit angles, blade thickness and number, ratio of blade exit/inlet height, and two different type of blade mean line configuration. There are some trends of influence, which are evident – comparatively small blade thickness influence, and influence of geometry parameters is more for impellers with bigger blade exit angles, etc. Approximating equations for both parameters are suggested. The next phase of work will be simulating of experimental performances with the suggested approximation equations as a base.Keywords: loading factor performance, centrifugal compressor, impeller, modeling
Procedia PDF Downloads 349311 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population
Authors: Ye Xue, Zhenhua Deng
Abstract:
Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool
Procedia PDF Downloads 58310 Polynomial Chaos Expansion Combined with Exponential Spline for Singularly Perturbed Boundary Value Problems with Random Parameter
Authors: W. K. Zahra, M. A. El-Beltagy, R. R. Elkhadrawy
Abstract:
So many practical problems in science and technology developed over the past decays. For instance, the mathematical boundary layer theory or the approximation of solution for different problems described by differential equations. When such problems consider large or small parameters, they become increasingly complex and therefore require the use of asymptotic methods. In this work, we consider the singularly perturbed boundary value problems which contain very small parameters. Moreover, we will consider these perturbation parameters as random variables. We propose a numerical method to solve this kind of problems. The proposed method is based on an exponential spline, Shishkin mesh discretization, and polynomial chaos expansion. The polynomial chaos expansion is used to handle the randomness exist in the perturbation parameter. Furthermore, the Monte Carlo Simulations (MCS) are used to validate the solution and the accuracy of the proposed method. Numerical results are provided to show the applicability and efficiency of the proposed method, which maintains a very remarkable high accuracy and it is ε-uniform convergence of almost second order.Keywords: singular perturbation problem, polynomial chaos expansion, Shishkin mesh, two small parameters, exponential spline
Procedia PDF Downloads 160309 Some Accuracy Related Aspects in Two-Fluid Hydrodynamic Sub-Grid Modeling of Gas-Solid Riser Flows
Authors: Joseph Mouallem, Seyed Reza Amini Niaki, Norman Chavez-Cussy, Christian Costa Milioli, Fernando Eduardo Milioli
Abstract:
Sub-grid closures for filtered two-fluid models (fTFM) useful in large scale simulations (LSS) of riser flows can be derived from highly resolved simulations (HRS) with microscopic two-fluid modeling (mTFM). Accurate sub-grid closures require accurate mTFM formulations as well as accurate correlation of relevant filtered parameters to suitable independent variables. This article deals with both of those issues. The accuracy of mTFM is touched by assessing the impact of gas sub-grid turbulence over HRS filtered predictions. A gas turbulence alike effect is artificially inserted by means of a stochastic forcing procedure implemented in the physical space over the momentum conservation equation of the gas phase. The correlation issue is touched by introducing a three-filtered variable correlation analysis (three-marker analysis) performed under a variety of different macro-scale conditions typical or risers. While the more elaborated correlation procedure clearly improved accuracy, accounting for gas sub-grid turbulence had no significant impact over predictions.Keywords: fluidization, gas-particle flow, two-fluid model, sub-grid models, filtered closures
Procedia PDF Downloads 123308 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method
Procedia PDF Downloads 502307 Deciding Graph Non-Hamiltonicity via a Closure Algorithm
Authors: E. R. Swart, S. J. Gismondi, N. R. Swart, C. E. Bell
Abstract:
We present an heuristic algorithm that decides graph non-Hamiltonicity. All graphs are directed, each undirected edge regarded as a pair of counter directed arcs. Each of the n! Hamilton cycles in a complete graph on n+1 vertices is mapped to an n-permutation matrix P where p(u,i)=1 if and only if the ith arc in a cycle enters vertex u, starting and ending at vertex n+1. We first create exclusion set E by noting all arcs (u, v) not in G, sufficient to code precisely all cycles excluded from G i.e. cycles not in G use at least one arc not in G. Members are pairs of components of P, {p(u,i),p(v,i+1)}, i=1, n-1. A doubly stochastic-like relaxed LP formulation of the Hamilton cycle decision problem is constructed. Each {p(u,i),p(v,i+1)} in E is coded as variable q(u,i,v,i+1)=0 i.e. shrinks the feasible region. We then implement the Weak Closure Algorithm (WCA) that tests necessary conditions of a matching, together with Boolean closure to decide 0/1 variable assignments. Each {p(u,i),p(v,j)} not in E is tested for membership in E, and if possible, added to E (q(u,i,v,j)=0) to iteratively maximize |E|. If the WCA constructs E to be maximal, the set of all {p(u,i),p(v,j)}, then G is decided non-Hamiltonian. Only non-Hamiltonian G share this maximal property. Ten non-Hamiltonian graphs (10 through 104 vertices) and 2000 randomized 31 vertex non-Hamiltonian graphs are tested and correctly decided non-Hamiltonian. For Hamiltonian G, the complement of E covers a matching, perhaps useful in searching for cycles. We also present an example where the WCA fails.Keywords: Hamilton cycle decision problem, computational complexity theory, graph theory, theoretical computer science
Procedia PDF Downloads 373306 Analyzing the Effects of Supply and Demand Shocks in the Spanish Economy
Authors: José M Martín-Moreno, Rafaela Pérez, Jesús Ruiz
Abstract:
In this paper we use a small open economy Dynamic Stochastic General Equilibrium Model (DSGE) for the Spanish economy to search for a deeper characterization of the determinants of Spain’s macroeconomic fluctuations throughout the period 1970-2008. In order to do this, we distinguish between tradable and non-tradable goods to take into account the fact that the presence of non-tradable goods in this economy is one of the largest in the world. We estimate a DSGE model with supply and demand shocks (sectorial productivity, public spending, international real interest rate and preferences) using Kalman Filter techniques. We find the following results. First of all, our variance decomposition analysis suggests that 1) the preference shock basically accounts for private consumption volatility, 2) the idiosyncratic productivity shock accounts for non-tradable output volatility, and 3) the sectorial productivity shock along with the international interest rate both greatly account for tradable output. Secondly, the model closely replicates the time path observed in the data for the Spanish economy and finally, the model captures the main cyclical qualitative features of this economy reasonably well.Keywords: business cycle, DSGE models, Kalman filter estimation, small open economy
Procedia PDF Downloads 416305 A Simulation-Optimization Approach to Control Production, Subcontracting and Maintenance Decisions for a Deteriorating Production System
Authors: Héctor Rivera-Gómez, Eva Selene Hernández-Gress, Oscar Montaño-Arango, Jose Ramon Corona-Armenta
Abstract:
This research studies the joint production, maintenance and subcontracting control policy for an unreliable deteriorating manufacturing system. Production activities are controlled by a derivation of the Hedging Point Policy, and given that the system is subject to deterioration, it reduces progressively its capacity to satisfy product demand. Multiple deterioration effects are considered, reflected mainly in the quality of the parts produced and the reliability of the machine. Subcontracting is available as support to satisfy product demand; also overhaul maintenance can be conducted to reduce the effects of deterioration. The main objective of the research is to determine simultaneously the production, maintenance and subcontracting rate which minimize the total incurred cost. A stochastic dynamic programming model is developed and solved through a simulation-based approach composed of statistical analysis and optimization with the response surface methodology. The obtained results highlight the strong interactions between production, deterioration and quality which justify the development of an integrated model. A numerical example and a sensitivity analysis are presented to validate our results.Keywords: subcontracting, optimal control, deterioration, simulation, production planning
Procedia PDF Downloads 579304 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning
Procedia PDF Downloads 354303 The Pressure Effect and First-Principles Study of Strontium Chalcogenides SrS
Authors: Benallou Yassine, Amara Kadda, Bouazza Boubakar, Soudini Belabbes, Arbouche Omar, M. Zemouli
Abstract:
The study of the pressure effect on the materials, their functionality and their properties is very important, insofar as it provides the opportunity to identify others applications such the optical properties in the alkaline earth chalcogenides, as like the SrS. Here we present the first-principles calculations which have been performed using the full potential linearized augmented plane wave method (FP-LAPW) within the Generalized Gradient Approximation developed by Perdew–Burke–Ernzerhor for solids (PBEsol). The calculated structural parameters like the lattice parameters, the bulk modulus B and their pressure derivative B' are in reasonable agreement with the available experimental and theoretical data. In addition, the elastic properties such as elastic constants (C11, C12, and C44), the shear modulus G, the Young modulus E, the Poisson’s ratio ν and the B/G ratio are also given. The treatments of exchange and correlation effects were done by the Tran-Blaha modified Becke-Johnson (TB-mBJ) potential for the electronic. The pressure effect on the electronic properties was visualized by calculating the variations of the gap as a function of pressure. The obtained results are compared to available experimental data and to other theoretical calculationsKeywords: SrS, GGA-PBEsol+TB-MBJ, density functional, Perdew–Burke–Ernzerhor, FP-LAPW, pressure effect
Procedia PDF Downloads 569302 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs
Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar
Abstract:
The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.Keywords: simulation, probability, confidence interval, sensitivity analysis
Procedia PDF Downloads 382301 Analysis of Three-Dimensional Longitudinal Rolls Induced by Double Diffusive Poiseuille-Rayleigh-Benard Flows in Rectangular Channels
Authors: O. Rahli, N. Mimouni, R. Bennacer, K. Bouhadef
Abstract:
This numerical study investigates the travelling wave’s appearance and the behavior of Poiseuille-Rayleigh-Benard (PRB) flow induced in 3D thermosolutale mixed convection (TSMC) in horizontal rectangular channels. The governing equations are discretized by using a control volume method with third order Quick scheme in approximating the advection terms. Simpler algorithm is used to handle coupling between the momentum and continuity equations. To avoid the excessively high computer time, full approximation storage (FAS) with full multigrid (FMG) method is used to solve the problem. For a broad range of dimensionless controlling parameters, the contribution of this work is to analyzing the flow regimes of the steady longitudinal thermoconvective rolls (noted R//) for both thermal and mass transfer (TSMC). The transition from the opposed volume forces to cooperating ones, considerably affects the birth and the development of the longitudinal rolls. The heat and mass transfers distribution are also examined.Keywords: heat and mass transfer, mixed convection, poiseuille-rayleigh-benard flow, rectangular duct
Procedia PDF Downloads 298300 Iterative Solver for Solving Large-Scale Frictional Contact Problems
Authors: Thierno Diop, Michel Fortin, Jean Deteix
Abstract:
Since the precise formulation of the elastic part is irrelevant for the description of the algorithm, we shall consider a generic case. In practice, however, we will have to deal with a non linear material (for instance a Mooney-Rivlin model). We are interested in solving a finite element approximation of the problem, leading to large-scale non linear discrete problems and, after linearization, to large linear systems and ultimately to calculations needing iterative methods. This also implies that penalty method, and therefore augmented Lagrangian method, are to be banned because of their negative effect on the condition number of the underlying discrete systems and thus on the convergence of iterative methods. This is in rupture to the mainstream of methods for contact in which augmented Lagrangian is the principal tool. We shall first present the problem and its discretization; this will lead us to describe a general solution algorithm relying on a preconditioner for saddle-point problems which we shall describe in some detail as it is not entirely standard. We will propose an iterative approach for solving three-dimensional frictional contact problems between elastic bodies, including contact with a rigid body, contact between two or more bodies and also self-contact.Keywords: frictional contact, three-dimensional, large-scale, iterative method
Procedia PDF Downloads 210299 Body Image Dissatifaction with and Personal Behavioral Control in Obese Patients Who are Attending to Treatment
Authors: Mariela Gonzalez, Zoraide Lugli, Eleonora Vivas, Rosana Guzmán
Abstract:
The objective was to determine the predictive capacity of self-efficacy perceived for weight control, locus of weight control and skills of weight self-management in the dissatisfaction of the body image in obese people who attend treatment. Sectional study conducted in the city of Maracay, Venezuela, with 243 obese who attend to treatment, 173 of the feminine gender and 70 of the male, with ages ranging between 18 and 57 years old. The sample body mass index ranged between 29.39 and 44.14. The following instruments were used: The Body Shape Questionnaire (BSQ), the inventory of body weight self-regulation, The Inventory of self-efficacy in the regulation of body weight and the Inventory of the Locus of weight control. Calculating the descriptive statistics and of central tendency, coefficients of correlation and multiple regression; it was found that a low ‘perceived Self-efficacy in the weight control’ and a high ‘Locus of external control’, predict the dissatisfaction with body image in obese who attend treatment. The findings are a first approximation to give an account of the importance of the personal control variables in the study of the psychological grief on the overweight individual.Keywords: dissatisfaction with body image, obese people, personal control, psychological variables
Procedia PDF Downloads 432