Search results for: computation independent model (CIM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18212

Search results for: computation independent model (CIM)

18212 Model Driven Architecture Methodologies: A Review

Authors: Arslan Murtaza

Abstract:

Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.

Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies

Procedia PDF Downloads 411
18211 Model-Independent Price Bounds for the Swiss Re Mortality Bond 2003

Authors: Raj Kumari Bahl, Sotirios Sabanis

Abstract:

In this paper, we are concerned with the valuation of the first Catastrophic Mortality Bond that was launched in the market namely the Swiss Re Mortality Bond 2003. This bond encapsulates the behavior of a well-defined mortality index to generate payoffs for the bondholders. Pricing this bond is a challenging task. We adapt the payoff of the terminal principal of the bond in terms of the payoff of an Asian put option and present an approach to derive model-independent bounds exploiting comonotonic theory. We invoke Jensen’s inequality for the computation of lower bounds and employ Lagrange optimization technique to achieve the upper bound. The success of these bounds is based on the availability of compatible European mortality options in the market. We carry out Monte Carlo simulations to estimate the bond price and illustrate the strength of these bounds across a variety of models. The fact that our bounds are model-independent is a crucial breakthrough in the pricing of catastrophic mortality bonds.

Keywords: mortality bond, Swiss Re Bond, mortality index, comonotonicity

Procedia PDF Downloads 217
18210 Aerodynamic Coefficients Prediction from Minimum Computation Combinations Using OpenVSP Software

Authors: Marine Segui, Ruxandra Mihaela Botez

Abstract:

OpenVSP is an aerodynamic solver developed by National Aeronautics and Space Administration (NASA) that allows building a reliable model of an aircraft. This software performs an aerodynamic simulation according to the angle of attack of the aircraft makes between the incoming airstream, and its speed. A reliable aerodynamic model of the Cessna Citation X was designed but it required a lot of computation time. As a consequence, a prediction method was established that allowed predicting lift and drag coefficients for all Mach numbers and for all angles of attack, exclusively for stall conditions, from a computation of three angles of attack and only one Mach number. Aerodynamic coefficients given by the prediction method for a Cessna Citation X model were finally compared with aerodynamics coefficients obtained using a complete OpenVSP study.

Keywords: aerodynamic, coefficient, cruise, improving, longitudinal, openVSP, solver, time

Procedia PDF Downloads 201
18209 Acoustic Room Impulse Response Computation with Image Sources and Frequency Dependent Boundary Reflection Coefficients

Authors: Pratik Gandhi, Kavitha Chandra, Charles Thompson

Abstract:

A computational model of the acoustic room impulse response between transmitters and receivers located in an enclosed cavity under the influence of frequency-dependent reflection coefficients of the walls is presented. The characteristic features of the impulse responses that differentiate these results from frequency-independent reflecting surfaces are discussed. The image-source model is derived from the first principle solution to Green's function of the acoustic wave equation. The post-processing of the computed impulse response with a band-pass filter to better represents the response of a loud-speaker is demonstrated.

Keywords: acoustic room impulse response, frequency dependent reflection coefficients, Green's function, image model

Procedia PDF Downloads 189
18208 Presenting the Mathematical Model to Determine Retention in the Watersheds

Authors: S. Shamohammadi, L. Razavi

Abstract:

This paper based on the principle concepts of SCS-CN model, a new mathematical model for computation of retention potential (S) presented. In the mathematical model, not only precipitation-runoff concepts in SCS-CN model are precisely represented in a mathematical form, but also new concepts, called “maximum retention” and “total retention” is introduced, and concepts of potential retention capacity, maximum retention, and total retention have been separated from each other. In the proposed model, actual retention (F), maximum actual retention (Fmax), total retention (S), maximum retention (Smax), and potential retention (Sp), for the first time clearly defined, so that Sp is not variable, but a function of morphological characteristics of the watershed. Indeed, based on the mathematical relation of the conceptual curve of SCS-CN model, the proposed model provides a new method for the computation of actual retention in watershed and it simply determined runoff based on. In the corresponding relations, in addition to Precipitation (P), Initial retention (Ia), cumulative values of actual retention capacity (F), total retention (S), runoff (Q), antecedent moisture (M), potential retention (Sp), total retention (S), we introduced Fmax and Fmin referring to maximum and minimum actual retention, respectively. As well as, ksh is a coefficient which depends on morphological characteristics of the watershed. Advantages of the modified version versus the original model include a better precision, higher performance, easier calibration and speed computing.

Keywords: model, mathematical, retention, watershed, SCS

Procedia PDF Downloads 425
18207 Factors Affecting Customer Loyalty in the Independent Surveyor Service Industry in Indonesia

Authors: Sufrin Hannan, Budi Suharjo, Rita Nurmalina, Kirbrandoko

Abstract:

The challenge for independent surveyor service companies now is growing with increasing uncertainty in business. Protection from the government for domestic independent surveyor industry from competitor attack, such as entering the global surveyors to Indonesia also no longer exists. Therefore, building customer loyalty becomes very important to create a long-term relationship between an independent surveyor with its customers. This study aims to develop a model that can be used to build customer loyalty by looking at various factors that determine customer loyalty, especially on independent surveyors for coal inspection in Indonesia. The development of this model uses the relationship marketing approach. Testing of the hypothesis is done by testing the variables that determine customer loyalty, either directly or indirectly, which amounted to 10 variables. The data were collected from 200 questionnaires filled by independent surveyor company decision makers from 51 exporting companies and coal trading companies in Indonesia and analyzed using Structural Equation Model (SEM). The results show that customer loyalty of independent surveyors is influenced by customer satisfaction, trust, switching-barrier, and relationship-bond. Research on customer satisfaction shows that customer satisfaction is influenced by the perceived quality and perceived value, while perceived quality is influenced by reliability, assurance, responsiveness, and empathy.

Keywords: relationship marketing, customer loyalty, customer satisfaction, switching barriers, relationship bonds

Procedia PDF Downloads 141
18206 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 312
18205 Comparing Spontaneous Hydrolysis Rates of Activated Models of DNA and RNA

Authors: Mohamed S. Sasi, Adel M. Mlitan, Abdulfattah M. Alkherraz

Abstract:

This research project aims to investigate difference in relative rates concerning phosphoryl transfer relevant to biological catalysis of DNA and RNA in the pH-independent reactions. Activated Models of DNA and RNA for alkyl-aryl phosphate diesters (with 4-nitrophenyl as a good leaving group) have successfully been prepared to gather kinetic parameters. Eyring plots for the pH–independent hydrolysis of 1 and 2 were established at different temperatures in the range 100–160 °C. These measurements have been used to provide a better estimate for the difference in relative rates between the reactivity of DNA and RNA cleavage. Eyring plot gave an extrapolated rate of kH2O = 1 × 10-10 s -1 for 1 (RNA model) and 2 (DNA model) at 25°C. Comparing the reactivity of RNA model and DNA model shows that the difference in relative rates in the pH-independent reactions is surprisingly very similar at 25°. This allows us to obtain chemical insights into how biological catalysts such as enzymes may have evolved to perform their current functions.

Keywords: DNA and RNA models, relative rates, reactivity, phosphoryl transfe

Procedia PDF Downloads 397
18204 Verifiable Secure Computation of Large Scale Two-Point Boundary Value Problems Using Certificate Validation

Authors: Yogita M. Ahire, Nedal M. Mohammed, Ahmed A. Hamoud

Abstract:

Scientific computation outsourcing is gaining popularity because it allows customers with limited computing resources and storage devices to outsource complex computation workloads to more powerful service providers. However, it raises some security and privacy concerns and challenges, such as customer input and output privacy, as well as cloud cheating behaviors. This study was motivated by these concerns and focused on privacy-preserving Two-Point Boundary Value Problems (BVP) as a common and realistic instance for verifiable safe multiparty computing. We'll look at the safe and verifiable schema with correctness guarantees by utilizing standard multiparty approaches to compute the result of a computation and then solely using verifiable ways to check that the result was right.

Keywords: verifiable computing, cloud computing, secure and privacy BVP, secure computation outsourcing

Procedia PDF Downloads 59
18203 Developed Text-Independent Speaker Verification System

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Speech is a very convenient way of communication between people and machines. It conveys information about the identity of the talker. Since speaker recognition technology is increasingly securing our everyday lives, the objective of this paper is to develop two automatic text-independent speaker verification systems (TI SV) using low-level spectral features and machine learning methods. (i) The first system is based on a support vector machine (SVM), which was widely used in voice signal processing with the aim of speaker recognition involving verifying the identity of the speaker based on its voice characteristics, and (ii) the second is based on Gaussian Mixture Model (GMM) and Universal Background Model (UBM) to combine different functions from different resources to implement the SVM based.

Keywords: speaker verification, text-independent, support vector machine, Gaussian mixture model, cepstral analysis

Procedia PDF Downloads 10
18202 Novel Approach to Privacy - Preserving Secure Multiparty Computation of Complex Solid Geometric Shape

Authors: Rizwan Rizwan

Abstract:

Secure Multiparty Computation is an emerging area of research within the cryptographic community, enabling secure collaboration among multiple parties while safeguarding their sensitive data. Secure Multiparty Computation has been extensively studied in the context of plane geometry, its application to complex solid geometry shapes remains relatively unexplored. This research paper aims to bridge this gap by proposing a solution for the secure multiparty computation of intersecting tetrahedra. We present an approach to calculate the volume of intersecting tetrahedra securely while preserving the privacy of the input data provided by each participating party. The proposed solution leverages accepted simulation paradigms to prove the privacy of the computation. We thoroughly analyze the computational and communication complexities of our approach, demonstrating that they closely align with the minimum theoretical complexity for the problems at hand. This optimal nature of our solution ensures efficient and secure collaborative geometric computations.

Keywords: cryptography, secure multiparty computation, solid geometry, protocol, simulation paradigm

Procedia PDF Downloads 17
18201 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: text mining, topic extraction, independent, incremental, independent component analysis

Procedia PDF Downloads 275
18200 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 126
18199 Multi-Objective Evolutionary Computation Based Feature Selection Applied to Behaviour Assessment of Children

Authors: F. Jiménez, R. Jódar, M. Martín, G. Sánchez, G. Sciavicco

Abstract:

Abstract—Attribute or feature selection is one of the basic strategies to improve the performances of data classification tasks, and, at the same time, to reduce the complexity of classifiers, and it is a particularly fundamental one when the number of attributes is relatively high. Its application to unsupervised classification is restricted to a limited number of experiments in the literature. Evolutionary computation has already proven itself to be a very effective choice to consistently reduce the number of attributes towards a better classification rate and a simpler semantic interpretation of the inferred classifiers. We present a feature selection wrapper model composed by a multi-objective evolutionary algorithm, the clustering method Expectation-Maximization (EM), and the classifier C4.5 for the unsupervised classification of data extracted from a psychological test named BASC-II (Behavior Assessment System for Children - II ed.) with two objectives: Maximizing the likelihood of the clustering model and maximizing the accuracy of the obtained classifier. We present a methodology to integrate feature selection for unsupervised classification, model evaluation, decision making (to choose the most satisfactory model according to a a posteriori process in a multi-objective context), and testing. We compare the performance of the classifier obtained by the multi-objective evolutionary algorithms ENORA and NSGA-II, and the best solution is then validated by the psychologists that collected the data.

Keywords: evolutionary computation, feature selection, classification, clustering

Procedia PDF Downloads 335
18198 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia PDF Downloads 106
18197 Optimizing the Capacity of a Convolutional Neural Network for Image Segmentation and Pattern Recognition

Authors: Yalong Jiang, Zheru Chi

Abstract:

In this paper, we study the factors which determine the capacity of a Convolutional Neural Network (CNN) model and propose the ways to evaluate and adjust the capacity of a CNN model for best matching to a specific pattern recognition task. Firstly, a scheme is proposed to adjust the number of independent functional units within a CNN model to make it be better fitted to a task. Secondly, the number of independent functional units in the capsule network is adjusted to fit it to the training dataset. Thirdly, a method based on Bayesian GAN is proposed to enrich the variances in the current dataset to increase its complexity. Experimental results on the PASCAL VOC 2010 Person Part dataset and the MNIST dataset show that, in both conventional CNN models and capsule networks, the number of independent functional units is an important factor that determines the capacity of a network model. By adjusting the number of functional units, the capacity of a model can better match the complexity of a dataset.

Keywords: CNN, convolutional neural network, capsule network, capacity optimization, character recognition, data augmentation, semantic segmentation

Procedia PDF Downloads 117
18196 A Parallel Computation Based on GPU Programming for a 3D Compressible Fluid Flow Simulation

Authors: Sugeng Rianto, P.W. Arinto Yudi, Soemarno Muhammad Nurhuda

Abstract:

A computation of a 3D compressible fluid flow for virtual environment with haptic interaction can be a non-trivial issue. This is especially how to reach good performances and balancing between visualization, tactile feedback interaction, and computations. In this paper, we describe our approach of computation methods based on parallel programming on a GPU. The 3D fluid flow solvers have been developed for smoke dispersion simulation by using combinations of the cubic interpolated propagation (CIP) based fluid flow solvers and the advantages of the parallelism and programmability of the GPU. The fluid flow solver is generated in the GPU-CPU message passing scheme to get rapid development of haptic feedback modes for fluid dynamic data. A rapid solution in fluid flow solvers is developed by applying cubic interpolated propagation (CIP) fluid flow solvers. From this scheme, multiphase fluid flow equations can be solved simultaneously. To get more acceleration in the computation, the Navier-Stoke Equations (NSEs) is packed into channels of texel, where computation models are performed on pixels that can be considered to be a grid of cells. Therefore, despite of the complexity of the obstacle geometry, processing on multiple vertices and pixels can be done simultaneously in parallel. The data are also shared in global memory for CPU to control the haptic in providing kinaesthetic interaction and felling. The results show that GPU based parallel computation approaches provide effective simulation of compressible fluid flow model for real-time interaction in 3D computer graphic for PC platform. This report has shown the feasibility of a new approach of solving the compressible fluid flow equations on the GPU. The experimental tests proved that the compressible fluid flowing on various obstacles with haptic interactions on the few model obstacles can be effectively and efficiently simulated on the reasonable frame rate with a realistic visualization. These results confirm that good performances and balancing between visualization, tactile feedback interaction, and computations can be applied successfully.

Keywords: CIP, compressible fluid, GPU programming, parallel computation, real-time visualisation

Procedia PDF Downloads 401
18195 Gaussian Mixture Model Based Identification of Arterial Wall Movement for Computation of Distension Waveform

Authors: Ravindra B. Patil, P. Krishnamoorthy, Shriram Sethuraman

Abstract:

This work proposes a novel Gaussian Mixture Model (GMM) based approach for accurate tracking of the arterial wall and subsequent computation of the distension waveform using Radio Frequency (RF) ultrasound signal. The approach was evaluated on ultrasound RF data acquired using a prototype ultrasound system from an artery mimicking flow phantom. The effectiveness of the proposed algorithm is demonstrated by comparing with existing wall tracking algorithms. The experimental results show that the proposed method provides 20% reduction in the error margin compared to the existing approaches in tracking the arterial wall movement. This approach coupled with ultrasound system can be used to estimate the arterial compliance parameters required for screening of cardiovascular related disorders.

Keywords: distension waveform, Gaussian Mixture Model, RF ultrasound, arterial wall movement

Procedia PDF Downloads 470
18194 Model and Algorithm for Dynamic Wireless Electric Vehicle Charging Network Design

Authors: Trung Hieu Tran, Jesse O'Hanley, Russell Fowler

Abstract:

When in-wheel wireless charging technology for electric vehicles becomes mature, a need for such integrated charging stations network development is essential. In this paper, we thus investigate the optimisation problem of in-wheel wireless electric vehicle charging network design. A mixed-integer linear programming model is formulated to solve into optimality the problem. In addition, a meta-heuristic algorithm is proposed for efficiently solving large-sized instances within a reasonable computation time. A parallel computing strategy is integrated into the algorithm to speed up its computation time. Experimental results carried out on the benchmark instances show that our model and algorithm can find the optimal solutions and their potential for practical applications.

Keywords: electric vehicle, wireless charging station, mathematical programming, meta-heuristic algorithm, parallel computing

Procedia PDF Downloads 47
18193 Dissociation of CDS from CVA Valuation Under Notation Changes

Authors: R. Henry, J-B. Paulin, St. Fauchille, Ph. Delord, K. Benkirane, A. Brunel

Abstract:

In this paper, the CVA computation of interest rate swap is presented based on its rating. Rating and probability default given by Moody’s Investors Service are used to calculate our CVA for a specific swap with different maturities. With this computation, the influence of rating variation can be shown on CVA. The application is made to the analysis of Greek CDS variation during the period of Greek crisis between 2008 and 2011. The main point is the determination of correlation between the fluctuation of Greek CDS cumulative value and the variation of swap CVA due to change of rating

Keywords: CDS, computation, CVA, Greek crisis, interest rate swap, maturity, rating, swap

Procedia PDF Downloads 275
18192 Aperiodic and Asymmetric Fibonacci Quasicrystals: Next Big Future in Quantum Computation

Authors: Jatindranath Gain, Madhumita DasSarkar, Sudakshina Kundu

Abstract:

Quantum information is stored in states with multiple quasiparticles, which have a topological degeneracy. Topological quantum computation is concerned with two-dimensional many body systems that support excitations. Anyons are elementary building block of quantum computations. When anyons tunneling in a double-layer system can transition to an exotic non-Abelian state and produce Fibonacci anyons, which are powerful enough for universal topological quantum computation (TQC).Here the exotic behavior of Fibonacci Superlattice is studied by using analytical transfer matrix methods and hence Fibonacci anyons. This Fibonacci anyons can build a quantum computer which is very emerging and exciting field today’s in Nanophotonics and quantum computation.

Keywords: quantum computing, quasicrystals, Multiple Quantum wells (MQWs), transfer matrix method, fibonacci anyons, quantum hall effect, nanophotonics

Procedia PDF Downloads 343
18191 A Deterministic Large Deviation Model Based on Complex N-Body Systems

Authors: David C. Ni

Abstract:

In the previous efforts, we constructed N-Body Systems by an extended Blaschke product (EBP), which represents a non-temporal and nonlinear extension of Lorentz transformation. In this construction, we rely only on two parameters, nonlinear degree, and relative momentum to characterize the systems. We further explored root computation via iteration with an algorithm extended from Jenkins-Traub method. The solution sets demonstrate a form of σ+ i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various canonical distributions. In this paper, we correlate the convergent sets in the original domain with solution sets, which demonstrating large-deviation distributions in the codomain. We proceed to compare our approach with the formula or principles, such as Donsker-Varadhan and Wentzell-Freidlin theories. The deterministic model based on this construction allows us to explore applications in the areas of finance and statistical mechanics.

Keywords: nonlinear Lorentz transformation, Blaschke equation, iteration solutions, root computation, large deviation distribution, deterministic model

Procedia PDF Downloads 363
18190 Unsupervised Reciter Recognition Using Gaussian Mixture Models

Authors: Ahmad Alwosheel, Ahmed Alqaraawi

Abstract:

This work proposes an unsupervised text-independent probabilistic approach to recognize Quran reciter voice. It is an accurate approach that works on real time applications. This approach does not require a prior information about reciter models. It has two phases, where in the training phase the reciters' acoustical features are modeled using Gaussian Mixture Models, while in the testing phase, unlabeled reciter's acoustical features are examined among GMM models. Using this approach, a high accuracy results are achieved with efficient computation time process.

Keywords: Quran, speaker recognition, reciter recognition, Gaussian Mixture Model

Procedia PDF Downloads 349
18189 Symbolic Computation and Abundant Travelling Wave Solutions to Modified Burgers' Equation

Authors: Muhammad Younis

Abstract:

In this article, the novel (G′/G)-expansion method is successfully applied to construct the abundant travelling wave solutions to the modified Burgers’ equation with the aid of computation. The method is reliable and useful, which gives more general exact travelling wave solutions than the existing methods. These obtained solutions are in the form of hyperbolic, trigonometric and rational functions including solitary, singular and periodic solutions which have many potential applications in physical science and engineering. Some of these solutions are new and some have already been constructed. Additionally, the constraint conditions, for the existence of the solutions are also listed.

Keywords: traveling wave solutions, NLPDE, computation, integrability

Procedia PDF Downloads 397
18188 Computationally Efficient Electrochemical-Thermal Li-Ion Cell Model for Battery Management System

Authors: Sangwoo Han, Saeed Khaleghi Rahimian, Ying Liu

Abstract:

Vehicle electrification is gaining momentum, and many car manufacturers promise to deliver more electric vehicle (EV) models to consumers in the coming years. In controlling the battery pack, the battery management system (BMS) must maintain optimal battery performance while ensuring the safety of a battery pack. Tasks related to battery performance include determining state-of-charge (SOC), state-of-power (SOP), state-of-health (SOH), cell balancing, and battery charging. Safety related functions include making sure cells operate within specified, static and dynamic voltage window and temperature range, derating power, detecting faulty cells, and warning the user if necessary. The BMS often utilizes an RC circuit model to model a Li-ion cell because of its robustness and low computation cost among other benefits. Because an equivalent circuit model such as the RC model is not a physics-based model, it can never be a prognostic model to predict battery state-of-health and avoid any safety risk even before it occurs. A physics-based Li-ion cell model, on the other hand, is more capable at the expense of computation cost. To avoid the high computation cost associated with a full-order model, many researchers have demonstrated the use of a single particle model (SPM) for BMS applications. One drawback associated with the single particle modeling approach is that it forces to use the average current density in the calculation. The SPM would be appropriate for simulating drive cycles where there is insufficient time to develop a significant current distribution within an electrode. However, under a continuous or high-pulse electrical load, the model may fail to predict cell voltage or Li⁺ plating potential. To overcome this issue, a multi-particle reduced-order model is proposed here. The use of multiple particles combined with either linear or nonlinear charge-transfer reaction kinetics enables to capture current density distribution within an electrode under any type of electrical load. To maintain computational complexity like that of an SPM, governing equations are solved sequentially to minimize iterative solving processes. Furthermore, the model is validated against a full-order model implemented in COMSOL Multiphysics.

Keywords: battery management system, physics-based li-ion cell model, reduced-order model, single-particle and multi-particle model

Procedia PDF Downloads 78
18187 The Impact of Political Connections on the Funtion of Independent Directors

Authors: Chih-Lin Chang, Tzu-Ching Weng

Abstract:

The purpose of this study is to explore the relationship between corporate political ties and independent directors' functions. With reference to the literature variables such as the characteristics of the relevant board of directors in the past, a single comprehensive function indicator is established as a substitute variable for the function of independent directors, and the impact of political connection on the independent board of directors is further discussed. This research takes Taiwan listed enterprises from 2014 to 2020 as the main research object and conducts empirical research through descriptive statistics, correlation and regression analysis. The empirical results show that companies with political connections will have a positive impact on the number of independent directors; political connections also have a significant positive relationship with the functional part of independent directors, which means that because companies have political connections, they have a positive impact on the seats or functions of independent directors. will pay more attention and increase their oversight functions.

Keywords: political, connection, independent, director, function

Procedia PDF Downloads 66
18186 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 59
18185 Symbolic Computation on Variable-Coefficient Non-Linear Dispersive Wave Equations

Authors: Edris Rawashdeh, I. Abu-Falahah, H. M. Jaradat

Abstract:

The variable-coefficient non-linear dispersive wave equation is investigated with the aid of symbolic computation. By virtue of a newly developed simplified bilinear method, multi-soliton solutions for such an equation have been derived. Effects of the inhomogeneities of media and nonuniformities of boundaries, depicted by the variable coefficients, on the soliton behavior are discussed with the aid of the characteristic curve method and graphical analysis.

Keywords: dispersive wave equations, multiple soliton solution, Hirota Bilinear Method, symbolic computation

Procedia PDF Downloads 419
18184 Stochastic Control of Decentralized Singularly Perturbed Systems

Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan

Abstract:

Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.

Keywords: decentralized, optimal control, output, singular perturb

Procedia PDF Downloads 335
18183 Antecedent and Outcome of New Product Development in Leather Industry, Bangkok and Vicinity, Thailand

Authors: Bundit Pungnirund

Abstract:

The purposes of this research were to develop and to monitor the antecedent factors which directly affected the success rate of new product development. This was a case study of the leather industry in Bangkok, Thailand. A total of 350 leather factories were used as a sample group. The findings revealed that the new product development model was harmonized with the empirical data at the acceptable level, the statistic values are: x^2=6.45, df= 7, p-value = .48856; RMSEA = .000; RMR = .0029; AGFI = .98; GFI = 1.00. The independent variable that directly influenced the dependent variable at the highest level was marketing outcome which had a influence coefficient at 0.32 and the independent variables that indirectly influenced the dependent variables at the highest level was a clear organization policy which had a influence coefficient at 0.17, whereas, all independent variables can predict the model at 48 percent.

Keywords: antecedent, new product development, leather industry, Thailand

Procedia PDF Downloads 268