Search results for: Prediction of financial markets
1424 The Influence of Knowledge Transfer on Outputs of Innovative Process – Case Study of Czech Regions
Authors: J. Stejskal, P. Hajek
Abstract:
The goal of this article is the analysis of knowledge transfer at the regional level of the Czech Republic. We show how goals of enterprises´ innovative activities are related to the rate of cooperation with different actors within regional innovative systems as well as in other world regions. The results show that the most important partners of enterprises are their suppliers and clients in most Czech regions. The cooperation rate of enterprises correlates significantly mainly with enterprises´ efforts to enter new markets and reduce labour costs per unit output. The meaning of this cooperation decreases with the increase of partner’s distance. Regarding the type of a cooperating partner, cooperation within an enterprise had to do with the increase of market share and decrease of labour costs. On the other hand, cooperation with clients had to do with efforts to replace outdated products or processes or enter new markets. We can pay less attention to the cooperation with government authorities and organizations. The reasons for marginalization of this cooperation should be submitted to further detailed investigation.
Keywords: Knowledge, transfer, innovative process, Czech republic, region.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15361423 The Influence of EU Regulation of Margin Requirements on Market Stock Volatility
Authors: Nadira Kaimova
Abstract:
In this paper it was examined the influence of margin regulation on stock market volatility in EU 1993 – 2014. Regulating margin requirements or haircuts for securities financing transactions has for a long time been considered as a potential tool to limit the build-up of leverage and dampen volatility in financial markets. The margin requirement dictates how much investors can borrow against these securities. Margin can be an important part of investment. Using daily and monthly stock returns and there is no convincing evidence that EU Regulation margin requirements have served to dampen stock market volatility. In this paper was detected the expected negative relation between margin requirements and the amount of margin credit outstanding. Also, it confirmed that changes in margin requirements by the EU regulation have tended to follow than lead changes in market volatility. For the analysis have been used the modified Levene statistics to test whether the standard deviation of stock returns in the 25, 50 and 100 days preceding margin changes is the same as that in the succeeding 25, 50 and 100 days. The analysis started in May 1993 when it was first empowered to set the initial margin requirement and the last sample was in May 2014. To test whether margin requirements influence stock market volatility over the long term, the sample of stock returns was divided into 14 periods, according to the 14 changes in margin requirements.
Keywords: Levene statistic, Margin Regulation, Stock Market, Volatility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21541422 Development of Accident Predictive Model for Rural Roadway
Authors: Fajaruddin Mustakim, Motohiro Fujita
Abstract:
This paper present the study carried out of accident analysis, black spot study and to develop accident predictive models based on the data collected at rural roadway, Federal Route 50 (F050) Malaysia. The road accident trends and black spot ranking were established on the F050. The development of the accident prediction model will concentrate in Parit Raja area from KM 19 to KM 23. Multiple non-linear regression method was used to relate the discrete accident data with the road and traffic flow explanatory variable. The dependent variable was modeled as the number of crashes namely accident point weighting, however accident point weighting have rarely been account in the road accident prediction Models. The result show that, the existing number of major access points, without traffic light, rise in speed, increasing number of Annual Average Daily Traffic (AADT), growing number of motorcycle and motorcar and reducing the time gap are the potential contributors of increment accident rates on multiple rural roadway.Keywords: Accident Trends, Black Spot Study, Accident Prediction Model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32821421 Evaluation of Context Information for Intermittent Networks
Authors: S. Balaji, E. Golden Julie, Y. Harold Robinson
Abstract:
The context aware adaptive routing protocol is presented for unicast communication in intermittently connected mobile ad hoc networks (MANETs). The selection of the node is done by the Kalman filter prediction theory and it also makes use of utility functions. The context aware adaptive routing is defined by spray and wait technique, but the time consumption in delivering the message is too high and also the resource wastage is more. In this paper, we describe the spray and focus routing scheme for avoiding the existing problems.
Keywords: Context aware adaptive routing, Kalman filter prediction, spray and wait, spray and focus, intermittent networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9131420 Evaluation of Chiller Power Consumption Using Grey Prediction
Authors: Tien-Shun Chan, Yung-Chung Chang, Cheng-Yu Chu, Wen-Hui Chen, Yuan-Lin Chen, Shun-Chong Wang, Chang-Chun Wang
Abstract:
98% of the energy needed in Taiwan has been imported. The prices of petroleum and electricity have been increasing. In addition, facility capacity, amount of electricity generation, amount of electricity consumption and number of Taiwan Power Company customers have continued to increase. For these reasons energy conservation has become an important topic. In the past linear regression was used to establish the power consumption models for chillers. In this study, grey prediction is used to evaluate the power consumption of a chiller so as to lower the total power consumption at peak-load (so that the relevant power providers do not need to keep on increasing their power generation capacity and facility capacity). In grey prediction, only several numerical values (at least four numerical values) are needed to establish the power consumption models for chillers. If PLR, the temperatures of supply chilled-water and return chilled-water, and the temperatures of supply cooling-water and return cooling-water are taken into consideration, quite accurate results (with the accuracy close to 99% for short-term predictions) may be obtained. Through such methods, we can predict whether the power consumption at peak-load will exceed the contract power capacity signed by the corresponding entity and Taiwan Power Company. If the power consumption at peak-load exceeds the power demand, the temperature of the supply chilled-water may be adjusted so as to reduce the PLR and hence lower the power consumption.Keywords: Gery system theory, grey prediction, chller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25781419 Investigation of Improved Chaotic Signal Tracking by Echo State Neural Networks and Multilayer Perceptron via Training of Extended Kalman Filter Approach
Authors: Farhad Asadi, S. Hossein Sadati
Abstract:
This paper presents a prediction performance of feedforward Multilayer Perceptron (MLP) and Echo State Networks (ESN) trained with extended Kalman filter. Feedforward neural networks and ESN are powerful neural networks which can track and predict nonlinear signals. However, their tracking performance depends on the specific signals or data sets, having the risk of instability accompanied by large error. In this study we explore this process by applying different network size and leaking rate for prediction of nonlinear or chaotic signals in MLP neural networks. Major problems of ESN training such as the problem of initialization of the network and improvement in the prediction performance are tackled. The influence of coefficient of activation function in the hidden layer and other key parameters are investigated by simulation results. Extended Kalman filter is employed in order to improve the sequential and regulation learning rate of the feedforward neural networks. This training approach has vital features in the training of the network when signals have chaotic or non-stationary sequential pattern. Minimization of the variance in each step of the computation and hence smoothing of tracking were obtained by examining the results, indicating satisfactory tracking characteristics for certain conditions. In addition, simulation results confirmed satisfactory performance of both of the two neural networks with modified parameterization in tracking of the nonlinear signals.Keywords: Feedforward neural networks, nonlinear signal prediction, echo state neural networks approach, leaking rates, capacity of neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7581418 Dynamic Self-Scheduling of Pumped-Storage Power Plant in Energy and Ancillary Service Markets Using Sliding Window Technique
Authors: P. Kanakasabapathy, Radhika. S,
Abstract:
In the competitive electricity market environment, the profit of the pumped-storage plant in the energy market can be maximized by operating it as a generator, when market clearing price is high and as a pump, to pump water from lower reservoir to upper reservoir, when the price is low. An optimal self-scheduling plan has been developed for a pumped-storage plant, carried out on weekly basis in order to maximize the profit of the plant, keeping into account of all the major uncertainties such as the sudden ancillary service delivery request and the price forecasting errors. For a pumped storage power plant to operate in a real time market successive self scheduling has to be done by considering the forecast of the day-ahead market and the modified reservoir storage due to the ancillary service request of the previous day. Sliding Window Technique has been used for successive self scheduling to ensure profit for the plant.
Keywords: Ancillary services, BPSO, Power System Economics (Electricity markets), Self-Scheduling, Sliding Window Technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25741417 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data
Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi
Abstract:
Edgeworth Approximation, Bootstrap and Monte Carlo Simulations have a considerable impact on the achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that have the components of a Cash-Flow of one of the most successful businesses in the world, as the financial activity, operational activity and investing activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case we have created a Vector Autoregression model, and after that we have generated the impulse responses in the terms of Asymptotic Analysis (Edgeworth Approximation), Monte Carlo Simulations and Residual Bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied, that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.
Keywords: Autoregression, Bootstrap, Edgeworth Expansion, Monte Carlo Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5951416 An Ontology for Investment in Chinese Steel Company
Authors: Liming Chen, Baoxin Xiu, Zhaoyun Ding, Bin Liu, Xianqiang Zhu
Abstract:
In the era of big data, public investors are faced with more complicated information related to investment decisions than ever before. To survive in the fierce competition, it has become increasingly urgent for investors to combine multi-source knowledge and evaluate the companies’ true value efficiently. For this, a rule-based ontology reasoning method is proposed to support steel companies’ value assessment. Considering the delay in financial disclosure and based on cost-benefit analysis, this paper introduces the supply chain enterprises financial analysis and constructs the ontology model used to value the value of steel company. In addition, domain knowledge is formally expressed with the help of Web Ontology Language (OWL) language and SWRL (Semantic Web Rule Language) rules. Finally, a case study on a steel company in China proved the effectiveness of the method we proposed.
Keywords: Financial ontology, steel company, supply chain, ontology reasoning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5951415 Meteorological Data Study and Forecasting Using Particle Swarm Optimization Algorithm
Authors: S. Esfandeh, M. Sedighizadeh
Abstract:
Weather systems use enormously complex combinations of numerical tools for study and forecasting. Unfortunately, due to phenomena in the world climate, such as the greenhouse effect, classical models may become insufficient mostly because they lack adaptation. Therefore, the weather forecast problem is matched for heuristic approaches, such as Evolutionary Algorithms. Experimentation with heuristic methods like Particle Swarm Optimization (PSO) algorithm can lead to the development of new insights or promising models that can be fine tuned with more focused techniques. This paper describes a PSO approach for analysis and prediction of data and provides experimental results of the aforementioned method on realworld meteorological time series.Keywords: Weather, Climate, PSO, Prediction, Meteorological
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20761414 Prediction of Natural Gas Viscosity using Artificial Neural Network Approach
Authors: E. Nemati Lay, M. Peymani, E. Sanjari
Abstract:
Prediction of viscosity of natural gas is an important parameter in the energy industries such as natural gas storage and transportation. In this study viscosity of different compositions of natural gas is modeled by using an artificial neural network (ANN) based on back-propagation method. A reliable database including more than 3841 experimental data of viscosity for testing and training of ANN is used. The designed neural network can predict the natural gas viscosity using pseudo-reduced pressure and pseudo-reduced temperature with AARD% of 0.221. The accuracy of designed ANN has been compared to other published empirical models. The comparison indicates that the proposed method can provide accurate results.
Keywords: Artificial neural network, Empirical correlation, Natural gas, Viscosity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32451413 Application and Assessment of Artificial Neural Networks for Biodiesel Iodine Value Prediction
Authors: Raquel M. de Sousa, Sofiane Labidi, Allan Kardec D. Barros, Alex O. Barradas Filho, Aldalea L. B. Marques
Abstract:
Several parameters are established in order to measure biodiesel quality. One of them is the iodine value, which is an important parameter that measures the total unsaturation within a mixture of fatty acids. Limitation of unsaturated fatty acids is necessary since warming of higher quantity of these ones ends in either formation of deposits inside the motor or damage of lubricant. Determination of iodine value by official procedure tends to be very laborious, with high costs and toxicity of the reagents, this study uses artificial neural network (ANN) in order to predict the iodine value property as an alternative to these problems. The methodology of development of networks used 13 esters of fatty acids in the input with convergence algorithms of back propagation of back propagation type were optimized in order to get an architecture of prediction of iodine value. This study allowed us to demonstrate the neural networks’ ability to learn the correlation between biodiesel quality properties, in this caseiodine value, and the molecular structures that make it up. The model developed in the study reached a correlation coefficient (R) of 0.99 for both network validation and network simulation, with Levenberg-Maquardt algorithm.Keywords: Artificial Neural Networks, Biodiesel, Iodine Value, Prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23801412 Foreign Real Estate Investment and the Australian Residential Property Market: A Study on Chinese Investors
Authors: Peng Yew Wong
Abstract:
House prices in the Australian capital cities were at record levels subsequent to Global Financial Crisis (GFC) 2008 and many believed that foreign investors, especially the Chinese investors, were the main reason for the Australian capital cities’ house prices escalation. This research conducted an Australian cross border semi-structured interviews in Shanghai, China to uncover historical evidence and emerging trend supporting the existence of a significant relationship between overseas investors and residential housing markets performance in Australia subsequent to the GFC 2008. Some unique investment strategies of private investors from China which emphasised on non-capitalist factors such as early education were identified, alongside with some insights on the significant China government policies that have incentivised the cross border investments from China. It is believed that this understanding will assist policy makers to effectively manage the overheated Australian residential property market without compromising the steady flow of FREI.
Keywords: Australian housing market, residential property, foreign real estate investment, education, China investor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16031411 High Capacity Data Hiding based on Predictor and Histogram Modification
Authors: Hui-Yu Huang, Shih-Hsu Chang
Abstract:
In this paper, we propose a high capacity image hiding technology based on pixel prediction and the difference of modified histogram. This approach is used the pixel prediction and the difference of modified histogram to calculate the best embedding point. This approach can improve the predictive accuracy and increase the pixel difference to advance the hiding capacity. We also use the histogram modification to prevent the overflow and underflow. Experimental results demonstrate that our proposed method within the same average hiding capacity can still keep high quality of image and low distortionKeywords: data hiding, predictor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18861410 Prediction of Saturated Hydraulic Conductivity Dynamics in an Iowan Agriculture Watershed
Authors: Mohamed Elhakeem, A. N. Thanos Papanicolaou, Christopher Wilson, Yi-Jia Chang
Abstract:
In this study, a physically-based, modeling framework was developed to predict saturated hydraulic conductivity (Ksat) dynamics in the Clear Creek Watershed (CCW), Iowa. The modeling framework integrated selected pedotransfer functions and watershed models with geospatial tools. A number of pedotransfer functions and agricultural watershed models were examined to select the appropriate models that represent the study site conditions. Models selection was based on statistical measures of the models’ errors compared to the Ksat field measurements conducted in the CCW under different soil, climate and land use conditions. The study has shown that the predictions of the combined pedotransfer function of Rosetta and the Water Erosion Prediction Project (WEPP) provided the best agreement to the measured Ksat values in the CCW compared to the other tested models. Therefore, Rosetta and WEPP were integrated with the Geographic Information System (GIS) tools for visualization of the data in forms of geospatial maps and prediction of Ksat variability in CCW due to the seasonal changes in climate and land use activities.
Keywords: Saturated hydraulic conductivity, pedotransfer functions, watershed models, geospatial tools.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25491409 Underpricing of IPOs during Hot and Cold Market Periods on the South African Stock Exchange (JSE)
Authors: Brownhilder N. Neneh, A. Van Aardt Smit
Abstract:
Underpricing is one anomaly in initial public offerings (IPO) literature that has been widely observed across different stock markets with different trends emerging over different time periods. This study seeks to determine how IPOs on the JSE performed on the first day, first week and first month over the period of 1996-2011. Underpricing trends are documented for both hot and cold market periods in terms of four main sectors (cyclical, defensive, growth stock and interest rate sensitive stocks). Using a sample of 360 listed companies on the JSE, the empirical findings established that IPOs on the JSE are significantly underpriced with an average market adjusted first day return of 62.9%. It is also established that hot market IPOs on the JSE are more underpriced than the cold market IPOs. Also observed is the fact that as the offer price per share increases above the median price for any given period, the level of underpricing decreases substantially. While significant differences exist in the level of underpricing of IPOs in the four different sectors in the hot and cold market periods, interest rates sensitive stocks showed a different trend from the other sectors and thus require further investigation to uncover this pattern.
Keywords: Underpricing, hot and cold markets, South Africa, JSE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42011408 Money Laundering and Financing of Terrorism
Authors: C. Mallada Fernández
Abstract:
Economic development and globalization of international markets have created a favourable atmosphere for the emergence of new forms of crime such as money laundering or financing of terrorism, which may contribute to destabilized and damage economic systems. In particular, money laundering have acquired great importance since the 11S attacks, what has caused on the one hand, the establishment and development of preventive measures and, on the other hand, a progressive hardening of penal measures. Since then, the regulations imposed to fight against money laundering have been viewed as key components also in the fight against terrorist financing. Terrorism, at the beginning, was a “national” crime connected with internal problems of the State (for instance the RAF in Germany or ETA in Spain) but in the last 20 years has started to be an international problem that is connected with the defence and security of the States. Therefore, the new strategic concept for the defense and security of NATO has a comprehensive list of security threats to the Alliance, such as terrorism, international instability, money laundering or attacks on cyberspace, among others. With this new concept, money laundering and terrorism has become a priority in the national defense.
In this work we will analyze the methods to combat these new threats to the national security. We will study the preventive legislations to combat money laundering and financing of terrorism, the UIF that exchange information between States, and the hawala-Banking.
Keywords: Control of financial flows, money laundering, terrorism, financing of terrorism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28821407 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model
Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl
Abstract:
Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the workpiece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.Keywords: Dexel, process stability, material removal, milling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22611406 Financial Burden of Family for the Children with Autism Spectrum Disorder
Authors: M. R. Bhuiyan, S. M. M. Hossain, M. Z. Islam
Abstract:
Autism Spectrum Disorder (ASD) is the fastest growing serious developmental disorder characterized by social deficits, communicative difficulties, and repetitive behaviors. ASD is an emerging public health issue globally which is associated with huge financial burden to the family, community and the nation. The aim of this study was to assess the financial burden of family for the children with Autism spectrum Disorder. This cross-sectional study was carried out from July 2015 to June 2016 among 154 children with ASD to assess the financial burden of family. Data were collected by face-to-face interview with semi-structured questionnaire following systematic random sampling technique. Majority (73.4%) children were male and mean (±SD) age was 6.66 ± 2.97 years. Most (88.8%) of the children were from urban areas with average monthly family income Tk. 41785.71±23936.45. Average monthly direct cost of the children was Tk.17656.49 ± 9984.35, while indirect cost was Tk. 13462.90 ± 9713.54 and total treatment cost was Tk. 23076.62 ± 15341.09. Special education cost (Tk. 4871.00), cost of therapy (Tk. 4124.07) and travel cost (Tk. 3988.31) were the major types of direct cost, while loss of income (Tk.14570.18) was the chief indirect cost incurred by the families. The study found that majority (59.8%) of the children attended special schools were incurred Tk.20001-78700 as total treatment cost, which were statistically significant (p<0.001). Again, families with higher monthly family income incurred higher treatment cost (r=0.526, p<0.05). Difference between mean direct and indirect cost was found significant (t=4.190, df=61, p<0.001). According to the analysis of variance, mean difference of father’s educational status among direct cost (F=10.337, p<0.001) and total treatment cost (F=7.841, p<0.001), which were statistically significant. The study revealed that maximum children with ASD were under five years, three-fourth were male. According to monthly family income, maximum family were in middle class. The study recommends cost effective interventions and financial safety-net measures to reduce the financial burden of families for the children with ASD.
Keywords: Autism spectrum disorder, financial burden, direct cost, indirect cost, Special education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12331405 Investigation of Some Technical Indexes inStock Forecasting Using Neural Networks
Authors: Myungsook Klassen
Abstract:
Training neural networks to capture an intrinsic property of a large volume of high dimensional data is a difficult task, as the training process is computationally expensive. Input attributes should be carefully selected to keep the dimensionality of input vectors relatively small. Technical indexes commonly used for stock market prediction using neural networks are investigated to determine its effectiveness as inputs. The feed forward neural network of Levenberg-Marquardt algorithm is applied to perform one step ahead forecasting of NASDAQ and Dow stock prices.Keywords: Stock Market Prediction, Neural Networks, Levenberg-Marquadt Algorithm, Technical Indexes
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19471404 Knowledge Based Model for Power Transformer Life Cycle Management Using Knowledge Engineering
Authors: S. S. Bhandari, N. Chakpitak, K. Meksamoot, T. Chandarasupsang
Abstract:
Under the limitation of investment budget, a utility company is required to maximize the utilization of their existing assets during their life cycle satisfying both engineering and financial requirements. However, utility does not have knowledge about the status of each asset in the portfolio neither in terms of technical nor financial values. This paper presents a knowledge based model for the utility companies in order to make an optimal decision on power transformer with their utilization. CommonKADS methodology, a structured development for knowledge and expertise representation, is utilized for designing and developing knowledge based model. A case study of One MVA power transformer of Nepal Electricity Authority is presented. The results show that the reusable knowledge can be categorized, modeled and utilized within the utility company using the proposed methodologies. Moreover, the results depict that utility company can achieve both engineering and financial benefits from its utilization.Keywords: CommonKADS, Knowledge Engineering, LifeCycle Management, Power Transformer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23041403 An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure
Authors: Fiona Browne, Huiru Zheng, Haiying Wang, Francisco Azuaje
Abstract:
Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.Keywords: Bayesian network, Classification, Data integration, Protein interaction networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16161402 Factors of Non-Conformity Behavior and the Emergence of a Ponzi Game in the Riba-Free (Interest-Free) Banking System of Iran
Authors: Amir Hossein Ghaffari Nejad, Forouhar Ferdowsi, Reza Mashhadi
Abstract:
In the interest-free banking system of Iran, the savings of society are in the form of bank deposits, and banks using the Islamic contracts, allocate the resources to applicants for obtaining facilities and credit. In the meantime, the central bank, with the aim of introducing monetary policy, determines the maximum interest rate on bank deposits in terms of macroeconomic requirements. But in recent years, the country's economic constraints with the stagflation and the consequence of the institutional weaknesses of the financial market of Iran have resulted in massive disturbances in the balance sheet of the banking system, resulting in a period of mismatch maturity in the banks' assets and liabilities and the implementation of a Ponzi game. This issue caused determination of the interest rate in long-term bank deposit contracts to be associated with non-observance of the maximum rate set by the central bank. The result of this condition was in the allocation of new sources of equipment to meet past commitments towards the old depositors and, as a result, a significant part of the supply of equipment was leaked out of the facilitating cycle and credit crunch emerged. The purpose of this study is to identify the most important factors affecting the occurrence of non-confirmatory financial banking behavior using data from 19 public and private banks of Iran. For this purpose, the causes of this non-confirmatory behavior of banks have been investigated using the panel vector autoregression method (PVAR) for the period of 2007-2015. Granger's causality test results suggest that the return of parallel markets for bank deposits, non-performing loans and the high share of the ratio of facilities to banks' deposits are all a cause of the formation of non-confirmatory behavior. Also, according to the results of impulse response functions and variance decomposition, NPL and the ratio of facilities to deposits have the highest long-term effect and also have a high contribution to explaining the changes in banks' non-confirmatory behavior in determining the interest rate on deposits.
Keywords: Non-conformity behavior, Ponzi game, panel vector autoregression, nonperforming loans.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8651401 Development of Maximum Entropy Method for Prediction of Droplet-size Distribution in Primary Breakup Region of Spray
Authors: E. Movahednejad, F. Ommi
Abstract:
Droplet size distributions in the cold spray of a fuel are important in observed combustion behavior. Specification of droplet size and velocity distributions in the immediate downstream of injectors is also essential as boundary conditions for advanced computational fluid dynamics (CFD) and two-phase spray transport calculations. This paper describes the development of a new model to be incorporated into maximum entropy principle (MEP) formalism for prediction of droplet size distribution in droplet formation region. The MEP approach can predict the most likely droplet size and velocity distributions under a set of constraints expressing the available information related to the distribution. In this article, by considering the mechanisms of turbulence generation inside the nozzle and wave growth on jet surface, it is attempted to provide a logical framework coupling the flow inside the nozzle to the resulting atomization process. The purpose of this paper is to describe the formulation of this new model and to incorporate it into the maximum entropy principle (MEP) by coupling sub-models together using source terms of momentum and energy. Comparison between the model prediction and experimental data for a gas turbine swirling nozzle and an annular spray indicate good agreement between model and experiment.Keywords: Droplet, instability, Size Distribution, Turbulence, Maximum Entropy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25801400 Climate Related Financial Risk for Automobile Industry and Impact to Financial Institutions
Authors: S. Mahalakshmi, B. Senthil Arasu
Abstract:
As per the recent changes happening in the global policies, climate related changes and the impact it causes across every sector are viewed as green swan events – in essence, climate related changes can happen often and lead to risk and lot of uncertainty, but need to be mitigated instead of considering them as black swan events. This brings about a question on how this risk can be computed, so that the financial institutions can plan to mitigate it. Climate related changes impact all risk types – credit risk, market risk, operational risk, liquidity risk, reputational risk and others. And the models required to compute this have to consider the different industrial needs of the counterparty, as well as the factors that are contributing to this – be it in the form of different risk drivers, or the different transmission channels or the different approaches and the granular form of data availability. This brings out to the suggestion that the climate related changes, though it affects Pillar I risks, will be a Pillar II risk. This has to be modeled specifically based on the financial institution’s actual exposure to different industries, instead of generalizing the risk charge. And this will have to be considered as the additional capital to be met by the financial institution in addition to their Pillar I risks, as well as the existing Pillar II risks. In this paper, we present a risk assessment framework to model and assess climate change risks - for both credit and market risks. This framework helps in assessing the different scenarios, and how the different transition risks affect the risk associated with the different parties. This research paper delves on the topic of increase in concentration of greenhouse gases, that in turn causing global warming. It then considers the various scenarios of having the different risk drivers impacting credit and market risk of an institution, by understanding the transmission channels, and also considering the transition risk. The paper then focuses on the industry that’s fast seeing a disruption: automobile industry. The paper uses the framework to show how the climate changes and the change to the relevant policies have impacted the entire financial institution. Appropriate statistical models for forecasting, anomaly detection and scenario modeling are built to demonstrate how the framework can be used by the relevant agencies to understand their financial risks. The paper also focuses on the climate risk calculation for the Pillar II capital calculations, and how it will make sense for the bank to maintain this in addition to their regular Pillar I and Pillar II capital.
Keywords: Capital calculation, climate risk, credit risk, pillar II risk, scenario modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4231399 An Improved Model for Prediction of the Effective Thermal Conductivity of Nanofluids
Authors: K. Abbaspoursani, M. Allahyari, M. Rahmani
Abstract:
Thermal conductivity is an important characteristic of a nanofluid in laminar flow heat transfer. This paper presents an improved model for the prediction of the effective thermal conductivity of nanofluids based on dimensionless groups. The model expresses the thermal conductivity of a nanofluid as a function of the thermal conductivity of the solid and liquid, their volume fractions and particle size. The proposed model includes a parameter which accounts for the interfacial shell, brownian motion, and aggregation of particle. The validation of the model is verified by applying the results obtained by the experiments of Tio2-water and Al2o3-water nanofluids.Keywords: Critical particle size, nanofluid, model, and thermal conductivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20491398 Predicting the Impact of the Defect on the Overall Environment in Function Based Systems
Authors: Parvinder S. Sandhu, Urvashi Malhotra, E. Ardil
Abstract:
There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, Software Faults, Accuracy, MAE, RMSE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13561397 Applications of Prediction and Identification Using Adaptive DCMAC Neural Networks
Authors: Yu-Lin Liao, Ya-Fu Peng
Abstract:
An adaptive dynamic cerebellar model articulation controller (DCMAC) neural network used for solving the prediction and identification problem is proposed in this paper. The proposed DCMAC has superior capability to the conventional cerebellar model articulation controller (CMAC) neural network in efficient learning mechanism, guaranteed system stability and dynamic response. The recurrent network is embedded in the DCMAC by adding feedback connections in the association memory space so that the DCMAC captures the dynamic response, where the feedback units act as memory elements. The dynamic gradient descent method is adopted to adjust DCMAC parameters on-line. Moreover, the analytical method based on a Lyapunov function is proposed to determine the learning-rates of DCMAC so that the variable optimal learning-rates are derived to achieve most rapid convergence of identifying error. Finally, the adaptive DCMAC is applied in two computer simulations. Simulation results show that accurate identifying response and superior dynamic performance can be obtained because of the powerful on-line learning capability of the proposed DCMAC.Keywords: adaptive, cerebellar model articulation controller, CMAC, prediction, identification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14011396 Mixtures of Monotone Networks for Prediction
Authors: Marina Velikova, Hennie Daniels, Ad Feelders
Abstract:
In many data mining applications, it is a priori known that the target function should satisfy certain constraints imposed by, for example, economic theory or a human-decision maker. In this paper we consider partially monotone prediction problems, where the target variable depends monotonically on some of the input variables but not on all. We propose a novel method to construct prediction models, where monotone dependences with respect to some of the input variables are preserved by virtue of construction. Our method belongs to the class of mixture models. The basic idea is to convolute monotone neural networks with weight (kernel) functions to make predictions. By using simulation and real case studies, we demonstrate the application of our method. To obtain sound assessment for the performance of our approach, we use standard neural networks with weight decay and partially monotone linear models as benchmark methods for comparison. The results show that our approach outperforms partially monotone linear models in terms of accuracy. Furthermore, the incorporation of partial monotonicity constraints not only leads to models that are in accordance with the decision maker's expertise, but also reduces considerably the model variance in comparison to standard neural networks with weight decay.Keywords: mixture models, monotone neural networks, partially monotone models, partially monotone problems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12461395 Prediction of Basic Wind Speed for Ayeyarwady
Authors: Chaw Su Mon
Abstract:
Abstract— The paper presents a preliminary study on modeling and estimation of basic wind speed ( extreme wind gusts ) for the consideration of vulnerability and design of building in Ayeyarwady Region. The establishment of appropriate design wind speeds is a critical step towards the calculation of design wind loads for structures. In this paper the extreme value analysis of this prediction work is based on the anemometer data (1970-2009) maintained by the department of meteorology and hydrology of Pathein. Statistical and probabilistic approaches are used to derive formulas for estimating 3-second gusts from recorded data (10-minute sustained mean wind speeds).
Keywords: Basic Wind Speed, Building, Gusts, Statistical and probabilistic approaches
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1279