Search results for: empirical measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5515

Search results for: empirical measure

5425 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options

Authors: Wajih Abbassi, Zouhaier Ben Khelifa

Abstract:

The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.

Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options

Procedia PDF Downloads 407
5424 A Similarity/Dissimilarity Measure to Biological Sequence Alignment

Authors: Muhammad A. Khan, Waseem Shahzad

Abstract:

Analysis of protein sequences is carried out for the purpose to discover their structural and ancestry relationship. Sequence similarity determines similar protein structures, similar function, and homology detection. Biological sequences composed of amino acid residues or nucleotides provide significant information through sequence alignment. In this paper, we present a new similarity/dissimilarity measure to sequence alignment based on the primary structure of a protein. The approach finds the distance between the two given sequences using the novel sequence alignment algorithm and a mathematical model. The algorithm runs at a time complexity of O(n²). A distance matrix is generated to construct a phylogenetic tree of different species. The new similarity/dissimilarity measure outperforms other existing methods.

Keywords: alignment, distance, homology, mathematical model, phylogenetic tree

Procedia PDF Downloads 157
5423 New Approaches for the Handwritten Digit Image Features Extraction for Recognition

Authors: U. Ravi Babu, Mohd Mastan

Abstract:

The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.

Keywords: handwritten digit recognition, distance measure, MNIST database, image features

Procedia PDF Downloads 439
5422 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis

Procedia PDF Downloads 148
5421 The Shannon Entropy and Multifractional Markets

Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese

Abstract:

Introduced by Shannon in 1948 in the field of information theory as the average rate at which information is produced by a stochastic set of data, the concept of entropy has gained much attention as a measure of uncertainty and unpredictability associated with a dynamical system, eventually depicted by a stochastic process. In particular, the Shannon entropy measures the degree of order/disorder of a given signal and provides useful information about the underlying dynamical process. It has found widespread application in a variety of fields, such as, for example, cryptography, statistical physics and finance. In this regard, many contributions have employed different measures of entropy in an attempt to characterize the financial time series in terms of market efficiency, market crashes and/or financial crises. The Shannon entropy has also been considered as a measure of the risk of a portfolio or as a tool in asset pricing. This work investigates the theoretical link between the Shannon entropy and the multifractional Brownian motion (mBm), stochastic process which recently is the focus of a renewed interest in finance as a driving model of stochastic volatility. In particular, after exploring the current state of research in this area and highlighting some of the key results and open questions that remain, we show a well-defined relationship between the Shannon (log)entropy and the memory function H(t) of the mBm. In details, we allow both the length of time series and time scale to change over analysis to study how the relation modify itself. On the one hand, applications are developed after generating surrogates of mBm trajectories based on different memory functions; on the other hand, an empirical analysis of several international stock indexes, which confirms the previous results, concludes the work.

Keywords: Shannon entropy, multifractional Brownian motion, Hurst–Holder exponent, stock indexes

Procedia PDF Downloads 83
5420 A Systematic Review of Situational Awareness and Cognitive Load Measurement in Driving

Authors: Aly Elshafei, Daniela Romano

Abstract:

With the development of autonomous vehicles, a human-machine interaction (HMI) system is needed for a safe transition of control when a takeover request (TOR) is required. An important part of the HMI system is the ability to monitor the level of situational awareness (SA) of any driver in real-time, in different scenarios, and without any pre-calibration. Presenting state-of-the-art machine learning models used to measure SA is the purpose of this systematic review. Investigating the limitations of each type of sensor, the gaps, and the most suited sensor and computational model that can be used in driving applications. To the author’s best knowledge this is the first literature review identifying online and offline classification methods used to measure SA, explaining which measurements are subject or session-specific, and how many classifications can be done with each classification model. This information can be very useful for researchers measuring SA to identify the most suited model to measure SA for different applications.

Keywords: situational awareness, autonomous driving, gaze metrics, EEG, ECG

Procedia PDF Downloads 100
5419 Predicting Bridge Pier Scour Depth with SVM

Authors: Arun Goel

Abstract:

Prediction of maximum local scour is necessary for the safety and economical design of the bridges. A number of equations have been developed over the years to predict local scour depth using laboratory data and a few pier equations have also been proposed using field data. Most of these equations are empirical in nature as indicated by the past publications. In this paper, attempts have been made to compute local depth of scour around bridge pier in dimensional and non-dimensional form by using linear regression, simple regression and SVM (Poly and Rbf) techniques along with few conventional empirical equations. The outcome of this study suggests that the SVM (Poly and Rbf) based modeling can be employed as an alternate to linear regression, simple regression and the conventional empirical equations in predicting scour depth of bridge piers. The results of present study on the basis of non-dimensional form of bridge pier scour indicates the improvement in the performance of SVM (Poly and Rbf) in comparison to dimensional form of scour.

Keywords: modeling, pier scour, regression, prediction, SVM (Poly and Rbf kernels)

Procedia PDF Downloads 432
5418 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O'Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient

Procedia PDF Downloads 267
5417 Problems Arising in Visual Perception: A Philosophical and Epistemological Analysis

Authors: K. A.Tharanga, K. H. H. Damayanthi

Abstract:

Perception is an epistemological concept discussed in Philosophy. Perception, in other word, vision, is one of the ways that human beings get empirical knowledge after five senses. However, we face innumerable problems when achieving knowledge from perception, and therefore the knowledge gained through perception is uncertain. what we see in the external world is not real. These are the major issues that we face when receiving knowledge through perception. Sometimes there is no physical existence of what we really see. In such cases, the perception is relative. The following frames will be taken into consideration when perception is analyzed illusions and delusions, the figure of a physical object, appearance and the reality of a physical object, time factor, and colour of a physical object. seeing and knowing become vary according to the above conceptual frames. We cannot come to a proper conclusion of what we see in the empirical world. Because the things that we see are not really there. Hence the scientific knowledge which is gained from observation is doubtful. All the factors discussed in science remain in the physical world. There is a leap from ones existence to the existence of a world outside his/her mind. Indeed, one can suppose that what he/she takes to be real is just a massive deception. However, depending on the above facts, if someone begins to doubt about the whole world, it is unavoidable to become his/her view a scepticism or nihilism. This is a certain reality.

Keywords: empirical, perception, sceptisism, nihilism

Procedia PDF Downloads 113
5416 Poincare Plot for Heart Rate Variability

Authors: Mazhar B. Tayel, Eslam I. AlSaba

Abstract:

The heart is the most important part in any body organisms. It effects and affected by any factor in the body. Therefore, it is a good detector of any matter in the body. When the heart signal is non-stationary signal, therefore, it should be study its variability. So, the Heart Rate Variability (HRV) has attracted considerable attention in psychology, medicine and have become important dependent measure in psychophysiology and behavioral medicine. Quantification and interpretation of heart rate variability. However, remain complex issues are fraught with pitfalls. This paper presents one of the non-linear techniques to analyze HRV. It discusses 'What Poincare plot is?', 'How it is work?', 'its usage benefits especially in HRV', 'the limitation of Poincare cause of standard deviation SD1, SD2', and 'How overcome this limitation by using complex correlation measure (CCM)'. The CCM is most sensitive to changes in temporal structure of the Poincaré plot as compared to SD1 and SD2.

Keywords: heart rate variability, chaotic system, poincare, variance, standard deviation, complex correlation measure

Procedia PDF Downloads 377
5415 Comparative Settlement Analysis on the under of Embankment with Empirical Formulas and Settlement Plate Measurement for Reducing Building Crack around of Embankments

Authors: Safitri Nur Wulandari, M. Ivan Adi Perdana, Prathisto L. Panuntun Unggul, R. Dary Wira Mahadika

Abstract:

In road construction on the soft soil, we need a soil improvement method to improve the soil bearing capacity of the land base so that the soil can withstand the traffic loads. Most of the land in Indonesia has a soft soil, where soft soil is a type of clay that has the consistency of very soft to medium stiff, undrained shear strength, Cu <0:25 kg/cm2, or the estimated value of NSPT <5 blows/ft. This study focuses on the analysis of the effect on preloading load (embarkment) to the amount of settlement ratio on the under of embarkment that will impact on the building cracks around of embarkment. The method used in this research is a superposition method for embarkment distribution on 27 locations with undisturbed soil samples at some borehole point in Java and Kalimantan, Indonesia. Then correlating the results of settlement plate monitoring on the field with Asaoka method. The results of settlement plate monitoring taken from an embarkment of Ahmad Yani airport in Semarang on 32 points. Where the value of Cc (index compressible) soil data based on some laboratory test results, while the value of Cc is not tested obtained from empirical formula Ardhana and Mochtar, 1999. From this research, the results of the field monitoring showed almost the same results with an empirical formulation with the standard deviation of 4% where the formulation of the empirical results of this analysis obtained by linear formula. Value empirical linear formula is to determine the effect of compression heap area as high as 4,25 m is 3,1209x + y = 0.0026 for the slope of the embankment 1: 8 for the same analysis with an initial height of embankment on the field. Provided that at the edge of the embankment settlement worth is not equal to 0 but at a quarter of embankment has a settlement ratio average 0.951 and at the edge of embankment has a settlement ratio 0,049. The influence areas around of embankment are approximately 1 meter for slope 1:8 and 7 meters for slope 1:2. So, it can cause the building cracks, to build in sustainable development.

Keywords: building cracks, influence area, settlement plate, soft soil, empirical formula, embankment

Procedia PDF Downloads 327
5414 Impact of Interest and Foreign Exchange Rates Liberalization on Investment Decision in Nigeria

Authors: Kemi Olalekan Oduntan

Abstract:

This paper was carried out in order to empirical, and descriptively analysis how interest rate and foreign exchange rate liberalization influence investment decision in Nigeria. The study spanned through the period of 1985 – 2014, secondary data were restricted to relevant variables such as investment (Proxy by Gross Fixed Capital Formation) saving rate, interest rate and foreign exchange rate. Theories and empirical literature from various scholars were reviews in the paper. Ordinary Least Square regression method was used for the analysis of data collection. The result of the regression was critically interpreted and discussed. It was discovered for empirical finding that tax investment decision in Nigeria is highly at sensitive rate. Hence, all the alternative hypotheses were accepted while the respective null hypotheses were rejected as a result of interest rate and foreign exchange has significant effect on investment in Nigeria. Therefore, impact of interest rate and foreign exchange rate on the state of investment in the economy cannot be over emphasized.

Keywords: interest rate, foreign exchange liberalization, investment decision, economic growth

Procedia PDF Downloads 345
5413 Empirical Measures to Enhance Germination Potential and Control Browning of Tissue Cultures of Andrographis paniculata

Authors: Nidhi Jindal, Ashok Chaudhury, Manisha Mangal

Abstract:

Andrographis paniculata, (Burm f.) Wallich ex. Nees (Family Acanthaceae) popularly known as King of Bitters, is an important medicinal herb. It has an astonishingly wide range of medicinal properties such as anti-inflammatory,antidiarrhoeal, antiviral, antimalarial, hepatoprotective, cardiovascular, anticancer, and immunostimulatory activities. It is widely cultivated in southern Asia. Though propagation of this herb generally occurs through seeds, it has many germination problems which intrigued scientists to work out on the alternative techniques for its mass production. The potential of tissue culture techniques as an alternative tool for AP multiplication was found to be promising. However, the high mortality rate of explants caused by phenolic browning of explants is one of the difficulties reported. Low multiplication rates were reported in the proliferation phase, as well as cultures decline characterized by leaf fall and loss of overall vigor. In view of above problems, a study was undertaken to overcome seed dormancy to improve germination potential and to investigate further on the possible means for successful proliferation of cultures via preventive approaches to overcome failures caused by phenolic browning. Experiments were conducted to improve germination potential and among all the chemical and mechanical trials, scarification of seeds with sand paper proved to be the best method to enhance the germination potential (82.44%) within 7 days. Similarly, several pretreatments and media combinations were tried to overcome browning of explants leading to the conclusion that addition of 0.1% citric acid and 0.2% of ascorbic acid in the media followed by rapid sub culturing of explants controlled browning and decline of explants by 67.45%.

Keywords: plant tissue culture, empirical measure, germination, tissue culture

Procedia PDF Downloads 397
5412 Explaining Listening Comprehension among L2 Learners of English: The Contribution of Vocabulary Knowledge and Working Memory Capacity

Authors: Ahmed Masrai

Abstract:

Listening comprehension constitutes a considerable challenge for the second language (L2) learners, but a little is known about the explanatory power of different variables in explaining variance in listening comprehension. Since research in this area, to the researcher's knowledge, is relatively small in comparison to that focusing on the relationship between reading comprehension and factors such as vocabulary and working memory, there is a need for studies that are seeking to fill the gap in our knowledge about the specific contribution of working memory capacity (WMC), aural vocabulary knowledge and written vocabulary knowledge to explaining listening comprehension. Among 130 English as foreign language learners, the present study examines what proportion of the variance in listening comprehension is explained by aural vocabulary knowledge, written vocabulary knowledge, and WMC. Four measures were used to collect the required data for the study: (1) A-Lex, a measure of aural vocabulary knowledge; (2) XK-Lex, a measure of written vocabulary knowledge; (3) Listening Span Task, a measure of WMC and; (4) IELTS Listening Test, a measure of listening comprehension. The results show that aural vocabulary knowledge is the strongest predictor of listening comprehension, followed by WMC, while written vocabulary knowledge is the weakest predictor. The study discusses implications for the explanatory power of aural vocabulary knowledge and WMC to listening comprehension and pedagogical practice in L2 classrooms.

Keywords: listening comprehension, second language, vocabulary knowledge, working memory

Procedia PDF Downloads 361
5411 Measure of Pleasure of Drug Users

Authors: Vano Tsertsvadze, Marina Chavchanidze, Lali Khurtsia

Abstract:

Problem of drug use is often seen as a combination of psychological and social problems, but this problem can be considered as economically rational decision in the process of buying pleasure (looking after children, reading, harvesting fruits in the fall, sex, eating, etc.). Before the adoption of the decisions people face to a trade-off - when someone chooses a delicious meal, she takes a completely rational decision, that the pleasure of eating has a lot more value than the pleasure which she will experience after two months diet on the summer beach showing off her beautiful body. This argument is also true for alcohol, drugs and cigarettes. Smoking has a negative effect on health, but smokers are not afraid of the threat of a lung cancer after 40 years, more valuable moment is a pleasure from smoking. Our hypothesis - unsatisfied pleasure and frustration, probably determines the risk of dependence on drug abuse. The purpose of research: 1- to determine the relative measure unit of pleasure, which will be used to measure and assess the intensity of various human pleasures. 2- to compare the intensity of the pleasure from different kinds of activity, with pleasures received from drug use. 3- Based on the analysis of data, to identify factors affecting the rational decision making. Research method: Respondents will be asked to recall the greatest pleasure of their life, which will be used as a measure of the other pleasures. The study will use focus groups and structured interviews.

Keywords: drug, drug-user, measurement, satisfaction

Procedia PDF Downloads 305
5410 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm

Authors: Ping Bo, Meng Yunshan

Abstract:

Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.

Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter

Procedia PDF Downloads 297
5409 A Combinatorial Representation for the Invariant Measure of Diffusion Processes on Metric Graphs

Authors: Michele Aleandri, Matteo Colangeli, Davide Gabrielli

Abstract:

We study a generalization to a continuous setting of the classical Markov chain tree theorem. In particular, we consider an irreducible diffusion process on a metric graph. The unique invariant measure has an atomic component on the vertices and an absolutely continuous part on the edges. We show that the corresponding density at x can be represented by a normalized superposition of the weights associated to metric arborescences oriented toward the point x. A metric arborescence is a metric tree oriented towards its root. The weight of each oriented metric arborescence is obtained by the product of the exponential of integrals of the form ∫a/b², where b is the drift and σ² is the diffusion coefficient, along the oriented edges, for a weight for each node determined by the local orientation of the arborescence around the node and for the inverse of the diffusion coefficient at x. The metric arborescences are obtained by cutting the original metric graph along some edges.

Keywords: diffusion processes, metric graphs, invariant measure, reversibility

Procedia PDF Downloads 144
5408 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank will be focused first when developing their test cases as these modules are vulnerable to defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: software testing, mutation test, network centrality measure, test case prioritization

Procedia PDF Downloads 84
5407 A Theory and Empirical Analysis on the Efficency of Chinese Electricity Pricing

Authors: Jianlin Wang, Jiajia Zhao

Abstract:

This paper applies the theory and empirical method to examine the relationship between electricity price and coal price, as well as electricity and industry output, for China during Jan 1999-Dec 2012. Our results indicate that there is no any causality between coal price and electricity price under other factors are controlled. However, we found a bi-directional causality between electricity consumption and industry output. Overall, the electricity price set by China’s NDRC is inefficient, which lead to the electricity supply shortage after 2004. It is time to reform electricity price system for China’s reformers.

Keywords: electricity price, coal price, power supply, China

Procedia PDF Downloads 451
5406 Comprehensive Experimental Study to Determine Energy Dissipation of Nappe Flows on Stepped Chutes

Authors: Abdollah Ghasempour, Mohammad Reza Kavianpour, Majid Galoie

Abstract:

This study has investigated the fundamental parameters which have effective role on energy dissipation of nappe flows on stepped chutes in order to estimate an empirical relationship using dimensional analysis. To gain this goal, comprehensive experimental study on some large-scale physical models with various step geometries, slopes, discharges, etc. were carried out. For all models, hydraulic parameters such as velocity, pressure, water depth, flow regime and etc. were measured precisely. The effective parameters, then, could be determined by analysis of experimental data. Finally, a dimensional analysis was done in order to estimate an empirical relationship for evaluation of energy dissipation of nappe flows on stepped chutes. Because of using the large-scale physical models in this study, the empirical relationship is in very good agreement with the experimental results.

Keywords: nappe flow, energy dissipation, stepped chute, dimensional analysis

Procedia PDF Downloads 341
5405 Organizational Performance and Impact of Social Innovation

Authors: Alfonso Unceta, Javier Castro-Spila

Abstract:

This paper offers a conceptual and empirical exploration between the organizational performance and the impact of social innovation. The paper contributes on the social innovation field in three domains: a) It provides analytical and empirical evidence linking organizational performance to the impact of social innovation; b) it provides a first outline of impact assessment of social innovation when it is developed by a diversity of heterogeneous actors (systemic social innovation); c) it provides a first outline for the development of innovation policies to support social innovations according to a typology of organizations and a typology of impact.

Keywords: absorptive capacity, social innovation impact, organizational performance, RESINDEX, Basque Country

Procedia PDF Downloads 459
5404 Is It Important to Measure the Volumetric Mass Density of Nanofluids?

Authors: Z. Haddad, C. Abid, O. Rahli, O. Margeat, W. Dachraoui, A. Mataoui

Abstract:

The present study aims to measure the volumetric mass density of NiPd-heptane nanofluids synthesized using a one-step method known as thermal decomposition of metal-surfactant complexes. The particle concentration is up to 7.55 g/l and the temperature range of the experiment is from 20°C to 50°C. The measured values were compared with the mixture theory and good agreement between the theoretical equation and measurement were obtained. Moreover, the available nanofluids volumetric mass density data in the literature is reviewed.

Keywords: NiPd nanoparticles, nanofluids, volumetric mass density, stability

Procedia PDF Downloads 380
5403 Knowledge Transfer from Experts to Novice: An Empirical Study on Online Communities

Authors: Firmansyah David

Abstract:

This paper aims to investigate factors that drive individuals to transfer their knowledge in the context of online communities. By revisiting tacit-to-explicit knowledge creation, this research attempts to contribute empirically using three online forums (1) Software Engineering; (2) Aerospace Simulator; (3) Health Insurance System. A qualitative approach was deployed to map and recognize the pattern of users ‘Knowledge Transfer (KT), particularly from expert to novice. The findings suggest a common form on how experts give their effort to formulate ‘explicit’ knowledge and how novices ‘understand’ such knowledge. This research underlines that skill; intuition, judgment; value and belief are the prominent factors, both for experts and novice. Further, this research has recognized the groups of expert and novice by their ability to transfer and to ‘adopt’ new knowledge. Future research infers to triangulate the method in which the quantitative study is needed to measure the level of adoption of (new) knowledge by individuals.

Keywords: explicit, expert, knowledge, online community

Procedia PDF Downloads 248
5402 Static vs. Stream Mining Trajectories Similarity Measures

Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh

Abstract:

Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.

Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining

Procedia PDF Downloads 376
5401 The Properties of Risk-based Approaches to Asset Allocation Using Combined Metrics of Portfolio Volatility and Kurtosis: Theoretical and Empirical Analysis

Authors: Maria Debora Braga, Luigi Riso, Maria Grazia Zoia

Abstract:

Risk-based approaches to asset allocation are portfolio construction methods that do not rely on the input of expected returns for the asset classes in the investment universe and only use risk information. They include the Minimum Variance Strategy (MV strategy), the traditional (volatility-based) Risk Parity Strategy (SRP strategy), the Most Diversified Portfolio Strategy (MDP strategy) and, for many, the Equally Weighted Strategy (EW strategy). All the mentioned approaches were based on portfolio volatility as a reference risk measure but in 2023, the Kurtosis-based Risk Parity strategy (KRP strategy) and the Minimum Kurtosis strategy (MK strategy) were introduced. Understandably, they used the fourth root of the portfolio-fourth moment as a proxy for portfolio kurtosis to work with a homogeneous function of degree one. This paper contributes mainly theoretically and methodologically to the framework of risk-based asset allocation approaches with two steps forward. First, a new and more flexible objective function considering a linear combination (with positive coefficients that sum to one) of portfolio volatility and portfolio kurtosis is used to alternatively serve a risk minimization goal or a homogeneous risk distribution goal. Hence, the new basic idea consists in extending the achievement of typical risk-based approaches’ goals to a combined risk measure. To give the rationale behind operating with such a risk measure, it is worth remembering that volatility and kurtosis are expressions of uncertainty, to be read as dispersion of returns around the mean and that both preserve adherence to a symmetric framework and consideration for the entire returns distribution as well, but also that they differ from each other in that the former captures the “normal” / “ordinary” dispersion of returns, while the latter is able to catch the huge dispersion. Therefore, the combined risk metric that uses two individual metrics focused on the same phenomena but differently sensitive to its intensity allows the asset manager to express, in the context of an objective function by varying the “relevance coefficient” associated with the individual metrics, alternatively, a wide set of plausible investment goals for the portfolio construction process while serving investors differently concerned with tail risk and traditional risk. Since this is the first study that also implements risk-based approaches using a combined risk measure, it becomes of fundamental importance to investigate the portfolio effects triggered by this innovation. The paper also offers a second contribution. Until the recent advent of the MK strategy and the KRP strategy, efforts to highlight interesting properties of risk-based approaches were inevitably directed towards the traditional MV strategy and SRP strategy. Previous literature established an increasing order in terms of portfolio volatility, starting from the MV strategy, through the SRP strategy, arriving at the EQ strategy and provided the mathematical proof for the “equalization effect” concerning marginal risks when the MV strategy is considered, and concerning risk contributions when the SRP strategy is considered. Regarding the validity of similar conclusions when referring to the MK strategy and KRP strategy, the development of a theoretical demonstration is still pending. This paper fills this gap.

Keywords: risk parity, portfolio kurtosis, risk diversification, asset allocation

Procedia PDF Downloads 45
5400 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 140
5399 Approaching Collaborative Governance Legitimacy through Discursive Legitimation Analysis

Authors: Carlo Schick

Abstract:

Legitimacy can be regarded the very fabric of political orders. Up to this point, IR scholarship was particularly interested in the legitimacy of nation-states, international regimes and of non-governmental actors. The legitimacy of collaborative governance comprising public, private and civic actors, however, has not received much attention from an IR perspective. This is partly due to the fact that the concept of legitimacy is difficult to operationalise and measure in settings where there is no clear boundary between political authorities and those who are subject to collaborative governance. In this case, legitimacy cannot be empirically approached in its own terms, but can only be analysed in terms of dialectic legitimation processes. The author develops a three-fold analytical framework based on a dialogical understanding of legitimation. Legitimation first has to relate to public legitimacy demands and contestations of collaborative governance and second to legitimacy claims issued by collaborative governance networks themselves. Lastly, collaborative governance is dependent on constant self-legitimisation. The paper closes with suggesting a discourse analytic approach to further empirical research on the legitimacy of collaborative governance.

Keywords: legitimacy, collaborative governance, discourse analysis, dialectic legitimation

Procedia PDF Downloads 312
5398 Global Voltage Harmonic Index for Measuring Harmonic Situation of Power Grids: A Focus on Power Transformers

Authors: Alireza Zabihi, Saeed Peyghami, Hossein Mokhtari

Abstract:

With the increasing deployment of renewable power plants, such as solar and wind, it is crucial to measure the harmonic situation of the grid. This paper proposes a global voltage harmonic index to measure the harmonic situation of the power grid with a focus on power transformers. The power electronics systems used to connect these plants to the network can introduce harmonics, leading to increased losses, reduced efficiency, false operation of protective relays, and equipment damage due to harmonic intensifications. The proposed index considers the losses caused by harmonics in power transformers which are of great importance and value to the network, providing a comprehensive measure of the harmonic situation of the grid. The effectiveness of the proposed index is evaluated on a real-world distribution network, and the results demonstrate its ability to identify the harmonic situation of the network, particularly in relation to power transformers. The proposed index provides a comprehensive measure of the harmonic situation of the grid, taking into account the losses caused by harmonics in power transformers. The proposed index has the potential to support power companies in optimizing their power systems and to guide researchers in developing effective mitigation strategies for harmonics in the power grid.

Keywords: global voltage harmonic index, harmonics, power grid, power quality, power transformers, renewable energy

Procedia PDF Downloads 98
5397 Determinants of Non-Performing Loans: An Empirical Investigation of Bank-Specific Micro-Economic Factors

Authors: Amir Ikram, Faisal Ijaz, Qin Su

Abstract:

The empirical study was undertaken to explore the determinants of non-performing loans (NPLs) of small and medium enterprises (SMEs) sector held by the commercial banks. Primary data was collected through well-structured survey questionnaire from credit analysts/bankers of 42 branches of 9 commercial banks, operating in the district of Lahore (Pakistan), for 2014-2015. Selective descriptive analysis and Pearson chi-square technique were used to illustrate and evaluate the significance of different variables affecting NPLs. Branch age, duration of the loan, and credit policy were found to be significant determinants of NPLs. The study proposes that bank-specific and SME-specific microeconomic variables directly influence NPLs, while macroeconomic factors act as intermediary variables. Framework exhibiting causal nexus of NPLs was also drawn on the basis of empirical findings. The results elaborate various origins of NPLs and suggest that they are primarily instigated by the loan sanctioning procedure of the financial institution. The paper also underlines the risk management practices adopted by the bank at branch level to averse the risk of loan default. Empirical investigation of bank-specific microeconomic factors of NPLs with respect to Pakistan’s economy is the novelty of the study. Broader strategic policy implications are provided for credit analysts and entrepreneurs.

Keywords: commercial banks, microeconomic factors, non-performing loans, small and medium enterprises

Procedia PDF Downloads 237
5396 Capital Accumulation, Technology Diffusion and Economic Growth: An Empirical Application to Tunisian Case

Authors: Ahmed Bellakhdhar

Abstract:

This paper aims to test the impact of various variables-namely, investment in physical capital, investment in human capital, openness to trade and foreign direct investments, and distance from the technology frontier-on economic growth in the Tunisian context during the period 1976-2010. Empirical results identify that the impact of human capital is significantly positive. This finding confirms the hypothesis that human capital is a main driver of economic performance through its role of improving the internal productive capacity and the absorption of foreign technology especially via foreign direct investments. The effect of FDI is significantly positive in all alternative regressions and the coefficient associated to physical capital variable is positive, but not significant overall. Concerning the import of technologically advanced equipments, our estimates show the absence of a significant direct impact on economic growth in Tunisia. Our empirical results also support the assumption of a non linear relationship between tax and growth and demonstrate the existence of an inverted-U curve between the two variables, in the spirit of the “Laffer curve”.

Keywords: Endogenous growth, Human capital, Technology transfer, Absorptive capacity

Procedia PDF Downloads 114