Search results for: sequential causal inference
463 Performance Evaluation of Dynamic Signal Control System for Mixed Traffic Conditions
Authors: Aneesh Babu, S. P. Anusha
Abstract:
A dynamic signal control system combines traditional traffic lights with an array of sensors to intelligently control vehicle and pedestrian traffic. The present study focus on evaluating the performance of dynamic signal control systems for mixed traffic conditions. Data collected from four different approaches to a typical four-legged signalized intersection at Trivandrum city in the Kerala state of India is used for the study. Performance of three other dynamic signal control methods, namely (i) Non-sequential method (ii) Webster design for consecutive signal cycle using flow as input, and (iii) dynamic signal control using RFID delay as input, were evaluated. The evaluation of the dynamic signal control systems was carried out using a calibrated VISSIM microsimulation model. Python programming was used to integrate the dynamic signal control algorithm through the COM interface in VISSIM. The intersection delay obtained from different dynamic signal control methods was compared with the delay obtained from fixed signal control. Based on the study results, it was observed that the intersection delay was reduced significantly by using dynamic signal control methods. The dynamic signal control method using delay from RFID sensors resulted in a higher percentage reduction in delay and hence is a suitable choice for implementation under mixed traffic conditions. The developed dynamic signal control strategies can be implemented in ITS applications under mixed traffic conditions.Keywords: dynamic signal control, intersection delay, mixed traffic conditions, RFID sensors
Procedia PDF Downloads 106462 Supplier Risk Management: A Multivariate Statistical Modelling and Portfolio Optimization Based Approach for Supplier Delivery Performance Development
Authors: Jiahui Yang, John Quigley, Lesley Walls
Abstract:
In this paper, the authors develop a stochastic model regarding the investment in supplier delivery performance development from a buyer’s perspective. The authors propose a multivariate model through a Multinomial-Dirichlet distribution within an Empirical Bayesian inference framework, representing both the epistemic and aleatory uncertainties in deliveries. A closed form solution is obtained and the lower and upper bound for both optimal investment level and expected profit under uncertainty are derived. The theoretical properties provide decision makers with useful insights regarding supplier delivery performance improvement problems where multiple delivery statuses are involved. The authors also extend the model from a single supplier investment into a supplier portfolio, using a Lagrangian method to obtain a theoretical expression for an optimal investment level and overall expected profit. The model enables a buyer to know how the marginal expected profit/investment level of each supplier changes with respect to the budget and which supplier should be invested in when additional budget is available. An application of this model is illustrated in a simulation study. Overall, the main contribution of this study is to provide an optimal investment decision making framework for supplier development, taking into account multiple delivery statuses as well as multiple projects.Keywords: decision making, empirical bayesian, portfolio optimization, supplier development, supply chain management
Procedia PDF Downloads 288461 Improving the Run Times of Existing and Historical Demand Models Using Simple Python Scripting
Authors: Abhijeet Ostawal, Parmjit Lall
Abstract:
The run times for a large strategic model that we were managing had become too long leading to delays in project delivery, increased costs and loss in productivity. Software developers are continuously working towards developing more efficient tools by changing their algorithms and processes. The issue faced by our team was how do you apply the latest technologies on validated existing models which are based on much older versions of software that do not have the latest software capabilities. The multi-model transport model that we had could only be run in sequential assignment order. Recent upgrades to the software now allowed the assignment to be run in parallel, a concept called parallelization. Parallelization is a Python script working only within the latest version of the software. A full model transfer to the latest version was not possible due to time, budget and the potential changes in trip assignment. This article is to show the method to adapt and update the Python script in such a way that it can be used in older software versions by calling the latest version and then recalling the old version for assignment model without affecting the results. Through a process of trial-and-error run time savings of up to 30-40% have been achieved. Assignment results were maintained within the older version and through this learning process we’ve applied this methodology to other even older versions of the software resulting in huge time savings, more productivity and efficiency for both client and consultant.Keywords: model run time, demand model, parallelisation, python scripting
Procedia PDF Downloads 118460 Protective Effects of Vitamin C and Vitamin E on Experimentally Induced Testicular Torsion and Detorsion in Rat Model
Authors: Anu Vinod Ranade
Abstract:
Aim: To evaluate and compare the effects of Vitamin C and Vitamin E on experimentally induced testicular torsion and detorsion in rats. Methods: Forty Male Wistar Albino rats were divided into five groups. Animals in the Group I underwent Sham operation, Group II consisted of animals that were subjected to torsion for three hours followed by detorsion for 24 hours without any treatment. While Group III, IV and V were orally pretreated with Vitamin C (40mg/kg.bw), vitamin E (100mg/kg.bw) and a combination of Vitamin C and vitamin E respectively for a period of 30 days. The testes of the experimental groups were manually rotated to 720° clockwise for three hours and counter rotated for 24 hours to induce ischemia and reperfusion. Sequential biopsies were performed and the testes were collected at the end of 24 hours of detrosion for morphological evaluation. Result: There was a significant decrease in the standard tubular diameter and the epithelial height of the seminiferous tubules in the untreated group when compared to Sham controls. The standard tubular diameter and seminiferous epithelial height showed near normal values when animals were pretreated with Vitamin C and Vitamin E individually or in combination. Conclusion: The results showed that pretreatment of with antioxidants vitamin E and vitamin C when administered prior to testicular torsion in rats significantly reduced the torsion and detorsion induced histopathlogical injury.Keywords: vitamin C, vitamin E, standard tubular diameter, standard epithelial height, testicular torsion
Procedia PDF Downloads 315459 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference
Procedia PDF Downloads 108458 Long Run Estimates of Population, Consumption and Economic Development of India: An ARDL Bounds Testing Approach of Cointegration
Authors: Sanjay Kumar, Arumugam Sankaran, Arjun K., Mousumi Das
Abstract:
The amount of domestic consumption and population growth is having a positive impact on economic growth and development as observed by the Harrod-Domar and endogenous growth models. The paper negates the Solow growth model which argues the population growth has a detrimental impact on per capita and steady-state growth. Unlike the Solow model, the paper observes, the per capita income growth never falls zero, and it sustains as positive. Hence, our goal here is to investigate the relationship among population, domestic consumption and economic growth of India. For this estimation, annual data from 1980-2016 has been collected from World Development Indicator and Reserve Bank of India. To know the long run as well as short-run dynamics among the variables, we have employed the ARDL bounds testing approach of cointegration followed by modified Wald causality test to know the direction of causality. The conclusion from cointegration and ARDL estimates reveal that there is a long run positive and statistically significant relationship among the variables under study. At the same time, the causality test shows that there is a causal relationship that exists among the variables. Hence, this calls for policies which have a long run perspective in strengthening the capabilities and entitlements of people and stabilizing domestic demand so as to serve long run and short run growth and stability of the economy.Keywords: cointegration, consumption, economic development, population growth
Procedia PDF Downloads 159457 Pushing the Boundary of Parallel Tractability for Ontology Materialization via Boolean Circuits
Authors: Zhangquan Zhou, Guilin Qi
Abstract:
Materialization is an important reasoning service for applications built on the Web Ontology Language (OWL). To make materialization efficient in practice, current research focuses on deciding tractability of an ontology language and designing parallel reasoning algorithms. However, some well-known large-scale ontologies, such as YAGO, have been shown to have good performance for parallel reasoning, but they are expressed in ontology languages that are not parallelly tractable, i.e., the reasoning is inherently sequential in the worst case. This motivates us to study the problem of parallel tractability of ontology materialization from a theoretical perspective. That is we aim to identify the ontologies for which materialization is parallelly tractable, i.e., in the NC complexity. Since the NC complexity is defined based on Boolean circuit that is widely used to investigate parallel computing problems, we first transform the problem of materialization to evaluation of Boolean circuits, and then study the problem of parallel tractability based on circuits. In this work, we focus on datalog rewritable ontology languages. We use Boolean circuits to identify two classes of datalog rewritable ontologies (called parallelly tractable classes) such that materialization over them is parallelly tractable. We further investigate the parallel tractability of materialization of a datalog rewritable OWL fragment DHL (Description Horn Logic). Based on the above results, we analyze real-world datasets and show that many ontologies expressed in DHL belong to the parallelly tractable classes.Keywords: ontology materialization, parallel reasoning, datalog, Boolean circuit
Procedia PDF Downloads 271456 Programming without Code: An Approach and Environment to Conditions-On-Data Programming
Authors: Philippe Larvet
Abstract:
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation
Procedia PDF Downloads 221455 Application of RS and GIS Technique for Identifying Groundwater Potential Zone in Gomukhi Nadhi Sub Basin, South India
Authors: Punitha Periyasamy, Mahalingam Sudalaimuthu, Sachikanta Nanda, Arasu Sundaram
Abstract:
India holds 17.5% of the world’s population but has only 2% of the total geographical area of the world where 27.35% of the area is categorized as wasteland due to lack of or less groundwater. So there is a demand for excessive groundwater for agricultural and non agricultural activities to balance its growth rate. With this in mind, an attempt is made to find the groundwater potential zone in Gomukhi river sub basin of Vellar River basin, TamilNadu, India covering an area of 1146.6 Sq.Km consists of 9 blocks from Peddanaickanpalayam to Villupuram fall in the sub basin. The thematic maps such as Geology, Geomorphology, Lineament, Landuse, and Landcover and Drainage are prepared for the study area using IRS P6 data. The collateral data includes rainfall, water level, soil map are collected for analysis and inference. The digital elevation model (DEM) is generated using Shuttle Radar Topographic Mission (SRTM) and the slope of the study area is obtained. ArcGIS 10.1 acts as a powerful spatial analysis tool to find out the ground water potential zones in the study area by means of weighted overlay analysis. Each individual parameter of the thematic maps are ranked and weighted in accordance with their influence to increase the water level in the ground. The potential zones in the study area are classified viz., Very Good, Good, Moderate, Poor with its aerial extent of 15.67, 381.06, 575.38, 174.49 Sq.Km respectively.Keywords: ArcGIS, DEM, groundwater, recharge, weighted overlay
Procedia PDF Downloads 444454 Interval Bilevel Linear Fractional Programming
Authors: F. Hamidi, N. Amiri, H. Mishmast Nehi
Abstract:
The Bilevel Programming (BP) model has been presented for a decision making process that consists of two decision makers in a hierarchical structure. In fact, BP is a model for a static two person game (the leader player in the upper level and the follower player in the lower level) wherein each player tries to optimize his/her personal objective function under dependent constraints; this game is sequential and non-cooperative. The decision making variables are divided between the two players and one’s choice affects the other’s benefit and choices. In other words, BP consists of two nested optimization problems with two objective functions (upper and lower) where the constraint region of the upper level problem is implicitly determined by the lower level problem. In real cases, the coefficients of an optimization problem may not be precise, i.e. they may be interval. In this paper we develop an algorithm for solving interval bilevel linear fractional programming problems. That is to say, bilevel problems in which both objective functions are linear fractional, the coefficients are interval and the common constraint region is a polyhedron. From the original problem, the best and the worst bilevel linear fractional problems have been derived and then, using the extended Charnes and Cooper transformation, each fractional problem can be reduced to a linear problem. Then we can find the best and the worst optimal values of the leader objective function by two algorithms.Keywords: best and worst optimal solutions, bilevel programming, fractional, interval coefficients
Procedia PDF Downloads 446453 ISMARA: Completely Automated Inference of Gene Regulatory Networks from High-Throughput Data
Authors: Piotr J. Balwierz, Mikhail Pachkov, Phil Arnold, Andreas J. Gruber, Mihaela Zavolan, Erik van Nimwegen
Abstract:
Understanding the key players and interactions in the regulatory networks that control gene expression and chromatin state across different cell types and tissues in metazoans remains one of the central challenges in systems biology. Our laboratory has pioneered a number of methods for automatically inferring core gene regulatory networks directly from high-throughput data by modeling gene expression (RNA-seq) and chromatin state (ChIP-seq) measurements in terms of genome-wide computational predictions of regulatory sites for hundreds of transcription factors and micro-RNAs. These methods have now been completely automated in an integrated webserver called ISMARA that allows researchers to analyze their own data by simply uploading RNA-seq or ChIP-seq data sets and provides results in an integrated web interface as well as in downloadable flat form. For any data set, ISMARA infers the key regulators in the system, their activities across the input samples, the genes and pathways they target, and the core interactions between the regulators. We believe that by empowering experimental researchers to apply cutting-edge computational systems biology tools to their data in a completely automated manner, ISMARA can play an important role in developing our understanding of regulatory networks across metazoans.Keywords: gene expression analysis, high-throughput sequencing analysis, transcription factor activity, transcription regulation
Procedia PDF Downloads 65452 Airbnb, Hotel Industry and Optimum Strategies: Evidence from European Cities, Barcelona, London and Paris
Authors: Juan Pedro Aznar Alarcon, Josep Maria Sayeras Maspera
Abstract:
Airbnb and other similar platforms are offering a near substitute to the traditional accommodation service supplied by the hotel sector. In this context, hotels can try to compete by offering higher quality and additional services, which imply the need for new investments or try to compete by reducing prices. The theoretical model presented in this paper analyzes the best response using a sequential game theory model. The main conclusion is that due to the financial constraints that small and medium hotels have these hotels have reduced prices whereas hotels that belong to international groups or have an easy access to financial resources have increased their investment to increase the quality of the service provided. To check the validity of the theoretical model financial data from Barcelona, London and Paris hotels have been used analyzing profitability, quality of the service provided, the investment propensity and the evolution of the gross profit. The model and the empirical data provide the base for some industrial policy in the hospitality industry. To address the extra cost that small hotels in Europe have to face compared by bigger firms would help to improve the level of quality provided and to some extent have positive externalities in terms of job creation and an increasing added value for the industry.Keywords: Airbnb, profitability, hospitality industry, game theory
Procedia PDF Downloads 348451 Optimization of Technical and Technological Solutions for the Development of Offshore Hydrocarbon Fields in the Kaliningrad Region
Authors: Pavel Shcherban, Viktoria Ivanova, Alexander Neprokin, Vladislav Golovanov
Abstract:
Currently, LLC «Lukoil-Kaliningradmorneft» is implementing a comprehensive program for the development of offshore fields of the Kaliningrad region. This is largely associated with the depletion of the resource base of land in the region, as well as the positive results of geological investigation surrounding the Baltic Sea area and the data on the volume of hydrocarbon recovery from a single offshore field are working on the Kaliningrad region – D-6 «Kravtsovskoye».The article analyzes the main stages of the LLC «Lukoil-Kaliningradmorneft»’s development program for the development of the hydrocarbon resources of the region's shelf and suggests an optimization algorithm that allows managing a multi-criteria process of development of shelf deposits. The algorithm is formed on the basis of the problem of sequential decision making, which is a section of dynamic programming. Application of the algorithm during the consolidation of the initial data, the elaboration of project documentation, the further exploration and development of offshore fields will allow to optimize the complex of technical and technological solutions and increase the economic efficiency of the field development project implemented by LLC «Lukoil-Kaliningradmorneft».Keywords: offshore fields of hydrocarbons of the Baltic Sea, development of offshore oil and gas fields, optimization of the field development scheme, solution of multicriteria tasks in oil and gas complex, quality management in oil and gas complex
Procedia PDF Downloads 200450 An Intelligent Scheme Switching for MIMO Systems Using Fuzzy Logic Technique
Authors: Robert O. Abolade, Olumide O. Ajayi, Zacheaus K. Adeyemo, Solomon A. Adeniran
Abstract:
Link adaptation is an important strategy for achieving robust wireless multimedia communications based on quality of service (QoS) demand. Scheme switching in multiple-input multiple-output (MIMO) systems is an aspect of link adaptation, and it involves selecting among different MIMO transmission schemes or modes so as to adapt to the varying radio channel conditions for the purpose of achieving QoS delivery. However, finding the most appropriate switching method in MIMO links is still a challenge as existing methods are either computationally complex or not always accurate. This paper presents an intelligent switching method for the MIMO system consisting of two schemes - transmit diversity (TD) and spatial multiplexing (SM) - using fuzzy logic technique. In this method, two channel quality indicators (CQI) namely average received signal-to-noise ratio (RSNR) and received signal strength indicator (RSSI) are measured and are passed as inputs to the fuzzy logic system which then gives a decision – an inference. The switching decision of the fuzzy logic system is fed back to the transmitter to switch between the TD and SM schemes. Simulation results show that the proposed fuzzy logic – based switching technique outperforms conventional static switching technique in terms of bit error rate and spectral efficiency.Keywords: channel quality indicator, fuzzy logic, link adaptation, MIMO, spatial multiplexing, transmit diversity
Procedia PDF Downloads 152449 Ambivalence as Ethical Practice: Methodologies to Address Noise, Bias in Care, and Contact Evaluations
Authors: Anthony Townsend, Robyn Fasser
Abstract:
While complete objectivity is a desirable scientific position from which to conduct a care and contact evaluation (CCE), it is precisely the recognition that we are inherently incapable of operating objectively that is the foundation of ethical practice and skilled assessment. Drawing upon recent research from Daniel Kahneman (2021) on the differences between noise and bias, as well as different inherent biases collectively termed “The Elephant in the Brain” by Kevin Simler and Robin Hanson (2019) from Oxford University, this presentation addresses both the various ways in which our judgments, perceptions and even procedures can be distorted and contaminated while conducting a CCE, but also considers the value of second order cybernetics and the psychodynamic concept of ‘ambivalence’ as a conceptual basis to inform our assessment methodologies to limit such errors or at least better identify them. Both a conceptual framework for ambivalence, our higher-order capacity to allow for the convergence and consideration of multiple emotional experiences and cognitive perceptions to inform our reasoning, and a practical methodology for assessment relying on data triangulation, Bayesian inference and hypothesis testing is presented as a means of promoting ethical practice for health care professionals conducting CCEs. An emphasis on widening awareness and perspective, limiting ‘splitting’, is demonstrated both in how this form of emotional processing plays out in alienating dynamics in families as well as the assessment thereof. In addressing this concept, this presentation aims to illuminate the value of ambivalence as foundational to ethical practice for assessors.Keywords: ambivalence, forensic, psychology, noise, bias, ethics
Procedia PDF Downloads 86448 Relationship between Electricity Consumption and Economic Growth: Evidence from Nigeria (1971-2012)
Authors: N. E Okoligwe, Okezie A. Ihugba
Abstract:
Few scholars disagrees that electricity consumption is an important supporting factor for economy growth. However, the relationship between electricity consumption and economy growth has different manifestation in different countries according to previous studies. This paper examines the causal relationship between electricity consumption and economic growth for Nigeria. In an attempt to do this, the paper tests the validity of the modernization or depending hypothesis by employing various econometric tools such as Augmented Dickey Fuller (ADF) and Johansen Co-integration test, the Error Correction Mechanism (ECM) and Granger Causality test on time series data from 1971-2012. The Granger causality is found not to run from electricity consumption to real GDP and from GDP to electricity consumption during the year of study. The null hypothesis is accepted at the 5 per cent level of significance where the probability value (0.2251 and 0.8251) is greater than five per cent level of significance because both of them are probably determined by some other factors like; increase in urban population, unemployment rate and the number of Nigerians that benefit from the increase in GDP and increase in electricity demand is not determined by the increase in GDP (income) over the period of study because electricity demand has always been greater than consumption. Consequently; the policy makers in Nigeria should place priority in early stages of reconstruction on building capacity additions and infrastructure development of the electric power sector as this would force the sustainable economic growth in Nigeria.Keywords: economic growth, electricity consumption, error correction mechanism, granger causality test
Procedia PDF Downloads 309447 Learning a Bayesian Network for Situation-Aware Smart Home Service: A Case Study with a Robot Vacuum Cleaner
Authors: Eu Tteum Ha, Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
The smart home environment backed up by IoT (internet of things) technologies enables intelligent services based on the awareness of the situation a user is currently in. One of the convenient sensors for recognizing the situations within a home is the smart meter that can monitor the status of each electrical appliance in real time. This paper aims at learning a Bayesian network that models the causal relationship between the user situations and the status of the electrical appliances. Using such a network, we can infer the current situation based on the observed status of the appliances. However, learning the conditional probability tables (CPTs) of the network requires many training examples that cannot be obtained unless the user situations are closely monitored by any means. This paper proposes a method for learning the CPT entries of the network relying only on the user feedbacks generated occasionally. In our case study with a robot vacuum cleaner, the feedback comes in whenever the user gives an order to the robot adversely from its preprogrammed setting. Given a network with randomly initialized CPT entries, our proposed method uses this feedback information to adjust relevant CPT entries in the direction of increasing the probability of recognizing the desired situations. Simulation experiments show that our method can rapidly improve the recognition performance of the Bayesian network using a relatively small number of feedbacks.Keywords: Bayesian network, IoT, learning, situation -awareness, smart home
Procedia PDF Downloads 522446 Earnings Volatility and Earnings Predictability
Authors: Yosra Ben Mhamed
Abstract:
Most previous research that investigates the importance of earnings volatility for a firm’s value has focused on the effects of earnings volatility on the cost of capital. Many study illustrate that earnings volatility can reduce the firm’s value by enhancing the cost of capital. However, a few recent studies directly examine the relation between earnings volatility and subsequent earnings levels. In our study, we further explore the role of volatility in forecasting. Our study makes two primary contributions to the literature. First, taking into account the level of current firm’s performance, we provide causal theory to the link between volatility and earnings predictability. Nevertheless, previous studies testing the linearity of this relationship have not mentioned any underlying theory. Secondly, our study contributes to the vast body of fundamental analysis research that identifies a set of variables that improve valuation, by showing that earnings volatility affects the estimation of future earnings. Projections of earnings are used by valuation research and practice to derive estimates of firm value. Since we want to examine the impact of volatility on earnings predictability, we sort the sample into three portfolios according to the level of their earnings volatility in ascending order. For each quintile, we present the predictability coefficient. In a second test, each of these portfolios is, then, sorted into three further quintiles based on their level of current earnings. These yield nine quintiles. So we can observe whether volatility strongly predicts decreases on earnings predictability only for highest quintile of earnings. In general, we find that earnings volatility has an inverse relationship with earnings predictability. Our results also show that the sensibility of earnings predictability to ex-ante volatility is more pronounced among profitability firms. The findings are most consistent with overinvestment and persistence explanations.Keywords: earnings volatility, earnings predictability, earnings persistence, current profitability
Procedia PDF Downloads 433445 Factors Affecting Employee’s Effectiveness at Job in Banking Sectors of Pakistan
Authors: Sajid Aman
Abstract:
Jobs in the banking sector in Pakistan are perceived as very tough, due to which employee turnover is very high. However, the managerial role is very important in influencing employees’ attitudes toward their turnout. This paper explores the manager’s role in influencing employees’ effectiveness on the job. The paper adopted a pragmatic approach by combining both qualitative and quantitative data. The study employed an exploratory sequential strategy under a mixed-method research design. Qualitative data was analyzed using thematic analysis. Five major themes, such as the manager’s attitude towards employees, his leadership style, listening to employee’s personal problems, provision of personal loans without interest and future career prospects, emerged as key factors increasing employee’s effectiveness in the banking sector. The quantitative data revealed that a manager’s attitude, leadership style, availability to listen to employees’ personal problems, and future career prospects and listening to employee’s personal problems are strongly associated with employees’ effectiveness at the job. However, personal loan without interest was noted as having no significant association with employee’s effectiveness at the job. The study concludes manager’s role is more important in the effectiveness of the employees at their job in the banking sector. It is suggested that managers should have a positive attitude towards employees and give time to listening to employee’s problems, even personal ones.Keywords: banking sector, employee’s effectiveness, manager’s role, leadership style
Procedia PDF Downloads 32444 Investigation of Cytotoxic Compounds in Ethyl Acetate and Chloroform Extracts of Nigella sativa Seeds by Sulforhodamine-B Assay-Guided Fractionation
Authors: Harshani Uggallage, Kapila D. Dissanayaka
Abstract:
A Sulforhodamine-B assay-guided fractionation on Nigella sativa seeds was conducted to determine the presence of cytotoxic compounds against human hepatoma (HepG2) cells. Initially, a freeze-dried sample of Nigella sativa seeds was sequentially extracted into solvents of increasing polarities. Crude extracts from the sequential extraction of Nigella sativa seeds in chloroform and ethyl acetate showed the highest cytotoxicity. The combined mixture of these two extracts was subjected to bioassay guided fractionation using a modified Kupchan method of partitioning, followed by Sephadex® LH-20 chromatography. This chromatographic separation process resulted in a column fraction with a convincing IC50 (half-maximal inhibitory concentration) value of 13.07µg/ml, which is considerable for developing therapeutic drug leads against human hepatoma. Reversed phase High-Performance Liquid Chromatography (HPLC) was finally conducted for the same column fraction, and the result indicates the presence of one or several main cytotoxic compounds against human HepG2 cells.Keywords: cytotoxic compounds, half-maximal inhibitory concentration, high-performance liquid chromatography, human HepG2 cells, nigella sativa seeds, Sulforhodamine-B assay
Procedia PDF Downloads 400443 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 32442 Bayesian Inference of Physicochemical Quality Elements of Tropical Lagoon Nokoué (Benin)
Authors: Hounyèmè Romuald, Maxime Logez, Mama Daouda, Argillier Christine
Abstract:
In view of the very strong degradation of aquatic ecosystems, it is urgent to set up monitoring systems that are best able to report on the effects of the stresses they undergo. This is particularly true in developing countries, where specific and relevant quality standards and funding for monitoring programs are lacking. The objective of this study was to make a relevant and objective choice of physicochemical parameters informative of the main stressors occurring on African lakes and to identify their alteration thresholds. Based on statistical analyses of the relationship between several driving forces and the physicochemical parameters of the Nokoué lagoon, relevant Physico-chemical parameters were selected for its monitoring. An innovative method based on Bayesian statistical modeling was used. Eleven Physico-chemical parameters were selected for their response to at least one stressor and their threshold quality standards were also established: Total Phosphorus (<4.5mg/L), Orthophosphates (<0.2mg/L), Nitrates (<0.5 mg/L), TKN (<1.85 mg/L), Dry Organic Matter (<5 mg/L), Dissolved Oxygen (>4 mg/L), BOD (<11.6 mg/L), Salinity (7.6 .), Water Temperature (<28.7 °C), pH (>6.2), and Transparency (>0.9 m). According to the System for the Evaluation of Coastal Water Quality, these thresholds correspond to” good to medium” suitability classes, except for total phosphorus. One of the original features of this study is the use of the bounds of the credibility interval of the fixed-effect coefficients as local weathering standards for the characterization of the Physico-chemical status of this anthropized African ecosystem.Keywords: driving forces, alteration thresholds, acadjas, monitoring, modeling, human activities
Procedia PDF Downloads 94441 Glycan Analyzer: Software to Annotate Glycan Structures from Exoglycosidase Experiments
Authors: Ian Walsh, Terry Nguyen-Khuong, Christopher H. Taron, Pauline M. Rudd
Abstract:
Glycoproteins and their covalently bonded glycans play critical roles in the immune system, cell communication, disease and disease prognosis. Ultra performance liquid chromatography (UPLC) coupled with mass spectrometry is conventionally used to qualitatively and quantitatively characterise glycan structures in a given sample. Exoglycosidases are enzymes that catalyze sequential removal of monosaccharides from the non-reducing end of glycans. They naturally have specificity for a particular type of sugar, its stereochemistry (α or β anomer) and its position of attachment to an adjacent sugar on the glycan. Thus, monitoring the peak movements (both in the UPLC and MS1) after application of exoglycosidases provides a unique and effective way to annotate sugars with high detail - i.e. differentiating positional and linkage isomers. Manual annotation of an exoglycosidase experiment is difficult and time consuming. As such, with increasing sample complexity and the number of exoglycosidases, the analysis could result in manually interpreting hundreds of peak movements. Recently, we have implemented pattern recognition software for automated interpretation of UPLC-MS1 exoglycosidase digestions. In this work, we explain the software, indicate how much time it will save and provide example usage showing the annotation of positional and linkage isomers in Immunoglobulin G, apolipoprotein J, and simple glycan standards.Keywords: bioinformatics, automated glycan assignment, liquid chromatography, mass spectrometry
Procedia PDF Downloads 200440 Latent Factors of Severity in Truck-Involved and Non-Truck-Involved Crashes on Freeways
Authors: Shin-Hyung Cho, Dong-Kyu Kim, Seung-Young Kho
Abstract:
Truck-involved crashes have higher crash severity than non-truck-involved crashes. There have been many studies about the frequency of crashes and the development of severity models, but those studies only analyzed the relationship between observed variables. To identify why more people are injured or killed when trucks are involved in the crash, we must examine to quantify the complex causal relationship between severity of the crash and risk factors by adopting the latent factors of crashes. The aim of this study was to develop a structural equation or model based on truck-involved and non-truck-involved crashes, including five latent variables, i.e. a crash factor, environmental factor, road factor, driver’s factor, and severity factor. To clarify the unique characteristics of truck-involved crashes compared to non-truck-involved crashes, a confirmatory analysis method was used. To develop the model, we extracted crash data from 10,083 crashes on Korean freeways from 2008 through 2014. The results showed that the most significant variable affecting the severity of a crash is the crash factor, which can be expressed by the location, cause, and type of the crash. For non-truck-involved crashes, the crash and environment factors increase severity of the crash; conversely, the road and driver factors tend to reduce severity of the crash. For truck-involved crashes, the driver factor has a significant effect on severity of the crash although its effect is slightly less than the crash factor. The multiple group analysis employed to analyze the differences between the heterogeneous groups of drivers.Keywords: crash severity, structural structural equation modeling (SEM), truck-involved crashes, multiple group analysis, crash on freeway
Procedia PDF Downloads 383439 Internal Combustion Engine Fuel Composition Detection by Analysing Vibration Signals Using ANFIS Network
Authors: M. N. Khajavi, S. Nasiri, E. Farokhi, M. R. Bavir
Abstract:
Alcohol fuels are renewable, have low pollution and have high octane number; therefore, they are important as fuel in internal combustion engines. Percentage detection of these alcoholic fuels with gasoline is a complicated, time consuming, and expensive process. Nowadays, these processes are done in equipped laboratories, based on international standards. The aim of this research is to determine percentage detection of different fuels based on vibration analysis of engine block signals. By doing, so considerable saving in time and cost can be achieved. Five different fuels consisted of pure gasoline (G) as base fuel and combination of this fuel with different percent of ethanol and methanol are prepared. For example, volumetric combination of pure gasoline with 10 percent ethanol is called E10. By this convention, we made M10 (10% methanol plus 90% pure gasoline), E30 (30% ethanol plus 70% pure gasoline), and M30 (30% Methanol plus 70% pure gasoline) were prepared. To simulate real working condition for this experiment, the vehicle was mounted on a chassis dynamometer and run under 1900 rpm and 30 KW load. To measure the engine block vibration, a three axis accelerometer was mounted between cylinder 2 and 3. After acquisition of vibration signal, eight time feature of these signals were used as inputs to an Adaptive Neuro Fuzzy Inference System (ANFIS). The designed ANFIS was trained for classifying these five different fuels. The results show suitable classification ability of the designed ANFIS network with 96.3 percent of correct classification.Keywords: internal combustion engine, vibration signal, fuel composition, classification, ANFIS
Procedia PDF Downloads 401438 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty
Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih
Abstract:
In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization
Procedia PDF Downloads 125437 Character and Evolution of Electronic Waste: A Technologically Developing Country's Experience
Authors: Karen C. Olufokunbi, Odetunji A. Odejobi
Abstract:
The discourse of this paper is the examination of the generation, accumulation and growth of e-waste in a developing country. Images and other data about computer e-waste were collected using a digital camera, 290 copies of questionnaire and three structured interviews using Obafemi Awolowo University (OAU), Ile-Ife, Nigeria environment as a case study. The numerical data were analysed using R data analysis and process tool. Automata-based techniques and Petri net modeling tool were used to design and simulate a computational model for the recovery of saleable materials from e-waste. The R analysis showed that at a 95 percent confidence level, the computer equipment that will be disposed by 2020 will be 417 units. Compared to the 800 units in circulation in 2014, 50 percent of personal computer components will become e-waste. This indicates that personal computer components were in high demand due to their low costs and will be disposed more rapidly when replaced by new computer equipment Also, 57 percent of the respondents discarded their computer e-waste by throwing it into the garbage bin or by dumping it. The simulated model using Coloured Petri net modelling tool for the process showed that the e-waste dynamics is a forward sequential process in the form of a pipeline meaning that an e-waste recovery of saleable materials process occurs in identifiable discrete stages indicating that e-waste will continue to accumulate and grow in volume with time.Keywords: Coloured Petri net, computational modelling, electronic waste, electronic waste process dynamics
Procedia PDF Downloads 166436 Food Insecurity and Quality of Life among the Poor Elderly in South Korea
Authors: Jayoung Cho
Abstract:
Poverty has become a social problem in South Korea, given that seven out of ten elderly experience multidimensional poverty. As quality of life is a major social welfare measure of a society, verifying the major factors affecting the quality of life among the elderly in poverty can be used as baseline data for the promotion of welfare. This study aims to investigate the longitudinal relationships between food insecurity and quality of life among the elderly in poverty. In this study, panel regression analysis using 5-year longitudinal panel data were derived from Korea Welfare Panel Study (KWPS, 2011-2015) were used to identify the research question. A total of 1,327 elderly people aged 65 or older with less than 60% of median income was analyzed. The main results of the study are as follows; first, the level of quality of life of the poor elderly was on average of 5, and repeated the increase and decrease over time. Second, food insecurity and quality of life of the elderly in poverty had a longitudinal causal relationship. Furthermore, the statistical significance of food insecurity was the highest despite controlling for major variables affecting the quality of life among the poor elderly. Therefore, political and practical approaches are strongly suggested and considered regarding the food insecurity for the quality of life among the elderly in poverty. In practical intervention, it is necessary to pay attention to food insecurity when assessing the poor elderly. Also, there is a need to build a new delivery system that incorporates segmented health and nutrition-related services. This study has an academic significance in that it brought out the issue of food insecurity of the poor elderly and confirmed the longitudinal relationship between food insecurity and quality of life.Keywords: food insecurity, longitudinal panel analysis, poor elderly, quality of life
Procedia PDF Downloads 240435 Cognitive Benefits of Being Bilingual: The Effect of Language Learning on the Working Memory in Emerging Miao-Mandarin Juveniles in Rural Regions of China
Authors: Peien Ma
Abstract:
Bilingual effect/advantage theorized the positive effect of being bilingual on general cognitive abilities, but it was unknown which factors tend to modulate these bilingualism effects on working memory capacity. This study imposed empirical field research on a group of low-SES emerging bilinguals, Miao people, in the hill tribes of rural China to investigate whether bilingualism affected their verbal working memory performance. 20 Miao-Chinese bilinguals (13 girls and 7 boys with a mean age of 11.45, SD=1.67) and 20 Chinese monolingual peers (13 girls and 7 boys with a mean age of 11.6, SD=0.68) were recruited. These bilingual and monolingual juveniles, matched on age, sex, socioeconomic status, and educational status, completed a language background questionnaire and a standard forward and backward digit span test adapted from Wechsler Adult Intelligence Scale-Revised (WAIS-R). The results showed that bilinguals earned a significantly higher overall mean score of the task, suggesting the superiority of working memory ability over the monolinguals. And bilingual cognitive benefits were independent of proficiency levels in learners’ two languages. The results suggested that bilingualism enhances working memory in sequential bilinguals from low SES backgrounds and shed light on our understanding of the bilingual advantage from a psychological and social perspective.Keywords: bilingual effects, heritage language, Miao/Hmong language Mandarin, working memory
Procedia PDF Downloads 157434 Julia-Based Computational Tool for Composite System Reliability Assessment
Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris
Abstract:
The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow
Procedia PDF Downloads 73