Search results for: expected utility maximization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3610

Search results for: expected utility maximization

3580 An Evidence Map of Cost-Utility Studies in Non-Small Cell Lung Cancer

Authors: Cassandra Springate, Alexandra Furber, Jack E. Hines

Abstract:

Objectives: To create an evidence map of the cost-utility studies available with non-small cell lung cancer patients, and identify the geographical settings and interventions used. Methods: Using the Disease, Study Type, and Model Type filters in heoro.com we identified all cost-utility studies published between 1960 and 2017 with patients with non-small cell lung cancer. These papers were then indexed according to pre-specified categories. Results: Heoro.com identified 89 independent publications, published between 1995 and 2017. Of the 89 papers, 74 were published since 2010, 28 were from the USA, and 35 were from Europe, 16 of which were from the UK. Other publications were from China and Japan (13), Canada (9), Australia and New Zealand (4), and other countries (8). Fifty-nine studies included a chemotherapy intervention, of which 23 included erlotinib or gefitinib, 21 included pemetrexed or docetaxel, others included nivolumab (3), pembrolizumab (2), crizotinib (2), denosumab (2), necitumumab (1), and bevacizumab (1). Also, 19 studies modeled screening, staging, or surveillance strategies. Conclusions: The cost-utility studies found for NSCLC most commonly looked at the effectiveness of different chemotherapy treatments, with some also evaluating the addition of screening strategies. Most were also conducted with patient data from the USA and Europe.

Keywords: cancer, cost-utility, economic model, non-small cell lung cancer

Procedia PDF Downloads 128
3579 Optimal Diversification and Bank Value Maximization

Authors: Chien-Chih Lin

Abstract:

This study argues that the optimal diversifications for the maximization of bank value are asymmetrical; they depend on the business cycle. During times of expansion, systematic risks are relatively low, and hence there is only a slight effect from raising them with a diversified portfolio. Consequently, the benefit of reducing individual risks dominates any loss from raising systematic risks, leading to a higher value for a bank by holding a diversified portfolio of assets. On the contrary, in times of recession, systematic risks are relatively high. It is more likely that the loss from raising systematic risks surpasses the benefit of reducing individual risks from portfolio diversification. Consequently, more diversification leads to lower bank values. Finally, some empirical evidence from the banks in Taiwan is provided.

Keywords: diversification, default probability, systemic risk, banking, business cycle

Procedia PDF Downloads 410
3578 Increasing Performance of Autopilot Guided Small Unmanned Helicopter

Authors: Tugrul Oktay, Mehmet Konar, Mustafa Soylak, Firat Sal, Murat Onay, Orhan Kizilkaya

Abstract:

In this paper, autonomous performance of a small manufactured unmanned helicopter is tried to be increased. For this purpose, a small unmanned helicopter is manufactured in Erciyes University, Faculty of Aeronautics and Astronautics. It is called as ZANKA-Heli-I. For performance maximization, autopilot parameters are determined via minimizing a cost function consisting of flight performance parameters such as settling time, rise time, overshoot during trajectory tracking. For this purpose, a stochastic optimization method named as simultaneous perturbation stochastic approximation is benefited. Using this approach, considerable autonomous performance increase (around %23) is obtained.

Keywords: small helicopters, hierarchical control, stochastic optimization, autonomous performance maximization, autopilots

Procedia PDF Downloads 559
3577 Pure Scalar Equilibria for Normal-Form Games

Authors: Herbert W. Corley

Abstract:

A scalar equilibrium (SE) is an alternative type of equilibrium in pure strategies for an n-person normal-form game G. It is defined using optimization techniques to obtain a pure strategy for each player of G by maximizing an appropriate utility function over the acceptable joint actions. The players’ actions are determined by the choice of the utility function. Such a utility function could be agreed upon by the players or chosen by an arbitrator. An SE is an equilibrium since no players of G can increase the value of this utility function by changing their strategies. SEs are formally defined, and examples are given. In a greedy SE, the goal is to assign actions to the players giving them the largest individual payoffs jointly possible. In a weighted SE, each player is assigned weights modeling the degree to which he helps every player, including himself, achieve as large a payoff as jointly possible. In a compromise SE, each player wants a fair payoff for a reasonable interpretation of fairness. In a parity SE, the players want their payoffs to be as nearly equal as jointly possible. Finally, a satisficing SE achieves a personal target payoff value for each player. The vector payoffs associated with each of these SEs are shown to be Pareto optimal among all such acceptable vectors, as well as computationally tractable.

Keywords: compromise equilibrium, greedy equilibrium, normal-form game, parity equilibrium, pure strategies, satisficing equilibrium, scalar equilibria, utility function, weighted equilibrium

Procedia PDF Downloads 92
3576 Utility Assessment Model for Wireless Technology in Construction

Authors: Yassir AbdelRazig, Amine Ghanem

Abstract:

Construction projects are information intensive in nature and involve many activities that are related to each other. Wireless technologies can be used to improve the accuracy and timeliness of data collected from construction sites and shares it with appropriate parties. Nonetheless, the construction industry tends to be conservative and shows hesitation to adopt new technologies. A main concern for owners, contractors or any person in charge on a job site is the cost of the technology in question. Wireless technologies are not cheap. There are a lot of expenses to be taken into consideration, and a study should be completed to make sure that the importance and savings resulting from the usage of this technology is worth the expenses. This research attempts to assess the effectiveness of using the appropriate wireless technologies based on criteria such as performance, reliability, and risk. The assessment is based on a utility function model that breaks down the selection issue into alternatives attribute. Then the attributes are assigned weights and single attributes are measured. Finally, single attribute are combined to develop one single aggregate utility index for each alternative.

Keywords: analytic hierarchy process, decision theory, utility function, wireless technologies

Procedia PDF Downloads 315
3575 Household Wealth and Portfolio Choice When Tail Events Are Salient

Authors: Carlson Murray, Ali Lazrak

Abstract:

Robust experimental evidence of systematic violations of expected utility (EU) establishes that individuals facing risk overweight utility from low probability gains and losses when making choices. These findings motivated development of models of preferences with probability weighting functions, such as rank dependent utility (RDU). We solve for the optimal investing strategy of an RDU investor in a dynamic binomial setting from which we derive implications for investing behavior. We show that relative to EU investors with constant relative risk aversion, commonly measured probability weighting functions produce optimal RDU terminal wealth with significant downside protection and upside exposure. We additionally find that in contrast to EU investors, RDU investors optimally choose a portfolio that contains fair bets that provide payo↵s that can be interpreted as lottery outcomes or exposure to idiosyncratic returns. In a calibrated version of the model, we calculate that RDU investors would be willing to pay 5% of their initial wealth for the freedom to trade away from an optimal EU wealth allocation. The dynamic trading strategy that supports the optimal wealth allocation implies portfolio weights that are independent of initial wealth but requires higher risky share after good stock return histories. Optimal trading also implies the possibility of non-participation when historical returns are poor. Our model fills a gap in the literature by providing new quantitative and qualitative predictions that can be tested experimentally or using data on household wealth and portfolio choice.

Keywords: behavioral finance, probability weighting, portfolio choice

Procedia PDF Downloads 401
3574 Transforming Water-Energy-Gas Industry through Smart Metering and Blockchain Technology

Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang

Abstract:

Advanced metering technologies coupled with informatics creates an opportunity to form digital multi-utility service providers. These providers will be able to concurrently collect a customers’ medium-high resolution water, electricity and gas demand data and provide user-friendly platforms to feed this information back to customers and supply/distribution utility organisations. With the emergence of blockchain technology, a new research area has been explored which helps bring this multi-utility service provider concept to a much higher level. This study aims at introducing a breakthrough system architecture where smart metering technology in water, energy, and gas (WEG) are combined with blockchain technology to provide customer a novel real-time consumption report and decentralized resource trading platform. A pilot study on 4 properties in Australia has been undertaken to demonstrate this system, where benefits for customers and utilities are undeniable.

Keywords: blockchain, digital multi-utility, end use, demand forecasting

Procedia PDF Downloads 153
3573 Political Economy and Human Rights Engaging in Conversation

Authors: Manuel Branco

Abstract:

This paper argues that mainstream economics is one of the reasons that can explain the difficulty in fully realizing human rights because its logic is intrinsically contradictory to human rights, most especially economic, social and cultural rights. First, its utilitarianism, both in its cardinal and ordinal understanding, contradicts human rights principles. Maximizing aggregate utility along the lines of cardinal utility is a theoretical exercise that consists in ensuring as much as possible that gains outweigh losses in society. In this process an individual may get worse off, though. If mainstream logic is comfortable with this, human rights' logic does not. Indeed, universality is a key principle in human rights and for this reason the maximization exercise should aim at satisfying all citizens’ requests when goods and services necessary to secure human rights are at stake. The ordinal version of utilitarianism, in turn, contradicts the human rights principle of indivisibility. Contrary to ordinal utility theory that ranks baskets of goods, human rights do not accept ranking when these goods and services are necessary to secure human rights. Second, by relying preferably on market logic to allocate goods and services, mainstream economics contradicts human rights because the intermediation of money prices and the purpose of profit may cause exclusion, thus compromising the principle of universality. Finally, mainstream economics sees human rights mainly as constraints to the development of its logic. According to this view securing human rights would, then, be considered a cost weighing on economic efficiency and, therefore, something to be minimized. Fully realizing human rights needs, therefore, a different approach. This paper discusses a human rights-based political economy. This political economy, among other characteristics should give up mainstream economics narrow utilitarian approach, give up its belief that market logic should guide all exchanges of goods and services between human beings, and finally give up its view of human rights as constraints on rational choice and consequently on good economic performance. Giving up mainstream’s narrow utilitarian approach means, first embracing procedural utility and human rights-aimed consequentialism. Second, a more radical break can be imagined; non-utilitarian, or even anti-utilitarian, approaches may emerge, then, as alternatives, these two standpoints being not necessarily mutually exclusive, though. Giving up market exclusivity means embracing decommodification. More specifically, this means an approach that takes into consideration the value produced outside the market and an allocation process no longer necessarily centered on money prices. Giving up the view of human rights as constraints means, finally, to consider human rights as an expression of wellbeing and a manifestation of choice. This means, in turn, an approach that uses indicators of economic performance other than growth at the macro level and profit at the micro level, because what we measure affects what we do.

Keywords: economic and social rights, political economy, economic theory, markets

Procedia PDF Downloads 130
3572 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 283
3571 A Multicriteria Mathematical Programming Model for Farm Planning in Greece

Authors: Basil Manos, Parthena Chatzinikolaou, Fedra Kiomourtzi

Abstract:

This paper presents a Multicriteria Mathematical Programming model for farm planning and sustainable optimization of agricultural production. The model can be used as a tool for the analysis and simulation of agricultural production plans, as well as for the study of impacts of various measures of Common Agriculture Policy in the member states of European Union. The model can achieve the optimum production plan of a farm or an agricultural region combining in one utility function different conflicting criteria as the maximization of gross margin and the minimization of fertilizers used, under a set of constraints for land, labor, available capital, Common Agricultural Policy etc. The proposed model was applied to the region of Larisa in central Greece. The optimum production plan achieves a greater gross return, a less fertilizers use, and a less irrigated water use than the existent production plan.

Keywords: sustainable optimization, multicriteria analysis, agricultural production, farm planning

Procedia PDF Downloads 578
3570 Water Supply and Utility Management to Address Urban Sanitation Issues

Authors: Akshaya P., Priyanjali Prabhkaran

Abstract:

The paper examines the formulation of strategies to develop a comprehensive model of city level water utility management to addressing urban sanitation issues. The water is prime life sustaining natural resources and nature’s gifts to all living beings on the earth multiple urban sanitation issues are addressed in the supply of water in a city. Many of these urban sanitation issues are linked to population expansion and economic inequity. Increased usage of water and the development caused water scarcity. The lack of water supply results increases the chance of unhygienic situations in the cities. In this study, the urban sanitation issues are identified with respect to water supply and utility management. The study compared based on their best practices and initiatives. From this, best practices and initiatives identify suitable sustainable measures to address water supply issues in the city level. The paper concludes with the listed provision that should be considered suitable measures for water supply and utility management in city level to address the urban sanitation issues.

Keywords: water, benchmarking water supply, water supply networks, water supply management

Procedia PDF Downloads 79
3569 A Multi-Attribute Utility Model for Performance Evaluation of Sustainable Banking

Authors: Sonia Rebai, Mohamed Naceur Azaiez, Dhafer Saidane

Abstract:

In this study, we develop a performance evaluation model based on a multi-attribute utility approach aiming at reaching the sustainable banking (SB) status. This model is built accounting for various banks’ stakeholders in a win-win paradigm. In addition, it offers the opportunity for adopting a global measure of performance as an indication of a bank’s sustainability degree. This measure is referred to as banking sustainability performance index (BSPI). This index may constitute a basis for ranking banks. Moreover, it may constitute a bridge between the assessment types of financial and extra-financial rating agencies. A real application is performed on three French banks.

Keywords: multi-attribute utility theory, performance, sustainable banking, financial rating

Procedia PDF Downloads 439
3568 Reworking of the Anomalies in the Discounted Utility Model as a Combination of Cognitive Bias and Decrease in Impatience: Decision Making in Relation to Bounded Rationality and Emotional Factors in Intertemporal Choices

Authors: Roberta Martino, Viviana Ventre

Abstract:

Every day we face choices whose consequences are deferred in time. These types of choices are the intertemporal choices and play an important role in the social, economic, and financial world. The Discounted Utility Model is the mathematical model of reference to calculate the utility of intertemporal prospects. The discount rate is the main element of the model as it describes how the individual perceives the indeterminacy of subsequent periods. Empirical evidence has shown a discrepancy between the behavior expected from the predictions of the model and the effective choices made from the decision makers. In particular, the term temporal inconsistency indicates those choices that do not remain optimal with the passage of time. This phenomenon has been described with hyperbolic models of the discount rate which, unlike the linear or exponential nature assumed by the discounted utility model, is not constant over time. This paper explores the problem of inconsistency by tracing the decision-making process through the concept of impatience. The degree of impatience and the degree of decrease of impatience are two parameters that allow to quantify the weight of emotional factors and cognitive limitations during the evaluation and selection of alternatives. In fact, although the theory assumes perfectly rational decision makers, behavioral finance and cognitive psychology have made it possible to understand that distortions in the decision-making process and emotional influence have an inevitable impact on the decision-making process. The degree to which impatience is diminished is the focus of the first part of the study. By comparing consistent and inconsistent preferences over time, it was possible to verify that some anomalies in the discounted utility model are a result of the combination of cognitive bias and emotional factors. In particular: the delay effect and the interval effect are compared through the concept of misperception of time; starting from psychological considerations, a criterion is proposed to identify the causes of the magnitude effect that considers the differences in outcomes rather than their ratio; the sign effect is analyzed by integrating in the evaluation of prospects with negative outcomes the psychological aspects of loss aversion provided by Prospect Theory. An experiment implemented confirms three findings: the greatest variation in the degree of decrease in impatience corresponds to shorter intervals close to the present; the greatest variation in the degree of impatience occurs for outcomes of lower magnitude; the variation in the degree of impatience is greatest for negative outcomes. The experimental phase was implemented with the construction of the hyperbolic factor through the administration of questionnaires constructed for each anomaly. This work formalizes the underlying causes of the discrepancy between the discounted utility model and the empirical evidence of preference reversal.

Keywords: decreasing impatience, discount utility model, hyperbolic discount, hyperbolic factor, impatience

Procedia PDF Downloads 82
3567 Utilities as Creditors: The Effect of Enforcement of Water Bill Payment in Zambia

Authors: Elizabeth Spink

Abstract:

Providing safe and affordable drinking water to low-income households in developing countries remains a challenge. Policy goals of increasing household piped-water access and cost recovery for utility providers are often at odds. Nonpayment of utility bills is frequently cited as a constraint to improving the quality of utility service. However, nonpayment is widely tolerated, and households often accumulate significant debt to the utility provider. This study examines the effect of enforcement of water bill payment through supply disconnections in Livingstone, Zambia. This research uses a dynamic model of household monthly payments and accumulation of arrears, which determine the probability of disconnection, and simulates the effect of exogenous changes in enforcement levels. This model is empirically tested using an event-study framework of exogenous increases in enforcement capacity that occur during administrative rezoning events, which reduce the number of households that one enforcement agent is responsible for. The results show that households are five percentage points more likely to make a payment in the months following a rezoning event, but disconnections for low-income households increase as well, resulting in little change in revenue collected by the water utility. The results suggest that high enforcement of water bill payments toward credit-constrained households may be ineffective and lead to reduced piped-water access.

Keywords: enforcement, nonpayment, piped-water access, water utilities

Procedia PDF Downloads 205
3566 Construction Contractor Pre-Qualification Using Multi-Attribute Utility Theory: A Multiplicative Approach

Authors: B. Vikram, Y. Anu Leena, Y. Anu Neena, M. V. Krishna Rao, V. S. S. Kumar

Abstract:

The industry is often criticized for inefficiencies in outcomes such as time and cost overruns, low productivity, poor quality and inadequate customer satisfaction. To enhance the chances for construction projects to be successful, selecting an able contractor is one of the fundamental decisions to be made by clients. The selection of the most appropriate contractor is a multi-criteria decision making (MCDM) process. In this paper, multi-attribute utility theory (MAUT) is employed utilizing the multiplicative form of utility function for ranking the prequalified contractors. Performance assessment criteria covering contracting company attributes, experience record, past performance, performance potential, financial stability and project specific criteria are considered for contractor evaluation. A case study of multistoried building for which four contractors submitted bids is considered to illustrate the applicability of multiplicative approach of MAUT to rank the prequalified contractors. The proposed MAUT decision making methodology can also be employed to other decision making situations.

Keywords: multi-attribute utility theory, construction industry, prequalification, contractor

Procedia PDF Downloads 414
3565 Review of Vehicle to Grid Applications in Recent Years

Authors: Afsane Amiri

Abstract:

Electric Vehicle (EV) technology is expected to take a major share in the light-vehicle market in the coming decades. Charging of EVs will put an extra burden on the distribution grid and in some cases adjustments will need to be made. In this paper a review of different plug-in and vehicle to grid (V2G) capable vehicles are given along with their power electronics topologies. The economic implication of charging the vehicle or sending power back to the utility is described in brief.

Keywords: energy storage system, battery unit, cost, optimal sizing, plug-in electric vehicles (PEVs), smart grid

Procedia PDF Downloads 578
3564 Fast Tumor Extraction Method Based on Nl-Means Filter and Expectation Maximization

Authors: Sandabad Sara, Sayd Tahri Yassine, Hammouch Ahmed

Abstract:

The development of science has allowed computer scientists to touch the medicine and bring aid to radiologists as we are presenting it in our article. Our work focuses on the detection and localization of tumors areas in the human brain; this will be a completely automatic without any human intervention. In front of the huge volume of MRI to be treated per day, the radiologist can spend hours and hours providing a tremendous effort. This burden has become less heavy with the automation of this step. In this article we present an automatic and effective tumor detection, this work consists of two steps: the first is the image filtering using the filter Nl-means, then applying the expectation maximization algorithm (EM) for retrieving the tumor mask from the brain MRI and extracting the tumor area using the mask obtained from the second step. To prove the effectiveness of this method multiple evaluation criteria will be used, so that we can compare our method to frequently extraction methods used in the literature.

Keywords: MRI, Em algorithm, brain, tumor, Nl-means

Procedia PDF Downloads 303
3563 Enhance Concurrent Design Approach through a Design Methodology Based on an Artificial Intelligence Framework: Guiding Group Decision Making to Balanced Preliminary Design Solution

Authors: Loris Franchi, Daniele Calvi, Sabrina Corpino

Abstract:

This paper presents a design methodology in which stakeholders are assisted with the exploration of a so-called negotiation space, aiming to the maximization of both group social welfare and single stakeholder’s perceived utility. The outcome results in less design iterations needed for design convergence while obtaining a higher solution effectiveness. During the early stage of a space project, not only the knowledge about the system but also the decision outcomes often are unknown. The scenario is exacerbated by the fact that decisions taken in this stage imply delayed costs associated with them. Hence, it is necessary to have a clear definition of the problem under analysis, especially in the initial definition. This can be obtained thanks to a robust generation and exploration of design alternatives. This process must consider that design usually involves various individuals, who take decisions affecting one another. An effective coordination among these decision-makers is critical. Finding mutual agreement solution will reduce the iterations involved in the design process. To handle this scenario, the paper proposes a design methodology which, aims to speed-up the process of pushing the mission’s concept maturity level. This push up is obtained thanks to a guided negotiation space exploration, which involves autonomously exploration and optimization of trade opportunities among stakeholders via Artificial Intelligence algorithms. The negotiation space is generated via a multidisciplinary collaborative optimization method, infused by game theory and multi-attribute utility theory. In particular, game theory is able to model the negotiation process to reach the equilibria among stakeholder needs. Because of the huge dimension of the negotiation space, a collaborative optimization framework with evolutionary algorithm has been integrated in order to guide the game process to efficiently and rapidly searching for the Pareto equilibria among stakeholders. At last, the concept of utility constituted the mechanism to bridge the language barrier between experts of different backgrounds and differing needs, using the elicited and modeled needs to evaluate a multitude of alternatives. To highlight the benefits of the proposed methodology, the paper presents the design of a CubeSat mission for the observation of lunar radiation environment. The derived solution results able to balance all stakeholders needs and guaranteeing the effectiveness of the selection mission concept thanks to its robustness in valuable changeability. The benefits provided by the proposed design methodology are highlighted, and further development proposed.

Keywords: concurrent engineering, artificial intelligence, negotiation in engineering design, multidisciplinary optimization

Procedia PDF Downloads 109
3562 Exploiting Non-Uniform Utility of Computing: A Case Study

Authors: Arnab Sarkar, Michael Huang, Chuang Ren, Jun Li

Abstract:

The increasing importance of computing in modern society has brought substantial growth in the demand for more computational power. In some problem domains such as scientific simulations, available computational power still sets a limit on what can be practically explored in computation. For many types of code, there is non-uniformity in the utility of computation. That is not every piece of computation contributes equally to the quality of the result. If this non-uniformity is understood well and exploited effectively, we can much more effectively utilize available computing power. In this paper, we discuss a case study of exploring such non-uniformity in a particle-in-cell simulation platform. We find both the existence of significant non-uniformity and that it is generally straightforward to exploit it. We show the potential of order-of-magnitude effective performance gain while keeping the comparable quality of output. We also discuss some challenges in both the practical application of the idea and evaluation of its impact.

Keywords: approximate computing, landau damping, non uniform utility computing, particle-in-cell

Procedia PDF Downloads 232
3561 Solution of Insurance Pricing Model Giving Optimum Premium Level for Both Insured and Insurer by Game Theory

Authors: Betul Zehra Karagul

Abstract:

A game consists of strategies that each actor has in his/her own choice strategies, and a game regulates the certain rules in the strategies that the actors choose, express how they evaluate their knowledge and the utility of output results. Game theory examines the human behaviors (preferences) of strategic situations in which each actor of a game regards the action that others will make in spite of his own moves. There is a balance between each player playing a game with the final number of players and the player with a certain probability of choosing the players, and this is called Nash equilibrium. The insurance is a two-person game where the insurer and insured are the actors. Both sides have the right to act in favor of utility functions. The insured has to pay a premium to buy the insurance cover. The insured will want to pay a low premium while the insurer is willing to get a high premium. In this study, the state of equilibrium for insurance pricing was examined in terms of the insurer and insured with game theory.

Keywords: game theory, insurance pricing, Nash equilibrium, utility function

Procedia PDF Downloads 332
3560 A Method of Manufacturing Low Cost Utility Robots and Vehicles

Authors: Gregory E. Ofili

Abstract:

Introduction and Objective: Climate change and a global economy mean farmers must adapt and gain access to affordable and reliable automation technologies. Key barriers include a lack of transportation, electricity, and internet service, coupled with costly enabling technologies and limited local subject matter expertise. Methodology/Approach: Resourcefulness is essential to mechanization on a farm. This runs contrary to the tech industry practice of planned obsolescence and disposal. One solution is plug-and-play hardware that allows farmer to assemble, repair, program, and service their own fleet of industrial machines. To that end, we developed a method of manufacturing low-cost utility robots, transport vehicles, and solar/wind energy harvesting systems, all running on an open-source Robot Operating System (ROS). We demonstrate this technology by fabricating a utility robot and an all-terrain (4X4) utility vehicle. Constructed of aluminum trusses and weighing just 40 pounds, yet capable of transporting 200 pounds of cargo, on sale for less than $2,000. Conclusions & Policy Implications: Electricity, internet, and automation are essential for productivity and competitiveness. With planned obsolescence, the priorities of technology suppliers are not aligned with the farmer’s realities. This patent-pending method of manufacturing low-cost industrial robots and electric vehicles has met its objective. To create low-cost machines, the farmer can assemble, program, and repair with basic hand tools.

Keywords: automation, robotics, utility robot, small-hold farm, robot operating system

Procedia PDF Downloads 46
3559 Facility Anomaly Detection with Gaussian Mixture Model

Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho

Abstract:

Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.

Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm

Procedia PDF Downloads 247
3558 The Accuracy of Small Firms at Predicting Their Employment

Authors: Javad Nosratabadi

Abstract:

This paper investigates the difference between firms' actual and expected employment along with the amount of loans invested by them. In addition, it examines the relationship between the amount of loans received by firms and wages. Empirically, using a causal effect estimation and firm-level data from a province in Iran between 2004 and 2011, the results show that there is a range of the loan amount for which firms' expected employment meets their actual one. In contrast, there is a gap between firms' actual and expected employment for any other loan amount. Furthermore, the result shows that there is a positive and significant relationship between the amount of loan invested by firms and wages.

Keywords: expected employment, actual employment, wage, loan

Procedia PDF Downloads 130
3557 Expected Roles and Practical Roles of the University Council in the Perception of the Staff in Suan Sunandha Rajabhat University

Authors: Suwaree Yordchim, Rosjana Chandrasa, Toby Gibbs, Pornthip Ruangprach

Abstract:

This research aims to 1) study the actual and expected role performance of the University Council viewed by personnel, 2) compare expected role performance of the University Council. The sample group is 295 personnel in Suan Sunandha Rajabhat University (303 questionnaires from different departments returning back from 348 ones). The research tools are questionnaires and constructed interview forms. The data are analyzed by computerized statistic program and constructed interview forms are analyzed by percentage, and mean. The results revealed that: 1.) the actual and expected role performance of the University Council viewed by staff in Suan Sunandha Rajabhat University in overall is at a medium level while the expected role performance is at high in all dimensions. 2.) to consider the comparison of the actual and expected role performance of the University Council viewed by personnel in Suan Sunandha Rajabhat University, which, in overall, had significantly different viewpoints at the level of 0.05 in all dimensions.

Keywords: expected role, practical role, university council, personnel

Procedia PDF Downloads 400
3556 Software-Defined Networks in Utility Power Networks

Authors: Ava Salmanpour, Hanieh Saeedi, Payam Rouhi, Elahe Hamzeil, Shima Alimohammadi, Siamak Hossein Khalaj, Mohammad Asadian

Abstract:

Software-defined network (SDN) is a network architecture designed to control network using software application in a central manner. This ability enables remote control of the whole network regardless of the network technology. In fact, in this architecture network intelligence is separated from physical infrastructure, it means that required network components can be implemented virtually using software applications. Today, power networks are characterized by a high range of complexity with a large number of intelligent devices, processing both huge amounts of data and important information. Therefore, reliable and secure communication networks are required. SDNs are the best choice to meet this issue. In this paper, SDN networks capabilities and characteristics will be reviewed and different basic controllers will be compared. The importance of using SDNs to escalate efficiency and reliability in utility power networks is going to be discussed and the comparison between the SDN-based power networks and traditional networks will be explained.

Keywords: software-defined network, SDNs, utility network, open flow, communication, gas and electricity, controller

Procedia PDF Downloads 84
3555 Theta-Phase Gamma-Amplitude Coupling as a Neurophysiological Marker in Neuroleptic-Naive Schizophrenia

Authors: Jun Won Kim

Abstract:

Objective: Theta-phase gamma-amplitude coupling (TGC) was used as a novel evidence-based tool to reflect the dysfunctional cortico-thalamic interaction in patients with schizophrenia. However, to our best knowledge, no studies have reported the diagnostic utility of the TGC in the resting-state electroencephalographic (EEG) of neuroleptic-naive patients with schizophrenia compared to healthy controls. Thus, the purpose of this EEG study was to understand the underlying mechanisms in patients with schizophrenia by comparing the TGC at rest between two groups and to evaluate the diagnostic utility of TGC. Method: The subjects included 90 patients with schizophrenia and 90 healthy controls. All patients were diagnosed with schizophrenia according to the criteria of Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) by two independent psychiatrists using semi-structured clinical interviews. Because patients were either drug-naïve (first episode) or had not been taking psychoactive drugs for one month before the study, we could exclude the influence of medications. Five frequency bands were defined for spectral analyses: delta (1–4 Hz), theta (4–8 Hz), slow alpha (8–10 Hz), fast alpha (10–13.5 Hz), beta (13.5–30 Hz), and gamma (30-80 Hz). The spectral power of the EEG data was calculated with fast Fourier Transformation using the 'spectrogram.m' function of the signal processing toolbox in Matlab. An analysis of covariance (ANCOVA) was performed to compare the TGC results between the groups, which were adjusted using a Bonferroni correction (P < 0.05/19 = 0.0026). Receiver operator characteristic (ROC) analysis was conducted to examine the discriminating ability of the TGC data for schizophrenia diagnosis. Results: The patients with schizophrenia showed a significant increase in the resting-state TGC at all electrodes. The delta, theta, slow alpha, fast alpha, and beta powers showed low accuracies of 62.2%, 58.4%, 56.9%, 60.9%, and 59.0%, respectively, in discriminating the patients with schizophrenia from the healthy controls. The ROC analysis performed on the TGC data generated the most accurate result among the EEG measures, displaying an overall classification accuracy of 92.5%. Conclusion: As TGC includes phase, which contains information about neuronal interactions from the EEG recording, TGC is expected to be useful for understanding the mechanisms the dysfunctional cortico-thalamic interaction in patients with schizophrenia. The resting-state TGC value was increased in the patients with schizophrenia compared to that in the healthy controls and had a higher discriminating ability than the other parameters. These findings may be related to the compensatory hyper-arousal patterns of the dysfunctional default-mode network (DMN) in schizophrenia. Further research exploring the association between TGC and medical or psychiatric conditions that may confound EEG signals will help clarify the potential utility of TGC.

Keywords: quantitative electroencephalography (QEEG), theta-phase gamma-amplitude coupling (TGC), schizophrenia, diagnostic utility

Procedia PDF Downloads 117
3554 Tolerance of Colonoscopy: Questioning Its Utility in the Elderly

Authors: Faizan Rathore, Naveed Sultan, Humphrey O. Connor

Abstract:

This study was carried out from Jan '12-Dec'12 to assess current practice in Kerry General Hospital against the age-related indicators for colonoscopies. A total of 1474 colonoscopies were performed,1177(79.9%) were diagnostic and 297 (20.1%) were therapeutic, patients were divided into 4 age groups under 75, 75-80, 81-85, 86+. The trend analysis revealed an increase in diagnostic colonoscopies and decrease in therapeutic colonoscopies with age. 664(45.04%) of colonoscopies were reported normal which made up the majority of the total diagnoses, 1330 (90.2%) of colonoscopies occurred without any complications. Main complications were patient discomfort being the highest, present in 112(7.6%) of patients, and lowest being urticaria around the IV site present in 1 (0.1%) of the cases. Patient discomfort was higher in younger patients as evidenced by 98 cases aged <75 , followed by 11 cases aged 75-80, 2 cases aged 81-85 and 1 case aged >86. Highest percentage of poor tolerance was found in 14 (1.1%) of total patients <75, 1 (0.8%) of total patients aged 75-80, 1(1.7%) of total patients in age group 81-85 and none (0%) in age group >86. We have established the safety of colonoscopy, low rate of complications and a better tolerance in the elderly from this study, however, its utility, especially in the presence of other comorbidities in elderly is questionable.

Keywords: colonoscopy, elderly patients, utility, tolerance

Procedia PDF Downloads 407
3553 Evaluation and New Modeling Improvement of Water Quality

Authors: Sebahat Seker

Abstract:

Since there is a parallel connection between drinking water quality and public health, studies on drinking and domestic water are of vital importance. Ardahan Province is one of the provinces located in the Northeast Anatolian Region, where animal husbandry and agriculture are carried out economically. City mains water uses underground spring water as a source and is chlorinated and given to the city center by gravity. However, mains water cannot be used outside the central district of the city, and the majority of the people meet their drinking and utility water needs from the wells they have opened individually. The water element, which is vital for all living things, is the most important substance that sustains life for humans. Under normal conditions, a healthy person consumes approximately 1.8-2 liters of water. The quality and use of potable water is one of the most important issues in terms of health. The quality parameters of drinking and utility water have been revealed by the scientific world. Scientific studies on drinking water quality in the world and its impact on public health are among the most popular topics. Although our country is surrounded by water on three sides, potable water resources are very few. In the Eastern Anatolia Region, it is difficult for the public to access drinking and utility water due to the difficult conditions both climatically and geographically. In this study, samples taken from drinking and utility water at certain intervals from the stations determined, and water quality parameters will be determined. The fact that such a study has not been carried out in the region before and the knowledge of the local people about water quality is very important in terms of its original and widespread effect.

Keywords: water quality, modelling, evaluation, northeastern anatolia

Procedia PDF Downloads 163
3552 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping

Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton

Abstract:

Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.

Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern

Procedia PDF Downloads 157
3551 An Approach to Maximize the Influence Spread in the Social Networks

Authors: Gaye Ibrahima, Mendy Gervais, Seck Diaraf, Ouya Samuel

Abstract:

In this paper, we consider the influence maximization in social networks. Here we give importance to initial diffuser called the seeds. The goal is to find efficiently a subset of k elements in the social network that will begin and maximize the information diffusion process. A new approach which treats the social network before to determine the seeds, is proposed. This treatment eliminates the information feedback toward a considered element as seed by extracting an acyclic spanning social network. At first, we propose two algorithm versions called SCG − algoritm (v1 and v2) (Spanning Connected Graphalgorithm). This algorithm takes as input data a connected social network directed or no. And finally, a generalization of the SCG − algoritm is proposed. It is called SG − algoritm (Spanning Graph-algorithm) and takes as input data any graph. These two algorithms are effective and have each one a polynomial complexity. To show the pertinence of our approach, two seeds set are determined and those given by our approach give a better results. The performances of this approach are very perceptible through the simulation carried out by the R software and the igraph package.

Keywords: acyclic spanning graph, centrality measures, information feedback, influence maximization, social network

Procedia PDF Downloads 217