Search results for: algorithmic pricing
239 Asset Pricing Puzzle and GDP-Growth: Pre and Post Covid-19 Pandemic Effect on Pakistan Stock Exchange
Authors: Mohammad Azam
Abstract:
This work is an endeavor to empirically investigate the Gross Domestic Product-Growth as mediating variable between various factors and portfolio returns using a broad sample of 522 financial and non-financial firms enlisted on Pakistan Stock Exchange between January-1993 and June-2022. The study employs the Structural Equation modeling and Ordinary Least Square regression to determine the findings before and during the Covid-19 epidemiological situation, which has not received due attention by researchers. The analysis reveals that market and investment factors are redundant, whereas size and value show significant results, whereas Gross Domestic Product-Growth performs significant mediating impact for the whole time frame. Using before Covid-19 period, the results reveal that market, value, and investment are redundant, but size, profitability, and Gross Domestic Product-Growth are significant. During the Covid-19, the statistics indicate that market and investment are redundant, though size and Gross Domestic Product-Growth are highly significant, but value and profitability are moderately significant. The Ordinary Least Square regression shows that market and investment are statistically insignificant, whereas size is highly significant but value and profitability are marginally significant. Using the Gross Domestic Product-Growth augmented model, a slight growth in R-square is observed. The size, value and profitability factors are recommended to the investors for Pakistan Stock Exchange. Conclusively, in the Pakistani market, the Gross Domestic Product-Growth indicates a feeble moderating effect between risk-premia and portfolio returns.Keywords: asset pricing puzzle, mediating role of GDP-growth, structural equation modeling, COVID-19 pandemic, Pakistan stock exchange
Procedia PDF Downloads 73238 The Hidden Role of Interest Rate Risks in Carry Trades
Authors: Jingwen Shi, Qi Wu
Abstract:
We study the role played interest rate risk in carry trade return in order to understand the forward premium puzzle. In this study, our goal is to investigate to what extent carry trade return is indeed due to compensation for risk taking and, more important, to reveal the nature of these risks. Using option data not only on exchange rates but also on interest rate swaps (swaptions), our first finding is that, besides the consensus currency risks, interest rate risks also contribute a non-negligible portion to the carry trade return. What strikes us is our second finding. We find that large downside risks of future exchange rate movements are, in fact, priced significantly in option market on interest rates. The role played by interest rate risk differs structurally from the currency risk. There is a unique premium associated with interest rate risk, though seemingly small in size, which compensates the tail risks, the left tail to be precise. On the technical front, our study relies on accurately retrieving implied distributions from currency options and interest rate swaptions simultaneously, especially the tail components of the two. For this purpose, our major modeling work is to build a new international asset pricing model where we use an orthogonal setup for pricing kernels and specify non-Gaussian dynamics in order to capture three sets of option skew accurately and consistently across currency options and interest rate swaptions, domestic and foreign, within one model. Our results open a door for studying forward premium anomaly through implied information from interest rate derivative market.Keywords: carry trade, forward premium anomaly, FX option, interest rate swaption, implied volatility skew, uncovered interest rate parity
Procedia PDF Downloads 445237 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights
Authors: Olga Kokoulina
Abstract:
Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.Keywords: algorithms, public interest, trade secrets, transparency
Procedia PDF Downloads 124236 The Role of Artificial Intelligence in Patent Claim Interpretation: Legal Challenges and Opportunities
Authors: Mandeep Saini
Abstract:
The rapid advancement of Artificial Intelligence (AI) is transforming various fields, including intellectual property law. This paper explores the emerging role of AI in interpreting patent claims, a critical and highly specialized area within intellectual property rights. Patent claims define the scope of legal protection granted to an invention, and their precise interpretation is crucial in determining the boundaries of the patent holder's rights. Traditionally, this interpretation has relied heavily on the expertise of patent examiners, legal professionals, and judges. However, the increasing complexity of modern inventions, especially in fields like biotechnology, software, and electronics, poses significant challenges to human interpretation. Introducing AI into patent claim interpretation raises several legal and ethical concerns. This paper addresses critical issues such as the reliability of AI-driven interpretations, the potential for algorithmic bias, and the lack of transparency in AI decision-making processes. It considers the legal implications of relying on AI, particularly regarding accountability for errors and the potential challenges to AI interpretations in court. The paper includes a comparative study of AI-driven patent claim interpretations versus human interpretations across different jurisdictions to provide a comprehensive analysis. This comparison highlights the variations in legal standards and practices, offering insights into how AI could impact the harmonization of international patent laws. The paper proposes policy recommendations for the responsible use of AI in patent law. It suggests legal frameworks that ensure AI tools complement, rather than replace, human expertise in patent claim interpretation. These recommendations aim to balance the benefits of AI with the need for maintaining trust, transparency, and fairness in the legal process. By addressing these critical issues, this research contributes to the ongoing discourse on integrating AI into the legal field, specifically within intellectual property rights. It provides a forward-looking perspective on how AI could reshape patent law, offering both opportunities for innovation and challenges that must be carefully managed to protect the integrity of the legal system.Keywords: artificial intelligence (ai), patent claim interpretation, intellectual property rights, algorithmic bias, natural language processing, patent law harmonization, legal ethics
Procedia PDF Downloads 21235 Embedded Hybrid Intuition: A Deep Learning and Fuzzy Logic Approach to Collective Creation and Computational Assisted Narratives
Authors: Roberto Cabezas H
Abstract:
The current work shows the methodology developed to create narrative lighting spaces for the multimedia performance piece 'cluster: the vanished paradise.' This empirical research is focused on exploring unconventional roles for machines in subjective creative processes, by delving into the semantics of data and machine intelligence algorithms in hybrid technological, creative contexts to expand epistemic domains trough human-machine cooperation. The creative process in scenic and performing arts is guided mostly by intuition; from that idea, we developed an approach to embed collective intuition in computational creative systems, by joining the properties of Generative Adversarial Networks (GAN’s) and Fuzzy Clustering based on a semi-supervised data creation and analysis pipeline. The model makes use of GAN’s to learn from phenomenological data (data generated from experience with lighting scenography) and algorithmic design data (augmented data by procedural design methods), fuzzy logic clustering is then applied to artificially created data from GAN’s to define narrative transitions built on membership index; this process allowed for the creation of simple and complex spaces with expressive capabilities based on position and light intensity as the parameters to guide the narrative. Hybridization comes not only from the human-machine symbiosis but also on the integration of different techniques for the implementation of the aided design system. Machine intelligence tools as proposed in this work are well suited to redefine collaborative creation by learning to express and expand a conglomerate of ideas and a wide range of opinions for the creation of sensory experiences. We found in GAN’s and Fuzzy Logic an ideal tool to develop new computational models based on interaction, learning, emotion and imagination to expand the traditional algorithmic model of computation.Keywords: fuzzy clustering, generative adversarial networks, human-machine cooperation, hybrid collective data, multimedia performance
Procedia PDF Downloads 142234 Design Data Sorter Circuit Using Insertion Sorting Algorithm
Authors: Hoda Abugharsa
Abstract:
In this paper we propose to design a sorter circuit using insertion sorting algorithm. The circuit will be designed using Algorithmic State Machines (ASM) method. That means converting the insertion sorting flowchart into an ASM chart. Then the ASM chart will be used to design the sorter circuit and the control unit.Keywords: insert sorting algorithm, ASM chart, sorter circuit, state machine, control unit
Procedia PDF Downloads 445233 The Strategic Gas Aggregator: A Key Legal Intervention in an Evolving Nigerian Natural Gas Sector
Authors: Olanrewaju Aladeitan, Obiageli Phina Anaghara-Uzor
Abstract:
Despite the abundance of natural gas deposits in Nigeria and the immense potential, this presents both for the domestic and export oriented revenue, there exists an imbalance in the preference for export as against the development and optimal utilization of natural gas for the domestic industry. Considerable amounts of gas are still being wasted by flaring in the country to this day. Although the government has set in place initiatives to harness gas at the flare and thereby reduce volumes flared, the gas producers would rather direct the gas produced to the export market whereas gas apportioned to the domestic market is often marred by the low domestic gas price which is often discouraging to the gas producers. The exported fraction of gas production no doubt yields healthy revenues for the government and an encouraging return on investment for the gas producers and for this reason export sales remain enticing and preferable to the domestic sale of gas. This export pull impacts negatively if left unchecked, on the domestic market which is in no position to match the price at the international markets. The issue of gas price remains critical to the optimal development of the domestic gas industry, in that it comprises the basis for investment decisions of the producers on the allocation of their scarce resources and to what project to channel their output in order to maximize profit. In order then to rebalance the domestic industry and streamline the market for gas, the Gas Aggregation Company of Nigeria, also known as the Strategic Aggregator was proposed under the Nigerian Gas Master Plan of 2008 and then established pursuant to the National Gas Supply and Pricing Regulations of 2008 to implement the domestic gas supply obligation which focuses on ramping-up gas volumes for domestic utilization by mandatorily requiring each gas producer to dedicate a portion of its gas production for domestic utilization before having recourse to the export market. The 2008 Regulations further stipulate penalties in the event of non-compliance. This study, in the main, assesses the adequacy of the legal framework for the Nigerian Gas Industry, given that the operational laws are structured more for oil than its gas counterpart; examine the legal basis for the Strategic Aggregator in the light of the Domestic Gas Supply and Pricing Policy 2008 and the National Domestic Gas Supply and Pricing Regulations 2008 and makes a case for a review of the pivotal role of the Aggregator in the Nigerian Gas market. In undertaking this assessment, the doctrinal research methodology was adopted. Findings from research conducted reveal the reawakening of the Federal Government to the immense potential of its gas industry as a critical sector of its economy and the need for a sustainable domestic natural gas market. A case for the review of the ownership structure of the Aggregator to comprise a balanced mix of the Federal Government, gas producers and other key stakeholders in order to ensure the effective implementation of the domestic supply obligations becomes all the more imperative.Keywords: domestic supply obligations, natural gas, Nigerian gas sector, strategic gas aggregator
Procedia PDF Downloads 224232 Efficient Corporate Image as a Strategy for Enhancing Profitability in Hotels
Authors: Lucila T. Magalong
Abstract:
The hotel industry has been using their corporate image and reputation to maintain service quality, customer satisfaction, and customer loyalty and to leverage themselves against competitors and facilitate their growth strategies. With the increasing pressure to perform, hotels have even created hybrid service strategy to fight in the niche markets across pricing and level-off service parameters.Keywords: corporate image, hotel industry, service quality, customer expectations
Procedia PDF Downloads 465231 Capacity Oversizing for Infrastructure Sharing Synergies: A Game Theoretic Analysis
Authors: Robin Molinier
Abstract:
Industrial symbiosis (I.S) rely on two basic modes of cooperation between organizations that are infrastructure/service sharing and resource substitution (the use of waste materials, fatal energy and recirculated utilities for production). The former consists in the intensification of use of an asset and thus requires to compare the incremental investment cost to be incurred and the stand-alone cost faced by each potential participant to satisfy its own requirements. In order to investigate the way such a cooperation mode can be implemented we formulate a game theoretic model integrating the grassroot investment decision and the ex-post access pricing problem. In the first period two actors set cooperatively (resp. non-cooperatively) a level of common (resp. individual) infrastructure capacity oversizing to attract ex-post a potential entrant with a plug-and-play offer (available capacity, tariff). The entrant’s requirement is randomly distributed and known only after investments took place. Capacity cost exhibits sub-additive property so that there is room for profitable overcapacity setting in the first period under some conditions that we derive. The entrant willingness-to-pay for the access to the infrastructure is driven by both her standalone cost and the complement cost to be incurred in case she chooses to access an infrastructure whose the available capacity is lower than her requirement level. The expected complement cost function is thus derived, and we show that it is decreasing, convex and shaped by the entrant’s requirements distribution function. For both uniform and triangular distributions optimal capacity level is obtained in the cooperative setting and equilibrium levels are determined in the non-cooperative case. Regarding the latter, we show that competition is deterred by the first period investor with the highest requirement level. Using the non-cooperative game outcomes which gives lower bounds for the profit sharing problem in the cooperative one we solve the whole game and describe situations supporting sharing agreements.Keywords: capacity, cooperation, industrial symbiosis, pricing
Procedia PDF Downloads 440230 Investigating Students’ Cognitive Processes in Solving Stoichiometric Problems and its Implications to Teaching and Learning Chemistry
Authors: Allen A. Espinosa, Larkins A. Trinidad
Abstract:
The present study investigated collegiate students’ problem solving strategies and misconceptions in solving stoichiometric problems and later on formulate a teaching framework from the result of the study. The study found out that the most prominent strategies among students are the mole method and the proportionality method, which are both algorithmic by nature. Misconception was also noted as some students rely on Avogadro’s number in converting between moles. It is suggested therefore that the teaching of stoichiometry should not be confined to demonstration. Students should be involved in the process of thinking of ways to solve the problem.Keywords: stoichiometry, Svogadro’s number, mole method, proportionality method
Procedia PDF Downloads 381229 Performance of the Strong Stability Method in the Univariate Classical Risk Model
Authors: Safia Hocine, Zina Benouaret, Djamil A¨ıssani
Abstract:
In this paper, we study the performance of the strong stability method of the univariate classical risk model. We interest to the stability bounds established using two approaches. The first based on the strong stability method developed for a general Markov chains. The second approach based on the regenerative processes theory . By adopting an algorithmic procedure, we study the performance of the stability method in the case of exponential distribution claim amounts. After presenting numerically and graphically the stability bounds, an interpretation and comparison of the results have been done.Keywords: Marcov chain, regenerative process, risk model, ruin probability, strong stability
Procedia PDF Downloads 324228 Identification of Configuration Space Singularities with Local Real Algebraic Geometry
Authors: Marc Diesse, Hochschule Heilbronn
Abstract:
We address the question of identifying the configuration space singularities of linkages, i.e., points where the configuration space is not locally a submanifold of Euclidean space. Because the configuration space cannot be smoothly parameterized at such points, these singularity types have a significantly negative impact on the kinematics of the linkage. It is known that Jacobian methods do not provide sufficient conditions for the existence of CS-singularities. Herein, we present several additional algebraic criteria that provide the sufficient conditions. Further, we use those criteria to analyze certain classes of planar linkages. These examples will also show how the presented criteria can be checked using algorithmic methods.Keywords: linkages, configuration space-singularities, real algebraic geometry, analytic geometry
Procedia PDF Downloads 148227 Waad Bil Mourabaha Pricing
Authors: Dchieche Amina, Aboulaich Rajae
Abstract:
In this work, we will modelize Waad Bil Mourabaha contract. This islamic contract provides the right to buy goods at a future date with a Mourabaha. Waad is a promise of sale or purchase of goods, declared in a unilateral way. In spite of the divergence between some schools of Islamic law about the Waad, this contract will allow us to study sophisticated and interesting contract: Waad Bil Mourabaha that can be used for hedging. In order to price Waad Bil Mourabaha contract, we will use an adapted Black and Scholes model using the Shariah compliant assumptions.Keywords: Islamic finance, Black-Scholes model, call option, risks, hedging
Procedia PDF Downloads 507226 Co-integration for Soft Commodities with Non-Constant Volatility
Authors: E. Channol, O. Collet, N. Kostyuchyk, T. Mesbah, Quoc Hoang Long Nguyen
Abstract:
In this paper, a pricing model is proposed for co-integrated commodities extending Larsson model. The futures formulae have been derived and tests have been performed with non-constant volatility. The model has been applied to energy commodities (gas, CO2, energy) and soft commodities (corn, wheat). Results show that non-constant volatility leads to more accurate short term prices, which provides better evaluation of value-at-risk and more generally improve the risk management.Keywords: co-integration, soft commodities, risk management, value-at-risk
Procedia PDF Downloads 547225 The Value Relevance of Components of Other Comprehensive Income When Net Income Is Disaggregated
Authors: Taisier A. Zoubi, Feras Salama, Mahmud Hossain, Yass A. Alkafaji
Abstract:
The purpose of this study is to examine the equity pricing of other comprehensive income when earnings are disaggregated into several components. Our findings indicate that other comprehensive income can better explain variation in stock returns when net income is reported in a disaggregated form. Additionally, we found that disaggregating both net income and other comprehensive income can explain more of the variation in the stock returns than the two summary components of comprehensive income. Our results survive a series of robustness checks.Keywords: market valuation, other comprehensive income, value-relevance, incremental information content
Procedia PDF Downloads 301224 Clubhouse: A Minor Rebellion against the Algorithmic Tyranny of the Majority
Authors: Vahid Asadzadeh, Amin Ataee
Abstract:
Since the advent of social media, there has been a wave of optimism among researchers and civic activists about the influence of virtual networks on the democratization process, which has gradually waned. One of the lesser-known concerns is how to increase the possibility of hearing the voices of different minorities. According to the theory of media logic, the media, using their technological capabilities, act as a structure through which events and ideas are interpreted. Social media, through the use of the learning machine and the use of algorithms, has formed a kind of structure in which the voices of minorities and less popular topics are lost among the commotion of the trends. In fact, the recommended systems and algorithms used in social media are designed to help promote trends and make popular content more popular, and content that belongs to minorities is constantly marginalized. As social networks gradually play a more active role in politics, the possibility of freely participating in the reproduction and reinterpretation of structures in general and political structures in particular (as Laclau and Mouffe had in mind) can be considered as criteria to democracy in action. The point is that the media logic of virtual networks is shaped by the rule and even the tyranny of the majority, and this logic does not make it possible to design a self-foundation and self-revolutionary model of democracy. In other words, today's social networks, though seemingly full of variety But they are governed by the logic of homogeneity, and they do not have the possibility of multiplicity as is the case in immanent radical democracies (influenced by Gilles Deleuze). However, with the emergence and increasing popularity of Clubhouse as a new social media, there seems to be a shift in the social media space, and that is the diminishing role of algorithms and systems reconditioners as content delivery interfaces. This has led to the fact that in the Clubhouse, the voices of minorities are better heard, and the diversity of political tendencies manifests itself better. The purpose of this article is to show, first, how social networks serve the elimination of minorities in general, and second, to argue that the media logic of social networks must adapt to new interpretations of democracy that give more space to minorities and human rights. Finally, this article will show how the Clubhouse serves the new interpretations of democracy at least in a minimal way. To achieve the mentioned goals, in this article by a descriptive-analytical method, first, the relation between media logic and postmodern democracy will be inquired. The political economy popularity in social media and its conflict with democracy will be discussed. Finally, it will be explored how the Clubhouse provides a new horizon for the concepts embodied in radical democracy, a horizon that more effectively serves the rights of minorities and human rights in general.Keywords: algorithmic tyranny, Clubhouse, minority rights, radical democracy, social media
Procedia PDF Downloads 145223 Parallel Computing: Offloading Matrix Multiplication to GPU
Authors: Bharath R., Tharun Sai N., Bhuvan G.
Abstract:
This project focuses on developing a Parallel Computing method aimed at optimizing matrix multiplication through GPU acceleration. Addressing algorithmic challenges, GPU programming intricacies, and integration issues, the project aims to enhance efficiency and scalability. The methodology involves algorithm design, GPU programming, and optimization techniques. Future plans include advanced optimizations, extended functionality, and integration with high-level frameworks. User engagement is emphasized through user-friendly interfaces, open- source collaboration, and continuous refinement based on feedback. The project's impact extends to significantly improving matrix multiplication performance in scientific computing and machine learning applications.Keywords: matrix multiplication, parallel processing, cuda, performance boost, neural networks
Procedia PDF Downloads 58222 Pricing Techniques to Mitigate Recurring Congestion on Interstate Facilities Using Dynamic Feedback Assignment
Authors: Hatem Abou-Senna
Abstract:
Interstate 4 (I-4) is a primary east-west transportation corridor between Tampa and Daytona cities, serving commuters, commercial and recreational traffic. I-4 is known to have severe recurring congestion during peak hours. The congestion spans about 11 miles in the evening peak period in the central corridor area as it is considered the only non-tolled limited access facility connecting the Orlando Central Business District (CBD) and the tourist attractions area (Walt Disney World). Florida officials had been skeptical of tolling I-4 prior to the recent legislation, and the public through the media had been complaining about the excessive toll facilities in Central Florida. So, in search for plausible mitigation to the congestion on the I-4 corridor, this research is implemented to evaluate the effectiveness of different toll pricing alternatives that might divert traffic from I-4 to the toll facilities during the peak period. The network is composed of two main diverging limited access highways, freeway (I-4) and toll road (SR 417) in addition to two east-west parallel toll roads SR 408 and SR 528, intersecting the above-mentioned highways from both ends. I-4 and toll road SR 408 are the most frequently used route by commuters. SR-417 is a relatively uncongested toll road with 15 miles longer than I-4 and $5 tolls compared to no monetary cost on 1-4 for the same trip. The results of the calibrated Orlando PARAMICS network showed that percentages of route diversion vary from one route to another and depends primarily on the travel cost between specific origin-destination (O-D) pairs. Most drivers going from Disney (O1) or Lake Buena Vista (O2) to Lake Mary (D1) were found to have a high propensity towards using I-4, even when eliminating tolls and/or providing real-time information. However, a diversion from I-4 to SR 417 for these OD pairs occurred only in the cases of the incident and lane closure on I-4, due to the increase in delay and travel costs, and when information is provided to travelers. Furthermore, drivers that diverted from I-4 to SR 417 and SR 528 did not gain significant travel-time savings. This was attributed to the limited extra capacity of the alternative routes in the peak period and the longer traveling distance. When the remaining origin-destination pairs were analyzed, average travel time savings on I-4 ranged between 10 and 16% amounting to 10 minutes at the most with a 10% increase in the network average speed. High propensity of diversion on the network increased significantly when eliminating tolls on SR 417 and SR 528 while doubling the tolls on SR 408 along with the incident and lane closure scenarios on I-4 and with real-time information provided. The toll roads were found to be a viable alternative to I-4 for these specific OD pairs depending on the user perception of the toll cost which was reflected in their specific travel times. However, on the macroscopic level, it was concluded that route diversion through toll reduction or elimination on surrounding toll roads would only have a minimum impact on reducing I-4 congestion during the peak period.Keywords: congestion pricing, dynamic feedback assignment, microsimulation, paramics, route diversion
Procedia PDF Downloads 178221 Consumer Behavior and the Demand for Sustainable Buildings in an Emerging Market: The Example of Brazil
Authors: Vinícius L. L. Morrone, David Douek, Helder M. F. Pereira, Bernadete L. M. Grandolpho
Abstract:
This work aimed to identify the relationships between the level of consumer environmental awareness and their search for sustainable properties, as well as to understand the main sustainability structures considered by these consumers during the decision process. Additionally, the paper looked up to the influence environmental awareness and financial status have over the disposition of buyers to pay more for sustainable properties. To achieve these objectives, 318 questionnaires were answered electronically, after being sent to the Green Building Brazil email basis, as to other Real Estate developers client basis. From all the questionnaires answered, 71 were discarded, leaving a total amount of 247 admitted questionnaires to be analyzed. The responses were evaluated based on the theory of consumer decision making, especially on the influence factors of this process. The data were processed using a PLS model, using the R software. The results have shown that the level of consumer environmental awareness effectively affects the consumer’s will of acquiring a sustainable property or, at least, a property with some environmental friendly structures. The consumer’s environmental awareness also positively impacts the importance consumers give to individual environmental friendly structures. Also, as a consumer value to those individual structures raises, it is also observed a raise in his will to buy a sustainable property. Additionally, the impact of consumer’s environmental awareness and financial status over the willingness to pay more for a property with those attributes. The results indicate that there was no relationship between consumers' environmental awareness and their willingness to pay more for a sustainable property. On the other hand, the financial status and the family income of the consumers showed a positive relation with the willingness to pay more for a sustainable property. This indicates that consumers with better financial conditions, which according to the analysis do not necessarily have a greater environmental awareness, are those who are willing to pay more for a sustainable property. Thus, this study indicates that, even if the environmental awareness impact positively the demand for sustainable structures and properties, this impact is not price reflected, due to the price elasticity of the consumption, especially for a category of lower income consumers. This paper adds to the literature in the way it projects some guidelines to the consumer’s decision process in the Real Estate market in emerging economies, as well as it presents some drivers to pricing decisions.Keywords: consumer behavior, environmental awareness, real estate pricing, sustainable buildings
Procedia PDF Downloads 190220 A Genetic Algorithm Approach for Multi Constraint Team Orienteering Problem with Time Windows
Authors: Uyanga Sukhbaatar, Ahmed Lbath, Mendamar Majig
Abstract:
The Orienteering Problem is the most known example to start modeling tourist trip design problem. In order to meet tourist’s interest and constraint the OP is becoming more and more complicate to solve. The Multi Constraint Team Orienteering Problem with Time Windows is the last extension of the OP which differentiates from other extensions by including more extra associated constraints. The goal of the MCTOPTW is maximizing tourist’s satisfaction score in same time not to violate any of these constraints. This paper presents a genetic algorithmic approach to tackle the MCTOPTW. The benchmark data from literature is tested by our algorithm and the performance results are compared.Keywords: multi constraint team orienteering problem with time windows, genetic algorithm, tour planning system
Procedia PDF Downloads 626219 A Theory and Empirical Analysis on the Efficency of Chinese Electricity Pricing
Authors: Jianlin Wang, Jiajia Zhao
Abstract:
This paper applies the theory and empirical method to examine the relationship between electricity price and coal price, as well as electricity and industry output, for China during Jan 1999-Dec 2012. Our results indicate that there is no any causality between coal price and electricity price under other factors are controlled. However, we found a bi-directional causality between electricity consumption and industry output. Overall, the electricity price set by China’s NDRC is inefficient, which lead to the electricity supply shortage after 2004. It is time to reform electricity price system for China’s reformers.Keywords: electricity price, coal price, power supply, China
Procedia PDF Downloads 468218 Prospective Museum Visitor Management Based on Prospect Theory: A Pragmatic Approach
Authors: Athina Thanou, Eirini Eleni Tsiropoulou, Symeon Papavassiliou
Abstract:
The problem of museum visitor experience and congestion management – in various forms - has come increasingly under the spotlight over the last few years, since overcrowding can significantly decrease the quality of visitors’ experience. Evidence suggests that on busy days the amount of time a visitor spends inside a crowded house museum can fall by up to 60% compared to a quiet mid-week day. In this paper we consider the aforementioned problem, by treating museums as evolving social systems that induce constraints. However, in a cultural heritage space, as opposed to the majority of social environments, the momentum of the experience is primarily controlled by the visitor himself. Visitors typically behave selfishly regarding the maximization of their own Quality of Experience (QoE) - commonly expressed through a utility function that takes several parameters into consideration, with crowd density and waiting/visiting time being among the key ones. In such a setting, congestion occurs when either the utility of one visitor decreases due to the behavior of other persons, or when costs of undertaking an activity rise due to the presence of other persons. We initially investigate how visitors’ behavioral risk attitudes, as captured and represented by prospect theory, affect their decisions in resource sharing settings, where visitors’ decisions and experiences are strongly interdependent. Different from the majority of existing studies and literature, we highlight that visitors are not risk neutral utility maximizers, but they demonstrate risk-aware behavior according to their personal risk characteristics. In our work, exhibits are organized into two groups: a) “safe exhibits” that correspond to less congested ones, where the visitors receive guaranteed satisfaction in accordance with the visiting time invested, and b) common pool of resources (CPR) exhibits, which are the most popular exhibits with possibly increased congestion and uncertain outcome in terms of visitor satisfaction. A key difference is that the visitor satisfaction due to CPR strongly depends not only on the invested time decision of a specific visitor, but also on that of the rest of the visitors. In the latter case, the over-investment in time, or equivalently the increased congestion potentially leads to “exhibit failure”, interpreted as the visitors gain no satisfaction from their observation of this exhibit due to high congestion. We present a framework where each visitor in a distributed manner determines his time investment in safe or CPR exhibits to optimize his QoE. Based on this framework, we analyze and evaluate how visitors, acting as prospect-theoretic decision-makers, respond and react to the various pricing policies imposed by the museum curators. Based on detailed evaluation results and experiments, we present interesting observations, regarding the impact of several parameters and characteristics such as visitor heterogeneity and use of alternative pricing policies, on scalability, user satisfaction, museum capacity, resource fragility, and operation point stability. Furthermore, we study and present the effectiveness of alternative pricing mechanisms, when used as implicit tools, to deal with the congestion management problem in the museums, and potentially decrease the exhibit failure probability (fragility), while considering the visitor risk preferences.Keywords: museum resource and visitor management, congestion management, propsect theory, cyber physical social systems
Procedia PDF Downloads 184217 Automated Recognition of Still’s Murmur in Children
Authors: Sukryool Kang, James McConnaughey, Robin Doroshow, Raj Shekhar
Abstract:
Still’s murmur, a vibratory heart murmur, is the most common normal innocent murmur of childhood. Many children with this murmur are unnecessarily referred for cardiology consultation and testing, which exacts a high cost financially and emotionally on the patients and their parents. Pediatricians to date are not successful at distinguishing Still’s murmur from murmurs of true heart disease. In this paper, we present a new algorithmic approach to distinguish Still’s murmur from pathological murmurs in children. We propose two distinct features, spectral width and signal power, which describe the sharpness of the spectrum and the signal intensity of the murmur, respectively. Seventy pediatric heart sound recordings of 41 Still’s and 29 pathological murmurs were used to develop and evaluate our algorithm that achieved a true positive rate of 97% and false positive rate of 0%. This approach would meet clinical standards in recognizing Still’s murmur.Keywords: AR modeling, auscultation, heart murmurs, Still's murmur
Procedia PDF Downloads 368216 Competitiveness and Pricing Policy Assessment for Resilience Surface Access System at Airports
Authors: Dimitrios J. Dimitriou
Abstract:
Considering a worldwide tendency, air transports are growing very fast and many changes have taken place in planning, management and decision making process. Given the complexity of airport operation, the best use of existing capacity is the key driver of efficiency and productivity. This paper deals with the evaluation framework for the ground access at airports, by using a set of mode choice indicators providing key messages towards airport’s ground access performance. The application presents results for a sample of 12 European airports, illustrating recommendations to define policy and improve service for the air transport access chain.Keywords: airport ground access, air transport chain, airport access performance, airport policy
Procedia PDF Downloads 370215 Algorithmic Fault Location in Complex Gas Networks
Authors: Soban Najam, S. M. Jahanzeb, Ahmed Sohail, Faraz Idris Khan
Abstract:
With the recent increase in reliance on Gas as the primary source of energy across the world, there has been a lot of research conducted on gas distribution networks. As the complexity and size of these networks grow, so does the leakage of gas in the distribution network. One of the most crucial factors in the production and distribution of gas is UFG or Unaccounted for Gas. The presence of UFG signifies that there is a difference between the amount of gas distributed, and the amount of gas billed. Our approach is to use information that we acquire from several specified points in the network. This information will be used to calculate the loss occurring in the network using the developed algorithm. The Algorithm can also identify the leakages at any point of the pipeline so we can easily detect faults and rectify them within minimal time, minimal efforts and minimal resources.Keywords: FLA, fault location analysis, GDN, gas distribution network, GIS, geographic information system, NMS, network Management system, OMS, outage management system, SSGC, Sui Southern gas company, UFG, unaccounted for gas
Procedia PDF Downloads 626214 Data-Driven Market Segmentation in Hospitality Using Unsupervised Machine Learning
Authors: Rik van Leeuwen, Ger Koole
Abstract:
Within hospitality, marketing departments use segmentation to create tailored strategies to ensure personalized marketing. This study provides a data-driven approach by segmenting guest profiles via hierarchical clustering based on an extensive set of features. The industry requires understandable outcomes that contribute to adaptability for marketing departments to make data-driven decisions and ultimately driving profit. A marketing department specified a business question that guides the unsupervised machine learning algorithm. Features of guests change over time; therefore, there is a probability that guests transition from one segment to another. The purpose of the study is to provide steps in the process from raw data to actionable insights, which serve as a guideline for how hospitality companies can adopt an algorithmic approach.Keywords: hierarchical cluster analysis, hospitality, market segmentation
Procedia PDF Downloads 108213 A Game-Theory-Based Price-Optimization Algorithm for the Simulation of Markets Using Agent-Based Modelling
Authors: Juan Manuel Sanchez-Cartas, Gonzalo Leon
Abstract:
A price competition algorithm for ABMs based on game theory principles is proposed to deal with the simulation of theoretical market models. The algorithm is applied to the classical Hotelling’s model and to a two-sided market model to show it leads to the optimal behavior predicted by theoretical models. However, when theoretical models fail to predict the equilibrium, the algorithm is capable of reaching a feasible outcome. Results highlight that the algorithm can be implemented in other simulation models to guarantee rational users and endogenous optimal behaviors. Also, it can be applied as a tool of verification given that is theoretically based.Keywords: agent-based models, algorithmic game theory, multi-sided markets, price optimization
Procedia PDF Downloads 455212 Multi-Scaled Non-Local Means Filter for Medical Images Denoising: Empirical Mode Decomposition vs. Wavelet Transform
Authors: Hana Rabbouch
Abstract:
In recent years, there has been considerable growth of denoising techniques mainly devoted to medical imaging. This important evolution is not only due to the progress of computing techniques, but also to the emergence of multi-resolution analysis (MRA) on both mathematical and algorithmic bases. In this paper, a comparative study is conducted between the two best-known MRA-based decomposition techniques: the Empirical Mode Decomposition (EMD) and the Discrete Wavelet Transform (DWT). The comparison is carried out in a framework of multi-scale denoising, where a Non-Local Means (NLM) filter is performed scale-by-scale to a sample of benchmark medical images. The results prove the effectiveness of the multiscaled denoising, especially when the NLM filtering is coupled with the EMD.Keywords: medical imaging, non local means, denoising, multiscaled analysis, empirical mode decomposition, wavelets
Procedia PDF Downloads 141211 Dynamics of Investor's Behaviour: An Analytical Survey Study in Indian Securities Market
Authors: Saurabh Agarwal
Abstract:
This paper attempts to formalise the effect of demographic variables like marital status, gender, occupation and age on the source of investment advice which, in turn, affect the herd behaviour of investors and probability of investment in near future. Further, postulations have been made for most preferred investment option and purpose of saving and source of investment. Impact of theoretical analysis on choice among investment alternatives has also been investigated. The analysis contributes to understanding the different investment choices made by households in India. The insights offered in the paper indirectly contribute in uncovering the various unexplained asset pricing puzzles.Keywords: portfolio choice, investment decisions, investor’s behaviour, Indian securities market
Procedia PDF Downloads 367210 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 89