Search results for: decisions under uncertainty
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 776

Search results for: decisions under uncertainty

146 Recommended Practice for Experimental Evaluation of the Seepage Sensitivity Damage of Coalbed Methane Reservoirs

Authors: Hao Liu, Lihui Zheng, Chinedu J. Okere, Chao Wang, Xiangchun Wang, Peng Zhang

Abstract:

The coalbed methane (CBM) extraction industry (an unconventional energy source) has not established guidelines for experimental evaluation of sensitivity damage for coal samples. The existing experimental process of previous researches mainly followed the industry standard for conventional oil and gas reservoirs (CIS). However, the existing evaluation method ignores certain critical differences between CBM reservoirs and conventional reservoirs, which could inevitably result in an inaccurate evaluation of sensitivity damage and, eventually, poor decisions regarding the formulation of formation damage prevention measures. In this study, we propose improved experimental guidelines for evaluating seepage sensitivity damage of CBM reservoirs by leveraging on the shortcomings of the existing methods. The proposed method was established via a theoretical analysis of the main drawbacks of the existing methods and validated through comparative experiments. The results show that the proposed evaluation technique provided reliable experimental results that can better reflect actual reservoir conditions and correctly guide the future development of CBM reservoirs. This study is pioneering the research on the optimization of experimental parameters for efficient exploration and development of CBM reservoirs.

Keywords: Coalbed methane, formation damage, permeability, unconventional energy source.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 389
145 Assessing the Adaptive Re-Use Potential of Buildings as Part of the Disaster Management Process

Authors: A. Esra İdemen, Sinan M. Şener, Emrah Acar

Abstract:

The technological paradigm of the disaster management field, especially in the case of governmental intervention strategies, is generally based on rapid and flexible accommodation solutions. From various technical solution patterns used to address the immediate housing needs of disaster victims, the adaptive re-use of existing buildings can be considered to be both low-cost and practical. However, there is a scarcity of analytical methods to screen, select and adapt buildings to help decision makers in cases of emergency. Following an extensive literature review, this paper aims to highlight key points and problem areas associated with the adaptive re-use of buildings within the disaster management context. In other disciplines such as real estate management, the adaptive re-use potential (ARP) of existing buildings is typically based on the prioritization of a set of technical and non-technical criteria which are then weighted to arrive at an economically viable investment decision. After a disaster, however, the assessment of the ARP of buildings requires consideration of different/additional layers of analysis which stem from general disaster management principles and the peculiarities of different types of disasters, as well as of their victims. In this paper, a discussion of the development of an adaptive re-use potential (ARP) assessment model is presented. It is thought that governmental and non-governmental decision makers who are required to take quick decisions to accommodate displaced masses following disasters are likely to benefit from the implementation of such a model.

Keywords: Adaptive re-use of buildings, assessment model, disaster management, temporary housing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
144 Artificial Intelligence-Based Chest X-Ray Test of COVID-19 Patients

Authors: Dhurgham Al-Karawi, Nisreen Polus, Shakir Al-Zaidi, Sabah Jassim

Abstract:

The management of COVID-19 patients based on chest imaging is emerging as an essential tool for evaluating the spread of the pandemic which has gripped the global community. It has already been used to monitor the situation of COVID-19 patients who have issues in respiratory status. There has been increase to use chest imaging for medical triage of patients who are showing moderate-severe clinical COVID-19 features, this is due to the fast dispersal of the pandemic to all continents and communities. This article demonstrates the development of machine learning techniques for the test of COVID-19 patients using Chest X-Ray (CXR) images in nearly real-time, to distinguish the COVID-19 infection with a significantly high level of accuracy. The testing performance has covered a combination of different datasets of CXR images of positive COVID-19 patients, patients with viral and bacterial infections, also, people with a clear chest. The proposed AI scheme successfully distinguishes CXR scans of COVID-19 infected patients from CXR scans of viral and bacterial based pneumonia as well as normal cases with an average accuracy of 94.43%, sensitivity 95%, and specificity 93.86%. Predicted decisions would be supported by visual evidence to help clinicians speed up the initial assessment process of new suspected cases, especially in a resource-constrained environment.

Keywords: COVID-19, chest x-ray scan, artificial intelligence, texture analysis, local binary pattern transform, Gabor filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 681
143 Celebrity Endorsement: How It Works When a Celebrity Fits the Brand and Advertisement

Authors: Göksel Şimşek

Abstract:

Celebrities are admired, appreciated and imitated all over the world. As a natural result of this, today many brands choose to work with celebrities for their advertisements. It can be said that the more the brands include celebrities in their marketing communication strategies, the tougher the competition in this field becomes and they allocate a large portion of their marketing budget to this. Brands invest in celebrities who will represent them in order to build the image they want to create.

This study aimed to bring under spotlight the perceptions of Turkish customers regarding the use of celebrities in advertisements and marketing communication and try to understand their possible effects on subsequent purchasing decisions. In addition, consumers’ reactions and perceptions were investigated in the context of the product-celebrity match, to what extent the celebrity conforms to the concept of the advertisement and the celebrity-target audience match.

In  order  to  achieve  this  purpose, a  quantitative research  was conducted  as a case  study concerning  Mavi Jeans  (textile company). Information was obtained through survey. The results from this case study are supported by relevant theories concerning the main subject. The most valuable result would be that instead of creating an advertisement around a celebrity in demand at the time, using a celebrity that fits the concept of the advertisement and feeds the concept rather than replaces it, that is celebrity endorsement, will lead to more striking and positive results.

Keywords: Celebrity endorsement, product-celebrity match, advertising.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6320
142 Typical Day Prediction Model for Output Power and Energy Efficiency of a Grid-Connected Solar Photovoltaic System

Authors: Yan Su, L. C. Chan

Abstract:

A novel typical day prediction model have been built and validated by the measured data of a grid-connected solar photovoltaic (PV) system in Macau. Unlike conventional statistical method used by previous study on PV systems which get results by averaging nearby continuous points, the present typical day statistical method obtain the value at every minute in a typical day by averaging discontinuous points at the same minute in different days. This typical day statistical method based on discontinuous point averaging makes it possible for us to obtain the Gaussian shape dynamical distributions for solar irradiance and output power in a yearly or monthly typical day. Based on the yearly typical day statistical analysis results, the maximum possible accumulated output energy in a year with on site climate conditions and the corresponding optimal PV system running time are obtained. Periodic Gaussian shape prediction models for solar irradiance, output energy and system energy efficiency have been built and their coefficients have been determined based on the yearly, maximum and minimum monthly typical day Gaussian distribution parameters, which are obtained from iterations for minimum Root Mean Squared Deviation (RMSD). With the present model, the dynamical effects due to time difference in a day are kept and the day to day uncertainty due to weather changing are smoothed but still included. The periodic Gaussian shape correlations for solar irradiance, output power and system energy efficiency have been compared favorably with data of the PV system in Macau and proved to be an improvement than previous models.

Keywords: Grid Connected, RMSD, Solar PV System, Typical Day.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
141 Using Satellite Images Datasets for Road Intersection Detection in Route Planning

Authors: Fatma El-zahraa El-taher, Ayman Taha, Jane Courtney, Susan Mckeever

Abstract:

Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions is critical to decisions such as crossing roads or selecting safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition  problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset are examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of detection of intersections in satellite images is evaluated.

Keywords: Satellite images, remote sensing images, data acquisition, autonomous vehicles, robot navigation, route planning, road intersections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 773
140 A Comprehensive Survey on RAT Selection Algorithms for Heterogeneous Networks

Authors: Abdallah AL Sabbagh, Robin Braun, Mehran Abolhasan

Abstract:

Due to the coexistence of different Radio Access Technologies (RATs), Next Generation Wireless Networks (NGWN) are predicted to be heterogeneous in nature. The coexistence of different RATs requires a need for Common Radio Resource Management (CRRM) to support the provision of Quality of Service (QoS) and the efficient utilization of radio resources. RAT selection algorithms are part of the CRRM algorithms. Simply, their role is to verify if an incoming call will be suitable to fit into a heterogeneous wireless network, and to decide which of the available RATs is most suitable to fit the need of the incoming call and admit it. Guaranteeing the requirements of QoS for all accepted calls and at the same time being able to provide the most efficient utilization of the available radio resources is the goal of RAT selection algorithm. The normal call admission control algorithms are designed for homogeneous wireless networks and they do not provide a solution to fit a heterogeneous wireless network which represents the NGWN. Therefore, there is a need to develop RAT selection algorithm for heterogeneous wireless network. In this paper, we propose an approach for RAT selection which includes receiving different criteria, assessing and making decisions, then selecting the most suitable RAT for incoming calls. A comprehensive survey of different RAT selection algorithms for a heterogeneous wireless network is studied.

Keywords: Heterogeneous Wireless Network, RAT selection algorithms, Next Generation Wireless Network (NGWN), Beyond 3G Network, Common Radio Resource Management (CRRM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
139 Using Knowledge Management and Critical Thinking to Understand Thai Perceptions and Decisions towards Work-Life Balance in a Multinational Software Development Firm

Authors: N. Mantalay, N. Chakpitak, W. Janchai, P. Sureepong

Abstract:

Work-life balance has been acknowledged and promoted for the sake of employee retention. It is essential for a manager to realize the human resources situation within a company to help employees work happily and perform at their best. This paper suggests knowledge management and critical thinking are useful to motivate employees to think about their work-life balance. A qualitative case study is presented, which aimed to discover the meaning of work-life balance-s meaning from the perspective of Thai knowledge workers and how it affects their decision-making towards work resignation. Results found three types of work-life balance dimensions; a work- life balance including a workplace and a private life setting, an organizational working life balance only, and a worklife balance only in a private life setting. These aspects all influenced the decision-making of the employees. Factors within a theme of an organizational work-life balance were involved with systematic administration, fair treatment, employee recognition, challenging assignments to gain working experience, assignment engagement, teamwork, relationship with superiors, and working environment, while factors concerning private life settings were about personal demands such as an increasing their salary or starting their own business.

Keywords: knowledge management, work-life balance, knowledge workers, decision-making, critical thinking, diverse workforce

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074
138 An Effective Decision-Making Strategy Based on Multi-Objective Optimization for Commercial Vehicles in Highway Scenarios

Authors: Weiming Hu, Xu Li, Xiaonan Li, Zhong Xu, Li Yuan, Xuan Dong

Abstract:

Maneuver decision-making plays a critical role in high-performance intelligent driving. This paper proposes a risk assessment-based decision-making network (RADMN) to address the problem of driving strategy for the commercial vehicle. RADMN integrates two networks, aiming at identifying the risk degree of collision and rollover and providing decisions to ensure the effectiveness and reliability of driving strategy. In the risk assessment module, risk degrees of the backward collision, forward collision and rollover are quantified for hazard recognition. In the decision module, a deep reinforcement learning based on multi-objective optimization (DRL-MOO) algorithm is designed, which comprehensively considers the risk degree and motion states of each traffic participant. To evaluate the performance of the proposed framework, Prescan/Simulink joint simulation was conducted in highway scenarios. Experimental results validate the effectiveness and reliability of the proposed RADMN. The output driving strategy can guarantee the safety and provide key technical support for the realization of autonomous driving of commercial vehicles.

Keywords: Decision-making strategy, risk assessment, multi-objective optimization, commercial vehicle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 596
137 An Investigation into the Impact of Techno-Entrepreneurship Education on Self-Employment

Authors: F. Farzin

Abstract:

Research has shown that techno-entrepreneurship is economically significant. Therefore, it is suggested that teaching techno-entrepreneurship may be important because such programmes would prepare current and future generations of learners to recognise and act on high-technology opportunities. Education in technoentrepreneurship may increase the knowledge of how to start one’s own enterprise and recognise the technological opportunities for commercialisation to improve decision-making about starting a new venture; also it influence decisions about capturing the business opportunities and turning them into successful ventures. Universities can play a main role in connecting and networking technoentrepreneurship students towards a cooperative attitude with real business practice and industry knowledge. To investigate and answer whether education for techno-entrepreneurs really helps, this paper choses a comparison of literature reviews as its method of research. After reviewing literature related to the impact of technoentrepreneurship education on self-employment 6 studies which had similar aim and objective to this paper were. These particular papers were selected based on a keywords search and as their aim, objectives, and gaps were close to the current research. In addition, they were all based on the influence of techno-entrepreneurship education in self-employment and intention of students to start new ventures. The findings showed that teaching techno-entrepreneurship education may have an influence on students’ intention and their future self-employment, but which courses should be covered and the duration of programmes, needs further investigation.

Keywords: Techno-entrepreneurship education, training, higher education, intention, self-employment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978
136 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images

Authors: Jameela Ali Alkrimi, Loay E. George, Azizah Suliman, Abdul Rahim Ahmad, Karim Al-Jashamy

Abstract:

Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.

Keywords: Red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3202
135 A Post Keynesian Environmental Macroeconomic Model for Agricultural Water Sustainability under Climate Change in the Murray-Darling Basin, Australia

Authors: Ke Zhao, Ballarat Colin Richardson, Jerry Courvisanos, John Crawford

Abstract:

Climate change has profound consequences for the agriculture of south-eastern Australia and its climate-induced water shortage in the Murray-Darling Basin. Post Keynesian Economics (PKE) macro-dynamics, along with Kaleckian investment and growth theory, are used to develop an ecological-economic system dynamics model of this complex nonlinear river basin system. The Murray- Darling Basin Simulation Model (MDB-SM) uses the principles of PKE to incorporate the fundamental uncertainty of economic behaviors of farmers regarding the investments they make and the climate change they face, particularly as regards water ecosystem services. MDB-SM provides a framework for macroeconomic policies, especially for long-term fiscal policy and for policy directed at the sustainability of agricultural water, as measured by socio-economic well-being considerations, which include sustainable consumption and investment in the river basin. The model can also reproduce other ecological and economic aspects and, for certain parameters and initial values, exhibit endogenous business cycles and ecological sustainability with realistic characteristics. Most importantly, MDBSM provides a platform for the analysis of alternative economic policy scenarios. These results reveal the importance of understanding water ecosystem adaptation under climate change by integrating a PKE macroeconomic analytical framework with the system dynamics modelling approach. Once parameterised and supplied with historical initial values, MDB-SM should prove to be a practical tool to provide alternative long-term policy simulations of agricultural water and socio-economic well-being.

Keywords: Agricultural water, Macroeconomic dynamics, Modeling, Investment dynamics, Sustainability, Unemployment, Economics, Keynesian, Kaleckian.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2174
134 Customer Segmentation Model in E-commerce Using Clustering Techniques and LRFM Model: The Case of Online Stores in Morocco

Authors: Rachid Ait daoud, Abdellah Amine, Belaid Bouikhalene, Rachid Lbibb

Abstract:

Given the increase in the number of e-commerce sites, the number of competitors has become very important. This means that companies have to take appropriate decisions in order to meet the expectations of their customers and satisfy their needs. In this paper, we present a case study of applying LRFM (length, recency, frequency and monetary) model and clustering techniques in the sector of electronic commerce with a view to evaluating customers’ values of the Moroccan e-commerce websites and then developing effective marketing strategies. To achieve these objectives, we adopt LRFM model by applying a two-stage clustering method. In the first stage, the self-organizing maps method is used to determine the best number of clusters and the initial centroid. In the second stage, kmeans method is applied to segment 730 customers into nine clusters according to their L, R, F and M values. The results show that the cluster 6 is the most important cluster because the average values of L, R, F and M are higher than the overall average value. In addition, this study has considered another variable that describes the mode of payment used by customers to improve and strengthen clusters’ analysis. The clusters’ analysis demonstrates that the payment method is one of the key indicators of a new index which allows to assess the level of customers’ confidence in the company's Website.

Keywords: Customer value, LRFM model, Cluster analysis, Self-Organizing Maps method (SOM), K-means algorithm, loyalty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6256
133 Deep Reinforcement Learning Approach for Trading Automation in the Stock Market

Authors: Taylan Kabbani, Ekrem Duman

Abstract:

Deep Reinforcement Learning (DRL) algorithms can scale to previously intractable problems. The automation of profit generation in the stock market is possible using DRL, by combining  the financial assets price ”prediction” step and the ”allocation” step of the portfolio in one unified process to produce fully autonomous systems capable of interacting with its environment to make optimal decisions through trial and error. This work represents a DRL model to generate profitable trades in the stock market, effectively overcoming the limitations of supervised learning approaches. We formulate the trading problem as a Partially observed Markov Decision Process (POMDP) model, considering the constraints imposed by the stock market, such as liquidity and transaction costs. We then solved the formulated POMDP problem using the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm and achieved a 2.68 Sharpe ratio on the test dataset. From the point of view of stock market forecasting and the intelligent decision-making mechanism, this paper demonstrates the superiority of DRL in financial markets over other types of machine learning and proves its credibility and advantages of strategic decision-making.

Keywords: Autonomous agent, deep reinforcement learning, MDP, sentiment analysis, stock market, technical indicators, twin delayed deep deterministic policy gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 531
132 Augmented Reality for Maintenance Operator for Problem Inspections

Authors: Chong-Yang Qiao, Teeravarunyou Sakol

Abstract:

Current production-oriented factories need maintenance operators to work in shifts monitoring and inspecting complex systems and different equipment in the situation of mechanical breakdown. Augmented reality (AR) is an emerging technology that embeds data into the environment for situation awareness to help maintenance operators make decisions and solve problems. An application was designed to identify the problem of steam generators and inspection centrifugal pumps. The objective of this research was to find the best medium of AR and type of problem solving strategies among analogy, focal object method and mean-ends analysis. Two scenarios of inspecting leakage were temperature and vibration. Two experiments were used in usability evaluation and future innovation, which included decision-making process and problem-solving strategy. This study found that maintenance operators prefer build-in magnifier to zoom the components (55.6%), 3D exploded view to track the problem parts (50%), and line chart to find the alter data or information (61.1%). There is a significant difference in the use of analogy (44.4%), focal objects (38.9%) and mean-ends strategy (16.7%). The marked differences between maintainers and operators are of the application of a problem solving strategy. However, future work should explore multimedia information retrieval which supports maintenance operators for decision-making.

Keywords: Augmented reality, situation awareness, decision-making, problem-solving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1346
131 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data

Authors: P. Kaladevi, N. Giridharan

Abstract:

The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.

Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595
130 Agreement Options on Multi Criteria Group Decision and Negotiation

Authors: Christiono Utomo, Arazi Idrus, Madzlan Napiah, Mohd. Faris Khamidi

Abstract:

This paper presents a conceptual model of agreement options on negotiation support for civil engineering decision. The negotiation support facilitates the solving of group choice decision making problems in civil engineering decision to reduce the impact of mud volcano disaster in Sidoarjo, Indonesia. The approach based on application of analytical hierarchy process (AHP) method for multi criteria decision on three level of decision hierarchy. Decisions for reducing impact is very complicated since many parties involved in a critical time. Where a number of stakeholders are involved in choosing a single alternative from a set of solution alternatives, there are different concern caused by differing stakeholder preferences, experiences, and background. Therefore, a group choice decision support is required to enable each stakeholder to evaluate and rank the solution alternatives before engaging into negotiation with the other stakeholders. Such civil engineering solutions as alternatives are referred to as agreement options that are determined by identifying the possible stakeholder choice, followed by determining the optimal solution for each group of stakeholder. Determination of the optimal solution is based on a game theory model of n-person general sum game with complete information that involves forming coalitions among stakeholders.

Keywords: Agreement options, AHP, agent, negotiation, multicriteria, game theory, and coalition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
129 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: Data analytics, green production, industrial energy management, optimization, renewable energies, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
128 The Impact of Transaction Costs on Rebalancing an Investment Portfolio in Portfolio Optimization

Authors: B. Marasović, S. Pivac, S. V. Vukasović

Abstract:

Constructing a portfolio of investments is one of the most significant financial decisions facing individuals and institutions. In accordance with the modern portfolio theory maximization of return at minimal risk should be the investment goal of any successful investor. In addition, the costs incurred when setting up a new portfolio or rebalancing an existing portfolio must be included in any realistic analysis. In this paper rebalancing an investment portfolio in the presence of transaction costs on the Croatian capital market is analyzed. The model applied in the paper is an extension of the standard portfolio mean-variance optimization model in which transaction costs are incurred to rebalance an investment portfolio. This model allows different costs for different securities, and different costs for buying and selling. In order to find efficient portfolio, using this model, first, the solution of quadratic programming problem of similar size to the Markowitz model, and then the solution of a linear programming problem have to be found. Furthermore, in the paper the impact of transaction costs on the efficient frontier is investigated. Moreover, it is shown that global minimum variance portfolio on the efficient frontier always has the same level of the risk regardless of the amount of transaction costs. Although efficient frontier position depends of both transaction costs amount and initial portfolio it can be concluded that extreme right portfolio on the efficient frontier always contains only one stock with the highest expected return and the highest risk.

Keywords: Croatian capital market, Fractional quadratic programming, Markowitz model, Portfolio optimization, Transaction costs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2951
127 Twitter Sentiment Analysis during the Lockdown on New Zealand

Authors: Smah Doeban Almotiri

Abstract:

One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2021, until April 4, 2021. Natural language processing (NLP), which is a form of Artificial intelligent was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applied machine learning sentimental method such as Crystal Feel and extended the size of the sample tweet by using multiple tweets over a longer period of time.

Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 591
126 Seismic Fragility Assessment of Strongback Steel Braced Frames Subjected to Near-Field Earthquakes

Authors: Mohammadreza Salek Faramarzi, Touraj Taghikhany

Abstract:

In this paper, seismic fragility assessment of a recently developed hybrid structural system, known as the strongback system (SBS) is investigated. In this system, to mitigate the occurrence of the soft-story mechanism and improve the distribution of story drifts over the height of the structure, an elastic vertical truss is formed. The strengthened members of the braced span are designed to remain substantially elastic during levels of excitation where soft-story mechanisms are likely to occur and impose a nearly uniform story drift distribution. Due to the distinctive characteristics of near-field ground motions, it seems to be necessary to study the effect of these records on seismic performance of the SBS. To this end, a set of 56 near-field ground motion records suggested by FEMA P695 methodology is used. For fragility assessment, nonlinear dynamic analyses are carried out in OpenSEES based on the recommended procedure in HAZUS technical manual. Four damage states including slight, moderate, extensive, and complete damage (collapse) are considered. To evaluate each damage state, inter-story drift ratio and floor acceleration are implemented as engineering demand parameters. Further, to extend the evaluation of the collapse state of the system, a different collapse criterion suggested in FEMA P695 is applied. It is concluded that SBS can significantly increase the collapse capacity and consequently decrease the collapse risk of the structure during its life time. Comparing the observing mean annual frequency (MAF) of exceedance of each damage state against the allowable values presented in performance-based design methods, it is found that using the elastic vertical truss, improves the structural response effectively.

Keywords: Strongback System, Near-fault, Seismic fragility, Uncertainty, IDA, Probabilistic performance assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 579
125 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: Generalized extreme values (GEV), likelihood estimation, precipitation data, Wakeby distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 678
124 Genetic Programming: Principles, Applications and Opportunities for Hydrological Modelling

Authors: Oluwaseun K. Oyebode, Josiah A. Adeyemo

Abstract:

Hydrological modelling plays a crucial role in the planning and management of water resources, most especially in water stressed regions where the need to effectively manage the available water resources is of critical importance. However, due to the complex, nonlinear and dynamic behaviour of hydro-climatic interactions, achieving reliable modelling of water resource systems and accurate projection of hydrological parameters are extremely challenging. Although a significant number of modelling techniques (process-based and data-driven) have been developed and adopted in that regard, the field of hydrological modelling is still considered as one that has sluggishly progressed over the past decades. This is majorly as a result of the identification of some degree of uncertainty in the methodologies and results of techniques adopted. In recent times, evolutionary computation (EC) techniques have been developed and introduced in response to the search for efficient and reliable means of providing accurate solutions to hydrological related problems. This paper presents a comprehensive review of the underlying principles, methodological needs and applications of a promising evolutionary computation modelling technique – genetic programming (GP). It examines the specific characteristics of the technique which makes it suitable to solving hydrological modelling problems. It discusses the opportunities inherent in the application of GP in water related-studies such as rainfall estimation, rainfall-runoff modelling, streamflow forecasting, sediment transport modelling, water quality modelling and groundwater modelling among others. Furthermore, the means by which such opportunities could be harnessed in the near future are discussed. In all, a case for total embracement of GP and its variants in hydrological modelling studies is made so as to put in place strategies that would translate into achieving meaningful progress as it relates to modelling of water resource systems, and also positively influence decision-making by relevant stakeholders.

Keywords: Computational modelling, evolutionary algorithms, genetic programming, hydrological modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3333
123 Dynamic Features Selection for Heart Disease Classification

Authors: Walid MOUDANI

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: Multi-Classifier Decisions Tree, Features Reduction, Dynamic Programming, Rough Sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2536
122 Assessment-Assisted and Relationship-Based Financial Advising: Using an Empirical Assessment to Understand Personal Investor Risk Tolerance in Professional Advising Relationships

Authors: Jerry Szatko, Edan L. Jorgensen, Stacia Jorgensen

Abstract:

A crucial component to the success of any financial advising relationship is for the financial professional to understand the perceptions, preferences and thought-processes carried by the financial clients they serve. Armed with this information, financial professionals are more quickly able to understand how they can tailor their approach to best match the individual preferences and needs of each personal investor. Our research explores the use of a quantitative assessment tool in the financial services industry to assist in the identification of the personal investor’s consumer behaviors, especially in terms of financial risk tolerance, as it relates to their financial decision making. Through this process, the Unitifi Consumer Insight Tool (UCIT) was created and refined to capture and categorize personal investor financial behavioral categories and the financial personality tendencies of individuals prior to the initiation of a financial advisement relationship. This paper discusses the use of this tool to place individuals in one of four behavior-based financial risk tolerance categories. Our discoveries and research were aided through administration of a web-based survey to a group of over 1,000 individuals. Our findings indicate that it is possible to use a quantitative assessment tool to assist in predicting the behavioral tendencies of personal consumers when faced with consumer financial risk and decisions.

Keywords: Behavior based advising, behavioral finance, financial advising, financial advisor tools, financial risk tolerance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 963
121 The Analysis of Regulation on Sustainability in Financial Sector in Lithuania

Authors: D. Kubiliute

Abstract:

The Republic of Lithuania is known as a trusted location for global business institutions and it attracts investors with its competitive environment for financial service providers. Along with the aspiration to offer a strong results-oriented and innovations-driven environment for financial service providers, Lithuanian regulatory authorities consistently implement the European Union's high regulatory standards for financial activities including sustainability-related disclosures. Since the European Union directed its policy towards transition to a climate-neutral, green, competitive and inclusive economy, additional regulatory requirements for financial market participants are adopted: disclosure of sustainable activities, transparency, prevention of greenwashing, and other. The financial sector is one of the key factors influencing the implementation of sustainability objectives in the European Union policies and mitigating the negative effects of climate change – public funds are not enough to make a significant impact on sustainable investments, therefore directing public and private capital to green projects may help to finance the necessary changes. The topic of the study is original and has not yet been widely analyzed in Lithuanian legal discourse. There are used quantitative and qualitative methodologies, logical, systematic and critical analysis principles, hence the aim of this study is to reveal the problematic of the implementation of regulation on sustainability in the Lithuanian financial sector. Additional regulatory requirements could cause serious changes in financial business operations: additional funds, employees and time have to be dedicated in order the companies could implement these regulations. Lack of knowledge and data on how to implement new regulatory requirements towards sustainable reporting causes a lot of uncertainty for financial market participants. And for some companies it might even be an essential point in terms of business continuity. It is considered that the supervisory authorities should find a balance between financial market needs and legal regulation.

Keywords: Financial, market participant, legal, regulation, sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 229
120 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: Classifier ensemble, breast cancer survivability, data mining, SEER.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
119 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: Internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1192
118 Microscopic Simulation of Toll Plaza Safety and Operations

Authors: Bekir O. Bartin, Kaan Ozbay, Sandeep Mudigonda, Hong Yang

Abstract:

The use of microscopic traffic simulation in evaluating the operational and safety conditions at toll plazas is demonstrated. Two toll plazas in New Jersey are selected as case studies and were developed and validated in Paramics traffic simulation software. In order to simulate drivers’ lane selection behavior in Paramics, a utility-based lane selection approach is implemented in Paramics Application Programming Interface (API). For each vehicle approaching the toll plaza, a utility value is assigned to each toll lane by taking into account the factors that are likely to impact drivers’ lane selection behavior, such as approach lane, exit lane and queue lengths. The results demonstrate that similar operational conditions, such as lane-by-lane toll plaza traffic volume can be attained using this approach. In addition, assessment of safety at toll plazas is conducted via a surrogate safety measure. In particular, the crash index (CI), an improved surrogate measure of time-to-collision (TTC), which reflects the severity of a crash is used in the simulation analyses. The results indicate that the spatial and temporal frequency of observed crashes can be simulated using the proposed methodology. Further analyses can be conducted to evaluate and compare various different operational decisions and safety measures using microscopic simulation models.

Keywords: Microscopic simulation, toll plaza, surrogate safety, application programming interface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 791
117 Concept for Knowledge out of Sri Lankan Non-State Sector: Performances of Higher Educational Institutes and Successes of Its Sector

Authors: S. Jeyarajan

Abstract:

Concept of knowledge is discovered from conducted study for successive Competition in Sri Lankan Non-State Higher Educational Institutes. The Concept discovered out of collected Knowledge Management Practices from Emerald inside likewise reputed literatures and of Non-State Higher Educational sector. A test is conducted to reveal existences and its reason behind of these collected practices in Sri Lankan Non-State Higher Education Institutes. Further, unavailability of such study and uncertain on number of participants for data collection in the Sri Lankan context contributed selection of research method as qualitative method, which used attributes of Delphi Method to manage those likewise uncertainty. Data are collected under Dramaturgical Method, which contributes efficient usage of the Delphi method. Grounded theory is selected as data analysis techniques, which is conducted in intermixed discourse to manage different perspectives of data that are collected systematically through perspective and modified snowball sampling techniques. Data are then analysed using Grounded Theory Development Techniques in Intermix discourses to manage differences in Data. Consequently, Agreement in the results of Grounded theories and of finding in the Foreign Study is discovered in the analysis whereas present study conducted as Qualitative Research and The Foreign Study conducted as Quantitative Research. As such, the Present study widens the discovery in the Foreign Study. Further, having discovered reason behind of the existences, the Present result shows Concept for Knowledge from Sri Lankan Non-State sector to manage higher educational Institutes in successful manner.

Keywords: Adherence of snowball sampling into perspective sampling, Delphi method in qualitative method, grounded theory development in intermix discourses of analysis, knowledge management for success of higher educational institutes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 777