Search results for: decision problem np-complete
10027 Bi-objective Network Optimization in Disaster Relief Logistics
Authors: Katharina Eberhardt, Florian Klaus Kaiser, Frank Schultmann
Abstract:
Last-mile distribution is one of the most critical parts of a disaster relief operation. Various uncertainties, such as infrastructure conditions, resource availability, and fluctuating beneficiary demand, render last-mile distribution challenging in disaster relief operations. The need to balance critical performance criteria like response time, meeting demand and cost-effectiveness further complicates the task. The occurrence of disasters cannot be controlled, and the magnitude is often challenging to assess. In summary, these uncertainties create a need for additional flexibility, agility, and preparedness in logistics operations. As a result, strategic planning and efficient network design are critical for an effective and efficient response. Furthermore, the increasing frequency of disasters and the rising cost of logistical operations amplify the need to provide robust and resilient solutions in this area. Therefore, we formulate a scenario-based bi-objective optimization model that integrates pre-positioning, allocation, and distribution of relief supplies extending the general form of a covering location problem. The proposed model aims to minimize underlying logistics costs while maximizing demand coverage. Using a set of disruption scenarios, the model allows decision-makers to identify optimal network solutions to address the risk of disruptions. We provide an empirical case study of the public authorities’ emergency food storage strategy in Germany to illustrate the potential applicability of the model and provide implications for decision-makers in a real-world setting. Also, we conduct a sensitivity analysis focusing on the impact of varying stockpile capacities, single-site outages, and limited transportation capacities on the objective value. The results show that the stockpiling strategy needs to be consistent with the optimal number of depots and inventory based on minimizing costs and maximizing demand satisfaction. The strategy has the potential for optimization, as network coverage is insufficient and relies on very high transportation and personnel capacity levels. As such, the model provides decision support for public authorities to determine an efficient stockpiling strategy and distribution network and provides recommendations for increased resilience. However, certain factors have yet to be considered in this study and should be addressed in future works, such as additional network constraints and heuristic algorithms.Keywords: humanitarian logistics, bi-objective optimization, pre-positioning, last mile distribution, decision support, disaster relief networks
Procedia PDF Downloads 8010026 Prediction of Coronary Artery Stenosis Severity Based on Machine Learning Algorithms
Authors: Yu-Jia Jian, Emily Chia-Yu Su, Hui-Ling Hsu, Jian-Jhih Chen
Abstract:
Coronary artery is the major supplier of myocardial blood flow. When fat and cholesterol are deposit in the coronary arterial wall, narrowing and stenosis of the artery occurs, which may lead to myocardial ischemia and eventually infarction. According to the World Health Organization (WHO), estimated 740 million people have died of coronary heart disease in 2015. According to Statistics from Ministry of Health and Welfare in Taiwan, heart disease (except for hypertensive diseases) ranked the second among the top 10 causes of death from 2013 to 2016, and it still shows a growing trend. According to American Heart Association (AHA), the risk factors for coronary heart disease including: age (> 65 years), sex (men to women with 2:1 ratio), obesity, diabetes, hypertension, hyperlipidemia, smoking, family history, lack of exercise and more. We have collected a dataset of 421 patients from a hospital located in northern Taiwan who received coronary computed tomography (CT) angiography. There were 300 males (71.26%) and 121 females (28.74%), with age ranging from 24 to 92 years, and a mean age of 56.3 years. Prior to coronary CT angiography, basic data of the patients, including age, gender, obesity index (BMI), diastolic blood pressure, systolic blood pressure, diabetes, hypertension, hyperlipidemia, smoking, family history of coronary heart disease and exercise habits, were collected and used as input variables. The output variable of the prediction module is the degree of coronary artery stenosis. The output variable of the prediction module is the narrow constriction of the coronary artery. In this study, the dataset was randomly divided into 80% as training set and 20% as test set. Four machine learning algorithms, including logistic regression, stepwise regression, neural network and decision tree, were incorporated to generate prediction results. We used area under curve (AUC) / accuracy (Acc.) to compare the four models, the best model is neural network, followed by stepwise logistic regression, decision tree, and logistic regression, with 0.68 / 79 %, 0.68 / 74%, 0.65 / 78%, and 0.65 / 74%, respectively. Sensitivity of neural network was 27.3%, specificity was 90.8%, stepwise Logistic regression sensitivity was 18.2%, specificity was 92.3%, decision tree sensitivity was 13.6%, specificity was 100%, logistic regression sensitivity was 27.3%, specificity 89.2%. From the result of this study, we hope to improve the accuracy by improving the module parameters or other methods in the future and we hope to solve the problem of low sensitivity by adjusting the imbalanced proportion of positive and negative data.Keywords: decision support, computed tomography, coronary artery, machine learning
Procedia PDF Downloads 22910025 Exploring Alignability Effects and the Role of Information Structure in Promoting Uptake of Energy Efficient Technologies
Authors: Rebecca Hafner, David Elmes, Daniel Read
Abstract:
The current research applies decision-making theory to the problem of increasing uptake of energy efficient technologies in the market place, where uptake is currently slower than one might predict following rational choice models. We apply the alignable/non-alignable features effect and explore the impact of varying information structure on the consumers’ preference for standard versus energy efficient technologies. In two studies we present participants with a choice between similar (boiler vs. boiler) vs. dissimilar (boiler vs. heat pump) technologies, described by a list of alignable and non-alignable attributes. In study One there is a preference for alignability when options are similar; an effect mediated by an increased tendency to infer missing information is the same. No effects of alignability on preference are found when options differ. One explanation for this split-shift in attentional focus is a change in construal levels potentially induced by the added consideration of environmental concern. Study two was designed to explore the interplay between alignability and construal level in greater detail. We manipulated construal level via a thought prime task prior to taking part in the same heating systems choice task, and find that there is a general preference for non-alignability, regardless of option type. We draw theoretical and applied implications for the type of information structure best suited for the promotion of energy efficient technologies.Keywords: alignability effects, decision making, energy-efficient technologies, sustainable behaviour change
Procedia PDF Downloads 31410024 The Impact of Structural Empowerment on Risk Management Practices: A Case Study of Saudi Arabia Construction Small and Medium-Sized Enterprises
Authors: S. Alyami, S. Mohammad
Abstract:
These Risk management practices have a significant impact on construction SMEs. The effective utilisation of these practices depends on culture change in order to optimise decision making for critical activities within construction projects. Thus, successful implementation of empowerment strategies would enhance operational employees to participate in effective decision making. However, there remain many barriers to individuals and organisations within empowerment strategies that require empirical investigation before the industry can benefit from their implementation. Gaps in understanding the relationship between employee empowerment and risk management practices still exist. This research paper aims to examine the impact of the structural empowerment on risk management practices in construction SMEs. The questionnaire has been distributed to participants (162 employees) that involve projects and civil engineers within a case study from Saudi construction SMEs. Partial least squares based structural equation modeling (PLS-SEM) was utilised to perform analysis. The results reveal a positive relationship between empowerment and risk management practices. The study shows how structural empowerment contributes to operational employees in risk management practices through involving activities such as decision making, self-efficiency, and autonomy. The findings of this study will contribute to close the current gaps in the construction SMEs context.Keywords: construction SMEs, culture, decision making, empowerment, risk management
Procedia PDF Downloads 11910023 A Value-Oriented Metamodel for Small and Medium Enterprises’ Decision Making
Authors: Romain Ben Taleb, Aurélie Montarnal, Matthieu Lauras, Mathieu Dahan, Romain Miclo
Abstract:
To be competitive and sustainable, any company has to maximize its value. However, unlike listed companies that can assess their values based on market shares, most Small and Medium Enterprises (SMEs) which are non-listed cannot have direct and live access to this critical information. Traditional accounting reports only give limited insights to SME decision-makers about the real impact of their day-to-day decisions on the company’s performance and value. Most of the time, an SME’s financial valuation is made one time a year as the associated process is time and resource-consuming, requiring several months and external expertise to be completed. To solve this issue, we propose in this paper a value-oriented metamodel that enables real-time and dynamic assessment of the SME’s value based on the large definition of their assets. These assets cover a wider scope of resources of the company and better account for immaterial assets. The proposal, which is illustrated in a case study, discusses the benefits of incorporating assets in the SME valuation.Keywords: SME, metamodel, decision support system, financial valuation, assets
Procedia PDF Downloads 9310022 An Algorithm for the Map Labeling Problem with Two Kinds of Priorities
Authors: Noboru Abe, Yoshinori Amai, Toshinori Nakatake, Sumio Masuda, Kazuaki Yamaguchi
Abstract:
We consider the problem of placing labels of the points on a plane. For each point, its position, the size of its label and a priority are given. Moreover, several candidates of its label positions are prespecified, and each of such label positions is assigned a priority. The objective of our problem is to maximize the total sum of priorities of placed labels and their points. By refining a labeling algorithm that can use these priorities, we propose a new heuristic algorithm which is more suitable for treating the assigned priorities.Keywords: map labeling, greedy algorithm, heuristic algorithm, priority
Procedia PDF Downloads 43310021 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics
Authors: Mikheil Kalmakhelidze
Abstract:
Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.Keywords: description logic, fuzzy logic, neural networks, record linkage
Procedia PDF Downloads 27410020 Loudspeaker Parameters Inverse Problem for Improving Sound Frequency Response Simulation
Authors: Y. T. Tsai, Jin H. Huang
Abstract:
The sound pressure level (SPL) of the moving-coil loudspeaker (MCL) is often simulated and analyzed using the lumped parameter model. However, the SPL of a MCL cannot be simulated precisely in the high frequency region, because the value of cone effective area is changed due to the geometry variation in different mode shapes, it is also related to affect the acoustic radiation mass and resistance. Herein, the paper presents the inverse method which has a high ability to measure the value of cone effective area in various frequency points, also can estimate the MCL electroacoustic parameters simultaneously. The proposed inverse method comprises the direct problem, adjoint problem, and sensitivity problem in collaboration with nonlinear conjugate gradient method. Estimated values from the inverse method are validated experimentally which compared with the measured SPL curve result. Results presented in this paper not only improve the accuracy of lumped parameter model but also provide the valuable information on loudspeaker cone design.Keywords: inverse problem, cone effective area, loudspeaker, nonlinear conjugate gradient method
Procedia PDF Downloads 30310019 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods
Authors: A. Senthil Kumar, V. Murali Bhaskaran
Abstract:
In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)
Procedia PDF Downloads 28810018 Consideration of Uncertainty in Engineering
Authors: A. Mohammadi, M. Moghimi, S. Mohammadi
Abstract:
Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method
Procedia PDF Downloads 41610017 Clarification of the Essential of Life Cycle Cost upon Decision-Making Process: An Empirical Study in Building Projects
Authors: Ayedh Alqahtani, Andrew Whyte
Abstract:
Life Cycle Cost (LCC) is one of the goals and key pillars of the construction management science because it comprises many of the functions and processes necessary, which assist organisations and agencies to achieve their goals. It has therefore become important to design and control assets during their whole life cycle, from the design and planning phase through to disposal phase. LCCA is aimed to improve the decision making system in the ownership of assets by taking into account all the cost elements including to the asset throughout its life. Current application of LCC approach is impractical during misunderstanding of the advantages of LCC. This main objective of this research is to show a different relationship between capital cost and long-term running costs. One hundred and thirty eight actual building projects in United Kingdom (UK) were used in order to achieve and measure the above-mentioned objective of the study. The result shown that LCC is one of the most significant tools should be considered on the decision making process.Keywords: building projects, capital cost, life cycle cost, maintenance costs, operation costs
Procedia PDF Downloads 54610016 The Functional Magnetic Resonance Imaging and the Consumer Behaviour: Reviewing Recent Research
Authors: Mikel Alonso López
Abstract:
In the first decade of the twenty-first century, advanced imaging techniques began to be applied for neuroscience research. The Functional Magnetic Resonance Imaging (fMRI) is one of the most important and most used research techniques for the investigation of emotions, because of its ease to observe the brain areas that oxygenate when performing certain tasks. In this research, we make a review about the main research carried out on the influence of the emotions in the decision-making process that is exposed by using the fMRI.Keywords: decision making, emotions, fMRI, consumer behaviour
Procedia PDF Downloads 48010015 ECO ROADS: A Solution to the Vehicular Pollution on Roads
Authors: Harshit Garg, Shakshi Gupta
Abstract:
One of the major problems in today’s world is the growing pollution. The cause for all environmental problems is the increasing pollution rate. Looking upon the statistics, one can find out that most of the pollution is caused by the vehicular pollution which is more than 70 % of the total pollution, effecting the environment as well as human health proportionally. One is aware of the fact that vehicles run on roads so why not having the roads which could adsorb that pollution, not only once but a number of times. Every problem has a solution which can be solved by the state of art of technology, that is one can use the innovative ideas and thoughts to make technology as a solution to the problem of vehicular pollution on roads. Solving the problem up to a certain limit/ percentage can be formulated into a new term called ECO ROADS.Keywords: environment, pollution, roads, sustainibility
Procedia PDF Downloads 55810014 Implications of Meteorological Parameters in Decision Making for Public Protective Actions during a Nuclear Emergency
Authors: M. Hussaina, K. Mahboobb, S. Z. Ilyasa, S. Shaheena
Abstract:
Plume dispersion modeling is a computational procedure to establish a relationship between emissions, meteorology, atmospheric concentrations, deposition and other factors. The emission characteristics (stack height, stack diameter, release velocity, heat contents, chemical and physical properties of the gases/particle released etc.), terrain (surface roughness, local topography, nearby buildings) and meteorology (wind speed, stability, mixing height, etc.) are required for the modeling of the plume dispersion and estimation of ground and air concentration. During the early phase of Fukushima accident, plume dispersion modeling and decisions were taken for the implementation of protective measures. A difference in estimated results and decisions made by different countries for taking protective actions created a concern in local and international community regarding the exact identification of the safe zone. The current study is focused to highlight the importance of accurate and exact weather data availability, scientific approach for decision making for taking urgent protective actions, compatible and harmonized approach for plume dispersion modeling during a nuclear emergency. As a case study, the influence of meteorological data on plume dispersion modeling and decision-making process has been performed.Keywords: decision making process, radiation doses, nuclear emergency, meteorological implications
Procedia PDF Downloads 18310013 Application of the Critical Decision Method for Monitoring and Improving Safety in the Construction Industry
Authors: Juan Carlos Rubio Romero, Francico Salguero Caparros, Virginia Herrera-Pérez
Abstract:
No one is in the slightest doubt about the high levels of risk involved in work in the construction industry. They are even higher in structural construction work. The Critical Decision Method (CDM) is a semi-structured interview technique that uses cognitive tests to identify the different disturbances that workers have to deal with in their work activity. At present, the vision of safety focused on daily performance and things that go well for safety and health management is facing the new paradigm known as Resilience Engineering. The aim of this study has been to describe the variability in formwork labour on concrete structures in the construction industry and, from there, to find out the resilient attitude of workers to unexpected events that they have experienced during their working lives. For this purpose, a series of semi-structured interviews were carried out with construction employees with extensive experience in formwork labour in Spain by applying the Critical Decision Method. This work has been the first application of the Critical Decision Method in the field of construction and, more specifically, in the execution of structures. The results obtained show that situations categorised as unthought-of are identified to a greater extent than potentially unexpected situations. The identification during these interviews of both expected and unexpected events provides insight into the critical decisions made and actions taken to improve resilience in daily practice in this construction work. From this study, it is clear that it is essential to gain more knowledge about the nature of the human cognitive process in work situations within complex socio-technical systems such as construction sites. This could lead to a more effective design of workplaces in the search for improved human performance.Keywords: resilience engineering, construction industry, unthought-of situations, critical decision method
Procedia PDF Downloads 14810012 Carbon Skimming: Towards an Application to Summarise and Compare Embodied Carbon to Aid Early-Stage Decision Making
Authors: Rivindu Nethmin Bandara Menik Hitihamy Mudiyanselage, Matthias Hank Haeusler, Ben Doherty
Abstract:
Investors and clients in the Architectural, Engineering and Construction industry find it difficult to understand complex datasets and reports with little to no graphic representation. The stakeholders examined in this paper include designers, design clients and end-users. Communicating embodied carbon information graphically and concisely can aid with decision support early in a building's life cycle. It is essential to create a common visualisation approach as the level of knowledge about embodied carbon varies between stakeholders. The tool, designed in conjunction with Bates Smart, condenses Tally Life Cycle Assessment data to a carbon hot-spotting visualisation, highlighting the sections with the highest amounts of embodied carbon. This allows stakeholders at every stage of a given project to have a better understanding of the carbon implications with minimal effort. It further allows stakeholders to differentiate building elements by their carbon values, which enables the evaluation of the cost-effectiveness of the selected materials at an early stage. To examine and build a decision-support tool, an action-design research methodology of cycles of iterations was used along with precedents of embodied carbon visualising tools. Accordingly, the importance of visualisation and Building Information Modelling are also explored to understand the best format for relaying these results.Keywords: embodied carbon, visualisation, summarisation, data filtering, early-stage decision-making, materiality
Procedia PDF Downloads 8310011 Instant Location Detection of Objects Moving at High Speed in C-OTDR Monitoring Systems
Authors: Andrey V. Timofeev
Abstract:
The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data off the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as 'signaling parameters' (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of C-OTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as a rule. This report contains describing the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems
Procedia PDF Downloads 47110010 Preventing Corruption in Dubai: Governance, Contemporary Strategies and Systemic Flaws
Authors: Graham Brooks, Belaisha Bin Belaisha, Hakkyong Kim
Abstract:
The problem of preventing and/or reducing corruption is a major international problem. This paper, however, specifically focuses on how organisations in Dubai are tackling the problem of money laundering. This research establishes that Dubai has a clear international anti-money laundering framework but suffers from some national weaknesses such as diverse anti-money laundering working practice, lack of communication, sharing information and disparate organisational vested self-interest.Keywords: corruption, governance, money laundering, prevention, strategies
Procedia PDF Downloads 27410009 Heart Attack Prediction Using Several Machine Learning Methods
Authors: Suzan Anwar, Utkarsh Goyal
Abstract:
Heart rate (HR) is a predictor of cardiovascular, cerebrovascular, and all-cause mortality in the general population, as well as in patients with cardio and cerebrovascular diseases. Machine learning (ML) significantly improves the accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment while avoiding unnecessary treatment of others. This research examines relationship between the individual's various heart health inputs like age, sex, cp, trestbps, thalach, oldpeaketc, and the likelihood of developing heart disease. Machine learning techniques like logistic regression and decision tree, and Python are used. The results of testing and evaluating the model using the Heart Failure Prediction Dataset show the chance of a person having a heart disease with variable accuracy. Logistic regression has yielded an accuracy of 80.48% without data handling. With data handling (normalization, standardscaler), the logistic regression resulted in improved accuracy of 87.80%, decision tree 100%, random forest 100%, and SVM 100%.Keywords: heart rate, machine learning, SVM, decision tree, logistic regression, random forest
Procedia PDF Downloads 13810008 Solving Optimal Control of Semilinear Elliptic Variational Inequalities Obstacle Problems using Smoothing Functions
Authors: El Hassene Osmani, Mounir Haddou, Naceurdine Bensalem
Abstract:
In this paper, we investigate optimal control problems governed by semilinear elliptic variational inequalities involving constraints on the state, and more precisely, the obstacle problem. We present a relaxed formulation for the problem using smoothing functions. Since we adopt a numerical point of view, we first relax the feasible domain of the problem, then using both mathematical programming methods and penalization methods, we get optimality conditions with smooth Lagrange multipliers. Some numerical experiments using IPOPT algorithm (Interior Point Optimizer) are presented to verify the efficiency of our approach.Keywords: complementarity problem, IPOPT, Lagrange multipliers, mathematical programming, optimal control, smoothing methods, variationally inequalities
Procedia PDF Downloads 17410007 Informed Urban Design: Minimizing Urban Heat Island Intensity via Stochastic Optimization
Authors: Luis Guilherme Resende Santos, Ido Nevat, Leslie Norford
Abstract:
The Urban Heat Island (UHI) is characterized by increased air temperatures in urban areas compared to undeveloped rural surrounding environments. With urbanization and densification, the intensity of UHI increases, bringing negative impacts on livability, health and economy. In order to reduce those effects, it is required to take into consideration design factors when planning future developments. Given design constraints such as population size and availability of area for development, non-trivial decisions regarding the buildings’ dimensions and their spatial distribution are required. We develop a framework for optimization of urban design in order to jointly minimize UHI intensity and buildings’ energy consumption. First, the design constraints are defined according to spatial and population limits in order to establish realistic boundaries that would be applicable in real life decisions. Second, the tools Urban Weather Generator (UWG) and EnergyPlus are used to generate outputs of UHI intensity and total buildings’ energy consumption, respectively. Those outputs are changed based on a set of variable inputs related to urban morphology aspects, such as building height, urban canyon width and population density. Lastly, an optimization problem is cast where the utility function quantifies the performance of each design candidate (e.g. minimizing a linear combination of UHI and energy consumption), and a set of constraints to be met is set. Solving this optimization problem is difficult, since there is no simple analytic form which represents the UWG and EnergyPlus models. We therefore cannot use any direct optimization techniques, but instead, develop an indirect “black box” optimization algorithm. To this end we develop a solution that is based on stochastic optimization method, known as the Cross Entropy method (CEM). The CEM translates the deterministic optimization problem into an associated stochastic optimization problem which is simple to solve analytically. We illustrate our model on a typical residential area in Singapore. Due to fast growth in population and built area and land availability generated by land reclamation, urban planning decisions are of the most importance for the country. Furthermore, the hot and humid climate in the country raises the concern for the impact of UHI. The problem presented is highly relevant to early urban design stages and the objective of such framework is to guide decision makers and assist them to include and evaluate urban microclimate and energy aspects in the process of urban planning.Keywords: building energy consumption, stochastic optimization, urban design, urban heat island, urban weather generator
Procedia PDF Downloads 13310006 'Explainable Artificial Intelligence' and Reasons for Judicial Decisions: Why Justifications and Not Just Explanations May Be Required
Authors: Jacquelyn Burkell, Jane Bailey
Abstract:
Artificial intelligence (AI) solutions deployed within the justice system face the critical task of providing acceptable explanations for decisions or actions. These explanations must satisfy the joint criteria of public and professional accountability, taking into account the perspectives and requirements of multiple stakeholders, including judges, lawyers, parties, witnesses, and the general public. This research project analyzes and integrates two existing literature on explanations in order to propose guidelines for explainable AI in the justice system. Specifically, we review three bodies of literature: (i) explanations of the purpose and function of 'explainable AI'; (ii) the relevant case law, judicial commentary and legal literature focused on the form and function of reasons for judicial decisions; and (iii) the literature focused on the psychological and sociological functions of these reasons for judicial decisions from the perspective of the public. Our research suggests that while judicial ‘reasons’ (arguably accurate descriptions of the decision-making process and factors) do serve similar explanatory functions as those identified in the literature on 'explainable AI', they also serve an important ‘justification’ function (post hoc constructions that justify the decision that was reached). Further, members of the public are also looking for both justification and explanation in reasons for judicial decisions, and that the absence of either feature is likely to contribute to diminished public confidence in the legal system. Therefore, artificially automated judicial decision-making systems that simply attempt to document the process of decision-making are unlikely in many cases to be useful to and accepted within the justice system. Instead, these systems should focus on the post-hoc articulation of principles and precedents that support the decision or action, especially in cases where legal subjects’ fundamental rights and liberties are at stake.Keywords: explainable AI, judicial reasons, public accountability, explanation, justification
Procedia PDF Downloads 12710005 Integrated Marketing Communication to Influencing International Standard Energy Economy Car Buying Decision of Consumers in Bangkok
Authors: Pisit Potjanajaruwit
Abstract:
The objective of this research was to study the influence of Integrated Marketing Communication on Buying Decision of Consumers in Bangkok. A total of 397 respondents were collected from customers who drive in Bangkok. A questionnaire was utilized as a tool to collect data. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. Data were analyzed by using Statistical Package for the Social Sciences. The findings revealed that the majority of respondents were male with the age between 25-34 years old, hold undergraduate degree, married and stay together. The average income of respondents was between 10,001-20,000 baht. In terms of occupation, the majority worked for private companies. The effect to the Buying Decision of Consumers in Bangkok to including sale promotion with the low interest and discount for an installment, selling by introducing and gave product information through sales persons, public relation by website, direct marketing by annual motor show and advertisement by television media.Keywords: Bangkok metropolis, ECO car, integrated marketing communication, international standard
Procedia PDF Downloads 31710004 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem
Authors: C. E. Nugraheni, L. Abednego
Abstract:
This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as meta-heuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.Keywords: hyper-heuristics, evolutionary algorithms, production scheduling, meta-heuristic
Procedia PDF Downloads 38110003 Upon One Smoothing Problem in Project Management
Authors: Dimitri Golenko-Ginzburg
Abstract:
A CPM network project with deterministic activity durations, in which activities require homogenous resources with fixed capacities, is considered. The problem is to determine the optimal schedule of starting times for all network activities within their maximal allowable limits (in order not to exceed the network's critical time) to minimize the maximum required resources for the project at any point in time. In case when a non-critical activity may start only at discrete moments with the pregiven time span, the problem becomes NP-complete and an optimal solution may be obtained via a look-over algorithm. For the case when a look-over requires much computational time an approximate algorithm is suggested. The algorithm's performance ratio, i.e., the relative accuracy error, is determined. Experimentation has been undertaken to verify the suggested algorithm.Keywords: resource smoothing problem, CPM network, lookover algorithm, lexicographical order, approximate algorithm, accuracy estimate
Procedia PDF Downloads 30210002 Survey Paper on Graph Coloring Problem and Its Application
Authors: Prateek Chharia, Biswa Bhusan Ghosh
Abstract:
Graph coloring is one of the prominent concepts in graph coloring. It can be defined as a coloring of the various regions of the graph such that all the constraints are fulfilled. In this paper various graphs coloring approaches like greedy coloring, Heuristic search for maximum independent set and graph coloring using edge table is described. Graph coloring can be used in various real time applications like student time tabling generation, Sudoku as a graph coloring problem, GSM phone network.Keywords: graph coloring, greedy coloring, heuristic search, edge table, sudoku as a graph coloring problem
Procedia PDF Downloads 54210001 A Financial Analysis of the Current State of IKEA: A Case Study
Authors: Isabela Vieira, Leonor Carvalho Garcez, Adalmiro Pereira, Tânia Teixeira
Abstract:
In the present work, we aim to analyse IKEA as a company, by focusing on its development, financial analysis and future benchmarks, as well as applying some of the knowledge learned in class, namely hedging and other financial risk mitigation solutions, to understand how IKEA navigates and protects itself from risk. The decision that led us to choose IKEA for our casework has to do with the long history of the company since the 1940s and its high internationalization in 63 different markets. The company also has clear financial reports which aided us in the making of the present essay and naturally, was a factor that contributed to our decision.Keywords: Ikea, financial risk, risk management, hedge
Procedia PDF Downloads 6110000 A General Variable Neighborhood Search Algorithm to Minimize Makespan of the Distributed Permutation Flowshop Scheduling Problem
Authors: G. M. Komaki, S. Mobin, E. Teymourian, S. Sheikh
Abstract:
This paper addresses minimizing the makespan of the distributed permutation flow shop scheduling problem. In this problem, there are several parallel identical factories or flowshops each with series of similar machines. Each job should be allocated to one of the factories and all of the operations of the jobs should be performed in the allocated factory. This problem has recently gained attention and due to NP-Hard nature of the problem, metaheuristic algorithms have been proposed to tackle it. Majority of the proposed algorithms require large computational time which is the main drawback. In this study, a general variable neighborhood search algorithm (GVNS) is proposed where several time-saving schemes have been incorporated into it. Also, the GVNS uses the sophisticated method to change the shaking procedure or perturbation depending on the progress of the incumbent solution to prevent stagnation of the search. The performance of the proposed algorithm is compared to the state-of-the-art algorithms based on standard benchmark instances.Keywords: distributed permutation flow shop, scheduling, makespan, general variable neighborhood search algorithm
Procedia PDF Downloads 3549999 Simulation Model of Induction Heating in COMSOL Multiphysics
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
The induction heating phenomenon depends on various factors, making the problem highly nonlinear. The mathematical analysis of this problem in most cases is very difficult and it is reduced to simple cases. Another knowledge of induction heating systems is generated in production environments, but these trial-error procedures are long and expensive. The numerical models of induction heating problem are another approach to reduce abovementioned drawbacks. This paper deals with the simulation model of induction heating problem. The simulation model of induction heating system in COMSOL Multiphysics is created. In this work we present results of numerical simulations of induction heating process in pieces of cylindrical shapes, in an inductor with four coils. The modeling of the inducting heating process was made with the software COMSOL Multiphysics Version 4.2a, for the study we present the temperature charts.Keywords: induction heating, electromagnetic field, inductor, numerical simulation, finite element
Procedia PDF Downloads 3169998 Solutions of Fuzzy Transportation Problem Using Best Candidates Method and Different Ranking Techniques
Authors: M. S. Annie Christi
Abstract:
Transportation Problem (TP) is based on supply and demand of commodities transported from one source to the different destinations. Usual methods for finding solution of TPs are North-West Corner Rule, Least Cost Method Vogel’s Approximation Method etc. The transportation costs tend to vary at each time. We can use fuzzy numbers which would give solution according to this situation. In this study the Best Candidate Method (BCM) is applied. For ranking Centroid Ranking Technique (CRT) and Robust Ranking Technique have been adopted to transform the fuzzy TP and the above methods are applied to EDWARDS Vacuum Company, Crawley, in West Sussex in the United Kingdom. A Comparative study is also given. We see that the transportation cost can be minimized by the application of CRT under BCM.Keywords: best candidate method, centroid ranking technique, fuzzy transportation problem, robust ranking technique, transportation problem
Procedia PDF Downloads 295