Search results for: decision fusion
3748 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment
Authors: Ritsuko Kawasaki, Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 3793747 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —
Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno
Abstract:
STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.Keywords: rule induction, decision table, missing data, noise
Procedia PDF Downloads 3963746 Aircraft Line Maintenance Equipped with Decision Support System
Authors: B. Sudarsan Baskar, S. Pooja Pragati, S. Raj Kumar
Abstract:
The cost effectiveness in aircraft maintenance is of high privilege in the recent days. The cost effectiveness can be effectively made when line maintenance activities are incorporated at airports during Turn around time (TAT). The present work outcomes the shortcomings that affect the dispatching of the aircrafts, aiming at high fleet operability and low maintenance cost. The operational and cost constraints have been discussed and a suggestive alternative mechanism is proposed. The possible allocation of all deferred maintenance tasks to a set of all deferred maintenance tasks to a set of suitable airport resources have termed as alternative and is discussed in this paper from the data’s collected from the kingfisher airlines.Keywords: decision support system, aircraft maintenance planning, maintenance-cost, RUL(remaining useful life), logistics, supply chain management
Procedia PDF Downloads 5023745 The Effect of Law on Politics
Authors: Boukrida Rafiq
Abstract:
Democracy is based on the notion that all citizens have the right to participate in the managing of political affairs and that every citizens input is of equal importance. This basic assumption clearly places emphasis on public participation in maintaining a stable democracy. The level of public participation, however is highly contested with many theorists arguing that too much public participation would overwhelm and ultimately cripple democratic systems. On the other hand, others who favor high levels of participation argue that more citizen involvement leads to greater representation. Regardless of these disagreements over the utopian level of participation, there is widespread agreement amongst scholars that, at the very least, some participation is necessary to maintain democratic systems. The ways in which citizens participate vary greatly and depending on the method used, influence political decision making at varying levels. The method of political participation is a key in controlling public influence over political affairs and therefore is also an integral part of maintaining democracy, whether it be "thin" (low levels of participation) or "Robust" (high levels of participation). High levels of participation or "robust" democracy are argued by some theorists to enhance democracy through providing the opportunity for more issues to be represented during decision making. The notion of widespread participation was first advanced by classical theorists.Keywords: assumption clearly places emphasis, ultimately cripple, influence political decision making at varying, classical theorists
Procedia PDF Downloads 4603744 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels
Authors: Joshua Buli, David Pietrowski, Samuel Britton
Abstract:
Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization
Procedia PDF Downloads 853743 Meteorological Risk Assessment for Ships with Fuzzy Logic Designer
Authors: Ismail Karaca, Ridvan Saracoglu, Omer Soner
Abstract:
Fuzzy Logic, an advanced method to support decision-making, is used by various scientists in many disciplines. Fuzzy programming is a product of fuzzy logic, fuzzy rules, and implication. In marine science, fuzzy programming for ships is dramatically increasing together with autonomous ship studies. In this paper, a program to support the decision-making process for ship navigation has been designed. The program is produced in fuzzy logic and rules, by taking the marine accidents and expert opinions into account. After the program was designed, the program was tested by 46 ship accidents reported by the Transportation Safety Investigation Center of Turkey. Wind speed, sea condition, visibility, day/night ratio have been used as input data. They have been converted into a risk factor within the Fuzzy Logic Designer application and fuzzy rules set by marine experts. Finally, the expert's meteorological risk factor for each accident is compared with the program's risk factor, and the error rate was calculated. The main objective of this study is to improve the navigational safety of ships, by using the advance decision support model. According to the study result, fuzzy programming is a robust model that supports safe navigation.Keywords: calculation of risk factor, fuzzy logic, fuzzy programming for ship, safety navigation of ships
Procedia PDF Downloads 1893742 An Assessment of Airport Collaborative Decision-Making System Using Predictive Maintenance
Authors: Faruk Aras, Melih Inal, Tansel Cinar
Abstract:
The coordination of airport staff especially in the operations and maintenance departments is important for the airport operation. As a result, this coordination will increase the efficiency in all operation. Therefore, a Collaborative Decision-Making (CDM) system targets on improving the overall productivity of all operations by optimizing the use of resources and improving the predictability of actions. Enlarged productivity can be of major benefit for all airport operations. It also increases cost-efficiency. This study explains how predictive maintenance using IoT (Internet of Things), predictive operations and the statistical data such as Mean Time To Failure (MTTF) improves airport terminal operations and utilize airport terminal equipment in collaboration with collaborative decision making system/Airport Operation Control Center (AOCC). Data generated by the predictive maintenance methods is retrieved and analyzed by maintenance managers to predict when a problem is about to occur. With that information, maintenance can be scheduled when needed. As an example, AOCC operator would have chance to assign a new gate that towards to this gate all the equipment such as travellator, elevator, escalator etc. are operational if the maintenance team is in collaboration with AOCC since maintenance team is aware of the health of the equipment because of predictive maintenance methods. Applying predictive maintenance methods based on analyzing the health of airport terminal equipment dramatically reduces the risk of downtime by on time repairs. We can classify the categories as high priority calls for urgent repair action, as medium priority requires repair at the earliest opportunity, and low priority allows maintenance to be scheduled when convenient. In all cases, identifying potential problems early resulted in better allocation airport terminal resources by AOCC.Keywords: airport, predictive maintenance, collaborative decision-making system, Airport Operation Control Center (AOCC)
Procedia PDF Downloads 3653741 Option Pricing Theory Applied to the Service Sector
Authors: Luke Miller
Abstract:
This paper develops an options pricing methodology to value strategic pricing strategies in the services sector. More specifically, this study provides a unifying taxonomy of current service sector pricing practices, frames these pricing decisions as strategic real options, demonstrates accepted option valuation techniques to assess service sector pricing decisions, and suggests future research areas where pricing decisions and real options overlap. Enhancing revenue in the service sector requires proactive decision making in a world of uncertainty. In an effort to strategically price service products, revenue enhancement necessitates a careful study of the service costs, customer base, competition, legalities, and shared economies with the market. Pricing decisions involve the quality of inputs, manpower, and best practices to maintain superior service. These decisions further hinge on identifying relevant pricing strategies and understanding how these strategies impact a firm’s value. A relatively new area of research applies option pricing theory to investments in real assets and is commonly known as real options. The real options approach is based on the premise that many corporate decisions to invest or divest in assets are simply an option wherein the firm has the right to make an investment without any obligation to act. The decision maker, therefore, has more flexibility and the value of this operating flexibility should be taken into consideration. The real options framework has already been applied to numerous areas including manufacturing, inventory, natural resources, research and development, strategic decisions, technology, and stock valuation. Additionally, numerous surveys have identified a growing need for the real options decision framework within all areas of corporate decision-making. Despite the wide applicability of real options, no study has been carried out linking service sector pricing decisions and real options. This is surprising given the service sector comprises 80% of the US employment and Gross Domestic Product (GDP). Identifying real options as a practical tool to value different service sector pricing strategies is believed to have a significant impact on firm decisions. This paper identifies and discusses four distinct pricing strategies available to the service sector from an options’ perspective: (1) Cost-based profit margin, (2) Increased customer base, (3) Platform pricing, and (4) Buffet pricing. Within each strategy lie several pricing tactics available to the service firm. These tactics can be viewed as options the decision maker has to best manage a strategic position in the market. To demonstrate the effectiveness of including flexibility in the pricing decision, a series of pricing strategies were developed and valued using a real options binomial lattice structure. The options pricing approach discussed in this study allows service firms to directly incorporate market-driven perspectives into the decision process and thus synchronizing service operations with organizational economic goals.Keywords: option pricing theory, real options, service sector, valuation
Procedia PDF Downloads 3553740 Description of Decision Inconsistency in Intertemporal Choices and Representation of Impatience as a Reflection of Irrationality: Consequences in the Field of Personalized Behavioral Finance
Authors: Roberta Martino, Viviana Ventre
Abstract:
Empirical evidence has, over time, confirmed that the behavior of individuals is inconsistent with the descriptions provided by the Discounted Utility Model, an essential reference for calculating the utility of intertemporal prospects. The model assumes that individuals calculate the utility of intertemporal prospectuses by adding up the values of all outcomes obtained by multiplying the cardinal utility of the outcome by the discount function estimated at the time the outcome is received. The trend of the discount function is crucial for the preferences of the decision maker because it represents the perception of the future, and its trend causes temporally consistent or temporally inconsistent preferences. In particular, because different formulations of the discount function lead to various conclusions in predicting choice, the descriptive ability of models with a hyperbolic trend is greater than linear or exponential models. Suboptimal choices from any time point of view are the consequence of this mechanism, the psychological factors of which are encapsulated in the discount rate trend. In addition, analyzing the decision-making process from a psychological perspective, there is an equivalence between the selection of dominated prospects and a degree of impatience that decreases over time. The first part of the paper describes and investigates the anomalies of the discounted utility model by relating the cognitive distortions of the decision-maker to the emotional factors that are generated during the evaluation and selection of alternatives. Specifically, by studying the degree to which impatience decreases, it’s possible to quantify how the psychological and emotional mechanisms of the decision-maker result in a lack of decision persistence. In addition, this description presents inconsistency as the consequence of an inconsistent attitude towards time-delayed choices. The second part of the paper presents an experimental phase in which we show the relationship between inconsistency and impatience in different contexts. Analysis of the degree to which impatience decreases confirms the influence of the decision maker's emotional impulses for each anomaly in the utility model discussed in the first part of the paper. This work provides an application in the field of personalized behavioral finance. Indeed, the numerous behavioral diversities, evident even in the degrees of decrease in impatience in the experimental phase, support the idea that optimal strategies may not satisfy individuals in the same way. With the aim of homogenizing the categories of investors and to provide a personalized approach to advice, the results proven in the experimental phase are used in a complementary way with the information in the field of behavioral finance to implement the Analytical Hierarchy Process model in intertemporal choices, useful for strategic personalization. In the construction of the Analytic Hierarchy Process, the degree of decrease in impatience is understood as reflecting irrationality in decision-making and is therefore used for the construction of weights between anomalies and behavioral traits.Keywords: analytic hierarchy process, behavioral finance, financial anomalies, impatience, time inconsistency
Procedia PDF Downloads 683739 Advanced Analytical Competency Is Necessary for Strategic Leadership to Achieve High-Quality Decision-Making
Authors: Amal Mohammed Alqahatni
Abstract:
This paper is a non-empirical analysis of existing literature on digital leadership competency, data-driven organizations, and dealing with AI technology (big data). This paper will provide insights into the importance of developing the leader’s analytical skills and style to be more effective for high-quality decision-making in a data-driven organization and achieve creativity during the organization's transformation to be digitalized. Despite the enormous potential that big data has, there are not enough experts in the field. Many organizations faced an issue with leadership style, which was considered an obstacle to organizational improvement. It investigates the obstacles to leadership style in this context and the challenges leaders face in coaching and development. The leader's lack of analytical skill with AI technology, such as big data tools, was noticed, as was the lack of understanding of the value of that data, resulting in poor communication with others, especially in meetings when the decision should be made. By acknowledging the different dynamics of work competency and organizational structure and culture, organizations can make the necessary adjustments to best support their leaders. This paper reviews prior research studies and applies what is known to assist with current obstacles. This paper addresses how analytical leadership will assist in overcoming challenges in a data-driven organization's work environment.Keywords: digital leadership, big data, leadership style, digital leadership challenge
Procedia PDF Downloads 693738 Fuzzy Linear Programming Approach for Determining the Production Amounts in Food Industry
Abstract:
In recent years, rapid and correct decision making is crucial for both people and enterprises. However, uncertainty makes decision-making difficult. Fuzzy logic is used for coping with this situation. Thus, fuzzy linear programming models are developed in order to handle uncertainty in objective function and the constraints. In this study, a problem of a factory in food industry is investigated, required data is obtained and the problem is figured out as a fuzzy linear programming model. The model is solved using Zimmerman approach which is one of the approaches for fuzzy linear programming. As a result, the solution gives the amount of production for each product type in order to gain maximum profit.Keywords: food industry, fuzzy linear programming, fuzzy logic, linear programming
Procedia PDF Downloads 6503737 A Comparative Study on Automatic Feature Classification Methods of Remote Sensing Images
Authors: Lee Jeong Min, Lee Mi Hee, Eo Yang Dam
Abstract:
Geospatial feature extraction is a very important issue in the remote sensing research. In the meantime, the image classification based on statistical techniques, but, in recent years, data mining and machine learning techniques for automated image processing technology is being applied to remote sensing it has focused on improved results generated possibility. In this study, artificial neural network and decision tree technique is applied to classify the high-resolution satellite images, as compared to the MLC processing result is a statistical technique and an analysis of the pros and cons between each of the techniques.Keywords: remote sensing, artificial neural network, decision tree, maximum likelihood classification
Procedia PDF Downloads 3473736 The Nexus of Decentralized Policy, social Heterogeneity and Poverty in Equitable Forest Benefit Sharing in the Lowland Community Forestry Program of Nepal
Authors: Dhiraj Neupane
Abstract:
Decentralized policy and practices have largely concentrated on the transformation of decision-making authorities from central to local institutions (or people) in the developing world. Such policy and practices always aimed for the equitable and efficient management of resources in the line of poverty reduction. The transformation of forest decision-making autonomy has also glorified as the best forest management alternatives to maximize the forest benefits and improve the livelihood of local people living nearby the forests. However, social heterogeneity and poor decision-making capacity of local institutions (or people) pose a nexus while managing the resources and sharing the forest benefits among the user households despite the policy objectives. The situation is severe in the lowland of Nepal, where forest resources have higher economic potential and user households have heterogeneous socio-economic conditions. The study discovered that utilizing the power of decision-making autonomy, user households were putting low values of timber considering the equitable access of timber to all user households as it is the most valuable product of community forest. Being the society is heterogeneous by socio-economic conditions, households of better economic conditions were always taking higher amount of forest benefits. The low valuation of timber has negative consequences on equitable benefit sharing and poor support to livelihood improvement of user households. Moreover, low valuation has possibility to increase the local demands of timber and increase the human pressure on forests.Keywords: decentralized forest policy, Nepal, poverty, social heterogeneity, Terai
Procedia PDF Downloads 2873735 The Implementation of the Human Right of Self-Determination: the Example of Nagorno-Karabakh Republic
Authors: S. Vlasyan
Abstract:
The article deals with the implementation of the right to self-determination of peoples on the example of Nagorno-Karabakh Republic. The problem of correlation of two fundamental principles of international law i. e. territorial integrity and the right to self-determination of peoples is considered to be one of the vital issues in the field of international law for several decades. So, in this article, the author analyzes the decision of the Supreme Court of Canada regarding specific issues of secession of Quebec from Canada, as well as the decision of the International Court of Justice in the case concerning East Timor (Portugal v. Australia), and in the case of Western Sahara. The author formulates legal conditions of Nagorno-Karabakh secession.Keywords: right of self-determination, territorial integrity, the principles of International Law, Nagorno-Karabakh Republic
Procedia PDF Downloads 4083734 A Condition-Based Maintenance Policy for Multi-Unit Systems Subject to Deterioration
Authors: Nooshin Salari, Viliam Makis
Abstract:
In this paper, we propose a condition-based maintenance policy for multi-unit systems considering the existence of economic dependency among units. We consider a system composed of N identical units, where each unit deteriorates independently. Deterioration process of each unit is modeled as a three-state continuous time homogeneous Markov chain with two working states and a failure state. The average production rate of units varies in different working states and demand rate of the system is constant. Units are inspected at equidistant time epochs, and decision regarding performing maintenance is determined by the number of units in the failure state. If the total number of units in the failure state exceeds a critical level, maintenance is initiated, where units in failed state are replaced correctively and deteriorated state units are maintained preventively. Our objective is to determine the optimal number of failed units to initiate maintenance minimizing the long run expected average cost per unit time. The problem is formulated and solved in the semi-Markov decision process (SMDP) framework. A numerical example is developed to demonstrate the proposed policy and the comparison with the corrective maintenance policy is presented.Keywords: reliability, maintenance optimization, semi-Markov decision process, production
Procedia PDF Downloads 1653733 Structural Health Monitoring-Integrated Structural Reliability Based Decision Making
Authors: Caglayan Hizal, Kutay Yuceturk, Ertugrul Turker Uzun, Hasan Ceylan, Engin Aktas, Gursoy Turan
Abstract:
Monitoring concepts for structural systems have been investigated by researchers for decades since such tools are quite convenient to determine intervention planning of structures. Despite the considerable development in this regard, the efficient use of monitoring data in reliability assessment, and prediction models are still in need of improvement in their efficiency. More specifically, reliability-based seismic risk assessment of engineering structures may play a crucial role in the post-earthquake decision-making process for the structures. After an earthquake, professionals could identify heavily damaged structures based on visual observations. Among these, it is hard to identify the ones with minimum signs of damages, even if they would experience considerable structural degradation. Besides, visual observations are open to human interpretations, which make the decision process controversial, and thus, less reliable. In this context, when a continuous monitoring system has been previously installed on the corresponding structure, this decision process might be completed rapidly and with higher confidence by means of the observed data. At this stage, the Structural Health Monitoring (SHM) procedure has an important role since it can make it possible to estimate the system reliability based on a recursively updated mathematical model. Therefore, integrating an SHM procedure into the reliability assessment process comes forward as an important challenge due to the arising uncertainties for the updated model in case of the environmental, material and earthquake induced changes. In this context, this study presents a case study on SHM-integrated reliability assessment of the continuously monitored progressively damaged systems. The objective of this study is to get instant feedback on the current state of the structure after an extreme event, such as earthquakes, by involving the observed data rather than the visual inspections. Thus, the decision-making process after such an event can be carried out on a rational basis. In the near future, this can give wing to the design of self-reported structures which can warn about its current situation after an extreme event.Keywords: condition assessment, vibration-based SHM, reliability analysis, seismic risk assessment
Procedia PDF Downloads 1433732 Calpoly Autonomous Transportation Experience: Software for Driverless Vehicle Operating on Campus
Authors: F. Tang, S. Boskovich, A. Raheja, Z. Aliyazicioglu, S. Bhandari, N. Tsuchiya
Abstract:
Calpoly Autonomous Transportation Experience (CATE) is a driverless vehicle that we are developing to provide safe, accessible, and efficient transportation of passengers throughout the Cal Poly Pomona campus for events such as orientation tours. Unlike the other self-driving vehicles that are usually developed to operate with other vehicles and reside only on the road networks, CATE will operate exclusively on walk-paths of the campus (potentially narrow passages) with pedestrians traveling from multiple locations. Safety becomes paramount as CATE operates within the same environment as pedestrians. As driverless vehicles assume greater roles in today’s transportation, this project will contribute to autonomous driving with pedestrian traffic in a highly dynamic environment. The CATE project requires significant interdisciplinary work. Researchers from mechanical engineering, electrical engineering and computer science are working together to attack the problem from different perspectives (hardware, software and system). In this abstract, we describe the software aspects of the project, with a focus on the requirements and the major components. CATE shall provide a GUI interface for the average user to interact with the car and access its available functionalities, such as selecting a destination from any origin on campus. We have developed an interface that provides an aerial view of the campus map, the current car location, routes, and the goal location. Users can interact with CATE through audio or manual inputs. CATE shall plan routes from the origin to the selected destination for the vehicle to travel. We will use an existing aerial map for the campus and convert it to a spatial graph configuration where the vertices represent the landmarks and edges represent paths that the car should follow with some designated behaviors (such as stay on the right side of the lane or follow an edge). Graph search algorithms such as A* will be implemented as the default path planning algorithm. D* Lite will be explored to efficiently recompute the path when there are any changes to the map. CATE shall avoid any static obstacles and walking pedestrians within some safe distance. Unlike traveling along traditional roadways, CATE’s route directly coexists with pedestrians. To ensure the safety of the pedestrians, we will use sensor fusion techniques that combine data from both lidar and stereo vision for obstacle avoidance while also allowing CATE to operate along its intended route. We will also build prediction models for pedestrian traffic patterns. CATE shall improve its location and work under a GPS-denied situation. CATE relies on its GPS to give its current location, which has a precision of a few meters. We have implemented an Unscented Kalman Filter (UKF) that allows the fusion of data from multiple sensors (such as GPS, IMU, odometry) in order to increase the confidence of localization. We also noticed that GPS signals can easily get degraded or blocked on campus due to high-rise buildings or trees. UKF can also help here to generate a better state estimate. In summary, CATE will provide on-campus transportation experience that coexists with dynamic pedestrian traffic. In future work, we will extend it to multi-vehicle scenarios.Keywords: driverless vehicle, path planning, sensor fusion, state estimate
Procedia PDF Downloads 1443731 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain
Authors: M. Pushparani, A. Sagaya
Abstract:
Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.Keywords: embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems
Procedia PDF Downloads 2853730 Decision Location and Resource Requirement for Relief Goods Assembly
Authors: Glenda B. Minguito, Jenith L. Banluta
Abstract:
One of the critical aspects of humanitarian operations is the distribution of relief goods to the affected community. The common assumption is that relief goods are prepositioned during disasters which are not applicable in developing countries like the Philippines. During disasters, the on-the-ground government agencies and responders have to procure, sort, weigh and pack the relief goods. There is a need to review the relief goods preparation as it seriously affects the delivery of necessary aid for human survival. This study also identifies the ideal location of the assembly hub to minimize the distance to the affected community. This paper reveals that location and resources are dependent on the type of disasters encountered at the local level. The Center-of-Gravity method and Multiple Activity Chart were applied in the analysis.Keywords: humanitarian supply chain, location decision, resource allocation, local level
Procedia PDF Downloads 1483729 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights
Authors: Olga Kokoulina
Abstract:
Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.Keywords: algorithms, public interest, trade secrets, transparency
Procedia PDF Downloads 1243728 Knowledge, Hierarchy and Decision-Making: Analysis of Documentary Filmmaking Practices in India
Authors: Nivedita Ghosh
Abstract:
In his critique of Lefebvre’s view that ‘technological capacities’ are class-dependent, Francois Hetman argues that technology today is participatory, allowing the entry of individuals from different levels of social stratification. As a result, we are entering into an era of technology operators or ‘clerks’ who become the new decision-makers because of the knowledge they possess of the use of technologies. In response to Hetman’s thesis, this paper argues that knowledge of technology, while indeed providing a momentary space for decision-making, does not necessarily restructure social hierarchies. Through case studies presented from the world of Indian documentary filmmaking, this paper puts forth the view that Hetman’s clerks, despite being technologically advanced, do not break into the filmmaking hierarchical order. This remains true even for a situation where technical knowledge rests most with those in the lowest rungs of the filmmaking ladder. Instead, technological knowledge provides the space for other kinds of relationships to evolve, such as those of ‘trusting the technician’ or ‘admiration for the technician’s work’. Furthermore, what continues to define documentary filmmaking hierarchy is conceptualization capacities of the practitioners, which are influenced by a similarity in socio-cultural backgrounds and film school training accessible primarily to the filmmakers instead of the technicians. Accordingly, the paper concludes with the argument that more than ‘technological-capacities’, it is ‘conceptualization capacities’ which are class-dependent, especially when we study the field of documentary filmmaking.Keywords: documentary filmmaking, India, technology, knowledge, hierarchy
Procedia PDF Downloads 2623727 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System
Procedia PDF Downloads 5503726 Presentation of International Military Intervention Correlates (IMIC) Database
Authors: Daniil Chernov
Abstract:
In the modern world, the number of conventional interstate wars is declining while the number of military interventions is rising. States no longer initiate conflicts by declaring war but actively intervene in existing military confrontations, often using a comparable number of coercive means. According to existing scholarly understanding, the decision to use force in international relations (in any form) is influenced by roughly the same set of factors: the dynamics of domestic political processes, national interests, international law, and ethical considerations. In the database on armed intervention to be presented in the report, the multifactor model of decision-making is developed. The database describes more than 200 different parameters for armed interventions between 1992 and 2022. The report will present the structure of the database, descriptive statistics, and its key advantages over other sources.Keywords: conflict resolution, international relations, military intervention, database
Procedia PDF Downloads 343725 The Promise of Nunca Más after Cambiemos: Representations of the 2x1 Decision of the Supreme Court and Santiago Maldonado's Disappearance in the Newspaper La Nación
Authors: Uluhan Berk Ondul
Abstract:
This article aims to shed light on the new stage of transitional justice in Argentina through examining the representations of the 2x1 decision of the Supreme Court and Santiago Maldonado’s Disappearance in the newspaper, La Nación. The two events hold the key to understanding Argentina’s journey since return to democracy as they are about the same crimes of the dictatorship, namely, the forced disappearance of civilians and the subsequent impunity that follows. In the case of a convicted torturer, The Supreme Court of Argentina ruled on 3rd of May 2017 that the days spent in preventive detention after two years should be counted double for the overall sentence. This court decision was met with severe resistance from the members of the parliament as well as the human rights movement. The second item on the list still continues and divides the country into two camps: (1) those who think that the police force has committed another act of forced disappearance in the case of activist Santiago Maldonado and (2) the others who blame the peronistas (the party and supporters of the ex-president Cristina Fernandez de Kirchner) of using this subject as a means to score political points. As a newspaper known for its proximity to the current administration, La Nación offers an insight to the direction of the country and also demonstrates how the neoliberal mindset works. The results of the study show that the transitional justice process in Argentina is far from being complete as the Promise of Nunca Más is still not a shared value but a political statement.Keywords: Argentina, Fallo 2x1, impunity, Santiago Maldonado, transitional justice
Procedia PDF Downloads 2313724 Lobbying Regulation in the EU: Transparency’s Achilles’ Heel
Authors: Krambia-Kapardis Maria, Neophytidou Christina
Abstract:
Lobbying is an inherent aspect within the democratic regimes across the globe. Although it can provide decision-makers with valuable knowledge and grant access to stakeholders in the decision-making process, it can also lead to undue influence and unfair competition at the expense of the public interest if it not transparent. Given the multi-level governance structure of the EU, it is no surprise that the EU policy-making arena has become a place-to-be for lobbyists. However, in order to ensure that influence is legitimate and not biased of any business interests, lobbying must be effectively regulated. A comparison with the US and Canadian lobbying regulatory framework and utilising some good practices from EU countries it is apparent that lobbying is the Achilles’ heel to transparency in the EU. It is evident that EU institutions suffer from ineffective regulations and could in fact benefit from a more robust, mandatory and better implemented system of lobbying regulation.Keywords: EU, lobbying regulation, transparency, democratic regimes
Procedia PDF Downloads 4223723 Intelligent Agent Travel Reservation System Requirements Definitions Using the Behavioral Patterns Analysis (BPA) Approach
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Intelligent Agent Reservation System (IARS). The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are developing the Behavioral Pattern Analysis (BPA) modeling methodology, and developing an interactive software tool (DECISION) which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, intelligent agent, reservation system, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 4843722 A Data-Mining Model for Protection of FACTS-Based Transmission Line
Authors: Ashok Kalagura
Abstract:
This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC
Procedia PDF Downloads 4233721 A Framework for the Evaluation of Infrastructures’ Serviceability
Authors: Kyonghoon Kim, Wonyoung Park, Taeil Park
Abstract:
In 1994, Korea experienced a national tragedy of Seongsu Bridge collapse. The accident was severe enough to alert governmental officers to the problem of existing management policy for national infrastructures. As a result, government legislated the ‘Guidelines for the safety inspection and test of infrastructure’ which have been utilized as the primary tool to make decision for the maintenance and rehabilitation of infrastructure for last twenty years. Although it is clear that the guideline established a basics how to evaluate and manage the condition of infrastructures in systematic manner, it is equally clear that the guideline needs improvements in order to obtain reasonable investment decisions for budget allocation. Because its inspection and evaluation procedures mainly focused on the structural condition of infrastructures, it was hard to make decision when the infrastructures were in same level of structural condition. In addition, it did not properly reflect various aspects of infrastructures such as performance, public demand, capacity, etc., which were more valuable to public. Regardless of the importance, these factors were commonly neglected in governmental decision-making process, because there factors were somewhat subjective and difficult to quantify in rational manner. Thus, this study proposes a framework to properly evaluate the serviceability indicators using AHP and Fuzzy approach. The framework is expected to assist governmental agency in establishing effective investment strategies for budget planning.Keywords: infrastructure, evaluation, serviceability, fuzzy
Procedia PDF Downloads 2863720 Regular or Irregular: An Investigation of Medicine Consumption Pattern with Poisson Mixture Model
Authors: Lichung Jen, Yi Chun Liu, Kuan-Wei Lee
Abstract:
Fruitful data has been accumulated in database nowadays and is commonly used as support for decision-making. In the healthcare industry, hospital, for instance, ordering pharmacy inventory is one of the key decision. With large drug inventory, the current cost increases and its expiration dates might lead to future issue, such as drug disposal and recycle. In contrast, underestimating demand of the pharmacy inventory, particularly standing drugs, affects the medical treatment and possibly hospital reputation. Prescription behaviour of hospital physicians is one of the critical factor influencing this decision, particularly irregular prescription behaviour. If a drug’s usage amount in the month is irregular and less than the regular usage, it may cause the trend of subsequent stockpiling. On the contrary, if a drug has been prescribed often than expected, it may result in insufficient inventory. We proposed a hierarchical Bayesian mixture model with two components to identify physicians’ regular/irregular prescription patterns with probabilities. Heterogeneity of hospital is considered in our proposed hierarchical Bayes model. The result suggested that modeling the prescription patterns of physician is beneficial for estimating the order quantity of medication and pharmacy inventory management of the hospital. Managerial implication and future research are discussed.Keywords: hierarchical Bayesian model, poission mixture model, medicines prescription behavior, irregular behavior
Procedia PDF Downloads 1273719 The Role of Temporary Migration as Coping Mechanism of Weather Shock: Evidence from Selected Semi-Arid Tropic Villages in India
Authors: Kalandi Charan Pradhan
Abstract:
In this study, we investigate does weather variation determine temporary labour migration using 210 sample households from six Semi-Arid Tropic (SAT) villages for the period of 2005-2014 in India. The study has made an attempt to examine how households use temporary labour migration as a coping mechanism to minimise the risk rather than maximize the utility of the households. The study employs panel Logit regression model to predict the probability of household having at least one temporary labour migrant. As per as econometrics result, it is found that along with demographic and socioeconomic factors; weather variation plays an important role to determine the decision of migration at household level. In order to capture the weather variation, the study uses mean crop yield deviation over the study periods. Based on the random effect logit regression result, the study found that there is a concave relationship between weather variation and decision of temporary labour migration. This argument supports the theory of New Economics of Labour Migration (NELM), which highlights the decision of labour migration not only maximise the households’ utility but it helps to minimise the risks.Keywords: temporary migration, socioeconomic factors, weather variation, crop yield, logit estimation
Procedia PDF Downloads 223