Search results for: gambling decision
3524 Implementation of Inference Fuzzy System as a Valuation Subsidiary is Based Particle Swarm Optimization for Solves the Issue of Decision Making in Middle Size Soccer Robot League
Authors: Zahra Abdolkarimi, Naser Zouri
Abstract:
Nowadays, there is unbelievable growing of Robots created a collection of complex and motivate subject in robotic and intellectual ornate, also it made a mechatronics style base of theoretical and technical way in Robocop. Additionally, robotics system recommended RoboCup factor as a provider of some standardization and testing method in case of computer discussion widely. The actual purpose of RoboCup is creating independent team of robots in 2050 based of FiFa roles to bring the victory in compare of world star team. In addition, decision making of robots depends to environment reaction, self-player and rival player with using inductive Fuzzy system valuation subsidiary to solve issue of robots in land game. The measure of selection in compare with other methods depends to amount of victories percentage in the same team that plays accidently. Consequences, shows method of our discussion is the best way for Particle Swarm Optimization and Fuzzy system compare to other decision of robotics algorithmic.Keywords: PSO algorithm, inference fuzzy system, chaos theory, soccer robot league
Procedia PDF Downloads 4053523 Intelligent Building as a Pragmatic Approach towards Achieving a Sustainable Environment
Authors: Zahra Hamedani
Abstract:
Many wonderful technological developments in recent years has opened up the possibility of using intelligent buildings for a number of important applications, ranging from minimizing resource usage as well as increasing building efficiency to maximizing comfort, adaption to inhabitants and responsiveness to environmental changes. The concept of an intelligent building refers to the highly embedded, interactive environment within which by exploiting the use of artificial intelligence provides the ability to know its configuration, anticipate the optimum dynamic response to prevailing environmental stimuli, and actuate the appropriate physical reaction to provide comfort and efficiency. This paper contains a general identification of the intelligence paradigm and its impacts on the architecture arena, that with examining the performance of artificial intelligence, a mechanism to analyze and finally for decision-making to control the environment will be described. This mechanism would be a hierarchy of the rational agents which includes decision-making, information, communication and physical layers. This multi-agent system relies upon machine learning techniques for automated discovery, prediction and decision-making. Then, the application of this mechanism regarding adaptation and responsiveness of intelligent building will be provided in two scales of environmental and user. Finally, we review the identifications of sustainability and evaluate the potentials of intelligent building systems in the creation of sustainable architecture and environment.Keywords: artificial intelligence, intelligent building, responsiveness, adaption, sustainability
Procedia PDF Downloads 4113522 A Multi-Objective Decision Making Model for Biodiversity Conservation and Planning: Exploring the Concept of Interdependency
Authors: M. Mohan, J. P. Roise, G. P. Catts
Abstract:
Despite living in an era where conservation zones are de-facto the central element in any sustainable wildlife management strategy, we still find ourselves grappling with several pareto-optimal situations regarding resource allocation and area distribution for the same. In this paper, a multi-objective decision making (MODM) model is presented to answer the question of whether or not we can establish mutual relationships between these contradicting objectives. For our study, we considered a Red-cockaded woodpecker (Picoides borealis) habitat conservation scenario in the coastal plain of North Carolina, USA. Red-cockaded woodpecker (RCW) is a non-migratory territorial bird that excavates cavities in living pine trees for roosting and nesting. The RCW groups nest in an aggregation of cavity trees called ‘cluster’ and for our model we use the number of clusters to be established as a measure of evaluating the size of conservation zone required. The case study is formulated as a linear programming problem and the objective function optimises the Red-cockaded woodpecker clusters, carbon retention rate, biofuel, public safety and Net Present Value (NPV) of the forest. We studied the variation of individual objectives with respect to the amount of area available and plotted a two dimensional dynamic graph after establishing interrelations between the objectives. We further explore the concept of interdependency by integrating the MODM model with GIS, and derive a raster file representing carbon distribution from the existing forest dataset. Model results demonstrate the applicability of interdependency from both linear and spatial perspectives, and suggest that this approach holds immense potential for enhancing environmental investment decision making in future.Keywords: conservation, interdependency, multi-objective decision making, red-cockaded woodpecker
Procedia PDF Downloads 3383521 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis
Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri
Abstract:
In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer
Procedia PDF Downloads 863520 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment
Authors: Ritsuko Kawasaki, Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 3803519 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —
Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno
Abstract:
STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.Keywords: rule induction, decision table, missing data, noise
Procedia PDF Downloads 3963518 Aircraft Line Maintenance Equipped with Decision Support System
Authors: B. Sudarsan Baskar, S. Pooja Pragati, S. Raj Kumar
Abstract:
The cost effectiveness in aircraft maintenance is of high privilege in the recent days. The cost effectiveness can be effectively made when line maintenance activities are incorporated at airports during Turn around time (TAT). The present work outcomes the shortcomings that affect the dispatching of the aircrafts, aiming at high fleet operability and low maintenance cost. The operational and cost constraints have been discussed and a suggestive alternative mechanism is proposed. The possible allocation of all deferred maintenance tasks to a set of all deferred maintenance tasks to a set of suitable airport resources have termed as alternative and is discussed in this paper from the data’s collected from the kingfisher airlines.Keywords: decision support system, aircraft maintenance planning, maintenance-cost, RUL(remaining useful life), logistics, supply chain management
Procedia PDF Downloads 5063517 The Effect of Law on Politics
Authors: Boukrida Rafiq
Abstract:
Democracy is based on the notion that all citizens have the right to participate in the managing of political affairs and that every citizens input is of equal importance. This basic assumption clearly places emphasis on public participation in maintaining a stable democracy. The level of public participation, however is highly contested with many theorists arguing that too much public participation would overwhelm and ultimately cripple democratic systems. On the other hand, others who favor high levels of participation argue that more citizen involvement leads to greater representation. Regardless of these disagreements over the utopian level of participation, there is widespread agreement amongst scholars that, at the very least, some participation is necessary to maintain democratic systems. The ways in which citizens participate vary greatly and depending on the method used, influence political decision making at varying levels. The method of political participation is a key in controlling public influence over political affairs and therefore is also an integral part of maintaining democracy, whether it be "thin" (low levels of participation) or "Robust" (high levels of participation). High levels of participation or "robust" democracy are argued by some theorists to enhance democracy through providing the opportunity for more issues to be represented during decision making. The notion of widespread participation was first advanced by classical theorists.Keywords: assumption clearly places emphasis, ultimately cripple, influence political decision making at varying, classical theorists
Procedia PDF Downloads 4623516 Meteorological Risk Assessment for Ships with Fuzzy Logic Designer
Authors: Ismail Karaca, Ridvan Saracoglu, Omer Soner
Abstract:
Fuzzy Logic, an advanced method to support decision-making, is used by various scientists in many disciplines. Fuzzy programming is a product of fuzzy logic, fuzzy rules, and implication. In marine science, fuzzy programming for ships is dramatically increasing together with autonomous ship studies. In this paper, a program to support the decision-making process for ship navigation has been designed. The program is produced in fuzzy logic and rules, by taking the marine accidents and expert opinions into account. After the program was designed, the program was tested by 46 ship accidents reported by the Transportation Safety Investigation Center of Turkey. Wind speed, sea condition, visibility, day/night ratio have been used as input data. They have been converted into a risk factor within the Fuzzy Logic Designer application and fuzzy rules set by marine experts. Finally, the expert's meteorological risk factor for each accident is compared with the program's risk factor, and the error rate was calculated. The main objective of this study is to improve the navigational safety of ships, by using the advance decision support model. According to the study result, fuzzy programming is a robust model that supports safe navigation.Keywords: calculation of risk factor, fuzzy logic, fuzzy programming for ship, safety navigation of ships
Procedia PDF Downloads 1913515 An Assessment of Airport Collaborative Decision-Making System Using Predictive Maintenance
Authors: Faruk Aras, Melih Inal, Tansel Cinar
Abstract:
The coordination of airport staff especially in the operations and maintenance departments is important for the airport operation. As a result, this coordination will increase the efficiency in all operation. Therefore, a Collaborative Decision-Making (CDM) system targets on improving the overall productivity of all operations by optimizing the use of resources and improving the predictability of actions. Enlarged productivity can be of major benefit for all airport operations. It also increases cost-efficiency. This study explains how predictive maintenance using IoT (Internet of Things), predictive operations and the statistical data such as Mean Time To Failure (MTTF) improves airport terminal operations and utilize airport terminal equipment in collaboration with collaborative decision making system/Airport Operation Control Center (AOCC). Data generated by the predictive maintenance methods is retrieved and analyzed by maintenance managers to predict when a problem is about to occur. With that information, maintenance can be scheduled when needed. As an example, AOCC operator would have chance to assign a new gate that towards to this gate all the equipment such as travellator, elevator, escalator etc. are operational if the maintenance team is in collaboration with AOCC since maintenance team is aware of the health of the equipment because of predictive maintenance methods. Applying predictive maintenance methods based on analyzing the health of airport terminal equipment dramatically reduces the risk of downtime by on time repairs. We can classify the categories as high priority calls for urgent repair action, as medium priority requires repair at the earliest opportunity, and low priority allows maintenance to be scheduled when convenient. In all cases, identifying potential problems early resulted in better allocation airport terminal resources by AOCC.Keywords: airport, predictive maintenance, collaborative decision-making system, Airport Operation Control Center (AOCC)
Procedia PDF Downloads 3653514 Option Pricing Theory Applied to the Service Sector
Authors: Luke Miller
Abstract:
This paper develops an options pricing methodology to value strategic pricing strategies in the services sector. More specifically, this study provides a unifying taxonomy of current service sector pricing practices, frames these pricing decisions as strategic real options, demonstrates accepted option valuation techniques to assess service sector pricing decisions, and suggests future research areas where pricing decisions and real options overlap. Enhancing revenue in the service sector requires proactive decision making in a world of uncertainty. In an effort to strategically price service products, revenue enhancement necessitates a careful study of the service costs, customer base, competition, legalities, and shared economies with the market. Pricing decisions involve the quality of inputs, manpower, and best practices to maintain superior service. These decisions further hinge on identifying relevant pricing strategies and understanding how these strategies impact a firm’s value. A relatively new area of research applies option pricing theory to investments in real assets and is commonly known as real options. The real options approach is based on the premise that many corporate decisions to invest or divest in assets are simply an option wherein the firm has the right to make an investment without any obligation to act. The decision maker, therefore, has more flexibility and the value of this operating flexibility should be taken into consideration. The real options framework has already been applied to numerous areas including manufacturing, inventory, natural resources, research and development, strategic decisions, technology, and stock valuation. Additionally, numerous surveys have identified a growing need for the real options decision framework within all areas of corporate decision-making. Despite the wide applicability of real options, no study has been carried out linking service sector pricing decisions and real options. This is surprising given the service sector comprises 80% of the US employment and Gross Domestic Product (GDP). Identifying real options as a practical tool to value different service sector pricing strategies is believed to have a significant impact on firm decisions. This paper identifies and discusses four distinct pricing strategies available to the service sector from an options’ perspective: (1) Cost-based profit margin, (2) Increased customer base, (3) Platform pricing, and (4) Buffet pricing. Within each strategy lie several pricing tactics available to the service firm. These tactics can be viewed as options the decision maker has to best manage a strategic position in the market. To demonstrate the effectiveness of including flexibility in the pricing decision, a series of pricing strategies were developed and valued using a real options binomial lattice structure. The options pricing approach discussed in this study allows service firms to directly incorporate market-driven perspectives into the decision process and thus synchronizing service operations with organizational economic goals.Keywords: option pricing theory, real options, service sector, valuation
Procedia PDF Downloads 3563513 Description of Decision Inconsistency in Intertemporal Choices and Representation of Impatience as a Reflection of Irrationality: Consequences in the Field of Personalized Behavioral Finance
Authors: Roberta Martino, Viviana Ventre
Abstract:
Empirical evidence has, over time, confirmed that the behavior of individuals is inconsistent with the descriptions provided by the Discounted Utility Model, an essential reference for calculating the utility of intertemporal prospects. The model assumes that individuals calculate the utility of intertemporal prospectuses by adding up the values of all outcomes obtained by multiplying the cardinal utility of the outcome by the discount function estimated at the time the outcome is received. The trend of the discount function is crucial for the preferences of the decision maker because it represents the perception of the future, and its trend causes temporally consistent or temporally inconsistent preferences. In particular, because different formulations of the discount function lead to various conclusions in predicting choice, the descriptive ability of models with a hyperbolic trend is greater than linear or exponential models. Suboptimal choices from any time point of view are the consequence of this mechanism, the psychological factors of which are encapsulated in the discount rate trend. In addition, analyzing the decision-making process from a psychological perspective, there is an equivalence between the selection of dominated prospects and a degree of impatience that decreases over time. The first part of the paper describes and investigates the anomalies of the discounted utility model by relating the cognitive distortions of the decision-maker to the emotional factors that are generated during the evaluation and selection of alternatives. Specifically, by studying the degree to which impatience decreases, it’s possible to quantify how the psychological and emotional mechanisms of the decision-maker result in a lack of decision persistence. In addition, this description presents inconsistency as the consequence of an inconsistent attitude towards time-delayed choices. The second part of the paper presents an experimental phase in which we show the relationship between inconsistency and impatience in different contexts. Analysis of the degree to which impatience decreases confirms the influence of the decision maker's emotional impulses for each anomaly in the utility model discussed in the first part of the paper. This work provides an application in the field of personalized behavioral finance. Indeed, the numerous behavioral diversities, evident even in the degrees of decrease in impatience in the experimental phase, support the idea that optimal strategies may not satisfy individuals in the same way. With the aim of homogenizing the categories of investors and to provide a personalized approach to advice, the results proven in the experimental phase are used in a complementary way with the information in the field of behavioral finance to implement the Analytical Hierarchy Process model in intertemporal choices, useful for strategic personalization. In the construction of the Analytic Hierarchy Process, the degree of decrease in impatience is understood as reflecting irrationality in decision-making and is therefore used for the construction of weights between anomalies and behavioral traits.Keywords: analytic hierarchy process, behavioral finance, financial anomalies, impatience, time inconsistency
Procedia PDF Downloads 683512 Advanced Analytical Competency Is Necessary for Strategic Leadership to Achieve High-Quality Decision-Making
Authors: Amal Mohammed Alqahatni
Abstract:
This paper is a non-empirical analysis of existing literature on digital leadership competency, data-driven organizations, and dealing with AI technology (big data). This paper will provide insights into the importance of developing the leader’s analytical skills and style to be more effective for high-quality decision-making in a data-driven organization and achieve creativity during the organization's transformation to be digitalized. Despite the enormous potential that big data has, there are not enough experts in the field. Many organizations faced an issue with leadership style, which was considered an obstacle to organizational improvement. It investigates the obstacles to leadership style in this context and the challenges leaders face in coaching and development. The leader's lack of analytical skill with AI technology, such as big data tools, was noticed, as was the lack of understanding of the value of that data, resulting in poor communication with others, especially in meetings when the decision should be made. By acknowledging the different dynamics of work competency and organizational structure and culture, organizations can make the necessary adjustments to best support their leaders. This paper reviews prior research studies and applies what is known to assist with current obstacles. This paper addresses how analytical leadership will assist in overcoming challenges in a data-driven organization's work environment.Keywords: digital leadership, big data, leadership style, digital leadership challenge
Procedia PDF Downloads 693511 Fuzzy Linear Programming Approach for Determining the Production Amounts in Food Industry
Abstract:
In recent years, rapid and correct decision making is crucial for both people and enterprises. However, uncertainty makes decision-making difficult. Fuzzy logic is used for coping with this situation. Thus, fuzzy linear programming models are developed in order to handle uncertainty in objective function and the constraints. In this study, a problem of a factory in food industry is investigated, required data is obtained and the problem is figured out as a fuzzy linear programming model. The model is solved using Zimmerman approach which is one of the approaches for fuzzy linear programming. As a result, the solution gives the amount of production for each product type in order to gain maximum profit.Keywords: food industry, fuzzy linear programming, fuzzy logic, linear programming
Procedia PDF Downloads 6523510 A Comparative Study on Automatic Feature Classification Methods of Remote Sensing Images
Authors: Lee Jeong Min, Lee Mi Hee, Eo Yang Dam
Abstract:
Geospatial feature extraction is a very important issue in the remote sensing research. In the meantime, the image classification based on statistical techniques, but, in recent years, data mining and machine learning techniques for automated image processing technology is being applied to remote sensing it has focused on improved results generated possibility. In this study, artificial neural network and decision tree technique is applied to classify the high-resolution satellite images, as compared to the MLC processing result is a statistical technique and an analysis of the pros and cons between each of the techniques.Keywords: remote sensing, artificial neural network, decision tree, maximum likelihood classification
Procedia PDF Downloads 3473509 The Nexus of Decentralized Policy, social Heterogeneity and Poverty in Equitable Forest Benefit Sharing in the Lowland Community Forestry Program of Nepal
Authors: Dhiraj Neupane
Abstract:
Decentralized policy and practices have largely concentrated on the transformation of decision-making authorities from central to local institutions (or people) in the developing world. Such policy and practices always aimed for the equitable and efficient management of resources in the line of poverty reduction. The transformation of forest decision-making autonomy has also glorified as the best forest management alternatives to maximize the forest benefits and improve the livelihood of local people living nearby the forests. However, social heterogeneity and poor decision-making capacity of local institutions (or people) pose a nexus while managing the resources and sharing the forest benefits among the user households despite the policy objectives. The situation is severe in the lowland of Nepal, where forest resources have higher economic potential and user households have heterogeneous socio-economic conditions. The study discovered that utilizing the power of decision-making autonomy, user households were putting low values of timber considering the equitable access of timber to all user households as it is the most valuable product of community forest. Being the society is heterogeneous by socio-economic conditions, households of better economic conditions were always taking higher amount of forest benefits. The low valuation of timber has negative consequences on equitable benefit sharing and poor support to livelihood improvement of user households. Moreover, low valuation has possibility to increase the local demands of timber and increase the human pressure on forests.Keywords: decentralized forest policy, Nepal, poverty, social heterogeneity, Terai
Procedia PDF Downloads 2883508 The Implementation of the Human Right of Self-Determination: the Example of Nagorno-Karabakh Republic
Authors: S. Vlasyan
Abstract:
The article deals with the implementation of the right to self-determination of peoples on the example of Nagorno-Karabakh Republic. The problem of correlation of two fundamental principles of international law i. e. territorial integrity and the right to self-determination of peoples is considered to be one of the vital issues in the field of international law for several decades. So, in this article, the author analyzes the decision of the Supreme Court of Canada regarding specific issues of secession of Quebec from Canada, as well as the decision of the International Court of Justice in the case concerning East Timor (Portugal v. Australia), and in the case of Western Sahara. The author formulates legal conditions of Nagorno-Karabakh secession.Keywords: right of self-determination, territorial integrity, the principles of International Law, Nagorno-Karabakh Republic
Procedia PDF Downloads 4103507 A Condition-Based Maintenance Policy for Multi-Unit Systems Subject to Deterioration
Authors: Nooshin Salari, Viliam Makis
Abstract:
In this paper, we propose a condition-based maintenance policy for multi-unit systems considering the existence of economic dependency among units. We consider a system composed of N identical units, where each unit deteriorates independently. Deterioration process of each unit is modeled as a three-state continuous time homogeneous Markov chain with two working states and a failure state. The average production rate of units varies in different working states and demand rate of the system is constant. Units are inspected at equidistant time epochs, and decision regarding performing maintenance is determined by the number of units in the failure state. If the total number of units in the failure state exceeds a critical level, maintenance is initiated, where units in failed state are replaced correctively and deteriorated state units are maintained preventively. Our objective is to determine the optimal number of failed units to initiate maintenance minimizing the long run expected average cost per unit time. The problem is formulated and solved in the semi-Markov decision process (SMDP) framework. A numerical example is developed to demonstrate the proposed policy and the comparison with the corrective maintenance policy is presented.Keywords: reliability, maintenance optimization, semi-Markov decision process, production
Procedia PDF Downloads 1653506 Integrating Machine Learning and Rule-Based Decision Models for Enhanced B2B Sales Forecasting and Customer Prioritization
Authors: Wenqi Liu, Reginald Bailey
Abstract:
This study proposes a comprehensive and effective approach to business-to-business (B2B) sales forecasting by integrating advanced machine learning models with a rule-based decision-making framework. The methodology addresses the critical challenge of optimizing sales pipeline performance and improving conversion rates through predictive analytics and actionable insights. The first component involves developing a classification model to predict the likelihood of conversion, aiming to outperform traditional methods such as logistic regression in terms of accuracy, precision, recall, and F1 score. Feature importance analysis highlights key predictive factors, such as client revenue size and sales velocity, providing valuable insights into conversion dynamics. The second component focuses on forecasting sales value using a regression model, designed to achieve superior performance compared to linear regression by minimizing mean absolute error (MAE), mean squared error (MSE), and maximizing R-squared metrics. The regression analysis identifies primary drivers of sales value, further informing data-driven strategies. To bridge the gap between predictive modeling and actionable outcomes, a rule-based decision framework is introduced. This model categorizes leads into high, medium, and low priorities based on thresholds for conversion probability and predicted sales value. By combining classification and regression outputs, this framework enables sales teams to allocate resources effectively, focus on high-value opportunities, and streamline lead management processes. The integrated approach significantly enhances lead prioritization, increases conversion rates, and drives revenue generation, offering a robust solution to the declining pipeline conversion rates faced by many B2B organizations. Our findings demonstrate the practical benefits of blending machine learning with decision-making frameworks, providing a scalable, data-driven solution for strategic sales optimization. This study underscores the potential of predictive analytics to transform B2B sales operations, enabling more informed decision-making and improved organizational outcomes in competitive markets.Keywords: machine learning, XGBoost, regression, decision making framework, system engineering
Procedia PDF Downloads 253505 Structural Health Monitoring-Integrated Structural Reliability Based Decision Making
Authors: Caglayan Hizal, Kutay Yuceturk, Ertugrul Turker Uzun, Hasan Ceylan, Engin Aktas, Gursoy Turan
Abstract:
Monitoring concepts for structural systems have been investigated by researchers for decades since such tools are quite convenient to determine intervention planning of structures. Despite the considerable development in this regard, the efficient use of monitoring data in reliability assessment, and prediction models are still in need of improvement in their efficiency. More specifically, reliability-based seismic risk assessment of engineering structures may play a crucial role in the post-earthquake decision-making process for the structures. After an earthquake, professionals could identify heavily damaged structures based on visual observations. Among these, it is hard to identify the ones with minimum signs of damages, even if they would experience considerable structural degradation. Besides, visual observations are open to human interpretations, which make the decision process controversial, and thus, less reliable. In this context, when a continuous monitoring system has been previously installed on the corresponding structure, this decision process might be completed rapidly and with higher confidence by means of the observed data. At this stage, the Structural Health Monitoring (SHM) procedure has an important role since it can make it possible to estimate the system reliability based on a recursively updated mathematical model. Therefore, integrating an SHM procedure into the reliability assessment process comes forward as an important challenge due to the arising uncertainties for the updated model in case of the environmental, material and earthquake induced changes. In this context, this study presents a case study on SHM-integrated reliability assessment of the continuously monitored progressively damaged systems. The objective of this study is to get instant feedback on the current state of the structure after an extreme event, such as earthquakes, by involving the observed data rather than the visual inspections. Thus, the decision-making process after such an event can be carried out on a rational basis. In the near future, this can give wing to the design of self-reported structures which can warn about its current situation after an extreme event.Keywords: condition assessment, vibration-based SHM, reliability analysis, seismic risk assessment
Procedia PDF Downloads 1453504 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain
Authors: M. Pushparani, A. Sagaya
Abstract:
Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.Keywords: embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems
Procedia PDF Downloads 2873503 Decision Location and Resource Requirement for Relief Goods Assembly
Authors: Glenda B. Minguito, Jenith L. Banluta
Abstract:
One of the critical aspects of humanitarian operations is the distribution of relief goods to the affected community. The common assumption is that relief goods are prepositioned during disasters which are not applicable in developing countries like the Philippines. During disasters, the on-the-ground government agencies and responders have to procure, sort, weigh and pack the relief goods. There is a need to review the relief goods preparation as it seriously affects the delivery of necessary aid for human survival. This study also identifies the ideal location of the assembly hub to minimize the distance to the affected community. This paper reveals that location and resources are dependent on the type of disasters encountered at the local level. The Center-of-Gravity method and Multiple Activity Chart were applied in the analysis.Keywords: humanitarian supply chain, location decision, resource allocation, local level
Procedia PDF Downloads 1513502 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights
Authors: Olga Kokoulina
Abstract:
Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.Keywords: algorithms, public interest, trade secrets, transparency
Procedia PDF Downloads 1253501 Knowledge, Hierarchy and Decision-Making: Analysis of Documentary Filmmaking Practices in India
Authors: Nivedita Ghosh
Abstract:
In his critique of Lefebvre’s view that ‘technological capacities’ are class-dependent, Francois Hetman argues that technology today is participatory, allowing the entry of individuals from different levels of social stratification. As a result, we are entering into an era of technology operators or ‘clerks’ who become the new decision-makers because of the knowledge they possess of the use of technologies. In response to Hetman’s thesis, this paper argues that knowledge of technology, while indeed providing a momentary space for decision-making, does not necessarily restructure social hierarchies. Through case studies presented from the world of Indian documentary filmmaking, this paper puts forth the view that Hetman’s clerks, despite being technologically advanced, do not break into the filmmaking hierarchical order. This remains true even for a situation where technical knowledge rests most with those in the lowest rungs of the filmmaking ladder. Instead, technological knowledge provides the space for other kinds of relationships to evolve, such as those of ‘trusting the technician’ or ‘admiration for the technician’s work’. Furthermore, what continues to define documentary filmmaking hierarchy is conceptualization capacities of the practitioners, which are influenced by a similarity in socio-cultural backgrounds and film school training accessible primarily to the filmmakers instead of the technicians. Accordingly, the paper concludes with the argument that more than ‘technological-capacities’, it is ‘conceptualization capacities’ which are class-dependent, especially when we study the field of documentary filmmaking.Keywords: documentary filmmaking, India, technology, knowledge, hierarchy
Procedia PDF Downloads 2623500 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System
Procedia PDF Downloads 5523499 Presentation of International Military Intervention Correlates (IMIC) Database
Authors: Daniil Chernov
Abstract:
In the modern world, the number of conventional interstate wars is declining while the number of military interventions is rising. States no longer initiate conflicts by declaring war but actively intervene in existing military confrontations, often using a comparable number of coercive means. According to existing scholarly understanding, the decision to use force in international relations (in any form) is influenced by roughly the same set of factors: the dynamics of domestic political processes, national interests, international law, and ethical considerations. In the database on armed intervention to be presented in the report, the multifactor model of decision-making is developed. The database describes more than 200 different parameters for armed interventions between 1992 and 2022. The report will present the structure of the database, descriptive statistics, and its key advantages over other sources.Keywords: conflict resolution, international relations, military intervention, database
Procedia PDF Downloads 423498 The Promise of Nunca Más after Cambiemos: Representations of the 2x1 Decision of the Supreme Court and Santiago Maldonado's Disappearance in the Newspaper La Nación
Authors: Uluhan Berk Ondul
Abstract:
This article aims to shed light on the new stage of transitional justice in Argentina through examining the representations of the 2x1 decision of the Supreme Court and Santiago Maldonado’s Disappearance in the newspaper, La Nación. The two events hold the key to understanding Argentina’s journey since return to democracy as they are about the same crimes of the dictatorship, namely, the forced disappearance of civilians and the subsequent impunity that follows. In the case of a convicted torturer, The Supreme Court of Argentina ruled on 3rd of May 2017 that the days spent in preventive detention after two years should be counted double for the overall sentence. This court decision was met with severe resistance from the members of the parliament as well as the human rights movement. The second item on the list still continues and divides the country into two camps: (1) those who think that the police force has committed another act of forced disappearance in the case of activist Santiago Maldonado and (2) the others who blame the peronistas (the party and supporters of the ex-president Cristina Fernandez de Kirchner) of using this subject as a means to score political points. As a newspaper known for its proximity to the current administration, La Nación offers an insight to the direction of the country and also demonstrates how the neoliberal mindset works. The results of the study show that the transitional justice process in Argentina is far from being complete as the Promise of Nunca Más is still not a shared value but a political statement.Keywords: Argentina, Fallo 2x1, impunity, Santiago Maldonado, transitional justice
Procedia PDF Downloads 2313497 Lobbying Regulation in the EU: Transparency’s Achilles’ Heel
Authors: Krambia-Kapardis Maria, Neophytidou Christina
Abstract:
Lobbying is an inherent aspect within the democratic regimes across the globe. Although it can provide decision-makers with valuable knowledge and grant access to stakeholders in the decision-making process, it can also lead to undue influence and unfair competition at the expense of the public interest if it not transparent. Given the multi-level governance structure of the EU, it is no surprise that the EU policy-making arena has become a place-to-be for lobbyists. However, in order to ensure that influence is legitimate and not biased of any business interests, lobbying must be effectively regulated. A comparison with the US and Canadian lobbying regulatory framework and utilising some good practices from EU countries it is apparent that lobbying is the Achilles’ heel to transparency in the EU. It is evident that EU institutions suffer from ineffective regulations and could in fact benefit from a more robust, mandatory and better implemented system of lobbying regulation.Keywords: EU, lobbying regulation, transparency, democratic regimes
Procedia PDF Downloads 4243496 Intelligent Agent Travel Reservation System Requirements Definitions Using the Behavioral Patterns Analysis (BPA) Approach
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Intelligent Agent Reservation System (IARS). The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are developing the Behavioral Pattern Analysis (BPA) modeling methodology, and developing an interactive software tool (DECISION) which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, intelligent agent, reservation system, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 4853495 A Data-Mining Model for Protection of FACTS-Based Transmission Line
Authors: Ashok Kalagura
Abstract:
This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC
Procedia PDF Downloads 424