Search results for: decision support technique
14655 Option Pricing Theory Applied to the Service Sector
Authors: Luke Miller
Abstract:
This paper develops an options pricing methodology to value strategic pricing strategies in the services sector. More specifically, this study provides a unifying taxonomy of current service sector pricing practices, frames these pricing decisions as strategic real options, demonstrates accepted option valuation techniques to assess service sector pricing decisions, and suggests future research areas where pricing decisions and real options overlap. Enhancing revenue in the service sector requires proactive decision making in a world of uncertainty. In an effort to strategically price service products, revenue enhancement necessitates a careful study of the service costs, customer base, competition, legalities, and shared economies with the market. Pricing decisions involve the quality of inputs, manpower, and best practices to maintain superior service. These decisions further hinge on identifying relevant pricing strategies and understanding how these strategies impact a firm’s value. A relatively new area of research applies option pricing theory to investments in real assets and is commonly known as real options. The real options approach is based on the premise that many corporate decisions to invest or divest in assets are simply an option wherein the firm has the right to make an investment without any obligation to act. The decision maker, therefore, has more flexibility and the value of this operating flexibility should be taken into consideration. The real options framework has already been applied to numerous areas including manufacturing, inventory, natural resources, research and development, strategic decisions, technology, and stock valuation. Additionally, numerous surveys have identified a growing need for the real options decision framework within all areas of corporate decision-making. Despite the wide applicability of real options, no study has been carried out linking service sector pricing decisions and real options. This is surprising given the service sector comprises 80% of the US employment and Gross Domestic Product (GDP). Identifying real options as a practical tool to value different service sector pricing strategies is believed to have a significant impact on firm decisions. This paper identifies and discusses four distinct pricing strategies available to the service sector from an options’ perspective: (1) Cost-based profit margin, (2) Increased customer base, (3) Platform pricing, and (4) Buffet pricing. Within each strategy lie several pricing tactics available to the service firm. These tactics can be viewed as options the decision maker has to best manage a strategic position in the market. To demonstrate the effectiveness of including flexibility in the pricing decision, a series of pricing strategies were developed and valued using a real options binomial lattice structure. The options pricing approach discussed in this study allows service firms to directly incorporate market-driven perspectives into the decision process and thus synchronizing service operations with organizational economic goals.Keywords: option pricing theory, real options, service sector, valuation
Procedia PDF Downloads 35514654 Study of Two MPPTs for Photovoltaic Systems Using Controllers Based in Fuzzy Logic and Sliding Mode
Authors: N. Ould cherchali, M. S. Boucherit, L. Barazane, A. Morsli
Abstract:
Photovoltaic power is widely used to supply isolated or unpopulated areas (lighting, pumping, etc.). Great advantage is that this source is inexhaustible, it offers great safety in use and it is clean. But the dynamic models used to describe a photovoltaic system are complicated and nonlinear and due to nonlinear I-V and P–V characteristics of photovoltaic generators, a maximum power point tracking technique (MPPT) is required to maximize the output power. In this paper, two online techniques of maximum power point tracking using robust controller for photovoltaic systems are proposed, the first technique use fuzzy logic controller (FLC) and the second use sliding mode controller (SMC) for photovoltaic systems. The two maximum power point tracking controllers receive the partial derivative of power as inputs, and the output is the duty cycle corresponding to maximum power. A Photovoltaic generator with Boost converter is developed using MATLAB/Simulink to verify the preferences of the proposed techniques. SMC technique provides a good tracking speed in fast changing irradiation and when the irradiation changes slowly or is constant the panel power of FLC technique presents a much smoother signal with less fluctuations.Keywords: fuzzy logic controller, maximum power point, photovoltaic system, tracker, sliding mode controller
Procedia PDF Downloads 54714653 Logistics Support as a Key Success Factor in Gastronomy
Authors: Hanna Zietara
Abstract:
Gastronomy is one of the oldest forms of commercial activity. It is currently one of the most popular and still dynamically developing branches of business. Socio-economic changes, its widespread occurrence, new techniques, or culinary styles affect the almost unlimited possibilities of its development. Importantly, regardless of the form of business adopted, food service is strongly related to logistics processes, and areas of food service that are closely linked to logistics are of strategic importance. Any inefficiency in logistics processes results in reduced chances for success and achieving competitive advantage by companies belonging to the catering industry. The aim of the paper is to identify the areas of logistic support occurring in the catering business, affecting the scope of the logistic processes implemented. The aim of the paper is realized through a plural homogeneous approach, based on: direct observation, text analysis of current documents, in-depth free targeted interviews.Keywords: gastronomy, competitive advantage, logistics, logistics support
Procedia PDF Downloads 16314652 Climate Change and Urban Flooding: The Need to Rethinking Urban Flood Management through Resilience
Authors: Suresh Hettiarachchi, Conrad Wasko, Ashish Sharma
Abstract:
The ever changing and expanding urban landscape increases the stress on urban systems to support and maintain safe and functional living spaces. Flooding presents one of the more serious threats to this safety, putting a larger number of people in harm’s way in congested urban settings. Climate change is adding to this stress by creating a dichotomy in the urban flood response. On the one hand, climate change is causing storms to intensify, resulting in more destructive, rarer floods, while on the other hand, longer dry periods are decreasing the severity of more frequent, less intense floods. This variability is creating a need to be more agile and innovative in how we design for and manage urban flooding. Here, we argue that to cope with this challenge climate change brings, we need to move towards urban flood management through resilience rather than flood prevention. We also argue that dealing with the larger variation in flood response to climate change means that we need to look at flooding from all aspects rather than the single-dimensional focus of flood depths and extents. In essence, we need to rethink how we manage flooding in the urban space. This change in our thought process and approach to flood management requires a practical way to assess and quantify resilience that is built into the urban landscape so that informed decision-making can support the required changes in planning and infrastructure design. Towards that end, we propose a Simple Urban Flood Resilience Index (SUFRI) based on a robust definition of resilience as a tool to assess flood resilience. The application of a simple resilience index such as the SUFRI can provide a practical tool that considers urban flood management in a multi-dimensional way and can present solutions that were not previously considered. When such an index is grounded on a clear and relevant definition of resilience, it can be a reliable and defensible way to assess and assist the process of adapting to the increasing challenges in urban flood management with climate change.Keywords: urban flood resilience, climate change, flood management, flood modelling
Procedia PDF Downloads 4814651 Fuzzy Linear Programming Approach for Determining the Production Amounts in Food Industry
Abstract:
In recent years, rapid and correct decision making is crucial for both people and enterprises. However, uncertainty makes decision-making difficult. Fuzzy logic is used for coping with this situation. Thus, fuzzy linear programming models are developed in order to handle uncertainty in objective function and the constraints. In this study, a problem of a factory in food industry is investigated, required data is obtained and the problem is figured out as a fuzzy linear programming model. The model is solved using Zimmerman approach which is one of the approaches for fuzzy linear programming. As a result, the solution gives the amount of production for each product type in order to gain maximum profit.Keywords: food industry, fuzzy linear programming, fuzzy logic, linear programming
Procedia PDF Downloads 65014650 Chemical Bath Deposition Technique of CdS Used in Closed Space Sublimation of CdTe Solar Cell
Authors: Z. Mahmood, F. U. Babar, S. Naz, H. U. Rehman
Abstract:
Cadmium Sulphide (CdS) was deposited on a Tec 15 glass substrate with the help of CBD (chemical bath deposition process) and then cadmium telluride CdTe was deposited on CdS with the help of CSS (closed spaced sublimation technique) for the construction of a solar cell. The thicknesses of all the deposited materials were measured with the help of Ellipsometry. The IV graphs were drawn in order to observe the current voltage output. The efficiency of the cell was graphed with the fill factor as well (graphs not given here). The efficiency came out to be approximately 16.5 % and the CIGS (copper-indium–gallium-selenide) maximum efficiency is 20 %. The efficiency of a solar cell can further be enhanced by adapting quality materials, good experimental devices and proper procedures. The grain size was analyzed with the help of scanning electron microscope using RBS (Rutherford backscattering spectroscopy).Keywords: Chemical Bath Deposition Technique (CBD), cadmium sulphide (CdS), CdTe, CSS (Closed Space Sublimation)
Procedia PDF Downloads 36414649 The Implementation of the Human Right of Self-Determination: the Example of Nagorno-Karabakh Republic
Authors: S. Vlasyan
Abstract:
The article deals with the implementation of the right to self-determination of peoples on the example of Nagorno-Karabakh Republic. The problem of correlation of two fundamental principles of international law i. e. territorial integrity and the right to self-determination of peoples is considered to be one of the vital issues in the field of international law for several decades. So, in this article, the author analyzes the decision of the Supreme Court of Canada regarding specific issues of secession of Quebec from Canada, as well as the decision of the International Court of Justice in the case concerning East Timor (Portugal v. Australia), and in the case of Western Sahara. The author formulates legal conditions of Nagorno-Karabakh secession.Keywords: right of self-determination, territorial integrity, the principles of International Law, Nagorno-Karabakh Republic
Procedia PDF Downloads 40814648 Rapid Soil Classification Using Computer Vision with Electrical Resistivity and Soil Strength
Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, P. L. Goh, Grace H. B. Foo, M. L. Leong
Abstract:
This paper presents the evaluation of various soil testing methods such as the four-probe soil electrical resistivity method and cone penetration test (CPT) that can complement a newly developed novel rapid soil classification scheme using computer vision, to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from the local construction industry are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labor-intensive. Thus, a rapid classification method is needed at the SGs. Four-probe soil electrical resistivity and CPT were evaluated for their feasibility as suitable additions to the computer vision system to further develop this innovative non-destructive and instantaneous classification method. The computer vision technique comprises soil image acquisition using an industrial-grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the following three items were targeted to be added onto the computer vision scheme: the apparent electrical resistivity of soil (ρ) measured using a set of four probes arranged in Wenner’s array, the soil strength measured using a modified mini cone penetrometer, and w measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay,” and a mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay” and are feasible as complementing methods to the computer vision system.Keywords: computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification
Procedia PDF Downloads 23914647 Innovation Knowledge Management for Public Sector in the Thailand
Authors: Supattra Kanchanopast
Abstract:
This article presents the process of change for innovation in the Thai public sector in order to create higher client satisfaction. Change management should concern the potentiality of the change agent or leader, the long-term vision or policy (political side) of the organization, the communication within the organization, suitable organizational culture and structure, preparedness of the personnel, and the fitness of the reward system. Sustaining innovation creation is not sophisticated, as traditionally believed. A basic management principle of identifying clarified and motivating goals needs to be followed by creating support systems after implementation and by ensuring the stakeholders’ benefit, derived from the innovation projects. Finally, creating an amiable atmosphere among the practitioners, including effective evaluation and reward schemes, will support the innovation. However, none of these will ever take place unless support is gained from the leaders of those organizations, and from the staff and clients involved also as well.Keywords: change management, client satisfaction, innovation management, Thai public sector
Procedia PDF Downloads 25214646 A Condition-Based Maintenance Policy for Multi-Unit Systems Subject to Deterioration
Authors: Nooshin Salari, Viliam Makis
Abstract:
In this paper, we propose a condition-based maintenance policy for multi-unit systems considering the existence of economic dependency among units. We consider a system composed of N identical units, where each unit deteriorates independently. Deterioration process of each unit is modeled as a three-state continuous time homogeneous Markov chain with two working states and a failure state. The average production rate of units varies in different working states and demand rate of the system is constant. Units are inspected at equidistant time epochs, and decision regarding performing maintenance is determined by the number of units in the failure state. If the total number of units in the failure state exceeds a critical level, maintenance is initiated, where units in failed state are replaced correctively and deteriorated state units are maintained preventively. Our objective is to determine the optimal number of failed units to initiate maintenance minimizing the long run expected average cost per unit time. The problem is formulated and solved in the semi-Markov decision process (SMDP) framework. A numerical example is developed to demonstrate the proposed policy and the comparison with the corrective maintenance policy is presented.Keywords: reliability, maintenance optimization, semi-Markov decision process, production
Procedia PDF Downloads 16514645 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design
Authors: Rajaian Hoonejani Mohammad, Eshraghi Pegah, Zomorodian Zahra Sadat, Tahsildoost Mohammad
Abstract:
Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.Keywords: early stage of design, energy, thermal comfort, validation, machine learning
Procedia PDF Downloads 7314644 Statistical Wavelet Features, PCA, and SVM-Based Approach for EEG Signals Classification
Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh
Abstract:
The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the support-vectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.Keywords: discrete wavelet transform, electroencephalogram, pattern recognition, principal component analysis, support vector machine
Procedia PDF Downloads 63814643 Evaluation of the Architect-Friendliness of LCA-Based Environmental Impact Assessment Tools
Authors: Elke Meex, Elke Knapen, Griet Verbeeck
Abstract:
The focus of sustainable building is gradually shifting from energy efficiency towards the more global environmental impact of building design during all life-cycle stages. In this context, many tools have been developed that use a LCA-approach to assess the environmental impact on a whole building level. Since the building design strongly influences the final environmental performance and the architect plays a key role in the design process, it is important that these tools are adapted to his work method and support the decision making from the early design phase on. Therefore, a comparative evaluation of the degree of architect-friendliness of some LCA tools on building level is made, based on an evaluation framework specifically developed for the architect’s viewpoint. In order to allow comparison of the results, a reference building has been designed, documented for different design phases and entered in all software tools. The evaluation according to the framework shows that the existing tools are not very architect-friendly. Suggestions for improvement are formulated.Keywords: architect-friendliness, design supportive value, evaluation framework, tool comparison
Procedia PDF Downloads 54014642 Structural Health Monitoring-Integrated Structural Reliability Based Decision Making
Authors: Caglayan Hizal, Kutay Yuceturk, Ertugrul Turker Uzun, Hasan Ceylan, Engin Aktas, Gursoy Turan
Abstract:
Monitoring concepts for structural systems have been investigated by researchers for decades since such tools are quite convenient to determine intervention planning of structures. Despite the considerable development in this regard, the efficient use of monitoring data in reliability assessment, and prediction models are still in need of improvement in their efficiency. More specifically, reliability-based seismic risk assessment of engineering structures may play a crucial role in the post-earthquake decision-making process for the structures. After an earthquake, professionals could identify heavily damaged structures based on visual observations. Among these, it is hard to identify the ones with minimum signs of damages, even if they would experience considerable structural degradation. Besides, visual observations are open to human interpretations, which make the decision process controversial, and thus, less reliable. In this context, when a continuous monitoring system has been previously installed on the corresponding structure, this decision process might be completed rapidly and with higher confidence by means of the observed data. At this stage, the Structural Health Monitoring (SHM) procedure has an important role since it can make it possible to estimate the system reliability based on a recursively updated mathematical model. Therefore, integrating an SHM procedure into the reliability assessment process comes forward as an important challenge due to the arising uncertainties for the updated model in case of the environmental, material and earthquake induced changes. In this context, this study presents a case study on SHM-integrated reliability assessment of the continuously monitored progressively damaged systems. The objective of this study is to get instant feedback on the current state of the structure after an extreme event, such as earthquakes, by involving the observed data rather than the visual inspections. Thus, the decision-making process after such an event can be carried out on a rational basis. In the near future, this can give wing to the design of self-reported structures which can warn about its current situation after an extreme event.Keywords: condition assessment, vibration-based SHM, reliability analysis, seismic risk assessment
Procedia PDF Downloads 14314641 Modeling of Strong Motion Generation Areas of the 2011 Tohoku, Japan Earthquake Using Modified Semi-Empirical Technique Incorporating Frequency Dependent Radiation Pattern Model
Authors: Sandeep, A. Joshi, Kamal, Piu Dhibar, Parveen Kumar
Abstract:
In the present work strong ground motion has been simulated using a modified semi-empirical technique (MSET), with frequency dependent radiation pattern model. Joshi et al. (2014) have modified the semi-empirical technique to incorporate the modeling of strong motion generation areas (SMGAs). A frequency dependent radiation pattern model is applied to simulate high frequency ground motion more precisely. Identified SMGAs (Kurahashi and Irikura 2012) of the 2011 Tohoku earthquake (Mw 9.0) were modeled using this modified technique. Records are simulated for both frequency dependent and constant radiation pattern function. Simulated records for both cases are compared with observed records in terms of peak ground acceleration and pseudo acceleration response spectra at different stations. Comparison of simulated and observed records in terms of root mean square error suggests that the method is capable of simulating record which matches in a wide frequency range for this earthquake and bears realistic appearance in terms of shape and strong motion parameters. The results confirm the efficacy and suitability of rupture model defined by five SMGAs for the developed modified technique.Keywords: strong ground motion, semi-empirical, strong motion generation area, frequency dependent radiation pattern, 2011 Tohoku Earthquake
Procedia PDF Downloads 53714640 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain
Authors: M. Pushparani, A. Sagaya
Abstract:
Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.Keywords: embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems
Procedia PDF Downloads 28514639 Decision Location and Resource Requirement for Relief Goods Assembly
Authors: Glenda B. Minguito, Jenith L. Banluta
Abstract:
One of the critical aspects of humanitarian operations is the distribution of relief goods to the affected community. The common assumption is that relief goods are prepositioned during disasters which are not applicable in developing countries like the Philippines. During disasters, the on-the-ground government agencies and responders have to procure, sort, weigh and pack the relief goods. There is a need to review the relief goods preparation as it seriously affects the delivery of necessary aid for human survival. This study also identifies the ideal location of the assembly hub to minimize the distance to the affected community. This paper reveals that location and resources are dependent on the type of disasters encountered at the local level. The Center-of-Gravity method and Multiple Activity Chart were applied in the analysis.Keywords: humanitarian supply chain, location decision, resource allocation, local level
Procedia PDF Downloads 14814638 Improvement of Healthcare Quality and Psychological Stress Relieve for Transition Program in Intensive Care Units
Authors: Ru-Yu Lien, Shih-Hsin Hung, Shu-Fen Lu, Shu-I Chin, Wen-Ju Yang, Wan Ming-Shang, Chien-Ying Wang
Abstract:
Background: Upon recovery from critical condition, patients are normally transferred from the intensive care units (ICUs) to the general wards. However, transferring patients to a new environment causes stressful experiences for both patients and their families. Therefore, there is a necessity to communicate with the patients and their families to reduce psychological stress and unplanned return. Methods: This study was performed in the general ICUs from January 1, 2021, to December 31, 2021, in Taipei Veteran General Hospital. The patients who were evaluated by doctors and liaison nurses transferred to the general wards were selected as the research objects and ranked by the Critical Care Transition Program (CCTP). The plan was applied to 40 patients in a study group and usual care support for a control group of 40 patients. The psychological condition of patients was evaluated by a migration stress scale and a hospital anxiety and depression scale. In addition, the rate of return to ICU was also measured. Results: A total of 63 patients out of 80 (78.8%) experienced moderate to severe degrees of anxiety, and 42 patients (52.6%) experienced moderate to severe degrees of depression before being transferred. The difference between anxiety and depression changed more after the transfer; moreover, when a transition program was applied, it was lower than without a transition program. The return to ICU rate in the study group was lower than in the usual transition group, with an adjusted odds ratio of 0.21 (95% confidence interval: 0.05-0.888, P=0.034). Conclusion: Our study found that the transfer program could reduce the anxiety and depression of patients and the associated stress on their families during the transition from ICU. Before being transferred out of ICU, the healthcare providers need to assess the needs of patients to set the goals of the care plan and perform patient-centered decision-making with multidisciplinary support.Keywords: ICU, critical care transition program, healthcare, transition program
Procedia PDF Downloads 8414637 Transparency of Algorithmic Decision-Making: Limits Posed by Intellectual Property Rights
Authors: Olga Kokoulina
Abstract:
Today, algorithms are assuming a leading role in various areas of decision-making. Prompted by a promise to provide increased economic efficiency and fuel solutions for pressing societal challenges, algorithmic decision-making is often celebrated as an impartial and constructive substitute for human adjudication. But in the face of this implied objectivity and efficiency, the application of algorithms is also marred with mounting concerns about embedded biases, discrimination, and exclusion. In Europe, vigorous debates on risks and adverse implications of algorithmic decision-making largely revolve around the potential of data protection laws to tackle some of the related issues. For example, one of the often-cited venues to mitigate the impact of potentially unfair decision-making practice is a so-called 'right to explanation'. In essence, the overall right is derived from the provisions of the General Data Protection Regulation (‘GDPR’) ensuring the right of data subjects to access and mandating the obligation of data controllers to provide the relevant information about the existence of automated decision-making and meaningful information about the logic involved. Taking corresponding rights and obligations in the context of the specific provision on automated decision-making in the GDPR, the debates mainly focus on efficacy and the exact scope of the 'right to explanation'. In essence, the underlying logic of the argued remedy lies in a transparency imperative. Allowing data subjects to acquire as much knowledge as possible about the decision-making process means empowering individuals to take control of their data and take action. In other words, forewarned is forearmed. The related discussions and debates are ongoing, comprehensive, and, often, heated. However, they are also frequently misguided and isolated: embracing the data protection law as ultimate and sole lenses are often not sufficient. Mandating the disclosure of technical specifications of employed algorithms in the name of transparency for and empowerment of data subjects potentially encroach on the interests and rights of IPR holders, i.e., business entities behind the algorithms. The study aims at pushing the boundaries of the transparency debate beyond the data protection regime. By systematically analysing legal requirements and current judicial practice, it assesses the limits of the transparency requirement and right to access posed by intellectual property law, namely by copyrights and trade secrets. It is asserted that trade secrets, in particular, present an often-insurmountable obstacle for realising the potential of the transparency requirement. In reaching that conclusion, the study explores the limits of protection afforded by the European Trade Secrets Directive and contrasts them with the scope of respective rights and obligations related to data access and portability enshrined in the GDPR. As shown, the far-reaching scope of the protection under trade secrecy is evidenced both through the assessment of its subject matter as well as through the exceptions from such protection. As a way forward, the study scrutinises several possible legislative solutions, such as flexible interpretation of the public interest exception in trade secrets as well as the introduction of the strict liability regime in case of non-transparent decision-making.Keywords: algorithms, public interest, trade secrets, transparency
Procedia PDF Downloads 12414636 Proof of Concept Design and Development of a Computer-Aided Medical Evaluation of Symptoms Web App: An Expert System for Medical Diagnosis in General Practice
Authors: Ananda Perera
Abstract:
Computer-Assisted Medical Evaluation of Symptoms (CAMEOS) is a medical expert system designed to help General Practices (GPs) make an accurate diagnosis. CAMEOS comprises a knowledge base, user input, inference engine, reasoning module, and output statement. The knowledge base was developed by the author. User input is an Html file. The physician user collects data in the consultation. Data is sent to the inference engine at servers. CAMEOS uses set theory to simulate diagnostic reasoning. The program output is a list of differential diagnoses, the most probable diagnosis, and the diagnostic reasoning.Keywords: CDSS, computerized decision support systems, expert systems, general practice, diagnosis, diagnostic systems, primary care diagnostic system, artificial intelligence in medicine
Procedia PDF Downloads 15514635 Knowledge, Hierarchy and Decision-Making: Analysis of Documentary Filmmaking Practices in India
Authors: Nivedita Ghosh
Abstract:
In his critique of Lefebvre’s view that ‘technological capacities’ are class-dependent, Francois Hetman argues that technology today is participatory, allowing the entry of individuals from different levels of social stratification. As a result, we are entering into an era of technology operators or ‘clerks’ who become the new decision-makers because of the knowledge they possess of the use of technologies. In response to Hetman’s thesis, this paper argues that knowledge of technology, while indeed providing a momentary space for decision-making, does not necessarily restructure social hierarchies. Through case studies presented from the world of Indian documentary filmmaking, this paper puts forth the view that Hetman’s clerks, despite being technologically advanced, do not break into the filmmaking hierarchical order. This remains true even for a situation where technical knowledge rests most with those in the lowest rungs of the filmmaking ladder. Instead, technological knowledge provides the space for other kinds of relationships to evolve, such as those of ‘trusting the technician’ or ‘admiration for the technician’s work’. Furthermore, what continues to define documentary filmmaking hierarchy is conceptualization capacities of the practitioners, which are influenced by a similarity in socio-cultural backgrounds and film school training accessible primarily to the filmmakers instead of the technicians. Accordingly, the paper concludes with the argument that more than ‘technological-capacities’, it is ‘conceptualization capacities’ which are class-dependent, especially when we study the field of documentary filmmaking.Keywords: documentary filmmaking, India, technology, knowledge, hierarchy
Procedia PDF Downloads 26214634 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System
Procedia PDF Downloads 55014633 An Inventory Management Model to Manage the Stock Level for Irregular Demand Items
Authors: Riccardo Patriarca, Giulio Di Gravio, Francesco Costantino, Massimo Tronci
Abstract:
An accurate inventory management policy acquires a crucial role in the several high-availability sectors. In these sectors, due to the high-cost of spares and backorders, an (S-1, S) replenishment policy is necessary for high-availability items. The policy enables the shipment of a substitute efficient item anytime the inventory size decreases by one. This policy can be modelled following the Multi-Echelon Technique for Recoverable Item Control (METRIC). The METRIC is a system-based technique that allows defining the optimum stock level in a multi-echelon network, adopting measures in line with the decision-maker’s perspective. The METRIC defines an availability-cost function with inventory costs and required service levels, using as inputs data about the demand trend, the supplying and maintenance characteristics of the network and the budget/availability constraints. The traditional METRIC relies on the hypothesis that a Poisson distribution well represents the demand distribution in case of items with a low failure rate. However, in this research, we will explore the effects of using a Poisson distribution to model the demand of low failure rate items characterized by an irregular demand trend. This characteristic of a demand is not included in the traditional METRIC formulation leading to the need of revising its traditional formulation. Using the CV (Coefficient of Variation) and ADI (Average inter-Demand Interval) classification, we will define the inherent flaws of Poisson-based METRIC for irregular demand items, defining an innovative ad hoc distribution which can better fit the irregular demands. This distribution will allow defining proper stock levels to reduce stocking and backorder costs due to the high irregularities in the demand trend. A case study in the aviation domain will clarify the benefits of this innovative METRIC approach.Keywords: METRIC, inventory management, irregular demand, spare parts
Procedia PDF Downloads 34714632 Solving the Set Covering Problem Using the Binary Cat Swarm Optimization Metaheuristic
Authors: Broderick Crawford, Ricardo Soto, Natalia Berrios, Eduardo Olguin
Abstract:
In this paper, we present a binary cat swarm optimization for solving the Set covering problem. The set covering problem is a well-known NP-hard problem with many practical applications, including those involving scheduling, production planning and location problems. Binary cat swarm optimization is a recent swarm metaheuristic technique based on the behavior of discrete cats. Domestic cats show the ability to hunt and are curious about moving objects. The cats have two modes of behavior: seeking mode and tracing mode. We illustrate this approach with 65 instances of the problem from the OR-Library. Moreover, we solve this problem with 40 new binarization techniques and we select the technical with the best results obtained. Finally, we make a comparison between results obtained in previous studies and the new binarization technique, that is, with roulette wheel as transfer function and V3 as discretization technique.Keywords: binary cat swarm optimization, binarization methods, metaheuristic, set covering problem
Procedia PDF Downloads 39614631 Identifying the Barriers Facing Chinese Small and Medium-Sized Enterprises and Evaluating the Effectiveness of Public Supports
Authors: A. Yongsheng Guo, B. Obedat. Abdulazeez, C. Xiaoxian Zhu
Abstract:
This study aimed to identify the barriers to the development of small and medium-sized enterprises (SMEs) in China and build a theoretical framework to evaluate the support provided by the authorities and institutions. A grounded theory approach was adopted to collect and analyze data. 32 interviews were conducted with SME managers, and open, axial and selective coding was utilized to develop themes. Based on institutional theory, grounded theory models were used to present findings. The findings showed that the main barriers in the business environment were defaulting on contracts, bureaucracy in procedures, lack of financial and legal support, limited intermediaries and channels, and poor quality of products and services. This study found that many programs were provided to support SMEs. A theoretical framework was developed to evaluate the performance of the programs from the managers’ perspective. The concepts of economy, efficiency and effectiveness were used to evaluate the perceived value of the programs. This study suggests that specialized programs are needed to suit sector-specific requirements, and creative packages are helpful in supporting SMEs' growth.Keywords: business support, public economics, public programme, SME
Procedia PDF Downloads 5014630 Complex Decision Rules in Quality Assurance Processes for Quick Service Restaurant Industry: Human Factors Determining Acceptability
Authors: Brandon Takahashi, Marielle Hanley, Gerry Hanley
Abstract:
The large-scale quick-service restaurant industry is a complex business to manage optimally. With over 40 suppliers providing different ingredients for food preparation and thousands of restaurants serving over 50 unique food offerings across a wide range of regions, the company must implement a quality assurance process. Businesses want to deliver quality food efficiently, reliably, and successfully at a low cost that the public wants to buy. They also want to make sure that their food offerings are never unsafe to eat or of poor quality. A good reputation (and profitable business) developed over the years can be gone in an instant if customers fall ill eating your food. Poor quality also results in food waste, and the cost of corrective actions is compounded by the reduction in revenue. Product compliance evaluation assesses if the supplier’s ingredients are within compliance with the specifications of several attributes (physical, chemical, organoleptic) that a company will test to ensure that a quality, safe to eat food is given to the consumer and will deliver the same eating experience in all parts of the country. The technical component of the evaluation includes the chemical and physical tests that produce numerical results that relate to shelf-life, food safety, and organoleptic qualities. The psychological component of the evaluation includes organoleptic, which is acting on or involving the use of the sense organs. The rubric for product compliance evaluation has four levels: (1) Ideal: Meeting or exceeding all technical (physical and chemical), organoleptic, & psychological specifications. (2) Deviation from ideal but no impact on quality: Not meeting or exceeding some technical and organoleptic/psychological specifications without impact on consumer quality and meeting all food safety requirements (3) Acceptable: Not meeting or exceeding some technical and organoleptic/psychological specifications resulting in reduction of consumer quality but not enough to lessen demand and meeting all food safety requirements (4) Unacceptable: Not meeting food safety requirements, independent of meeting technical and organoleptic specifications or meeting all food safety requirements but product quality results in consumer rejection of food offering. Sampling of products and consumer tastings within the distribution network is a second critical element of the quality assurance process and are the data sources for the statistical analyses. Each finding is not independently assessed with the rubric. For example, the chemical data will be used to back up/support any inferences on the sensory profiles of the ingredients. Certain flavor profiles may not be as apparent when mixed with other ingredients, which leads to weighing specifications differentially in the acceptability decision. Quality assurance processes are essential to achieve that balance of quality and profitability by making sure the food is safe and tastes good but identifying and remediating product quality issues before they hit the stores. Comprehensive quality assurance procedures implement human factors methodologies, and this report provides recommendations for systemic application of quality assurance processes for quick service restaurant services. This case study will review the complex decision rubric and evaluate processes to ensure the right balance of cost, quality, and safety is achieved.Keywords: decision making, food safety, organoleptics, product compliance, quality assurance
Procedia PDF Downloads 18814629 Presentation of International Military Intervention Correlates (IMIC) Database
Authors: Daniil Chernov
Abstract:
In the modern world, the number of conventional interstate wars is declining while the number of military interventions is rising. States no longer initiate conflicts by declaring war but actively intervene in existing military confrontations, often using a comparable number of coercive means. According to existing scholarly understanding, the decision to use force in international relations (in any form) is influenced by roughly the same set of factors: the dynamics of domestic political processes, national interests, international law, and ethical considerations. In the database on armed intervention to be presented in the report, the multifactor model of decision-making is developed. The database describes more than 200 different parameters for armed interventions between 1992 and 2022. The report will present the structure of the database, descriptive statistics, and its key advantages over other sources.Keywords: conflict resolution, international relations, military intervention, database
Procedia PDF Downloads 3414628 A Study of Learning Achievement for Heat Transfer by Using Experimental Sets of Convection with the Predict-Observe-Explain Teaching Technique
Authors: Wanlapa Boonsod, Nisachon Yangprasong, Udomsak Kitthawee
Abstract:
Thermal physics education is a complicated and challenging topic to discuss in any classroom. As a result, most students tend to be uninterested in learning this topic. In the current study, a convection experiment set was devised to show how heat can be transferred by a convection system to a thermoelectric plate until a LED flashes. This research aimed to 1) create a natural convection experimental set, 2) study learning achievement on the convection experimental set with the predict-observe-explain (POE) technique, and 3) study satisfaction for the convection experimental set with the predict-observe-explain (POE) technique. The samples were chosen by purposive sampling and comprised 28 students in grade 11 at Patumkongka School in Bangkok, Thailand. The primary research instrument was the plan for predict-observe-explain (POE) technique on heat transfer using a convection experimental set. Heat transfer experimental set by convection. The instruments used to collect data included a heat transfer achievement model by convection, a Satisfaction Questionnaire after the learning activity, and the predict-observe-explain (POE) technique for heat transfer using a convection experimental set. The research format comprised a one-group pretest-posttest design. The data was analyzed by GeoGebra program. The statistics used in the research were mean, standard deviation and t-test for dependent samples. The results of the research showed that achievement on heat transfer using convection experimental set was composed of thermo-electrics on the top side attached to the heat sink and another side attached to a stainless plate. Electrical current was displayed by the flashing of a 5v LED. The entire set of thermo-electrics was set up on the top of the box and heated by an alcohol burner. The achievement of learning was measured with the predict-observe-explain (POE) technique, with the natural convection experimental set statistically higher than before learning at a 0.01 level. Satisfaction with POE for physics learning of heat transfer by using convection experimental set was at a high level (4.83 from 5.00).Keywords: convection, heat transfer, physics education, POE
Procedia PDF Downloads 21814627 The Effects of Aging on the Cost of Operating and Support: An Empirical Study Applied to Weapon Systems
Authors: Byungchae Kim, Jiwoo Nam
Abstract:
Aging of weapon systems can cause the failure and degeneration of components which results in increase of operating and support costs. However, whether this aging effect is significantly strong and it influences a lot on national defense spending due to the rapid increase in operating and support (O&S) costs is questionable. To figure out this, we conduct a literature review analyzing the aging effect of US weapon systems. We also conduct an empirical research using a maintenance database of Korean weapon systems, Defense Logistics Integrated Information System (DAIIS). We run regression of various types of O&S cost on weapon system age to investigate the statistical significance of aging effect and use generalized linear model to find relations between the failure of different priced components and the age. Our major finding is although aging effect exists, its impacts on weapon system cost seem to be not too large considering several characteristics of O&S cost elements not relying on the age.Keywords: O&S cost, aging effect, weapon system, GLM
Procedia PDF Downloads 14214626 The Effects of Leadership on the Claim of Responsibility
Authors: Katalin Kovacs
Abstract:
In most forms of violence the perpetrators intend to hide their identities. Terrorism is different. Terrorist groups often take responsibility for their attacks, and consequently they reveal their identities. This unique characteristic of terrorism has been largely overlooked, and scholars are still puzzled as to why terrorist groups claim responsibility for their attacks. Certainly, the claim of responsibility is worth analysing. It would help to have a clearer picture of what terrorist groups try to achieve and how, but also to develop an understanding of the strategic planning of terrorist attacks and the message the terrorists intend to deliver. The research aims to answer the question why terrorist groups choose to claim responsibility for some of their attacks and not for others. In order to do so the claim of responsibility is considered to be a tactical choice, based on the assumption that terrorists weigh the costs and benefits of claiming responsibility. The main argument is that terrorist groups do not claim responsibility in cases when there is no tactical advantage gained from claiming responsibility. The idea that the claim of responsibility has tactical value offers the opportunity to test these assertions using a large scale empirical analysis. The claim of responsibility as a tactical choice depends on other tactical choices, such as the choice of target, the internationality of the attack, the number of victims and whether the group occupies territory or operates as an underground group. The structure of the terrorist groups and the level of decision making also affects the claim of responsibility. Terrorists on the lower level are less disciplined than the leaders. This means that the terrorists on lower levels pay less attention to the strategic objectives and engage easier in indiscriminate violence, and consequently they would less like to claim responsibility. Therefore, the research argues that terrorists, who are on a highest level of decision making would claim responsibility for the attacks as those are who takes into account the strategic objectives. As most studies on terrorism fail to provide definitions; therefore the researches are fragmented and incomparable. Separate, isolated researches do not support comprehensive thinking. It is also very important to note that there are only a few researches using quantitative methods. The aim of the research is to develop a new and comprehensive overview of the claim of responsibility based on strong quantitative evidence. By using well-established definitions and operationalisation the current research focuses on a broad range of attributes that can have tactical values in order to determine circumstances when terrorists are more likely to claim responsibility.Keywords: claim of responsibility, leadership, tactical choice, terrorist group
Procedia PDF Downloads 313