Search results for: Bayes' decision
2377 High Techno-Parks in the Economy of Azerbaijan and Their Management Problems
Authors: Rasim M. Alguliyev, Alovsat G. Aliyev, Roza O. Shahverdiyeva
Abstract:
The paper investigated the role and position of high techno-parks, which is one of the priorities of Azerbaijan. The main objectives, functions and features of the establishment of high-techno parks, as well as organization of the activity of the structural elements, which are the parking complex and their interactions were analyzed. The development, organization and management of high techno-parks were studied. The key features and functions of innovative structures’ management were explained. The need for a comprehensive management system for the development of high-techno parks was emphasized and the major problems were analyzed. In addition, the methods were proposed for the development of information systems supporting decision making in systematic and sustainable management of the parks.Keywords: innovative development, innovation processes, innovation economy, innovation infrastructure, high technology park, efficient management, management decisions, information insurance
Procedia PDF Downloads 4732376 Signal Restoration Using Neural Network Based Equalizer for Nonlinear channels
Authors: Z. Zerdoumi, D. Benatia, , D. Chicouche
Abstract:
This paper investigates the application of artificial neural network to the problem of nonlinear channel equalization. The difficulties caused by channel distortions such as inter symbol interference (ISI) and nonlinearity can overcome by nonlinear equalizers employing neural networks. It has been shown that multilayer perceptron based equalizer outperform significantly linear equalizers. We present a multilayer perceptron based equalizer with decision feedback (MLP-DFE) trained with the back propagation algorithm. The capacity of the MLP-DFE to deal with nonlinear channels is evaluated. From simulation results it can be noted that the MLP based DFE improves significantly the restored signal quality, the steady state mean square error (MSE), and minimum Bit Error Rate (BER), when comparing with its conventional counterpart.Keywords: Artificial Neural Network, signal restoration, Nonlinear Channel equalization, equalization
Procedia PDF Downloads 4972375 eTransformation Framework for the Cognitive Systems
Authors: Ana Hol
Abstract:
Digital systems are in the cognitive wave of the eTransformations and are now extensively aimed at meeting the individuals’ demands, both those of customers requiring services and those of service providers. It is also apparent that successful future systems will not just simply open doors to the traditional owners/users to offer and receive services such as Uber for example does today, but will in the future require more customized and cognitively enabled infrastructures that will be responsive to the system user’s needs. To be able to identify what is required for such systems, this research reviews the historical and the current effects of the eTransformation process by studying: 1. eTransitions of company websites and mobile applications, 2. Emergence of new sheared economy business models as Uber and, 3. New requirements for demand driven, cognitive systems capable of learning and just in time decision making. Based on the analysis, this study proposes a Cognitive eTransformation Framework capable of guiding implementations of new responsive and user aware systems.Keywords: system implementations, AI supported systems, cognitive systems, eTransformation
Procedia PDF Downloads 2382374 A Comprehensive Key Performance Indicators Dashboard for Emergency Medical Services
Authors: Giada Feletti, Daniela Tedesco, Paolo Trucco
Abstract:
The present study aims to develop a dashboard of Key Performance Indicators (KPI) to enhance information and predictive capabilities in Emergency Medical Services (EMS) systems, supporting both operational and strategic decisions of different actors. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning the indicators currently used for the performance measurement of EMS systems. From this literature analysis, it emerged that current studies focus on two distinct perspectives: the ambulance service, a fundamental component of pre-hospital health treatment, and the patient care in the Emergency Department (ED). The perspective proposed by this study is to consider an integrated view of the ambulance service process and the ED process, both essential to ensure high quality of care and patient safety. Thus, the proposal focuses on the entire healthcare service process and, as such, allows considering the interconnection between the two EMS processes, the pre-hospital and hospital ones, connected by the assignment of the patient to a specific ED. In this way, it is possible to optimize the entire patient management. Therefore, attention is paid to the dependency of decisions that in current EMS management models tend to be neglected or underestimated. In particular, the integration of the two processes enables the evaluation of the advantage of an ED selection decision having visibility on EDs’ saturation status and therefore considering the distance, the available resources and the expected waiting times. Starting from a critical review of the KPIs proposed in the extant literature, the design of the dashboard was carried out: the high number of analyzed KPIs was reduced by eliminating the ones firstly not in line with the aim of the study and then the ones supporting a similar functionality. The KPIs finally selected were tested on a realistic dataset, which draws us to exclude additional indicators due to the unavailability of data required for their computation. The final dashboard, which was discussed and validated by experts in the field, includes a variety of KPIs able to support operational and planning decisions, early warning, and citizens’ awareness of EDs accessibility in real-time. By associating each KPI to the EMS phase it refers to, it was also possible to design a well-balanced dashboard covering both efficiency and effective performance of the entire EMS process. Indeed, just the initial phases related to the interconnection between ambulance service and patient’s care are covered by traditional KPIs compared to the subsequent phases taking place in the hospital ED. This could be taken into consideration for the potential future development of the dashboard. Moreover, the research could proceed by building a multi-layer dashboard composed of the first level with a minimal set of KPIs to measure the basic performance of the EMS system at an aggregate level and further levels with KPIs that can bring additional and more detailed information.Keywords: dashboard, decision support, emergency medical services, key performance indicators
Procedia PDF Downloads 1132373 Classification Based on Deep Neural Cellular Automata Model
Authors: Yasser F. Hassan
Abstract:
Deep learning structure is a branch of machine learning science and greet achievement in research and applications. Cellular neural networks are regarded as array of nonlinear analog processors called cells connected in a way allowing parallel computations. The paper discusses how to use deep learning structure for representing neural cellular automata model. The proposed learning technique in cellular automata model will be examined from structure of deep learning. A deep automata neural cellular system modifies each neuron based on the behavior of the individual and its decision as a result of multi-level deep structure learning. The paper will present the architecture of the model and the results of simulation of approach are given. Results from the implementation enrich deep neural cellular automata system and shed a light on concept formulation of the model and the learning in it.Keywords: cellular automata, neural cellular automata, deep learning, classification
Procedia PDF Downloads 1982372 Design of a Computational Model to Support the Calculation of a Structural Health Index for Bridges
Authors: Jeison Sánchez Araya, Cesar Garita, Giannina Ortiz
Abstract:
In many Latin American countries, including Costa Rica, the poor condition of national road bridges significantly hinders socioeconomic progress. Addressing this issue, this article introduces a computational method designed to evaluate and monitor bridge health over time. It outlines a business intelligence model that facilitates data storage from bridge inspections and supports structural health index calculations. A Power BI prototype displays crucial visualizations that improve decision making on infrastructure investments. This approach leverages business intelligence and hierarchical visualization techniques, offering a solution to quantitatively assess bridge health and prioritize investments in national infrastructure efficiently.Keywords: bridges, business intelligence, structural health index, structural health monitoring
Procedia PDF Downloads 22371 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider
Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf
Abstract:
We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approachKeywords: top tagger, multivariate, deep learning, LHC, single top
Procedia PDF Downloads 1112370 Implications of Social Rights Adjudication on the Separation of Powers Doctrine: Colombian Case
Authors: Mariam Begadze
Abstract:
Separation of Powers (SOP) has often been the most frequently posed objection against the judicial enforcement of socio-economic rights. Although a lot has been written to refute those, very rarely has it been assessed what effect the current practice of social rights adjudication has had on the construction of SOP doctrine in specific jurisdictions. Colombia is an appropriate case-study on this question. The notion of collaborative SOP in the 1991 Constitution has affected the court’s conception of its role. On the other hand, the trends in the jurisprudence have further shaped the collaborative notion of SOP. Other institutional characteristics of the Colombian constitutional law have played its share role as well. Tutela action, particularly flexible and fast judicial action for individuals has placed the judiciary in a more confrontational relation vis-à-vis the political branches. Later interventions through abstract review of austerity measures further contributed to that development. Logically, the court’s activism in this sphere has attracted attacks from political branches, which have turned out to be unsuccessful precisely due to court’s outreach to the middle-class, whose direct reliance on the court has turned into its direct democratic legitimacy. Only later have the structural judgments attempted to revive the collaborative notion behind SOP doctrine. However, the court-supervised monitoring process of implementation has itself manifested fluctuations in the mode of collaboration, moving into more managerial supervision recently. This is not surprising considering the highly dysfunctional political system in Colombia, where distrust seems to be the default starting point in the interaction of the branches. The paper aims to answer the question, what the appropriate judicial tools are to realize the collaborative notion of SOP in a context where the court has to strike a balance between the strong executive and the weak and largely dysfunctional legislative branch. If the recurrent abuse lies in the indifference and inaction of legislative branches to engage with political issues seriously, what are the tools in the court’s hands to activate the political process? The answer to this question partly lies in the court’s other strand of jurisprudence, in which it combines substantive objections with procedural ones concerning the operation of the legislative branch. The primary example is the decision on value-added tax on basic goods, in which the court invalidated the law based on the absence of sufficient deliberation in Congress on the question of the bills’ implications on the equity and progressiveness of the entire taxing system. The decision led to Congressional rejection of an identical bill based on the arguments put forward by the court. The case perhaps is the best illustration of the collaborative notion of SOP, in which the court refrains from categorical pronouncements, while does its bit for activating political process. This also legitimizes the court’s activism based on its role to counter the most perilous abuse in the Colombian context – failure of the political system to seriously engage with serious political questions.Keywords: Colombian constitutional court, judicial review, separation of powers, social rights
Procedia PDF Downloads 1052369 Multi Agent System Architecture Oriented Prometheus Methodology Design for Reverse Logistics
Authors: F. Lhafiane, A. Elbyed, M. Bouchoum
Abstract:
The design of Reverse logistics Network has attracted growing attention with the stringent pressures from both environmental awareness and business sustainability. Reverse logistical activities include return, remanufacture, disassemble and dispose of products can be quite complex to manage. In addition, demand can be difficult to predict, and decision making is one of the challenges tasks. This complexity has amplified the need to develop an integrated architecture for product return as an enterprise system. The main purpose of this paper is to design Multi agent system (MAS) architecture using the Prometheus methodology to efficiently manage reverse logistics processes. The proposed MAS architecture includes five types of agents: Gate keeping Agent, Collection Agent, Sorting Agent, Processing Agent and Disposal Agent which act respectively during the five steps of reverse logistics Network.Keywords: reverse logistics, multi agent system, prometheus methodology
Procedia PDF Downloads 4712368 Estimating the Value of Statistical Life under the Subsidization and Cultural Effects
Authors: Mohammad A. Alolayan, John S. Evans, James K. Hammitt
Abstract:
The value of statistical life has been estimated for a middle eastern country with high economical subsidization system. In this study, in-person interviews were conducted on a stratified random sample to estimate the value of mortality risk. Double-bounded dichotomous choice questions followed by open-ended question were used in the interview to investigate the willingness to pay of the respondent for mortality risk reduction. High willingness to pay was found to be associated with high income and education. Also, females were found to have lower willingness to pay than males. The estimated value of statistical life is larger than the ones estimated for western countries where taxation system exists. This estimate provides a baseline for monetizing the health benefits for proposed policy or program to the decision makers in an eastern country. Also, the value of statistical life for a country in the region can be extrapolated from this this estimate by using the benefit transfer method.Keywords: mortality, risk, VSL, willingness-to-pay
Procedia PDF Downloads 3152367 2.5D Face Recognition Using Gabor Discrete Cosine Transform
Authors: Ali Cheraghian, Farshid Hajati, Soheila Gheisari, Yongsheng Gao
Abstract:
In this paper, we present a novel 2.5D face recognition method based on Gabor Discrete Cosine Transform (GDCT). In the proposed method, the Gabor filter is applied to extract feature vectors from the texture and the depth information. Then, Discrete Cosine Transform (DCT) is used for dimensionality and redundancy reduction to improve computational efficiency. The system is combined texture and depth information in the decision level, which presents higher performance compared to methods, which use texture and depth information, separately. The proposed algorithm is examined on publically available Bosphorus database including models with pose variation. The experimental results show that the proposed method has a higher performance compared to the benchmark.Keywords: Gabor filter, discrete cosine transform, 2.5d face recognition, pose
Procedia PDF Downloads 3282366 Assuming the Decision of Having One (More) Child: The New Dimensions of the Post Communist Romanian Family
Authors: Horea-Serban Raluca-Ioana, Istrate Marinela
Abstract:
The first part of the paper analyzes the dynamics of the total fertility rate both at the national and regional level, pointing out the regional disparities in the distribution of this indicator. At the same time, we also focus on the collapse of the number of live births, on the changes in the fertility rate by birth rank, as well as on the failure of acquiring the desired number of children. The second part of the study centres upon a survey applied to urban families with 3 and more than 3 offspring. The preliminary analysis highlights the fact that an increased fertility (more than 3rd rank) is triggered by the parents’ above the average material condition and superior education. The current situation of Romania, which is still passing through a period of relatively rapid demographic changes, marked by numerous convulsions, requires a new approach, in compliance with the recent interpretations appropriate to a new post-transitional demographic regime.Keywords: fertility rate, family size intention, third birth rank, regional disparities
Procedia PDF Downloads 3272365 Efficiency Measurement of Indian Sugar Manufacturing Firms - a DEA Approach
Authors: Amit Kumar Dwivedi, Priyanko Ghosh
Abstract:
Data Envelopment analysis (DEA) has been used to calculate the technical and scale efficiency measures of the public and private sugar manufacturing firms of the Indian Sugar Industry (2006 to 2010). Within DEA framework, the input & Output oriented Variable Returns to Scale (VRS) & Constant Return to Scale (CRS) model is employed for the study of Decision making units (DMUs). A representative sample of 43 firms which account for major portion of the total market share is studied. The selection criterion for the inclusion of a firm in the analysis was the total sales of INR 5,000 million or more in the year 2010. After reviewing the literature it is found that no study has been conducted in the context of Indian sugar manufacturing firms in the Post-liberalization era which motivates us to initiate the study.Keywords: technical efficiency, Indian sugar manufacturing units, DEA, input output oriented
Procedia PDF Downloads 5422364 The Twin Terminal of Pedestrian Trajectory Based on City Intelligent Model (CIM) 4.0
Authors: Chen Xi, Lao Xuerui, Li Junjie, Jiang Yike, Wang Hanwei, Zeng Zihao
Abstract:
To further promote the development of smart cities, the microscopic "nerve endings" of the City Intelligent Model (CIM) are extended to be more sensitive. In this paper, we develop a pedestrian trajectory twin terminal based on the CIM and CNN technology. It also uses 5G networks, architectural and geoinformatics technologies, convolutional neural networks, combined with deep learning networks for human behaviour recognition models, to provide empirical data such as 'pedestrian flow data and human behavioural characteristics data', and ultimately form spatial performance evaluation criteria and spatial performance warning systems, to make the empirical data accurate and intelligent for prediction and decision making.Keywords: urban planning, urban governance, CIM, artificial intelligence, convolutional neural network
Procedia PDF Downloads 1502363 Collect Meaningful Information about Stock Markets from the Web
Authors: Saleem Abuleil, Khalid S. Alsamara
Abstract:
Events represent a significant source of information on the web; they deliver information about events that occurred around the world in all kind of subjects and areas. These events can be collected and organized to provide valuable and useful information for decision makers, researchers, as well as any person seeking knowledge. In this paper, we discuss an ongoing research to target stock markets domain to observe and record changes (events) when they happen, collect them, understand the meaning of each one of them, and organize the information along with meaning in a well-structured format. By using Semantic Role Labeling (SRL) technique, we identified four factors for each event in this paper: verb of action and three roles associated with it, entity name, attribute, and attribute value. We have generated a set of rules and techniques to support our approach to analyze and understand the meaning of the events taking place in stock markets.Keywords: natuaral language processing, Arabic language, event extraction and understanding, sematic role labeling, stock market
Procedia PDF Downloads 3932362 Applications of Digital Tools, Satellite Images and Geographic Information Systems in Data Collection of Greenhouses in Guatemala
Authors: Maria A. Castillo H., Andres R. Leandro, Jose F. Bienvenido B.
Abstract:
During the last 20 years, the globalization of economies, population growth, and the increase in the consumption of fresh agricultural products have generated greater demand for ornamentals, flowers, fresh fruits, and vegetables, mainly from tropical areas. This market situation has demanded greater competitiveness and control over production, with more efficient protected agriculture technologies, which provide greater productivity and allow us to guarantee the quality and quantity that is required in a constant and sustainable way. Guatemala, located in the north of Central America, is one of the largest exporters of agricultural products in the region and exports fresh vegetables, flowers, fruits, ornamental plants, and foliage, most of which were grown in greenhouses. Although there are no official agricultural statistics on greenhouse production, several thesis works, and congress reports have presented consistent estimates. A wide range of protection structures and roofing materials are used, from the most basic and simple ones for rain control to highly technical and automated structures connected with remote sensors for monitoring and control of crops. With this breadth of technological models, it is necessary to analyze georeferenced data related to the cultivated area, to the different existing models, and to the covering materials, integrated with altitude, climate, and soil data. The georeferenced registration of the production units, the data collection with digital tools, the use of satellite images, and geographic information systems (GIS) provide reliable tools to elaborate more complete, agile, and dynamic information maps. This study details a methodology proposed for gathering georeferenced data of high protection structures (greenhouses) in Guatemala, structured in four phases: diagnosis of available information, the definition of the geographic frame, selection of satellite images, and integration with an information system geographic (GIS). It especially takes account of the actual lack of complete data in order to obtain a reliable decision-making system; this gap is solved through the proposed methodology. A summary of the results is presented in each phase, and finally, an evaluation with some improvements and tentative recommendations for further research is added. The main contribution of this study is to propose a methodology that allows to reduce the gap of georeferenced data in protected agriculture in this specific area where data is not generally available and to provide data of better quality, traceability, accuracy, and certainty for the strategic agricultural decision öaking, applicable to other crops, production models and similar/neighboring geographic areas.Keywords: greenhouses, protected agriculture, GIS, Guatemala, satellite image, digital tools, precision agriculture
Procedia PDF Downloads 1942361 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator
Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani
Abstract:
During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA
Procedia PDF Downloads 1862360 Digital Twins in the Built Environment: A Systematic Literature Review
Authors: Bagireanu Astrid, Bros-Williamson Julio, Duncheva Mila, Currie John
Abstract:
Digital Twins (DT) are an innovative concept of cyber-physical integration of data between an asset and its virtual replica. They have originated in established industries such as manufacturing and aviation and have garnered increasing attention as a potentially transformative technology within the built environment. With the potential to support decision-making, real-time simulations, forecasting abilities and managing operations, DT do not fall under a singular scope. This makes defining and leveraging the potential uses of DT a potential missed opportunity. Despite its recognised potential in established industries, literature on DT in the built environment remains limited. Inadequate attention has been given to the implementation of DT in construction projects, as opposed to its operational stage applications. Additionally, the absence of a standardised definition has resulted in inconsistent interpretations of DT in both industry and academia. There is a need to consolidate research to foster a unified understanding of the DT. Such consolidation is indispensable to ensure that future research is undertaken with a solid foundation. This paper aims to present a comprehensive systematic literature review on the role of DT in the built environment. To accomplish this objective, a review and thematic analysis was conducted, encompassing relevant papers from the last five years. The identified papers are categorised based on their specific areas of focus, and the content of these papers was translated into a through classification of DT. In characterising DT and the associated data processes identified, this systematic literature review has identified 6 DT opportunities specifically relevant to the built environment: Facilitating collaborative procurement methods, Supporting net-zero and decarbonization goals, Supporting Modern Methods of Construction (MMC) and off-site manufacturing (OSM), Providing increased transparency and stakeholders collaboration, Supporting complex decision making (real-time simulations and forecasting abilities) and Seamless integration with Internet of Things (IoT), data analytics and other DT. Finally, a discussion of each area of research is provided. A table of definitions of DT across the reviewed literature is provided, seeking to delineate the current state of DT implementation in the built environment context. Gaps in knowledge are identified, as well as research challenges and opportunities for further advancements in the implementation of DT within the built environment. This paper critically assesses the existing literature to identify the potential of DT applications, aiming to harness the transformative capabilities of data in the built environment. By fostering a unified comprehension of DT, this paper contributes to advancing the effective adoption and utilisation of this technology, accelerating progress towards the realisation of smart cities, decarbonisation, and other envisioned roles for DT in the construction domain.Keywords: built environment, design, digital twins, literature review
Procedia PDF Downloads 812359 Risk Assessment of Building Information Modelling Adoption in Construction Projects
Authors: Amirhossein Karamoozian, Desheng Wu, Behzad Abbasnejad
Abstract:
Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.Keywords: risk, BIM, fuzzy TOPSIS, construction projects
Procedia PDF Downloads 2292358 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism
Authors: Lubos Rojka
Abstract:
The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.Keywords: consciousness, free will, determinism, emergence, moral responsibility
Procedia PDF Downloads 1642357 The Factors that Effect to User Satisfaction of Information System in Bangkok Hospital
Authors: Somchai Buaroong
Abstract:
This research attempted to study information system success in dimensions of the user satisfaction level and to find the association between the independent factors of the user experiences, user knowledge, and user attitude. The study sample was selected using simple random sampling that comprised of 190 users who had used the Bangkok HIS. The data were reported from 165 questionnaires. The results found that the user satisfaction was at a moderate level, user satisfaction on the information quality and system quality was at a moderate level, while satisfaction on service quality was at a high level. The computer knowledge of the user was at a moderate level, and the user attitude was at a positive level. The participation of the user was at a low level and the participation in decision and in evaluation was at a low level; however participation in implementation and in benefit was at a moderate.Keywords: information system success, hospital information system, user attitude, user satisfaction
Procedia PDF Downloads 3212356 Analysis of Influencing Factors on Infield-Logistics: A Survey of Different Farm Types in Germany
Authors: Michael Mederle, Heinz Bernhardt
Abstract:
The Management of machine fleets or autonomous vehicle control will considerably increase efficiency in future agricultural production. Especially entire process chains, e.g. harvesting complexes with several interacting combine harvesters, grain carts, and removal trucks, provide lots of optimization potential. Organization and pre-planning ensure to get these efficiency reserves accessible. One way to achieve this is to optimize infield path planning. Particularly autonomous machinery requires precise specifications about infield logistics to be navigated effectively and process optimized in the fields individually or in machine complexes. In the past, a lot of theoretical optimization has been done regarding infield logistics, mainly based on field geometry. However, there are reasons why farmers often do not apply the infield strategy suggested by mathematical route planning tools. To make the computational optimization more useful for farmers this study focuses on these influencing factors by expert interviews. As a result practice-oriented navigation not only to the field but also within the field will be possible. The survey study is intended to cover the entire range of German agriculture. Rural mixed farms with simple technology equipment are considered as well as large agricultural cooperatives which farm thousands of hectares using track guidance and various other electronic assistance systems. First results show that farm managers using guidance systems increasingly attune their infield-logistics on direction giving obstacles such as power lines. In consequence, they can avoid inefficient boom flippings while doing plant protection with the sprayer. Livestock farmers rather focus on the application of organic manure with its specific requirements concerning road conditions, landscape terrain or field access points. Cultivation of sugar beets makes great demands on infield patterns because of its particularities such as the row crop system or high logistics demands. Furthermore, several machines working in the same field simultaneously influence each other, regardless whether or not they are of the equal type. Specific infield strategies always are based on interactions of several different influences and decision criteria. Single working steps like tillage, seeding, plant protection or harvest mostly cannot be considered each individually. The entire production process has to be taken into consideration to detect the right infield logistics. One long-term objective of this examination is to integrate the obtained influences on infield strategies as decision criteria into an infield navigation tool. In this way, path planning will become more practical for farmers which is a basic requirement for automatic vehicle control and increasing process efficiency.Keywords: autonomous vehicle control, infield logistics, path planning, process optimizing
Procedia PDF Downloads 2332355 Citation Analysis of New Zealand Court Decisions
Authors: Tobias Milz, L. Macpherson, Varvara Vetrova
Abstract:
The law is a fundamental pillar of human societies as it shapes, controls and governs how humans conduct business, behave and interact with each other. Recent advances in computer-assisted technologies such as NLP, data science and AI are creating opportunities to support the practice, research and study of this pervasive domain. It is therefore not surprising that there has been an increase in investments into supporting technologies for the legal industry (also known as “legal tech” or “law tech”) over the last decade. A sub-discipline of particular appeal is concerned with assisted legal research. Supporting law researchers and practitioners to retrieve information from the vast amount of ever-growing legal documentation is of natural interest to the legal research community. One tool that has been in use for this purpose since the early nineteenth century is legal citation indexing. Among other use cases, they provided an effective means to discover new precedent cases. Nowadays, computer-assisted network analysis tools can allow for new and more efficient ways to reveal the “hidden” information that is conveyed through citation behavior. Unfortunately, access to openly available legal data is still lacking in New Zealand and access to such networks is only commercially available via providers such as LexisNexis. Consequently, there is a need to create, analyze and provide a legal citation network with sufficient data to support legal research tasks. This paper describes the development and analysis of a legal citation Network for New Zealand containing over 300.000 decisions from 125 different courts of all areas of law and jurisdiction. Using python, the authors assembled web crawlers, scrapers and an OCR pipeline to collect and convert court decisions from openly available sources such as NZLII into uniform and machine-readable text. This facilitated the use of regular expressions to identify references to other court decisions from within the decision text. The data was then imported into a graph-based database (Neo4j) with the courts and their respective cases represented as nodes and the extracted citations as links. Furthermore, additional links between courts of connected cases were added to indicate an indirect citation between the courts. Neo4j, as a graph-based database, allows efficient querying and use of network algorithms such as PageRank to reveal the most influential/most cited courts and court decisions over time. This paper shows that the in-degree distribution of the New Zealand legal citation network resembles a power-law distribution, which indicates a possible scale-free behavior of the network. This is in line with findings of the respective citation networks of the U.S. Supreme Court, Austria and Germany. The authors of this paper provide the database as an openly available data source to support further legal research. The decision texts can be exported from the database to be used for NLP-related legal research, while the network can be used for in-depth analysis. For example, users of the database can specify the network algorithms and metrics to only include specific courts to filter the results to the area of law of interest.Keywords: case citation network, citation analysis, network analysis, Neo4j
Procedia PDF Downloads 1072354 Microgrid Design Under Optimal Control With Batch Reinforcement Learning
Authors: Valentin Père, Mathieu Milhé, Fabien Baillon, Jean-Louis Dirion
Abstract:
Microgrids offer potential solutions to meet the need for local grid stability and increase isolated networks autonomy with the integration of intermittent renewable energy production and storage facilities. In such a context, sizing production and storage for a given network is a complex task, highly depending on input data such as power load profile and renewable resource availability. This work aims at developing an operating cost computation methodology for different microgrid designs based on the use of deep reinforcement learning (RL) algorithms to tackle the optimal operation problem in stochastic environments. RL is a data-based sequential decision control method based on Markov decision processes that enable the consideration of random variables for control at a chosen time scale. Agents trained via RL constitute a promising class of Energy Management Systems (EMS) for the operation of microgrids with energy storage. Microgrid sizing (or design) is generally performed by minimizing investment costs and operational costs arising from the EMS behavior. The latter might include economic aspects (power purchase, facilities aging), social aspects (load curtailment), and ecological aspects (carbon emissions). Sizing variables are related to major constraints on the optimal operation of the network by the EMS. In this work, an islanded mode microgrid is considered. Renewable generation is done with photovoltaic panels; an electrochemical battery ensures short-term electricity storage. The controllable unit is a hydrogen tank that is used as a long-term storage unit. The proposed approach focus on the transfer of agent learning for the near-optimal operating cost approximation with deep RL for each microgrid size. Like most data-based algorithms, the training step in RL leads to important computer time. The objective of this work is thus to study the potential of Batch-Constrained Q-learning (BCQ) for the optimal sizing of microgrids and especially to reduce the computation time of operating cost estimation in several microgrid configurations. BCQ is an off-line RL algorithm that is known to be data efficient and can learn better policies than on-line RL algorithms on the same buffer. The general idea is to use the learned policy of agents trained in similar environments to constitute a buffer. The latter is used to train BCQ, and thus the agent learning can be performed without update during interaction sampling. A comparison between online RL and the presented method is performed based on the score by environment and on the computation time.Keywords: batch-constrained reinforcement learning, control, design, optimal
Procedia PDF Downloads 1232353 Web-Based Paperless Campus: An Approach to Reduce the Cost and Complexity of Education Administration
Authors: Yekini N. Asafe, Haastrup A. Victor, Lawal N. Olawale, Okikiola F. Mercy
Abstract:
Recent increase in access to personal computer and networking systems have made it feasible to perform much of cumbersome and costly paper-based administration in all organization. Desktop computers, networking systems, high capacity storage devices and telecommunications system is currently allowing the transfer of various format of data to be processed, stored and dissemination for the purpose of decision making. Going paperless is more of benefits compare to full paper-based office. This paper proposed a model for design and implementation of e-administration system (paperless campus) for an institution of learning. If this model is design and implemented it will reduced cost and complexity of educational administration also eliminate menaces and environmental hazards attributed to paper-based administration within schools and colleges.Keywords: e-administration, educational administration, paperless campus, paper-based administration
Procedia PDF Downloads 3802352 Effectuation of Interactive Advertising: An Empirical Study on Egyptian Tourism Advertising
Authors: Bassant Eyada, Hanan Atef Kamal Eldin
Abstract:
Advertising has witnessed a diffusion and development in technology to promote products and services, increasingly relying on the interactivity between the consumer and the advertisement. Consumers seek, self-select, process, use and respond to the information provided, hence, providing the potential to increase consumers’ efficiency, involvement, trustworthiness, response, and satisfaction towards the advertised product or service. The power of interactive personalized messages shifts the focus of traditional advertising to more concentrated consumers, sending out tailored messages with more specific individual needs and preferences, defining the importance and relevance that consumers attach to the advertisement, therefore, enhancing the ability to persuade, and the quality of decision making. In this paper, the researchers seek to discuss and explore innovative interactive advertising, its’ effectiveness on consumers and the benefits the advertisements provide, through designing an interactive ad to be placed at the international airports promoting tourism in Egypt.Keywords: advertising, effectiveness, interactivity, Egypt
Procedia PDF Downloads 3162351 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses
Authors: Ouzayr Rabhi, Ibtissam Arrassen
Abstract:
To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML
Procedia PDF Downloads 1602350 A Multi-Agent Intelligent System for Monitoring Health Conditions of Elderly People
Authors: Ayman M. Mansour
Abstract:
In this paper, we propose a multi-agent intelligent system that is used for monitoring the health conditions of elderly people. Monitoring the health condition of elderly people is a complex problem that involves different medical units and requires continuous monitoring. Such expert system is highly needed in rural areas because of inadequate number of available specialized physicians or nurses. Such monitoring must have autonomous interactions between these medical units in order to be effective. A multi-agent system is formed by a community of agents that exchange information and proactively help one another to achieve the goal of elderly monitoring. The agents in the developed system are equipped with intelligent decision maker that arms them with the rule-based reasoning capability that can assist the physicians in making decisions regarding the medical condition of elderly people.Keywords: fuzzy logic, inference system, monitoring system, multi-agent system
Procedia PDF Downloads 6082349 Radical Web Text Classification Using a Composite-Based Approach
Authors: Kolade Olawande Owoeye, George R. S. Weir
Abstract:
The widespread of terrorism and extremism activities on the internet has become a major threat to the government and national securities due to their potential dangers which have necessitated the need for intelligence gathering via web and real-time monitoring of potential websites for extremist activities. However, the manual classification for such contents is practically difficult or time-consuming. In response to this challenge, an automated classification system called composite technique was developed. This is a computational framework that explores the combination of both semantics and syntactic features of textual contents of a web. We implemented the framework on a set of extremist webpages dataset that has been subjected to the manual classification process. Therein, we developed a classification model on the data using J48 decision algorithm, this is to generate a measure of how well each page can be classified into their appropriate classes. The classification result obtained from our method when compared with other states of arts, indicated a 96% success rate in classifying overall webpages when matched against the manual classification.Keywords: extremist, web pages, classification, semantics, posit
Procedia PDF Downloads 1452348 On the Impracticality of Kierkegaard's Community of Authentic Individuals
Authors: Andrew Ka Pok Tam
Abstract:
Kierkegaard has been misinterpreted as an anti-social philosopher for a long time until in recent years when there are more discussions on his concept of community in Journals and Papers inspired by Karl Bayer. Community which is based upon an individual's relations to others is different from the crowd or the public where the numerical or the majority make decisions. As a result, authenticity is only possible in the community. But Kierkegaard did not explain how we can preserve the individual's authenticity by establishing a community instead of a public in the reality. Kierkegaard was against the democratic reform in 1848 Denmark because he thought all elections mean the majority wins and the authenticity of a single individual would be suppressed. However, Kierkegaard himself does not suggest an alternative political system that may preserve the authenticity of individual. This paper aims to evaluate the possibility for us to establish a Kierkegaadian community in practice so as to preserve every individual's authenticity. This paper argues that the practicality of Kierekegaadian community is limited. In order to have effective communications and relations among individuals, a Kierkegaardian community must be small and inefficient as every individual's must remain authentic in all political decision for the whole community.Keywords: authenticity, community, individual, kierkegaard
Procedia PDF Downloads 361