Search results for: decision to choose
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4510

Search results for: decision to choose

3460 Breast Cancer Risk is Predicted Using Fuzzy Logic in MATLAB Environment

Authors: S. Valarmathi, P. B. Harathi, R. Sridhar, S. Balasubramanian

Abstract:

Machine learning tools in medical diagnosis is increasing due to the improved effectiveness of classification and recognition systems to help medical experts in diagnosing breast cancer. In this study, ID3 chooses the splitting attribute with the highest gain in information, where gain is defined as the difference between before the split versus after the split. It is applied for age, location, taluk, stage, year, period, martial status, treatment, heredity, sex, and habitat against Very Serious (VS), Very Serious Moderate (VSM), Serious (S) and Not Serious (NS) to calculate the gain of information. The ranked histogram gives the gain of each field for the breast cancer data. The doctors use TNM staging which will decide the risk level of the breast cancer and play an important decision making field in fuzzy logic for perception based measurement. Spatial risk area (taluk) of the breast cancer is calculated. Result clearly states that Coimbatore (North and South) was found to be risk region to the breast cancer than other areas at 20% criteria. Weighted value of taluk was compared with criterion value and integrated with Map Object to visualize the results. ID3 algorithm shows the high breast cancer risk regions in the study area. The study has outlined, discussed and resolved the algorithms, techniques / methods adopted through soft computing methodology like ID3 algorithm for prognostic decision making in the seriousness of the breast cancer.

Keywords: ID3 algorithm, breast cancer, fuzzy logic, MATLAB

Procedia PDF Downloads 509
3459 Detecting Music Enjoyment Level Using Electroencephalogram Signals and Machine Learning Techniques

Authors: Raymond Feng, Shadi Ghiasi

Abstract:

An electroencephalogram (EEG) is a non-invasive technique that records electrical activity in the brain using scalp electrodes. Researchers have studied the use of EEG to detect emotions and moods by collecting signals from participants and analyzing how those signals correlate with their activities. In this study, researchers investigated the relationship between EEG signals and music enjoyment. Participants listened to music while data was collected. During the signal-processing phase, power spectral densities (PSDs) were computed from the signals, and dominant brainwave frequencies were extracted from the PSDs to form a comprehensive feature matrix. A machine learning approach was then taken to find correlations between the processed data and the music enjoyment level indicated by the participants. To improve on previous research, multiple machine learning models were employed, including K-Nearest Neighbors Classifier, Support Vector Classifier, and Decision Tree Classifier. Hyperparameters were used to fine-tune each model to further increase its performance. The experiments showed that a strong correlation exists, with the Decision Tree Classifier with hyperparameters yielding 85% accuracy. This study proves that EEG is a reliable means to detect music enjoyment and has future applications, including personalized music recommendation, mood adjustment, and mental health therapy.

Keywords: EEG, electroencephalogram, machine learning, mood, music enjoyment, physiological signals

Procedia PDF Downloads 47
3458 Wireless Sensor Network for Forest Fire Detection and Localization

Authors: Tarek Dandashi

Abstract:

WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.

Keywords: forest fire, WSN, wireless sensor network, algortihm

Procedia PDF Downloads 255
3457 Autonomic Sonar Sensor Fault Manager for Mobile Robots

Authors: Martin Doran, Roy Sterritt, George Wilkie

Abstract:

NASA, ESA, and NSSC space agencies have plans to put planetary rovers on Mars in 2020. For these future planetary rovers to succeed, they will heavily depend on sensors to detect obstacles. This will also become of vital importance in the future, if rovers become less dependent on commands received from earth-based control and more dependent on self-configuration and self-decision making. These planetary rovers will face harsh environments and the possibility of hardware failure is high, as seen in missions from the past. In this paper, we focus on using Autonomic principles where self-healing, self-optimization, and self-adaption are explored using the MAPE-K model and expanding this model to encapsulate the attributes such as Awareness, Analysis, and Adjustment (AAA-3). In the experimentation, a Pioneer P3-DX research robot is used to simulate a planetary rover. The sonar sensors on the P3-DX robot are used to simulate the sensors on a planetary rover (even though in reality, sonar sensors cannot operate in a vacuum). Experiments using the P3-DX robot focus on how our software system can be adapted with the loss of sonar sensor functionality. The autonomic manager system is responsible for the decision making on how to make use of remaining ‘enabled’ sonars sensors to compensate for those sonar sensors that are ‘disabled’. The key to this research is that the robot can still detect objects even with reduced sonar sensor capability.

Keywords: autonomic, self-adaption, self-healing, self-optimization

Procedia PDF Downloads 343
3456 Development of an Index for Asset Class in Ex-Ante Portfolio Management

Authors: Miang Hong Ngerng, Noor Diyana Jasme, May Jin Theong

Abstract:

Volatile market environment is inevitable. Fund managers are struggling to choose the right strategy to survive and overcome uncertainties and adverse market movement. Therefore, finding certainty in the mist of uncertainty future is one of the key performance objectives for fund managers. Current available theoretical results are not practical due to strong reliance on the investment assumption made. This paper is to identify the component that can be forecasted in Ex-ante setting which is the realistic situation facing a fund manager in the actual execution of asset allocation in portfolio management. Partial lease square method was used to generate an index with 10 years accounting data from 191 companies listed in KLSE. The result shows that the index reflects the inner nature of the business and up to 30% of the stock return can be explained by the index.

Keywords: active portfolio management, asset allocation ex-ante investment, asset class, partial lease square

Procedia PDF Downloads 263
3455 Determination Optimum Strike Price of FX Option Call Spread with USD/IDR Volatility and Garman–Kohlhagen Model Analysis

Authors: Bangkit Adhi Nugraha, Bambang Suripto

Abstract:

On September 2016 Bank Indonesia (BI) release regulation no.18/18/PBI/2016 that permit bank clients for using the FX option call spread USD/IDR. Basically, this product is a combination between clients buy FX call option (pay premium) and sell FX call option (receive premium) to protect against currency depreciation while also capping the potential upside with cheap premium cost. BI classifies this product as a structured product. The structured product is combination at least two financial instruments, either derivative or non-derivative instruments. The call spread is the first structured product against IDR permitted by BI since 2009 as response the demand increase from Indonesia firms on FX hedging through derivative for protecting market risk their foreign currency asset or liability. The composition of hedging products on Indonesian FX market increase from 35% on 2015 to 40% on 2016, the majority on swap product (FX forward, FX swap, cross currency swap). Swap is formulated by interest rate difference of the two currency pairs. The cost of swap product is 7% for USD/IDR with one year USD/IDR volatility 13%. That cost level makes swap products seem expensive for hedging buyers. Because call spread cost (around 1.5-3%) cheaper than swap, the most Indonesian firms are using NDF FX call spread USD/IDR on offshore with outstanding amount around 10 billion USD. The cheaper cost of call spread is the main advantage for hedging buyers. The problem arises because BI regulation requires the call spread buyer doing the dynamic hedging. That means, if call spread buyer choose strike price 1 and strike price 2 and volatility USD/IDR exchange rate surpass strike price 2, then the call spread buyer must buy another call spread with strike price 1’ (strike price 1’ = strike price 2) and strike price 2’ (strike price 2’ > strike price 1‘). It could make the premium cost of call spread doubled or even more and dismiss the purpose of hedging buyer to find the cheapest hedging cost. It is very crucial for the buyer to choose best optimum strike price before entering into the transaction. To help hedging buyer find the optimum strike price and avoid expensive multiple premium cost, we observe ten years 2005-2015 historical data of USD/IDR volatility to be compared with the price movement of the call spread USD/IDR using Garman–Kohlhagen Model (as a common formula on FX option pricing). We use statistical tools to analysis data correlation, understand nature of call spread price movement over ten years, and determine factors affecting price movement. We select some range of strike price and tenor and calculate the probability of dynamic hedging to occur and how much it’s cost. We found USD/IDR currency pairs is too uncertain and make dynamic hedging riskier and more expensive. We validated this result using one year data and shown small RMS. The study result could be used to understand nature of FX call spread and determine optimum strike price for hedging plan.

Keywords: FX call spread USD/IDR, USD/IDR volatility statistical analysis, Garman–Kohlhagen Model on FX Option USD/IDR, Bank Indonesia Regulation no.18/18/PBI/2016

Procedia PDF Downloads 371
3454 Solution of Insurance Pricing Model Giving Optimum Premium Level for Both Insured and Insurer by Game Theory

Authors: Betul Zehra Karagul

Abstract:

A game consists of strategies that each actor has in his/her own choice strategies, and a game regulates the certain rules in the strategies that the actors choose, express how they evaluate their knowledge and the utility of output results. Game theory examines the human behaviors (preferences) of strategic situations in which each actor of a game regards the action that others will make in spite of his own moves. There is a balance between each player playing a game with the final number of players and the player with a certain probability of choosing the players, and this is called Nash equilibrium. The insurance is a two-person game where the insurer and insured are the actors. Both sides have the right to act in favor of utility functions. The insured has to pay a premium to buy the insurance cover. The insured will want to pay a low premium while the insurer is willing to get a high premium. In this study, the state of equilibrium for insurance pricing was examined in terms of the insurer and insured with game theory.

Keywords: game theory, insurance pricing, Nash equilibrium, utility function

Procedia PDF Downloads 346
3453 Probing Environmental Sustainability via Brownfield Remediation: A Framework to Manage Brownfields in Ethiopia Lesson to Africa

Authors: Mikiale Gebreslase Gebremariam, Chai Huaqi, Tesfay Gebretsdkan Gebremichael, Dawit Nega Bekele

Abstract:

In recent years, brownfield redevelopment projects (BRPs) have contributed to the overarching paradigm of the United Nations 2030 agendas. In the present circumstance, most developed nations adopted BRPs, an efficacious urban policy tool. However, in developing and some advanced countries, BRPs are lacking due to limitations of awareness, policy tools, and financial capability for cleaning up brownfield sites. For example, the growth and development of Ethiopian cities were achieved at the cost of poor urban planning, including no community consultations and excessive urbanization for future growth. The demand for land resources is more and more urgent as the result of an intermigration to major cities and towns for socio-economic reasons and population growth. In the past, the development mode of spreading major cities has made horizontal urbanizations stretching outwards. Expansion in search of more land resources, while the outer cities are growing, the inner cities are polluted by environmental pollution. It is noteworthy that the rapid development of cities has not brought about an increase in people's happiness index. Thus, the proposed management framework for managing brownfields in Ethiopia as a lesson to the developing nation facing similar challenges and growth will add immense value in solving the problems and give insights into brownfield land utilization. Under the umbrella of the grey incidence decision-making model and with the consideration of multiple stakeholders and tight environmental and economic constraints, the proposed management framework integrates different criteria from economic, social, environmental, technical, and risk aspects into the grey incidence decision-making model and gives useful guidance to manage brownfields in Ethiopia. Furthermore, it will contribute to the future development of the social economy and the missions of the 2030 UN sustainable development goals.

Keywords: Brownfields, environmental sustainability, Ethiopia, grey-incidence decision-making, sustainable urban development

Procedia PDF Downloads 80
3452 Biodiesel Production from Canola Oil Using Trans-Esterification Process with Koh as a Catalyst

Authors: M. Nafis Alfarizi, Dinda A. Utami, Arif Hidayat

Abstract:

Biodiesel is one solution to overcome the use of petroleum fuels. Many alternative feedstocks that can be used among which canola oil. The purpose of this study was to determine the ability of canola oil and KOH for the trans-esterification reaction in biodiesel production. Canola oil has a very high purity that can be used as an alternative feedstock for biodiesel production and expected it will be produced biodiesel with excellent quality. In this case of study, we used trans-esterification process wherein the triglyceride is reacted with an alcohol with KOH as a catalyst, and it will produce biodiesel and glycerol as byproduct and we choose trans-esterification process because canola oil has a 0,445% FFA content. The variables studied in this research include the comparison of canola oil and methanol, temperature, time, and the percent of catalyst used. In this study the method of analysis we use GCMS and FTIR to know what the characteristic in canola oil. Development of canola oil seems to be the perfect solution to produce high-quality biodiesel. The reaction conditions resulted in 97.87% -w methyl ester (biodiesel) product by using a 0.5% wt KOH catalyst with canola and methanol ratio 1:8 at 60°C.

Keywords: biodiesel, canola oil, KOH, trans-esterification

Procedia PDF Downloads 241
3451 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System

Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek

Abstract:

This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.

Keywords: data warehouse, GIS, MCDM, SOLAP

Procedia PDF Downloads 166
3450 Application of Supervised Deep Learning-based Machine Learning to Manage Smart Homes

Authors: Ahmed Al-Adaileh

Abstract:

Renewable energy sources, domestic storage systems, controllable loads and machine learning technologies will be key components of future smart homes management systems. An energy management scheme that uses a Deep Learning (DL) approach to support the smart home management systems, which consist of a standalone photovoltaic system, storage unit, heating ventilation air-conditioning system and a set of conventional and smart appliances, is presented. The objective of the proposed scheme is to apply DL-based machine learning to predict various running parameters within a smart home's environment to achieve maximum comfort levels for occupants, reduced electricity bills, and less dependency on the public grid. The problem is using Reinforcement learning, where decisions are taken based on applying the Continuous-time Markov Decision Process. The main contribution of this research is the proposed framework that applies DL to enhance the system's supervised dataset to offer unlimited chances to effectively support smart home systems. A case study involving a set of conventional and smart appliances with dedicated processing units in an inhabited building can demonstrate the validity of the proposed framework. A visualization graph can show "before" and "after" results.

Keywords: smart homes systems, machine learning, deep learning, Markov Decision Process

Procedia PDF Downloads 184
3449 Potentials of Additive Manufacturing: An Approach to Increase the Flexibility of Production Systems

Authors: A. Luft, S. Bremen, N. Balc

Abstract:

The task of flexibility planning and design, just like factory planning, for example, is to create the long-term systemic framework that constitutes the restriction for short-term operational management. This is a strategic challenge since, due to the decision defect character of the underlying flexibility problem, multiple types of flexibility need to be considered over the course of various scenarios, production programs, and production system configurations. In this context, an evaluation model has been developed that integrates both conventional and additive resources on a basic task level and allows the quantification of flexibility enhancement in terms of mix and volume flexibility, complexity reduction, and machine capacity. The model helps companies to decide in early decision-making processes about the potential gains of implementing additive manufacturing technologies on a strategic level. For companies, it is essential to consider both additive and conventional manufacturing beyond pure unit costs. It is necessary to achieve an integrative view of manufacturing that incorporates both additive and conventional manufacturing resources and quantifies their potential with regard to flexibility and manufacturing complexity. This also requires a structured process for the strategic production systems design that spans the design of various scenarios and allows for multi-dimensional and comparative analysis. A respective guideline for the planning of additive resources on a strategic level is being laid out in this paper.

Keywords: additive manufacturing, production system design, flexibility enhancement, strategic guideline

Procedia PDF Downloads 115
3448 How to Perform Proper Indexing?

Authors: Watheq Mansour, Waleed Bin Owais, Mohammad Basheer Kotit, Khaled Khan

Abstract:

Efficient query processing is one of the utmost requisites in any business environment to satisfy consumer needs. This paper investigates the various types of indexing models, viz. primary, secondary, and multi-level. The investigation is done under the ambit of various types of queries to which each indexing model performs with efficacy. This study also discusses the inherent advantages and disadvantages of each indexing model and how indexing models can be chosen based on a particular environment. This paper also draws parallels between various indexing models and provides recommendations that would help a Database administrator to zero-in on a particular indexing model attributed to the needs and requirements of the production environment. In addition, to satisfy industry and consumer needs attributed to the colossal data generation nowadays, this study has proposed two novel indexing techniques that can be used to index highly unstructured and structured Big Data with efficacy. The study also briefly discusses some best practices that the industry should follow in order to choose an indexing model that is apposite to their prerequisites and requirements.

Keywords: indexing, hashing, latent semantic indexing, B-tree

Procedia PDF Downloads 152
3447 The Correlation between Territory Planning and Logistics Development: Methodological Approach

Authors: Ebtissem Sassi, Abdellatif Benabdelhafid, Sami Hammami

Abstract:

Congestion, pollution and space misuse are the major risks in the hinterland. Management of these risks is a major issue for all the actors intervening in territory management. A good mastery of these risks is based on the consideration of environmental and physical constraints since the implementation of a policy integrates simultaneously an efficient use, territorial resources, and financial resources which become increasingly rare. Yet, this balance can be difficult to establish simultaneously by all the actors. Indeed, every actor has often the tendency to favor these objectives in detriment to others. In this framework, we have fixed the objective of designing and achieving a model which will centralize multidisciplinary data and serve the analysis tool as well as a decision support tool. In this article, we will elaborate some methodological axes allowing the good management of the territory system through (i) determination of the structural factors of the decision support system, (ii) integration of methods tools favoring the territorial decisional process. Logistics territory geographic information system is a model dealing with this issue. The objective of this model is to facilitate the exchanges between the actors around a common question which was the research subject of human sciences researchers (geography, economy), nature sciences (ecology) as well as finding an optimal solution for simultaneous responses to all these objectives.

Keywords: complexity, territory, logistics, territory planning, conceptual model, GIS, MCA

Procedia PDF Downloads 123
3446 In the Spirit of Open Educational Resources: Library Resources and Fashion Merchandising

Authors: Lizhu Y. Davis, Gretchen Higginbottom, Vang Vang

Abstract:

This presentation explores the adoption of library resources to engage students in a Visual Merchandising course during the 2016 spring semester. This study was a cross-disciplinary collaboration between the Fashion Merchandising Program and the Madden Library at California State University, Fresno. The goal of the project was to explore and assess the students’ use of library resources as a part of the Affordable Learning Solutions Initiative, a California State University (CSU) Office of the Chancellor Program that enables faculty to choose and provide high-quality, free or low-cost educational materials for their students. Students were interviewed afterwards and the results were generally favorable and provided insight into how students perceive and use library resources to support their research needs. This study reveals an important step in examining how open educational resources impact student learning.

Keywords: collaboration, library resources, open educational resources, visual merchandising

Procedia PDF Downloads 308
3445 Federalism, Dual Sovereignty, and the Supreme Court of Nigeria

Authors: Edoba Bright Omoregie

Abstract:

Nigeria became a federation in 1954 six years before it gained independence away from British colonial rule. The country has remained a federation since then despite the challenging circumstances of military rule and civil strife which have tasked its federal credentials. Since 1961, when it first decided a federalism dispute, cases over vertical and horizontal powers have inundated the country’s Supreme Court. In its current practice of federalism after democratic rule was resumed in 1999, the country has witnessed a spell of intergovernmental disputes over a good number of federalism issues. Such conflicts have eventually found their way to the Supreme Court for resolution, not as a final appellate court (which it is in other non-federal matters) but as a court of first and final instance following the constitutional provision granting the court such power. However, in April 2014 one of such disputes was denied hearing by the court when it declined original jurisdiction to determine the matter. The suit was instituted by one state of the federation against the federal government and the other 35 states challenging the collection of value added tax (a consumption tax)on certain goods and services within the state. The paper appraises the rationale of the court’s decision and reason that its decision to decline jurisdiction is the result of an avoidable misunderstanding of the dual sovereignty instituted by the federal system of Nigeria as well as a misconception of the role which the court is constitutionally assigned to play in resolving intergovernmental schisms in the federal system.

Keywords: dual sovereignty, federalism, intergovernmental conflict, Supreme Court

Procedia PDF Downloads 547
3444 Multi-Scale Green Infrastructure: An Integrated Literature Review

Authors: Panpan Feng

Abstract:

The concept of green infrastructure originated in Europe and the United States. It aims to ensure smart growth of urban and rural ecosystems and achieve sustainable urban and rural ecological, social, and economic development by combining it with gray infrastructure in traditional planning. Based on the literature review of the theoretical origin, value connotation, and measurement methods of green infrastructure, this study summarizes the research content of green infrastructure at different scales from the three spatial levels of region, city, and block and divides it into functional dimensions, spatial dimension, and strategic dimension. The results show that in the functional dimension, from region-city-block, the research on green infrastructure gradually shifts from ecological function to social function. In the spatial dimension, from region-city-block, the research on the spatial form of green infrastructure has shifted from two-dimensional to three-dimensional, and the spatial structure of green infrastructure has shifted from single ecological elements to multiple composite elements. From a strategic perspective, green infrastructure research is more of a spatial planning tool based on land management, environmental livability and ecological psychology, providing certain decision-making support.

Keywords: green infrastructure, multi-scale, social and ecological functions, spatial strategic decision-making tools

Procedia PDF Downloads 47
3443 The Significance of Awareness about Gender Diversity for the Future of Work: A Multi-Method Study of Organizational Structures and Policies Considering Trans and Gender Diversity

Authors: Robin C. Ladwig

Abstract:

The future of work becomes less predictable, which requires increasing the adaptability of organizations to social and work changes. Society is transforming regarding gender identity in the sense that more people come forward to identify as trans and gender diverse (TGD). Organizations are ill-equipped to provide a safe and encouraging work environment by lacking inclusive organizational structures. The qualitative multi-method research about TGD inclusivity in the workplace explores the enablers and barriers for TGD individuals to satisfactory engage in the work environment and organizational culture. Furthermore, these TGD insights are analyzed about their organizational implications and awareness from a leadership and management perspective. The semi-structured online interviews with TGD individuals and the photo-elicit open-ended questionnaire addressed to leadership and management in diversity, career development, and human resources have been analyzed with a critical grounded theory approach. Findings demonstrated the significance of TGD voices, the support of leadership and management, as well as the synergy between voices and leadership. Hence, it indicates practical implications such as the revision of exclusive language used in policies, data collection, or communication and reconsideration of organizational decision-making by leaders to include TGD voices.

Keywords: future of work, occupational identity, organisational decision-making, trans and gender diverse identity

Procedia PDF Downloads 118
3442 Fraud Detection in Credit Cards with Machine Learning

Authors: Anjali Chouksey, Riya Nimje, Jahanvi Saraf

Abstract:

Online transactions have increased dramatically in this new ‘social-distancing’ era. With online transactions, Fraud in online payments has also increased significantly. Frauds are a significant problem in various industries like insurance companies, baking, etc. These frauds include leaking sensitive information related to the credit card, which can be easily misused. Due to the government also pushing online transactions, E-commerce is on a boom. But due to increasing frauds in online payments, these E-commerce industries are suffering a great loss of trust from their customers. These companies are finding credit card fraud to be a big problem. People have started using online payment options and thus are becoming easy targets of credit card fraud. In this research paper, we will be discussing machine learning algorithms. We have used a decision tree, XGBOOST, k-nearest neighbour, logistic-regression, random forest, and SVM on a dataset in which there are transactions done online mode using credit cards. We will test all these algorithms for detecting fraud cases using the confusion matrix, F1 score, and calculating the accuracy score for each model to identify which algorithm can be used in detecting frauds.

Keywords: machine learning, fraud detection, artificial intelligence, decision tree, k nearest neighbour, random forest, XGBOOST, logistic regression, support vector machine

Procedia PDF Downloads 138
3441 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data

Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder

Abstract:

Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.

Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods

Procedia PDF Downloads 248
3440 Optimizing Design Works in Construction Consultant Company: A Knowledge-Based Application

Authors: Phan Nghiem Vu, Le Tuan Vu, Ta Quang Tai

Abstract:

The optimal construction design used during the execution of a construction project is a key factor in determining high productivity and customer satisfaction, however, this management process sometimes is carried out without care and the systematic method that it deserves, bringing negative consequences. This study proposes a knowledge management (KM) approach that will enable the intelligent use of experienced and acknowledged engineers to improve the management of construction design works for a project. Then a knowledge-based application to support this decision-making process is proposed and described. To define and design the system for the application, semi-structured interviews were conducted within five construction consulting organizations with the purpose of studying the way that the method’ optimizing process is implemented in practice and the knowledge supported with it. A system of an optimizing construction design works (OCDW) based on knowledge was developed then validated with construction experts. The OCDW was liked as a valuable tool for construction design works’ optimization, by supporting organizations to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The benefits are described as provided by the performance support system, reducing costs and time, improving product design quality, satisfying customer requirements, expanding the brand organization.

Keywords: optimizing construction design work, construction consultant organization, knowledge management, knowledge-based application

Procedia PDF Downloads 120
3439 Simulation-based Decision Making on Intra-hospital Patient Referral in a Collaborative Medical Alliance

Authors: Yuguang Gao, Mingtao Deng

Abstract:

The integration of independently operating hospitals into a unified healthcare service system has become a strategic imperative in the pursuit of hospitals’ high-quality development. Central to the concept of group governance over such transformation, exemplified by a collaborative medical alliance, is the delineation of shared value, vision, and goals. Given the inherent disparity in capabilities among hospitals within the alliance, particularly in the treatment of different diseases characterized by Disease Related Groups (DRG) in terms of effectiveness, efficiency and resource utilization, this study aims to address the centralized decision-making of intra-hospital patient referral within the medical alliance to enhance the overall production and quality of service provided. We first introduce the notion of production utility, where a higher production utility for a hospital implies better performance in treating patients diagnosed with that specific DRG group of diseases. Then, a Discrete-Event Simulation (DES) framework is established for patient referral among hospitals, where patient flow modeling incorporates a queueing system with fixed capacities for each hospital. The simulation study begins with a two-member alliance. The pivotal strategy examined is a "whether-to-refer" decision triggered when the bed usage rate surpasses a predefined threshold for either hospital. Then, the decision encompasses referring patients to the other hospital based on DRG groups’ production utility differentials as well as bed availability. The objective is to maximize the total production utility of the alliance while minimizing patients’ average length of stay and turnover rate. Thus the parameter under scrutiny is the bed usage rate threshold, influencing the efficacy of the referral strategy. Extending the study to a three-member alliance, which could readily be generalized to multi-member alliances, we maintain the core setup while introducing an additional “which-to-refer" decision that involves referring patients with specific DRG groups to the member hospital according to their respective production utility rankings. The overarching goal remains consistent, for which the bed usage rate threshold is once again a focal point for analysis. For the two-member alliance scenario, our simulation results indicate that the optimal bed usage rate threshold hinges on the discrepancy in the number of beds between member hospitals, the distribution of DRG groups among incoming patients, and variations in production utilities across hospitals. Transitioning to the three-member alliance, we observe similar dependencies on these parameters. Additionally, it becomes evident that an imbalanced distribution of DRG diagnoses and further disparity in production utilities among member hospitals may lead to an increase in the turnover rate. In general, it was found that the intra-hospital referral mechanism enhances the overall production utility of the medical alliance compared to individual hospitals without partnership. Patients’ average length of stay is also reduced, showcasing the positive impact of the collaborative approach. However, the turnover rate exhibits variability based on parameter setups, particularly when patients are redirected within the alliance. In conclusion, the re-structuring of diagnostic disease groups within the medical alliance proves instrumental in improving overall healthcare service outcomes, providing a compelling rationale for the government's promotion of patient referrals within collaborative medical alliances.

Keywords: collaborative medical alliance, disease related group, patient referral, simulation

Procedia PDF Downloads 42
3438 Debate between Breast Milk and Formula Milk in Nutritional Value

Authors: Nora Alkharji, Wafa Fallatah

Abstract:

Introduction: One of the major issues to consider when is deciding on what to feed a baby is the quality of the food itself. Whilst commercially prepared infant formulas are a nutritious alternative to breast milk, and even contain some vitamins and nutrients, most major medical organizations consider breastfeeding the best nutritional option for babies. Choosing whether to breastfeed or formula feed your baby is one of the first decisions expectant parents will make. The American Academy of Pediatrics (AAP) is in agreement with other organizations such as the American Medical Association (AMA), the American Dietetic Association (ADA), and the World Health Organization (WHO) in recommending breastfeeding as the best nutrition for babies and best suited for a baby's digestive system. In addition, breastfeeding helps in the combatting of infections, prevention of allergies, and protection against various chronic conditions. The decision to breastfeed or formula feed one’s baby is a very personal one. However, certain points need to be clarified regarding the nutritional value of breastfeeding versus formula feeding to allow for informed decision-making. Methodology: -A formal debate about whether to breastfeed or formula feed babies as the better choice. -There will be two debaters, both lactation consultants -Arguments will be based on evidence-based medicine -Duration period of debated: 45 min Result: Clarification and heightened awareness of the benefits of breastfeeding. Conclusion: This debate will make the choice between breastfeeding or formula feeding a relatively easy one to make by both health worker and parents.

Keywords: breastmilk, formula milk, nutritional, comparison

Procedia PDF Downloads 458
3437 Core Number Optimization Based Scheduler to Order/Mapp Simulink Application

Authors: Asma Rebaya, Imen Amari, Kaouther Gasmi, Salem Hasnaoui

Abstract:

Over these last years, the number of cores witnessed a spectacular increase in digital signal and general use processors. Concurrently, significant researches are done to get benefit from the high degree of parallelism. Indeed, these researches are focused to provide an efficient scheduling from hardware/software systems to multicores architecture. The scheduling process consists on statically choose one core to execute one task and to specify an execution order for the application tasks. In this paper, we describe an efficient scheduler that calculates the optimal number of cores required to schedule an application, gives a heuristic scheduling solution and evaluates its cost. Our proposal results are evaluated and compared with Preesm scheduler results and we prove that ours allows better scheduling in terms of latency, computation time and number of cores.

Keywords: computation time, hardware/software system, latency, optimization, multi-cores platform, scheduling

Procedia PDF Downloads 273
3436 Study of Cavitation Erosion of Pump-Storage Hydro Power Plant Prototype

Authors: Tine Cencič, Marko Hočevar, Brane Širok

Abstract:

An experimental investigation has been made to detect cavitation in pump–storage hydro power plant prototype suffering from cavitation in pump mode. Vibrations and acoustic emission on the housing of turbine bearing and pressure fluctuations in the draft tube were measured and the corresponding signals have been recorded and analyzed. The analysis was based on the analysis of high-frequency content of measured variables. The pump-storage hydro power plant prototype has been operated at various input loads and Thoma numbers. Several estimators of cavitation were evaluated according to coefficient of determination between Thoma number and cavitation estimators. The best results were achieved with a compound discharge coefficient cavitation estimator. Cavitation estimators were evaluated in several intervals of frequencies. Also, a prediction of cavitation erosion was made in order to choose the appropriate maintenance and repair periods.

Keywords: cavitation erosion, turbine, cavitation measurement, fluid dynamics

Procedia PDF Downloads 405
3435 Foresight in Food Supply System in Bogota

Authors: Suarez-Puello Alejandro, Baquero-Ruiz Andrés F, Suarez-Puello Rodrigo

Abstract:

This paper discusses the results of a foresight exercise which analyzes Bogota’s fruit, vegetable and tuber supply chain strategy- described at the Food Supply and Security Master Plan (FSSMP)-to provide the inhabitants of Bogotá, Colombia, with basic food products at a fair price. The methodology consisted of using quantitative and qualitative foresight tools such as system dynamics and variable selection methods to better represent interactions among stakeholders and obtain more integral results that could shed light on this complex situation. At first, the Master Plan is an input to establish the objectives and scope of the exercise. Then, stakeholders and their relationships are identified. Later, system dynamics is used to model product, information and money flow along the fruit, vegetable and tuber supply chain. Two scenarios are presented, discussing actions by the public sector and the reactions that could be expected from the whole food supply system. Finally, these impacts are compared to the Food Supply and Security Master Plan’s objectives suggesting recommendations that could improve its execution. This foresight exercise performed at a governmental level is intended to promote the widen the use of foresight as an anticipatory, decision-making tool that offers solutions to complex problems.

Keywords: decision making, foresight, public policies, supply chain, system dynamics

Procedia PDF Downloads 429
3434 Method for Requirements Analysis and Decision Making for Restructuring Projects in Factories

Authors: Rene Hellmuth

Abstract:

The requirements for the factory planning and the building concerned have changed in the last years. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring gains more importance in order to maintain the competitiveness of a factory. Restrictions regarding new areas, shorter life cycles of product and production technology as well as a VUCA (volatility, uncertainty, complexity and ambiguity) world cause more frequently occurring rebuilding measures within a factory. Restructuring of factories is the most common planning case today. Restructuring is more common than new construction, revitalization and dismantling of factories. The increasing importance of restructuring processes shows that the ability to change was and is a promising concept for the reaction of companies to permanently changing conditions. The factory building is the basis for most changes within a factory. If an adaptation of a construction project (factory) is necessary, the inventory documents must be checked and often time-consuming planning of the adaptation must take place to define the relevant components to be adapted, in order to be able to finally evaluate them. The different requirements of the planning participants from the disciplines of factory planning (production planner, logistics planner, automation planner) and industrial construction planning (architect, civil engineer) come together during reconstruction and must be structured. This raises the research question: Which requirements do the disciplines involved in the reconstruction planning place on a digital factory model? A subordinate research question is: How can model-based decision support be provided for a more efficient design of the conversion within a factory? Because of the high adaptation rate of factories and its building described above, a methodology for rescheduling factories based on the requirements engineering method from software development is conceived and designed for practical application in factory restructuring projects. The explorative research procedure according to Kubicek is applied. Explorative research is suitable if the practical usability of the research results has priority. Furthermore, it will be shown how to best use a digital factory model in practice. The focus will be on mobile applications to meet the needs of factory planners on site. An augmented reality (AR) application will be designed and created to provide decision support for planning variants. The aim is to contribute to a shortening of the planning process and model-based decision support for more efficient change management. This requires the application of a methodology that reduces the deficits of the existing approaches. The time and cost expenditure are represented in the AR tablet solution based on a building information model (BIM). Overall, the requirements of those involved in the planning process for a digital factory model in the case of restructuring within a factory are thus first determined in a structured manner. The results are then applied and transferred to a construction site solution based on augmented reality.

Keywords: augmented reality, digital factory model, factory planning, restructuring

Procedia PDF Downloads 121
3433 Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition

Authors: Aishwarya Ravindra Fursule, Shruti Kshirsagar

Abstract:

In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results.

Keywords: comparison, ML classifiers, KNN, decision tree, SVM, random forest, logistic regression, ensemble classifiers

Procedia PDF Downloads 35
3432 Correlation Analysis between Sensory Processing Sensitivity (SPS), Meares-Irlen Syndrome (MIS) and Dyslexia

Authors: Kaaryn M. Cater

Abstract:

Students with sensory processing sensitivity (SPS), Meares-Irlen Syndrome (MIS) and dyslexia can become overwhelmed and struggle to thrive in traditional tertiary learning environments. An estimated 50% of tertiary students who disclose learning related issues are dyslexic. This study explores the relationship between SPS, MIS and dyslexia. Baseline measures will be analysed to establish any correlation between these three minority methods of information processing. SPS is an innate sensitivity trait found in 15-20% of the population and has been identified in over 100 species of animals. Humans with SPS are referred to as Highly Sensitive People (HSP) and the measure of HSP is a 27 point self-test known as the Highly Sensitive Person Scale (HSPS). A 2016 study conducted by the author established base-line data for HSP students in a tertiary institution in New Zealand. The results of the study showed that all participating HSP students believed the knowledge of SPS to be life-changing and useful in managing life and study, in addition, they believed that all tutors and in-coming students should be given information on SPS. MIS is a visual processing and perception disorder that is found in approximately 10% of the population and has a variety of symptoms including visual fatigue, headaches and nausea. One way to ease some of these symptoms is through the use of colored lenses or overlays. Dyslexia is a complex phonological based information processing variation present in approximately 10% of the population. An estimated 50% of dyslexics are thought to have MIS. The study exploring possible correlations between these minority forms of information processing is due to begin in February 2017. An invitation will be extended to all first year students enrolled in degree programmes across all faculties and schools within the institution. An estimated 900 students will be eligible to participate in the study. Participants will be asked to complete a battery of on-line questionnaires including the Highly Sensitive Person Scale, the International Dyslexia Association adult self-assessment and the adapted Irlen indicator. All three scales have been used extensively in literature and have been validated among many populations. All participants whose score on any (or some) of the three questionnaires suggest a minority method of information processing will receive an invitation to meet with a learning advisor, and given access to counselling services if they choose. Meeting with a learning advisor is not mandatory, and some participants may choose not to receive help. Data will be collected using the Question Pro platform and base-line data will be analysed using correlation and regression analysis to identify relationships and predictors between SPS, MIS and dyslexia. This study forms part of a larger three year longitudinal study and participants will be required to complete questionnaires at annual intervals in subsequent years of the study until completion of (or withdrawal from) their degree. At these data collection points, participants will be questioned on any additional support received relating to their minority method(s) of information processing. Data from this study will be available by April 2017.

Keywords: dyslexia, highly sensitive person (HSP), Meares-Irlen Syndrome (MIS), minority forms of information processing, sensory processing sensitivity (SPS)

Procedia PDF Downloads 226
3431 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues

Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid

Abstract:

New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.

Keywords: information visualization, visual analytics, text mining, visual text analytics tools, big data visualization

Procedia PDF Downloads 391