Search results for: decision processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7455

Search results for: decision processing

6285 An Improved Parallel Algorithm of Decision Tree

Authors: Jiameng Wang, Yunfei Yin, Xiyu Deng

Abstract:

Parallel optimization is one of the important research topics of data mining at this stage. Taking Classification and Regression Tree (CART) parallelization as an example, this paper proposes a parallel data mining algorithm based on SSP-OGini-PCCP. Aiming at the problem of choosing the best CART segmentation point, this paper designs an S-SP model without data association; and in order to calculate the Gini index efficiently, a parallel OGini calculation method is designed. In addition, in order to improve the efficiency of the pruning algorithm, a synchronous PCCP pruning strategy is proposed in this paper. In this paper, the optimal segmentation calculation, Gini index calculation, and pruning algorithm are studied in depth. These are important components of parallel data mining. By constructing a distributed cluster simulation system based on SPARK, data mining methods based on SSP-OGini-PCCP are tested. Experimental results show that this method can increase the search efficiency of the best segmentation point by an average of 89%, increase the search efficiency of the Gini segmentation index by 3853%, and increase the pruning efficiency by 146% on average; and as the size of the data set increases, the performance of the algorithm remains stable, which meets the requirements of contemporary massive data processing.

Keywords: classification, Gini index, parallel data mining, pruning ahead

Procedia PDF Downloads 123
6284 Resource Framework Descriptors for Interestingness in Data

Authors: C. B. Abhilash, Kavi Mahesh

Abstract:

Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.

Keywords: RDF, interestingness, knowledge base, semantic data

Procedia PDF Downloads 162
6283 An Experimental Study of Scalar Implicature Processing in Chinese

Authors: Liu Si, Wang Chunmei, Liu Huangmei

Abstract:

A prominent component of the semantic versus pragmatic debate, scalar implicature (SI) has been gaining great attention ever since it was proposed by Horn. The constant debate is between the structural and pragmatic approach. The former claims that generation of SI is costless, automatic, and dependent mostly on the structural properties of sentences, whereas the latter advocates both that such generation is largely dependent upon context, and that the process is costly. Many experiments, among which Katsos’s text comprehension experiments are influential, have been designed and conducted in order to verify their views, but the results are not conclusive. Besides, most of the experiments were conducted in English language materials. Katsos conducted one off-line and three on-line text comprehension experiments, in which the previous shortcomings were addressed on a certain extent and the conclusion was in favor of the pragmatic approach. We intend to test the results of Katsos’s experiment in Chinese scalar implicature. Four experiments in both off-line and on-line conditions to examine the generation and response time of SI in Chinese "yixie" (some) and "quanbu (dou)" (all) will be conducted in order to find out whether the structural or the pragmatic approach could be sustained. The study mainly aims to answer the following questions: (1) Can SI be generated in the upper- and lower-bound contexts as Katsos confirmed when Chinese language materials are used in the experiment? (2) Can SI be first generated, then cancelled as default view claimed or can it not be generated in a neutral context when Chinese language materials are used in the experiment? (3) Is SI generation costless or costly in terms of processing resources? (4) In line with the SI generation process, what conclusion can be made about the cognitive processing model of language meaning? Is it a parallel model or a linear model? Or is it a dynamic and hierarchical model? According to previous theoretical debates and experimental conflicts, presumptions could be made that SI, in Chinese language, might be generated in the upper-bound contexts. Besides, the response time might be faster in upper-bound than that found in lower-bound context. SI generation in neutral context might be the slowest. At last, a conclusion would be made that the processing model of SI could not be verified by either absolute structural or pragmatic approaches. It is, rather, a dynamic and complex processing mechanism, in which the interaction of language forms, ad hoc context, mental context, background knowledge, speakers’ interaction, etc. are involved.

Keywords: cognitive linguistics, pragmatics, scalar implicture, experimental study, Chinese language

Procedia PDF Downloads 361
6282 Influence of Thermal Processing Methods on Antinutrient of Artocarpus heterophyllus Seeds

Authors: Marina Zulkifli, Mohd Faizal Mashhod, Noriham Abdullah

Abstract:

The aim of this study was to determine the antinutrient compounds of jackfruit (Artocarpus heterophyllus) seeds as affected by thermal processes. Two types of heat treatments were applied namely boiling and microwave cooking. Results of this study showed that boiling caused a significant decrease in phytate content (30.01%), oxalate content (33.22%), saponin content (35.69%) and tannin content (44.58%) as compared to microwave cooking and raw seed. The percentage loss of antinutrient compounds in microwaved seed was: phytate 24.58%, oxalate 27.28%, saponin 16.50% and tannin 32.21%. Hence, these findings suggested that boiling is an effective treatment to reduce the level of toxic compounds in foods.

Keywords: jackfruit, heat treatments, antinutrient compounds, thermal processing

Procedia PDF Downloads 433
6281 Computational Intelligence and Machine Learning for Urban Drainage Infrastructure Asset Management

Authors: Thewodros K. Geberemariam

Abstract:

The rapid physical expansion of urbanization coupled with aging infrastructure presents a unique decision and management challenges for many big city municipalities. Cities must therefore upgrade and maintain the existing aging urban drainage infrastructure systems to keep up with the demands. Given the overall contribution of assets to municipal revenue and the importance of infrastructure to the success of a livable city, many municipalities are currently looking for a robust and smart urban drainage infrastructure asset management solution that combines management, financial, engineering and technical practices. This robust decision-making shall rely on sound, complete, current and relevant data that enables asset valuation, impairment testing, lifecycle modeling, and forecasting across the multiple asset portfolios. On this paper, predictive computational intelligence (CI) and multi-class machine learning (ML) coupled with online, offline, and historical record data that are collected from an array of multi-parameter sensors are used for the extraction of different operational and non-conforming patterns hidden in structured and unstructured data to determine and produce actionable insight on the current and future states of the network. This paper aims to improve the strategic decision-making process by identifying all possible alternatives; evaluate the risk of each alternative, and choose the alternative most likely to attain the required goal in a cost-effective manner using historical and near real-time urban drainage infrastructure data for urban drainage infrastructures assets that have previously not benefited from computational intelligence and machine learning advancements.

Keywords: computational intelligence, machine learning, urban drainage infrastructure, machine learning, classification, prediction, asset management space

Procedia PDF Downloads 152
6280 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering

Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel

Abstract:

Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.

Keywords: classification, data mining, spam filtering, naive bayes, decision tree

Procedia PDF Downloads 411
6279 Proposal of a Model Supporting Decision-Making Based on Multi-Objective Optimization Analysis on Information Security Risk Treatment

Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu

Abstract:

Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.

Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization

Procedia PDF Downloads 461
6278 Utilizing Literature Review and Shared Decision-Making to Support a Patient Make the Decision: A Case Study of Virtual Reality for Postoperative Pain

Authors: Pei-Ru Yang, Yu-Chen Lin, Jia-Min Wu

Abstract:

Background: A 58-year-old man with a history of osteoporosis and diabetes presented with chronic pain in his left knee due to severe knee joint degeneration. The knee replacement surgery was recommended by the doctor. But the patient suffered from low pain tolerance and wondered if virtual reality could relieve acute postoperative wound pain. Methods: We used the PICO (patient, intervention, comparison, and outcome) approach to generate indexed keywords and searched systematic review articles from 2017 to 2021 on the Cochran Library, PubMed, and Clinical Key databases. Results: The initial literature results included 38 articles, including 12 Cochrane library articles and 26 PubMed articles. One article was selected for further analysis after removing duplicates and off-topic articles. The eight trials included in this article were published between 2013 and 2019 and recruited a total of 723 participants. The studies, conducted in India, Lebanon, Iran, South Korea, Spain, and China, included adults who underwent hemorrhoidectomy, dental surgery, craniotomy or spine surgery, episiotomy repair, and knee surgery, with a mean age (24.1 ± 4.1 to 73.3 ± 6.5). Virtual reality is an emerging non-drug postoperative analgesia method. The findings showed that pain control was reduced by a mean of 1.48 points (95% CI: -2.02 to -0.95, p-value < 0.0001) in minor surgery and 0.32 points in major surgery (95% CI: -0.53 to -0.11, p-value < 0.03), and the overall postoperative satisfaction has improved. Discussion: Postoperative pain is a common clinical problem in surgical patients. Research has confirmed that virtual reality can create an immersive interactive environment, communicate with patients, and effectively relieve postoperative pain. However, virtual reality requires the purchase of hardware and software and other related computer equipment, and its high cost is a disadvantage. We selected the best literature based on clinical questions to answer the patient's question and used share decision making (SDM) to help the patient make decisions based on the clinical situation after knee replacement surgery to improve the quality of patient-centered care.

Keywords: knee replacement surgery, postoperative pain, share decision making, virtual reality

Procedia PDF Downloads 68
6277 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model

Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova

Abstract:

The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.

Keywords: bacteriocins, cross-contamination, mathematical model, temperature

Procedia PDF Downloads 144
6276 GIS Model for Sanitary Landfill Site Selection Based on Geotechnical Parameters

Authors: Hecson Christian, Joel Macwan

Abstract:

Landfill site selection in an urban area is a critical issue in the planning process. With the growth of the urbanization, it has a mammoth impact on the economy, ecology, and environmental health of the region. Outsized amount of wastes are produced and the problem gets soared every day. Hence, selection of ideal site for sanitary landfill is a challenge for urban planners and solid waste managers. Disposal site is a function of many parameters. Among all, Geotechnical parameters are very vital as the same is related to surrounding open land. Moreover, the accessible safe and acceptable land is also scarce. Therefore, in this paper geotechnical parameters are used to develop a GIS model to identify an ideal location for landfill purpose. Metropolitan city of Surat is highly populated and fastest growing urban area in India. The research objectives are to conduct field experiments to collect data and to transfer the facts in GIS platform to evolve a model, to find ideal location. Planners’ preferences were obtained to use analytical hierarchical process (AHP) to find weights of each parameter. Integration of GIS and Multi-Criteria Decision Analysis (MCDA) techniques are applied to improve decision-making. It augments an environment for transformation and combination of geographical data and planners’ preferences. GIS performs deterministic overlay and buffer operations. MCDA methods evaluate alternatives based on the decision makers’ subjective values and priorities. Research results have shown many alternative locations. Economic analysis of selected site from actual operations point of view is not included in this research.

Keywords: GIS, AHP, MCDA, Geo-technical

Procedia PDF Downloads 145
6275 Redefining Infrastructure as Code Orchestration Using AI

Authors: Georges Bou Ghantous

Abstract:

This research delves into the transformative impact of Artificial Intelligence (AI) on Infrastructure as Code (IaaC) practices, specifically focusing on the redefinition of infrastructure orchestration. By harnessing AI technologies such as machine learning algorithms and predictive analytics, organizations can achieve unprecedented levels of efficiency and optimization in managing their infrastructure resources. AI-driven IaaC introduces proactive decision-making through predictive insights, enabling organizations to anticipate and address potential issues before they arise. Dynamic resource scaling, facilitated by AI, ensures that infrastructure resources can seamlessly adapt to fluctuating workloads and changing business requirements. Through case studies and best practices, this paper sheds light on the tangible benefits and challenges associated with AI-driven IaaC transformation, providing valuable insights for organizations navigating the evolving landscape of digital infrastructure management.

Keywords: artificial intelligence, infrastructure as code, efficiency optimization, predictive insights, dynamic resource scaling, proactive decision-making

Procedia PDF Downloads 34
6274 Decision Support System for Fetus Status Evaluation Using Cardiotocograms

Authors: Oyebade K. Oyedotun

Abstract:

The cardiotocogram is a technical recording of the heartbeat rate and uterine contractions of a fetus during pregnancy. During pregnancy, several complications can occur to both the mother and the fetus; hence it is very crucial that medical experts are able to find technical means to check the healthiness of the mother and especially the fetus. It is very important that the fetus develops as expected in stages during the pregnancy period; however, the task of monitoring the health status of the fetus is not that which is easily achieved as the fetus is not wholly physically available to medical experts for inspection. Hence, doctors have to resort to some other tests that can give an indication of the status of the fetus. One of such diagnostic test is to obtain cardiotocograms of the fetus. From the analysis of the cardiotocograms, medical experts can determine the status of the fetus, and therefore necessary medical interventions. Generally, medical experts classify examined cardiotocograms into ‘normal’, ‘suspect’, or ‘pathological’. This work presents an artificial neural network based decision support system which can filter cardiotocograms data, producing the corresponding statuses of the fetuses. The capability of artificial neural network to explore the cardiotocogram data and learn features that distinguish one class from the others has been exploited in this research. In this research, feedforward and radial basis neural networks were trained on a publicly available database to classify the processed cardiotocogram data into one of the three classes: ‘normal’, ‘suspect’, or ‘pathological’. Classification accuracies of 87.8% and 89.2% were achieved during the test phase of the trained network for the feedforward and radial basis neural networks respectively. It is the hope that while the system described in this work may not be a complete replacement for a medical expert in fetus status evaluation, it can significantly reinforce the confidence in medical diagnosis reached by experts.

Keywords: decision support, cardiotocogram, classification, neural networks

Procedia PDF Downloads 332
6273 A Reliable Multi-Type Vehicle Classification System

Authors: Ghada S. Moussa

Abstract:

Vehicle classification is an important task in traffic surveillance and intelligent transportation systems. Classification of vehicle images is facing several problems such as: high intra-class vehicle variations, occlusion, shadow, illumination. These problems and others must be considered to develop a reliable vehicle classification system. In this study, a reliable multi-type vehicle classification system based on Bag-of-Words (BoW) paradigm is developed. Our proposed system used and compared four well-known classifiers; Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), k-Nearest Neighbour (KNN), and Decision Tree to classify vehicles into four categories: motorcycles, small, medium and large. Experiments on a large dataset show that our approach is efficient and reliable in classifying vehicles with accuracy of 95.7%. The SVM outperforms other classification algorithms in terms of both accuracy and robustness alongside considerable reduction in execution time. The innovativeness of developed system is it can serve as a framework for many vehicle classification systems.

Keywords: vehicle classification, bag-of-words technique, SVM classifier, LDA classifier, KNN classifier, decision tree classifier, SIFT algorithm

Procedia PDF Downloads 358
6272 Unconscious Bias in Judicial Decisions: Legal Genealogy and Disgust in Cases of Private, Adult, Consensual Sexual Acts Leading to Injury

Authors: Susanna Menis

Abstract:

‘Unconscious’ bias is widespread, affecting society on all levels of decision-making and beyond. Placed in the law context, this study will explore the direct effect of the psycho-social and cultural evolution of unconscious bias on how a judicial decision was made. The aim of this study is to contribute to socio-legal scholarship by examining the formation of unconscious bias and its influence on the creation of legal rules that judges believe reflect social solidarity and protect against violence. The study seeks to understand how concepts like criminalization and unlawfulness are constructed by the common law. The study methodology follows two theoretical approaches: historical genealogy and emotions as sociocultural phenomena. Both methods have the ‘tracing back’ of the original formation of a social way of seeing and doing things in common. The significance of this study lies in the importance of reflecting on the ways unconscious bias may be formed; placing judges’ decisions under this spotlight forces us to challenge the status quo, interrogate justice, and seek refinement of the law.

Keywords: legal geneology, emotions, disgust, criminal law

Procedia PDF Downloads 61
6271 Accounting Management Information System for Convenient Shop in Bangkok Thailand

Authors: Anocha Rojanapanich

Abstract:

The purpose of this research is to develop and design an accounting management information system for convenient shop in Bangkok Thailand. The study applied the System Development Life Cycle (SDLC) for development which began with study and analysis of current data, including the existing system. Then, the system was designed and developed to meet users’ requirements via the internet network by use of application software such as My SQL for database management, Product diversity, Apache HTTP Server for Web Server and PHP Hypertext Preprocessor for an interface between web server, database and users. The system was designed into two subsystems as the main system, or system for head office, and the branch system for branch shops. These consisted of three parts which are classified by user management as shop management, inventory management and Point of Sale (POS) management and importance of cost information for decision making also as well as.

Keywords: accounting management information system, convenient shop, cost information for decision making system, development life cycle

Procedia PDF Downloads 420
6270 Impact of Varying Malting and Fermentation Durations on Specific Chemical, Functional Properties, and Microstructural Behaviour of Pearl Millet and Sorghum Flour Using Response Surface Methodology

Authors: G. Olamiti; TK. Takalani; D. Beswa, AIO Jideani

Abstract:

The study investigated the effects of malting and fermentation times on some chemical, functional properties and microstructural behaviour of Agrigreen, Babala pearl millet cultivars and sorghum flours using response surface methodology (RSM). Central Composite Rotatable Design (CCRD) was performed on two independent variables: malting and fermentation times (h), at intervals of 24, 48, and 72, respectively. The results of dependent parameters such as pH, titratable acidity (TTA), Water absorption capacity (WAC), Oil absorption capacity (OAC), bulk density (BD), dispersibility and microstructural behaviour of the flours studied showed a significant difference in p < 0.05 upon malting and fermentation time. Babala flour exhibited a higher pH value at 4.78 at 48 h malted and 81.9 fermentation times. Agrigreen flour showed a higher TTA value at 0.159% at 81.94 h malted and 48 h fermentation times. WAC content was also higher in malted and fermented Babala flour at 2.37 ml g-1 for 81.94 h malted and 48 h fermentation time. Sorghum flour exhibited the least OAC content at 1.67 ml g-1 at 14 h malted and 48 h fermentation times. Agrigreen flour recorded the least bulk density, at 0.53 g ml-1 for 72 h malted and 24 h fermentation time. Sorghum flour exhibited a higher content of dispersibility, at 56.34%, after 24 h malted and 72 h fermented time. The response surface plots showed that increased malting and fermentation time influenced the dependent parameters. The microstructure behaviour of malting and fermentation times of pearl millet varieties and sorghum flours showed isolated, oval, spherical, or polygonal to smooth surfaces. The optimal processing conditions, such as malting and fermentation time for Agrigreen, were 32.24 h and 63.32 h; 35.18 h and 34.58 h for Babala; and 36.75 h and 47.88 h for sorghum with high desirability of 1.00. The validation of the optimum processing malting and fermentation times (h) on the dependent improved the experimented values. Food processing companies can use the study's findings to improve food processing and quality.

Keywords: Pearl millet, malting, fermentation, microstructural behaviour

Procedia PDF Downloads 71
6269 A Combined Approach Based on Artificial Intelligence and Computer Vision for Qualitative Grading of Rice Grains

Authors: Hemad Zareiforoush, Saeed Minaei, Ahmad Banakar, Mohammad Reza Alizadeh

Abstract:

The quality inspection of rice (Oryza sativa L.) during its various processing stages is very important. In this research, an artificial intelligence-based model coupled with computer vision techniques was developed as a decision support system for qualitative grading of rice grains. For conducting the experiments, first, 25 samples of rice grains with different levels of percentage of broken kernels (PBK) and degree of milling (DOM) were prepared and their qualitative grade was assessed by experienced experts. Then, the quality parameters of the same samples examined by experts were determined using a machine vision system. A grading model was developed based on fuzzy logic theory in MATLAB software for making a relationship between the qualitative characteristics of the product and its quality. Totally, 25 rules were used for qualitative grading based on AND operator and Mamdani inference system. The fuzzy inference system was consisted of two input linguistic variables namely, DOM and PBK, which were obtained by the machine vision system, and one output variable (quality of the product). The model output was finally defuzzified using Center of Maximum (COM) method. In order to evaluate the developed model, the output of the fuzzy system was compared with experts’ assessments. It was revealed that the developed model can estimate the qualitative grade of the product with an accuracy of 95.74%.

Keywords: machine vision, fuzzy logic, rice, quality

Procedia PDF Downloads 419
6268 The Formulation of Inference Fuzzy System as a Valuation Subsidiary Based Particle Swarm Optimization for Solves the Issue of Decision Making in Middle Size Soccer Robot League

Authors: Zahra Abdolkarimi, Naser Zouri

Abstract:

The actual purpose of RoboCup is creating independent team of robots in 2050 based of FiFa roles to bring the victory in compare of world star team. There is unbelievable growing of Robots created a collection of complex and motivate subject in robotic and intellectual ornate, also it made a mechatronics style base of theoretical and technical way in Robocop. Decision making of robots depends to environment reaction, self-player and rival player with using inductive Fuzzy system valuation subsidiary to solve issue of robots in land game. The measure of selection in compare with other methods depends to amount of victories percentage in the same team that plays accidentally.

Keywords: particle swarm optimization, chaos theory, inference fuzzy system, simulation environment rational fuzzy system, mamdani and assilian, deffuzify

Procedia PDF Downloads 386
6267 Deep Reinforcement Learning for Optimal Decision-Making in Supply Chains

Authors: Nitin Singh, Meng Ling, Talha Ahmed, Tianxia Zhao, Reinier van de Pol

Abstract:

We propose the use of reinforcement learning (RL) as a viable alternative for optimizing supply chain management, particularly in scenarios with stochasticity in product demands. RL’s adaptability to changing conditions and its demonstrated success in diverse fields of sequential decision-making makes it a promising candidate for addressing supply chain problems. We investigate the impact of demand fluctuations in a multi-product supply chain system and develop RL agents with learned generalizable policies. We provide experimentation details for training RL agents and statistical analysis of the results. We study the generalization ability of RL agents for different demand uncertainty scenarios and observe superior performance compared to the agents trained with fixed demand curves. The proposed methodology has the potential to lead to cost reduction and increased profit for companies dealing with frequent inventory movement between supply and demand nodes.

Keywords: inventory management, reinforcement learning, supply chain optimization, uncertainty

Procedia PDF Downloads 107
6266 Pale, Soft, Exudative (PSE) Turkey Meat in a Brazilian Commercial Processing Plant

Authors: Danielle C. B. Honorato, Rafael H. Carvalho, Adriana L. Soares, Ana Paula F. R. L. Bracarense, Paulo D. Guarnieri, Massami Shimokomaki, Elza I. Ida

Abstract:

Over the past decade, the Brazilian production of turkey meat increased by more than 50%, indicating that the turkey meat is considered a great potential for the Brazilian economy contributing to the growth of agribusiness at the marketing international scenario. However, significant color changes may occur during its processing leading to the pale, soft and exudative (PSE) appearance on the surface of breast meat due to the low water holding capacity (WHC). Changes in PSE meat functional properties occur due to the myofibrils proteins denaturation caused by a rapid postmortem glycolysis resulting in a rapid pH decline while the carcass temperature is still warm. The aim of this study was to analyze the physical, chemical and histological characteristics of PSE turkey meat obtained from a Brazilian commercial processing plant. The turkey breasts samples were collected (n=64) at the processing line and classified as PSE at L* ≥ 53 value. The pH was also analyzed after L* measurement. In sequence, PSE meat samples were evaluated for WHC, cooking loss (CL), shear force (SF), myofibril fragmentation index (MFI), protein denaturation (PD) and histological evaluation. The abnormal color samples presented lower pH values, 16% lower fiber diameter, 11% lower SF and 2% lower WHC than those classified as normal. The CL, PD and MFI were, respectively, 9%, 18% and 4% higher in PSE samples. The Pearson correlation between the L* values and CL, PD and MFI was positive, while that SF and pH values presented negative correlation. Under light microscopy, a shrinking of PSE muscle cell diameter was approximately 16% shorter in relation to normal samples and an extracellular enlargement of endomysium and perimysium sheaths as the consequence of higher water contents lost as observed previously by lower WHC values. Thus, the results showed that PSE turkey breast meat presented significant changes in their physical, chemical and histological characteristics that may impair its functional properties.

Keywords: functional properties, histological evaluation, meat quality, PSE

Procedia PDF Downloads 460
6265 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 258
6264 Adaptive Decision Feedback Equalizer Utilizing Fixed-Step Error Signal for Multi-Gbps Serial Links

Authors: Alaa Abdullah Altaee

Abstract:

This paper presents an adaptive decision feedback equalizer (ADFE) for multi-Gbps serial links utilizing a fix-step error signal extracted from cross-points of received data symbols. The extracted signal is generated based on violation of received data symbols with minimum detection requirements at the clock and data recovery (CDR) stage. The iterations of the adaptation process search for the optimum feedback tap coefficients to maximize the data eye-opening and minimize the adaptation convergence time. The effectiveness of the proposed architecture is validated using the simulation results of a serial link designed in an IBM 130 nm 1.2V CMOS technology. The data link with variable channel lengths is analyzed using Spectre from Cadence Design Systems with BSIM4 device models.

Keywords: adaptive DFE, CMOS equalizer, error detection, serial links, timing jitter, wire-line communication

Procedia PDF Downloads 120
6263 The Incidental Linguistic Information Processing and Its Relation to General Intellectual Abilities

Authors: Evgeniya V. Gavrilova, Sofya S. Belova

Abstract:

The present study was aimed at clarifying the relationship between general intellectual abilities and efficiency in free recall and rhymed words generation task after incidental exposure to linguistic stimuli. The theoretical frameworks stress that general intellectual abilities are based on intentional mental strategies. In this context, it seems to be crucial to examine the efficiency of incidentally presented information processing in cognitive task and its relation to general intellectual abilities. The sample consisted of 32 Russian students. Participants were exposed to pairs of words. Each pair consisted of two common nouns or two city names. Participants had to decide whether a city name was presented in each pair. Thus words’ semantics was processed intentionally. The city names were considered to be focal stimuli, whereas common nouns were considered to be peripheral stimuli. Along with that each pair of words could be rhymed or not be rhymed, but this phonemic aspect of stimuli’s characteristic (rhymed and non-rhymed words) was processed incidentally. Then participants were asked to produce as many rhymes as they could to new words. The stimuli presented earlier could be used as well. After that, participants had to retrieve all words presented earlier. In the end, verbal and non-verbal abilities were measured with number of special psychometric tests. As for free recall task intentionally processed focal stimuli had an advantage in recall compared to peripheral stimuli. In addition all the rhymed stimuli were recalled more effectively than non-rhymed ones. The inverse effect was found in words generation task where participants tended to use mainly peripheral stimuli compared to focal ones. Furthermore peripheral rhymed stimuli were most popular target category of stimuli that was used in this task. Thus the information that was processed incidentally had a supplemental influence on efficiency of stimuli processing as well in free recall as in word generation task. Different patterns of correlations between intellectual abilities and efficiency in different stimuli processing in both tasks were revealed. Non-verbal reasoning ability correlated positively with free recall of peripheral rhymed stimuli, but it was not related to performance on rhymed words’ generation task. Verbal reasoning ability correlated positively with free recall of focal stimuli. As for rhymed words generation task, verbal intelligence correlated negatively with generation of focal stimuli and correlated positively with generation of all peripheral stimuli. The present findings lead to two key conclusions. First, incidentally processed stimuli had an advantage in free recall and word generation task. Thus incidental information processing appeared to be crucial for subsequent cognitive performance. Secondly, it was demonstrated that incidentally processed stimuli were recalled more frequently by participants with high nonverbal reasoning ability and were more effectively used by participants with high verbal reasoning ability in subsequent cognitive tasks. That implies that general intellectual abilities could benefit from operating by different levels of information processing while cognitive problem solving. This research was supported by the “Grant of President of RF for young PhD scientists” (contract № is 14.Z56.17.2980- MK) and the Grant № 15-36-01348a2 of Russian Foundation for Humanities.

Keywords: focal and peripheral stimuli, general intellectual abilities, incidental information processing

Procedia PDF Downloads 231
6262 Autonomous Ground Vehicle Navigation Based on a Single Camera and Image Processing Methods

Authors: Auday Al-Mayyahi, Phil Birch, William Wang

Abstract:

A vision system-based navigation for autonomous ground vehicle (AGV) equipped with a single camera in an indoor environment is presented. A proposed navigation algorithm has been utilized to detect obstacles represented by coloured mini- cones placed in different positions inside a corridor. For the recognition of the relative position and orientation of the AGV to the coloured mini cones, the features of the corridor structure are extracted using a single camera vision system. The relative position, the offset distance and steering angle of the AGV from the coloured mini-cones are derived from the simple corridor geometry to obtain a mapped environment in real world coordinates. The corridor is first captured as an image using the single camera. Hence, image processing functions are then performed to identify the existence of the cones within the environment. Using a bounding box surrounding each cone allows to identify the locations of cones in a pixel coordinate system. Thus, by matching the mapped and pixel coordinates using a projection transformation matrix, the real offset distances between the camera and obstacles are obtained. Real time experiments in an indoor environment are carried out with a wheeled AGV in order to demonstrate the validity and the effectiveness of the proposed algorithm.

Keywords: autonomous ground vehicle, navigation, obstacle avoidance, vision system, single camera, image processing, ultrasonic sensor

Procedia PDF Downloads 302
6261 Understanding the Behavioral Mechanisms of Pavlovian Biases: Intriguing Insights from Replication and Reversal Paradigms

Authors: Sanjiti Sharma, Carol Seger

Abstract:

Pavlovian biases are crucial to the decision-making processes, however, if left unchecked can extend to maladaptive behavior such as Substance Use Disorders (SUDs), anxiety, and much more. This study explores the interaction between Pavlovian biases and goal-directed instrumental learning by examining how each adapts to task reversal. it hypothesized that Pavlovian biases would be slow to adjust after reversal due to their reliance on inflexible learning, whereas the more flexible goal-directed instrumental learning system would adapt more quickly. The experiment utilized a modified Go No-Go task with two phases: replication of existing findings and a task reversal paradigm. Results showed instrumental learning's flexibility, with participants adapting after reversal. However, Pavlovian biases led to decreased accuracy post-reversal, with slow adaptation, especially when conflicting with instrumental objectives. These findings emphasize the inflexible nature of Pavlovian biases and their role in decision-making and cognitive rigidity.

Keywords: pavlovian bias, goal-directed learning, cognitive flexibility, learning bias

Procedia PDF Downloads 26
6260 Biogas Production from Pistachio (Pistacia vera L.) Processing Waste

Authors: İ. Çelik, Goksel Demirer

Abstract:

Turkey is the third largest producer of pistachio (Pistacia vera L.) after Iran and United States. Harvested pistachio nuts are covered with organic hull which is removed by de-hulling process. Most of the pistachio by-products which are produced during de-hulling process are considered as agricultural waste and often mixed with soil, to a lesser extent are used as feedstuff by local livestock farmers and a small portion is used as herbal medicine. Due to its high organic and phenolic content as well as high solids concentration, pistachio processing wastes create significant waste management problems unless they are properly managed. However, there is not a well-established waste management method compensating the waste generated during the processing of pistachios. This study investigated the anaerobic treatability and biogas generation potential of pistachio hull waste. The effect of pre-treatment on biogas generation potential was investigated. For this purpose, Biochemical Methane Potential (BMP) Assays were conducted for two Chemical Oxygen Demand (COD) concentrations of 22 and 33 g tCOD l-1 at the absence and presence of chemical and thermal pre-treatment methods. The results revealed anaerobic digestion of the pistachio de-hulling wastes and subsequent biogas production as a renewable energy source are possible. The observed percent COD removal and methane yield values of the pre-treated pistachio de-hulling waste samples were significantly higher than the raw pistachio de-hulling waste. The highest methane yield was observed as 213.4 ml CH4/g COD.

Keywords: pistachio de-hulling waste, biogas, renewable energy, pre-treatment

Procedia PDF Downloads 215
6259 Social Media, Networks and Related Technology: Business and Governance Perspectives

Authors: M. A. T. AlSudairi, T. G. K. Vasista

Abstract:

The concept of social media is becoming the top of the agenda for many business executives and public sector executives today. Decision makers as well as consultants, try to identify ways in which firms and enterprises can make profitable use of social media and network related applications such as Wikipedia, Face book, YouTube, Google+, Twitter. While it is fun and useful to participating in this media and network for achieving the communication effectively and efficiently, semantic and sentiment analysis and interpretation becomes a crucial issue. So, the objective of this paper is to provide literature review on social media, network and related technology related to semantics and sentiment or opinion analysis covering business and governance perspectives. In this regard, a case study on the use and adoption of Social media in Saudi Arabia has been discussed. It is concluded that semantic web technology play a significant role in analyzing the social networks and social media content for extracting the interpretational knowledge towards strategic decision support.

Keywords: CRASP methodology, formative assessment, literature review, semantic web services, social media, social networks

Procedia PDF Downloads 451
6258 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection

Procedia PDF Downloads 90
6257 Reinforcement-Learning Based Handover Optimization for Cellular Unmanned Aerial Vehicles Connectivity

Authors: Mahmoud Almasri, Xavier Marjou, Fanny Parzysz

Abstract:

The demand for services provided by Unmanned Aerial Vehicles (UAVs) is increasing pervasively across several sectors including potential public safety, economic, and delivery services. As the number of applications using UAVs grows rapidly, more and more powerful, quality of service, and power efficient computing units are necessary. Recently, cellular technology draws more attention to connectivity that can ensure reliable and flexible communications services for UAVs. In cellular technology, flying with a high speed and altitude is subject to several key challenges, such as frequent handovers (HOs), high interference levels, connectivity coverage holes, etc. Additional HOs may lead to “ping-pong” between the UAVs and the serving cells resulting in a decrease of the quality of service and energy consumption. In order to optimize the number of HOs, we develop in this paper a Q-learning-based algorithm. While existing works focus on adjusting the number of HOs in a static network topology, we take into account the impact of cells deployment for three different simulation scenarios (Rural, Semi-rural and Urban areas). We also consider the impact of the decision distance, where the drone has the choice to make a switching decision on the number of HOs. Our results show that a Q-learning-based algorithm allows to significantly reduce the average number of HOs compared to a baseline case where the drone always selects the cell with the highest received signal. Moreover, we also propose which hyper-parameters have the largest impact on the number of HOs in the three tested environments, i.e. Rural, Semi-rural, or Urban.

Keywords: drones connectivity, reinforcement learning, handovers optimization, decision distance

Procedia PDF Downloads 108
6256 Selection of Solid Waste Landfill Site Using Geographical Information System (GIS)

Authors: Fatih Iscan, Ceren Yagci

Abstract:

Rapid population growth, urbanization and industrialization are known as the most important factors of environment problems. Elimination and management of solid wastes are also within the most important environment problems. One of the main problems in solid waste management is the selection of the best site for elimination of solid wastes. Lately, Geographical Information System (GIS) has been used for easing selection of landfill area. GIS has the ability of imitating necessary economical, environmental and political limitations. They play an important role for the site selection of landfill area as a decision support tool. In this study; map layers will be studied for minimum effect of environmental, social and cultural factors and maximum effect for engineering/economical factors for site selection of landfill areas and using GIS for an decision support mechanism in solid waste landfill areas site selection will be presented in Aksaray/TURKEY city, Güzelyurt district practice.

Keywords: GIS, landfill, solid waste, spatial analysis

Procedia PDF Downloads 359