Search results for: multi variable decision making
10979 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 26910978 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos
Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog
Abstract:
Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.Keywords: vaccinated, unvaccinated, socoal distancing, filipinos
Procedia PDF Downloads 20610977 Implementation of A Treatment Escalation Plan During The Covid 19 Outbreak in Aneurin Bevan University Health Board
Authors: Peter Collett, Mike Pynn, Haseeb Ur Rahman
Abstract:
For the last few years across the UK there has been a push towards implementing treatment escalation plans (TEP) for every patient admitted to hospital. This is a paper form which is completed by a junior doctor then countersigned by the consultant responsible for the patient's care. It is designed to address what level of care is appropriate for the patient in question at point of entry to hospital. It helps decide whether the patient would benefit for ward based, high dependency or intensive care. They are completed to ensure the patient's best interests are maintained and aim to facilitate difficult decisions which may be required at a later date. For example, a frail patient with significant co-morbidities, unlikely to survive a pathology requiring an intensive care admission is admitted to hospital the decision can be made early to state the patient would not benefit from an ICU admission. This decision can be reversed depending on the clinical course of the patient's admission. It promotes discussions with the patient regarding their wishes to receive certain levels of healthcare. This poster describes the steps taken in the Aneurin Bevan University Health Board (ABUHB) when implementing the TEP form. The team implementing the TEP form campaigned for it's use to the board of directors. The directors were eager to hear of experiences of other health boards who had implemented the TEP form. The team presented the data produced in a number of health boards and demonstrated the proposed form. Concern was raised regarding the legalities of the form and that it could upset patients and relatives if the form was not explained properly. This delayed the effectuation of the TEP form and further research and discussion would be required. When COVID 19 reached the UK the National Institute for Health and Clinical Excellence issued guidance stating every patient admitted to hospital should be issued a TEP form. The TEP form was accelerated through the vetting process and was approved with immediate effect. The TEP form in ABUHB has now been in circulation for a month. An audit investigating it's uptake and a survey gathering opinions have been conducted.Keywords: acute medicine, clinical governance, intensive care, patient centered decision making
Procedia PDF Downloads 18010976 Women Participation in Agriculture and Rural Development Activities in Kwacciyar-Lalle and Mogonho Communities of Sokoto State, Nigeria
Authors: B. Z. Abubakar, J. P. Voh, B. F. Umar, S. Khalid, A. A. Barau, J. Aigbe
Abstract:
The study was conducted to identify and assess the various community development programmes designed and executed by Sokoto Agricultural and Community Development Project (SACDP) with the assistance of International Funds for Agricultural Development (IFAD) among women beneficiaries in Kwacciyar-lalle and Mogonho communities of Sokoto state. A simple random sampling technique was employed to select 20 project beneficiaries in each of the selected communities, making a total of 40 beneficiaries. Structured questionnaire, descriptive statistics such as frequencies and percentages and also participatory methodologies such as focus group discussion and pair wise ranking were used to analyze the data. Results showed that majority of the beneficiaries (75%) were married and undertook animal rearing as their major occupation. Results further showed that (85%) of the beneficiaries were involved in decision making, which enhanced their participation. Pair-wise ranking showed dug well as the most preferred activity, followed by construction of Islamic school in Kwacciyar-lalle while well construction followed by provision of improved animal species were most preferred in Mogonho. Recommendations made in the light of achieving people’s participation include provision of more infrastructural facilities and working materials.Keywords: community development, focus group, pair-wise ranking, infrastructure
Procedia PDF Downloads 37610975 Inclusive Cities Decision Matrix Based on a Multidimensional Approach for Sustainable Smart Cities
Authors: Madhurima S. Waghmare, Shaleen Singhal
Abstract:
The concept of smartness, inclusion, sustainability is multidisciplinary and fuzzy, rooted in economic and social development theories and policies which get reflected in the spatial development of the cities. It is a challenge to convert these concepts from aspirations to transforming actions. There is a dearth of assessment and planning tools to support the city planners and administrators in developing smart, inclusive, and sustainable cities. To address this gap, this study develops an inclusive cities decision matrix based on an exploratory approach and using mixed methods. The matrix is soundly based on a review of multidisciplinary urban sector literature and refined and finalized based on inputs from experts and insights from case studies. The application of the decision matric on the case study cities in India suggests that the contemporary planning tools for cities need to be multidisciplinary and flexible to respond to the unique needs of the diverse contexts. The paper suggests that a multidimensional and inclusive approach to city planning can play an important role in building sustainable smart cities.Keywords: inclusive-cities decision matrix, smart cities in India, city planning tools, sustainable cities
Procedia PDF Downloads 15810974 Designing State Feedback Multi-Target Controllers by the Use of Particle Swarm Optimization Algorithm
Authors: Seyedmahdi Mousavihashemi
Abstract:
One of the most important subjects of interest in researches is 'improving' which result in various algorithms. In so many geometrical problems we are faced with target functions which should be optimized. In group practices, all the functions’ cooperation lead to convergence. In the study, the optimization algorithm of dense particles is used. Usage of the algorithm improves the given performance norms. The results reveal that usage of swarm algorithm for reinforced particles in designing state feedback improves the given performance norm and in optimized designing of multi-target state feedback controlling, the network will maintain its bearing structure. The results also show that PSO is usable for optimization of state feedback controllers.Keywords: multi-objective, enhanced, feedback, optimization, algorithm, particle, design
Procedia PDF Downloads 50410973 DeepLig: A de-novo Computational Drug Design Approach to Generate Multi-Targeted Drugs
Authors: Anika Chebrolu
Abstract:
Mono-targeted drugs can be of limited efficacy against complex diseases. Recently, multi-target drug design has been approached as a promising tool to fight against these challenging diseases. However, the scope of current computational approaches for multi-target drug design is limited. DeepLig presents a de-novo drug discovery platform that uses reinforcement learning to generate and optimize novel, potent, and multitargeted drug candidates against protein targets. DeepLig’s model consists of two networks in interplay: a generative network and a predictive network. The generative network, a Stack- Augmented Recurrent Neural Network, utilizes a stack memory unit to remember and recognize molecular patterns when generating novel ligands from scratch. The generative network passes each newly created ligand to the predictive network, which then uses multiple Graph Attention Networks simultaneously to forecast the average binding affinity of the generated ligand towards multiple target proteins. With each iteration, given feedback from the predictive network, the generative network learns to optimize itself to create molecules with a higher average binding affinity towards multiple proteins. DeepLig was evaluated based on its ability to generate multi-target ligands against two distinct proteins, multi-target ligands against three distinct proteins, and multi-target ligands against two distinct binding pockets on the same protein. With each test case, DeepLig was able to create a library of valid, synthetically accessible, and novel molecules with optimal and equipotent binding energies. We propose that DeepLig provides an effective approach to design multi-targeted drug therapies that can potentially show higher success rates during in-vitro trials.Keywords: drug design, multitargeticity, de-novo, reinforcement learning
Procedia PDF Downloads 10310972 Collaborative Energy Optimization for Multi-Microgrid Distribution System Based on Two-Stage Game Approach
Authors: Hanmei Peng, Yiqun Wang, Mao Tan, Zhuocen Dai, Yongxin Su
Abstract:
Efficient energy management in multi-microgrid distribution systems holds significant importance for enhancing the economic benefits of regional power grids. To better balance conflicts among various stakeholders, a two-stage game-based collaborative optimization approach is proposed in this paper, effectively addressing the realistic scenario involving both competition and collaboration among stakeholders. The first stage, aimed at maximizing individual benefits, involves constructing a non-cooperative tariff game model for the distribution network and surplus microgrid. In the second stage, considering power flow and physical line capacity constraints we establish a cooperative P2P game model for the multi-microgrid distribution system, and the optimization involves employing the Lagrange method of multipliers to handle complex constraints. Simulation results demonstrate that the proposed approach can effectively improve the system economics while harmonizing individual and collective rationality.Keywords: cooperative game, collaborative optimization, multi-microgrid distribution system, non-cooperative game
Procedia PDF Downloads 7510971 Detailed Analysis of Multi-Mode Optical Fiber Infrastructures for Data Centers
Authors: Matej Komanec, Jan Bohata, Stanislav Zvanovec, Tomas Nemecek, Jan Broucek, Josef Beran
Abstract:
With the exponential growth of social networks, video streaming and increasing demands on data rates, the number of newly built data centers rises proportionately. The data centers, however, have to adjust to the rapidly increased amount of data that has to be processed. For this purpose, multi-mode (MM) fiber based infrastructures are often employed. It stems from the fact, the connections in data centers are typically realized within a short distance, and the application of MM fibers and components considerably reduces costs. On the other hand, the usage of MM components brings specific requirements for installation service conditions. Moreover, it has to be taken into account that MM fiber components have a higher production tolerance for parameters like core and cladding diameters, eccentricity, etc. Due to the high demands for the reliability of data center components, the determination of properly excited optical field inside the MM fiber core belongs to the key parameters while designing such an MM optical system architecture. Appropriately excited mode field of the MM fiber provides optimal power budget in connections, leads to the decrease of insertion losses (IL) and achieves effective modal bandwidth (EMB). The main parameter, in this case, is the encircled flux (EF), which should be properly defined for variable optical sources and consequent different mode-field distribution. In this paper, we present detailed investigation and measurements of the mode field distribution for short MM links purposed in particular for data centers with the emphasis on reliability and safety. These measurements are essential for large MM network design. The various scenarios, containing different fibers and connectors, were tested in terms of IL and mode-field distribution to reveal potential challenges. Furthermore, we focused on estimation of particular defects and errors, which can realistically occur like eccentricity, connector shifting or dust, were simulated and measured, and their dependence to EF statistics and functionality of data center infrastructure was evaluated. The experimental tests were performed at two wavelengths, commonly used in MM networks, of 850 nm and 1310 nm to verify EF statistics. Finally, we provide recommendations for data center systems and networks, using OM3 and OM4 MM fiber connections.Keywords: optical fiber, multi-mode, data centers, encircled flux
Procedia PDF Downloads 38010970 Renewable Energy and Environment: Design of a Decision Aided Tool for Sustainable Development
Authors: Mustapha Ouardouz, Mina Amharref, Abdessamed Bernoussi
Abstract:
The future energy, for limited energy resources countries, goes through renewable energies (solar, wind etc.). The renewable energies constitute a major component of the energy strategy to cover a substantial part of the growing needs and contribute to environmental protection by replacing fossil fuels. Indeed, sustainable development involves the promotion of renewable energy and the preservation of the environment by the use of clean energy technologies to limit emissions of greenhouse gases and reducing the pressure exerted on the forest cover. So the impact studies, of the energy use on the environment and farm-related risks are necessary. For that, a global approach integrating all the various sectors involved in such project seems to be the best approach. In this paper we present an approach based on the multi criteria analysis and the realization of one pilot to achieve the development of an innovative geo-intelligent environmental platform. An implementation of this platform will collect, process, analyze and manage environmental data in connection with the nature of used energy in the studied region. As an application we consider a region in the north of Morocco characterized by intense agricultural and industrials activities and using diverse renewable energy. The strategic goals of this platform are; the decision support for better governance, improving the responsiveness of public and private companies connected by providing them in real time with reliable data, modeling and simulation possibilities of energy scenarios, the identification of socio-technical solutions to introduce renewable energies and estimate technical and implantable potential by socio-economic analyzes and the assessment of infrastructure for the region and the communities, the preservation and enhancement of natural resources for better citizenship governance through democratization of access to environmental information, the tool will also perform simulations integrating environmental impacts of natural disasters, particularly those linked to climate change. Indeed extreme cases such as floods, droughts and storms will be no longer rare and therefore should be integrated into such projects.Keywords: renewable energies, decision aided tool, environment, simulation
Procedia PDF Downloads 46310969 Understanding Consumer Behavior Towards Business Ethics: Is it Really Important for Consumers
Authors: Ömer Akkaya, Muammer Zerenler
Abstract:
Ethics is important for all shareholders and stakeholders that a firm has in its environment. Whether a firm behaves ethically or unethically has a significant influence on consumers’ decision making and buying process. This research tries to explain business ethics from consumers’ perspective. The survey includes several questions to explain how consumers react if they know a firm behave unethically or ethically. What are consumers’ expectations regarding the ethical behavior of firm? Do consumer reward or punish the firms considering the ethics? Does it really important for consumers firms behaving ethical?Keywords: business ethics, consumer behavior, ethics, social responsibility
Procedia PDF Downloads 36510968 The Amount of Information Processing and Balance Performance in Children: The Dual-Task Paradigm
Authors: Chin-Chih Chiou, Tai-Yuan Su, Ti-Yu Chen, Wen-Yu Chiu, Chungyu Chen
Abstract:
The purpose of this study was to investigate the effect of reaction time (RT) or balance performance as the number of stimulus-response choices increases, the amount of information processing of 0-bit and 1-bit conditions based on Hick’s law, using the dual-task design. Eighteen children (age: 9.38 ± 0.27 years old) were recruited as the participants for this study, and asked to assess RT and balance performance separately and simultaneously as following five conditions: simple RT (0-bit decision), choice RT (1-bit decision), single balance control, balance control with simple RT, and balance control with choice RT. Biodex 950-300 balance system and You-Shang response timer were used to record and analyze the postural stability and information processing speed (RT) respectively for the participants. Repeated measures one-way ANOVA with HSD post-hoc test and 2 (balance) × 2 (amount of information processing) repeated measures two-way ANOVA were used to test the parameters of balance performance and RT (α = .05). The results showed the overall stability index in the 1-bit decision was lower than in 0-bit decision, and the mean deflection in the 1-bit decision was lower than in single balance performance. Simple RTs were faster than choice RTs both in single task condition and dual task condition. It indicated that the chronometric approach of RT could use to infer the attention requirement of the secondary task. However, this study did not find that the balance performance is interfered for children by the increasing of the amount of information processing.Keywords: capacity theory, reaction time, Hick’s law, balance
Procedia PDF Downloads 45510967 Single Valued Neutrosophic Hesitant Fuzzy Rough Set and Its Application
Authors: K. M. Alsager, N. O. Alshehri
Abstract:
In this paper, we proposed the notion of single valued neutrosophic hesitant fuzzy rough set, by combining single valued neutrosophic hesitant fuzzy set and rough set. The combination of single valued neutrosophic hesitant fuzzy set and rough set is a powerful tool for dealing with uncertainty, granularity and incompleteness of knowledge in information systems. We presented both definition and some basic properties of the proposed model. Finally, we gave a general approach which is applied to a decision making problem in disease diagnoses, and demonstrated the effectiveness of the approach by a numerical example.Keywords: single valued neutrosophic fuzzy set, single valued neutrosophic fuzzy hesitant set, rough set, single valued neutrosophic hesitant fuzzy rough set
Procedia PDF Downloads 27910966 The Design Optimization for Sound Absorption Material of Multi-Layer Structure
Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Tae-Hyeon Oh, Dae-Kyu Park
Abstract:
Sound absorbing material is used as automotive interior material. Sound absorption coefficient should be predicted to design it. But it is difficult to predict sound absorbing coefficient because it is comprised of several material layers. So, its targets are achieved through many experimental tunings. It causes a lot of cost and time. In this paper, we propose the process to estimate the sound absorption coefficient with multi-layer structure. In order to estimate the coefficient, physical properties of each material are used. These properties also use predicted values by Foam-X software using the sound absorption coefficient data measured by impedance tube. Since there are many physical properties and the measurement equipment is expensive, the values predicted by software are used. Through the measurement of the sound absorption coefficient of each material, its physical properties are calculated inversely. The properties of each material are used to calculate the sound absorption coefficient of the multi-layer material. Since the absorption coefficient of multi-layer can be calculated, optimization design is possible through simulation. Then, we will compare and analyze the calculated sound absorption coefficient with the data measured by scaled reverberation chamber and impedance tubes for a prototype. If this method is used when developing automotive interior materials with multi-layer structure, the development effort can be reduced because it can be optimized by simulation. So, cost and time can be saved.Keywords: sound absorption material, sound impedance tube, sound absorption coefficient, optimization design
Procedia PDF Downloads 29510965 Variable Refrigerant Flow (VRF) Zonal Load Prediction Using a Transfer Learning-Based Framework
Authors: Junyu Chen, Peng Xu
Abstract:
In the context of global efforts to enhance building energy efficiency, accurate thermal load forecasting is crucial for both device sizing and predictive control. Variable Refrigerant Flow (VRF) systems are widely used in buildings around the world, yet VRF zonal load prediction has received limited attention. Due to differences between VRF zones in building-level prediction methods, zone-level load forecasting could significantly enhance accuracy. Given that modern VRF systems generate high-quality data, this paper introduces transfer learning to leverage this data and further improve prediction performance. This framework also addresses the challenge of predicting load for building zones with no historical data, offering greater accuracy and usability compared to pure white-box models. The study first establishes an initial variable set of VRF zonal building loads and generates a foundational white-box database using EnergyPlus. Key variables for VRF zonal loads are identified using methods including SRRC, PRCC, and Random Forest. XGBoost and LSTM are employed to generate pre-trained black-box models based on the white-box database. Finally, real-world data is incorporated into the pre-trained model using transfer learning to enhance its performance in operational buildings. In this paper, zone-level load prediction was integrated with transfer learning, and a framework was proposed to improve the accuracy and applicability of VRF zonal load prediction.Keywords: zonal load prediction, variable refrigerant flow (VRF) system, transfer learning, energyplus
Procedia PDF Downloads 3310964 A Contribution to the Polynomial Eigen Problem
Authors: Malika Yaici, Kamel Hariche, Tim Clarke
Abstract:
The relationship between eigenstructure (eigenvalues and eigenvectors) and latent structure (latent roots and latent vectors) is established. In control theory eigenstructure is associated with the state space description of a dynamic multi-variable system and a latent structure is associated with its matrix fraction description. Beginning with block controller and block observer state space forms and moving on to any general state space form, we develop the identities that relate eigenvectors and latent vectors in either direction. Numerical examples illustrate this result. A brief discussion of the potential of these identities in linear control system design follows. Additionally, we present a consequent result: a quick and easy method to solve the polynomial eigenvalue problem for regular matrix polynomials.Keywords: eigenvalues/eigenvectors, latent values/vectors, matrix fraction description, state space description
Procedia PDF Downloads 47410963 Earthquake Identification to Predict Tsunami in Andalas Island, Indonesia Using Back Propagation Method and Fuzzy TOPSIS Decision Seconder
Authors: Muhamad Aris Burhanudin, Angga Firmansyas, Bagus Jaya Santosa
Abstract:
Earthquakes are natural hazard that can trigger the most dangerous hazard, tsunami. 26 December 2004, a giant earthquake occurred in north-west Andalas Island. It made giant tsunami which crushed Sumatra, Bangladesh, India, Sri Lanka, Malaysia and Singapore. More than twenty thousand people dead. The occurrence of earthquake and tsunami can not be avoided. But this hazard can be mitigated by earthquake forecasting. Early preparation is the key factor to reduce its damages and consequences. We aim to investigate quantitatively on pattern of earthquake. Then, we can know the trend. We study about earthquake which has happened in Andalas island, Indonesia one last decade. Andalas is island which has high seismicity, more than a thousand event occur in a year. It is because Andalas island is in tectonic subduction zone of Hindia sea plate and Eurasia plate. A tsunami forecasting is needed to mitigation action. Thus, a Tsunami Forecasting Method is presented in this work. Neutral Network has used widely in many research to estimate earthquake and it is convinced that by using Backpropagation Method, earthquake can be predicted. At first, ANN is trained to predict Tsunami 26 December 2004 by using earthquake data before it. Then after we get trained ANN, we apply to predict the next earthquake. Not all earthquake will trigger Tsunami, there are some characteristics of earthquake that can cause Tsunami. Wrong decision can cause other problem in the society. Then, we need a method to reduce possibility of wrong decision. Fuzzy TOPSIS is a statistical method that is widely used to be decision seconder referring to given parameters. Fuzzy TOPSIS method can make the best decision whether it cause Tsunami or not. This work combines earthquake prediction using neural network method and using Fuzzy TOPSIS to determine the decision that the earthquake triggers Tsunami wave or not. Neural Network model is capable to capture non-linear relationship and Fuzzy TOPSIS is capable to determine the best decision better than other statistical method in tsunami prediction.Keywords: earthquake, fuzzy TOPSIS, neural network, tsunami
Procedia PDF Downloads 50310962 Medium-Scale Multi-Juice Extractor for Food Processing
Authors: Flordeliza L. Mercado, Teresito G. Aguinaldo, Helen F. Gavino, Victorino T. Taylan
Abstract:
Most fruits and vegetables are available in large quantities during peak season which are oftentimes marketed at low price and left to rot or fed to farm animals. The lack of efficient storage facilities, and the additional cost and unavailability of small machinery for food processing, results to low price and wastage. Incidentally, processed fresh fruits and vegetables are gaining importance nowadays and health conscious people are also into ‘juicing’. One way to reduce wastage and ensure an all-season availability of crop juices at reasonable costs is to develop equipment for effective extraction of juice. The study was conducted to design, fabricate and evaluate a multi-juice extractor using locally available materials, making it relatively cheaper and affordable for medium-scale enterprises. The study was also conducted to formulate juice blends using extracted juices and calamansi juice at different blending percentage, and evaluate its chemical properties and sensory attributes. Furthermore, the chemical properties of extracted meals were evaluated for future applications. The multi-juice extractor has an overall dimension of 963mm x 300mm x 995mm, a gross weight of 82kg and 5 major components namely; feeding hopper, extracting chamber, juice and meal outlet, transmission assembly, and frame. The machine performance was evaluated based on juice recovery, extraction efficiency, extraction rate, extraction recovery, and extraction loss considering type of crop as apple and carrot with three replications each and was analyzed using T-test. The formulated juice blends were subjected to sensory evaluation and data gathered were analyzed using Analysis of Variance appropriate for Complete Randomized Design. Results showed that the machine’s juice recovery (73.39%), extraction rate (16.40li/hr), and extraction efficiency (88.11%) for apple were significantly higher than for carrot while extraction recovery (99.88%) was higher for apple than for carrot. Extraction loss (0.12%) was lower for apple than for carrot, but was not significantly affected by crop. Based on adding percentage mark-up on extraction cost (Php 2.75/kg), the breakeven weight and payback period for a 35% mark-up is 4,710.69kg and 1.22 years, respectively and for a 50% mark-up, the breakeven weight is 3,492.41kg and the payback period is 0.86 year (10.32 months). Results on the sensory evaluation of juice blends showed that the type of juice significantly influenced all the sensory parameters while the blending percentage including their respective interaction, had no significant effect on all sensory parameters, making the apple-calamansi juice blend more preferred than the carrot-calamansi juice blend in terms of all the sensory parameter. The machine’s performance is higher for apple than for carrot and the cost analysis on the use of the machine revealed that it is financially viable with a payback period of 1.22 years (35% mark-up) and 0.86 year (50% mark-up) for machine cost, generating an income of Php 23,961.60 and Php 34,444.80 per year using 35% and 50% mark-up, respectively. The juice blends were of good qualities based on the values obtained in the chemical analysis and the extracted meal could also be used to produce another product based on the values obtained from proximate analysis.Keywords: food processing, fruits and vegetables, juice extraction, multi-juice extractor
Procedia PDF Downloads 32810961 High-Frequency Cryptocurrency Portfolio Management Using Multi-Agent System Based on Federated Reinforcement Learning
Authors: Sirapop Nuannimnoi, Hojjat Baghban, Ching-Yao Huang
Abstract:
Over the past decade, with the fast development of blockchain technology since the birth of Bitcoin, there has been a massive increase in the usage of Cryptocurrencies. Cryptocurrencies are not seen as an investment opportunity due to the market’s erratic behavior and high price volatility. With the recent success of deep reinforcement learning (DRL), portfolio management can be modeled and automated. In this paper, we propose a novel DRL-based multi-agent system to automatically make proper trading decisions on multiple cryptocurrencies and gain profits in the highly volatile cryptocurrency market. We also extend this multi-agent system with horizontal federated transfer learning for better adapting to the inclusion of new cryptocurrencies in our portfolio; therefore, we can, through the concept of diversification, maximize our profits and minimize the trading risks. Experimental results through multiple simulation scenarios reveal that this proposed algorithmic trading system can offer three promising key advantages over other systems, including maximized profits, minimized risks, and adaptability.Keywords: cryptocurrency portfolio management, algorithmic trading, federated learning, multi-agent reinforcement learning
Procedia PDF Downloads 12110960 Optimal Maintenance and Improvement Policies in Water Distribution System: Markov Decision Process Approach
Authors: Jong Woo Kim, Go Bong Choi, Sang Hwan Son, Dae Shik Kim, Jung Chul Suh, Jong Min Lee
Abstract:
The Markov Decision Process (MDP) based methodology is implemented in order to establish the optimal schedule which minimizes the cost. Formulation of MDP problem is presented using the information about the current state of pipe, improvement cost, failure cost and pipe deterioration model. The objective function and detailed algorithm of dynamic programming (DP) are modified due to the difficulty of implementing the conventional DP approaches. The optimal schedule derived from suggested model is compared to several policies via Monte Carlo simulation. Validity of the solution and improvement in computational time are proved.Keywords: Markov decision processes, dynamic programming, Monte Carlo simulation, periodic replacement, Weibull distribution
Procedia PDF Downloads 42510959 Application of a New Efficient Normal Parameter Reduction Algorithm of Soft Sets in Online Shopping
Authors: Xiuqin Ma, Hongwu Qin
Abstract:
A new efficient normal parameter reduction algorithm of soft set in decision making was proposed. However, up to the present, few documents have focused on real-life applications of this algorithm. Accordingly, we apply a New Efficient Normal Parameter Reduction algorithm into real-life datasets of online shopping, such as Blackberry Mobile Phone Dataset. Experimental results show that this algorithm is not only suitable but feasible for dealing with the online shopping.Keywords: soft sets, parameter reduction, normal parameter reduction, online shopping
Procedia PDF Downloads 51710958 The Impact of Socio-Economic and Type of Religion on the Behavior of Obedience among Arab-Israeli Teenagers
Authors: Sadhana Ghnayem
Abstract:
This article examines the relationship between several socio-economic and background variables of Arab-Israeli families and their effect on the conflict management style of forcing, where teenage children are expected to obey their parents without questioning. The article explores the inter-generational gap and the desire of Arab-Israeli parents to force their teenage children to obey without questioning. The independent variables include: the sex of the parent, religion (Christian or Muslim), income of the parent, years of education of the parent, and the sex of the teenage child. We use the dependent variable of “Obedience Without Questioning” that is reported twice: by each of the parents as well as by the children. We circulated a questionnaire and collected data from a sample of 180 parents and their adolescent child living in the Galilee area during 2018. In this questionnaire we asked each of the parent and his/her teenage child about whether the latter is expected to follow the instructions of the former without questioning. The outcome of this article indicates, first, that Christian-Arab families are less authoritarian than Muslims families in demanding sheer obedience from their children. Second, female parents indicate more than male parents that their teenage child indeed obeys without questioning. Third, there is a negative correlation between the variable “Income” and “Obedience without Questioning.” Yet, the regression coefficient of this variable is close zero. Fourth, there is a positive correlation between years of education and obedience reported by the children. In other words, more educated parents are more likely to demand obedience from their children. Finally, after running the regression, the study also found that the impact of the variables of religion as well as the sex of the child on the dependent variable of obedience is also significant at above 95 and 90%, respectively.Keywords: conflict, religion, conflict management style, obedience
Procedia PDF Downloads 17510957 'Systems' and Its Impact on Virtual Teams and Electronic Learning
Authors: Shavindrie Cooray
Abstract:
It is vital that students are supported in having balanced conversations about topics that might be controversial. This process is crucial to the development of critical thinking skills. This can be difficult to attain in e-learning environments, with some research finding students report a perceived loss in the quality of knowledge exchange and performance. This research investigated if Systems Theory could be applied to structure the discussion, improve information sharing, and reduce conflicts when students are working in online environments. This research involved 160 participants across four categories of student groups at a college in the Northeastern US. Each group was provided with a shared problem, and each group was expected to make a proposal for a solution. Two groups worked face-to-face; the first face to face group engaged with the problem and each other with no intervention from a facilitator; a second face to face group worked on the problem using Systems tools to facilitate problem structuring, group discussion, and decision-making. There were two types of virtual teams. The first virtual group also used Systems tools to facilitate problem structuring and group discussion. However, all interactions were conducted in a synchronous virtual environment. The second type of virtual team also met in real time but worked with no intervention. Findings from the study demonstrated that the teams (both virtual and face-to-face) using Systems tools shared more information with each other than the other teams; additionally, these teams reported an increased level of disagreement amongst their members, but also expressed more confidence and satisfaction with the experience and resulting decision compared to the other groups.Keywords: e-learning, virtual teams, systems approach, conflicts
Procedia PDF Downloads 14110956 Predicting Provider Service Time in Outpatient Clinics Using Artificial Intelligence-Based Models
Authors: Haya Salah, Srinivas Sharan
Abstract:
Healthcare facilities use appointment systems to schedule their appointments and to manage access to their medical services. With the growing demand for outpatient care, it is now imperative to manage physician's time effectively. However, high variation in consultation duration affects the clinical scheduler's ability to estimate the appointment duration and allocate provider time appropriately. Underestimating consultation times can lead to physician's burnout, misdiagnosis, and patient dissatisfaction. On the other hand, appointment durations that are longer than required lead to doctor idle time and fewer patient visits. Therefore, a good estimation of consultation duration has the potential to improve timely access to care, resource utilization, quality of care, and patient satisfaction. Although the literature on factors influencing consultation length abound, little work has done to predict it using based data-driven approaches. Therefore, this study aims to predict consultation duration using supervised machine learning algorithms (ML), which predicts an outcome variable (e.g., consultation) based on potential features that influence the outcome. In particular, ML algorithms learn from a historical dataset without explicitly being programmed and uncover the relationship between the features and outcome variable. A subset of the data used in this study has been obtained from the electronic medical records (EMR) of four different outpatient clinics located in central Pennsylvania, USA. Also, publicly available information on doctor's characteristics such as gender and experience has been extracted from online sources. This research develops three popular ML algorithms (deep learning, random forest, gradient boosting machine) to predict the treatment time required for a patient and conducts a comparative analysis of these algorithms with respect to predictive performance. The findings of this study indicate that ML algorithms have the potential to predict the provider service time with superior accuracy. While the current approach of experience-based appointment duration estimation adopted by the clinic resulted in a mean absolute percentage error of 25.8%, the Deep learning algorithm developed in this study yielded the best performance with a MAPE of 12.24%, followed by gradient boosting machine (13.26%) and random forests (14.71%). Besides, this research also identified the critical variables affecting consultation duration to be patient type (new vs. established), doctor's experience, zip code, appointment day, and doctor's specialty. Moreover, several practical insights are obtained based on the comparative analysis of the ML algorithms. The machine learning approach presented in this study can serve as a decision support tool and could be integrated into the appointment system for effectively managing patient scheduling.Keywords: clinical decision support system, machine learning algorithms, patient scheduling, prediction models, provider service time
Procedia PDF Downloads 12510955 Corporate In-Kind Donations and Economic Efficiency: The Case of Surplus Food Recovery and Donation
Authors: Sedef Sert, Paola Garrone, Marco Melacini, Alessandro Perego
Abstract:
This paper is aimed at enhancing our current understanding of motivations behind corporate in-kind donations and to find out whether economic efficiency may be a major driver. Our empirical setting is consisted of surplus food recovery and donation by companies from food supply chain. This choice of empirical setting is motivated by growing attention on the paradox of food insecurity and food waste i.e. a total of 842 million people worldwide were estimated to be suffering from regularly not getting enough food, while approximately 1.3 billion tons per year food is wasted globally. Recently, many authors have started considering surplus food donation to nonprofit organizations as a way to cope with social issue of food insecurity and environmental issue of food waste. In corporate philanthropy literature the motivations behind the corporate donations for social purposes, such as altruistic motivations, enhancements to employee morale, the organization’s image, supplier/customer relationships, local community support, have been examined. However, the relationship with economic efficiency is not studied and in many cases the pure economic efficiency as a decision making factor is neglected. Although in literature there are some studies give us the clue on economic value creation of surplus food donation such as saving landfill fees or getting tax deductions, so far there is no study focusing deeply on this phenomenon. In this paper, we develop a conceptual framework which explores the economic barriers and drivers towards alternative surplus food management options i.e. discounts, secondary markets, feeding animals, composting, energy recovery, disposal. The case study methodology is used to conduct the research. Protocols for semi structured interviews are prepared based on an extensive literature review and adapted after expert opinions. The interviews are conducted mostly with the supply chain and logistics managers of 20 companies in food sector operating in Italy, in particular in Lombardy region. The results shows that in current situation, the food manufacturing companies can experience cost saving by recovering and donating the surplus food with respect to other methods especially considering the disposal option. On the other hand, retail and food service sectors are not economically incentivized to recover and donate surplus food to disfavored population. The paper shows that not only strategic and moral motivations, but also economic motivations play an important role in managerial decision making process in surplus food management. We also believe that our research while rooted in the surplus food management topic delivers some interesting implications to more general research on corporate in-kind donations. It also shows that there is a huge room for policy making favoring the recovery and donation of surplus products.Keywords: corporate philanthropy, donation, recovery, surplus food
Procedia PDF Downloads 31710954 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE
Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao
Abstract:
For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE
Procedia PDF Downloads 18110953 Evaluation of Research in the Field of Energy Efficiency and MCA Methods Using Publications Databases
Authors: Juan Sepúlveda
Abstract:
Energy is a fundamental component in sustainability, the access and use of this resource is related with economic growth, social improvements, and environmental impacts. In this sense, energy efficiency has been studied as a factor that enhances the positive impacts of energy in communities; however, the implementation of efficiency requires strong policy and strategies that usually rely on individual measures focused in independent dimensions. In this paper, the problem of energy efficiency as a multi-objective problem is studied, using scientometric analysis to discover trends and patterns that allow to identify the main variables and study approximations related with a further development of models to integrate energy efficiency and MCA into policy making for small communities.Keywords: energy efficiency, MCA, scientometric, trends
Procedia PDF Downloads 37610952 Development of Researcher Knowledge in Mathematics Education: Towards a Confluence Framework
Authors: Igor Kontorovich, Rina Zazkis
Abstract:
We present a framework of researcher knowledge development in conducting a study in mathematics education. The key components of the framework are: knowledge germane to conducting a particular study, processes of knowledge accumulation, and catalyzing filters that influence a researcher decision making. The components of the framework originated from a confluence between constructs and theories in Mathematics Education, Higher Education and Sociology. Drawing on a self-reflective interview with a leading researcher in mathematics education, professor Michèle Artigue, we illustrate how the framework can be utilized in data analysis. Criteria for framework evaluation are discussed. Keywords: community of practice, knowledge development, mathematics education research, researcher knowledge
Procedia PDF Downloads 51410951 Climate Change and Urban Flooding: The Need to Rethinking Urban Flood Management through Resilience
Authors: Suresh Hettiarachchi, Conrad Wasko, Ashish Sharma
Abstract:
The ever changing and expanding urban landscape increases the stress on urban systems to support and maintain safe and functional living spaces. Flooding presents one of the more serious threats to this safety, putting a larger number of people in harm’s way in congested urban settings. Climate change is adding to this stress by creating a dichotomy in the urban flood response. On the one hand, climate change is causing storms to intensify, resulting in more destructive, rarer floods, while on the other hand, longer dry periods are decreasing the severity of more frequent, less intense floods. This variability is creating a need to be more agile and innovative in how we design for and manage urban flooding. Here, we argue that to cope with this challenge climate change brings, we need to move towards urban flood management through resilience rather than flood prevention. We also argue that dealing with the larger variation in flood response to climate change means that we need to look at flooding from all aspects rather than the single-dimensional focus of flood depths and extents. In essence, we need to rethink how we manage flooding in the urban space. This change in our thought process and approach to flood management requires a practical way to assess and quantify resilience that is built into the urban landscape so that informed decision-making can support the required changes in planning and infrastructure design. Towards that end, we propose a Simple Urban Flood Resilience Index (SUFRI) based on a robust definition of resilience as a tool to assess flood resilience. The application of a simple resilience index such as the SUFRI can provide a practical tool that considers urban flood management in a multi-dimensional way and can present solutions that were not previously considered. When such an index is grounded on a clear and relevant definition of resilience, it can be a reliable and defensible way to assess and assist the process of adapting to the increasing challenges in urban flood management with climate change.Keywords: urban flood resilience, climate change, flood management, flood modelling
Procedia PDF Downloads 5610950 Decision Trees Constructing Based on K-Means Clustering Algorithm
Authors: Loai Abdallah, Malik Yousef
Abstract:
A domain space for the data should reflect the actual similarity between objects. Since objects belonging to the same cluster usually share some common traits even though their geometric distance might be relatively large. In general, the Euclidean distance of data points that represented by large number of features is not capturing the actual relation between those points. In this study, we propose a new method to construct a different space that is based on clustering to form a new distance metric. The new distance space is based on ensemble clustering (EC). The EC distance space is defined by tracking the membership of the points over multiple runs of clustering algorithm metric. Over this distance, we train the decision trees classifier (DT-EC). The results obtained by applying DT-EC on 10 datasets confirm our hypotheses that embedding the EC space as a distance metric would improve the performance.Keywords: ensemble clustering, decision trees, classification, K nearest neighbors
Procedia PDF Downloads 197