Search results for: optimization problem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9628

Search results for: optimization problem

7768 Optimization of Maintenance of PV Module Arrays Based on Asset Management Strategies: Case of Study

Authors: L. Alejandro Cárdenas, Fernando Herrera, David Nova, Juan Ballesteros

Abstract:

This paper presents a methodology to optimize the maintenance of grid-connected photovoltaic systems, considering the cleaning and module replacement periods based on an asset management strategy. The methodology is based on the analysis of the energy production of the PV plant, the energy feed-in tariff, and the cost of cleaning and replacement of the PV modules, with the overall revenue received being the optimization variable. The methodology is evaluated as a case study of a 5.6 kWp solar PV plant located on the Bogotá campus of the Universidad Nacional de Colombia. The asset management strategy implemented consists of assessing the PV modules through visual inspection, energy performance analysis, pollution, and degradation. Within the visual inspection of the plant, the general condition of the modules and the structure is assessed, identifying dust deposition, visible fractures, and water accumulation on the bottom. The energy performance analysis is performed with the energy production reported by the monitoring systems and compared with the values estimated in the simulation. The pollution analysis is performed using the soiling rate due to dust accumulation, which can be modelled by a black box with an exponential function dependent on historical pollution values. The pollution rate is calculated with data collected from the energy generated during two years in a photovoltaic plant on the campus of the National University of Colombia. Additionally, the alternative of assessing the temperature degradation of the PV modules is evaluated by estimating the cell temperature with parameters such as ambient temperature and wind speed. The medium-term energy decrease of the PV modules is assessed with the asset management strategy by calculating the health index to determine the replacement period of the modules due to degradation. This study proposes a tool for decision making related to the maintenance of photovoltaic systems. The above, projecting the increase in the installation of solar photovoltaic systems in power systems associated with the commitments made in the Paris Agreement for the reduction of CO2 emissions. In the Colombian context, it is estimated that by 2030, 12% of the installed power capacity will be solar PV.

Keywords: asset management, PV module, optimization, maintenance

Procedia PDF Downloads 39
7767 The Relation between Coping Strategies with Stress and Mental Health Situation in Flying Addicted Family of Self Introducer and Private

Authors: Farnoush Haghanipour

Abstract:

Recent research studies relation between coping strategies with stress and mental health situation in flying addicted family of self-introducer and private, Units of Guilan province. For this purpose 251 family (parent, spouse), that referred to private and self-introducer centers to break out of drug are selected in random sampling form. Research method was cross sectional-descriptive and purpose of research was fixing of between kinds of coping strategies with stress and mental health condition with attention to demographic variables. Therefore to collection of information, coping strategies questionnaire (CSQ) and mental health questionnaire (GHQ) was used and finally data analyzed by descriptive statistical methods (average, standard deviation) and inferential statistical correlation coefficient and regression. Study of correlation coefficient between mental healths with problem focused emotional focused and detachment strategies in level more than %99 is confirmed. Also mental health with avoidant focused hasn't correlation in other words relation is between mental health with problem focused strategies (r= 0/34) and emotional focused with mental health (r=0.52) and detachment with mental health (r= 0.18) in meaningful level 0.05. And also relation is between emotional focused strategies and mental health (r= 0.034) that is meaningless in Alpha 0.05. Also relation between problem processed coping strategies and mental health situation with attention to demographic variable is meaningful and relation level verified in confidence level more than 0.99. And result of anticipation equation regression statistical test has most a have in problem focused coping strategy, mental health, but relation of the avoidant emotional, detachment strategy with mental health was meaningless with attention to demographic variables.

Keywords: stress, coping strategy with stress, mental health, self introducer and private

Procedia PDF Downloads 307
7766 Multi-Objective Optimization and Effect of Surface Conditions on Fatigue Performance of Burnished Components Made of AISI 52100 Steel

Authors: Ouahiba Taamallah, Tarek Litim

Abstract:

The study deals with the burnishing effect of AISI 52100 steel and parameters influence (Py, i and f on surface integrity. The results show that the optimal effects are closely related to the treatment parameters. With a 92% improvement in roughness, SB can be defined as a finishing operation within the machining range. Due to 85% gain in consolidation rate, this treatment constitutes an efficient process for work-hardening of material. In addition, a statistical study based on regression and Taguchi's design has made it possible to develop mathematical models to predict output responses according to the studied burnishing parameters. Response Surface Methodology RSM showed a simultaneous influence of the burnishing parameters and to observe the optimal parameters of the treatment. ANOVA Analysis of results led to validate the prediction model with a determination coefficient R2=94.60% and R2=93.41% for surface roughness and micro-hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=20 Kgf, i=5 passes and f=0.08 mm.rev-1, which favors minimum surface roughness and a maximum of micro-hardness. The result was validated by a composite desirability D_i=1 for both surface roughness and microhardness, respectively. Applying optimal parameters, burnishing showed its beneficial effects in fatigue resistance, especially for imposed loading in the low cycle fatigue of the material where the lifespan increased by 90%.

Keywords: AISI 52100 steel, burnishing, Taguchi, fatigue

Procedia PDF Downloads 185
7765 Exploring Students' Alternative Conception in Vector Components

Authors: Umporn Wutchana

Abstract:

An open ended problem and unstructured interview had been used to explore students’ conceptual and procedural understanding of vector components. The open ended problem had been designed based on research instrument used in previous physics education research. Without physical context, we asked students to find out magnitude and draw graphical form of vector components. The open ended problem was given to 211 first year students of faculty of science during the third (summer) semester in 2014 academic year. The students spent approximately 15 minutes of their second time of the General Physics I course to complete the open ended problem after they had failed. Consequently, their responses were classified based on the similarity of errors performed in the responses. Then, an unstructured interview was conducted. 7 students were randomly selected and asked to reason and explain their answers. The study results showed that 53% of 211 students provided correct numerical magnitude of vector components while 10.9% of them confused and punctuated the magnitude of vectors in x- with y-components. Others 20.4% provided just symbols and the last 15.6% gave no answer. When asking to draw graphical form of vector components, only 10% of 211 students made corrections. A majority of them produced errors and revealed alternative conceptions. 46.5% drew longer and/or shorter magnitude of vector components. 43.1% drew vectors in different forms or wrote down other symbols. Results from the unstructured interview indicated that some students just memorized the method to get numerical magnitude of x- and y-components. About graphical form of component vectors, some students though that the length of component vectors should be shorter than those of the given one. So then, it could be combined to be equal length of the given vectors while others though that component vectors should has the same length as the given vectors. It was likely to be that many students did not develop a strong foundation of understanding in vector components but just learn by memorizing its solution or the way to compute its magnitude and attribute little meaning to such concept.

Keywords: graphical vectors, vectors, vector components, misconceptions, alternative conceptions

Procedia PDF Downloads 181
7764 Parenting a Child with Mental Health Problems: The Role of Self-compassion

Authors: Vered Shenaar-Golan, Nava Wald, Uri Yatzkar

Abstract:

Background: Parenting children with mental health problems poses multiple challenges, including coping with difficult behavior and negative child emotions. The impact on parents includes financial strain, negative social stigma, and negative feelings of guilt or blame, resulting in significant stress and lower levels of well-being. Given findings that self-compassion plays a significant role in reducing stress and improving well-being, the current study examined the role of self-compassion in the experience of parents raising a child with mental health problems. The study tested (1) whether child behavioral/emotional problem severity is associated with higher parental stress and lower parental well-being; (2) whether self-compassion is associated with lower parental stress and higher parental well-being; and (3) whether self-compassion is a stronger predictor of parental stress and well-being than child behavioral/emotional problem severity. Methods: Three hundred and six mothers and two hundred and fifty-six fathers of children attending a hospital child and adolescent psychiatric center were assessed at admission. Consenting parents completed four questionnaires: Child Strength and Difficulty – parent version, Self-compassion, Parent Feeling Inventory, and Well-Being. Results: Child behavioral/emotional problem severity was associated with higher parental stress and lower parental well-being, and self-compassion was a stronger predictor of parental stress and well-being levels than child behavioral/emotional problem severity. For children with internalizing but not externalizing behavioral/emotional problems, parental self-compassion was the only predictor of parental well-being beyond the severity of child behavioral/emotional problems. Conclusions: Cultivating self-compassion is important in reducing parental stress and increasing parental well-being, particularly with internalizing presentations, and should be considered when designing therapeutic interventions for parents.

Keywords: parenting children with mental health problems, self-compassion, parental stress, feelings, well-being

Procedia PDF Downloads 74
7763 Probabilistic Approach of Dealing with Uncertainties in Distributed Constraint Optimization Problems and Situation Awareness for Multi-agent Systems

Authors: Sagir M. Yusuf, Chris Baber

Abstract:

In this paper, we describe how Bayesian inferential reasoning will contributes in obtaining a well-satisfied prediction for Distributed Constraint Optimization Problems (DCOPs) with uncertainties. We also demonstrate how DCOPs could be merged to multi-agent knowledge understand and prediction (i.e. Situation Awareness). The DCOPs functions were merged with Bayesian Belief Network (BBN) in the form of situation, awareness, and utility nodes. We describe how the uncertainties can be represented to the BBN and make an effective prediction using the expectation-maximization algorithm or conjugate gradient descent algorithm. The idea of variable prediction using Bayesian inference may reduce the number of variables in agents’ sampling domain and also allow missing variables estimations. Experiment results proved that the BBN perform compelling predictions with samples containing uncertainties than the perfect samples. That is, Bayesian inference can help in handling uncertainties and dynamism of DCOPs, which is the current issue in the DCOPs community. We show how Bayesian inference could be formalized with Distributed Situation Awareness (DSA) using uncertain and missing agents’ data. The whole framework was tested on multi-UAV mission for forest fire searching. Future work focuses on augmenting existing architecture to deal with dynamic DCOPs algorithms and multi-agent information merging.

Keywords: DCOP, multi-agent reasoning, Bayesian reasoning, swarm intelligence

Procedia PDF Downloads 114
7762 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: decision tree, genetic algorithm, machine learning, software defect prediction

Procedia PDF Downloads 326
7761 Motion of an Infinitesimal Particle in Binary Stellar Systems: Kepler-34, Kepler-35, Kepler-16, Kepler-413

Authors: Rajib Mia, Badam Singh Kushvah

Abstract:

The present research was motivated by the recent discovery of the binary star systems. In this paper, we use the restricted three-body problem in the binary stellar systems, considering photogravitational effects of both the stars. The aim of this study is to investigate the motion of the infinitesimal mass in the vicinity of the Lagrangian points. The stability and periodic orbits of collinear points and the stability and trajectories of the triangular points are studied in stellar binary systems Kepler-34, Kepler-35, Kepler-413 and Kepler-16 systems. A detailed comparison is made among periodic orbits and trajectories.

Keywords: exoplanetary systems, lagrangian points, periodic orbit, restricted three body problem, stability

Procedia PDF Downloads 427
7760 Performance Enhancement of Hybrid Racing Car by Design Optimization

Authors: Tarang Varmora, Krupa Shah, Karan Patel

Abstract:

Environmental pollution and shortage of conventional fuel are the main concerns in the transportation sector. Most of the vehicles use an internal combustion engine (ICE), powered by gasoline fuels. This results into emission of toxic gases. Hybrid electric vehicle (HEV) powered by electric machine and ICE is capable of reducing emission of toxic gases and fuel consumption. However to build HEV, it is required to accommodate motor and batteries in the vehicle along with engine and fuel tank. Thus, overall weight of the vehicle increases. To improve the fuel economy and acceleration, the weight of the HEV can be minimized. In this paper, the design methodology to reduce the weight of the hybrid racing car is proposed. To this end, the chassis design is optimized. Further, attempt is made to obtain the maximum strength with minimum material weight. The best configuration out of the three main configurations such as series, parallel and the dual-mode (series-parallel) is chosen. Moreover, the most suitable type of motor, battery, braking system, steering system and suspension system are identified. The racing car is designed and analyzed in the simulating software. The safety of the vehicle is assured by performing static and dynamic analysis on the chassis frame. From the results, it is observed that, the weight of the racing car is reduced by 11 % without compromising on safety and cost. It is believed that the proposed design and specifications can be implemented practically for manufacturing hybrid racing car.

Keywords: design optimization, hybrid racing car, simulation, vehicle, weight reduction

Procedia PDF Downloads 289
7759 Community Based Participatory Research in Opioid Use: Design of an Informatics Solution

Authors: Sue S. Feldman, Bradley Tipper, Benjamin Schooley

Abstract:

Nearly every community in the US has been impacted by opioid related addictions/deaths; it is a national problem that is threatening our social and economic welfare. Most believe that tackling this problem from a prevention perspective advances can be made toward breaking the chain of addiction. One mechanism, community based participatory research, involves the community in the prevention approach. This project combines that approach with a design science approach to develop an integrated solution. Findings suggested accountable care communities, transpersonal psychology, and social exchange theory as product kernel theories. Evaluation was conducted on a prototype.

Keywords: substance use and abuse recovery, community resource centers, accountable care communities, community based participatory research

Procedia PDF Downloads 148
7758 Two and Three Layer Lamination of Nanofiber

Authors: Roman Knizek, Denisa Karhankova, Ludmila Fridrichova

Abstract:

For their exceptional properties nanofibers, respectively, nanofiber layers are achieving an increasingly wider range of uses. Nowadays nanofibers are used mainly in the field of air filtration where they are removing submicron particles, bacteria, and viruses. Their efficiency is not changed in time, and the power consumption is much lower than that of electrically charged filters. Nanofibers are primarily used for converting and storage of energy in both air and liquid filtration, in food and packaging, protecting the environment, but also in health care which is made possible by their newly discovered properties. However, a major problem of the nanofiber layer is practically zero abrasion resistance; it is, therefore, necessary to laminate the nanofiber layer with another suitable material. Unfortunately, lamination of nanofiber layers is a major problem since the nanofiber layer contains small pores through which it is very difficult for adhesion to pass through. Therefore, there is still only a small percentage of products with these unique fibers 5.

Keywords: nanofiber layer, nanomembrane, lamination, electrospinning

Procedia PDF Downloads 720
7757 A Description Logics Based Approach for Building Multi-Viewpoints Ontologies

Authors: M. Hemam, M. Djezzar, T. Djouad

Abstract:

We are interested in the problem of building an ontology in a heterogeneous organization, by taking into account different viewpoints and different terminologies of communities in the organization. Such ontology, that we call multi-viewpoint ontology, confers to the same universe of discourse, several partial descriptions, where each one is relative to a particular viewpoint. In addition, these partial descriptions share at global level, ontological elements constituent a consensus between the various viewpoints. In order to provide response elements to this problem we define a multi-viewpoints knowledge model based on viewpoint and ontology notions. The multi-viewpoints knowledge model is used to formalize the multi-viewpoints ontology in description logics language.

Keywords: description logic, knowledge engineering, ontology, viewpoint

Procedia PDF Downloads 308
7756 Optimization of Hot Metal Charging Circuit in a Steel Melting Shop Using Industrial Engineering Techniques for Achieving Manufacturing Excellence

Authors: N. Singh, A. Khullar, R. Shrivastava, I. Singh, A. S. Kumar

Abstract:

Steel forms the basis of any modern society and is essential to economic growth. India’s annual crude steel production has seen a consistent increase over the past years and is poised to grow to 300 million tons per annum by 2030-31 from current level of 110-120 million tons per annum. Steel industry is highly capital-intensive industry and to remain competitive, it is imperative that it invests in operational excellence. Due to inherent nature of the industry, there is large amount of variability in its supply chain both internally and externally. Production and productivity of a steel plant is greatly affected by the bottlenecks present in material flow logistics. The internal logistics constituting of transport of liquid metal within a steel melting shop (SMS) presents an opportunity in increasing the throughput with marginal capital investment. The study was carried out at one of the SMS of an integrated steel plant located in the eastern part of India. The plant has three SMS’s and the study was carried out at one of them. The objective of this study was to identify means to optimize SMS hot metal logistics through application of industrial engineering techniques. The study also covered the identification of non-value-added activities and proposed methods to eliminate the delays and improve the throughput of the SMS.

Keywords: optimization, steel making, supply chain, throughput enhancement, workforce productivity

Procedia PDF Downloads 115
7755 Heat and Mass Transfer in a Saturated Porous Medium Confined in Cylindrical Annular Geometry

Authors: A. Ja, J. Belabid, A. Cheddadi

Abstract:

This paper reports the numerical simulation of double diffusive natural convection flows within a horizontal annular filled with a saturated porous medium. The analysis concerns the influence of the different parameters governing the problem, namely, the Rayleigh number Ra, the Lewis number Le and the buoyancy ratio N, on the heat and mass transfer and on the flow structure, in the case of a fixed radius ratio R = 2. The numerical model used for the discretization of the dimensionless equations governing the problem is based on the finite difference method, using the ADI scheme. The study is focused on steady-state solutions in the cooperation situation.

Keywords: natural convection, double-diffusion, porous medium, annular geometry, finite differences

Procedia PDF Downloads 339
7754 Fuzzy Linear Programming Approach for Determining the Production Amounts in Food Industry

Authors: B. Güney, Ç. Teke

Abstract:

In recent years, rapid and correct decision making is crucial for both people and enterprises. However, uncertainty makes decision-making difficult. Fuzzy logic is used for coping with this situation. Thus, fuzzy linear programming models are developed in order to handle uncertainty in objective function and the constraints. In this study, a problem of a factory in food industry is investigated, required data is obtained and the problem is figured out as a fuzzy linear programming model. The model is solved using Zimmerman approach which is one of the approaches for fuzzy linear programming. As a result, the solution gives the amount of production for each product type in order to gain maximum profit.

Keywords: food industry, fuzzy linear programming, fuzzy logic, linear programming

Procedia PDF Downloads 645
7753 Using Geopolymer Technology on Stabilization and Reutilization the Expansion Behavior Slag

Authors: W. H. Lee, T. W. Cheng, K. Y. Lin, S. W. Huang, Y. C. Ding

Abstract:

Basic Oxygen Furnace (BOF) Slag and electric arc furnace (EAF) slag is the by-product of iron making and steel making. Each of slag with produced over 100 million tons annually in Taiwan. The type of slag has great engineering properties, such as, high hardness and density, high compressive strength, low abrasion ratio, and can replace natural aggregate for building materials. However, no matter BOF or EAF slag, both have the expansion problem, due to it contains free lime. The purpose of this study was to stabilize the BOF and EAF slag by using geopolymer technology, hoping can prevent and solve the expansion problem. The experimental results showed that using geopolymer technology can successfully solve and prevent the expansion problem. Their main properties are analyzed with regard to their use as building materials. Autoclave is used to study the volume stability of these specimens. Finally, the compressive strength of geopolymer mortar with BOF/FAF slag can be reached over 21MPa after curing for 28 days. After autoclave testing, the volume expansion does not exceed 0.2%. Even after the autoclave test, the compressive strength can be grown to over 35MPa. In this study have success using these results on ready-mixed concrete plant, and have the same experimental results as laboratory scale. These results gave encouragement that the stabilized and reutilized BOF/EAF slag could be replaced as a feasible natural fine aggregate by using geopolymer technology.

Keywords: BOF slag, EAF slag, autoclave test, geopolymer

Procedia PDF Downloads 131
7752 Narrative Study to Resilience and Adversity's Response

Authors: Yun Hang Stanley Cheung

Abstract:

In recent years, many educators and entrepreneurs have often suggested that students’ and workers’ ability of the adversity response is very important, it would affect problem-solving strategies and ultimate success in their career or life. The meaning of resilience is discussed as the process of bouncing back and the ability to adapt well in adversity’s response, being resilient does not mean to live without any stress and difficulty, but to grow and thrive under pressure. The purpose of this study is to describe the process of resilience and adversity’s response. The use of the narrative inquiry aims for understanding the experiential process of adversity response, and the problem-solving strategies (such as emotion control, motivation, decisions making process), as well as making the experience become life story, which may be evaluated by its teller and its listeners. The narrative study describes the researcher’s self-experience of adversity’s response to the recovery of the seriously burnt injury from a hill fire at his 12 years old, as well as the adversities and obstacles related to the tragedy after the physical recovery. Sense-Making Theory and McCormack’s Lenses were used for constructive perspective and data analyzing. To conclude, this study has described the life story of fighting the adversities, also, those narratives come out some suggestions, which point out positive thinking is necessary to build up resilience and the ability of immediate adversity response. Also, some problem-solving strategies toward adversities are discussed, which are helpful for resilience education for youth and young adult.

Keywords: adversity response, life story, narrative inquiry, resilience

Procedia PDF Downloads 304
7751 The Exploitation of Balancing an Inverted Pendulum System Using Sliding Mode Control

Authors: Sheren H. Salah, Ahmed Y. Ben Sasi

Abstract:

The inverted pendulum system is a classic control problem that is used in universities around the world. It is a suitable process to test prototype controllers due to its high non-linearities and lack of stability. The inverted pendulum represents a challenging control problem, which continually moves toward an uncontrolled state. This paper presents the possibility of balancing an inverted pendulum system using sliding mode control (SMC). The goal is to determine which control strategy delivers better performance with respect to pendulum’s angle and cart's position. Therefore, proportional-integral-derivative (PID) is used for comparison. Results have proven SMC control produced better response compared to PID control in both normal and noisy systems.

Keywords: inverted pendulum (IP), proportional-integral derivative (PID), sliding mode control (SMC), systems and control engineering

Procedia PDF Downloads 583
7750 Thermal Analysis and Optimization of a High-Speed Permanent Magnet Synchronous Motor with Toroidal Windings

Authors: Yuan Wan, Shumei Cui, Shaopeng Wu

Abstract:

Toroidal windings were taken advantage of to reduce of axial length of the motor, so as to match the applications that have severe restrictions on the axial length. But slotting in the out edge of the stator will decrease the heat-dissipation capacity of the water cooling of the housing. Besides, the windings in the outer slots will increase the copper loss, which will further increase the difficult for heat dissipation of the motor. At present, carbon-fiber composite retaining sleeve are increasingly used to be mounted over the magnets to ensure the rotor strength at high speeds. Due to the poor thermal conductivity of carbon-fiber sleeve, the cooling of the rotor becomes very difficult, which may result in the irreversible demagnetization of magnets for the excessively high temperature. So it is necessary to analyze the temperature rise of such motor. This paper builds a computational fluid dynamic (CFD) model of a toroidal-winding high-speed permanent magnet synchronous motor (PMSM) with water cooling of housing and forced air cooling of rotor. Thermal analysis was carried out based on the model and the factors that affects the temperature rise were investigated. Then thermal optimization for the prototype was achieved. Finally, a small-size prototype was manufactured and the thermal analysis results were verified.

Keywords: thermal analysis, temperature rise, toroidal windings, high-speed PMSM, CFD

Procedia PDF Downloads 486
7749 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 575
7748 Consensus Reaching Process and False Consensus Effect in a Problem of Portfolio Selection

Authors: Viviana Ventre, Giacomo Di Tollo, Roberta Martino

Abstract:

The portfolio selection problem includes the evaluation of many criteria that are difficult to compare directly and is characterized by uncertain elements. The portfolio selection problem can be modeled as a group decision problem in which several experts are invited to present their assessment. In this context, it is important to study and analyze the process of reaching a consensus among group members. Indeed, due to the various diversities among experts, reaching consensus is not necessarily always simple and easily achievable. Moreover, the concept of consensus is accompanied by the concept of false consensus, which is particularly interesting in the dynamics of group decision-making processes. False consensus can alter the evaluation and selection phase of the alternative and is the consequence of the decision maker's inability to recognize that his preferences are conditioned by subjective structures. The present work aims to investigate the dynamics of consensus attainment in a group decision problem in which equivalent portfolios are proposed. In particular, the study aims to analyze the impact of the subjective structure of the decision-maker during the evaluation and selection phase of the alternatives. Therefore, the experimental framework is divided into three phases. In the first phase, experts are sent to evaluate the characteristics of all portfolios individually, without peer comparison, arriving independently at the selection of the preferred portfolio. The experts' evaluations are used to obtain individual Analytical Hierarchical Processes that define the weight that each expert gives to all criteria with respect to the proposed alternatives. This step provides insight into how the decision maker's decision process develops, step by step, from goal analysis to alternative selection. The second phase includes the description of the decision maker's state through Markov chains. In fact, the individual weights obtained in the first phase can be reviewed and described as transition weights from one state to another. Thus, with the construction of the individual transition matrices, the possible next state of the expert is determined from the individual weights at the end of the first phase. Finally, the experts meet, and the process of reaching consensus is analyzed by considering the single individual state obtained at the previous stage and the false consensus bias. The work contributes to the study of the impact of subjective structures, quantified through the Analytical Hierarchical Process, and how they combine with the false consensus bias in group decision-making dynamics and the consensus reaching process in problems involving the selection of equivalent portfolios.

Keywords: analytical hierarchical process, consensus building, false consensus effect, markov chains, portfolio selection problem

Procedia PDF Downloads 90
7747 Internal and External Validity in Experimental Economics

Authors: Helena Chytilova, Robin Maialeh

Abstract:

Experimental economics is subject to criticism with regards to frequently discussed trade-off between internal and external validity requirements, which seems to be critically flawed. Incompatibility of trade-off condition and condition of internal validity as a prerequisite for external validity is presented. In addition, the imprecise concept of artificiality found to be rather improving external validity, seems to strengthen illusory status of external versus internal validity tension. Internal validity will be further analysed with regards to Duhem-Quine problem, where unpredictability argument is significantly weakened trough application of inductivism within the illustrative hypothetical-deductive model. Discussion outlined above partially weakens critical arguments related to robustness of results in experimental economics, if perfectly controlled experimental environment is secured.

Keywords: Duhem-Quine problem, external validity, inductivism, internal validity

Procedia PDF Downloads 279
7746 If You Can't Teach Yourself, No One Can

Authors: Timna Mayer

Abstract:

This paper explores the vast potential of self-directed learning in violin pedagogy. Based in practice and drawing on concepts from neuropsychology, the author, a violinist and teacher, outlines five learning principles. Self-directed learning is defined as an ongoing process based on problem detection, definition, and resolution. The traditional roles of teacher and student are reimagined within this context. A step-by-step guide to applied self-directed learning suggests a model for both teachers and students that realizes student independence in the classroom, leading to higher-level understanding and more robust performance. While the value of self-directed learning is well-known in general pedagogy, this paper is novel in applying the approach to the study of musical performance, a field which is currently dominated by habit and folklore, rather than informed by science.

Keywords: neuropsychology and musical performance, self-directed learning, strategic problem solving, violin pedagogy

Procedia PDF Downloads 145
7745 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 373
7744 A Quantum Leap: Developing Quantum Semi-Structured Complex Numbers to Solve the “Division by Zero” Problem

Authors: Peter Jean-Paul, Shanaz Wahid

Abstract:

The problem of division by zero can be stated as: “what is the value of 0 x 1/0?” This expression has been considered undefined by mathematicians because it can have two equally valid solutions either 0 or 1. Recently semi-structured complex number set was invented to solve “division by zero”. However, whilst the number set had some merits it was considered to have a poor theoretical foundation and did not provide a quality solution to “division by zero”. Moreover, the set lacked consistency in simple algebraic calculations producing contradictory results when dividing by zero. To overcome these issues this research starts by treating the expression " 0 x 1/0" as a quantum mechanical system that produces two tangled results 0 and 1. Dirac Notation (a tool from quantum mechanics) was then used to redefine the unstructured unit p in semi-structured complex numbers so that p represents the superposition of two results (0 and 1) and collapses into a single value when used in algebraic expressions. In the process, this paper describes a new number set called Quantum Semi-structured Complex Numbers that provides a valid solution to the problem of “division by zero”. This research shows that this new set (1) forms a “Field”, (2) can produce consistent results when solving division by zero problems, (3) can be used to accurately describe systems whose mathematical descriptions involve division by zero. This research served to provide a firm foundation for Quantum Semi-structured Complex Numbers and support their practical use.

Keywords: division by zero, semi-structured complex numbers, quantum mechanics, Hilbert space, Euclidean space

Procedia PDF Downloads 151
7743 Intelligent Crowd Management Systems in Trains

Authors: Sai S. Hari, Shriram Ramanujam, Unnati Trivedi

Abstract:

The advent of mass transit systems like rail, metro, maglev, and various other rail based transport has pacified the requirement of public transport for the masses to a great extent. However, the abatement of the demand does not necessarily mean it is managed efficiently, eloquently or in an encapsulating manner. The primary problem identified that the one this paper seeks to solve is the dipsomaniac like manner in which the compartments are occupied. This problem is solved by using a comparison of an empty train and an occupied one. The pixel data of an occupied train is compared to the pixel data of an empty train. This is done using canny edge detection technique. After the comparison it intimates the passengers at the consecutive stops which compartments are not occupied or have low occupancy. Thus, redirecting them and preventing overcrowding.

Keywords: canny edge detection, comparison, encapsulation, redirection

Procedia PDF Downloads 327
7742 A Review on Big Data Movement with Different Approaches

Authors: Nay Myo Sandar

Abstract:

With the growth of technologies and applications, a large amount of data has been producing at increasing rate from various resources such as social media networks, sensor devices, and other information serving devices. This large collection of massive, complex and exponential growth of dataset is called big data. The traditional database systems cannot store and process such data due to large and complexity. Consequently, cloud computing is a potential solution for data storage and processing since it can provide a pool of resources for servers and storage. However, moving large amount of data to and from is a challenging issue since it can encounter a high latency due to large data size. With respect to big data movement problem, this paper reviews the literature of previous works, discusses about research issues, finds out approaches for dealing with big data movement problem.

Keywords: Big Data, Cloud Computing, Big Data Movement, Network Techniques

Procedia PDF Downloads 78
7741 Total Controllability of the Second Order Nonlinear Differential Equation with Delay and Non-Instantaneous Impulses

Authors: Muslim Malik, Avadhesh Kumar

Abstract:

A stronger concept of exact controllability which is called Total Controllability is introduced in this manuscript. Sufficient conditions have been established for the total controllability of a control problem, governed by second order nonlinear differential equation with delay and non-instantaneous impulses in a Banach space X. The results are obtained using the strongly continuous cosine family and Banach fixed point theorem. Also, the total controllability of an integrodifferential problem is investigated. At the end, some numerical examples are provided to illustrate the analytical findings.

Keywords: Banach fixed point theorem, non-instantaneous impulses, strongly continuous cosine family, total controllability

Procedia PDF Downloads 294
7740 Pareto Optimal Material Allocation Mechanism

Authors: Peter Egri, Tamas Kis

Abstract:

Scheduling problems have been studied by the algorithmic mechanism design research from the beginning. This paper is focusing on a practically important, but theoretically rather neglected field: the project scheduling problem where the jobs connected by precedence constraints compete for various nonrenewable resources, such as materials. Although the centralized problem can be solved in polynomial-time by applying the algorithm of Carlier and Rinnooy Kan from the Eighties, obtaining materials in a decentralized environment is usually far from optimal. It can be observed in practical production scheduling situations that project managers tend to cache the required materials as soon as possible in order to avoid later delays due to material shortages. This greedy practice usually leads both to excess stocks for some projects and materials, and simultaneously, to shortages for others. The aim of this study is to develop a model for the material allocation problem of a production plant, where a central decision maker—the inventory—should assign the resources arriving at different points in time to the jobs. Since the actual due dates are not known by the inventory, the mechanism design approach is applied with the projects as the self-interested agents. The goal of the mechanism is to elicit the required information and allocate the available materials such that it minimizes the maximal tardiness among the projects. It is assumed that except the due dates, the inventory is familiar with every other parameters of the problem. A further requirement is that due to practical considerations monetary transfer is not allowed. Therefore a mechanism without money is sought which excludes some widely applied solutions such as the Vickrey–Clarke–Groves scheme. In this work, a type of Serial Dictatorship Mechanism (SDM) is presented for the studied problem, including a polynomial-time algorithm for computing the material allocation. The resulted mechanism is both truthful and Pareto optimal. Thus the randomization over the possible priority orderings of the projects results in a universally truthful and Pareto optimal randomized mechanism. However, it is shown that in contrast to problems like the many-to-many matching market, not every Pareto optimal solution can be generated with an SDM. In addition, no performance guarantee can be given compared to the optimal solution, therefore this approximation characteristic is investigated with experimental study. All in all, the current work studies a practically relevant scheduling problem and presents a novel truthful material allocation mechanism which eliminates the potential benefit of the greedy behavior that negatively influences the outcome. The resulted allocation is also shown to be Pareto optimal, which is the most widely used criteria describing a necessary condition for a reasonable solution.

Keywords: material allocation, mechanism without money, polynomial-time mechanism, project scheduling

Procedia PDF Downloads 328
7739 A Digital Twin Approach to Support Real-time Situational Awareness and Intelligent Cyber-physical Control in Energy Smart Buildings

Authors: Haowen Xu, Xiaobing Liu, Jin Dong, Jianming Lian

Abstract:

Emerging smart buildings often employ cyberinfrastructure, cyber-physical systems, and Internet of Things (IoT) technologies to increase the automation and responsiveness of building operations for better energy efficiency and lower carbon emission. These operations include the control of Heating, Ventilation, and Air Conditioning (HVAC) and lighting systems, which are often considered a major source of energy consumption in both commercial and residential buildings. Developing energy-saving control models for optimizing HVAC operations usually requires the collection of high-quality instrumental data from iterations of in-situ building experiments, which can be time-consuming and labor-intensive. This abstract describes a digital twin approach to automate building energy experiments for optimizing HVAC operations through the design and development of an adaptive web-based platform. The platform is created to enable (a) automated data acquisition from a variety of IoT-connected HVAC instruments, (b) real-time situational awareness through domain-based visualizations, (c) adaption of HVAC optimization algorithms based on experimental data, (d) sharing of experimental data and model predictive controls through web services, and (e) cyber-physical control of individual instruments in the HVAC system using outputs from different optimization algorithms. Through the digital twin approach, we aim to replicate a real-world building and its HVAC systems in an online computing environment to automate the development of building-specific model predictive controls and collaborative experiments in buildings located in different climate zones in the United States. We present two case studies to demonstrate our platform’s capability for real-time situational awareness and cyber-physical control of the HVAC in the flexible research platforms within the Oak Ridge National Laboratory (ORNL) main campus. Our platform is developed using adaptive and flexible architecture design, rendering the platform generalizable and extendable to support HVAC optimization experiments in different types of buildings across the nation.

Keywords: energy-saving buildings, digital twins, HVAC, cyber-physical system, BIM

Procedia PDF Downloads 98