Search results for: uncertainty principle
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2082

Search results for: uncertainty principle

1812 Appearance-Based Discrimination in a Workplace: An Emerging Problem for Labor Law Relationships

Authors: Irmina Miernicka

Abstract:

Nowadays, dress codes and widely understood appearance are becoming more important in the workplace. They are often used in the workplace to standardize image of an employer, to communicate a corporate image and ensure that customers can easily identify it. It is also a way to build professionalism of employer. Additionally, in many cases, an employer will introduce a dress code for health and safety reasons. Employers more often oblige employees to follow certain rules concerning their clothing, grooming, make-up, body art or even weight. An important research problem is to find the limits of the employer's interference with the external appearance of employees. They are primarily determined by the two main obligations of the employer, i. e. the obligation to respect the employee's personal rights and the principle of equal treatment and non-discrimination in employment. It should also be remembered that the limits of the employer's interference will be different when certain rules concerning the employee's appearance result directly from the provisions of laws and other acts of universally binding law (workwear, official clothing, and uniform). The analysis of this issue was based on literature and jurisprudence, both domestic and foreign, including the U.S. and European case law, and led the author to put forward a thesis that there are four main principles, which will protect the employer from the allegation of discrimination. First, it is the principle of adequacy - the means requirements regarding dress code must be appropriate to the position and type of work performed by the employee. Secondly, in accordance with the purpose limitation principle, an employer may introduce certain requirements regarding the appearance of employees if there is a legitimate, objective justification for this (such as work safety or type of work performed), not dictated by the employer's subjective feelings and preferences. Thirdly, these requirements must not place an excessive burden on workers and be disproportionate in relation to the employer's objective (principle of proportionality). Fourthly, the employer should also ensure that the requirements imposed in the workplace are equally burdensome and enforceable from all groups of employees. Otherwise, it may expose itself to grounds of discrimination based on sex or age. At the same time, it is also possible to differentiate the situation of some employees if these differences are small and reflect established habits and traditions and if employees are obliged to maintain the same level of professionalism in their positions. Although this subject may seem to be insignificant, frequent application of dress codes and increasing awareness of both employees and employers indicate that its legal aspects need to be thoroughly analyzed. Many legal cases brought before U.S. and European courts show that employees look for legal protection when they consider that their rights are violated by dress code introduced in a workplace.

Keywords: labor law, the appearance of an employee, discrimination in the workplace, dress code in a workplace

Procedia PDF Downloads 125
1811 Resistance and Sub-Resistances of RC Beams Subjected to Multiple Failure Modes

Authors: F. Sangiorgio, J. Silfwerbrand, G. Mancini

Abstract:

Geometric and mechanical properties all influence the resistance of RC structures and may, in certain combination of property values, increase the risk of a brittle failure of the whole system. This paper presents a statistical and probabilistic investigation on the resistance of RC beams designed according to Eurocodes 2 and 8, and subjected to multiple failure modes, under both the natural variation of material properties and the uncertainty associated with cross-section and transverse reinforcement geometry. A full probabilistic model based on JCSS Probabilistic Model Code is derived. Different beams are studied through material nonlinear analysis via Monte Carlo simulations. The resistance model is consistent with Eurocode 2. Both a multivariate statistical evaluation and the data clustering analysis of outcomes are then performed. Results show that the ultimate load behaviour of RC beams subjected to flexural and shear failure modes seems to be mainly influenced by the combination of the mechanical properties of both longitudinal reinforcement and stirrups, and the tensile strength of concrete, of which the latter appears to affect the overall response of the system in a nonlinear way. The model uncertainty of the resistance model used in the analysis plays undoubtedly an important role in interpreting results.

Keywords: modelling, Monte Carlo simulations, probabilistic models, data clustering, reinforced concrete members, structural design

Procedia PDF Downloads 472
1810 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand

Authors: Esma Birisci, Ronald McGarvey

Abstract:

One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.

Keywords: environmental studies, food waste, production planning, uncertain and correlated demand

Procedia PDF Downloads 374
1809 Verification of a Simple Model for Rolling Isolation System Response

Authors: Aarthi Sridhar, Henri Gavin, Karah Kelly

Abstract:

Rolling Isolation Systems (RISs) are simple and effective means to mitigate earthquake hazards to equipment in critical and precious facilities, such as hospitals, network collocation facilities, supercomputer centers, and museums. The RIS works by isolating components acceleration the inertial forces felt by the subsystem. The RIS consists of two platforms with counter-facing concave surfaces (dishes) in each corner. Steel balls lie inside the dishes and allow the relative motion between the top and bottom platform. Formerly, a mathematical model for the dynamics of RISs was developed using Lagrange’s equations (LE) and experimentally validated. A new mathematical model was developed using Gauss’s Principle of Least Constraint (GPLC) and verified by comparing impulse response trajectories of the GPLC model and the LE model in terms of the peak displacements and accelerations of the top platform. Mathematical models for the RIS are tedious to derive because of the non-holonomic rolling constraints imposed on the system. However, using Gauss’s Principle of Least constraint to find the equations of motion removes some of the obscurity and yields a system that can be easily extended. Though the GPLC model requires more state variables, the equations of motion are far simpler. The non-holonomic constraint is enforced in terms of accelerations and therefore requires additional constraint stabilization methods in order to avoid the possibility that numerical integration methods can cause the system to go unstable. The GPLC model allows the incorporation of more physical aspects related to the RIS, such as contribution of the vertical velocity of the platform to the kinetic energy and the mass of the balls. This mathematical model for the RIS is a tool to predict the motion of the isolation platform. The ability to statistically quantify the expected responses of the RIS is critical in the implementation of earthquake hazard mitigation.

Keywords: earthquake hazard mitigation, earthquake isolation, Gauss’s Principle of Least Constraint, nonlinear dynamics, rolling isolation system

Procedia PDF Downloads 252
1808 Economic Analysis of Interaction Freedom, Institutions and Development in the countries of North Africa: Amartya Sen Approach of Capability

Authors: Essardi Omar, Razzouk Redouane

Abstract:

The concept of freedom requires notice of countries all over the world to consider welfare and the quality of life. Despite, many economics efforts in the field of development literature, they have often failed to incorporate the ideas of freedom and rights into their theoretical and empirical work. However, with Amartya Sen’s approach of capability and researches, we can provide a basis for moving forward in theory and measure of development. Indeed, with an approach based on the correlation and the analysis of data, particularly on the tool of principle component analysis, we are going to study assessments of World Bank, Freedom House, Fraster institute, and MINEFE experts. Our empirical objective is to reveal the existence of the institutional and freedom characteristics related to the development of the emergent countries. In order to help us to explain the recent performance reached by Central and Eastern Europe and Latine America in compared with the case of countries of North Africa. To do this, first we will try to build indicators based on dilemma liberties /institutions. Second we will introduce institutional variables and freedom variables to make comparisons in freedom, quality of institutions and development in the countries observed.

Keywords: freedoms, institutions, development, approach of capability, principle component analysis

Procedia PDF Downloads 430
1807 Optimizing the Public Policy Information System under the Environment of E-Government

Authors: Qian Zaijian

Abstract:

E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.

Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems

Procedia PDF Downloads 867
1806 Optimal Investment and Consumption Decision for an Investor with Ornstein-Uhlenbeck Stochastic Interest Rate Model through Utility Maximization

Authors: Silas A. Ihedioha

Abstract:

In this work; it is considered that an investor’s portfolio is comprised of two assets; a risky stock which price process is driven by the geometric Brownian motion and a risk-free asset with Ornstein-Uhlenbeck Stochastic interest rate of return, where consumption, taxes, transaction costs and dividends are involved. This paper aimed at the optimization of the investor’s expected utility of consumption and terminal return on his investment at the terminal time having power utility preference. Using dynamic optimization procedure of maximum principle, a second order nonlinear partial differential equation (PDE) (the Hamilton-Jacobi-Bellman equation HJB) was obtained from which an ordinary differential equation (ODE) obtained via elimination of variables. The solution to the ODE gave the closed form solution of the investor’s problem. It was found the optimal investment in the risky asset is horizon dependent and a ratio of the total amount available for investment and the relative risk aversion coefficient.

Keywords: optimal, investment, Ornstein-Uhlenbeck, utility maximization, stochastic interest rate, maximum principle

Procedia PDF Downloads 225
1805 Victims Legal Representation before International Criminal Court: Freedom of Choice and Role of Victims Legal Representatives

Authors: Erinda Male

Abstract:

Participation of a lawyer in any criminal proceedings on behalf of an accused person or a victim is essential to a fair trial. Legal representation is particularly crucial in proceedings before international tribunals, especially in the International Criminal Court. The paper thus focuses on the importance of the legal representation of victims and defendants before the ICC, as well as on the role of the legal representative in the proceedings before the court and the principle of freedom of choice of legal representatives. Also, the paper presents a short overview of the significance of legal representatives for victims and the necessity to protect their primary role in the ICC system, and ensure that it is coherent and respectful of victims’ rights. Victim participation is an important part of the ICC Statute and it is designed to help ensure that those most affected by the crimes are able to engage with the Court. Proper and quality legal representation ensures meaningful participation of victims at stages of the proceedings before ICC. Finally, the paper acknowledges the role of legal representatives during the pre-trial, trial and post-trial phase, the different modalities in selecting the legal representatives as well as balancing victims’ participation with the right of the accused to a fair trial.

Keywords: fair trial, freedom of choice principle, international criminal court, legal representatives, victims

Procedia PDF Downloads 394
1804 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network

Procedia PDF Downloads 226
1803 Calibration of Syringe Pumps Using Interferometry and Optical Methods

Authors: E. Batista, R. Mendes, A. Furtado, M. C. Ferreira, I. Godinho, J. A. Sousa, M. Alvares, R. Martins

Abstract:

Syringe pumps are commonly used for drug delivery in hospitals and clinical environments. These instruments are critical in neonatology and oncology, where any variation in the flow rate and drug dosing quantity can lead to severe incidents and even death of the patient. Therefore it is very important to determine the accuracy and precision of these devices using the suitable calibration methods. The Volume Laboratory of the Portuguese Institute for Quality (LVC/IPQ) uses two different methods to calibrate syringe pumps from 16 nL/min up to 20 mL/min. The Interferometric method uses an interferometer to monitor the distance travelled by a pusher block of the syringe pump in order to determine the flow rate. Therefore, knowing the internal diameter of the syringe with very high precision, the travelled distance, and the time needed for that travelled distance, it was possible to calculate the flow rate of the fluid inside the syringe and its uncertainty. As an alternative to the gravimetric and the interferometric method, a methodology based on the application of optical technology was also developed to measure flow rates. Mainly this method relies on measuring the increase of volume of a drop over time. The objective of this work is to compare the results of the calibration of two syringe pumps using the different methodologies described above. The obtained results were consistent for the three methods used. The uncertainties values were very similar for all the three methods, being higher for the optical drop method due to setup limitations.

Keywords: calibration, flow, interferometry, syringe pump, uncertainty

Procedia PDF Downloads 109
1802 An Absolute Femtosecond Rangefinder for Metrological Support in Coordinate Measurements

Authors: Denis A. Sokolov, Andrey V. Mazurkevich

Abstract:

In the modern world, there is an increasing demand for highly precise measurements in various fields, such as aircraft, shipbuilding, and rocket engineering. This has resulted in the development of appropriate measuring instruments that are capable of measuring the coordinates of objects within a range of up to 100 meters, with an accuracy of up to one micron. The calibration process for such optoelectronic measuring devices (trackers and total stations) involves comparing the measurement results from these devices to a reference measurement based on a linear or spatial basis. The reference used in such measurements could be a reference base or a reference range finder with the capability to measure angle increments (EDM). The base would serve as a set of reference points for this purpose. The concept of the EDM for replicating the unit of measurement has been implemented on a mobile platform, which allows for angular changes in the direction of laser radiation in two planes. To determine the distance to an object, a high-precision interferometer with its own design is employed. The laser radiation travels to the corner reflectors, which form a spatial reference with precisely known positions. When the femtosecond pulses from the reference arm and the measuring arm coincide, an interference signal is created, repeating at the frequency of the laser pulses. The distance between reference points determined by interference signals is calculated in accordance with recommendations from the International Bureau of Weights and Measures for the indirect measurement of time of light passage according to the definition of a meter. This distance is D/2 = c/2nF, approximately 2.5 meters, where c is the speed of light in a vacuum, n is the refractive index of a medium, and F is the frequency of femtosecond pulse repetition. The achieved uncertainty of type A measurement of the distance to reflectors 64 m (N•D/2, where N is an integer) away and spaced apart relative to each other at a distance of 1 m does not exceed 5 microns. The angular uncertainty is calculated theoretically since standard high-precision ring encoders will be used and are not a focus of research in this study. The Type B uncertainty components are not taken into account either, as the components that contribute most do not depend on the selected coordinate measuring method. This technology is being explored in the context of laboratory applications under controlled environmental conditions, where it is possible to achieve an advantage in terms of accuracy. In general, the EDM tests showed high accuracy, and theoretical calculations and experimental studies on an EDM prototype have shown that the uncertainty type A of distance measurements to reflectors can be less than 1 micrometer. The results of this research will be utilized to develop a highly accurate mobile absolute range finder designed for the calibration of high-precision laser trackers and laser rangefinders, as well as other equipment, using a 64 meter laboratory comparator as a reference.

Keywords: femtosecond laser, pulse correlation, interferometer, laser absolute range finder, coordinate measurement

Procedia PDF Downloads 60
1801 Assessment of Climate Change Impacts on the Hydrology of Upper Guder Catchment, Upper Blue Nile

Authors: Fikru Fentaw Abera

Abstract:

Climate changes alter regional hydrologic conditions and results in a variety of impacts on water resource systems. Such hydrologic changes will affect almost every aspect of human well-being. The goal of this paper is to assess the impact of climate change on the hydrology of Upper Guder catchment located in northwest of Ethiopia. The GCM derived scenarios (HadCM3 A2a & B2a SRES emission scenarios) experiments were used for the climate projection. The statistical downscaling model (SDSM) was used to generate future possible local meteorological variables in the study area. The down-scaled data were then used as input to the soil and water assessment tool (SWAT) model to simulate the corresponding future stream flow regime in Upper Guder catchment of the Abay River Basin. A semi distributed hydrological model, SWAT was developed and Generalized Likelihood Uncertainty Estimation (GLUE) was utilized for uncertainty analysis. GLUE is linked with SWAT in the Calibration and Uncertainty Program known as SWAT-CUP. Three benchmark periods simulated for this study were 2020s, 2050s and 2080s. The time series generated by GCM of HadCM3 A2a and B2a and Statistical Downscaling Model (SDSM) indicate a significant increasing trend in maximum and minimum temperature values and a slight increasing trend in precipitation for both A2a and B2a emission scenarios in both Gedo and Tikur Inch stations for all three bench mark periods. The hydrologic impact analysis made with the downscaled temperature and precipitation time series as input to the hydrological model SWAT suggested for both A2a and B2a emission scenarios. The model output shows that there may be an annual increase in flow volume up to 35% for both emission scenarios in three benchmark periods in the future. All seasons show an increase in flow volume for both A2a and B2a emission scenarios for all time horizons. Potential evapotranspiration in the catchment also will increase annually on average 3-15% for the 2020s and 7-25% for the 2050s and 2080s for both A2a and B2a emissions scenarios.

Keywords: climate change, Guder sub-basin, GCM, SDSM, SWAT, SWAT-CUP, GLUE

Procedia PDF Downloads 365
1800 The Changes in Consumer Behavior and the Decision-making Process After Covid-19 in Greece

Authors: Markou Vasiliki, Serdaris Panagiotis

Abstract:

The consumer behavior and decision-making process of consumers is a process that is affected by the factor of uncertainty. The onslaught of the Covid 19 pandemic has changed the consumer decision-making process in many ways. This change can be seen both in the buying process (how and where they shop) but also in the types of goods and services they are looking for. In addition, due to the mainly economic uncertainty that came from this event, but also the effects on both society and the economy in general, new consumer behaviors were created. Traditional forms of shopping are no longer a primary choice, consumers have turned to digital channels such as e-commerce and social media to fulfill needs. The purpose of this particular article is to examine how much the consumer's decision-making process has been affected after the pandemic and if consumer behavior has changed. An online survey was conducted to examine the change in decision making. Essentially, the demographic factors that influence the decision-making process were examined, as well as the social and economic factors. The research is divided into two parts. The first part included a literature review of the research that has been carried out to identify the factors, and the second part where the empirical investigation was carried out using a questionnaire and was done electronically with the help of Google Forms. The questionnaire was divided into several sections. They included questions about consumer behavior, but mainly about how they make decisions today, whether those decisions have changed due to the pandemic, and whether those changes are permanent. Also, for decision-making, goods were divided into essential products, high-tech products, transactions with the state and others. Αbout 500 consumers aged between 18 and 75 participated in the research. The data was processed with both descriptive statistics and econometric models. The results showed that the consumer behavior and decision-making process has changed. Now consumers widely use the internet for shopping, consumer behaviors and consumer patterns have changed. Social and economic factors play an important role. Income, gender and other factors were found to be statistically significant. In addition, it is worth noting that the percentage who made purchases during the pandemic through the internet for the first time was remarkable and related to age. Essentially, the arrival of the pandemic caused uncertainty for individuals, mainly financial, and this affected the decision-making process. In addition, shopping through the internet is now the first choice, especially among young people, and it seems that it is about to become established.

Keywords: consumer behavior, decision making, COVID-19, Greece, behavior change

Procedia PDF Downloads 48
1799 Improved Image Retrieval for Efficient Localization in Urban Areas Using Location Uncertainty Data

Authors: Mahdi Salarian, Xi Xu, Rashid Ansari

Abstract:

Accurate localization of mobile devices based on camera-acquired visual media information usually requires a search over a very large GPS-referenced image database. This paper proposes an efficient method for limiting the search space for image retrieval engine by extracting and leveraging additional media information about Estimated Positional Error (EP E) to address complexity and accuracy issues in the search, especially to be used for compensating GPS location inaccuracy in dense urban areas. The improved performance is achieved by up to a hundred-fold reduction in the search area used in available reference methods while providing improved accuracy. To test our procedure we created a database by acquiring Google Street View (GSV) images for down town of Chicago. Other available databases are not suitable for our approach due to lack of EP E for the query images. We tested the procedure using more than 200 query images along with EP E acquired mostly in the densest areas of Chicago with different phones and in different conditions such as low illumination and from under rail tracks. The effectiveness of our approach and the effect of size and sector angle of the search area are discussed and experimental results demonstrate how our proposed method can improve performance just by utilizing a data that is available for mobile systems such as smart phones.

Keywords: localization, retrieval, GPS uncertainty, bag of word

Procedia PDF Downloads 283
1798 Evaluation of Spatial Distribution Prediction for Site-Scale Soil Contaminants Based on Partition Interpolation

Authors: Pengwei Qiao, Sucai Yang, Wenxia Wei

Abstract:

Soil pollution has become an important issue in China. Accurate spatial distribution prediction of pollutants with interpolation methods is the basis for soil remediation in the site. However, a relatively strong variability of pollutants would decrease the prediction accuracy. Theoretically, partition interpolation can result in accurate prediction results. In order to verify the applicability of partition interpolation for a site, benzo (b) fluoranthene (BbF) in four soil layers was adopted as the research object in this paper. IDW (inverse distance weighting)-, RBF (radial basis function)-and OK (ordinary kriging)-based partition interpolation accuracies were evaluated, and their influential factors were analyzed; then, the uncertainty and applicability of partition interpolation were determined. Three conclusions were drawn. (1) The prediction error of partitioned interpolation decreased by 70% compared to unpartitioned interpolation. (2) Partition interpolation reduced the impact of high CV (coefficient of variation) and high concentration value on the prediction accuracy. (3) The prediction accuracy of IDW-based partition interpolation was higher than that of RBF- and OK-based partition interpolation, and it was suitable for the identification of highly polluted areas at a contaminated site. These results provide a useful method to obtain relatively accurate spatial distribution information of pollutants and to identify highly polluted areas, which is important for soil pollution remediation in the site.

Keywords: accuracy, applicability, partition interpolation, site, soil pollution, uncertainty

Procedia PDF Downloads 145
1797 Stochastic Fleet Sizing and Routing in Drone Delivery

Authors: Amin Karimi, Lele Zhang, Mark Fackrell

Abstract:

Rural-to-urban population migrations are a global phenomenon, with projections indicating that by 2050, 68% of the world's population will inhabit densely populated urban centers. Concurrently, the popularity of e-commerce shopping has surged, evidenced by a 51% increase in total e-commerce sales from 2017 to 2021. Consequently, distribution and logistics systems, integral to effective supply chain management, confront escalating hurdles in efficiently delivering and distributing products within bustling urban environments. Additionally, events like environmental challenges and the COVID-19 pandemic have indicated that decision-makers are facing numerous sources of uncertainty. Therefore, to design an efficient and reliable logistics system, uncertainty must be considered. In this study, it examine fleet sizing and routing while considering uncertainty in demand rate. Fleet sizing is typically a strategic-level decision, while routing is an operational-level one. In this study, a carrier must make two types of decisions: strategic-level decisions regarding the number and types of drones to be purchased, and operational-level decisions regarding planning routes based on available fleet and realized demand. If the available fleets are insufficient to serve some customers, the carrier must outsource that delivery at a relatively high cost, calculated per order. With this hierarchy of decisions, it can model the problem using two-stage stochastic programming. The first-stage decisions involve planning the number and type of drones to be purchased, while the second-stage decisions involve planning routes. To solve this model, it employ logic-based benders decomposition, which decomposes the problem into a master problem and a set of sub-problems. The master problem becomes a mixed integer programming model to find the best fleet sizing decisions, and the sub-problems become capacitated vehicle routing problems considering battery status. Additionally, it assume a heterogeneous fleet based on load and battery capacity, and it consider that battery health deteriorates over time as it plan for multiple periods.

Keywords: drone-delivery, stochastic demand, VRP, fleet sizing

Procedia PDF Downloads 60
1796 Understanding the Influence of Fibre Meander on the Tensile Properties of Advanced Composite Laminates

Authors: Gaoyang Meng, Philip Harrison

Abstract:

When manufacturing composite laminates, the fibre directions within the laminate are never perfectly straight and inevitably contain some degree of stochastic in-plane waviness or ‘meandering’. In this work we aim to understand the relationship between the degree of meandering of the fibre paths, and the resulting uncertainty in the laminate’s final mechanical properties. To do this, a numerical tool is developed to automatically generate meandering fibre paths in each of the laminate's 8 plies (using Matlab) and after mapping this information into finite element simulations (using Abaqus), the statistical variability of the tensile mechanical properties of a [45°/90°/-45°/0°]s carbon/epoxy (IM7/8552) laminate is predicted. The stiffness, first ply failure strength and ultimate failure strength are obtained. Results are generated by inputting the degree of variability in the fibre paths and the laminate is then examined in all directions (from 0° to 359° in increments of 1°). The resulting predictions are output as flower (polar) plots for convenient analysis. The average fibre orientation of each ply in a given laminate is determined by the laminate layup code [45°/90°/-45°/0°]s. However, in each case, the plies contain increasingly large amounts of in-plane waviness (quantified by the standard deviation of the fibre direction in each ply across the laminate. Four different amounts of variability in the fibre direction are tested (2°, 4°, 6° and 8°). Results show that both the average tensile stiffness and the average tensile strength decrease, while the standard deviations increase, with an increasing degree of fibre meander. The variability in stiffness is found to be relatively insensitive to the rotation angle, but the variability in strength is sensitive. Specifically, the uncertainty in laminate strength is relatively low at orientations centred around multiples of 45° rotation angle, and relatively high between these rotation angles. To concisely represent all the information contained in the various polar plots, rotation-angle dependent Weibull distribution equations are fitted to the data. The resulting equations can be used to quickly estimate the size of the errors bars for the different mechanical properties, resulting from the amount of fibre directional variability contained within the laminate. A longer term goal is to use these equations to quickly introduce realistic variability at the component level.

Keywords: advanced composite laminates, FE simulation, in-plane waviness, tensile properties, uncertainty quantification

Procedia PDF Downloads 89
1795 Estimating CO₂ Storage Capacity under Geological Uncertainty Using 3D Geological Modeling of Unconventional Reservoir Rocks in Block nv32, Shenvsi Oilfield, China

Authors: Ayman Mutahar Alrassas, Shaoran Ren, Renyuan Ren, Hung Vo Thanh, Mohammed Hail Hakimi, Zhenliang Guan

Abstract:

The significant effect of CO₂ on global climate and the environment has gained more concern worldwide. Enhance oil recovery (EOR) associated with sequestration of CO₂ particularly into the depleted oil reservoir is considered the viable approach under financial limitations since it improves the oil recovery from the existing oil reservoir and boosts the relation between global-scale of CO₂ capture and geological sequestration. Consequently, practical measurements are required to attain large-scale CO₂ emission reduction. This paper presents an integrated modeling workflow to construct an accurate 3D reservoir geological model to estimate the storage capacity of CO₂ under geological uncertainty in an unconventional oil reservoir of the Paleogene Shahejie Formation (Es1) in the block Nv32, Shenvsi oilfield, China. In this regard, geophysical data, including well logs of twenty-two well locations and seismic data, were combined with geological and engineering data and used to construct a 3D reservoir geological modeling. The geological modeling focused on four tight reservoir units of the Shahejie Formation (Es1-x1, Es1-x2, Es1-x3, and Es1-x4). The validated 3D reservoir models were subsequently used to calculate the theoretical CO₂ storage capacity in the block Nv32, Shenvsi oilfield. Well logs were utilized to predict petrophysical properties such as porosity and permeability, and lithofacies and indicate that the Es1 reservoir units are mainly sandstone, shale, and limestone with a proportion of 38.09%, 32.42%, and 29.49, respectively. Well log-based petrophysical results also show that the Es1 reservoir units generally exhibit 2–36% porosity, 0.017 mD to 974.8 mD permeability, and moderate to good net to gross ratios. These estimated values of porosity, permeability, lithofacies, and net to gross were up-scaled and distributed laterally using Sequential Gaussian Simulation (SGS) and Simulation Sequential Indicator (SIS) methods to generate 3D reservoir geological models. The reservoir geological models show there are lateral heterogeneities of the reservoir properties and lithofacies, and the best reservoir rocks exist in the Es1-x4, Es1-x3, and Es1-x2 units, respectively. In addition, the reservoir volumetric of the Es1 units in block Nv32 was also estimated based on the petrophysical property models and fund to be between 0.554368

Keywords: CO₂ storage capacity, 3D geological model, geological uncertainty, unconventional oil reservoir, block Nv32

Procedia PDF Downloads 181
1794 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection

Procedia PDF Downloads 307
1793 Timing and Probability of Presurgical Teledermatology: Survival Analysis

Authors: Felipa de Mello-Sampayo

Abstract:

The aim of this study is to undertake, from patient’s perspective, the timing and probability of using teledermatology, comparing it with a conventional referral system. The dynamic stochastic model’s main value-added consists of the concrete application to patients waiting for dermatology surgical intervention. Patients with low health level uncertainty must use teledermatology treatment as soon as possible, which is precisely when the teledermatology is least valuable. The results of the model were then tested empirically with the teledermatology network covering the area served by the Hospital Garcia da Horta, Portugal, links the primary care centers of 24 health districts with the hospital’s dermatology department via the corporate intranet of the Portuguese healthcare system. Health level volatility can be understood as the hazard of developing skin cancer and the trend of health level as the bias of developing skin lesions. The results of the survival analysis suggest that the theoretical model can explain the use of teledermatology. It depends negatively on the volatility of patients' health, and positively on the trend of health, i.e., the lower the risk of developing skin cancer and the younger the patients, the more presurgical teledermatology one expects to occur. Presurgical teledermatology also depends positively on out-of-pocket expenses and negatively on the opportunity costs of teledermatology, i.e., the lower the benefit missed by using teledermatology, the more presurgical teledermatology one expects to occur.

Keywords: teledermatology, wait time, uncertainty, opportunity cost, survival analysis

Procedia PDF Downloads 129
1792 On implementing Sumak Kawsay in Post Bellum Principles: The Reconstruction of Natural Damage in the Aftermath of War

Authors: Lisa Tragbar

Abstract:

In post-war scenarios, reconstruction is a principle towards creating a Just Peace in order to restore a stable post-war society. Just peace theorists explore normative behaviour after war, including the duties and responsibilities of different actors and peacebuilding strategies to achieve a lasting, positive peace. Environmental peace ethicists have argued for including the role of nature in the Ethics of War and Peace. This text explores the question of why and how to rethink the value of nature in post-war scenarios. The aim is to include the rights of nature within a maximalist account of reconstruction by highlighting sumak kawsay in the post-war period. Destruction of nature is usually considered collateral damage in war scenarios. Common universal standards for post-war reconstruction are restitution, compensation and reparation programmes, which is mostly anthropocentric approach. The problem of reconstruction in the aftermath of war is the instrumental value of nature. The responsibility to rebuild needs to be revisited within a non-anthropocentric context. There is an ongoing debate about a minimalist or maximalist approach to post-war reconstruction. While Michael Walzer argues for minimalist in-and-out interventions, Alex Bellamy argues for maximalist strategies such as the responsibility to protect, a UN-concept on how face mass atrocity crimes and how to reconstruct peace. While supporting the tradition of maximalist responsibility to rebuild, these normative post-Bellum concepts do not yet sufficiently consider the rights of nature in the aftermath of war. While reconstruction of infrastructures seems important and necessary, concepts that strengthen the intrinsic value of nature in post-bellum measures must also be included. Peace is not Just Peace without a thriving nature that provides the conditions and resources to live and guarantee human rights. Ecuador's indigenous philosophy of life can contribute to the restoration of nature after war by changing the perspective on the value of nature. The sumak kawsay includes the de-hierarchisation of humans and nature and the principle of reciprocity towards nature. Transferring this idea of life and interconnectedness to post-war reconstruction practices, post bellum perpetrators have restorative obligations not only to people but also to nature. This maximalist approach would include both a restitutive principle, by restoring the balance between humans and nature, and a retributive principle, by punishing the perpetrators through compensatory duties to nature. A maximalist approach to post-war reconstruction that takes into account the rights of nature expands the normative post-war questions to include a more complex field of responsibilities. After a war, Just Peace is restored once not only human rights but also the rights of nature are secured. A minimalist post-bellum approach to reconstruction does not locate future problems at their source and does not offer a solution for the inclusion of obligations to nature. There is a lack of obligations towards nature after a war, which can be changed through a different perspective: The indigenous philosophy of life provides the necessary principles for a comprehensive reconstruction of Just Peace.

Keywords: normative ethics, peace, post-war, sumak kawsay, applied ethics

Procedia PDF Downloads 79
1791 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies

Authors: Roberta Martino, Viviana Ventre

Abstract:

Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.

Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty

Procedia PDF Downloads 130
1790 Production Planning for Animal Food Industry under Demand Uncertainty

Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut

Abstract:

This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.

Keywords: animal food, stochastic linear programming, aggregate planning, production planning, demand uncertainty

Procedia PDF Downloads 380
1789 Prediction of Incompatibility Between Excipients and API in Gliclazide Tablets Using Infrared Spectroscopy and Principle Component Analysis

Authors: Farzad Khajavi

Abstract:

Recognition of the interaction between active pharmaceutical ingredients (API) and excipients is a pivotal factor in the development of all pharmaceutical dosage forms. By predicting the interaction between API and excipients, we will be able to prevent the advent of impurities or at least lessen their amount. In this study, we used principle component analysis (PCA) to predict the interaction between Gliclazide as a secondary amine with Lactose in pharmaceutical solid dosage forms. The infrared spectra of binary mixtures of Gliclazide with Lactose at different mole ratios were recorded, and the obtained matrix was analyzed with PCA. By plotting score columns of the analyzed matrix, the incompatibility between Gliclazide and Lactose was observed. This incompatibility was seen experimentally. We observed the appearance of the impurity originated from the Maillard reaction between Gliclazide and Lactose at the chromatogram of the manufactured tablets in room temperature and under accelerated stability conditions. This impurity increases at the stability months. By changing Lactose to Mannitol and using Calcium Dibasic Phosphate in the tablet formulation, the amount of the impurity decreased and was in the acceptance range defined by British pharmacopeia for Gliclazide Tablets. This method is a fast and simple way to predict the existence of incompatibility between excipients and active pharmaceutical ingredients.

Keywords: PCA, gliclazide, impurity, infrared spectroscopy, interaction

Procedia PDF Downloads 209
1788 Two-stage Robust Optimization for Collaborative Distribution Network Design Under Uncertainty

Authors: Reza Alikhani

Abstract:

This research focuses on the establishment of horizontal cooperation among companies to enhance their operational efficiency and competitiveness. The study proposes an approach to horizontal collaboration, called coalition configuration, which involves partnering companies sharing distribution centers in a network design problem. The paper investigates which coalition should be formed in each distribution center to minimize the total cost of the network. Moreover, potential uncertainties, such as operational and disruption risks, are considered during the collaborative design phase. To address this problem, a two-stage robust optimization model for collaborative distribution network design under surging demand and facility disruptions is presented, along with a column-and-constraint generation algorithm to obtain exact solutions tailored to the proposed formulation. Extensive numerical experiments are conducted to analyze solutions obtained by the model in various scenarios, including decisions ranging from fully centralized to fully decentralized settings, collaborative versus non-collaborative approaches, and different amounts of uncertainty budgets. The results show that the coalition formation mechanism proposes some solutions that are competitive with the savings of the grand coalition. The research also highlights that collaboration increases network flexibility and resilience while reducing costs associated with demand and capacity uncertainties.

Keywords: logistics, warehouse sharing, robust facility location, collaboration for resilience

Procedia PDF Downloads 70
1787 Estimation of the Road Traffic Emissions and Dispersion in the Developing Countries Conditions

Authors: Hicham Gourgue, Ahmed Aharoune, Ahmed Ihlal

Abstract:

We present in this work our model of road traffic emissions (line sources) and dispersion of these emissions, named DISPOLSPEM (Dispersion of Poly Sources and Pollutants Emission Model). In its emission part, this model was designed to keep the consistent bottom-up and top-down approaches. It also allows to generate emission inventories from reduced input parameters being adapted to existing conditions in Morocco and in the other developing countries. While several simplifications are made, all the performance of the model results are kept. A further important advantage of the model is that it allows the uncertainty calculation and emission rate uncertainty according to each of the input parameters. In the dispersion part of the model, an improved line source model has been developed, implemented and tested against a reference solution. It provides improvement in accuracy over previous formulas of line source Gaussian plume model, without being too demanding in terms of computational resources. In the case study presented here, the biggest errors were associated with the ends of line source sections; these errors will be canceled by adjacent sections of line sources during the simulation of a road network. In cases where the wind is parallel to the source line, the use of the combination discretized source and analytical line source formulas minimizes remarkably the error. Because this combination is applied only for a small number of wind directions, it should not excessively increase the calculation time.

Keywords: air pollution, dispersion, emissions, line sources, road traffic, urban transport

Procedia PDF Downloads 442
1786 Using Interval Type-2 Fuzzy Controller for Diabetes Mellitus

Authors: Nafiseh Mollaei, Reihaneh Kardehi Moghaddam

Abstract:

In case of Diabetes Mellitus the controlling of insulin is very difficult. This illness is an incurable disease affecting millions of people worldwide. Glucose is a sugar which provides energy to the cells. Insulin is a hormone which supports the absorption of glucose. Fuzzy control strategy is attractive for glucose control because it mimics the first and second phase responses that the pancreas beta cells use to control glucose. We propose two control algorithms a type-1 fuzzy controller and an interval type-2 fuzzy method for the insulin infusion. The closed loop system has been simulated for different patients with different parameters, in present of the food intake disturbance and it has been shown that the blood glucose concentrations at a normoglycemic level of 110 mg/dl in the reasonable amount of time. This paper deals with type 1 diabetes as a nonlinear model, which has been simulated in MATLAB-SIMULINK environment. The novel model, termed the Augmented Minimal Model is used in the simulations. There are some uncertainties in this model due to factors such as blood glucose, daily meals or sudden stress. In addition to eliminate the effects of uncertainty, different control methods may be utilized. In this article, fuzzy controller performance were assessed in terms of its ability to track a normoglycemic set point (110 mg/dl) in response to a [0-10] g meal disturbance. Finally, the development reported in this paper is supposed to simplify the insulin delivery, so increasing the quality of life of the patient.

Keywords: interval type-2, fuzzy controller, minimal augmented model, uncertainty

Procedia PDF Downloads 431
1785 Artificial Neural Network to Predict the Optimum Performance of Air Conditioners under Environmental Conditions in Saudi Arabia

Authors: Amr Sadek, Abdelrahaman Al-Qahtany, Turkey Salem Al-Qahtany

Abstract:

In this study, a backpropagation artificial neural network (ANN) model has been used to predict the cooling and heating capacities of air conditioners (AC) under different conditions. Sufficiently large measurement results were obtained from the national energy-efficiency laboratories in Saudi Arabia and were used for the learning process of the ANN model. The parameters affecting the performance of the AC, including temperature, humidity level, specific heat enthalpy indoors and outdoors, and the air volume flow rate of indoor units, have been considered. These parameters were used as inputs for the ANN model, while the cooling and heating capacity values were set as the targets. A backpropagation ANN model with two hidden layers and one output layer could successfully correlate the input parameters with the targets. The characteristics of the ANN model including the input-processing, transfer, neurons-distance, topology, and training functions have been discussed. The performance of the ANN model was monitored over the training epochs and assessed using the mean squared error function. The model was then used to predict the performance of the AC under conditions that were not included in the measurement results. The optimum performance of the AC was also predicted under the different environmental conditions in Saudi Arabia. The uncertainty of the ANN model predictions has been evaluated taking into account the randomness of the data and lack of learning.

Keywords: artificial neural network, uncertainty of model predictions, efficiency of air conditioners, cooling and heating capacities

Procedia PDF Downloads 74
1784 Responsibility of International Financial Institutions for Harmful Environmental Consequences Arising from Their Development Interventions

Authors: Reham Barakat

Abstract:

Over the last few decades, the influence of International Financial Institutions (IFIs), especially the World Bank (WB), has significantly increased. Since the early 1980s, IFIs have assumed greater role, especially in developing countries; their total lending has dramatically increased, affecting billions of people in their Borrower States. Though the purpose of the development assistance provided by IFIs is to alleviate poverty and promote economic and social development in their member countries, IFIs have been subject to massive criticism by civil society institutions, international NGOs and local communities for the harmful environmental, social and economic impacts resulting from their development interventions in borrower countries, such as deforestation, displacement of indigenous peoples, and unemployment. While the role of IFIs has expanded over time, affecting billions of people, their accountability mechanisms remained behind and were criticized for lacking sufficient independency and enforceability. The serious adverse environmental impacts of the World Bank’s funded projects, along with their weak accountability mechanisms, raises the question of 'To what extent IFIs should be held internationally responsible for the harmful environmental consequences arising from their development interventions?'. This paper argues that IFIs are legally responsible for the harmful environmental consequences arising from their development interventions. The study (i) identifies the applicable laws and relevant primary rules from which the international environmental obligations of IFIs towards their borrower countries are derived (ii) assesses the World Bank’s compliance to the principles of the International Environmental Law including the precautionary principle, the polluter pays principle, and the principle of Good-Neighborliness, (iii) assesses the World Bank’s current internal accountability mechanisms for harmful environmental impacts resulting from the World Bank’s funded projects, and finally (iv) identifies the appropriate dispute settlement mechanisms to which states and non-state actors could raise their claims against IFIs for harmful environmental consequences arising from their interventions.

Keywords: international environmental law, international financial institutions, international responsibility, world bank, environmental and social safeguards

Procedia PDF Downloads 173
1783 Design and Development of Solar Water Cooler Using Principle of Evaporation

Authors: Vipul Shiralkar, Rohit Khadilkar, Shekhar Kulkarni, Ismail Mullani, Omkar Malvankar

Abstract:

The use of water cooler has increased and become an important appliance in the world of global warming. Most of the coolers are electrically operated. In this study an experimental setup of evaporative water cooler using solar energy is designed and developed. It works on the principle of heat transfer using evaporation of water. Water is made to flow through copper tubes arranged in a specific array manner. Cotton plug is wrapped on copper tubes and rubber pipes are arranged in the same way as copper tubes above it. Water percolated from rubber pipes is absorbed by cotton plug. The setup has 40L water carrying capacity with forced cooling arrangement and variable speed fan which uses solar energy stored in 20Ah capacity battery. Fan speed greatly affects the temperature drop. Tests were performed at different fan speed. Maximum temperature drop achieved was 90C at 1440 rpm of fan speed. This temperature drop is very attractive. This water cooler uses solar energy hence it is cost efficient and it is affordable to rural community as well. The cooler is free from any harmful emissions like other refrigerants and hence environmental friendly. Very less maintenance is required as compared to the conventional electrical water cooler.

Keywords: evaporation, cooler, energy, copper, solar, cost

Procedia PDF Downloads 320