Search results for: cumulative probabilities
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 536

Search results for: cumulative probabilities

236 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region

Authors: Mohammad Bakhshi, Firas Al Janabi

Abstract:

High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.

Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall

Procedia PDF Downloads 177
235 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs

Authors: Lokesh Varshney, R. K. Saket

Abstract:

This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.

Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation

Procedia PDF Downloads 532
234 A Joint Possibilistic-Probabilistic Tool for Load Flow Uncertainty Assessment-Part II: Case Studies

Authors: Morteza Aien, Masoud Rashidinejad, Mahmud Fotuhi-Firuzabad

Abstract:

Power systems are innately uncertain systems. To face with such uncertain systems, robust uncertainty assessment tools are appealed. This paper inspects the uncertainty assessment formulation of the load flow (LF) problem considering different kinds of uncertainties, developed in its companion paper through some case studies. The proposed methodology is based on the evidence theory and joint propagation of possibilistic and probabilistic uncertainties. The load and wind power generation are considered as probabilistic uncertain variables and the electric vehicles (EVs) and gas turbine distributed generation (DG) units are considered as possibilistic uncertain variables. The cumulative distribution function (CDF) of the system output parameters obtained by the pure probabilistic method lies within the belief and plausibility functions obtained by the joint propagation approach. Furthermore, the imprecision in the DG parameters is explicitly reflected by the gap between the belief and plausibility functions. This gap, due to the epistemic uncertainty on the DG resources parameters grows as the penetration level increases.

Keywords: electric vehicles, joint possibilistic- probabilistic uncertainty modeling, uncertain load flow, wind turbine generator

Procedia PDF Downloads 401
233 Effect of Organic Fertilizers on the Improvement of Soil Microbiological Functioning under Saline Conditions of Arid Regions: Impact on Carbon and Nitrogen Mineralization

Authors: Oustani Mabrouka, Halilat Md Tahar, Hannachi Slimane

Abstract:

This study was conducted on representative and contrasting soils of arid regions. It focuses on the compared influence of two organic fertilizers: poultry manure (PM) and bovine manure (BM) on improving the microbial functioning of non-saline (SS) and saline (SSS) soils, in particularly, the process of mineralization of nitrogen and carbon. The microbiological activity was estimated by respirometric test (CO2–C emissions) and the extraction of two forms of mineral nitrogen (NH4+-N and NO3--N). Thus, after 56 days of incubation under controlled conditions (28 degrees and 80 per cent of the field capacity), the two types of manures showed that the mineralization activity varies according to type of soil and the organic substrate itself. However, the highest cumulative quantities of CO2–C, NH4+–N and NO3-–N obtained at the end of incubation were recorded in non-saline (SS) soil treated with poultry manure with 1173.4, 4.26 and 8.40 mg/100 g of dry soil, respectively. The reductions in rates of release of CO2–C and of nitrification under saline conditions were 21 and 36, 78 %, respectively. The influence of organic substratum on the microbial density shows a stimulating effect on all microbial groups studied. The whole results show the usefulness of two types of manures for the improvement of the microbiological functioning of arid soils.

Keywords: Salinity, Organic matter, Microorganisms, Mineralization, Nitrogen, Carbon, Arid regions

Procedia PDF Downloads 251
232 Releasing Two Insect Predators to Control of Aphids Under Open-field Conditions

Authors: Mohamed Ahmed Gesraha, Amany Ramadan Ebeid

Abstract:

Aphids are noxious and serious persistent pests in the open fields worldwide. Many authors studied the possibility of aphid control by applying Ladybirds and Lacewings at different releasing rates under open-field conditions. Results clarify that releasing 3rd instar larvae of Coccinella undecimpunctata at the rate of 1 larva:50 aphid was more effective than 1:100 or 1:200 rates for controlling Aphis gossypii population in Okra field; reflecting more than 90% reduction in the aphid population within 15 days. When Chrysoperla carnea 2nd larval instar were releasing at 1:5, 1:10, and 1:20 (predator: aphid), it was noticed that the former rate was the most effective one, inducing 98.93% reduction in aphid population; while the two other rates reflecting less reduction. Additionally, in the case of double releases, the reduction percentage at the 1:5 rate was 99.63%, emphasize that this rate was the most effective one; the other rates induced 97.05 and 95.64% reduction. Generally, a double release was more effective in all tested rates than the single one because of the cumulative existence of the predators in large numbers at the same period of the experiment. It could be concluded that utilizing insect predators (Coccinella undecimpunctata or Chrysoperla carnea) at an early larval stag were faire enough to reduce the aphids’ populations under open fields conditions.

Keywords: releasing predators, lacewings, ladybird, open fields

Procedia PDF Downloads 147
231 Sono- and Photocatalytic Degradation of Indigocarmine in Water Using ZnO

Authors: V. Veena, Suguna Yesodharan, E. P. Yesodharan

Abstract:

Two Advanced Oxidation Processes (AOP) i.e., sono- and photo-catalysis mediated by semiconductor oxide catalyst, ZnO has been found effective for the removal of trace amounts of the toxic dye pollutant Indigocarmine (IC) from water. The effect of various reaction parameters such as concentration of the dye, catalyst dosage, temperature, pH, dissolved oxygen etc. as well as the addition of oxidisers and presence of salts in water on the rate of degradation has been evaluated and optimised. The degradation follows variable kinetics depending on the concentration of the substrate, the order of reaction varying from 1 to 0 with increase in concentration. The reaction proceeds through a number of intermediates and many of them have been identified using GCMS technique. The intermediates do not affect the rate of degradation significantly. The influence of anions such as chloride, sulphate, fluoride, carbonate, bicarbonate, phosphate etc. on the degradation of IC is not consistent and does not follow any predictable pattern. Phosphates and fluorides inhibit the degradation while chloride, sulphate, carbonate and bicarbonate enhance. Adsorption studies of the dye in the absence as well as presence of these anions show that there may not be any direct correlation between the adsorption of the dye on the catalyst and the degradation. Oxidants such as hydrogen peroxide and persulphate enhance the degradation though the combined effect and it is less than the cumulative effect of individual components. COD measurements show that the degradation proceeds to complete mineralisation. The results will be presented and probable mechanism for the degradation will be discussed.

Keywords: AOP, COD, indigocarmine, photocatalysis, sonocatalysis

Procedia PDF Downloads 298
230 Credible Autopsy Report for Investigators and Judiciary

Authors: Sudhir K. Gupta

Abstract:

Introduction: When a forensic doctor determines that a suspicious death is a suicide, homicide, or accident, the decision virtually becomes incontestable by the investigating police officer, and it becomes an issue whether the medical opinion was created with necessary checks and balances on the other probabilities of the case. It is suggested that the opinion of Forensic Medical experts is conventional, mutable, and shifting from one expert to another. The determination of suicide, accident, or homicide is mandatorily required, which is the Gold Standard for conducting death investigations. Forensic investigations serve many audiences, but the court is by far the most critical. The likely questions on direct and cross-examination determine how forensic doctors gather and handle evidence and what conclusions they reach. Methodology: The author interacted with the investigative authority, and a crime scene visit was also done along with the perusal of the Postmortem report, subsequent opinion, and crime scene photographs and statements of the witness and accused. Further analysis of all relevant scientific documents and opinions of other forensic doctors, forensic scientists, and ballistic experts involved in these cases was done to arrive at an opinion with scientific justification. Findings: The opinions arrived at by the author and how they helped the judiciary in delivering justice in these cases have been discussed in this article. This can help the readers to understand the process involved in formulating a credible forensic medical expert opinion for investigators and the judiciary. Conclusion: A criminal case might be won or lost over doubt cast on the chain of custody. Medically trained forensic doctors, therefore, learn to practice their profession in legally appropriate ways, and opinions must be based on medical justifications with credible references.

Keywords: forensic doctor, professional credibility, investigation, expert opinion

Procedia PDF Downloads 45
229 Applying Concept Mapping to Explore Temperature Abuse Factors in the Processes of Cold Chain Logistics Centers

Authors: Marco F. Benaglia, Mei H. Chen, Kune M. Tsai, Chia H. Hung

Abstract:

As societal and family structures, consumer dietary habits, and awareness about food safety and quality continue to evolve in most developed countries, the demand for refrigerated and frozen foods has been growing, and the issues related to their preservation have gained increasing attention. A well-established cold chain logistics system is essential to avoid any temperature abuse; therefore, assessing potential disruptions in the operational processes of cold chain logistics centers becomes pivotal. This study preliminarily employs HACCP to find disruption factors in cold chain logistics centers that may cause temperature abuse. Then, concept mapping is applied: selected experts engage in brainstorming sessions to identify any further factors. The panel consists of ten experts, including four from logistics and home delivery, two from retail distribution, one from the food industry, two from low-temperature logistics centers, and one from the freight industry. Disruptions include equipment-related aspects, human factors, management aspects, and process-related considerations. The areas of observation encompass freezer rooms, refrigerated storage areas, loading docks, sorting areas, and vehicle parking zones. The experts also categorize the disruption factors based on perceived similarities and build a similarity matrix. Each factor is evaluated for its impact, frequency, and investment importance. Next, multiple scale analysis, cluster analysis, and other methods are used to analyze these factors. Simultaneously, key disruption factors are identified based on their impact and frequency, and, subsequently, the factors that companies prioritize and are willing to invest in are determined by assessing investors’ risk aversion behavior. Finally, Cumulative Prospect Theory (CPT) is applied to verify the risk patterns. 66 disruption factors are found and categorized into six clusters: (1) "Inappropriate Use and Maintenance of Hardware and Software Facilities", (2) "Inadequate Management and Operational Negligence", (3) "Product Characteristics Affecting Quality and Inappropriate Packaging", (4) "Poor Control of Operation Timing and Missing Distribution Processing", (5) "Inadequate Planning for Peak Periods and Poor Process Planning", and (6) "Insufficient Cold Chain Awareness and Inadequate Training of Personnel". This study also identifies five critical factors in the operational processes of cold chain logistics centers: "Lack of Personnel’s Awareness Regarding Cold Chain Quality", "Personnel Not Following Standard Operating Procedures", "Personnel’s Operational Negligence", "Management’s Inadequacy", and "Lack of Personnel’s Knowledge About Cold Chain". The findings show that cold chain operators prioritize prevention and improvement efforts in the "Inappropriate Use and Maintenance of Hardware and Software Facilities" cluster, particularly focusing on the factors of "Temperature Setting Errors" and "Management’s Inadequacy". However, through the application of CPT theory, this study reveals that companies are not usually willing to invest in the improvement of factors related to the "Inappropriate Use and Maintenance of Hardware and Software Facilities" cluster due to its low occurrence likelihood, but they acknowledge the severity of the consequences if it does occur. Hence, the main implication is that the key disruption factors in cold chain logistics centers’ processes are associated with personnel issues; therefore, comprehensive training, periodic audits, and the establishment of reasonable incentives and penalties for both new employees and managers may significantly reduce disruption issues.

Keywords: concept mapping, cold chain, HACCP, cumulative prospect theory

Procedia PDF Downloads 23
228 Modeling of Electrokinetic Mixing in Lab on Chip Microfluidic Devices

Authors: Virendra J. Majarikar, Harikrishnan N. Unni

Abstract:

This paper sets to demonstrate a modeling of electrokinetic mixing employing electroosmotic stationary and time-dependent microchannel using alternate zeta patches on the lower surface of the micromixer in a lab on chip microfluidic device. Electroosmotic flow is amplified using different 2D and 3D model designs with alternate and geometric zeta potential values such as 25, 50, and 100 mV, respectively, to achieve high concentration mixing in the electrokinetically-driven microfluidic system. The enhancement of electrokinetic mixing is studied using Finite Element Modeling, and simulation workflow is accomplished with defined integral steps. It can be observed that the presence of alternate zeta patches can help inducing microvortex flows inside the channel, which in turn can improve mixing efficiency. Fluid flow and concentration fields are simulated by solving Navier-Stokes equation (implying Helmholtz-Smoluchowski slip velocity boundary condition) and Convection-Diffusion equation. The effect of the magnitude of zeta potential, the number of alternate zeta patches, etc. are analysed thoroughly. 2D simulation reveals that there is a cumulative increase in concentration mixing, whereas 3D simulation differs slightly with low zeta potential as that of the 2D model within the T-shaped micromixer for concentration 1 mol/m3 and 0 mol/m3, respectively. Moreover, 2D model results were compared with those of 3D to indicate the importance of the 3D model in a microfluidic design process.

Keywords: COMSOL Multiphysics®, electrokinetic, electroosmotic, microfluidics, zeta potential

Procedia PDF Downloads 218
227 Biochar Assisted Municipal Wastewater Treatment and Nutrient Recycling

Authors: A. Pokharel, A. Farooque, B. Acharya

Abstract:

Pyrolysis can be used for energy production from waste biomass of agriculture and forestry. Biochar is the solid byproduct of pyrolysis and its cascading use can offset the cost of the process. A wide variety of research on biochar has highlighted its ability to absorb nutrients, metal and complex compounds; filter suspended solids; enhance microorganisms’ growth; retain water and nutrients as well as to increase carbon content of soil. In addition, sustainable biochar systems are an attractive approach for carbon sequestration and total waste management cycle. Commercially available biochar from Sigma Aldrich was studied for adsorption of nitrogen from effluent of municipal wastewater treatment plant. Adsorption isotherm and breakthrough curve were determined for the biochar. Similarly, biochar’s effects in aerobic as well as anaerobic bioreactors were also studied. In both cases, the biomass was increased in presence of biochar. The amount of gas produced for anaerobic digestion of fruit mix (apple and banana) was similar but the rate of production was significantly faster in biochar fed reactors. The cumulative goal of the study is to use biochar in various wastewater treatment units like aeration tank, secondary clarifier and tertiary nutrient recovery system as well as in anaerobic digestion of the sludge to optimize utilization and add value before being used as a soil amendment.

Keywords: biochar, nutrient recyling, wastewater treatment, soil amendment

Procedia PDF Downloads 107
226 Probabilistic Crash Prediction and Prevention of Vehicle Crash

Authors: Lavanya Annadi, Fahimeh Jafari

Abstract:

Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.

Keywords: road safety, crash prediction, exploratory analysis, machine learning

Procedia PDF Downloads 72
225 The Effects of Land Use Types to Determine the Status of Sustainable River

Authors: Michael Louis Sunaris, Robby Yussac Tallar

Abstract:

The concept of sustainable river is evolving in Indonesia today. Many rivers condition in Indonesia have decreased by quality and quantity. The degradation of this condition is caused by rapid land use change as a result of increased population growth and human activity. It brings the degradation of the existing watersheds including some types of land use that an important factor in determining the status of river sustainability. Therefore, an evaluation method is required to determine the sustainability status of waterbody within watershed. The purpose of this study is to analyze various types of land use in determining the status of river sustainability. This study takes the watersheds of Citarum Upstream as a study area. The results of the analysis prove the index of sustainability status of the river that changes from good to bad or average in the rivers in the study area. The rapid and uncontrolled changes of land use especially in the upper watersheds area are the main causes that happened over time. It was indicated that the cumulative runoff coefficients were increased significantly. These situations indicated that the damage of watersheds has an impact on the water surplus or deficit problem yearly. Therefore, the rivers in Indonesia should be protected and conserved. The sustainability index of the rivers is an index to indicate the condition of watersheds by defining status of rivers in order to achieve sustainable water resource management.

Keywords: land use change, runoff coefficient, a simple index, sustainable river

Procedia PDF Downloads 121
224 Wireless Sensor Network for Forest Fire Detection and Localization

Authors: Tarek Dandashi

Abstract:

WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.

Keywords: forest fire, WSN, wireless sensor network, algortihm

Procedia PDF Downloads 236
223 Effect of Dimensional Reinforcement Probability on Discrimination of Visual Compound Stimuli by Pigeons

Authors: O. V. Vyazovska

Abstract:

Behavioral efficiency is one of the main principles to be successful in nature. Accuracy of visual discrimination is determined by the attention, learning experience, and memory. In the experimental condition, pigeons’ responses to visual stimuli presented on the screen of the monitor are behaviorally manifested by pecking or not pecking the stimulus, by the number of pecking, reaction time, etc. The higher the probability of rewarding is, the more likely pigeons will respond to the stimulus. We trained 8 pigeons (Columba livia) on a stagewise go/no-go visual discrimination task.16 visual stimuli were created from all possible combinations of four binary dimensions: brightness (dark/bright), size (large/small), line orientation (vertical/horizontal), and shape (circle/square). In the first stage, we presented S+ and 4 S-stimuli: the first that differed in all 4-dimensional values from S+, the second with brightness dimension sharing with S+, the third sharing brightness and orientation with S+, the fourth sharing brightness, orientation and size. Then all 16 stimuli were added. Pigeons rejected correctly 6-8 of 11 new added S-stimuli at the beginning of the second stage. The results revealed that pigeons’ behavior at the beginning of the second stage was controlled by probabilities of rewarding for 4 dimensions learned in the first stage. More or fewer mistakes with dimension discrimination at the beginning of the second stage depended on the number S- stimuli sharing the dimension with S+ in the first stage. A significant inverse correlation between the number of S- stimuli sharing dimension values with S+ in the first stage and the dimensional learning rate at the beginning of the second stage was found. Pigeons were more confident in discrimination of shape and size dimensions. They made mistakes at the beginning of the second stage, which were not associated with these dimensions. Thus, the received results help elucidate the principles of dimensional stimulus control during learning compound multidimensional visual stimuli.

Keywords: visual go/no go discrimination, selective attention, dimensional stimulus control, pigeon

Procedia PDF Downloads 108
222 An Investigation of the Therapeutic Effects of Indian Classical Music (Raga Bhairavi) on Mood and Physiological Parameters of Scholars

Authors: Kalpana Singh, Nikita Katiyar

Abstract:

This research investigates the impact of Raga Bhairavi, a prominent musical scale in Indian classical music, on the mood and basic physiological parameters of research scholars at the University of Lucknow - India. The study focuses on the potential therapeutic effects of listening to Raga Bhairavi during morning hours. A controlled experimental design is employed, utilizing self-reporting tools for mood assessment and monitoring physiological indicators such as heart rate, oxygen saturation levels, body temperature and blood pressure. The hypothesis posits that exposure to Raga Bhairavi will lead to positive mood modulation and a reduction in physiological stress markers among research scholars. Data collection involves pre and post-exposure measurements, providing insights into the immediate and cumulative effects of the musical intervention. The study aims to contribute valuable information to the growing field of music therapy, offering a potential avenue for enhancing the well-being and productivity of individuals engaged in intense cognitive activities. Results may have implications for the integration of music-based interventions in academic and research environments, fostering a conducive atmosphere for intellectual pursuits.

Keywords: bio-musicology, classical music, mood assessment, music therapy, physiology, Raga Bhairavi

Procedia PDF Downloads 16
221 Free Will and Compatibilism in Decision Theory: A Solution to Newcomb’s Paradox

Authors: Sally Heyeon Hwang

Abstract:

Within decision theory, there are normative principles that dictate how one should act in addition to empirical theories of actual behavior. As a normative guide to one’s actual behavior, evidential or causal decision-theoretic equations allow one to identify outcomes with maximal utility values. The choice that each person makes, however, will, of course, differ according to varying assignments of weight and probability values. Regarding these different choices, it remains a subject of considerable philosophical controversy whether individual subjects have the capacity to exercise free will with respect to the assignment of probabilities, or whether instead the assignment is in some way constrained. A version of this question is given a precise form in Richard Jeffrey’s assumption that free will is necessary for Newcomb’s paradox to count as a decision problem. This paper will argue, against Jeffrey, that decision theory does not require the assumption of libertarian freedom. One of the hallmarks of decision-making is its application across a wide variety of contexts; the implications of a background assumption of free will is similarly varied. One constant across the contexts of decision is that there are always at least two levels of choice for a given agent, depending on the degree of prior constraint. Within the context of Newcomb’s problem, when the predictor is attempting to guess the choice the agent will make, he or she is analyzing the determined aspects of the agent such as past characteristics, experiences, and knowledge. On the other hand, as David Lewis’ backtracking argument concerning the relationship between past and present events brings to light, there are similarly varied ways in which the past can actually be dependent on the present. One implication of this argument is that even in deterministic settings, an agent can have more free will than it may seem. This paper will thus argue against the view that a stable background assumption of free will or determinism in decision theory is necessary, arguing instead for a compatibilist decision theory yielding a novel treatment of Newcomb’s problem.

Keywords: decision theory, compatibilism, free will, Newcomb’s problem

Procedia PDF Downloads 294
220 Assessment of Land Surface Temperature Using Satellite Remote Sensing

Authors: R. Vidhya, M. Navamuniyammal M. Sivakumar, S. Reeta

Abstract:

The unplanned urbanization affects the environment due to pollution, conditions of the atmosphere, decreased vegetation and the pervious and impervious soil surface. Considered to be a cumulative effect of all these impacts is the Urban Heat Island. In this paper, the urban heat island effect is studied for the Chennai city, TamilNadu, South India using satellite remote sensing data. LANDSAT 8 OLI and TIRS DATA acquired on 9th September 2014 were used to Land Surface Temperature (LST) map, vegetation fraction map, Impervious surface fraction, Normalized Difference Water Index (NDWI), Normalized Difference Building Index (NDBI) and Normalized Difference Vegetation Index (NDVI) map. The relationship among LST, Vegetation fraction, NDBI, NDWI, and NDVI was calculated. The Chennai city’s Urban Heat Island effect is significant, and the results indicate LST has strong negative correlation with the vegetation present and positive correlation with NDBI. The vegetation is the main factor to control urban heat island effect issues in urban area like Chennai City. This study will help in developing measures to land use planning to reduce the heat effects in urban area based on remote sensing derivatives.

Keywords: land surface temperature, brightness temperature, emissivity, vegetation index

Procedia PDF Downloads 244
219 Place of Radiotherapy in the Treatment of Intracranial Meningiomas: Experience of the Cancer Center Emir Abdelkader of Oran Algeria

Authors: Taleb L., Benarbia M., Boutira F. M., Allam H., Boukerche A.

Abstract:

Introduction and purpose of the study: Meningiomas are the most common non-glial intracranial tumors in adults, accounting for approximately 30% of all central nervous system tumors. The aim of our study is to determine the epidemiological, clinical, therapeutic, and evolutionary characteristics of a cohort of patients with intracranial meningioma treated with radiotherapy at the Emir Abdelkader Cancer Center in Oran. Material and methods: This is a retrospective study of 44 patients during the period from 2014 to 2020. The overall survival and relapse-free survival curves were calculated using the Kaplan-Meier method. Results and statistical analysis: The median age of the patients was 49 years [21-76 years] with a clear female predominance (sex ratio=2.4). The average diagnostic delay was seven months [2 to 24 months], the circumstances of the discovery of which were dominated by headaches in 54.5% of cases (n=24), visual disturbances in 40.9% (n=18), and motor disorders in 15.9% (n=7). The seat of the tumor was essentially at the level of the base of the skull in 52.3% of patients (n=23), including 29.5% (n=13) at the level of the cavernous sinus, 27.3% (n=12) at the parasagittal level and 20.5% (n=9) at the convexity. The diagnosis was confirmed surgically in 36 patients (81.8%) whose anatomopathological study returned in favor of grades I, II, and III in respectively 40.9%, 29.5%, and 11.4% of the cases. Radiotherapy was indicated postoperatively in 45.5% of patients (n=20), exclusive in 27.3% (n=12) and after tumor recurrence in 27.3% of cases (n=18). The irradiation doses delivered were as follows: 50 Gy (20.5%), 54 Gy (65.9%), and 60 Gy (13.6%). With a median follow-up of 69 months, the probabilities of relapse-free survival and overall survival at three years are 93.2% and 95.4%, respectively, whereas they are 71.2% and 80.7% at five years. Conclusion: Meningiomas are common primary brain tumors. Most often benign but can also progress aggressively. Their treatment is essentially surgical, but radiotherapy retains its place in specific situations, allowing good tumor control and overall survival.

Keywords: diagnosis, meningioma, surgery, radiotherapy, survival

Procedia PDF Downloads 60
218 Automatic Identification and Monitoring of Wildlife via Computer Vision and IoT

Authors: Bilal Arshad, Johan Barthelemy, Elliott Pilton, Pascal Perez

Abstract:

Getting reliable, informative, and up-to-date information about the location, mobility, and behavioural patterns of animals will enhance our ability to research and preserve biodiversity. The fusion of infra-red sensors and camera traps offers an inexpensive way to collect wildlife data in the form of images. However, extracting useful data from these images, such as the identification and counting of animals remains a manual, time-consuming, and costly process. In this paper, we demonstrate that such information can be automatically retrieved by using state-of-the-art deep learning methods. Another major challenge that ecologists are facing is the recounting of one single animal multiple times due to that animal reappearing in other images taken by the same or other camera traps. Nonetheless, such information can be extremely useful for tracking wildlife and understanding its behaviour. To tackle the multiple count problem, we have designed a meshed network of camera traps, so they can share the captured images along with timestamps, cumulative counts, and dimensions of the animal. The proposed method takes leverage of edge computing to support real-time tracking and monitoring of wildlife. This method has been validated in the field and can be easily extended to other applications focusing on wildlife monitoring and management, where the traditional way of monitoring is expensive and time-consuming.

Keywords: computer vision, ecology, internet of things, invasive species management, wildlife management

Procedia PDF Downloads 107
217 Intergenerational Class Mobility in Greece: A Cross-Cohort Analysis with Evidence from European Union-Statistics on Income and Living Conditions

Authors: G. Stamatopoulou, M. Symeonaki, C. Michalopoulou

Abstract:

In this work, we study the intergenerational social mobility in Greece, in order to provide up-to-date evidence on the changes in the mobility patterns throughout the years. An analysis for both men and women aged between 25-64 years old is carried out. Three main research objectives are addressed. First, we aim to examine the relationship between the socio-economic status of parents and their children. Secondly, we investigate the evolution of the mobility patterns between different birth cohorts. Finally, the role of education is explored in shaping the mobility patterns. For the analysis, we draw data on both parental and individuals' social outcomes from different national databases. The social class of origins and destination is measured according to the European Socio-Economic Classification (ESeC), while the respondents' educational attainment is coded into categories based on the International Standard Classification of Education (ISCED). Applying the Markov transition probability theory, and a range of measures and models, this work focuses on the magnitude and the direction of the movements that take place in the Greek labour market, as well as the level of social fluidity. Three-way mobility tables are presented, where the transition probabilities between the classes of destination and origins are calculated for different cohorts. Additionally, a range of absolute and relative mobility rates, as well as distance measures, are presented. The study covers a large time span beginning in 1940 until 1995, shedding light on the effects of the national institutional processes on the social movements of individuals. Given the evidence on the mobility patterns of the most recent birth cohorts, we also investigate the possible effects of the 2008 economic crisis.

Keywords: cohort analysis, education, Greece, intergenerational mobility, social class

Procedia PDF Downloads 76
216 Experimental Study on a Solar Heat Concentrating Steam Generator

Authors: Qiangqiang Xu, Xu Ji, Jingyang Han, Changchun Yang, Ming Li

Abstract:

Replacing of complex solar concentrating unit, this paper designs a solar heat-concentrating medium-temperature steam-generating system. Solar radiation is collected by using a large solar collecting and heat concentrating plate and is converged to the metal evaporating pipe with high efficient heat transfer. In the meantime, the heat loss is reduced by employing a double-glazed cover and other heat insulating structures. Thus, a high temperature is reached in the metal evaporating pipe. The influences of the system's structure parameters on system performance are analyzed. The steam production rate and the steam production under different solar irradiance, solar collecting and heat concentrating plate area, solar collecting and heat concentrating plate temperature and heat loss are obtained. The results show that when solar irradiance is higher than 600 W/m2, the effective heat collecting area is 7.6 m2 and the double-glazing cover is adopted, the system heat loss amount is lower than the solar irradiance value. The stable steam is produced in the metal evaporating pipe at 100 ℃, 110 ℃, and 120 ℃, respectively. When the average solar irradiance is about 896 W/m2, and the steaming cumulative time is about 5 hours, the daily steam production of the system is about 6.174 kg. In a single day, the solar irradiance is larger at noon, thus the steam production rate is large at that time. Before 9:00 and after 16:00, the solar irradiance is smaller, and the steam production rate is almost 0.

Keywords: heat concentrating, heat loss, medium temperature, solar steam production

Procedia PDF Downloads 154
215 Simulation-Based Validation of Safe Human-Robot-Collaboration

Authors: Titanilla Komenda

Abstract:

Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.

Keywords: human-machine-system, human-robot-collaboration, safety, simulation

Procedia PDF Downloads 334
214 New Variational Approach for Contrast Enhancement of Color Image

Authors: Wanhyun Cho, Seongchae Seo, Soonja Kang

Abstract:

In this work, we propose a variational technique for image contrast enhancement which utilizes global and local information around each pixel. The energy functional is defined by a weighted linear combination of three terms which are called on a local, a global contrast term and dispersion term. The first one is a local contrast term that can lead to improve the contrast of an input image by increasing the grey-level differences between each pixel and its neighboring to utilize contextual information around each pixel. The second one is global contrast term, which can lead to enhance a contrast of image by minimizing the difference between its empirical distribution function and a cumulative distribution function to make the probability distribution of pixel values becoming a symmetric distribution about median. The third one is a dispersion term that controls the departure between new pixel value and pixel value of original image while preserving original image characteristics as well as possible. Second, we derive the Euler-Lagrange equation for true image that can achieve the minimum of a proposed functional by using the fundamental lemma for the calculus of variations. And, we considered the procedure that this equation can be solved by using a gradient decent method, which is one of the dynamic approximation techniques. Finally, by conducting various experiments, we can demonstrate that the proposed method can enhance the contrast of colour images better than existing techniques.

Keywords: color image, contrast enhancement technique, variational approach, Euler-Lagrang equation, dynamic approximation method, EME measure

Procedia PDF Downloads 419
213 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 555
212 Adsorption Mechanism of Heavy Metals and Organic Pesticide on Industrial Construction and Demolition Waste and Its Runoff Behaviors

Authors: Sheng Huang, Xin Zhao, Xiaofeng Gao, Tao Zhou, Shijin Dai, Youcai Zhao

Abstract:

Adsorption of heavy metal pollutants (Zn, Cd, Pb, Cr, Cu) and organic pesticide (phorate, dithiophosphate diethyl, triethyl phosphorothioate), along with their multi-contamination on the surface of industrial construction & demolition waste (C&D waste) was investigated. Brick powder was selected as the appropriate waste while its maximum equilibrium adsorption amount of heavy metal under single controlled contamination matrix reached 5.41, 0.81, 0.45, 1.13 and 0.97 mg/g, respectively. Effects of pH and spiking dose of ICDW was also investigated. Equilibrium adsorption amount of organic pesticide varied from 0.02 to 0.97 mg/g, which was negatively correlated to the size distribution and hydrophilism. Existence of organic pesticide on surface of ICDW caused various effects on the heavy metal adsorption, mainly due to combination of metal ions and the floccule formation along with wrapping behaviors by pesticide pollutants. Adsorption of Zn was sharply decreased from 7.1 to 0.15 mg/g compared with clean ICDW and phorate contaminated ICDW, while that of Pb, Cr and Cd experienced an increase- then decrease procedure. On the other hand, runoff of pesticide contaminants was investigated under 25 mm/h simulated rainfall. Results showed that the cumulative runoff amount fitted well with curve obtained from a power function, of which r2=0.95 and 0.91 for 1DAA (1 day between contamination and runoff) and 7DAA, respectively. This study helps provide evaluation of industrial construction and demolition waste contamination into aquatic systems.

Keywords: adsorption mechanism, industrial construction waste, metals, pesticide, runoff

Procedia PDF Downloads 422
211 Formulation Development and Evaluation Chlorpheniramine Maleate Containing Nanoparticles Loaded Thermo Sensitive in situ Gel for Treatment of Allergic Rhinitis

Authors: Vipin Saini, Manish Kumar, Shailendra Bhatt, A. Pandurangan

Abstract:

The aim of the present study was to fabricate a thermo sensitive gel containing Chlorpheniramine maleate (CPM) loaded nanoparticles following intranasal administration for effective treatment of allergic rhinitis. Chitosan based nanoparticles were prepared by precipitation method followed by the addition of developed NPs within the Poloxamer 407 and carbopol 934P based mucoadhesive thermo-reversible gel. Developed formulations were evaluated for Particle size, PDI, % entrapment efficiency and % cumulative drug permeation. NP3 formulation was found to be optimized on the basis of minimum particle size (143.9 nm), maximum entrapment efficiency (80.10±0.414 %) and highest drug permeation (90.92±0.531 %). The optimized formulation NP3 was then formulated into thermo reversible in situ gel. This intensifies the contact between nasal mucosa and the drug, increases and facilitates the drug absorption which results in increased bioavailability. G4 formulation was selected as the optimize on the basis of gelation ability and mucoadhesive strength. Histology was carried out to examine the damage caused by the optimized G4 formulation. Results revealed no visual signs of tissue damage thus indicated safe nasal delivery of nanoparticulate in situ gel formulation G4. Thus, intranasal CPM NP-loaded in situ gel was found to be a promising formulation for the treatment of allergic rhinitis.

Keywords: chitosan, nanoparticles, in situ gel, chlorpheniramine maleate, poloxamer 407

Procedia PDF Downloads 148
210 Apollo Clinical Excellence Scorecard (ACE@25): An Initiative to Drive Quality Improvement in Hospitals

Authors: Anupam Sibal

Abstract:

Whatever is measured tends to improve. With a view to objectively measuring and improving clinical quality across the Apollo Group Hospitals, the initiative of ACE @ 25 (Apollo Clinical Excellence@25) was launched on Jan 09. ACE @ 25 is a clinically balanced scorecard incorporating 25 clinical quality parameters involving complication rates, mortality rates, one-year survival rates and average length of stay after major procedures like liver and renal transplant, CABG, TKR, THR, TURP, PTCA, endoscopy, large bowel resection and MRM covering all major specialties. Also included are hospital acquired infection rates, pain satisfaction and medication errors. Benchmarks have been chosen from the world’s best hospitals. There are weighted scores for outcomes color coded green, orange and red. The cumulative score is 100. Data is reported monthly by 43 Group Hospitals online on the Lighthouse platform. Action taken reports for parameters falling in red are submitted quarterly and reviewed by the board. An audit team audits the data at all locations every six months. Scores are linked to appraisal of the medical head and there is an “ACE @ 25” Champion Award for the highest scorer. Scores for different parameters were variable from green to red at the start of the initiative. Most hospitals showed an improvement in scores over the last four years for parameters where they had showed scores in red or orange at the start of the initiative. The overall scores for the group have shown an increase from 72 in 2010 to 81 in 2015.

Keywords: benchmarks, clinical quality, lighthouse, platform, scores

Procedia PDF Downloads 265
209 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction

Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey

Abstract:

In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.

Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization

Procedia PDF Downloads 320
208 Fragility Analysis of a Soft First-Story Building in Mexico City

Authors: Rene Jimenez, Sonia E. Ruiz, Miguel A. Orellana

Abstract:

On 09/19/2017, a Mw = 7.1 intraslab earthquake occurred in Mexico causing the collapse of about 40 buildings. Many of these were 5- or 6-story buildings with soft first story; so, it is desirable to perform a structural fragility analysis of typical structures representative of those buildings and to propose a reliable structural solution. Here, a typical 5-story building constituted by regular R/C moment-resisting frames in the first story and confined masonry walls in the upper levels, similar to the collapsed structures on the 09/19/2017 Mexico earthquake, is analyzed. Three different structural solutions of the 5-story building are considered: S1) it is designed in accordance with the Mexico City Building Code-2004; S2) then, the column dimensions of the first story corresponding to S1 are reduced, and S3) viscous dampers are added at the first story of solution S2. A number of dynamic incremental analyses are performed for each structural solution, using a 3D structural model. The hysteretic behavior model of the masonry was calibrated with experiments performed at the Laboratory of Structures at UNAM. Ten seismic ground motions are used to excite the structures; they correspond to ground motions recorded in intermediate soil of Mexico City with a dominant period around 1s, where the structures are located. The fragility curves of the buildings are obtained for different values of the maximum inter-story drift demands. Results show that solutions S1 and S3 give place to similar probabilities of exceedance of a given value of inter-story drift for the same seismic intensity, and that solution S2 presents a higher probability of exceedance for the same seismic intensity and inter-story drift demand. Therefore, it is concluded that solution S3 (which corresponds to the building with soft first story and energy dissipation devices) can be a reliable solution from the structural point of view.

Keywords: demand hazard analysis, fragility curves, incremental dynamic analyzes, soft-first story, structural capacity

Procedia PDF Downloads 145
207 Hybrid Wind Solar Gas Reliability Optimization Using Harmony Search under Performance and Budget Constraints

Authors: Meziane Rachid, Boufala Seddik, Hamzi Amar, Amara Mohamed

Abstract:

Today’s energy industry seeks maximum benefit with maximum reliability. In order to achieve this goal, design engineers depend on reliability optimization techniques. This work uses a harmony search algorithm (HS) meta-heuristic optimization method to solve the problem of wind-Solar-Gas power systems design optimization. We consider the case where redundant electrical components are chosen to achieve a desirable level of reliability. The electrical power components of the system are characterized by their cost, capacity and reliability. The reliability is considered in this work as the ability to satisfy the consumer demand which is represented as a piecewise cumulative load curve. This definition of the reliability index is widely used for power systems. The proposed meta-heuristic seeks for the optimal design of series-parallel power systems in which a multiple choice of wind generators, transformers and lines are allowed from a list of product available in the market. Our approach has the advantage to allow electrical power components with different parameters to be allocated in electrical power systems. To allow fast reliability estimation, a universal moment generating function (UMGF) method is applied. A computer program has been developed to implement the UMGF and the HS algorithm. An illustrative example is presented.

Keywords: reliability optimization, harmony search optimization (HSA), universal generating function (UMGF)

Procedia PDF Downloads 549