Search results for: mitigation techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7554

Search results for: mitigation techniques

5724 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 311
5723 Silymarin Reverses Scopolamine-Induced Memory Deficit in Object Recognition Test in Rats: A Behavioral, Biochemical, Histopathological and Immunohistochemical Study

Authors: Salma A. El-Marasy, Reham M. Abd-Elsalam, Omar A. Ahmed-Farid

Abstract:

Dementia is characterized by impairments in memory and other cognitive abilities. This study aims to elucidate the possible ameliorative effect of silymarin on scopolamine-induced dementia using the object recognition test (ORT). The study was extended to demonstrate the role of cholinergic activity, oxidative stress, neuroinflammation, brain neurotransmitters and histopathological changes in the anti-amnestic effect of silymarin in demented rats. Wistar rats were pretreated with silymarin (200, 400, 800 mg/kg) or donepezil (10 mg/kg) orally for 14 consecutive days. Dementia was induced after the last drug administration by a single intraperitoneal dose of scopolamine (16 mg/kg). Then behavioral, biochemical, histopathological, and immunohistochemical analyses were then performed. Rats pretreated with silymarin counteracted scopolamine-induced non-spatial working memory impairment in the ORT and decreased acetylcholinesterase (AChE) activity, reduced malondialdehyde (MDA), elevated reduced glutathione (GSH), restored gamma-aminobutyric acid (GABA) and dopamine (DA) contents in the cortical and hippocampal brain homogenates. Silymarin dose-dependently reversed scopolamine-induced histopathological changes. Immunohistochemical analysis showed that silymarin dose-dependently mitigated protein expression of a glial fibrillary acidic protein (GFAP) and nuclear factor kappa-B (NF-κB) in the brain cortex and hippocampus. All these effects of silymarin were similar to that of the standard anti-amnestic drug, donepezil. This study reveals that the ameliorative effect of silymarin on scopolamine-induced dementia in rats using the ORT maybe in part mediated by, enhancement of cholinergic activity, anti-oxidant and anti-inflammatory activities as well as mitigation in brain neurotransmitters and histopathological changes.

Keywords: dementia, donepezil, object recognition test, rats, silymarin, scopolamine

Procedia PDF Downloads 140
5722 Safeguarding the Construction Industry: Interrogating and Mitigating Emerging Risks from AI in Construction

Authors: Abdelrhman Elagez, Rolla Monib

Abstract:

This empirical study investigates the observed risks associated with adopting Artificial Intelligence (AI) technologies in the construction industry and proposes potential mitigation strategies. While AI has transformed several industries, the construction industry is slowly adopting advanced technologies like AI, introducing new risks that lack critical analysis in the current literature. A comprehensive literature review identified a research gap, highlighting the lack of critical analysis of risks and the need for a framework to measure and mitigate the risks of AI implementation in the construction industry. Consequently, an online survey was conducted with 24 project managers and construction professionals, possessing experience ranging from 1 to 30 years (with an average of 6.38 years), to gather industry perspectives and concerns relating to AI integration. The survey results yielded several significant findings. Firstly, respondents exhibited a moderate level of familiarity (66.67%) with AI technologies, while the industry's readiness for AI deployment and current usage rates remained low at 2.72 out of 5. Secondly, the top-ranked barriers to AI adoption were identified as lack of awareness, insufficient knowledge and skills, data quality concerns, high implementation costs, absence of prior case studies, and the uncertainty of outcomes. Thirdly, the most significant risks associated with AI use in construction were perceived to be a lack of human control (decision-making), accountability, algorithm bias, data security/privacy, and lack of legislation and regulations. Additionally, the participants acknowledged the value of factors such as education, training, organizational support, and communication in facilitating AI integration within the industry. These findings emphasize the necessity for tailored risk assessment frameworks, guidelines, and governance principles to address the identified risks and promote the responsible adoption of AI technologies in the construction sector.

Keywords: risk management, construction, artificial intelligence, technology

Procedia PDF Downloads 113
5721 Low-Impact Development Strategies Assessment for Urban Design

Authors: Y. S. Lin, H. L. Lin

Abstract:

Climate change and land-use change caused by urban expansion increase the frequency of urban flooding. To mitigate the increase in runoff volume, low-impact development (LID) is a green approach for reducing the area of impervious surface and managing stormwater at the source with decentralized micro-scale control measures. However, the current benefit assessment and practical application of LID in Taiwan is still tending to be development plan in the community and building site scales. As for urban design, site-based moisture-holding capacity has been common index for evaluating LID’s effectiveness of urban design, which ignore the diversity, and complexity of the urban built environments, such as different densities, positive and negative spaces, volumes of building and so on. Such inflexible regulations not only probably make difficulty for most of the developed areas to implement, but also not suitable for every different types of built environments, make little benefits to some types of built environments. Looking toward to enable LID to strength the link with urban design to reduce the runoff in coping urban flooding, the research consider different characteristics of different types of built environments in developing LID strategy. Classify the built environments by doing the cluster analysis based on density measures, such as Ground Space Index (GSI), Floor Space Index (FSI), Floors (L), and Open Space Ratio (OSR), and analyze their impervious surface rates and runoff volumes. Simulate flood situations by using quasi-two-dimensional flood plain flow model, and evaluate the flood mitigation effectiveness of different types of built environments in different low-impact development strategies. The information from the results of the assessment can be more precisely implement in urban design. In addition, it helps to enact regulations of low-Impact development strategies in urban design more suitable for every different type of built environments.

Keywords: low-impact development, urban design, flooding, density measures

Procedia PDF Downloads 338
5720 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection

Authors: Hamidullah Binol, Abdullah Bal

Abstract:

Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.

Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods

Procedia PDF Downloads 433
5719 Elucidating Microstructural Evolution Mechanisms in Tungsten via Layerwise Rolling in Additive Manufacturing: An Integrated Simulation and Experimental Approach

Authors: Sadman Durlov, Aditya Ganesh-Ram, Hamidreza Hekmatjou, Md Najmus Salehin, Nora Shayesteh Ameri

Abstract:

In the field of additive manufacturing, tungsten stands out for its exceptional resistance to high temperatures, making it an ideal candidate for use in extreme conditions. However, its inherent brittleness and vulnerability to thermal cracking pose significant challenges to its manufacturability. This study explores the microstructural evolution of tungsten processed through layer-wise rolling in laser powder bed fusion additive manufacturing, utilizing a comprehensive approach that combines advanced simulation techniques with empirical research. We aim to uncover the complex processes of plastic deformation and microstructural transformations, with a particular focus on the dynamics of grain size, boundary evolution, and phase distribution. Our methodology employs a combination of simulation and experimental data, allowing for a detailed comparison that elucidates the key mechanisms influencing microstructural alterations during the rolling process. This approach facilitates a deeper understanding of the material's behavior under additive manufacturing conditions, specifically in terms of deformation and recrystallization. The insights derived from this research not only deepen our theoretical knowledge but also provide actionable strategies for refining manufacturing parameters to improve the tungsten components' mechanical properties and functional performance. By integrating simulation with practical experimentation, this study significantly enhances the field of materials science, offering a robust framework for the development of durable materials suited for challenging operational environments. Our findings pave the way for optimizing additive manufacturing techniques and expanding the use of tungsten across various demanding sectors.

Keywords: additive manufacturing, layer wise rolling, refractory materials, in-situ microstructure modifications

Procedia PDF Downloads 64
5718 Exploring Community Benefits Frameworks as a Tool for Addressing Intersections of Equity and the Green Economy in Toronto's Urban Development

Authors: Cheryl Teelucksingh

Abstract:

Toronto is in the midst of an urban development and infrastructure boom. Population growth and concerns about urban sprawl and carbon emissions have led to pressure on the municipal and the provincial governments to re-think urban development. Toronto’s approach to climate change mitigation and adaptation has positioning of the emerging green economy as part of the solution. However, the emerging green economy many not benefit all Torontonians in terms of jobs, improved infrastructure, and enhanced quality of life. Community benefits agreements (CBAs) are comprehensive, negotiated commitments, in which founders and builders of major infrastructure projects formally agree to work with community interest groups based in the community where the development is taking place, toward mutually beneficial environmental and labor market outcomes. When community groups are equitably represented in the process, they stand not only to benefit from the jobs created from the project itself, but also from the longer-term community benefits related to the quality of the completed work, including advocating for communities’ environmental needs. It is believed that green employment initiatives in Toronto should give greater consideration to best practices learned from community benefits agreements. Drawing on the findings of a funded qualitative study in Toronto (Canada), “The Green Gap: Toward Inclusivity in Toronto’s Green Economy” (2013-2016), this paper examines the emergent CBA in Toronto in relation to the development of a light rail transit project. Theoretical and empirical consideration will be given to the research gaps around CBAs, the role of various stakeholders, and discuss the potential for CBAs to gain traction in the Toronto’s urban development context. The narratives of various stakeholders across Toronto’s green economy will be interwoven with a discussion of the CBA model in Toronto and other jurisdictions.

Keywords: green economy in Toronto, equity, community benefits agreements, environmental justice, community sustainability

Procedia PDF Downloads 345
5717 Information Visualization Methods Applied to Nanostructured Biosensors

Authors: Osvaldo N. Oliveira Jr.

Abstract:

The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.

Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique

Procedia PDF Downloads 342
5716 Practices of Waterwise Circular Economy in Water Protection: A Case Study on Pyhäjärvi, SW Finland

Authors: Jari Koskiaho, Teija Kirkkala, Jani Salminen, Sarianne Tikkanen, Sirkka Tattari

Abstract:

Here, phosphorus (P) loading to the lake Pyhäjärvi (SW Finland) was reviewed, load reduction targets were determined, and different measures of waterwise circular economy to reach the targets were evaluated. In addition to the P loading from the lake’s catchment, there is a significant amount of internal P loading occurring in the lake. There are no point source emissions into the lake. Thus, the most important source of external nutrient loading is agriculture. According to the simulations made with LLR-model, the chemical state of the lake is at the border of the classes ‘Satisfactory’ and ‘Good’. The LLR simulations suggest that a reduction of some hundreds of kilograms in annual P loading would be needed to reach an unquestionably ‘Good’ state. Evaluation of the measures of the waterwise circular economy suggested that they possess great potential in reaching the target P load reduction. If they were applied extensively and in a versatile, targeted manner in the catchment, their combined effect would reach the target reduction. In terms of cost-effectiveness, the waterwise measures were ranked as follows: The best: Fishing, 2nd best: Recycling of vegetation of reed beds, wetlands and buffer zones, 3rd best: Recycling field drainage waters stored in wetlands and ponds for irrigation, 4th best: Controlled drainage and irrigation, and 5th best: Recycling of the sediments of wetlands and ponds for soil enrichment. We also identified various waterwise nutrient recycling measures to decrease the P content of arable land. The cost-effectiveness of such measures may be very good. Solutions are needed to Finnish water protection in general, and particularly for regions like lake Pyhäjärvi catchment with intensive domestic animal production, of which the ‘P-hotspots’ are a crucial issue.

Keywords: circular economy, lake protection, mitigation measures, phosphorus

Procedia PDF Downloads 110
5715 Earth Observations and Hydrodynamic Modeling to Monitor and Simulate the Oil Pollution in the Gulf of Suez, Red Sea, Egypt

Authors: Islam Abou El-Magd, Elham Ali, Moahmed Zakzouk, Nesreen Khairy, Naglaa Zanaty

Abstract:

Maine environment and coastal zone are wealthy with natural resources that contribute to the local economy of Egypt. The Gulf of Suez and Red Sea area accommodates diverse human activities that contribute to the local economy, including oil exploration and production, touristic activities, export and import harbors, etc, however, it is always under the threat of pollution due to human interaction and activities. This research aimed at integrating in-situ measurements and remotely sensed data with hydrodynamic model to map and simulate the oil pollution. High-resolution satellite sensors including Sentinel 2 and Plantlab were functioned to trace the oil pollution. Spectral band ratio of band 4 (infrared) over band 3 (red) underpinned the mapping of the point source pollution from the oil industrial estates. This ratio is supporting the absorption windows detected in the hyperspectral profiles. ASD in-situ hyperspectral device was used to measure experimentally the oil pollution in the marine environment. The experiment used to measure water behavior in three cases a) clear water without oil, b) water covered with raw oil, and c) water after a while from throwing the raw oil. The spectral curve is clearly identified absorption windows for oil pollution, particularly at 600-700nm. MIKE 21 model was applied to simulate the dispersion of the oil contamination and create scenarios for crises management. The model requires precise data preparation of the bathymetry, tides, waves, atmospheric parameters, which partially obtained from online modeled data and other from historical in-situ stations. The simulation enabled to project the movement of the oil spill and could create a warning system for mitigation. Details of the research results will be described in the paper.

Keywords: oil pollution, remote sensing, modelling, Red Sea, Egypt

Procedia PDF Downloads 350
5714 Development and Implementation of a Business Technology Program Based on Techniques for Reusing Water in a Colombian Company

Authors: Miguel A. Jimenez Barros, Elyn L. Solano Charris, Luis E. Ramirez, Lauren Castro Bolano, Carlos Torres Barreto, Juliana Morales Cubillo

Abstract:

This project sought to mitigate the high levels of water consumption in industrial processes in accordance with the water-rationing plan promoted at national and international level due to the water consumption projections published by the United Nations. Water consumption has three main uses, municipal (common use), agricultural and industrial where the latter consumes a minimum percentage (around 20% of the total consumption). Awareness on world water scarcity, a Colombian company responsible for generation of massive consumption products, decided to implement politics and techniques for water treatment, recycling, and reuse. The project consisted in a business technology program that permits a better use of wastewater caused by production operations. This approach reduces the potable water consumption, generates better conditions of water in the sewage dumps, generates a positive environmental impact for the region, and is a reference model in national and international levels. In order to achieve the objective, a process flow diagram was used in order to define the industrial processes that required potable water. This strategy allowed the industry to determine a water reuse plan at the operational level without affecting the requirements associated with the manufacturing process and even more, to support the activities developed in administrative buildings. Afterwards, the company made an evaluation and selection of the chemical and biological processes required for water reuse, in compliance with the Colombian Law. The implementation of the business technology program optimized the water use and recirculation rate up to 70%, accomplishing an important reduction of the regional environmental impact.

Keywords: bio-reactor, potable water, reverse osmosis, water treatment

Procedia PDF Downloads 239
5713 Rejuvenation of Aged Kraft-Cellulose Insulating Paper Used in Transformers

Authors: Y. Jeon, A. Bissessur, J. Lin, P. Ndungu

Abstract:

Most transformers employ the usage of cellulose paper, which has been chemically modified through the Kraft process that acts as an effective insulator. Cellulose ageing and oil degradation are directly linked to fouling of the transformer and accumulation of large quantities of waste insulating paper. In addition to technical difficulties, this proves costly for power utilities to deal with. Currently there are no cost effective method for the rejuvenation of cellulose paper that has been documented nor proposed, since renewal of used insulating paper is implemented as the best option. This study proposes and contrasts different rejuvenation methods of accelerated aged cellulose insulating paper by chemical and bio-bleaching processes. Of the three bleaching methods investigated, two are, conventional chlorine-based sodium hypochlorite (m/v), and chlorine-free hydrogen peroxide (v/v), whilst the third is a bio-bleaching technique that uses a bacterium isolate, Acinetobacter strain V2. Through chemical bleaching, varying the strengths of the bleaching reagents at 0.3 %, 0.6 %, 0.9 %, 1.2 %, 1.5 % and 1.8 % over 4 hrs. were analyzed. Bio-bleaching implemented a bacterium isolate, Acinetobacter strain V2, to bleach the aged Kraft paper over 4 hrs. The determination of the amount of alpha cellulose, degree of polymerization and viscosity carried out on Kraft-cellulose insulating paper before and after bleaching. Overall the investigated techniques of chemical and bio-bleaching were successful and effective in treating degraded and accelerated aged Kraft-cellulose insulating paper, however, to varying extents. Optimum conditions for chemical bleaching were attained at bleaching strengths of 1.2 % (m/v) NaOCl and 1.5 % (v/v) H2O2 yielding alpha cellulose contents of 82.4 % and 80.7 % and degree of polymerizations of 613 and 616 respectively. Bio-bleaching using Acinetobacter strain V2 proved to be the superior technique with alpha cellulose levels of 89.0 % and a degree of polymerization of 620. Chemical bleaching techniques require careful and controlled clean-up treatments as it is chlorine and hydrogen peroxide based while bio-bleaching is an extremely eco-friendly technique.

Keywords: alpha cellulose, bio-bleaching, degree of polymerization, Kraft-cellulose insulating paper, transformer, viscosity

Procedia PDF Downloads 275
5712 Study of Radiation Response in Lactobacillus Species

Authors: Kanika Arora, Madhu Bala

Abstract:

The small intestine epithelium is highly sensitive and major targets of ionizing radiation. Radiation causes gastrointestinal toxicity either by direct deposition of energy or indirectly (inflammation or bystander effects) generating free radicals and reactive oxygen species. Oxidative stress generated as a result of radiation causes active inflammation within the intestinal mucosa leading to structural and functional impairment of gut epithelial barrier. As a result, there is a loss of tolerance to normal dietary antigens and commensal flora together with exaggerated response to pathogens. Dysbiosis may therefore thought to play a role in radiation enteropathy and can contribute towards radiation induced bowel toxicity. Lactobacilli residing in the gut shares a long conjoined evolutionary history with their hosts and by doing so these organisms have developed an intimate and complex symbiotic relationships. The objective behind this study was to look for the strains with varying resistance to ionizing radiation and to see whether the niche of the bacteria is playing any role in radiation resistance property of bacteria. In this study, we have isolated the Lactobacillus spp. from probiotic preparation and murine gastrointestinal tract, both of which were supposed to be the important source for its isolation. Biochemical characterization did not show a significant difference in the properties, while a significant preference was observed in carbohydrate utilization capacity by the isolates. Effect of ionizing radiations induced by Co60 gamma radiation (10 Gy) on lactobacilli cells was investigated. A cellular survival curve versus absorbed doses was determined. Radiation resistance studies showed that the response of isolates towards cobalt-60 gamma radiation differs from each other and significant decrease in survival was observed in a dose-dependent manner. Thus the present study revealed that the property of radioresistance in Lactobacillus depends upon the source from where they have been isolated.

Keywords: dysbiosis, lactobacillus, mitigation, radiation

Procedia PDF Downloads 141
5711 The Prevalence of Organized Retail Crime in Riyadh, Saudi Arabia

Authors: Saleh Dabil

Abstract:

This study investigates the level of existence of organized retail crime in supermarkets of Riyadh, Saudi Arabia. The store managers, security managers and general employees were asked about the types of retail crimes occur in the stores. Three independent variables were related to the report of organized retail theft. The independent variables are: (1) the supermarket profile (volume, location, standard and type of the store), (2) the social physical environment of the store (maintenance, cleanness and overall organizational cooperation), (3) the security techniques and loss prevention electronics techniques used. The theoretical framework of this study based on the social disorganization theory. This study concluded that the organized retail theft, in specific, organized theft is moderately apparent in Riyadh stores. The general result showed that the environment of the stores has an effect on the prevalence of organized retail theft with relation to the gender of thieves, age groups, working shift, type of stolen items as well as the number of thieves in one case. Among other reasons, some factors of the organized theft are: economic pressure of customers based on the location of the store. The dealing of theft also was investigated to have a clear picture of stores dealing with organized retail theft. The result showed that mostly, thieves sent without any action and sometimes given written warning. Very few cases dealt with by police. There are other factors in the study can be looked up in the text. This study suggests solving the problem of organized theft; first is ‘the well distributing of the duties and responsibilities between the employees especially for security purposes’. Second is ‘installation of strong security system’ and ‘making well-designed store layout’. Third is ‘giving training for general employees’ and ‘to give periodically security skills training of employees’. There are other suggestions in the study can be looked up in the text.

Keywords: organized crime, retail, theft, loss prevention, store environment

Procedia PDF Downloads 201
5710 Lightweight Ceramics from Clay and Ground Corncobs

Authors: N.Quaranta, M. Caligaris, R. Varoli, A. Cristobal, M. Unsen, H. López

Abstract:

Corncobs are agricultural wastes and they can be used as fuel or as raw material in different industrial processes like cement manufacture, contaminant adsorption, chemical compound synthesis, etc. The aim of this work is to characterize this waste and analyze the feasibility of its use as a pore-forming material in the manufacture of lightweight ceramics for the civil construction industry. The characterization of raw materials is carried out by using various techniques: electron diffraction analysis X-ray, differential and gravimetric thermal analyses, FTIR spectroscopy, ecotoxicity evaluation, among others. The ground corncobs, particle size less than 2 mm, are mixed with clay up to 30% in volume and shaped by uniaxial pressure of 25 MPa, with 6% humidity, in moulds of 70mm x 40mm x 18mm. Then the green bodies are heat treated at 950°C for two hours following the treatment curves used in ceramic industry. The ceramic probes are characterized by several techniques: density, porosity and water absorption, permanent volumetric variation, loss on ignition, microscopies analysis, and mechanical properties. DTA-TGA analysis of corncobs shows in the range 20°-250°C a small loss in TGA curve and exothermic peaks at 250°-500°C. FTIR spectrum of the corncobs sample shows the characteristic pattern of this kind of organic matter with stretching vibration bands of adsorbed water, methyl groups, C–O and C–C bonds, and the complex form of the cellulose and hemicellulose glycosidic bonds. The obtained ceramic bodies present external good characteristics without loose edges and adequate properties for the market requirements. The porosity values of the sintered pieces are higher than those of the reference sample without waste addition. The results generally indicate that it is possible to use corncobs as porosity former in ceramic bodies without modifying the usual sintering temperatures employed in the industry.

Keywords: ceramic industry, biomass, recycling, hemicellulose glycosidic bonds

Procedia PDF Downloads 407
5709 Flood Management Plans in Different Flooding Zones of Gujranwala and Rawalpindi Divisions, Punjab, Pakistan

Authors: Muhammad Naveed

Abstract:

In this paper, flood issues in Gujranwala and Rawalpindi divisions are discussed as a primary importance as these zones are affected continuously from flooding in recent years, provincial variability of the issue, introduce status of the continuous administration measures, their adequacy and future needs in flood administration are secured. Flood issues in these zones are exhibited by Chenab River Basin, Jhelum Rivers Basin. Some unique problems, related to floods in these divisions is lack of major dams on Chenab and Jhelum rivers and also mismanagement of rivers and canal water like dam break stream, and water signing in Tal zones, are additionally mentioned. There are major Nalaas in these regions like Nalaa Lai of Rawalpindi and Nalaa Daik, Nalaa Palkhu, Nalaa Aik of Gujranwala are major cause of floods in these regions other than rivers. Proper management of these Nalaas and moving of nearby population well in time could reduce impacts from flood in these regions. Progress of different flood administration measures, both auxiliary and non-basic, are discussed. Likewise, future needs to accomplish proficient and fruitful flood management measures in Pakistan are additionally brought up. In this paper, we describe different hard and soft engineering techniques to overcome flood situations in these zones as these zones are more vulnerable due to lack of management in canal and river water. Effective management and use of hard and soft techniques are need of time in coming future for controlling greater flooding in flood risk zones to overcome or minimize people’s death as well as agricultural and financial resources as flood and other natural disasters are a major drawback in the economic prosperity of the country.

Keywords: flood management, rivers, major dams, agricultural and financial loss, future management and control

Procedia PDF Downloads 203
5708 Research and Application of Multi-Scale Three Dimensional Plant Modeling

Authors: Weiliang Wen, Xinyu Guo, Ying Zhang, Jianjun Du, Boxiang Xiao

Abstract:

Reconstructing and analyzing three-dimensional (3D) models from situ measured data is important for a number of researches and applications in plant science, including plant phenotyping, functional-structural plant modeling (FSPM), plant germplasm resources protection, agricultural technology popularization. It has many scales like cell, tissue, organ, plant and canopy from micro to macroscopic. The techniques currently used for data capture, feature analysis, and 3D reconstruction are quite different of different scales. In this context, morphological data acquisition, 3D analysis and modeling of plants on different scales are introduced systematically. The commonly used data capture equipment for these multiscale is introduced. Then hot issues and difficulties of different scales are described respectively. Some examples are also given, such as Micron-scale phenotyping quantification and 3D microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning, 3D reconstruction of leaf surfaces and feature extraction from point cloud acquired by using 3D handheld scanner, plant modeling by combining parameter driven 3D organ templates. Several application examples by using the 3D models and analysis results of plants are also introduced. A 3D maize canopy was constructed, and light distribution was simulated within the canopy, which was used for the designation of ideal plant type. A grape tree model was constructed from 3D digital and point cloud data, which was used for the production of science content of 11th international conference on grapevine breeding and genetics. By using the tissue models of plants, a Google glass was used to look around visually inside the plant to understand the internal structure of plants. With the development of information technology, 3D data acquisition, and data processing techniques will play a greater role in plant science.

Keywords: plant, three dimensional modeling, multi-scale, plant phenotyping, three dimensional data acquisition

Procedia PDF Downloads 280
5707 Monitoring Deforestation Using Remote Sensing And GIS

Authors: Tejaswi Agarwal, Amritansh Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection

Procedia PDF Downloads 1213
5706 NDVI as a Measure of Change in Forest Biomass

Authors: Amritansh Agarwal, Tejaswi Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI change detection

Procedia PDF Downloads 405
5705 Experimental Study Damage in a Composite Structure by Vibration Analysis- Glass / Polyester

Authors: R. Abdeldjebar, B. Labbaci, L. Missoum, B. Moudden, M. Djermane

Abstract:

The basic components of a composite material made him very sensitive to damage, which requires techniques for detecting damage reliable and efficient. This work focuses on the detection of damage by vibration analysis, whose main objective is to exploit the dynamic response of a structure to detect understand the damage. The experimental results are compared with those predicted by numerical models to confirm the effectiveness of the approach.

Keywords: experimental, composite, vibration analysis, damage

Procedia PDF Downloads 677
5704 Seismic Inversion for Geothermal Exploration

Authors: E. N. Masri, E. Takács

Abstract:

Amplitude Versus Offset (AVO) and simultaneous model-based impedance inversion techniques have not been utilized for geothermal exploration commonly; however, some recent publications called the attention that they can be very useful in the geothermal investigations. In this study, we present rock physical attributes obtained from 3D pre-stack seismic data and well logs collected in a study area of the NW part of Pannonian Basin where the geothermal reservoir is located in the fractured zones of Triassic basement and it was hit by three productive-injection well pairs. The holes were planned very successfully based on the conventional 3D migrated stack volume prior to this study. Subsequently, the available geophysical-geological datasets provided a great opportunity to test modern inversion procedures in the same area. In this presentation, we provide a summary of the theory and application of the most promising seismic inversion techniques from the viewpoint of geothermal exploration. We demonstrate P- and S-wave impedance, as well as the velocity (Vp and Vs), the density, and the Vp/Vs ratio attribute volumes calculated from the seismic and well-logging data sets. After a detailed discussion, we conclude that P-wave impedance and Vp/Vp ratio are the most helpful parameters for lithology discrimination in the study area. They detect the hot water saturated fracture zone very well thus they can be very useful in mapping the investigated reservoir. Integrated interpretation of all the obtained rock-physical parameters is essential. We are extending the above discussed pre-stack seismic tools by studying the possibilities of Elastic Impedance Inversion (EII) for geothermal exploration. That procedure provides two other useful rock-physical properties, the compressibility and the rigidity (Lamé parameters). Results of those newly created elastic parameters will also be demonstrated in the presentation. Geothermal extraction is of great interest nowadays; and we can adopt several methods have been successfully applied in the hydrocarbon exploration for decades to discover new reservoirs and reduce drilling risk and cost.

Keywords: fractured zone, seismic, well-logging, inversion

Procedia PDF Downloads 131
5703 A Reading Light That Can Adjust Indoor Light Intensity According to the Activity and Person for Improve Indoor Visual Comfort of Occupants and Tested using Post-occupancy Evaluation Techniques for Sri Lankan Population

Authors: R.T.P. De Silva, T. K. Wijayasiriwardhane, B. Jayawardena

Abstract:

Most people nowadays spend their time indoor environment. Because of that, a quality indoor environment needs for them. This study was conducted to identify how to improve indoor visual comfort using a personalized light system. Light intensity, light color, glare, and contrast are the main facts that affect visual comfort. The light intensity which needs to perform a task is changed according to the task. Using necessary light intensity and we can improve the visual comfort of occupants. The hue can affect the emotions of occupants. The preferred light colors and intensity change according to the occupant's age and gender. The research was conducted to identify is there any relationship between personalization and visual comfort. To validate this designed an Internet of Things-based reading light. This light can work according to the standard light levels and personalized light levels. It also can measure the current light intensity of the environment and maintain continuous light levels according to the task. The test was conducted by using 25 undergraduates, and 5school students, and 5 adults. The feedbacks are gathered using Post-occupancy evaluation (POE) techniques. Feedbacks are gathered in three steps, It was done without any light control, with standard light level, and with personalized light level Users had to spend 10 minutes under each condition. After finishing each step, collected their feedbacks. According to the result gathered, 94% of participants rated a personalized light system as comfort for them. The feedbacks show stay under continuous light level help to keep their concentrate. Future research can be conducted on how the color of indoor light can affect for indoor visual comfort of occupants using a personalized light system. Further proposed IoT based can improve to change the light colors according to the user's preference.

Keywords: indoor environment quality, internet of things based light system, post occupancy evaluation, visual comfort

Procedia PDF Downloads 160
5702 A New Graph Theoretic Problem with Ample Practical Applications

Authors: Mehmet Hakan Karaata

Abstract:

In this paper, we first coin a new graph theocratic problem with numerous applications. Second, we provide two algorithms for the problem. The first solution is using a brute-force techniques, whereas the second solution is based on an initial identification of the cycles in the given graph. We then provide a correctness proof of the algorithm. The applications of the problem include graph analysis, graph drawing and network structuring.

Keywords: algorithm, cycle, graph algorithm, graph theory, network structuring

Procedia PDF Downloads 391
5701 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques

Authors: Jonathan J. Burson

Abstract:

With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.

Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis

Procedia PDF Downloads 100
5700 A Comparison of Convolutional Neural Network Architectures for the Classification of Alzheimer’s Disease Patients Using MRI Scans

Authors: Tomas Premoli, Sareh Rowlands

Abstract:

In this study, we investigate the impact of various convolutional neural network (CNN) architectures on the accuracy of diagnosing Alzheimer’s disease (AD) using patient MRI scans. Alzheimer’s disease is a debilitating neurodegenerative disorder that affects millions worldwide. Early, accurate, and non-invasive diagnostic methods are required for providing optimal care and symptom management. Deep learning techniques, particularly CNNs, have shown great promise in enhancing this diagnostic process. We aim to contribute to the ongoing research in this field by comparing the effectiveness of different CNN architectures and providing insights for future studies. Our methodology involved preprocessing MRI data, implementing multiple CNN architectures, and evaluating the performance of each model. We employed intensity normalization, linear registration, and skull stripping for our preprocessing. The selected architectures included VGG, ResNet, and DenseNet models, all implemented using the Keras library. We employed transfer learning and trained models from scratch to compare their effectiveness. Our findings demonstrated significant differences in performance among the tested architectures, with DenseNet201 achieving the highest accuracy of 86.4%. Transfer learning proved to be helpful in improving model performance. We also identified potential areas for future research, such as experimenting with other architectures, optimizing hyperparameters, and employing fine-tuning strategies. By providing a comprehensive analysis of the selected CNN architectures, we offer a solid foundation for future research in Alzheimer’s disease diagnosis using deep learning techniques. Our study highlights the potential of CNNs as a valuable diagnostic tool and emphasizes the importance of ongoing research to develop more accurate and effective models.

Keywords: Alzheimer’s disease, convolutional neural networks, deep learning, medical imaging, MRI

Procedia PDF Downloads 80
5699 Ports and Airports: Gateways to Vector-Borne Diseases in Portugal Mainland

Authors: Maria C. Proença, Maria T. Rebelo, Maria J. Alves, Sofia Cunha

Abstract:

Vector-borne diseases are transmitted to humans by mosquitos, sandflies, bugs, ticks, and other vectors. Some are re-transmitted between vectors, if the infected human has a new contact when his levels of infection are high. The vector is infected for lifetime and can transmit infectious diseases not only between humans but also from animals to humans. Some vector borne diseases are very disabling and globally account for more than one million deaths worldwide. The mosquitoes from the complex Culex pipiens sl. are the most abundant in Portugal, and we dispose in this moment of a data set from the surveillance program that has been carried on since 2006 across the country. All mosquitos’ species are included, but the large coverage of Culex pipiens sl. and its importance for public health make this vector an interesting candidate to assess risk of disease amplification. This work focus on ports and airports identified as key areas of high density of vectors. Mosquitoes being ectothermic organisms, the main factor for vector survival and pathogen development is temperature. Minima and maxima local air temperatures for each area of interest are averaged by month from data gathered on a daily basis at the national network of meteorological stations, and interpolated in a geographic information system (GIS). The range of temperatures ideal for several pathogens are known and this work shows how to use it with the meteorological data in each port and airport facility, to focus an efficient implementation of countermeasures and reduce simultaneously risk transmission and mitigation costs. The results show an increased alert with decreasing latitude, which corresponds to higher minimum and maximum temperatures and a lower amplitude range of the daily temperature.

Keywords: human health, risk assessment, risk management, vector-borne diseases

Procedia PDF Downloads 423
5698 Reliability Analysis in Power Distribution System

Authors: R. A. Deshpande, P. Chandhra Sekhar, V. Sankar

Abstract:

In this paper, we discussed the basic reliability evaluation techniques needed to evaluate the reliability of distribution systems which are applied in distribution system planning and operation. Basically, the reliability study can also help to predict the reliability performance of the system after quantifying the impact of adding new components to the system. The number and locations of new components needed to improve the reliability indices to certain limits are identified and studied.

Keywords: distribution system, reliability indices, urban feeder, rural feeder

Procedia PDF Downloads 777
5697 Anterior Tooth Misalignment: Orthodontics or Restorative Treatment

Authors: Maryam Firouzmandi, Moosa Miri

Abstract:

Smile is considered to be one of the most effective methods of influencing people. Increasing numbers of patients are requesting cosmetic dental procedures to achieve the perfect smile. Based on the patient’s age, oral and facial characteristics, and the dentist’s expertise, different concepts of treatment would be available. Orthodontics is the most conservative and the ideal treatment alternative for crowded anterior teeth; however, it may be rejected by patients due to occupational limitations of time, physical discomfort including pain and functional limitations, psychological discomfort, and appearance during treatment. In addition, orthodontic treatment will not resolve deficits of contour and color of the anterior teeth. In consequence, patients may demand restorative techniques to resolve their anterior mal-alignment instead, often called "instant orthodontics". Following its introduction, however, adhesive dentistry has suffered at times from overuse. Creating short-term attractive smiles at the expense of long-term dental health and optimal tooth biomechanics by using cosmetic techniques should not be considered an ethical approach. The objective of this narrative review was to investigate the literature for guidelines with regard to decision making and treatment planning for anterior tooth mal-alignment. In this regard, indications of orthodontic, restorative, combination of both treatments, and adjunctive periodontal surgery were discussed in clinical cases to achieve a proportional smile. Restorative modalities would include disking, cosmetic contouring, veneers, and crowns and were compared with limited or comprehensive orthodontic options. A rapid review was also presented on pros and cons of snap on smile to mask malalignments. Diagnostic tools such as mock up, wax up, and digital smile design were also considered to achieve more conservative and functional treatments with respect to biologic factors.

Keywords: crowding, misalignment, veneer, crown, orthodontics

Procedia PDF Downloads 119
5696 Indigenous Understandings of Climate Vulnerability in Chile: A Qualitative Approach

Authors: Rosario Carmona

Abstract:

This article aims to discuss the importance of indigenous people participation in climate change mitigation and adaptation. Specifically, it analyses different understandings of climate vulnerability among diverse actors involved in climate change policies in Chile: indigenous people, state officials, and academics. These data were collected through participant observation and interviews conducted during October 2017 and January 2019 in Chile. Following Karen O’Brien, there are two types of vulnerability, outcome vulnerability and contextual vulnerability. How vulnerability to climate change is understood determines the approach, which actors are involved and which knowledge is considered to address it. Because climate change is a very complex phenomenon, it is necessary to transform the institutions and their responses. To do so, it is fundamental to consider these two perspectives and different types of knowledge, particularly those of the most vulnerable, such as indigenous people. For centuries and thanks to a long coexistence with the environment, indigenous societies have elaborated coping strategies, and some of them are already adapting to climate change. Indigenous people from Chile are not an exception. But, indigenous people tend to be excluded from decision-making processes. And indigenous knowledge is frequently seen as subjective and arbitrary in relation to science. Nevertheless, last years indigenous knowledge has gained particular relevance in the academic world, and indigenous actors are getting prominence in international negotiations. There are some mechanisms that promote their participation (e.g., Cancun safeguards, World Bank operational policies, REDD+), which are not absent from difficulties. And since 2016 parties are working on a Local Communities and Indigenous Peoples Platform. This paper also explores the incidence of this process in Chile. Although there is progress in the participation of indigenous people, this participation responds to the operational policies of the funding agencies and not to a real commitment of the state with this sector. The State of Chile omits a review of the structure that promotes inequality and the exclusion of indigenous people. In this way, climate change policies could be configured as a new mechanism of coloniality that validates a single type of knowledge and leads to new territorial control strategies, which increases vulnerability.

Keywords: indigenous knowledge, climate change, vulnerability, Chile

Procedia PDF Downloads 130
5695 Surveying Apps in Dam Excavation

Authors: Ali Mohammadi

Abstract:

Whenever there is a need to dig the ground, the presence of a surveyor is required to control the map. In projects such as dams and tunnels, these controls are more important because any mistakes can increase the cost. Also, time is great importance in These projects have and one of the ways to reduce the drilling time is to use techniques that can reduce the mapping time in these projects. Nowadays, with the existence of mobile phones, we can design apps that perform calculations and drawing for us on the mobile phone. Also, if we have a device that requires a computer to access its information, by designing an app, we can transfer its information to the mobile phone and use it, so we will not need to go to the office.

Keywords: app, tunnel, excavation, dam

Procedia PDF Downloads 73