Search results for: uncertainties
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 421

Search results for: uncertainties

121 Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models

Authors: Morten Brøgger, Kim Wittchen

Abstract:

Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.

Keywords: building stock energy modelling, energy-savings, archetype

Procedia PDF Downloads 122
120 The Importance of Reflection and Collegial Support for Clinical Instructors When Evaluating Failing Students in a Clinical Nursing Course

Authors: Maria Pratt, Lynn Martin

Abstract:

Context: In nursing education, clinical instructors are crucial in assessing and evaluating students' performance in clinical courses. However, instructors often struggle when assigning failing grades to students at risk of failing. Research Aim: This qualitative study aims to understand clinical instructors' experiences evaluating students with unsatisfactory performance, including how reflection and collegial support impact this evaluation process. Methodology, Data Collection, and Analysis Procedures: This study employs Gadamer's Hermeneutic Inquiry as the research methodology. A purposive maximum variation sampling technique was used to recruit eight clinical instructors from a collaborative undergraduate nursing program in Southwestern Ontario. Semi-structured, open-ended, and audio-taped interviews were conducted with the participants. The hermeneutic analysis was applied to interpret the interview data to allow for a thorough exploration and interpretation of the instructors' experiences evaluating failing students. Findings: The main findings of this qualitative research indicate that evaluating failing students was emotionally draining for the clinical instructors who experienced multiple challenges, uncertainties, and negative feelings associated with assigning failing grades. However, the analysis revealed that ongoing reflection and collegial support played a crucial role in mitigating the challenges they experienced. Conclusion: This study contributes to the theoretical understanding of nursing education by shedding light on clinical instructors' challenges in evaluating failing students. It emphasizes the emotional toll associated with this process and the role that reflection and collegial support play in alleviating those challenges. The findings underscore the need for ongoing professional development and support for instructors in nursing education. By understanding and addressing clinical instructors' experiences, nursing education programs can better equip them to effectively evaluate struggling students and provide the necessary support for their professional growth.

Keywords: clinical instructor, student evaluation, nursing, reflection, support

Procedia PDF Downloads 52
119 The Control of Wall Thickness Tolerance during Pipe Purchase Stage Based on Reliability Approach

Authors: Weichao Yu, Kai Wen, Weihe Huang, Yang Yang, Jing Gong

Abstract:

Metal-loss corrosion is a major threat to the safety and integrity of gas pipelines as it may result in the burst failures which can cause severe consequences that may include enormous economic losses as well as the personnel casualties. Therefore, it is important to ensure the corroding pipeline integrity and efficiency, considering the value of wall thickness, which plays an important role in the failure probability of corroding pipeline. Actually, the wall thickness is controlled during pipe purchase stage. For example, the API_SPEC_5L standard regulates the allowable tolerance of the wall thickness from the specified value during the pipe purchase. The allowable wall thickness tolerance will be used to determine the wall thickness distribution characteristic such as the mean value, standard deviation and distribution. Taking the uncertainties of the input variables in the burst limit-state function into account, the reliability approach rather than the deterministic approach will be used to evaluate the failure probability. Moreover, the cost of pipe purchase will be influenced by the allowable wall thickness tolerance. More strict control of the wall thickness usually corresponds to a higher pipe purchase cost. Therefore changing the wall thickness tolerance will vary both the probability of a burst failure and the cost of the pipe. This paper describes an approach to optimize the wall thickness tolerance considering both the safety and economy of corroding pipelines. In this paper, the corrosion burst limit-state function in Annex O of CSAZ662-7 is employed to evaluate the failure probability using the Monte Carlo simulation technique. By changing the allowable wall thickness tolerance, the parameters of the wall thickness distribution in the limit-state function will be changed. Using the reliability approach, the corresponding variations in the burst failure probability will be shown. On the other hand, changing the wall thickness tolerance will lead to a change in cost in pipe purchase. Using the variation of the failure probability and pipe cost caused by changing wall thickness tolerance specification, the optimal allowable tolerance can be obtained, and used to define pipe purchase specifications.

Keywords: allowable tolerance, corroding pipeline segment, operation cost, production cost, reliability approach

Procedia PDF Downloads 358
118 Empowering 21st Century Students with Self-Employability Skill Competencies in an Era of Uncertainties of Paid Employment Jobs

Authors: Pac Ordu

Abstract:

The paper was conceived on the premise that employment of tertiary education graduates has become an endemic problem in Nigeria. Recognizing the objective of current education as schooling for paid employment, the paper identified that the basic objective of present-day education should be schooling to become self-employed. While schooling to become a successful employee was identified as the focus for the older generation, schooling to become self-employed was defined as the focus for 21st-century teaching and learning. Hence, the paper condemned the inability of curriculum implementers to teach creative trends to enable students to acquire practical skills and small business operation-oriented competencies. A review of some disciplines was made to show the new trend of education that would empower Nigerian students with small business enterprise operation skills for self-employment on graduation. This was further made to draw the attention of institutions and curriculum designers to the need for our curriculum to be functional in line with demands of the innovative economic environment. The paper also noted that at periods of recession with its attendant effects, was the best period for students of entrepreneurship to dream and create their small business enterprises. It highlighted the role of Federal College of Education (Technical) Omoku, Rivers State, Nigeria and the national recognition it has received for developing an innovative, practical model of teaching entrepreneurship education in Nigeria Colleges of Education system. In order to equip students for economic survival on graduation, the introduction of innovative teaching can only be successful if lecturers shift their focus away from the conventional emphasis on theory to students’ energy quotients. While the paper obviously recommended that lecturers should be creative and teach outside the curriculum box, it further recommended that students should use this period of their studentship to dream, create and operate their own small business enterprises.

Keywords: 21st century students, curriculum, entrepreneurship, hands-on-training, innovative

Procedia PDF Downloads 81
117 Comparative Study of Water Quality Parameters in the Proximity of Various Landfills Sites in India

Authors: Abhishek N. Srivastava, Rahul Singh, Sumedha Chakma

Abstract:

The rapid urbanization in the developing countries is generating an enormous amount of waste leading to the creation of unregulated landfill sites at various places at its disposal. The liquid waste, known as leachate, produced from these landfills sites is severely affecting the surrounding water quality. The water quality in the proximity areas of the landfill is found affected by various physico-chemical parameters of leachate such as pH, alkalinity, total hardness, conductivity, chloride, total dissolved solids (TDS), total suspended solids (TSS), sulphate, nitrate, phosphate, fluoride, sodium and potassium, biological parameters such as biochemical oxygen demand (BOD), chemical oxygen demand (COD), Faecal coliform, and heavy metals such as cadmium (Cd), lead (Pb), iron (Fe), mercury (Hg), arsenic (As), cobalt (Co), manganese (Mn), zinc (Zn), copper (Cu), chromium (Cr), nickel (Ni). However, all these parameters are distributive in leachate that produced according to the nature of waste being dumped at various landfill sites, therefore, it becomes very difficult to predict the main responsible parameter of leachate for water quality contamination. The present study is endeavour the comparative analysis of the physical, chemical and biological parameters of various landfills in India viz. Okhla landfill, Ghazipur landfill, Bhalswa ladfill in NCR Delhi, Deonar landfill in Mumbai, Dhapa landfill in Kolkata and Kodungayaiyur landfill, Perungudi landfill in Chennai. The statistical analysis of the parameters was carried out using the Statistical Packages for the Social Sciences (SPSS) and LandSim 2.5 model to simulate the long term effect of various parameters on different time scale. Further, the uncertainties characterization of various input parameters has also been analysed using fuzzy alpha cut (FAC) technique to check the sensitivity of various water quality parameters at the proximity of numerous landfill sites. Finally, the study would help to suggest the best method for the prevention of pollution migration from the landfill sites on priority basis.

Keywords: landfill leachate, water quality, LandSim, fuzzy alpha cut

Procedia PDF Downloads 95
116 Comparison of Water Equivalent Ratio of Several Dosimetric Materials in Proton Therapy Using Monte Carlo Simulations and Experimental Data

Authors: M. R. Akbari , H. Yousefnia, E. Mirrezaei

Abstract:

Range uncertainties of protons are currently a topic of interest in proton therapy. Two of the parameters that are often used to specify proton range are water equivalent thickness (WET) and water equivalent ratio (WER). Since WER values for a specific material is nearly constant at different proton energies, it is a more useful parameter to compare. In this study, WER values were calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and TRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. In FLUKA simulation, a cylindrical phantom, 1000 mm in height and 300 mm in diameter, filled with the studied materials was simulated. A typical mono-energetic proton pencil beam in a wide range of incident energies usually applied in proton therapy (50 MeV to 225 MeV) impinges normally on the phantom. In order to obtain the WER values for the considered materials, cylindrical detectors, 1 mm in height and 20 mm in diameter, were also simulated along the beam trajectory in the phantom. In TRIM calculations, type of projectile, energy and angle of incidence, type of target material and thickness should be defined. The mode of 'detailed calculation with full damage cascades' was selected for proton transport in the target material. The biggest difference in WER values between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. In Al and PMMA, the biggest difference between each code and experimental data was 1.08%, 1.26%, 2.55%, 0.94%, 0.77% and 0.95% for SEICS, FLUKA and SRIM, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference in PMMA and ≤1.08% difference in Al, respectively) with the available experimental data in this study. It is concluded that, FLUKA and TRIM codes have capability for Bragg curves simulation and WER values calculation in the studied materials. They can also predict Bragg peak location and range of proton beams with acceptable accuracy.

Keywords: water equivalent ratio, dosimetric materials, proton therapy, Monte Carlo simulations

Procedia PDF Downloads 286
115 Cost-Effectiveness Analysis of the Use of COBLATION™ Knee Chondroplasty versus Mechanical Debridement in German Patients

Authors: Ayoade Adeyemi, Leo Nherera, Paul Trueman, Antje Emmermann

Abstract:

Background and objectives: Radiofrequency (RF) generated plasma chondroplasty is considered a promising treatment alternative to mechanical debridement (MD) with a shaver. The aim of the study was to perform a cost-effectiveness analysis comparing costs and outcomes following COBLATION chondroplasty versus mechanical debridement in patients with knee pain associated with a medial meniscus tear and idiopathic ICRS grade III focal lesion of the medial femoral condyle from a payer perspective. Methods: A decision-analytic model was developed comparing economic and clinical outcomes between the two treatment options in German patients following knee chondroplasty. Revision rates based on the frequency of repeat arthroscopy, osteotomy and conversion to total knee replacement, reimbursement costs and outcomes data over a 4-year time horizon were extracted from published literature. One-way sensitivity analyses were conducted to assess uncertainties around model parameters. Threshold analysis determined the revision rate at which model results change. All costs were reported in 2016 euros, future costs were discounted at a 3% annual rate. Results: Over a 4 year period, COBLATION chondroplasty resulted in an overall net saving cost of €461 due to a lower revision rate of 14% compared to 48% with MD. Threshold analysis showed that both options were associated with comparable costs if COBLATION revision rate was assumed to increase up to 23%. The initial procedure costs for COBLATION were higher compared to MD and outcome scores were significantly improved at 1 and 4 years post-operation versus MD. Conclusion: The analysis shows that COBLATION chondroplasty is a cost-effective option compared to mechanical debridement in the treatment of patients with a medial meniscus tear and idiopathic ICRS grade III defect of the medial femoral condyle.

Keywords: COBLATION, cost-effectiveness, knee chondroplasty, mechanical debridement

Procedia PDF Downloads 356
114 Calculational-Experimental Approach of Radiation Damage Parameters on VVER Equipment Evaluation

Authors: Pavel Borodkin, Nikolay Khrennikov, Azamat Gazetdinov

Abstract:

The problem of ensuring of VVER type reactor equipment integrity is now most actual in connection with justification of safety of the NPP Units and extension of their service life to 60 years and more. First of all, it concerns old units with VVER-440 and VVER-1000. The justification of the VVER equipment integrity depends on the reliability of estimation of the degree of the equipment damage. One of the mandatory requirements, providing the reliability of such estimation, and also evaluation of VVER equipment lifetime, is the monitoring of equipment radiation loading parameters. In this connection, there is a problem of justification of such normative parameters, used for an estimation of the pressure vessel metal embrittlement, as the fluence and fluence rate (FR) of fast neutrons above 0.5 MeV. From the point of view of regulatory practice, a comparison of displacement per atom (DPA) and fast neutron fluence (FNF) above 0.5 MeV has a practical concern. In accordance with the Russian regulatory rules, neutron fluence F(E > 0.5 MeV) is a radiation exposure parameter used in steel embrittlement prediction under neutron irradiation. However, the DPA parameter is a more physically legitimate quantity of neutron damage of Fe based materials. If DPA distribution in reactor structures is more conservative as neutron fluence, this case should attract the attention of the regulatory authority. The purpose of this work was to show what radiation load parameters (fluence, DPA) on all VVER equipment should be under control, and give the reasonable estimations of such parameters in the volume of all equipment. The second task is to give the conservative estimation of each parameter including its uncertainty. Results of recently received investigations allow to test the conservatism of calculational predictions, and, as it has been shown in the paper, combination of ex-vessel measured data with calculated ones allows to assess unpredicted uncertainties which are results of specific unique features of individual equipment for VVER reactor. Some results of calculational-experimental investigations are presented in this paper.

Keywords: equipment integrity, fluence, displacement per atom, nuclear power plant, neutron activation measurements, neutron transport calculations

Procedia PDF Downloads 128
113 Evaluation of Teaching Performance in Higher Education: From the Students' Responsibility to Their Evaluative Competence

Authors: Natacha Jesus-Silva, Carla S. Pereira, Natercia Durao, Maria Das Dores Formosinho, Cristina Costa-Lobo

Abstract:

Any assessment process, by its very nature, raises a wide range of doubts, uncertainties, and insecurities of all kinds. The evaluation process should be ethically irreproachable, treating each and every one of the evaluated according to a conduct that ensures that the process is fair, contributing to all recognize and feel well with the processes and results of the evaluation. This is a very important starting point and implies that positive and constructive conceptions and attitudes are developed regarding the evaluation of teaching performance, where students' responsibility is desired. It is not uncommon to find teachers feeling threatened at various levels, in particular as regards their autonomy and their professional dignity. Evaluation must be useful in that it should enable decisions to be taken to improve teacher performance, the quality of teaching or the learning climate of the school. This study is part of a research project whose main objective is to identify, select, evaluate and synthesize the available evidence on Quality Indicators in Higher Education. In this work, the 01 parameters resulting from pedagogical surveys in a Portuguese higher education institution in the north of the country will be presented, surveys for the 2015/2016 school year, presented to 1751 students, in a total of 11 degrees and 18 master's degrees. It has analyzed the evaluation made by students with respect to the performance of a group of 68 teachers working full time. This paper presents the lessons learned in the last three academic years, allowing for the identification of the effects on the following areas: teaching strategies and methodologies, capacity of systematization, learning climate, creation of conditions for active student participation. This paper describes the procedures resulting from the descriptive analysis (frequency analysis, descriptive measures and association measures) and inferential analysis (ANOVA one-way, MANOVA one-way, MANOVA two-way and correlation analysis).

Keywords: teaching performance, higher education, students responsibility, indicators of teaching management

Procedia PDF Downloads 250
112 Changes in Forest Cover Regulate Streamflow in Central Nigerian Gallery Forests

Authors: Rahila Yilangai, Sonali Saha, Amartya Saha, Augustine Ezealor

Abstract:

Gallery forests in sub-Saharan Africa are drastically disappearing due to intensive anthropogenic activities thus reducing ecosystem services, one of which is water provisioning. The role played by forest cover in regulating streamflow and water yield is not well understood, especially in West Africa. This pioneering 2-year study investigated the interrelationships between plant cover and hydrology in protected and unprotected gallery forests. Rainfall, streamflow, and evapotranspiration (ET) measurements/estimates over 2015-2016 were obtained to form a water balance for both catchments. In addition, transpiration in the protected gallery forest with high vegetation cover was calculated from stomatal conductance readings of selected species chosen from plot level data of plant diversity and abundance. Results showed that annual streamflow was significantly higher in the unprotected site than the protected site, even when normalized by catchment area. However, streamflow commenced earlier and lasted longer in the protected site than the degraded unprotected site, suggesting regulation by the greater tree density in the protected site. Streamflow correlated strongly with rainfall with the highest peak in August. As expected, transpiration measurements were less than potential evapotranspiration estimates, while rainfall exceeded ET in the water cycle. The water balance partitioning suggests that the lower vegetation cover in the unprotected catchment leads to a larger runoff in the rainy season and less infiltration, thereby leading to streams drying up earlier, than in the protected catchment. This baseline information is important in understanding the contribution of plants in water cycle regulation, for modeling integrative water management in applied research and natural resource management in sustaining water resources with changing the land cover and climate uncertainties in this data-poor region.

Keywords: evapotranspiration, gallery forest, rainfall, streamflow, transpiration

Procedia PDF Downloads 134
111 Quality Assurances for an On-Board Imaging System of a Linear Accelerator: Five Months Data Analysis

Authors: Liyun Chang, Cheng-Hsiang Tsai

Abstract:

To ensure the radiation precisely delivering to the target of cancer patients, the linear accelerator equipped with the pretreatment on-board imaging system is introduced and through it the patient setup is verified before the daily treatment. New generation radiotherapy using beam-intensity modulation, usually associated the treatment with steep dose gradients, claimed to have achieved both a higher degree of dose conformation in the targets and a further reduction of toxicity in normal tissues. However, this benefit is counterproductive if the beam is delivered imprecisely. To avoid shooting critical organs or normal tissues rather than the target, it is very important to carry out the quality assurance (QA) of this on-board imaging system. The QA of the On-Board Imager® (OBI) system of one Varian Clinac-iX linear accelerator was performed through our procedures modified from a relevant report and AAPM TG142. Two image modalities, 2D radiography and 3D cone-beam computed tomography (CBCT), of the OBI system were examined. The daily and monthly QA was executed for five months in the categories of safety, geometrical accuracy and image quality. A marker phantom and a blade calibration plate were used for the QA of geometrical accuracy, while the Leeds phantom and Catphan 504 phantom were used in the QA of radiographic and CBCT image quality, respectively. The reference images were generated through a GE LightSpeed CT simulator with an ADAC Pinnacle treatment planning system. Finally, the image quality was analyzed via an OsiriX medical imaging system. For the geometrical accuracy test, the average deviations of the OBI isocenter in each direction are less than 0.6 mm with uncertainties less than 0.2 mm, while all the other items have the displacements less than 1 mm. For radiographic image quality, the spatial resolution is 1.6 lp/cm with contrasts less than 2.2%. The spatial resolution, low contrast, and HU homogenous of CBCT are larger than 6 lp/cm, less than 1% and within 20 HU, respectively. All tests are within the criteria, except the HU value of Teflon measured with the full fan mode exceeding the suggested value that could be due to itself high HU value and needed to be rechecked. The OBI system in our facility was then demonstrated to be reliable with stable image quality. The QA of OBI system is really necessary to achieve the best treatment for a patient.

Keywords: CBCT, image quality, quality assurance, OBI

Procedia PDF Downloads 265
110 A Systematic Review on Development of a Cost Estimation Framework: A Case Study of Nigeria

Authors: Babatunde Dosumu, Obuks Ejohwomu, Akilu Yunusa-Kaltungo

Abstract:

Cost estimation in construction is often difficult, particularly when dealing with risks and uncertainties, which are inevitable and peculiar to developing countries like Nigeria. Direct consequences of these are major deviations in cost, duration, and quality. The fundamental aim of this study is to develop a framework for assessing the impacts of risk on cost estimation, which in turn causes variabilities between contract sum and final account. This is very important, as initial estimates given to clients should reflect the certain magnitude of consistency and accuracy, which the client builds other planning-related activities upon, and also enhance the capabilities of construction industry professionals by enabling better prediction of the final account from the contract sum. In achieving this, a systematic literature review was conducted with cost variability and construction projects as search string within three databases: Scopus, Web of science, and Ebsco (Business source premium), which are further analyzed and gap(s) in knowledge or research discovered. From the extensive review, it was found that factors causing deviation between final accounts and contract sum ranged between 1 and 45. Besides, it was discovered that a cost estimation framework similar to Building Cost Information Services (BCIS) is unavailable in Nigeria, which is a major reason why initial estimates are very often inconsistent, leading to project delay, abandonment, or determination at the expense of the huge sum of money invested. It was concluded that the development of a cost estimation framework that is adjudged an important tool in risk shedding rather than risk-sharing in project risk management would be a panacea to cost estimation problems, leading to cost variability in the Nigerian construction industry by the time this ongoing Ph.D. research is completed. It was recommended that practitioners in the construction industry should always take into account risk in order to facilitate the rapid development of the construction industry in Nigeria, which should give stakeholders a more in-depth understanding of the estimation effectiveness and efficiency to be adopted by stakeholders in both the private and public sectors.

Keywords: cost variability, construction projects, future studies, Nigeria

Procedia PDF Downloads 159
109 Integration of Agile Philosophy and Scrum Framework to Missile System Design Processes

Authors: Misra Ayse Adsiz, Selim Selvi

Abstract:

In today's world, technology is competing with time. In order to catch up with the world's companies and adapt quickly to the changes, it is necessary to speed up the processes and keep pace with the rate of change of the technology. The missile system design processes, which are handled with classical methods, keep behind in this race. Because customer requirements are not clear, and demands are changing again and again in the design process. Therefore, in the system design process, a methodology suitable for the missile system design dynamics has been investigated and the processes used for catching up the era are examined. When commonly used design processes are analyzed, it is seen that any one of them is dynamic enough for today’s conditions. So a hybrid design process is established. After a detailed review of the existing processes, it is decided to focus on the Scrum Framework and Agile Philosophy. Scrum is a process framework. It is focused on to develop software and handling change management with rapid methods. In addition, agile philosophy is intended to respond quickly to changes. In this study, it is aimed to integrate Scrum framework and agile philosophy, which are the most appropriate ways for rapid production and change adaptation, into the missile system design process. With this approach, it is aimed that the design team, involved in the system design processes, is in communication with the customer and provide an iterative approach in change management. These methods, which are currently being used in the software industry, have been integrated with the product design process. A team is created for system design process. The roles of Scrum Team are realized with including the customer. A scrum team consists of the product owner, development team and scrum master. Scrum events, which are short, purposeful and time-limited, are organized to serve for coordination rather than long meetings. Instead of the classic system design methods used in product development studies, a missile design is made with this blended method. With the help of this design approach, it is become easier to anticipate changing customer demands, produce quick solutions to demands and combat uncertainties in the product development process. With the feedback of the customer who included in the process, it is worked towards marketing optimization, design and financial optimization.

Keywords: agile, design, missile, scrum

Procedia PDF Downloads 140
108 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images

Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir

Abstract:

The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.

Keywords: altitude estimation, drone, image processing, trajectory planning

Procedia PDF Downloads 80
107 Assessing the Mass Concentration of Microplastics and Nanoplastics in Wastewater Treatment Plants by Pyrolysis Gas Chromatography−Mass Spectrometry

Authors: Yanghui Xu, Qin Ou, Xintu Wang, Feng Hou, Peng Li, Jan Peter van der Hoek, Gang Liu

Abstract:

The level and removal of microplastics (MPs) in wastewater treatment plants (WWTPs) has been well evaluated by the particle number, while the mass concentration of MPs and especially nanoplastics (NPs) remains unclear. In this study, microfiltration, ultrafiltration and hydrogen peroxide digestion were used to extract MPs and NPs with different size ranges (0.01−1, 1−50, and 50−1000 μm) across the whole treatment schemes in two WWTPs. By identifying specific pyrolysis products, pyrolysis gas chromatography−mass spectrometry were used to quantify their mass concentrations of selected six types of polymers (i.e., polymethyl methacrylate (PMMA), polypropylene (PP), polystyrene (PS), polyethylene (PE), polyethylene terephthalate (PET), and polyamide (PA)). The mass concentrations of total MPs and NPs decreased from 26.23 and 11.28 μg/L in the influent to 1.75 and 0.71 μg/L in the effluent, with removal rates of 93.3 and 93.7% in plants A and B, respectively. Among them, PP, PET and PE were the dominant polymer types in wastewater, while PMMA, PS and PA only accounted for a small part. The mass concentrations of NPs (0.01−1 μm) were much lower than those of MPs (>1 μm), accounting for 12.0−17.9 and 5.6− 19.5% of the total MPs and NPs, respectively. Notably, the removal efficiency differed with the polymer type and size range. The low-density MPs (e.g., PP and PE) had lower removal efficiency than high-density PET in both plants. Since particles with smaller size could pass the tertiary sand filter or membrane filter more easily, the removal efficiency of NPs was lower than that of MPs with larger particle size. Based on annual wastewater effluent discharge, it is estimated that about 0.321 and 0.052 tons of MPs and NPs were released into the river each year. Overall, this study investigated the mass concentration of MPs and NPs with a wide size range of 0.01−1000 μm in wastewater, which provided valuable information regarding the pollution level and distribution characteristics of MPs, especially NPs, in WWTPs. However, there are limitations and uncertainties in the current study, especially regarding the sample collection and MP/NP detection. The used plastic items (e.g., sampling buckets, ultrafiltration membranes, centrifugal tubes, and pipette tips) may introduce potential contamination. Additionally, the proposed method caused loss of MPs, especially NPs, which can lead to underestimation of MPs/NPs. Further studies are recommended to address these challenges about MPs/NPs in wastewater.

Keywords: microplastics, nanoplastics, mass concentration, WWTPs, Py-GC/MS

Procedia PDF Downloads 245
106 Sustainable Ecological Agricultural Systems in Bangladesh: Environmental, Economic and Social Perspective of Compost

Authors: Protima Chakraborty

Abstract:

The sustainability of conventional agriculture in Bangladesh is under threat from the continuous degradation of land and water resources, and from declining yields due to indiscriminate use of agrochemicals. NASL (Northern Agro Services Limited) is pursuing efforts to promote ecological agriculture with emphasis on better use of organic fertilizer resources and the reduction of external inputs. This paper examines the sustainability of two production systems in terms of their environmental soundness, economic viability and social acceptability based on empirical data collected through making demonstration land cultivation, a household survey, soil sample analysis, observations and discussions with key informants. Twelve indicators were selected to evaluate sustainability. Significant differences were found between the two systems in crop diversification, soil fertility management, pests and diseases management, and use of agrochemicals & Organic Compost. However, significant variations were found in other indicators such as land-use pattern, crop yield and stability, risk and uncertainties, and food security. Although crop yield and financial return were found to be slightly higher in the ecological system, the economic return and value addition per unit of land show the positive difference of using compost rather than chemical fertilizer. The findings suggest that ecological agriculture has a tendency towards becoming ecologically, economically and socially more sound than conventional agriculture, as it requires considerably fewer agro-chemicals, adds more organic matter to the soil, provides balanced food, and requires higher local inputs without markedly compromising output and financial benefits. Broad-policy measures, including the creation of mass awareness of adverse health effects of agrochemical-based products, are outlined for the promotion of ecological agriculture.

Keywords: Bangladesh, compost, conventional agriculture, organic fertilizer, environmental sustainability, economic viability, social acceptability

Procedia PDF Downloads 212
105 Derivation of Fragility Functions of Marine Drilling Risers Under Ocean Environment

Authors: Pranjal Srivastava, Piyali Sengupta

Abstract:

The performance of marine drilling risers is crucial in the offshore oil and gas industry to ensure safe drilling operation with minimum downtime. Experimental investigations on marine drilling risers are limited in the literature owing to the expensive and exhaustive test setup required to replicate the realistic riser model and ocean environment in the laboratory. Therefore, this study presents an analytical model of marine drilling riser for determining its fragility under ocean environmental loading. In this study, the marine drilling riser is idealized as a continuous beam having a concentric circular cross-section. Hydrodynamic loading acting on the marine drilling riser is determined by Morison’s equations. By considering the equilibrium of forces on the marine drilling riser for the connected and normal drilling conditions, the governing partial differential equations in terms of independent variables z (depth) and t (time) are derived. Subsequently, the Runge Kutta method and Finite Difference Method are employed for solving the partial differential equations arising from the analytical model. The proposed analytical approach is successfully validated with respect to the experimental results from the literature. From the dynamic analysis results of the proposed analytical approach, the critical design parameters peak displacements, upper and lower flex joint rotations and von Mises stresses of marine drilling risers are determined. An extensive parametric study is conducted to explore the effects of top tension, drilling depth, ocean current speed and platform drift on the critical design parameters of the marine drilling riser. Thereafter, incremental dynamic analysis is performed to derive the fragility functions of shallow water and deep-water marine drilling risers under ocean environmental loading. The proposed methodology can also be adopted for downtime estimation of marine drilling risers incorporating the ranges of uncertainties associated with the ocean environment, especially at deep and ultra-deepwater.

Keywords: drilling riser, marine, analytical model, fragility

Procedia PDF Downloads 115
104 Projected Uncertainties in Herbaceous Production Result from Unpredictable Rainfall Pattern and Livestock Grazing in a Humid Tropical Savanna Ecosystem

Authors: Daniel Osieko Okach, Joseph Otieno Ondier, Gerhard Rambold, John Tenhunen, Bernd Huwe, Dennis Otieno

Abstract:

Increased human activities such as grazing, logging, and agriculture alongside unpredictable rainfall patterns have been detrimental to the ecosystem service delivery, therefore compromising its productivity potential. This study aimed at simulating the impact of drought (50%) and enhanced rainfall (150%) on the future herbaceous CO2 uptake, biomass production and soil C:N dynamics in a humid savanna ecosystem influenced by livestock grazing. Rainfall pattern was predicted using manipulation experiments set up to reduce (50%) and increase (150%) ambient (100%) rainfall amounts in grazed and non-grazed plots. The impact of manipulated rainfall regime on herbaceous CO2 fluxes, biomass production and soil C:N dynamics was measured against volumetric soil water content (VWC) logged every 30 minutes using the 5TE (Decagon Devices Inc., Washington, USA) soil moisture sensors installed (at 20 cm soil depth) in every plots. Herbaceous biomass was estimated using destructive method augmented by standardized photographic imaging. CO2 fluxes were measured using the ecosystem chamber method and the gas analysed using LI-820 gas analyzer (USA). C:N ratio was calculated from the soil carbon and Nitrogen contents (analyzed using EA2400CHNS/O and EA2410 N elemental analyzers respectively) of different plots under study. The patterning of VWC was directly influenced by the rainfall amount with lower VWC observed in the grazed compared to the non-grazed plots. Rainfall variability, grazing and their interaction significantly affected changes in VWC (p < 0.05) and subsequently total biomass and CO2 fluxes. VWC had a strong influence on CO2 fluxes under 50% rainfall reduction in the grazed (r2 = 0.91; p < 0.05) and ambient rainfall in the ungrazed (r2 = 0.77; p < 0.05). The dependence of biomass on VWC across plots was enhanced under grazed (r2 = 0.78 - 0.87; p < 0.05) condition as compared to ungrazed (r2 = 0.44 - 0.85; p < 0.05). The C:N ratio was however not correlated to VWC across plots. This study provides insight on how the predicted trends in humid savanna will respond to changes influenced by rainfall variability and livestock grazing and consequently the sustainable management of such ecosystems.

Keywords: CO2 fluxes, rainfall manipulation, soil properties, sustainability

Procedia PDF Downloads 100
103 Biological Hazards and Laboratory inflicted Infections in Sub-Saharan Africa

Authors: Godfrey Muiya Mukala

Abstract:

This research looks at an array of fields in Sub-Saharan Africa comprising agriculture, food enterprises, medicine, organisms genetically modified, microbiology, and nanotechnology that can be gained from biotechnological research and development. Findings into dangerous organisms, mainly bacterial germs, rickettsia, fungi, parasites, or organisms that are genetically engineered, have immensely posed questions attributed to the biological danger they bring forth to human beings and the environment because of their uncertainties. In addition, the recurrence of previously managed diseases or the inception of new diseases are connected to biosafety challenges, especially in rural set-ups in low and middle-income countries. Notably, biotechnology laboratories are required to adopt biosafety measures to protect their workforce, community, environment, and ecosystem from unforeseen materials and organisms. Sensitization and inclusion of educational frameworks for laboratory workers are essential to acquiring a solid knowledge of harmful biological agents. This is in addition to human pathogenicity, susceptibility, and epidemiology to the biological data used in research and development. This article reviews and analyzes research intending to identify the proper implementation of universally accepted practices in laboratory safety and biological hazards. This research identifies ideal microbiological methods, adequate containment equipment, sufficient resources, safety barriers, specific training, and education of the laboratory workforce to decrease and contain biological hazards. Subsequently, knowledge of standardized microbiological techniques and processes, in addition to the employment of containment facilities, protective barriers, and equipment, is far-reaching in preventing occupational infections. Similarly, reduction of risks and prevention may be attained by training, education, and research on biohazards, pathogenicity, and epidemiology of the relevant microorganisms. In this technique, medical professionals in rural setups may adopt the knowledge acquired from the past to project possible concerns in the future.

Keywords: sub-saharan africa, biotechnology, laboratory, infections, health

Procedia PDF Downloads 42
102 Open Fields' Dosimetric Verification for a Commercially-Used 3D Treatment Planning System

Authors: Nashaat A. Deiab, Aida Radwan, Mohamed Elnagdy, Mohamed S. Yahiya, Rasha Moustafa

Abstract:

This study is to evaluate and investigate the dosimetric performance of our institution's 3D treatment planning system, Elekta PrecisePLAN, for open 6MV fields including square, rectangular, variation in SSD, centrally blocked, missing tissue, square MLC and MLC shaped fields guided by the recommended QA tests prescribed in AAPM TG53, NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Seven different tests were done applied on solid water equivalent phantom along with 2D array dose detection system, the calculated doses using 3D treatment planning system PrecisePLAN, compared with measured doses to make sure that the dose calculations are accurate for open fields including square, rectangular, variation in SSD, centrally blocked, missing tissue, square MLC and MLC shaped fields. The QA results showed dosimetric accuracy of the TPS for open fields within the specified tolerance limits. However large square (25cm x 25cm) and rectangular fields (20cm x 5cm) some points were out of tolerance in penumbra region (11.38 % and 10.9 %, respectively). For the test of SSD variation, the large field resulted from SSD 125 cm for 10cm x 10cm filed the results recorded an error of 0.2% at the central axis and 1.01% in penumbra. The results yielded differences within the accepted tolerance level as recommended. Large fields showed variations in penumbra. These differences between dose values predicted by the TPS and the measured values at the same point may result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.

Keywords: quality assurance, dose calculation, 3D treatment planning system, photon beam

Procedia PDF Downloads 471
101 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data

Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin

Abstract:

The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.

Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline

Procedia PDF Downloads 279
100 Tools and Techniques in Risk Assessment in Public Risk Management Organisations

Authors: Atousa Khodadadyan, Gabe Mythen, Hirbod Assa, Beverley Bishop

Abstract:

Risk assessment and the knowledge provided through this process is a crucial part of any decision-making process in the management of risks and uncertainties. Failure in assessment of risks can cause inadequacy in the entire process of risk management, which in turn can lead to failure in achieving organisational objectives as well as having significant damaging consequences on populations affected by the potential risks being assessed. The choice of tools and techniques in risk assessment can influence the degree and scope of decision-making and subsequently the risk response strategy. There are various available qualitative and quantitative tools and techniques that are deployed within the broad process of risk assessment. The sheer diversity of tools and techniques available to practitioners makes it difficult for organisations to consistently employ the most appropriate methods. This tools and techniques adaptation is rendered more difficult in public risk regulation organisations due to the sensitive and complex nature of their activities. This is particularly the case in areas relating to the environment, food, and human health and safety, when organisational goals are tied up with societal, political and individuals’ goals at national and international levels. Hence, recognising, analysing and evaluating different decision support tools and techniques employed in assessing risks in public risk management organisations was considered. This research is part of a mixed method study which aimed to examine the perception of risk assessment and the extent to which organisations practise risk assessment’ tools and techniques. The study adopted a semi-structured questionnaire with qualitative and quantitative data analysis to include a range of public risk regulation organisations from the UK, Germany, France, Belgium and the Netherlands. The results indicated the public risk management organisations mainly use diverse tools and techniques in the risk assessment process. The primary hazard analysis; brainstorming; hazard analysis and critical control points were described as the most practiced risk identification techniques. Within qualitative and quantitative risk analysis, the participants named the expert judgement, risk probability and impact assessment, sensitivity analysis and data gathering and representation as the most practised techniques.

Keywords: decision-making, public risk management organisations, risk assessment, tools and techniques

Procedia PDF Downloads 248
99 A Framework Based Blockchain for the Development of a Social Economy Platform

Authors: Hasna Elalaoui Elabdallaoui, Abdelaziz Elfazziki, Mohamed Sadgal

Abstract:

Outlines: The social economy is a moral approach to solidarity applied to the projects’ development. To reconcile economic activity and social equity, crowdfunding is as an alternative means of financing social projects. Several collaborative blockchain platforms exist. It eliminates the need for a central authority or an inconsiderate middleman. Also, the costs for a successful crowdfunding campaign are reduced, since there is no commission to be paid to the intermediary. It improves the transparency of record keeping and delegates authority to authorities who may be prone to corruption. Objectives: The objectives are: to define a software infrastructure for projects’ participatory financing within a social and solidarity economy, allowing transparent, secure, and fair management and to have a financial mechanism that improves financial inclusion. Methodology: The proposed methodology is: crowdfunding platforms literature review, financing mechanisms literature review, requirements analysis and project definition, a business plan, Platform development process and implementation technology, and testing an MVP. Contributions: The solution consists of proposing a new approach to crowdfunding based on Islamic financing, which is the principle of Mousharaka inspired by Islamic financing, which presents a financial innovation that integrates ethics and the social dimension into contemporary banking practices. Conclusion: Crowdfunding platforms need to secure projects and allow only quality projects but also offer a wide range of options to funders. Thus, a framework based on blockchain technology and Islamic financing is proposed to manage this arbitration between quality and quantity of options. The proposed financing system, "Musharaka", is a mode of financing that prohibits interests and uncertainties. The implementation is offered on the secure Ethereum platform as investors sign and initiate transactions for contributions using their digital signature wallet managed by a cryptography algorithm and smart contracts. Our proposal is illustrated by a crop irrigation project in the Marrakech region.

Keywords: social economy, Musharaka, blockchain, smart contract, crowdfunding

Procedia PDF Downloads 45
98 Influential Parameters in Estimating Soil Properties from Cone Penetrating Test: An Artificial Neural Network Study

Authors: Ahmed G. Mahgoub, Dahlia H. Hafez, Mostafa A. Abu Kiefa

Abstract:

The Cone Penetration Test (CPT) is a common in-situ test which generally investigates a much greater volume of soil more quickly than possible from sampling and laboratory tests. Therefore, it has the potential to realize both cost savings and assessment of soil properties rapidly and continuously. The principle objective of this paper is to demonstrate the feasibility and efficiency of using artificial neural networks (ANNs) to predict the soil angle of internal friction (Φ) and the soil modulus of elasticity (E) from CPT results considering the uncertainties and non-linearities of the soil. In addition, ANNs are used to study the influence of different parameters and recommend which parameters should be included as input parameters to improve the prediction. Neural networks discover relationships in the input data sets through the iterative presentation of the data and intrinsic mapping characteristics of neural topologies. General Regression Neural Network (GRNN) is one of the powerful neural network architectures which is utilized in this study. A large amount of field and experimental data including CPT results, plate load tests, direct shear box, grain size distribution and calculated data of overburden pressure was obtained from a large project in the United Arab Emirates. This data was used for the training and the validation of the neural network. A comparison was made between the obtained results from the ANN's approach, and some common traditional correlations that predict Φ and E from CPT results with respect to the actual results of the collected data. The results show that the ANN is a very powerful tool. Very good agreement was obtained between estimated results from ANN and actual measured results with comparison to other correlations available in the literature. The study recommends some easily available parameters that should be included in the estimation of the soil properties to improve the prediction models. It is shown that the use of friction ration in the estimation of Φ and the use of fines content in the estimation of E considerable improve the prediction models.

Keywords: angle of internal friction, cone penetrating test, general regression neural network, soil modulus of elasticity

Procedia PDF Downloads 391
97 Dynamic Reliability for a Complex System and Process: Application on Offshore Platform in Mozambique

Authors: Raed KOUTA, José-Alcebiades-Ernesto HLUNGUANE, Eric Châtele

Abstract:

The search for and exploitation of new fossil energy resources is taking place in the context of the gradual depletion of existing deposits. Despite the adoption of international targets to combat global warming, the demand for fuels continues to grow, contradicting the movement towards an energy-efficient society. The increase in the share of offshore in global hydrocarbon production tends to compensate for the depletion of terrestrial reserves, thus constituting a major challenge for the players in the sector. Through the economic potential it represents, and the energy independence it provides, offshore exploitation is also a challenge for States such as Mozambique, which have large maritime areas and whose environmental wealth must be considered. The exploitation of new reserves on economically viable terms depends on available technologies. The development of deep and ultra-deep offshore requires significant research and development efforts. Progress has also been made in managing the multiple risks inherent in this activity. Our study proposes a reliability approach to develop products and processes designed to live at sea. Indeed, the context of an offshore platform requires highly reliable solutions to overcome the difficulties of access to the system for regular maintenance and quick repairs and which must resist deterioration and degradation processes. One of the characteristics of failures that we consider is the actual conditions of use that are considered 'extreme.' These conditions depend on time and the interactions between the different causes. These are the two factors that give the degradation process its dynamic character, hence the need to develop dynamic reliability models. Our work highlights mathematical models that can explicitly manage interactions between components and process variables. These models are accompanied by numerical resolution methods that help to structure a dynamic reliability approach in a physical and probabilistic context. The application developed makes it possible to evaluate the reliability, availability, and maintainability of a floating storage and unloading platform for liquefied natural gas production.

Keywords: dynamic reliability, offshore plateform, stochastic process, uncertainties

Procedia PDF Downloads 93
96 Seismic Response of Structure Using a Three Degree of Freedom Shake Table

Authors: Ketan N. Bajad, Manisha V. Waghmare

Abstract:

Earthquakes are the biggest threat to the civil engineering structures as every year it cost billions of dollars and thousands of deaths, around the world. There are various experimental techniques such as pseudo-dynamic tests – nonlinear structural dynamic technique, real time pseudo dynamic test and shaking table test method that can be employed to verify the seismic performance of structures. Shake table is a device that is used for shaking structural models or building components which are mounted on it. It is a device that simulates a seismic event using existing seismic data and nearly truly reproducing earthquake inputs. This paper deals with the use of shaking table test method to check the response of structure subjected to earthquake. The various types of shake table are vertical shake table, horizontal shake table, servo hydraulic shake table and servo electric shake table. The goal of this experiment is to perform seismic analysis of a civil engineering structure with the help of 3 degree of freedom (i.e. in X Y Z direction) shake table. Three (3) DOF shaking table is a useful experimental apparatus as it imitates a real time desired acceleration vibration signal for evaluating and assessing the seismic performance of structure. This study proceeds with the proper designing and erection of 3 DOF shake table by trial and error method. The table is designed to have a capacity up to 981 Newton. Further, to study the seismic response of a steel industrial building, a proportionately scaled down model is fabricated and tested on the shake table. The accelerometer is mounted on the model, which is used for recording the data. The experimental results obtained are further validated with the results obtained from software. It is found that model can be used to determine how the structure behaves in response to an applied earthquake motion, but the model cannot be used for direct numerical conclusions (such as of stiffness, deflection, etc.) as many uncertainties involved while scaling a small-scale model. The model shows modal forms and gives the rough deflection values. The experimental results demonstrate shake table as the most effective and the best of all methods available for seismic assessment of structure.

Keywords: accelerometer, three degree of freedom shake table, seismic analysis, steel industrial shed

Procedia PDF Downloads 104
95 Structural Health Monitoring-Integrated Structural Reliability Based Decision Making

Authors: Caglayan Hizal, Kutay Yuceturk, Ertugrul Turker Uzun, Hasan Ceylan, Engin Aktas, Gursoy Turan

Abstract:

Monitoring concepts for structural systems have been investigated by researchers for decades since such tools are quite convenient to determine intervention planning of structures. Despite the considerable development in this regard, the efficient use of monitoring data in reliability assessment, and prediction models are still in need of improvement in their efficiency. More specifically, reliability-based seismic risk assessment of engineering structures may play a crucial role in the post-earthquake decision-making process for the structures. After an earthquake, professionals could identify heavily damaged structures based on visual observations. Among these, it is hard to identify the ones with minimum signs of damages, even if they would experience considerable structural degradation. Besides, visual observations are open to human interpretations, which make the decision process controversial, and thus, less reliable. In this context, when a continuous monitoring system has been previously installed on the corresponding structure, this decision process might be completed rapidly and with higher confidence by means of the observed data. At this stage, the Structural Health Monitoring (SHM) procedure has an important role since it can make it possible to estimate the system reliability based on a recursively updated mathematical model. Therefore, integrating an SHM procedure into the reliability assessment process comes forward as an important challenge due to the arising uncertainties for the updated model in case of the environmental, material and earthquake induced changes. In this context, this study presents a case study on SHM-integrated reliability assessment of the continuously monitored progressively damaged systems. The objective of this study is to get instant feedback on the current state of the structure after an extreme event, such as earthquakes, by involving the observed data rather than the visual inspections. Thus, the decision-making process after such an event can be carried out on a rational basis. In the near future, this can give wing to the design of self-reported structures which can warn about its current situation after an extreme event.

Keywords: condition assessment, vibration-based SHM, reliability analysis, seismic risk assessment

Procedia PDF Downloads 107
94 Knowledge Co-Production on Future Climate-Change-Induced Mass-Movement Risks in Alpine Regions

Authors: Elisabeth Maidl

Abstract:

The interdependence of climate change and natural hazard goes along with large uncertainties regarding future risks. Regional stakeholders, experts in natural hazards management and scientists have specific knowledge, resp. mental models on such risks. This diversity of views makes it difficult to find common and broadly accepted prevention measures. If the specific knowledge of these types of actors is shared in an interactive knowledge production process, this enables a broader and common understanding of complex risks and allows to agree on long-term solution strategies. Previous studies on mental models confirm that actors with specific vulnerabilities perceive different aspects of a topic and accordingly prefer different measures. In bringing these perspectives together, there is the potential to reduce uncertainty and to close blind spots in solution finding. However, studies that examine the mental models of regional actors on future concrete mass movement risks are lacking so far. The project tests and evaluates the feasibility of knowledge co-creation for the anticipatory prevention of climate change-induced mass movement risks in the Alps. As a key element, mental models of the three included groups of actors are compared. Being integrated into the research program Climate Change Impacts on Alpine Mass Movements (CCAMM2), this project is carried out in two Swiss mountain regions. The project is structured in four phases: 1) the preparatory phase, in which the participants are identified, 2) the baseline phase, in which qualitative interviews and a quantitative pre-survey are conducted with actors 3) the knowledge-co-creation phase, in which actors have a moderated exchange meeting, and a participatory modelling workshop on specific risks in the region, and 4) finally a public information event. Results show that participants' mental models are based on the place of origin, profession, believes, values, which results in narratives on climate change and hazard risks. Further, the more intensively participants interact with each other, the more likely is that they change their views. This provides empirical evidence on how changes in opinions and mindsets can be induced and fostered.

Keywords: climate change, knowledge-co-creation, participatory process, natural hazard risks

Procedia PDF Downloads 31
93 Criticality Assessment Model for Water Pipelines Using Fuzzy Analytical Network Process

Authors: A. Assad, T. Zayed

Abstract:

Water networks (WNs) are responsible of providing adequate amounts of safe, high quality, water to the public. As other critical infrastructure systems, WNs are subjected to deterioration which increases the number of breaks and leaks and lower water quality. In Canada, 35% of water assets require critical attention and there is a significant gap between the needed and the implemented investments. Thus, the need for efficient rehabilitation programs is becoming more urgent given the paradigm of aging infrastructure and tight budget. The first step towards developing such programs is to formulate a Performance Index that reflects the current condition of water assets along with its criticality. While numerous studies in the literature have focused on various aspects of condition assessment and reliability, limited efforts have investigated the criticality of such components. Critical water mains are those whose failure cause significant economic, environmental or social impacts on a community. Inclusion of criticality in computing the performance index will serve as a prioritizing tool for the optimum allocating of the available resources and budget. In this study, several social, economic, and environmental factors that dictate the criticality of a water pipelines have been elicited from analyzing the literature. Expert opinions were sought to provide pairwise comparisons of the importance of such factors. Subsequently, Fuzzy Logic along with Analytical Network Process (ANP) was utilized to calculate the weights of several criteria factors. Multi Attribute Utility Theories (MAUT) was then employed to integrate the aforementioned weights with the attribute values of several pipelines in Montreal WN. The result is a criticality index, 0-1, that quantifies the severity of the consequence of failure of each pipeline. A novel contribution of this approach is that it accounts for both the interdependency between criteria factors as well as the inherited uncertainties in calculating the criticality. The practical value of the current study is represented by the automated tool, Excel-MATLAB, which can be used by the utility managers and decision makers in planning for future maintenance and rehabilitation activities where high-level efficiency in use of materials and time resources is required.

Keywords: water networks, criticality assessment, asset management, fuzzy analytical network process

Procedia PDF Downloads 120
92 Analysing the Applicability of a Participatory Approach to Life Cycle Sustainability Assessment: Case Study of a Housing Estate Regeneration in London

Authors: Sahar Navabakhsh, Rokia Raslan, Yair Schwartz

Abstract:

Decision-making on regeneration of housing estates, whether to refurbish or re-build, has been mostly triggered by economic factors. To enable sustainable growth, it is vital that environmental and social impacts of different scenarios are also taken into account. The methodology used to include all the three sustainable development pillars is called Life Cycle Sustainability Assessment (LCSA), which comprises of Life Cycle Assessment (LCA) for the assessment of environmental impacts of buildings. Current practice of LCA is regularly conducted post design stage and by sustainability experts. Not only is undertaking an LCA at this stage less effective, but issues such as the limited scope for the definition and assessment of environmental impacts, the implication of changes in the system boundary and the alteration of each of the variable metrics, employment of different Life Cycle Impact Assessment Methods and use of various inventory data for Life Cycle Inventory Analysis can result in considerably contrasting results. Given the niche nature and scarce specialist domain of LCA of buildings, the majority of the stakeholders do not contribute to the generation or interpretation of the impact assessment, and the results can be generated and interpreted subjectively due to the mentioned uncertainties. For an effective and democratic assessment of environmental impacts, different stakeholders, and in particular the community and design team should collaborate in the process of data collection, assessment and analysis. This paper examines and evaluates a participatory approach to LCSA through the analysis of a case study of a housing estate in South West London. The study has been conducted throughout tier-based collaborative methods to collect and share data through surveys and co-design workshops with the community members and the design team as the main stakeholders. The assessment of lifecycle impacts is conducted throughout the process and has influenced the decision-making on the design of the Community Plan. The evaluation concludes better assessment transparency and outcome, alongside other socio-economic benefits of identifying and engaging the most contributive stakeholders in the process of conducting LCSA.

Keywords: life cycle assessment, participatory LCA, life cycle sustainability assessment, participatory processes, decision-making, housing estate regeneration

Procedia PDF Downloads 117