Search results for: run off estimation and rainfall
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2548

Search results for: run off estimation and rainfall

928 Evaluation of Geomechanical and Geometrical Parameters’ Effects on Hydro-Mechanical Estimation of Water Inflow into Underground Excavations

Authors: M. Mazraehli, F. Mehrabani, S. Zare

Abstract:

In general, mechanical and hydraulic processes are not independent of each other in jointed rock masses. Therefore, the study on hydro-mechanical coupling of geomaterials should be a center of attention in rock mechanics. Rocks in their nature contain discontinuities whose presence extremely influences mechanical and hydraulic characteristics of the medium. Assuming this effect, experimental investigations on intact rock cannot help to identify jointed rock mass behavior. Hence, numerical methods are being used for this purpose. In this paper, water inflow into a tunnel under significant water table has been estimated using hydro-mechanical discrete element method (HM-DEM). Besides, effects of geomechanical and geometrical parameters including constitutive model, friction angle, joint spacing, dip of joint sets, and stress factor on the estimated inflow rate have been studied. Results demonstrate that inflow rates are not identical for different constitutive models. Also, inflow rate reduces with increased spacing and stress factor.

Keywords: distinct element method, fluid flow, hydro-mechanical coupling, jointed rock mass, underground excavations

Procedia PDF Downloads 165
927 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion

Authors: Doyoung Kim, Hyo Seon Park

Abstract:

Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.

Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification

Procedia PDF Downloads 407
926 Artificial Neural Network Based Approach for Estimation of Individual Vehicle Speed under Mixed Traffic Condition

Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh

Abstract:

Developing speed model is a challenging task particularly under mixed traffic condition where the traffic composition plays a significant role in determining vehicular speed. The present research has been conducted to model individual vehicular speed in the context of mixed traffic on an urban arterial. Traffic speed and volume data have been collected from three midblock arterial road sections in New Delhi. Using the field data, a volume based speed prediction model has been developed adopting the methodology of Artificial Neural Network (ANN). The model developed in this work is capable of estimating speed for individual vehicle category. Validation results show a great deal of agreement between the observed speeds and the predicted values by the model developed. Also, it has been observed that the ANN based model performs better compared to other existing models in terms of accuracy. Finally, the sensitivity analysis has been performed utilizing the model in order to examine the effects of traffic volume and its composition on individual speeds.

Keywords: speed model, artificial neural network, arterial, mixed traffic

Procedia PDF Downloads 388
925 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility

Authors: Fu Jinyu, Lin Jinguan

Abstract:

This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.

Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate

Procedia PDF Downloads 154
924 Absorbed Dose Estimation of 68Ga-EDTMP in Human Organs

Authors: S. Zolghadri, H. Yousefnia, A. R. Jalilian

Abstract:

Bone metastases are observed in a wide range of cancers leading to intolerable pain. While early detection can help the physicians in the decision of the type of treatment, various radiopharmaceuticals using phosphonates like 68Ga-EDTMP have been developed. In this work, due to the importance of absorbed dose, human absorbed dose of this new agent was calculated for the first time based on biodistribution data in Wild-type rats. 68Ga was obtained from 68Ge/68Ga generator with radionuclidic purity and radiochemical purity of higher than 99%. The radiolabeled complex was prepared in the optimized conditions. Radiochemical purity of the radiolabeled complex was checked by instant thin layer chromatography (ITLC) method using Whatman No. 2 paper and saline. The results indicated the radiochemical purity of higher than 99%. The radiolabelled complex was injected into the Wild-type rats and its biodistribution was studied up to 120 min. As expected, major accumulation was observed in the bone. Absorbed dose of each human organ was calculated based on biodistribution in the rats using RADAR method. Bone surface and bone marrow with 0.112 and 0.053 mSv/MBq, respectively, received the highest absorbed dose. According to these results, the radiolabeled complex is a suitable and safe option for PET bone imaging.

Keywords: absorbed dose, EDTMP, ⁶⁸Ga, rats

Procedia PDF Downloads 193
923 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment

Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai

Abstract:

Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.

Keywords: computational methods, MATLAB, seismic hazard, seismic measurements

Procedia PDF Downloads 339
922 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 456
921 Variation in N₂ Fixation and N Contribution by 30 Groundnut (Arachis hypogaea L.) Varieties Grown in Blesbokfontein Mpumalanga Province, South Africa

Authors: Titus Y. Ngmenzuma, Cherian. Mathews, Feilx D. Dakora

Abstract:

In Africa, poor nutrient availability, particularly N and P, coupled with low soil moisture due to erratic rainfall, constitutes the major crop production constraints. Although inorganic fertilizers are an option for meeting crop nutrient requirements for increased grain yield, the high cost and scarcity of inorganic inputs make them inaccessible to resource-poor farmers in Africa. Because crops grown on such nutrient-poor soils are micronutrient deficient, incorporating N₂-fixing legumes into cropping systems can sustainably improve crop yield and nutrient accumulation in the grain. In Africa, groundnut can easily form an effective symbiosis with native soil rhizobia, leading to marked N contribution in cropping systems. In this study, field experiments were conducted at Blesbokfontein in Mpumalanga Province to assess N₂ fixation and N contribution by 30 groundnut varieties during the 2018/2019 planting season using the ¹⁵N natural abundance technique. The results revealed marked differences in shoot dry matter yield, symbiotic N contribution, soil N uptake and grain yield among the groundnut varieties. The percent N derived from fixation ranged from 37 to 44% for varieties ICGV131051 and ICGV13984. The amount of N-fixed ranged from 21 to 58 kg/ha for varieties Chinese and IS-07273, soil N uptake from 31 to 80 kg/ha for varieties IS-07947 and IS-07273, and grain yield from 193 to 393 kg/ha for varieties ICGV15033 and ICGV131096, respectively. Compared to earlier studies on groundnut in South Africa, this study has shown low N₂ fixation and N contribution to the cropping systems, possibly due to environmental factors such as low soil moisture. Because the groundnut varieties differed in their growth, symbiotic performance and grain yield, more field testing is required over a range of differing agro-ecologies to identify genotypes suitable for different cropping environments

Keywords: ¹⁵N natural abundance, percent N derived from fixation, amount of N-fixed, grain yield

Procedia PDF Downloads 186
920 Chemical Characterization and Antioxidant Capacity of Flour From Two Soya Bean Cultivars (Glycine Max)

Authors: Meziani Samira, Menadi Noreddine, Labga Lahouaria, Chenni Fatima Zohra, Toumi Asma

Abstract:

A comparative study between two varieties of soya beans was carried out in this work. The method consists of studying and proceeding to prepare a by-product (Flour) from two varieties of soybeans, a Chinese variety imported and marketed in Algeria. The chemical composition of ash, protein and fat was determined in this study. The minerals, namely potassium and sodium, were measured by flame spectrophotometer. In addition, the estimation of the polyphenol content and evaluation of the antioxidant activity Ferric Reducing Antioxidant Power assay (FRAP) f the methanol extracts of the flours were also carried out. The result revealed that soy flour from two cultivars, on average, contained 8% moisture, more than 50% protein, 1.58-1.87g fat, and 0.28-0.30g of ash. A slight difference was found for contents of 489 mg/ml of K + and 20 mg/ml of NA +. In addition, the phenolic content of the methanolic extracts gives a value of almost 37 mg EAG / g for both cultivars of soy flour. The estimated Reductive Antioxidant Iron (FRAP) potency of soy flour might be related to its polyphenol richness, which is similar to the variety of China. The flour Soya varieties tested contained a significant amount of protein and phenolic compounds with good antioxidant properties.

Keywords: soye beans, soya flour, protein, total polyphenols

Procedia PDF Downloads 89
919 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model

Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis

Abstract:

In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.

Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data

Procedia PDF Downloads 329
918 Assessment of Exploitation Vulnerability of Quantum Communication Systems with Phase Encryption

Authors: Vladimir V. Nikulin, Bekmurza H. Aitchanov, Olimzhon A. Baimuratov

Abstract:

Quantum communication technology takes advantage of the intrinsic properties of laser carriers, such as very high data rates and low power requirements, to offer unprecedented data security. Quantum processes at the physical layer of encryption are used for signal encryption with very competitive performance characteristics. The ultimate range of applications for QC systems spans from fiber-based to free-space links and from secure banking operations to mobile airborne and space-borne networking where they are subjected to channel distortions. Under practical conditions, the channel can alter the optical wave front characteristics, including its phase. In addition, phase noise of the communication source and photo-detection noises alter the signal to bring additional ambiguity into the measurement process. If quantized values of photons are used to encrypt the signal, exploitation of quantum communication links becomes extremely difficult. In this paper, we present the results of analysis and simulation studies of the effects of noise on phase estimation for quantum systems with different number of encryption bases and operating at different power levels.

Keywords: encryption, phase distortion, quantum communication, quantum noise

Procedia PDF Downloads 551
917 Nonparametric Path Analysis with a Truncated Spline Approach in Modeling Waste Management Behavior Patterns

Authors: Adji Achmad Rinaldo Fernandes, Usriatur Rohma

Abstract:

Nonparametric path analysis is a statistical method that does not rely on the assumption that the curve is known. The purpose of this study is to determine the best truncated spline nonparametric path function between linear and quadratic polynomial degrees with 1, 2, and 3 knot points and to determine the significance of estimating the best truncated spline nonparametric path function in the model of the effect of perceived benefits and perceived convenience on behavior to convert waste into economic value through the intention variable of changing people's mindset about waste using the t test statistic at the jackknife resampling stage. The data used in this study are primary data obtained from research grants. The results showed that the best model of nonparametric truncated spline path analysis is quadratic polynomial degree with 3 knot points. In addition, the significance of the best truncated spline nonparametric path function estimation using jackknife resampling shows that all exogenous variables have a significant influence on the endogenous variables.

Keywords: nonparametric path analysis, truncated spline, linear, kuadratic, behavior to turn waste into economic value, jackknife resampling

Procedia PDF Downloads 46
916 Numerical Study for the Estimation of Hydrodynamic Current Drag Coefficients for the Colombian Navy Frigates Using Computational Fluid Dynamics

Authors: Mauricio Gracia, Luis Leal, Bharat Verma

Abstract:

Computational fluid dynamics (CFD) has become nowadays an important tool in the process of hydrodynamic design of modern ships. CFD is used to model any phenomena related to fluid flow in a control volume like a ship or any offshore structure in the sea. In the present study, the current force drag coefficients for a Colombian Navy Frigate in deep and shallow water are estimated through the application of CFD. The study shows the process of simulating the ship current drag coefficients using the CFD simulations method, which is conducted using STAR-CCM+ software package. The Almirante Padilla class Frigate ship scale model is investigated. The results show the ship current drag coefficient calculated considering a current speed of 1 knot with a 90° drift angle for the full-scale ship. Predicted results were compared against the current drag coefficients published in the Lloyds register OCIMF report. It is shown that the simulation results agree fairly well with the published results and that STAR-CCM+ code can predict current drag coefficients.

Keywords: CFD, current draft coefficient, STAR-CCM+, OCIMF, Bollard pull

Procedia PDF Downloads 171
915 An Integrated Assessment (IA) of Water Resources in the Speightstown Catchment, Barbados Using a GIS-Based Decision Support System

Authors: Anuradha Maharaj, Adrian Cashman

Abstract:

The cross-cutting nature of water as a resource translates into the need for a better understanding of its movement, storage and loss at all points in the hydro-socioeconomic cycle. An integrated approach to addressing the issue of sustainability means quantitatively understanding: the linkages within this cycle, the role of water managers in resource allocation, and the critical factors influencing its scarcity. The Water Evaluation and Planning Tool (WEAP) is an integrative model that combines the catchment-scale hydrologic processes with a water management model, driven by environmental requirements and socioeconomic demands. The concept of demand priorities is included to represent the areas of greatest use within a given catchment. Located on Barbados’ West Coast, Speightstown and the surrounding areas encompass a well-developed tourist, residential and agricultural area. The main water resource for this area, and the rest of the island, is that of groundwater. The availability of groundwater in Barbados may be adversely affected by the projected changes in climate, such as reduced wet season rainfall. Economic development and changing sector priorities together with climate related changes have the potential to affect water resource abundance and by extension the allocation of resources for example in the Speightstown area. In order to investigate the potential impacts on the Speightstown area specifically, a WEAP Model of the study area was developed to estimate the present available water (baseline reference scenario 2000-2010). From this baseline scenario, it is envisioned that an exploration into projected changes in availability in the near term (2035-2045) and medium/long term (2065-2075) time frames will be undertaken. The generated estimations can assist water managers to better evaluate the status of and identify trends in water use and formulate adaptation measures to offset future deficits.

Keywords: water evaluation and planning system (WEAP), water availability, demand and supply, water allocation

Procedia PDF Downloads 349
914 End-to-End Pyramid Based Method for Magnetic Resonance Imaging Reconstruction

Authors: Omer Cahana, Ofer Levi, Maya Herman

Abstract:

Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.

Keywords: magnetic resonance imaging, image reconstruction, pyramid network, deep learning

Procedia PDF Downloads 90
913 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 79
912 Estimation of the Parameters of Muskingum Methods for the Prediction of the Flood Depth in the Moudjar River Catchment

Authors: Fares Laouacheria, Said Kechida, Moncef Chabi

Abstract:

The objective of the study was based on the hydrological routing modelling for the continuous monitoring of the hydrological situation in the Moudjar river catchment, especially during floods with Hydrologic Engineering Center–Hydrologic Modelling Systems (HEC-HMS). The HEC-GeoHMS was used to transform data from geographic information system (GIS) to HEC-HMS for delineating and modelling the catchment river in order to estimate the runoff volume, which is used as inputs to the hydrological routing model. Two hydrological routing models were used, namely Muskingum and Muskingum routing models, for conducting this study. In this study, a comparison between the parameters of the Muskingum and Muskingum-Cunge routing models in HEC-HMS was used for modelling flood routing in the Moudjar river catchment and determining the relationship between these parameters and the physical characteristics of the river. The results indicate that the effects of input parameters such as the weighting factor "X" and travel time "K" on the output results are more significant, where the Muskingum routing model was more sensitive to input parameters than the Muskingum-Cunge routing model. This study can contribute to understand and improve the knowledge of the mechanisms of river floods, especially in ungauged river catchments.

Keywords: HEC-HMS, hydrological modelling, Muskingum routing model, Muskingum-Cunge routing model

Procedia PDF Downloads 275
911 Risk Based Building Information Modeling (BIM) for Urban Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Building Information Modeling (BIM) is a holistic documentation process for operational visualization, design coordination, estimation and project scheduling. BIM software defines objects parametrically and it is a tool for virtual reality. Primary advantage of implementing BIM is the visual coordination of the building structure and systems such as Mechanical, Electrical and Plumbing (MEP) and it also identifies the possible conflicts between the building systems. This paper is an attempt to develop a risk based BIM model which would highlight the primary advantages of application of BIM pertaining to urban infrastructure transportation project. It has been observed that about 40% of the Architecture, Engineering and Construction (AEC) companies use BIM but primarily for their outsourced projects. Also, 65% of the respondents agree that BIM would be used quiet strongly for future construction projects in India. The 3D models developed with Revit 2015 software would reduce co-ordination problems amongst the architects, structural engineers, contractors and building service providers (MEP). Integration of risk management along with BIM would provide enhanced co-ordination, collaboration and high probability of successful completion of the complex infrastructure transportation project within stipulated time and cost frame.

Keywords: building information modeling (BIM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 309
910 Model Based Fault Diagnostic Approach for Limit Switches

Authors: Zafar Mahmood, Surayya Naz, Nazir Shah Khattak

Abstract:

The degree of freedom relates to our capability to observe or model the energy paths within the system. Higher the number of energy paths being modeled leaves to us a higher degree of freedom, but increasing the time and modeling complexity rendering it useless for today’s world’s need for minimum time to market. Since the number of residuals that can be uniquely isolated are dependent on the number of independent outputs of the system, increasing the number of sensors required. The examples of discrete position sensors that may be used to form an array include limit switches, Hall effect sensors, optical sensors, magnetic sensors, etc. Their mechanical design can usually be tailored to fit in the transitional path of an STME in a variety of mechanical configurations. The case studies into multi-sensor system were carried out and actual data from sensors is used to test this generic framework. It is being investigated, how the proper modeling of limit switches as timing sensors, could lead to unified and neutral residual space while keeping the implementation cost reasonably low.

Keywords: low-cost limit sensors, fault diagnostics, Single Throw Mechanical Equipment (STME), parameter estimation, parity-space

Procedia PDF Downloads 616
909 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 75
908 Evaluation of Flow Alteration under Climate Change Scenarios for Disaster Risk Management in Lower Mekong Basin: A Case Study in Prek Thnot River in Cambodia

Authors: Vathanachannbo Veth, Ilan Ich, Sophea Rom Phy, Ty Sok, Layheang Song, Sophal Try, Chantha Oeurng

Abstract:

Climate change is one of the major global challenges inducing disaster risks and threatening livelihoods and communities through adverse impacts on food and water security, ecosystems, and services. Prek Thnot River Basin of Cambodia is one of the largest tributaries in the Lower Mekong that has been exposed to hazards and disasters, particularly floods and is said to be the effect of climate change. Therefore, the assessment of precipitation and streamflow changes under the effect of climate change was proposed in this river basin using Soil Water Assessment Tool (SWAT) model and different flow indices under baseline (1997 to 2011) and climate change scenarios (RCP2.6 and RCP8.5 with three General Circulation Models (GCMs): GFDL, GISS, and IPSL) in two time-horizons: near future (the 2030s: 2021 to 2040) and medium future (2060s: 2051 to 2070). Both intensity and frequency indices compared with the historical extreme rainfall indices significantly change in the GFDL under the RCP8.5 for both 2030s and 2060s. The average rate change of Rx1day, Rx10day, SDII, and R20mm in the 2030s and 2060s of both RCP2.6 and RCP8.5 was found to increase in GFDL and decrease in both GISS and IPSL. The mean percentage change of the flow analyzed in the IHA tool (Group1) indicated that the flow in the Prek Thnot River increased in GFDL for both RCP2.6 and RCP8.5 in both 2030s and 2060s, oppositely in GISS, the flow decreases. Moreover, the IPSL affected the flow by increasing in five months (January, February, October, November, and December), and in the other seven months, the flow decreased accordingly. This study provides water resources managers and policymakers with a wide range of precipitation and water flow projections within the Prek Thnot River Basin in the context of plausible climate change scenarios.

Keywords: IHA, climate change, disaster risk, Prek Thnot River Basin, Cambodia

Procedia PDF Downloads 100
907 The Effectiveness of Environmental Policy Instruments for Promoting Renewable Energy Consumption: Command-and-Control Policies versus Market-Based Policies

Authors: Mahmoud Hassan

Abstract:

Understanding the impact of market- and non-market-based environmental policy instruments on renewable energy consumption (REC) is crucial for the design and choice of policy packages. This study aims to empirically investigate the effect of environmental policy stringency index (EPS) and its components on REC in 27 OECD countries over the period from 1990 to 2015, and then use the results to identify what the appropriate environmental policy mix should look like. By relying on the two-step system GMM estimator, we provide evidence that increasing environmental policy stringency as a whole promotes renewable energy consumption in these 27 developed economies. Moreover, policymakers are able, through the market- and non-market-based environmental policy instruments, to increase the use of renewable energy. However, not all of these instruments are effective for achieving this goal. The results indicate that R&D subsidies and trading schemes have a positive and significant impact on REC, while taxes, feed-in tariff and emission standards have not a significant effect. Furthermore, R&D subsidies are more effective than trading schemes for stimulating the use of clean energy. These findings proved to be robust across the three alternative panel techniques used.

Keywords: environmental policy stringency, renewable energy consumption, two-step system-GMM estimation, linear dynamic panel data model

Procedia PDF Downloads 179
906 F-VarNet: Fast Variational Network for MRI Reconstruction

Authors: Omer Cahana, Maya Herman, Ofer Levi

Abstract:

Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.

Keywords: MRI, deep learning, variational network, computer vision, compress sensing

Procedia PDF Downloads 159
905 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 122
904 Impact of Environmental Rule of Law towards Positive Environmental Outcomes in Nigeria

Authors: Kate N. Okeke

Abstract:

The ever-growing needs of man requiring satisfaction have pushed him strongly towards industrialization which has and is still leaving environmental degradation and its attendant negative impacts in its wake. It is, therefore, not surprising that the enjoyment of fundamental rights like food supply, security of lives and property, freedom of worship, health and education have been drastically affected by such degradation. In recognition of the imperative need to protect the environment and human rights, many global instruments and constitutions have recognized the right to a healthy and sustainable environment. Some environmental advocates and quite a number of literatures on the subject matter call for the recognition of environmental rights via rule of law as a vital means of achieving positive outcomes on the subject matter. However, although there are numerous countries with constitutional environmental provisions, most of them such as Nigeria, have shown poor environmental performance. A notable problem is the fact that the constitution which recognizes environmental rights appears in its other provisions to contradict its provisions by making enforceability of the environmental rights unattainable. While adopting a descriptive, analytical, comparative and explanatory study design in reviewing a successful positive environmental outcome via the rule of law, this article argues that rule of law on a balance of scale, weighs more than just environmental rights recognition and therefore should receive more attention by environmental lawyers and advocates. This is because with rule of law, members of a society are sure of getting the most out of the environmental rights existing in their legal system. Members of Niger-Delta communities of Nigeria will benefit from the environmental rights existing in Nigeria. They are exposed to environmental degradation and pollution with effects such as acidic rainfall, pollution of farmlands and clean water sources. These and many more are consequences of oil and gas exploration. It will also pave way for solving the violence between cattle herdsmen and farmers in the Middle Belt and other regions of Nigeria. Their clashes are over natural resource control. Having seen that environmental rule of law is vital to sustainable development, this paper aims to contribute to discussions on how best the vehicle of rule law can be driven towards achieving positive environmental outcomes. This will be in reliance on other enforceable provisions in the Nigerian Constitution. Other domesticated international instruments will also be considered to attain sustainable environment and development.

Keywords: environment, rule of law, constitution, sustainability

Procedia PDF Downloads 156
903 Analysis of an Alternative Data Base for the Estimation of Solar Radiation

Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag

Abstract:

The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.

Keywords: energy potential, reanalyses, renewable energy, solar radiation

Procedia PDF Downloads 162
902 Evaluation of Settlement of Coastal Embankments Using Finite Elements Method

Authors: Sina Fadaie, Seyed Abolhassan Naeini

Abstract:

Coastal embankments play an important role in coastal structures by reducing the effect of the wave forces and controlling the movement of sediments. Many coastal areas are underlain by weak and compressible soils. Estimation of during construction settlement of coastal embankments is highly important in design and safety control of embankments and appurtenant structures. Accordingly, selecting and establishing of an appropriate model with a reasonable level of complication is one of the challenges for engineers. Although there are advanced models in the literature regarding design of embankments, there is not enough information on the prediction of their associated settlement, particularly in coastal areas having considerable soft soils. Marine engineering study in Iran is important due to the existence of two important coastal areas located in the northern and southern parts of the country. In the present study, the validity of Terzaghi’s consolidation theory has been investigated. In addition, the settlement of these coastal embankments during construction is predicted by using special methods in PLAXIS software by the help of appropriate boundary conditions and soil layers. The results indicate that, for the existing soil condition at the site, some parameters are important to be considered in analysis. Consequently, a model is introduced to estimate the settlement of the embankments in such geotechnical conditions.

Keywords: consolidation, settlement, coastal embankments, numerical methods, finite elements method

Procedia PDF Downloads 155
901 Modelling the Long Rune of Aggregate Import Demand in Libya

Authors: Said Yousif Khairi

Abstract:

Being a developing economy, imports of capital, raw materials and manufactories goods are vital for sustainable economic growth. In 2006, Libya imported LD 8 billion (US$ 6.25 billion) which composed of mainly machinery and transport equipment (49.3%), raw material (18%), and food products and live animals (13%). This represented about 10% of GDP. Thus, it is pertinent to investigate factors affecting the amount of Libyan imports. An econometric model representing the aggregate import demand for Libya was developed and estimated using the bounds test procedure, which based on an unrestricted error correction model (UECM). The data employed for the estimation was from 1970–2010. The results of the bounds test revealed that the volume of imports and its determinants namely real income, consumer price index and exchange rate are co-integrated. The findings indicate that the demand for imports is inelastic with respect to income, index price level and The exchange rate variable in the short run is statistically significant. In the long run, the income elasticity is elastic while the price elasticity and the exchange rate remains inelastic. This indicates that imports are important elements for Libyan economic growth in the long run.

Keywords: import demand, UECM, bounds test, Libya

Procedia PDF Downloads 358
900 Economic Evaluation Offshore Wind Project under Uncertainly and Risk Circumstances

Authors: Sayed Amir Hamzeh Mirkheshti

Abstract:

Offshore wind energy as a strategic renewable energy, has been growing rapidly due to availability, abundance and clean nature of it. On the other hand, budget of this project is incredibly higher in comparison with other renewable energies and it takes more duration. Accordingly, precise estimation of time and cost is needed in order to promote awareness in the developers and society and to convince them to develop this kind of energy despite its difficulties. Occurrence risks during on project would cause its duration and cost constantly changed. Therefore, to develop offshore wind power, it is critical to consider all potential risks which impacted project and to simulate their impact. Hence, knowing about these risks could be useful for the selection of most influencing strategies such as avoidance, transition, and act in order to decrease their probability and impact. This paper presents an evaluation of the feasibility of 500 MV offshore wind project in the Persian Gulf and compares its situation with uncertainty resources and risk. The purpose of this study is to evaluate time and cost of offshore wind project under risk circumstances and uncertain resources by using Monte Carlo simulation. We analyzed each risk and activity along with their distribution function and their effect on the project.

Keywords: wind energy project, uncertain resources, risks, Monte Carlo simulation

Procedia PDF Downloads 350
899 Method Validation for Determining Platinum and Palladium in Catalysts Using Inductively Coupled Plasma Optical Emission Spectrometry

Authors: Marin Senila, Oana Cadar, Thorsten Janisch, Patrick Lacroix-Desmazes

Abstract:

The study presents the analytical capability and validation of a method based on microwave-assisted acid digestion for quantitative determination of platinum and palladium in catalysts using inductively coupled plasma optical emission spectrometry (ICP-OES). In order to validate the method, the main figures of merit such as limit of detection and limit of quantification, precision and accuracy were considered and the measurement uncertainty was estimated based on the bottom-up approach according to the international guidelines of ISO/IEC 17025. Limit of detections, estimated from blank signal using 3 s criterion, were 3.0 mg/kg for Pt and respectively 3.6 mg/kg for Pd, while limits of quantification were 9.0 mg/kg for Pt and respectively 10.8 mg/kg for Pd. Precisions, evaluated as standard deviations of repeatability (n=5 parallel samples), were less than 10% for both precious metals. Accuracies of the method, verified by recovery estimation certified reference material NIST SRM 2557 - pulverized recycled monolith, were 99.4 % for Pt and 101% for Pd. The obtained limit of quantifications and accuracy were satisfactory for the intended purpose. The paper offers all the steps necessary to validate the determination method for Pt and Pd in catalysts using inductively coupled plasma optical emission spectrometry.

Keywords: catalyst analysis, ICP-OES, method validation, platinum, palladium

Procedia PDF Downloads 165