Search results for: indicator estimation
2086 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models
Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton
Abstract:
Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets
Procedia PDF Downloads 4272085 Assessment of DNA Degradation Using Comet Assay: A Versatile Technique for Forensic Application
Authors: Ritesh K. Shukla
Abstract:
Degradation of biological samples in terms of macromolecules (DNA, RNA, and protein) are the major challenges in the forensic investigation which misleads the result interpretation. Currently, there are no precise methods available to circumvent this problem. Therefore, at the preliminary level, some methods are urgently needed to solve this issue. In this order, Comet assay is one of the most versatile, rapid and sensitive molecular biology technique to assess the DNA degradation. This technique helps to assess DNA degradation even at very low amount of sample. Moreover, the expedient part of this method does not require any additional process of DNA extraction and isolation during DNA degradation assessment. Samples directly embedded on agarose pre-coated microscopic slide and electrophoresis perform on the same slide after lysis step. After electrophoresis microscopic slide stained by DNA binding dye and observed under fluorescent microscope equipped with Komet software. With the help of this technique extent of DNA degradation can be assessed which can help to screen the sample before DNA fingerprinting, whether it is appropriate for DNA analysis or not. This technique not only helps to assess degradation of DNA but many other challenges in forensic investigation such as time since deposition estimation of biological fluids, repair of genetic material from degraded biological sample and early time since death estimation could also be resolved. With the help of this study, an attempt was made to explore the application of well-known molecular biology technique that is Comet assay in the field of forensic science. This assay will open avenue in the field of forensic research and development.Keywords: comet assay, DNA degradation, forensic, molecular biology
Procedia PDF Downloads 1552084 Optimizing the Nanoliposome of Nisin Produced by Sonication
Authors: Seyed Moslemi S. A. , Hesari J., Valizadeh H., Rezaiee-Mokaram R.
Abstract:
Nanotechnology and nanoscience and related fields in this area, will impact on daily human life in the not too distant future. The basic materials of liposomes are lipids. Lipids that can be used to build liposomes can be provided from variety of sources. In this research, lecithin and cholesterol were used to prepare liposomes. Probe sonicator was used to minimize the particles of liposome and make nanoliposomes. Encapsulation efficiency were analyzed with pyrogallol red indicator and autoanalizer equipment. The smallest particle size was 220 nanometer( 100 mg lecithin, 50 mg cholestrol, 12 min and amplitude of 90%). The highest encapsulation efficiency was 13.5%( 120 mg lecithin,45 mg cholestrol, 12 min and ampilitude of 92%).Keywords: optimizing, nanoliposome, nisin, cheese
Procedia PDF Downloads 4832083 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom
Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu
Abstract:
The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity
Procedia PDF Downloads 1892082 Incorporation of Safety into Design by Safety Cube
Authors: Mohammad Rajabalinejad
Abstract:
Safety is often seen as a requirement or a performance indicator through the design process, and this does not always result in optimally safe products or systems. This paper suggests integrating the best safety practices with the design process to enrich the exploration experience for designers and add extra values for customers. For this purpose, the commonly practiced safety standards and design methods have been reviewed and their common blocks have been merged forming Safety Cube. Safety Cube combines common blocks for design, hazard identification, risk assessment and risk reduction through an integral approach. An example application presents the use of Safety Cube for design of machinery.Keywords: safety, safety cube, product, system, machinery, design
Procedia PDF Downloads 2462081 Earnings vs Cash Flows: The Valuation Perspective
Authors: Megha Agarwal
Abstract:
The research paper is an effort to compare the earnings based and cash flow based methods of valuation of an enterprise. The theoretically equivalent methods based on either earnings such as Residual Earnings Model (REM), Abnormal Earnings Growth Model (AEGM), Residual Operating Income Method (ReOIM), Abnormal Operating Income Growth Model (AOIGM) and its extensions multipliers such as price/earnings ratio, price/book value ratio; or cash flow based models such as Dividend Valuation Method (DVM) and Free Cash Flow Method (FCFM) all provide different estimates of valuation of the Indian giant corporate Reliance India Limited (RIL). An ex-post analysis of published accounting and financial data for four financial years from 2008-09 to 2011-12 has been conducted. A comparison of these valuation estimates with the actual market capitalization of the company shows that the complex accounting based model AOIGM provides closest forecasts. These different estimates may be derived due to inconsistencies in discount rate, growth rates and the other forecasted variables. Although inputs for earnings based models may be available to the investor and analysts through published statements, precise estimation of free cash flows may be better undertaken by the internal management. The estimation of value from more stable parameters as residual operating income and RNOA could be considered superior to the valuations from more volatile return on equity.Keywords: earnings, cash flows, valuation, Residual Earnings Model (REM)
Procedia PDF Downloads 3762080 State Estimator Performance Enhancement: Methods for Identifying Errors in Modelling and Telemetry
Authors: M. Ananthakrishnan, Sunil K Patil, Koti Naveen, Inuganti Hemanth Kumar
Abstract:
State estimation output of EMS forms the base case for all other advanced applications used in real time by a power system operator. Ensuring tuning of state estimator is a repeated process and cannot be left once a good solution is obtained. This paper attempts to demonstrate methods to improve state estimator solution by identifying incorrect modelling and telemetry inputs to the application. In this work, identification of database topology modelling error by plotting static network using node-to-node connection details is demonstrated with examples. Analytical methods to identify wrong transmission parameters, incorrect limits and mistakes in pseudo load and generator modelling are explained with various cases observed. Further, methods used for active and reactive power tuning using bus summation display, reactive power absorption summary, and transformer tap correction are also described. In a large power system, verifying all network static data and modelling parameter on regular basis is difficult .The proposed tuning methods can be easily used by operators to quickly identify errors to obtain the best possible state estimation performance. This, in turn, can lead to improved decision-support capabilities, ultimately enhancing the safety and reliability of the power grid.Keywords: active power tuning, database modelling, reactive power, state estimator
Procedia PDF Downloads 72079 Immunosupressive Effect of Chloroquine through the Inhibition of Myeloperoxidase
Authors: J. B. Minari, O. B. Oloyede
Abstract:
Polymorphonuclear neutrophils (PMNs) play a crucial role in a variety of infections caused by bacteria, fungi, and parasites. Indeed, the involvement of PMNs in host defence against Plasmodium falciparum is well documented both in vitro and in vivo. Many of the antimalarial drugs such as chloroquine used in the treatment of human malaria significantly reduce the immune response of the host in vitro and in vivo. Myeloperoxidase is the most abundant enzyme found in the polymorphonuclear neutrophil which plays a crucial role in its function. This study was carried out to investigate the effect of chloroquine on the enzyme. In investigating the effects of the drug on myeloperoxidase, the influence of concentration, pH, partition ratio estimation and kinetics of inhibition were studied. This study showed that chloroquine is concentration-dependent inhibitor of myeloperoxidase with an IC50 of 0.03 mM. Partition ratio estimation showed that 40 enzymatic turnover cycles are required for complete inhibition of myeloperoxidase in the presence of chloroquine. The influence of pH on the effect of chloroquine on the enzyme showed significant inhibition of myeloperoxidase at physiological pH. The kinetic inhibition studies showed that chloroquine caused a non-competitive inhibition with an inhibition constant Ki of 0.27mM. The results obtained from this study shows that chloroquine is a potent inhibitor of myeloperoxidase and it is capable of inactivating the enzyme. It is therefore considered that the inhibition of myeloperoxidase in the presence of chloroquine as revealed in this study may partly explain the impairment of polymorphonuclear neutrophil and consequent immunosuppression of the host defence system against secondary infections.Keywords: myeloperoxidase, chloroquine, inhibition, neutrophil, immune
Procedia PDF Downloads 3742078 Lamb Fleece Quality as an Indicator of Endoparasitism
Authors: Maria Christine Rizzon Cintra, Tâmara Duarte Borges, Cristina Santos Sotomaior
Abstract:
Lamb’s fleece quality can be influenced by many factors, including welfare, stress, nutritional imbalance and presence of ectoparasites. The association of fleece quality and endoparasitism, until now, was not well solved. The present study was undertaken to evaluate if a fleece visual score could predict lamb parasitosis with the focus on gastrointestinal parasites. Fleece quality was scored based on a combination of cleanliness and wool cover, using a three-point scale (1-3). Score 1: fleece shows no sign of dirt or contamination, and had sufficient fleece for the breed and time of year with whole body coverage; Score 2: fleece was little damp or wet, with coat contaminated by small patches of mud or dung and some areas of fleece loose, but no shed or bald patches of no more than 10cm in diameter; Score 3: fleece filthy, very wet with coated in mud or dug, and loose fleece with shed areas of pulls with bald patches greater than 10cm, some areas may be trailing. All fleece quality scores (FQS) were assessed with lamb restrained to ensure close inspection and were done along lamb back and considered just one side of the body. To confirm the gastrointestinal parasites and animal’s anemia, faecal egg counts (FEC) and hematocrit were done for each animal. Lambs were also weighed. All these measurements were done every 15-days, beginning at 60-days until 150-days of life, using 48 animals crossed Texel x Ile de France. For statistics analysis, it was used Stratigraphic Program (4.1. version), and all significant differences between FQS, weight gain, age, hematocrit, and FEC were assessed using analysis of variance following by Duncan test, and the correlation was done by Pearson test at P<0.05. Results showed that animals scored as ‘3’ in FQS had a lower hematocrit and a higher FEC (p<0.05) than animals scored as ‘1’ (hematocrit: 26, 24, 23 and FEC 2107, 2962, 4626 respectively for 1, 2 and 3 FQS). There were correlations between FQS and FEC (r = 0.16), FQS and hematocrit (r = -0.33) an FQS and weight gain (r = -0.20) indicating that worst FQS animals (score 3) had greater gastrointestinal parasites’ infection, were more anemic and had lower weight gain than animals scored as ‘1’ or ‘2’ for FQS. Concerning the lamb´s age, animals that received score ‘3’ in FQS, maintained gastrointestinal parasites’ infection over the time (P<0.05). It was concluded that FQS could be an important indicator to be included in the selective treatment for control verminosis in lambs.Keywords: fleece, gastrointestinal parasites, sheep, welfare
Procedia PDF Downloads 2412077 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD
Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik
Abstract:
The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet
Procedia PDF Downloads 5712076 A Multidimensional Indicator-Based Framework to Assess the Sustainability of Productive Green Roofs: A Case Study in Madrid
Authors: Francesca Maria Melucci, Marco Panettieri, Rocco Roma
Abstract:
Cities are at the forefront of achieving the sustainable development goals set out in the Sustainable Development Goals of Agenda 2030. For these reasons, increasing attention has been given to the creation of resilient, sustainable, inclusive and green cities and finding solutions to these problems is one of the greatest challenges faced by researchers today. In particular urban green infrastructures, including green roofs, play a key role in tackling environmental, social and economic problems. The starting point was an extensive literature review on 1. research developments on the benefits (environmental, economic and social) and implications of green roofs; 2. sustainability assessment and applied methodologies; 3. specific indicators to measure impacts on urban sustainability. Through this review, the appropriate qualitative and quantitative characteristics that are part of the complex 'green roof' system were identified, as studies that holistically capture its multifunctional nature are still lacking. So, this paper aims to find a method to improve community participation in green roof initiatives and support local governance processes in developing efficient proposals to achieve better sustainability and resilience of cities. To this aim, the multidimensional indicator-based framework, presented by Tapia in 2021, has been tested for the first time in the case of a green roof in the city of Madrid. The framework's set of indicators was implemented with other indicators such as those of waste management and circularity (OECD Inventory of Circular Economy indicators) and sustainability performance. The specific indicators to be used in the case study were decided after a consultation phase with relevant stakeholders. Data on the community's willingness to participate in green roof implementation initiatives were collected through interviews and online surveys with a heterogeneous sample of citizens. The results of the application of the framework suggest how the different aspects of sustainability influence the choice of a green roof and provide input on the main mechanisms involved in citizens' willingness to participate in such initiatives.Keywords: urban agriculture, green roof, urban sustainability, indicators, multi-criteria analysis
Procedia PDF Downloads 722075 Defining a Framework for Holistic Life Cycle Assessment of Building Components by Considering Parameters Such as Circularity, Material Health, Biodiversity, Pollution Control, Cost, Social Impacts, and Uncertainty
Authors: Naomi Grigoryan, Alexandros Loutsioli Daskalakis, Anna Elisse Uy, Yihe Huang, Aude Laurent (Webanck)
Abstract:
In response to the building and construction sectors accounting for a third of all energy demand and emissions, the European Union has placed new laws and regulations in the construction sector that emphasize material circularity, energy efficiency, biodiversity, and social impact. Existing design tools assess sustainability in early-stage design for products or buildings; however, there is no standardized methodology for measuring the circularity performance of building components. Existing assessment methods for building components focus primarily on carbon footprint but lack the comprehensive analysis required to design for circularity. The research conducted in this paper covers the parameters needed to assess sustainability in the design process of architectural products such as doors, windows, and facades. It maps a framework for a tool that assists designers with real-time sustainability metrics. Considering the life cycle of building components such as façades, windows, and doors involves the life cycle stages applied to product design and many of the methods used in the life cycle analysis of buildings. The current industry standards of sustainability assessment for metal building components follow cradle-to-grave life cycle assessment (LCA), track Global Warming Potential (GWP), and document the parameters used for an Environmental Product Declaration (EPD). Developed by the Ellen Macarthur Foundation, the Material Circularity Indicator (MCI) is a methodology utilizing the data from LCA and EPDs to rate circularity, with a "value between 0 and 1 where higher values indicate a higher circularity+". Expanding on the MCI with additional indicators such as the Water Circularity Index (WCI), the Energy Circularity Index (ECI), the Social Circularity Index (SCI), Life Cycle Economic Value (EV), and calculating biodiversity risk and uncertainty, the assessment methodology of an architectural product's impact can be targeted more specifically based on product requirements, performance, and lifespan. Broadening the scope of LCA calculation for products to incorporate aspects of building design allows product designers to account for the disassembly of architectural components. For example, the Material Circularity Indicator for architectural products such as windows and facades is typically low due to the impact of glass, as 70% of glass ends up in landfills due to damage in the disassembly process. The low MCI can be combatted by expanding beyond cradle-to-grave assessment and focusing the design process on disassembly, recycling, and repurposing with the help of real-time assessment tools. Design for Disassembly and Urban Mining has been integrated within the construction field on small scales as project-based exercises, not addressing the entire supply chain of architectural products. By adopting more comprehensive sustainability metrics and incorporating uncertainty calculations, the sustainability assessment of building components can be more accurately assessed with decarbonization and disassembly in mind, addressing the large-scale commercial markets within construction, some of the most significant contributors to climate change.Keywords: architectural products, early-stage design, life cycle assessment, material circularity indicator
Procedia PDF Downloads 882074 Effect Analysis of an Improved Adaptive Speech Noise Reduction Algorithm in Online Communication Scenarios
Authors: Xingxing Peng
Abstract:
With the development of society, there are more and more online communication scenarios such as teleconference and online education. In the process of conference communication, the quality of voice communication is a very important part, and noise may cause the communication effect of participants to be greatly reduced. Therefore, voice noise reduction has an important impact on scenarios such as voice calls. This research focuses on the key technologies of the sound transmission process. The purpose is to maintain the audio quality to the maximum so that the listener can hear clearer and smoother sound. Firstly, to solve the problem that the traditional speech enhancement algorithm is not ideal when dealing with non-stationary noise, an adaptive speech noise reduction algorithm is studied in this paper. Traditional noise estimation methods are mainly used to deal with stationary noise. In this chapter, we study the spectral characteristics of different noise types, especially the characteristics of non-stationary Burst noise, and design a noise estimator module to deal with non-stationary noise. Noise features are extracted from non-speech segments, and the noise estimation module is adjusted in real time according to different noise characteristics. This adaptive algorithm can enhance speech according to different noise characteristics, improve the performance of traditional algorithms to deal with non-stationary noise, so as to achieve better enhancement effect. The experimental results show that the algorithm proposed in this chapter is effective and can better adapt to different types of noise, so as to obtain better speech enhancement effect.Keywords: speech noise reduction, speech enhancement, self-adaptation, Wiener filter algorithm
Procedia PDF Downloads 582073 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis
Authors: Petr Gurný
Abstract:
One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the credit-scoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.Keywords: credit-scoring models, multidimensional subordinated Lévy model, probability of default
Procedia PDF Downloads 4562072 Modeling Diel Trends of Dissolved Oxygen for Estimating the Metabolism in Pristine Streams in the Brazilian Cerrado
Authors: Wesley A. Saltarelli, Nicolas R. Finkler, Adriana C. P. Miwa, Maria C. Calijuri, Davi G. F. Cunha
Abstract:
The metabolism of the streams is an indicator of ecosystem disturbance due to the influences of the catchment on the structure of the water bodies. The study of the respiration and photosynthesis allows the estimation of energy fluxes through the food webs and the analysis of the autotrophic and heterotrophic processes. We aimed at evaluating the metabolism in streams located in the Brazilian savannah, Cerrado (Sao Carlos, SP), by determining and modeling the daily changes of dissolved oxygen (DO) in the water during one year. Three water bodies with minimal anthropogenic interference in their surroundings were selected, Espraiado (ES), Broa (BR) and Canchim (CA). Every two months, water temperature, pH and conductivity are measured with a multiparameter probe. Nitrogen and phosphorus forms are determined according to standard methods. Also, canopy cover percentages are estimated in situ with a spherical densitometer. Stream flows are quantified through the conservative tracer (NaCl) method. For the metabolism study, DO (PME-MiniDOT) and light (Odyssey Photosynthetic Active Radiation) sensors log data for at least three consecutive days every ten minutes. The reaeration coefficient (k2) is estimated through the method of the tracer gas (SF6). Finally, we model the variations in DO concentrations and calculate the rates of gross and net primary production (GPP and NPP) and respiration based on the one station method described in the literature. Three sampling were carried out in October and December 2015 and February 2016 (the next will be in April, June and August 2016). The results from the first two periods are already available. The mean water temperatures in the streams were 20.0 +/- 0.8C (Oct) and 20.7 +/- 0.5C (Dec). In general, electrical conductivity values were low (ES: 20.5 +/- 3.5uS/cm; BR 5.5 +/- 0.7uS/cm; CA 33 +/- 1.4 uS/cm). The mean pH values were 5.0 (BR), 5.7 (ES) and 6.4 (CA). The mean concentrations of total phosphorus were 8.0ug/L (BR), 66.6ug/L (ES) and 51.5ug/L (CA), whereas soluble reactive phosphorus concentrations were always below 21.0ug/L. The BR stream had the lowest concentration of total nitrogen (0.55mg/L) as compared to CA (0.77mg/L) and ES (1.57mg/L). The average discharges were 8.8 +/- 6L/s (ES), 11.4 +/- 3L/s and CA 2.4 +/- 0.5L/s. The average percentages of canopy cover were 72% (ES), 75% (BR) and 79% (CA). Significant daily changes were observed in the DO concentrations, reflecting predominantly heterotrophic conditions (respiration exceeded the gross primary production, with negative net primary production). The GPP varied from 0-0.4g/m2.d (in Oct and Dec) and the R varied from 0.9-22.7g/m2.d (Oct) and from 0.9-7g/m2.d (Dec). The predominance of heterotrophic conditions suggests increased vulnerability of the ecosystems to artificial inputs of organic matter that would demand oxygen. The investigation of the metabolism in the pristine streams can help defining natural reference conditions of trophic state.Keywords: low-order streams, metabolism, net primary production, trophic state
Procedia PDF Downloads 2582071 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data
Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer
Abstract:
This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML
Procedia PDF Downloads 1292070 Deliberation of Daily Evapotranspiration and Evaporative Fraction Based on Remote Sensing Data
Authors: J. Bahrawi, M. Elhag
Abstract:
Estimation of evapotranspiration is always a major component in water resources management. Traditional techniques of calculating daily evapotranspiration based on field measurements are valid only for local scales. Earth observation satellite sensors are thus used to overcome difficulties in obtaining daily evapotranspiration measurements on regional scale. The Surface Energy Balance System (SEBS) model was adopted to estimate daily evapotranspiration and relative evaporation along with other land surface energy fluxes. The model requires agro-climatic data that improve the model outputs. Advance Along Track Scanning Radiometer (AATSR) and Medium Spectral Resolution Imaging Spectrometer (MERIS) imageries were used to estimate the daily evapotranspiration and relative evaporation over the entire Nile Delta region in Egypt supported by meteorological data collected from six different weather stations located within the study area. Daily evapotranspiration maps derived from SEBS model show a strong agreement with actual ground-truth data taken from 92 points uniformly distributed all over the study area. Moreover, daily evapotranspiration and relative evaporation are strongly correlated. The reliable estimation of daily evapotranspiration supports the decision makers to review the current land use practices in terms of water management, while enabling them to propose proper land use changes.Keywords: daily evapotranspiration, relative evaporation, SEBS, AATSR, MERIS, Nile Delta
Procedia PDF Downloads 2592069 Linear Regression Estimation of Tactile Comfort for Denim Fabrics Based on In-Plane Shear Behavior
Authors: Nazli Uren, Ayse Okur
Abstract:
Tactile comfort of a textile product is an essential property and a major concern when it comes to customer perceptions and preferences. The subjective nature of comfort and the difficulties regarding the simulation of human hand sensory feelings make it hard to establish a well-accepted link between tactile comfort and objective evaluations. On the other hand, shear behavior of a fabric is a mechanical parameter which can be measured by various objective test methods. The principal aim of this study is to determine the tactile comfort of commercially available denim fabrics by subjective measurements, create a tactile score database for denim fabrics and investigate the relations between tactile comfort and shear behavior. In-plane shear behaviors of 17 different commercially available denim fabrics with a variety of raw material and weave structure were measured by a custom design shear frame and conventional bias extension method in two corresponding diagonal directions. Tactile comfort of denim fabrics was determined via subjective customer evaluations as well. Aforesaid relations were statistically investigated and introduced as regression equations. The analyses regarding the relations between tactile comfort and shear behavior showed that there are considerably high correlation coefficients. The suggested regression equations were likewise found out to be statistically significant. Accordingly, it was concluded that the tactile comfort of denim fabrics can be estimated with a high precision, based on the results of in-plane shear behavior measurements.Keywords: denim fabrics, in-plane shear behavior, linear regression estimation, tactile comfort
Procedia PDF Downloads 3022068 Reducing the Incidence Rate of Pressure Sore in a Medical Center in Taiwan
Authors: Chang Yu Chuan
Abstract:
Background and Aim: Pressure sore is not only the consequence of any gradual damage of the skin leading to tissue defects but also an important indicator of clinical care. If hospitalized patients develop pressure sores without proper care, it would result in delayed healing, wound infection, increase patient physical pain, prolonged hospital stay and even death, which would have a negative impact on the quality of care and also increase nursing manpower and medical costs. This project is aimed at decreasing the incidence of pressure sore in one ward of internal medicine. Our data showed 53 cases (0.61%) of pressure sore in 2015, which exceeded the average (0.5%) of Taiwan Clinical Performance Indicator (TCPI) for medical centers. The purpose of this project is to reduce the incidence rate of pressure sore in the ward. After data collection and analysis from January to December 2016, the reasons of developing pressure sore were found: 1. Lack of knowledge to prevent pressure among nursing staffs; 2. No relevant courses about preventing pressure ulcers and pressure wound care being held in this unit; 3. Low complete rate of pressure sore care education that family members should receive from nursing staffs; 4. Decompression equipment is not enough; 5. Lack of standard procedures for body-turning and positioning care. After team members brainstorming, several strategies were proposed, including holding in-service education, pressure sore care seed training, purchasing decompression mattress and memory pillows, designing more elements of health education tools, such as health education pamphlet, posters and multimedia films of body-turning and positioning demonstration, formulation and promotion of standard operating procedures. In this way, nursing staffs can understand the body-turning and positioning guidelines for pressure sore prevention and enhance the quality of care. After the implementation of this project, the pressure sore density significantly decreased from 0.61%(53 cases) to 0.45%(28 cases) in this ward. The project shows good results and good example for nurses working at the ward and helps to enhance quality of care.Keywords: body-turning and positioning, incidence density, nursing, pressure sore
Procedia PDF Downloads 2672067 The Grain Size Distribution of Sandy Soils in Libya
Authors: Massoud Farag Abouklaish
Abstract:
The main aim of the present study is to investigate and classify the particle size distribution of sandy soils in Libya. More than fifty soil samples collected from many regions in North, West and South of Libya. Laboratory sieve analysis tests performed on disturbed soil samples to determine grain size distribution. As well as to provide an indicator of general engineering behavior and good understanding, test results are presented and analysed. In addition, conclusions, recommendations are made.Keywords: Libya, grain size, sandy soils, sieve analysis tests
Procedia PDF Downloads 6122066 Stochastic Repair and Replacement with a Single Repair Channel
Authors: Mohammed A. Hajeeh
Abstract:
This paper examines the behavior of a system, which upon failure is either replaced with certain probability p or imperfectly repaired with probability q. The system is analyzed using Kolmogorov's forward equations method; the analytical expression for the steady state availability is derived as an indicator of the system’s performance. It is found that the analysis becomes more complex as the number of imperfect repairs increases. It is also observed that the availability increases as the number of states and replacement probability increases. Using such an approach in more complex configurations and in dynamic systems is cumbersome; therefore, it is advisable to resort to simulation or heuristics. In this paper, an example is provided for demonstration.Keywords: repairable models, imperfect, availability, exponential distribution
Procedia PDF Downloads 2872065 VaR or TCE: Explaining the Preferences of Regulators
Authors: Silvia Faroni, Olivier Le Courtois, Krzysztof Ostaszewski
Abstract:
While a lot of research concentrates on the merits of VaR and TCE, which are the two most classic risk indicators used by financial institutions, little has been written on explaining why regulators favor the choice of VaR or TCE in their set of rules. In this paper, we investigate the preferences of regulators with the aim of understanding why, for instance, a VaR with a given confidence level is ultimately retained. Further, this paper provides equivalence rules that explain how a given choice of VaR can be equivalent to a given choice of TCE. Then, we introduce a new risk indicator that extends TCE by providing a more versatile weighting of the constituents of probability distribution tails. All of our results are illustrated using the generalized Pareto distribution.Keywords: generalized pareto distribution, generalized tail conditional expectation, regulator preferences, risk measure
Procedia PDF Downloads 1702064 Correlation Analysis between the Corporate Governance and Financial Performance of Banking Sectors Using Parameter Estimation
Authors: Vishwa Nath Maurya, Rama Shanker Sharma, Saad Talib Hasson Aljebori, Avadhesh Kumar Maurya, Diwinder Kaur Arora
Abstract:
Present paper deals with problems of determining the relationship between the variables of corporate governance and financial performance of Islamic banks. Here, we dealt with the corporate governance in the banking sector, where increasing the importance of corporate governance, due to their special nature, as the bankruptcy of banks affects not only the relevant parties from customers, depositors and lenders, but also affect financial stability and then the economy as a whole. Through this paper we dealt to the specificity of governance in Islamic banks, which face double governance: Anglo-Saxon governance system and Islamic governance system. In addition, we focused our attention to measure the impact of corporate governance variables on financial performance through an empirical study on a sample of Islamic banks during the period 2005-2012 in the GCC region. Our present study implies that there is a very strong relationship between the variables of governance and financial performance of Islamic banks, where there is a positive relationship between return on assets and the composition of the Board of Directors, the size of the Board of Directors, the number of committees in the Council, as well as the number of members of the Sharia Supervisory Board, while it is clear that there is a negative relationship between return on assets and concentration ownership.Keywords: correlation analysis, parametric estimation, corporate governance, financial performance, financial stability, conventional banks, bankruptcy, Islamic governance system
Procedia PDF Downloads 5162063 A Hybrid Genetic Algorithm and Neural Network for Wind Profile Estimation
Authors: M. Saiful Islam, M. Mohandes, S. Rehman, S. Badran
Abstract:
Increasing necessity of wind power is directing us to have precise knowledge on wind resources. Methodical investigation of potential locations is required for wind power deployment. High penetration of wind energy to the grid is leading multi megawatt installations with huge investment cost. This fact appeals to determine appropriate places for wind farm operation. For accurate assessment, detailed examination of wind speed profile, relative humidity, temperature and other geological or atmospheric parameters are required. Among all of these uncertainty factors influencing wind power estimation, vertical extrapolation of wind speed is perhaps the most difficult and critical one. Different approaches have been used for the extrapolation of wind speed to hub height which are mainly based on Log law, Power law and various modifications of the two. This paper proposes a Artificial Neural Network (ANN) and Genetic Algorithm (GA) based hybrid model, namely GA-NN for vertical extrapolation of wind speed. This model is very simple in a sense that it does not require any parametric estimations like wind shear coefficient, roughness length or atmospheric stability and also reliable compared to other methods. This model uses available measured wind speeds at 10m, 20m and 30m heights to estimate wind speeds up to 100m. A good comparison is found between measured and estimated wind speeds at 30m and 40m with approximately 3% mean absolute percentage error. Comparisons with ANN and power law, further prove the feasibility of the proposed method.Keywords: wind profile, vertical extrapolation of wind, genetic algorithm, artificial neural network, hybrid machine learning
Procedia PDF Downloads 4902062 Atmospheric CO2 Capture via Temperature/Vacuum Swing Adsorption in SIFSIX-3-Ni
Authors: Eleni Tsalaporta, Sebastien Vaesen, James M. D. MacElroy, Wolfgang Schmitt
Abstract:
Carbon dioxide capture has attracted the attention of many governments, industries and scientists over the last few decades, due to the rapid increase in atmospheric CO2 composition, with several studies being conducted in this area over the last few years. In many of these studies, CO2 capture in complex Pressure Swing Adsorption (PSA) cycles has been associated with high energy consumption despite the promising capture performance of such processes. The purpose of this study is the economic capture of atmospheric carbon dioxide for its transformation into a clean type of energy. A single column Temperature /Vacuum Swing Adsorption (TSA/VSA) process is proposed as an alternative option to multi column Pressure Swing Adsorption (PSA) processes. The proposed adsorbent is SIFSIX-3-Ni, a newly developed MOF (Metal Organic Framework), with extended CO2 selectivity and capacity. There are three stages involved in this paper: (i) SIFSIX-3-Ni is synthesized and pelletized and its physical and chemical properties are examined before and after the pelletization process, (ii) experiments are designed and undertaken for the estimation of the diffusion and adsorption parameters and limitations for CO2 undergoing capture from the air; and (iii) the CO2 adsorption capacity and dynamical characteristics of SIFSIX-3-Ni are investigated both experimentally and mathematically by employing a single column TSA/VSA, for the capture of atmospheric CO2. This work is further supported by a technical-economical study for the estimation of the investment cost and the energy consumption of the single column TSA/VSA process. The simulations are performed using gProms.Keywords: carbon dioxide capture, temperature/vacuum swing adsorption, metal organic frameworks, SIFSIX-3-Ni
Procedia PDF Downloads 2632061 Estimation of World Steel Production by Process
Authors: Reina Kawase
Abstract:
World GHG emissions should be reduced 50% by 2050 compared with 1990 level. CO2 emission reduction from steel sector, an energy-intensive sector, is essential. To estimate CO2 emission from steel sector in the world, estimation of steel production is required. The world steel production by process is estimated during the period of 2005-2050. The world is divided into aggregated 35 regions. For a steel making process, two kinds of processes are considered; basic oxygen furnace (BOF) and electric arc furnace (EAF). Steel production by process in each region is decided based on a current production capacity, supply-demand balance of steel and scrap, technology innovation of steel making, steel consumption projection, and goods trade. World steel production under moderate countermeasure scenario in 2050 increases by 1.3 times compared with that in 2012. When domestic scrap recycling is promoted, steel production in developed regions increases about 1.5 times. The share in developed regions changes from 34 %(2012) to about 40%(2050). This is because developed regions are main suppliers of scrap. 48-57% of world steel production is produced by EAF. Under the scenario which thinks much of supply-demand balance of steel, steel production in developing regions increases is 1.4 times and is larger than that in developed regions. The share in developing regions, however, is not so different from current level. The increase in steel production by EAF is the largest under the scenario in which supply-demand balance of steel is an important factor. The share reaches 65%.Keywords: global steel production, production distribution scenario, steel making process, supply-demand balance
Procedia PDF Downloads 4502060 Factors Influencing Site Overhead Cost of Construction Projects in Egypt: A Comparative Analysis
Authors: Aya Effat, Ossama A. Hosny, Elkhayam M. Dorra
Abstract:
Estimating costs is a crucial step in construction management and should be completed at the beginning of every project to establish the project's budget. The precision of the cost estimate plays a significant role in the success of construction projects as it allows project managers to effectively manage the project's costs. Site overhead costs constitute a significant portion of construction project budgets, necessitating accurate prediction and management. These costs are influenced by a multitude of factors, requiring a thorough examination and analysis to understand their relative importance and impact. Thus, the main aim of this research is to enhance the contractor’s ability to predict and manage site overheads by identifying and analyzing the main factors influencing the site overheads costs in the Egyptian construction industry. Through a comprehensive literature review, key factors were first identified and subsequently validated using a thorough comparative analysis of data from 55 real-life construction projects. Through this comparative analysis, the relationship between each factor and site overheads percentage as well as each site overheads subcategory and each project construction phase was identified and examined. Furthermore, correlation analysis was done to check for multicollinearity and identify factors with the highest impact. The findings of this research offer valuable insights into the key drivers of site overhead costs in the Egyptian construction industry. By understanding these factors, construction professionals can make informed decisions regarding the estimation and management of site overhead costs.Keywords: comparative analysis, cost estimation, construction management, site overheads
Procedia PDF Downloads 172059 Design and Test a Robust Bearing-Only Target Motion Analysis Algorithm Based on Modified Gain Extended Kalman Filter
Authors: Mohammad Tarek Al Muallim, Ozhan Duzenli, Ceyhun Ilguy
Abstract:
Passive sonar is a method for detecting acoustic signals in the ocean. It detects the acoustic signals emanating from external sources. With passive sonar, we can determine the bearing of the target only, no information about the range of the target. Target Motion Analysis (TMA) is a process to estimate the position and speed of a target using passive sonar information. Since bearing is the only available information, the TMA technique called Bearing-only TMA. Many TMA techniques have been developed. However, until now, there is not a very effective method that could be used to always track an unknown target and extract its moving trace. In this work, a design of effective Bearing-only TMA Algorithm is done. The measured bearing angles are very noisy. Moreover, for multi-beam sonar, the measurements is quantized due to the sonar beam width. To deal with this, modified gain extended Kalman filter algorithm is used. The algorithm is fine-tuned, and many modules are added to improve the performance. A special validation gate module is used to insure stability of the algorithm. Many indicators of the performance and confidence level measurement are designed and tested. A new method to detect if the target is maneuvering is proposed. Moreover, a reactive optimal observer maneuver based on bearing measurements is proposed, which insure converging to the right solution all of the times. To test the performance of the proposed TMA algorithm a simulation is done with a MATLAB program. The simulator program tries to model a discrete scenario for an observer and a target. The simulator takes into consideration all the practical aspects of the problem such as a smooth transition in the speed, a circular turn of the ship, noisy measurements, and a quantized bearing measurement come for multi-beam sonar. The tests are done for a lot of given test scenarios. For all the tests, full tracking is achieved within 10 minutes with very little error. The range estimation error was less than 5%, speed error less than 5% and heading error less than 2 degree. For the online performance estimator, it is mostly aligned with the real performance. The range estimation confidence level gives a value equal to 90% when the range error less than 10%. The experiments show that the proposed TMA algorithm is very robust and has low estimation error. However, the converging time of the algorithm is needed to be improved.Keywords: target motion analysis, Kalman filter, passive sonar, bearing-only tracking
Procedia PDF Downloads 4022058 Estimating of Groundwater Recharge Value for Al-Najaf City, Iraq
Authors: Hayder H. Kareem
Abstract:
Groundwater recharge is a crucial parameter for any groundwater management system. The variability of the recharge rates and the difficulty in estimating this factor in many processes by direct observation leads to the complexity of estimating the recharge value. Various methods are existing to estimate the groundwater recharge, with some limitations for each method to be able for application. This paper focuses particularly on a real study area, Al-Najaf City, Iraq. In this city, there are few groundwater aquifers, but the aquifer which is considered in this study is the closest one to the ground surface, the Dibdibba aquifer. According to the Aridity Index, which is estimated in the paper, Al-Najaf City is classified as a region located in an arid climate, and this identified that the most appropriate method to estimate the groundwater recharge is Thornthwaite's formula or Thornthwaite's method. From the calculations, the estimated average groundwater recharge over the period 1980-2014 for Al-Najaf City is 40.32 mm/year. Groundwater recharge is completely affected the groundwater table level (groundwater head). Therefore, to make sure that this value of recharge is true, the MODFLOW program has been used to apply this value through finding the relationship between the calculated and observed heads where a groundwater model for the Al-Najaf City study area has been built by MODFLOW to simulate this area for different purposes, one of these purposes is to simulate the groundwater recharge. MODFLOW results show that this value of groundwater recharge is extremely high and needs to be reduced. Therefore, a further sensitivity test has been carried out for the Al-Najaf City study area by the MODFLOW program through changing the recharge value and found that the best estimation of groundwater recharge value for this city is 16.5 mm/year where this value gives the best fitting between the calculated and observed heads with minimum values of RMSE % (13.175) and RSS m² (1454).Keywords: Al-Najaf City, groundwater modelling, recharge estimation, visual MODFLOW
Procedia PDF Downloads 1352057 Stature Prediction from Anthropometry of Extremities among Jordanians
Authors: Amal A. Mashali, Omar Eltaweel, Elerian Ekladious
Abstract:
Stature of an individual has an important role in identification, which is often required in medico-legal practice. The estimation of stature is an important step in the identification of dismembered remains or when only a part of a skeleton is only available as in major disasters or with mutilation. There is no published data on anthropological data among Jordanian population. The present study was designed in order to find out relationship of stature to some anthropometric measures among a sample of Jordanian population and to determine the most accurate and reliable one in predicting the stature of an individual. A cross sectional study was conducted on 336 adult healthy volunteers , free of bone diseases, nutritional diseases and abnormalities in the extremities after taking their consent. Students of Faculty of Medicine, Mutah University helped in collecting the data. The anthropometric measurements (anatomically defined) were stature, humerus length, hand length and breadth, foot length and breadth, foot index and knee height on both right and left sides of the body. The measurements were typical on both sides of the bodies of the studied samples. All the anthropologic data showed significant relation with age except the knee height. There was a significant difference between male and female measurements except for the foot index where F= 0.269. There was a significant positive correlation between the different measures and the stature of the individuals. Three equations were developed for estimation of stature. The most sensitive measure for prediction of a stature was found to be the humerus length.Keywords: foot index, foot length, hand length, humerus length, stature
Procedia PDF Downloads 306