Search results for: interval estimation
2135 Formulation of Extended-Release Gliclazide Tablet Using a Mathematical Model for Estimation of Hypromellose
Authors: Farzad Khajavi, Farzaneh Jalilfar, Faranak Jafari, Leila Shokrani
Abstract:
Formulation of gliclazide in the form of extended-release tablet in 30 and 60 mg dosage forms was performed using hypromellose (HPMC K4M) as a retarding agent. Drug-release profiles were investigated in comparison with references Diamicron MR 30 and 60 mg tablets. The effect of size of powder particles, the amount of hypromellose in formulation, hardness of tablets, and also the effect of halving the tablets were investigated on drug release profile. A mathematical model which describes hypromellose behavior in initial times of drug release was proposed for the estimation of hypromellose content in modified-release gliclazide 60 mg tablet. This model is based on erosion of hypromellose in dissolution media. The model is applicable to describe release profiles of insoluble drugs. Therefore, by using dissolved amount of drug in initial times of dissolution and the model, the amount of hypromellose in formulation can be predictable. The model was used to predict the HPMC K4M content in modified-release gliclazide 30 mg and extended-release quetiapine 200 mg tablets.Keywords: Gliclazide, hypromellose, drug release, modified-release tablet, mathematical model
Procedia PDF Downloads 2222134 The Effect of Elapsed Time on the Cardiac Troponin-T Degradation and Its Utility as a Time Since Death Marker in Cases of Death Due to Burn
Authors: Sachil Kumar, Anoop K.Verma, Uma Shankar Singh
Abstract:
It’s extremely important to study postmortem interval in different causes of death since it assists in a great way in making an opinion on the exact cause of death following such incident often times. With diligent knowledge of the interval one could really say as an expert that the cause of death is not feigned hence there is a great need in evaluating such death to have been at the CRIME SCENE before performing an autopsy on such body. The approach described here is based on analyzing the degradation or proteolysis of a cardiac protein in cases of deaths due to burn as a marker of time since death. Cardiac tissue samples were collected from (n=6) medico-legal autopsies, (Department of Forensic Medicine and Toxicology), King George’s Medical University, Lucknow India, after informed consent from the relatives and studied post-mortem degradation by incubation of the cardiac tissue at room temperature (20±2 OC) for different time periods (~7.30, 18.20, 30.30, 41.20, 41.40, 54.30, 65.20, and 88.40 Hours). The cases included were the subjects of burn without any prior history of disease who died in the hospital and their exact time of death was known. The analysis involved extraction of the protein, separation by denaturing gel electrophoresis (SDS-PAGE) and visualization by Western blot using cTnT specific monoclonal antibodies. The area of the bands within a lane was quantified by scanning and digitizing the image using Gel Doc. As time postmortem progresses the intact cTnT band degrades to fragments that are easily detected by the monoclonal antibodies. A decreasing trend in the level of cTnT (% of intact) was found as the PM hours increased. A significant difference was observed between <15 h and other PM hours (p<0.01). Significant difference in cTnT level (% of intact) was also observed between 16-25 h and 56-65 h & >75 h (p<0.01). Western blot data clearly showed the intact protein at 42 kDa, three major (28 kDa, 30kDa, 10kDa) fragments, three additional minor fragments (12 kDa, 14kDa, and 15 kDa) and formation of low molecular weight fragments. Overall, both PMI and cardiac tissue of burned corpse had a statistically significant effect where the greatest amount of protein breakdown was observed within the first 41.40 Hrs and after it intact protein slowly disappears. If the percent intact cTnT is calculated from the total area integrated within a Western blot lane, then the percent intact cTnT shows a pseudo-first order relationship when plotted against the time postmortem. A strong significant positive correlation was found between cTnT and PM hours (r=0.87, p=0.0001). The regression analysis showed a good variability explained (R2=0.768) The post-mortem Troponin-T fragmentation observed in this study reveals a sequential, time-dependent process with the potential for use as a predictor of PMI in cases of burning.Keywords: burn, degradation, postmortem interval, troponin-T
Procedia PDF Downloads 4492133 Downtime Estimation of Building Structures Using Fuzzy Logic
Authors: M. De Iuliis, O. Kammouh, G. P. Cimellaro, S. Tesfamariam
Abstract:
Community Resilience has gained a significant attention due to the recent unexpected natural and man-made disasters. Resilience is the process of maintaining livable conditions in the event of interruptions in normally available services. Estimating the resilience of systems, ranging from individuals to communities, is a formidable task due to the complexity involved in the process. The most challenging parameter involved in the resilience assessment is the 'downtime'. Downtime is the time needed for a system to recover its services following a disaster event. Estimating the exact downtime of a system requires a lot of inputs and resources that are not always obtainable. The uncertainties in the downtime estimation are usually handled using probabilistic methods, which necessitates acquiring large historical data. The estimation process also involves ignorance, imprecision, vagueness, and subjective judgment. In this paper, a fuzzy-based approach to estimate the downtime of building structures following earthquake events is proposed. Fuzzy logic can integrate descriptive (linguistic) knowledge and numerical data into the fuzzy system. This ability allows the use of walk down surveys, which collect data in a linguistic or a numerical form. The use of fuzzy logic permits a fast and economical estimation of parameters that involve uncertainties. The first step of the method is to determine the building’s vulnerability. A rapid visual screening is designed to acquire information about the analyzed building (e.g. year of construction, structural system, site seismicity, etc.). Then, a fuzzy logic is implemented using a hierarchical scheme to determine the building damageability, which is the main ingredient to estimate the downtime. Generally, the downtime can be divided into three main components: downtime due to the actual damage (DT1); downtime caused by rational and irrational delays (DT2); and downtime due to utilities disruption (DT3). In this work, DT1 is computed by relating the building damageability results obtained from the visual screening to some already-defined components repair times available in the literature. DT2 and DT3 are estimated using the REDITM Guidelines. The Downtime of the building is finally obtained by combining the three components. The proposed method also allows identifying the downtime corresponding to each of the three recovery states: re-occupancy; functional recovery; and full recovery. Future work is aimed at improving the current methodology to pass from the downtime to the resilience of buildings. This will provide a simple tool that can be used by the authorities for decision making.Keywords: resilience, restoration, downtime, community resilience, fuzzy logic, recovery, damage, built environment
Procedia PDF Downloads 1602132 Artificial Neural Network and Satellite Derived Chlorophyll Indices for Estimation of Wheat Chlorophyll Content under Rainfed Condition
Authors: Muhammad Naveed Tahir, Wang Yingkuan, Huang Wenjiang, Raheel Osman
Abstract:
Numerous models used in prediction and decision-making process but most of them are linear in natural environment, and linear models reach their limitations with non-linearity in data. Therefore accurate estimation is difficult. Artificial Neural Networks (ANN) found extensive acceptance to address the modeling of the complex real world for the non-linear environment. ANN’s have more general and flexible functional forms than traditional statistical methods can effectively deal with. The link between information technology and agriculture will become more firm in the near future. Monitoring crop biophysical properties non-destructively can provide a rapid and accurate understanding of its response to various environmental influences. Crop chlorophyll content is an important indicator of crop health and therefore the estimation of crop yield. In recent years, remote sensing has been accepted as a robust tool for site-specific management by detecting crop parameters at both local and large scales. The present research combined the ANN model with satellite-derived chlorophyll indices from LANDSAT 8 imagery for predicting real-time wheat chlorophyll estimation. The cloud-free scenes of LANDSAT 8 were acquired (Feb-March 2016-17) at the same time when ground-truthing campaign was performed for chlorophyll estimation by using SPAD-502. Different vegetation indices were derived from LANDSAT 8 imagery using ERADAS Imagine (v.2014) software for chlorophyll determination. The vegetation indices were including Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Chlorophyll Absorbed Ratio Index (CARI), Modified Chlorophyll Absorbed Ratio Index (MCARI) and Transformed Chlorophyll Absorbed Ratio index (TCARI). For ANN modeling, MATLAB and SPSS (ANN) tools were used. Multilayer Perceptron (MLP) in MATLAB provided very satisfactory results. For training purpose of MLP 61.7% of the data, for validation purpose 28.3% of data and rest 10% of data were used to evaluate and validate the ANN model results. For error evaluation, sum of squares error and relative error were used. ANN model summery showed that sum of squares error of 10.786, the average overall relative error was .099. The MCARI and NDVI were revealed to be more sensitive indices for assessing wheat chlorophyll content with the highest coefficient of determination R²=0.93 and 0.90 respectively. The results suggested that use of high spatial resolution satellite imagery for the retrieval of crop chlorophyll content by using ANN model provides accurate, reliable assessment of crop health status at a larger scale which can help in managing crop nutrition requirement in real time.Keywords: ANN, chlorophyll content, chlorophyll indices, satellite images, wheat
Procedia PDF Downloads 1462131 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis
Authors: Alexander Marx
Abstract:
Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.Keywords: value at risk, financial market risk, banking, quantitative risk management
Procedia PDF Downloads 942130 Repeatable Scalable Business Models: Can Innovation Drive an Entrepreneurs Un-Validated Business Model?
Authors: Paul Ojeaga
Abstract:
Can the level of innovation use drive un-validated business models across regions? To what extent does industrial sector attractiveness drive firm’s success across regions at the time of start-up? This study examines the role of innovation on start-up success in six regions of the world (namely Sub Saharan Africa, the Middle East and North Africa, Latin America, South East Asia Pacific, the European Union and the United States representing North America) using macroeconomic variables. While there have been studies using firm level data, results from such studies are not suitable for national policy decisions. The need to drive a regional innovation policy also begs for an answer, therefore providing room for this study. Results using dynamic panel estimation show that innovation counts in the early infancy stage of new business life cycle. The results are robust even after controlling for time fixed effects and the study present variance-covariance estimation robust standard errors.Keywords: industrial economics, un-validated business models, scalable models, entrepreneurship
Procedia PDF Downloads 2822129 Ultra-High Frequency Passive Radar Coverage for Cars Detection in Semi-Urban Scenarios
Authors: Pedro Gómez-del-Hoyo, Jose-Luis Bárcena-Humanes, Nerea del-Rey-Maestre, María-Pilar Jarabo-Amores, David Mata-Moya
Abstract:
A study of achievable coverages using passive radar systems in terrestrial traffic monitoring applications is presented. The study includes the estimation of the bistatic radar cross section of different commercial vehicle models that provide challenging low values which make detection really difficult. A semi-urban scenario is selected to evaluate the impact of excess propagation losses generated by an irregular relief. A bistatic passive radar exploiting UHF frequencies radiated by digital video broadcasting transmitters is assumed. A general method of coverage estimation using electromagnetic simulators in combination with estimated car average bistatic radar cross section is applied. In order to reduce the computational cost, hybrid solution is implemented, assuming free space for the target-receiver path but estimating the excess propagation losses for the transmitter-target one.Keywords: bistatic radar cross section, passive radar, propagation losses, radar coverage
Procedia PDF Downloads 3362128 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods
Authors: Autcha Araveeporn
Abstract:
This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution
Procedia PDF Downloads 3562127 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 1592126 Fatigue Life Estimation Using N-Code for Drive Shaft of Passenger Vehicle
Authors: Tae An Kim, Hyo Lim Kang, Hye Won Han, Seung Ho Han
Abstract:
The drive shaft of passenger vehicle has its own function such as transmitting the engine torque from the gearbox and differential gears to the wheels. It must also compensate for all variations in angle or length resulting from manoeuvring and deflection for perfect synchronization between joints. Torsional fatigue failures occur frequently at the connection parts of the spline joints in the end of the drive shaft. In this study, the fatigue life of a drive shaft of passenger vehicle was estimated by using the finite element analysis. A commercial software of n-Code was applied under twisting load conditions, i.e. 0~134kgf•m and 0~188kgf•m, in which the shear strain range-fatigue life relationship considering Signed Shear method, Smith-Watson-Topper equation, Neuber-Hoffman Seeger method, size sensitivity factor and surface roughness effect was taken into account. The estimated fatigue life was verified by a twisting load test of the real drive shaft in a test rig. (Human Resource Training Project for Industry Matched R & D, KIAT, N036200004).Keywords: drive shaft, fatigue life estimation, passenger vehicle, shear strain range-fatigue life relationship, torsional fatigue failure
Procedia PDF Downloads 2752125 A Study of Mode Choice Model Improvement Considering Age Grouping
Authors: Young-Hyun Seo, Hyunwoo Park, Dong-Kyu Kim, Seung-Young Kho
Abstract:
The purpose of this study is providing an improved mode choice model considering parameters including age grouping of prime-aged and old age. In this study, 2010 Household Travel Survey data were used and improper samples were removed through the analysis. Chosen alternative, date of birth, mode, origin code, destination code, departure time, and arrival time are considered from Household Travel Survey. By preprocessing data, travel time, travel cost, mode, and ratio of people aged 45 to 55 years, 55 to 65 years and over 65 years were calculated. After the manipulation, the mode choice model was constructed using LIMDEP by maximum likelihood estimation. A significance test was conducted for nine parameters, three age groups for three modes. Then the test was conducted again for the mode choice model with significant parameters, travel cost variable and travel time variable. As a result of the model estimation, as the age increases, the preference for the car decreases and the preference for the bus increases. This study is meaningful in that the individual and households characteristics are applied to the aggregate model.Keywords: age grouping, aging, mode choice model, multinomial logit model
Procedia PDF Downloads 3222124 Uncontrollable Inaccuracy in Inverse Problems
Authors: Yu Menshikov
Abstract:
In this paper the influence of errors of function derivatives in initial time which have been obtained by experiment (uncontrollable inaccuracy) to the results of inverse problem solution was investigated. It was shown that these errors distort the inverse problem solution as a rule near the beginning of interval where the solution are analyzed. Several methods for remove the influence of uncontrollable inaccuracy have been suggested.Keywords: inverse problems, filtration, uncontrollable inaccuracy
Procedia PDF Downloads 5052123 Study on Errors in Estimating the 3D Gaze Point for Different Pupil Sizes Using Eye Vergences
Authors: M. Pomianek, M. Piszczek, M. Maciejewski
Abstract:
The binocular eye tracking technology is increasingly being used in industry, entertainment and marketing analysis. In the case of virtual reality, eye tracking systems are already the basis for user interaction with the environment. In such systems, the high accuracy of determining the user's eye fixation point is very important due to the specificity of the virtual reality head-mounted display (HMD). Often, however, there are unknown errors occurring in the used eye tracking technology, as well as those resulting from the positioning of the devices in relation to the user's eyes. However, can the virtual environment itself influence estimation errors? The paper presents mathematical analyses and empirical studies of the determination of the fixation point and errors resulting from the change in the size of the pupil in response to the intensity of the displayed scene. The article contains both static laboratory tests as well as on the real user. Based on the research results, optimization solutions were proposed that would reduce the errors of gaze estimation errors. Studies show that errors in estimating the fixation point of vision can be minimized both by improving the pupil positioning algorithm in the video image and by using more precise methods to calibrate the eye tracking system in three-dimensional space.Keywords: eye tracking, fixation point, pupil size, virtual reality
Procedia PDF Downloads 1322122 The Current Situation and Perspectives of Electricity Demand and Estimation of Carbon Dioxide Emissions and Efficiency
Abstract:
This article presents a current and future energy situation in Libya. The electric power efficiency and operating hours in power plants are evaluated from 2005 to 2010. Carbon dioxide emissions in most of power plants are estimated. In 2005, the efficiency of steam power plants achieved a range of 20% to 28%. While, the gas turbine power plants efficiency ranged between 9% and 25%, this can be considered as low efficiency. However, the efficiency improvement has clearly observed in some power plants from 2008 to 2010, especially in the power plant of North Benghazi and west Tripoli. In fact, these power plants have modified to combine cycle. The efficiency of North Benghazi power plant has increased from 25% to 46.6%, while in Tripoli it is increased from 22% to 34%. On the other hand, the efficiency improvement is not observed in the gas turbine power plants. When compared to the quantity of fuel used, the carbon dioxide emissions resulting from electricity generation plants were very high. Finally, an estimation of the energy demand has been done to the maximum load and the annual load factor (i.e., the ratio between the output power and installed power).Keywords: power plant, efficiency improvement, carbon dioxide emissions, energy situation in Libya
Procedia PDF Downloads 4782121 Estimation of Reservoir Capacity and Sediment Deposition Using Remote Sensing Data
Authors: Odai Ibrahim Mohammed Al Balasmeh, Tapas Karmaker, Richa Babbar
Abstract:
In this study, the reservoir capacity and sediment deposition were estimated using remote sensing data. The satellite images were synchronized with water level and storage capacity to find out the change in sediment deposition due to soil erosion and transport by streamflow. The water bodies spread area was estimated using vegetation indices, e.g., normalize differences vegetation index (NDVI) and normalize differences water index (NDWI). The 3D reservoir bathymetry was modeled by integrated water level, storage capacity, and area. From the models of different time span, the change in reservoir storage capacity was estimated. Another reservoir with known water level, storage capacity, area, and sediment deposition was used to validate the estimation technique. The t-test was used to assess the results between observed and estimated reservoir capacity and sediment deposition.Keywords: satellite data, normalize differences vegetation index, NDVI, normalize differences water index, NDWI, reservoir capacity, sedimentation, t-test hypothesis
Procedia PDF Downloads 1672120 Estimation of Aquifer Properties Using Pumping Tests: Case Study of Pydibhimavaram Industrial Area, Srikakulam, India
Authors: G. Venkata Rao, P. Kalpana, R. Srinivasa Rao
Abstract:
Adequate and reliable estimates of aquifer parameters are of utmost importance for proper management of vital groundwater resources. At present scenario the ground water is polluted because of industrial waste disposed over the land and the contaminants are transported in the aquifer from one area to another area which is depending on the characteristics of the aquifer and contaminants. To know the contaminant transport, the accurate estimation of aquifer properties is highly needed. Conventionally, these properties are estimated through pumping tests carried out on water wells. The occurrence and movement of ground water in the aquifer are characteristically defined by the aquifer parameters. The pumping (aquifer) test is the standard technique for estimating various hydraulic properties of aquifer systems, viz, transmissivity (T), hydraulic conductivity (K), storage coefficient (S) etc., for which the graphical method is widely used. The study area for conducting pumping test is Pydibheemavaram Industrial area near the coastal belt of Srikulam, AP, India. The main objective of the present work is to estimate the aquifer properties for developing contaminant transport model for the study area.Keywords: aquifer, contaminant transport, hydraulic conductivity, industrial waste, pumping test
Procedia PDF Downloads 4462119 Estimation of Emanation Properties of Kimberlites and Host Rocks of Lomonosov Diamond Deposit in Russia
Authors: E. Yu. Yakovlev, A. V. Puchkov
Abstract:
The study is devoted to experimental work on the assessment of emanation properties of kimberlites and host rocks of the Lomonosov diamond deposit of the Arkhangelsk diamondiferous province. The aim of the study is estimation the factors influencing on formation of the radon field over kimberlite pipes. For various types of rocks composing the kimberlite pipe and near-pipe space, the following parameters were measured: porosity, density, radium-226 activity, activity of free radon and emanation coefficient. The research results showed that the largest amount of free radon is produced by rocks of near-pipe space, which are the Vendian host deposits and are characterized by high values of the emanation coefficient, radium activity and porosity. The lowest values of these parameters are characteristic of vent-facies kimberlites, which limit the formation of activity of free radon in body of the pipe. The results of experimental work confirm the prospects of using emanation methods for prospecting of kimberlite pipes.Keywords: emanation coefficient, kimberlites, porosity, radon volumetric activity
Procedia PDF Downloads 1392118 Automatic Post Stroke Detection from Computed Tomography Images
Authors: C. Gopi Jinimole, A. Harsha
Abstract:
For detecting strokes, Computed Tomography (CT) scan is preferred for imaging the abnormalities or infarction in the brain. Because of the problems in the window settings used to evaluate brain CT images, they are very poor in the early stage infarction detection. This paper presents an automatic estimation method for the window settings of the CT images for proper contrast of the hyper infarction present in the brain. In the proposed work the window width is estimated automatically for each slice and the window centre is changed to a new value of 31HU, which is the average of the HU values of the grey matter and white matter in the brain. The automatic window width estimation is based on the average of median of statistical central moments. Thus with the new suggested window centre and estimated window width, the hyper infarction or post-stroke regions in CT brain images are properly detected. The proposed approach assists the radiologists in CT evaluation for early quantitative signs of delayed stroke, which leads to severe hemorrhage in the future can be prevented by providing timely medication to the patients.Keywords: computed tomography (CT), hyper infarction or post stroke region, Hounsefield Unit (HU), window centre (WC), window width (WW)
Procedia PDF Downloads 2032117 Satellite Derived Evapotranspiration and Turbulent Heat Fluxes Using Surface Energy Balance System (SEBS)
Authors: Muhammad Tayyab Afzal, Muhammad Arslan, Mirza Muhammad Waqar
Abstract:
One of the key components of the water cycle is evapotranspiration (ET), which represents water consumption by vegetated and non-vegetated surfaces. Conventional techniques for measurements of ET are point based and representative of the local scale only. Satellite remote sensing data with large area coverage and high temporal frequency provide representative measurements of several relevant biophysical parameters required for estimation of ET at regional scales. The objective is of this research is to exploit satellite data in order to estimate evapotranspiration. This study uses Surface Energy Balance System (SEBS) model to calculate daily actual evapotranspiration (ETa) in Larkana District, Sindh Pakistan using Landsat TM data for clouds-free days. As there is no flux tower in the study area for direct measurement of latent heat flux or evapotranspiration and sensible heat flux, therefore, the model estimated values of ET were compared with reference evapotranspiration (ETo) computed by FAO-56 Penman Monteith Method using meteorological data. For a country like Pakistan, agriculture by irrigation in the river basins is the largest user of fresh water. For the better assessment and management of irrigation water requirement, the estimation of consumptive use of water for agriculture is very important because it is the main consumer of water. ET is yet an essential issue of water imbalance due to major loss of irrigation water and precipitation on cropland. As large amount of irrigated water is lost through ET, therefore its accurate estimation can be helpful for efficient management of irrigation water. Results of this study can be used to analyse surface conditions, i.e. temperature, energy budgets and relevant characteristics. Through this information we can monitor vegetation health and suitable agricultural conditions and can take controlling steps to increase agriculture production.Keywords: SEBS, remote sensing, evapotranspiration, ETa
Procedia PDF Downloads 3332116 Dose Evaluations with SNAP/RADTRAD for Loss of Coolant Accidents in a BWR6 Nuclear Power Plant
Authors: Kai Chun Yang, Shao-Wen Chen, Jong-Rong Wang, Chunkuan Shih, Jung-Hua Yang, Hsiung-Chih Chen, Wen-Sheng Hsu
Abstract:
In this study, we build RADionuclide Transport, Removal And Dose Estimation/Symbolic Nuclear Analysis Package (SNAP/RADTRAD) model of Kuosheng Nuclear Power Plant which is based on the Final Safety Evaluation Report (FSAR) and other data of Kuosheng Nuclear Power Plant. It is used to estimate the radiation dose of the Exclusion Area Boundary (EAB), the Low Population Zone (LPZ), and the control room following ‘release from the containment’ case in Loss Of Coolant Accident (LOCA). The RADTRAD analysis result shows that the evaluation dose at EAB, LPZ, and the control room are close to the FSAR data, and all of the doses are lower than the regulatory limits. At last, we do a sensitivity analysis and observe that the evaluation doses increase as the intake rate of the control room increases.Keywords: RADTRAD, radionuclide transport, removal and dose estimation, snap, symbolic nuclear analysis package, boiling water reactor, NPP, kuosheng
Procedia PDF Downloads 3432115 An Audit on the Quality of Pre-Operative Intra-Oral Digital Radiographs Taken for Dental Extractions in a General Practice Setting
Authors: Gabrielle O'Donoghue
Abstract:
Background: Pre-operative radiographs facilitate assessment and treatment planning in minor oral surgery. Quality assurance for dental radiography advocates the As Low As Reasonably Achievable (ALARA) principle in collecting accurate diagnostic information. Aims: To audit the quality of digital intraoral periapicals (IOPAs) taken prior to dental extractions in a metropolitan general dental practice setting. Standards: The National Radiological Protection Board (NRPB) guidance outlines three grades of radiograph quality: excellent (Grade 1 > 70% of total exposures), diagnostically acceptable (Grade 2 <20%), and unacceptable (Grade 3 <10%). Methodology: A study of pre-operative radiographs taken prior to dental extractions across 12 private general dental practices in a large metropolitan area by 44 practitioners. A total of 725 extractions were assessed, allowing 258 IOPAs to be reviewed in one audit cycle. Results: First cycle: Of 258 IOPAs: 223(86.4%) scored Grade 1, 27(10.5%) Grade 2, and 8(3.1%) Grade 3. The standard was met. 35 dental extractions were performed without an available pre-operative radiograph. Action Plan & Recommendations: Results were distributed to all staff and a continuous professional development evening organized to outline recommendations to improve image quality. A second audit cycle is proposed at a six-month interval to review the recommendations and appraise results. Conclusion: The overall standard of radiographs met the published guidelines. A significant improvement in the number of procedures undertaken without pre-operative imaging is expected at a six-month interval period. An investigation into undiagnostic imaging and associated adverse patient outcomes is being considered. Maintenance of the standards achieved is predicted in the second audit cycle to ensure consistent high quality imaging.Keywords: audit, oral radiology, oral surgery, periapical radiographs, quality assurance
Procedia PDF Downloads 1662114 Use of Multistage Transition Regression Models for Credit Card Income Prediction
Authors: Denys Osipenko, Jonathan Crook
Abstract:
Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability
Procedia PDF Downloads 4872113 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model
Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl
Abstract:
Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.Keywords: dexel, process stability, material removal, milling
Procedia PDF Downloads 5252112 Multiple Linear Regression for Rapid Estimation of Subsurface Resistivity from Apparent Resistivity Measurements
Authors: Sabiu Bala Muhammad, Rosli Saad
Abstract:
Multiple linear regression (MLR) models for fast estimation of true subsurface resistivity from apparent resistivity field measurements are developed and assessed in this study. The parameters investigated were apparent resistivity (ρₐ), horizontal location (X) and depth (Z) of measurement as the independent variables; and true resistivity (ρₜ) as the dependent variable. To achieve linearity in both resistivity variables, datasets were first transformed into logarithmic domain following diagnostic checks of normality of the dependent variable and heteroscedasticity to ensure accurate models. Four MLR models were developed based on hierarchical combination of the independent variables. The generated MLR coefficients were applied to another data set to estimate ρₜ values for validation. Contours of the estimated ρₜ values were plotted and compared to the observed data plots at the colour scale and blanking for visual assessment. The accuracy of the models was assessed using coefficient of determination (R²), standard error (SE) and weighted mean absolute percentage error (wMAPE). It is concluded that the MLR models can estimate ρₜ for with high level of accuracy.Keywords: apparent resistivity, depth, horizontal location, multiple linear regression, true resistivity
Procedia PDF Downloads 2762111 Effects of Irrigation Scheduling and Soil Management on Maize (Zea mays L.) Yield in Guinea Savannah Zone of Nigeria
Authors: I. Alhassan, A. M. Saddiq, A. G. Gashua, K. K. Gwio-Kura
Abstract:
The main objective of any irrigation program is the development of an efficient water management system to sustain crop growth and development and avoid physiological water stress in the growing plants. Field experiment to evaluate the effects of some soil moisture conservation practices on yield and water use efficiency (WUE) of maize was carried out in three locations (i.e. Mubi and Yola in the northern Guinea Savannah and Ganye in the southern Guinea Savannah of Adamawa State, Nigeria) during the dry seasons of 2013 and 2014. The experiment consisted of three different irrigation levels (7, 10 and 12 day irrigation intervals), two levels of mulch (mulch and un-mulched) and two tillage practices (no tillage and minimum tillage) arranged in a randomized complete block design with split-split plot arrangement and replicated three times. The Blaney-Criddle method was used for measuring crop evapotranspiration. The results indicated that seven-day irrigation intervals and mulched treatment were found to have significant effect (P>0.05) on grain yield and water use efficiency in all the locations. The main effect of tillage was non-significant (P<0.05) on grain yield and WUE. The interaction effects of irrigation and mulch were significant (P>0.05) on grain yield and WUE at Mubi and Yola. Generally, higher grain yield and WUE were recorded on mulched and seven-day irrigation intervals, whereas lower values were recorded on un-mulched with 12-day irrigation intervals. Tillage exerts little influence on the yield and WUE. Results from Ganye were found to be generally higher than those recorded in Mubi and Yola; it also showed that an irrigation interval of 10 days with mulching could be adopted for the Ganye area, while seven days interval is more appropriate for Mubi and Yola.Keywords: irrigation, maize, mulching, tillage, savanna
Procedia PDF Downloads 2152110 Maximum Deformation Estimation for Reinforced Concrete Buildings Using Equivalent Linearization Method
Authors: Chien-Kuo Chiu
Abstract:
In the displacement-based seismic design and evaluation, equivalent linearization method is one of the approximation methods to estimate the maximum inelastic displacement response of a system. In this study, the accuracy of two equivalent linearization methods are investigated. The investigation consists of three soil condition in Taiwan (Taipei Basin 1, 2, and 3) and five different heights of building (H_r= 10, 20, 30, 40, and 50 m). The first method is the Taiwan equivalent linearization method (TELM) which was proposed based on Japanese equivalent linear method considering the modification factor, α_T= 0.85. On the basis of Lin and Miranda study, the second method is proposed with some modification considering Taiwan soil conditions. From this study, it is shown that Taiwanese equivalent linearization method gives better estimation compared to the modified Lin and Miranda method (MLM). The error index for the Taiwanese equivalent linearization method are 16%, 13%, and 12% for Taipei Basin 1, 2, and 3, respectively. Furthermore, a ductility demand spectrum of single-degree-of-freedom (SDOF) system is presented in this study as a guide for engineers to estimate the ductility demand of a structure.Keywords: displacement-based design, ductility demand spectrum, equivalent linearization method, RC buildings, single-degree-of-freedom
Procedia PDF Downloads 1622109 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation
Authors: Mohammad Anwar, Shah Waliullah
Abstract:
This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model
Procedia PDF Downloads 682108 Least Squares Solution for Linear Quadratic Gaussian Problem with Stochastic Approximation Approach
Authors: Sie Long Kek, Wah June Leong, Kok Lay Teo
Abstract:
Linear quadratic Gaussian model is a standard mathematical model for the stochastic optimal control problem. The combination of the linear quadratic estimation and the linear quadratic regulator allows the state estimation and the optimal control policy to be designed separately. This is known as the separation principle. In this paper, an efficient computational method is proposed to solve the linear quadratic Gaussian problem. In our approach, the Hamiltonian function is defined, and the necessary conditions are derived. In addition to this, the output error is defined and the least-square optimization problem is introduced. By determining the first-order necessary condition, the gradient of the sum squares of output error is established. On this point of view, the stochastic approximation approach is employed such that the optimal control policy is updated. Within a given tolerance, the iteration procedure would be stopped and the optimal solution of the linear-quadratic Gaussian problem is obtained. For illustration, an example of the linear-quadratic Gaussian problem is studied. The result shows the efficiency of the approach proposed. In conclusion, the applicability of the approach proposed for solving the linear quadratic Gaussian problem is highly demonstrated.Keywords: iteration procedure, least squares solution, linear quadratic Gaussian, output error, stochastic approximation
Procedia PDF Downloads 1872107 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor
Authors: Jinseon Song, Yongwan Park
Abstract:
In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.Keywords: positioning, distance, camera, features, SURF(Speed-Up Robust Features), database, estimation
Procedia PDF Downloads 3492106 Roles of Aquatic Plants on Erosion Relief of Stream Bed
Authors: Jin-Hong Kim
Abstract:
Roles of the vegetation to mitigate the erosion of the stream bed or to facilitate the deposition of the fine sediments by the species of the aquatic plants were presented. Field investigation on the estimation of the change of the bed level and the estimation of the flow characteristics were performed. The results showed that Phragmites japonica has the mitigation function of 0.3m-0.4m of the erosion in the range of higher than 1.0m/s of flow velocity at the vegetated region. Phragmites communis has the mitigation function of 0.2m-0.3m of the erosion in the range of higher than 0.7m/s of flow velocity at the vegetated region. Salix gracilistyla has greater role than Phragmites japonica and Phragmites communis to sustain the stable channel. It has the mitigation function of 0.4m-0.5m of the erosion in the range of higher than 1.4m/s of flow velocity. Miscanthus sacchariflorus has a weak role compared with that of Phragmites japonica and Salix gracilistyla, but it has still function for sustaining the stable bed. From these results, the vegetation has effective roles to mitigate the erosion or to facilitate the deposition of the stream bed.Keywords: aquatic plants, Phragmites japonica, Phragmites communis, Salix gracilistyla
Procedia PDF Downloads 385