Search results for: loss estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5125

Search results for: loss estimation

4405 Formulation of Extended-Release Gliclazide Tablet Using a Mathematical Model for Estimation of Hypromellose

Authors: Farzad Khajavi, Farzaneh Jalilfar, Faranak Jafari, Leila Shokrani

Abstract:

Formulation of gliclazide in the form of extended-release tablet in 30 and 60 mg dosage forms was performed using hypromellose (HPMC K4M) as a retarding agent. Drug-release profiles were investigated in comparison with references Diamicron MR 30 and 60 mg tablets. The effect of size of powder particles, the amount of hypromellose in formulation, hardness of tablets, and also the effect of halving the tablets were investigated on drug release profile. A mathematical model which describes hypromellose behavior in initial times of drug release was proposed for the estimation of hypromellose content in modified-release gliclazide 60 mg tablet. This model is based on erosion of hypromellose in dissolution media. The model is applicable to describe release profiles of insoluble drugs. Therefore, by using dissolved amount of drug in initial times of dissolution and the model, the amount of hypromellose in formulation can be predictable. The model was used to predict the HPMC K4M content in modified-release gliclazide 30 mg and extended-release quetiapine 200 mg tablets.

Keywords: Gliclazide, hypromellose, drug release, modified-release tablet, mathematical model

Procedia PDF Downloads 201
4404 Weight Loss and Symptom Improvement in Women with Secondary Lymphedema Using Semaglutide

Authors: Shivani Thakur, Jasmin Dominguez Cervantes, Ahmed Zabiba, Fatima Zabiba, Sandhini Agarwal, Kamalpreet Kaur, Hussein Maatouk, Shae Chand, Omar Madriz, Tiffany Huang, Saloni Bansal

Abstract:

The prevalence of lymphedema in women in rural communities highlights the importance of developing effective treatment and prevention methods. Subjects with secondary lymphedema in California’s Central Valley were surveyed at 6 surgical clinics to assess demographics and symptoms of lymphedema. Additionally, subjects on semaglutide treatment for obesity and/or T2DM were monitored for their diabetes management, weight loss progress, and lymphedema symptoms compared to subjects who were not treated with semaglutide. The subjects were followed for 12 months. Subjects who were treated with semaglutide completed pre-treatment questionnaires and follow-up post-treatment questionnaires at 3, 6, 9, 12 months, along with medical assessment. The untreated subjects completed similar questionnaires. The questionnaires investigated subjective feelings regarding lymphedema symptoms and management using a Likert-scale; quantitative leg measurements were collected, and blood work reviewed at these appointments. Paired difference t-tests, chi-squared tests, and independent sample t-tests were performed. 50 subjects, aged 18-75 years, completed the surveys evaluating secondary lymphedema: 90% female, 69% Hispanic, 45% Spanish speaking, 42% disabled, 57 % employed, 54% income range below 30 thousand dollars, and average BMI of 40. Both treatment and non-treatment groups noted the most common symptoms were leg swelling (x̄=3.2, ▁d= 1.3), leg pain (x̄=3.2, ▁d=1.6 ), loss of daily function (x̄=3, ▁d=1.4 ), and negative body image (x̄=4.4, ▁d=0.54). Subjects in the semaglutide treatment group >3 months of treatment compared to the untreated group demonstrated: 55% subject in the treated group had a 10% weight loss vs 3% in the untreated group (average BMI reduction by 11% vs untreated by 2.5%, p<0.05) and improved subjective feelings about their lymphedema symptoms: leg swelling (x̄=2.4, ▁d=0.45 vs x̄=3.2, ▁d=1.3, p<0.05), leg pain (x̄=2.2, ▁d=0.45 vs x̄= 3.2, ▁d= 1.6, p<0.05), and heaviness (x̄=2.2, ▁d=0.45 vs x̄=3, ▁d=1.56, p<0.05). Improvement in diabetes management was demonstrated by an average of 0.9 % decrease in A1C values compared to untreated 0.1 %, p<0.05. In comparison to untreated subjects, treatment subjects on semaglutide noted 6 cm decrease in the circumference of the leg, knee, calf, and ankle compared to 2 cm in untreated subjects, p<0.05. Semaglutide was shown to significantly improve weight loss, T2DM management, leg circumference, and secondary lymphedema functional, physical and psychosocial symptoms.

Keywords: diabetes, secondary lymphedema, semaglutide, obesity

Procedia PDF Downloads 43
4403 Tranexamic Acid in Orthopedic Surgery in Children

Authors: K. Amanzoui, A. Erragh, M. Elharit, A. Afif, K. Elfakhr, S. Kalouch, A. Chlilek

Abstract:

Orthopedic surgery is a provider of pre and postoperative bleeding; patients are exposed to several risks, and different measures are proposed to reduce bleeding during surgery, called the transfusion-sparing method, including tranexamic acid, which has shown its effectiveness in numerous studies. A prospective analytical study in 50 children was carried out in the orthopedic traumatology operating room of the EL HAROUCHI hospital of the CHU IBN ROCHD in Casablanca over a period of six months (April to October 2022). Two groups were randomized: one receiving tranexamic acid (Group A) and a non-receiving control group (Group B). The average age was 10.3 years, of which 58.8% were female. The first type of surgery was thoracolumbar scoliosis (52%). The average preoperative hemoglobin was 12.28 g/dl in group A, against 12.67 g/dl in the control group. There was no significant difference between the two groups (p=0.148). Mean intraoperative bleeding was 396.29 ml in group A versus 412 ml in the control group. No significant difference was observed for this parameter (p=0.632). The average hemoglobin level in the immediate postoperative period in our patients is 10.2 g/dl. In group A, it was 10.95 g/dl versus 10.93 g/dl in group B. At H24 postoperative, the mean hemoglobin value was 10.29 g/dl in group A against 9.5 g/dl in group B. For group A, the blood loss recorded during the first 24 hours was 209.43 ml, against 372 ml in group B, with a significant difference between the two groups (p=0.001). There is no statistically significant difference between the 2 groups in terms of the use of fillers, ephedrine or intraoperative transfusion. While for postoperative transfusion, we note the existence of a statistically significant difference between group A and group B. It is suggested that the use of tranexamic acid is an effective, simple, and low-cost way to limit postoperative blood loss and the need for transfusion.

Keywords: tranexamic acid, blood loss, orthopedic surgery, children

Procedia PDF Downloads 52
4402 Adopting Collaborative Business Processes to Prevent the Loss of Information in Public Administration Organisations

Authors: A. Capodieci, G. Del Fiore, L. Mainetti

Abstract:

Recently, the use of web 2.0 tools has increased in companies and public administration organizations. This phenomenon, known as "Enterprise 2.0", has, de facto, modified common organizational and operative practices. This has led “knowledge workers” to change their working practices through the use of Web 2.0 communication tools. Unfortunately, these tools have not been integrated with existing enterprise information systems, a situation that could potentially lead to a loss of information. This is an important problem in an organizational context, because knowledge of information exchanged within the organization is needed to increase the efficiency and competitiveness of the organization. In this article we demonstrate that it is possible to capture this knowledge using collaboration processes, which are processes of abstraction created in accordance with design patterns and applied to new organizational operative practices.

Keywords: business practices, business process patterns, collaboration tools, enterprise 2.0, knowledge workers

Procedia PDF Downloads 338
4401 Downtime Estimation of Building Structures Using Fuzzy Logic

Authors: M. De Iuliis, O. Kammouh, G. P. Cimellaro, S. Tesfamariam

Abstract:

Community Resilience has gained a significant attention due to the recent unexpected natural and man-made disasters. Resilience is the process of maintaining livable conditions in the event of interruptions in normally available services. Estimating the resilience of systems, ranging from individuals to communities, is a formidable task due to the complexity involved in the process. The most challenging parameter involved in the resilience assessment is the 'downtime'. Downtime is the time needed for a system to recover its services following a disaster event. Estimating the exact downtime of a system requires a lot of inputs and resources that are not always obtainable. The uncertainties in the downtime estimation are usually handled using probabilistic methods, which necessitates acquiring large historical data. The estimation process also involves ignorance, imprecision, vagueness, and subjective judgment. In this paper, a fuzzy-based approach to estimate the downtime of building structures following earthquake events is proposed. Fuzzy logic can integrate descriptive (linguistic) knowledge and numerical data into the fuzzy system. This ability allows the use of walk down surveys, which collect data in a linguistic or a numerical form. The use of fuzzy logic permits a fast and economical estimation of parameters that involve uncertainties. The first step of the method is to determine the building’s vulnerability. A rapid visual screening is designed to acquire information about the analyzed building (e.g. year of construction, structural system, site seismicity, etc.). Then, a fuzzy logic is implemented using a hierarchical scheme to determine the building damageability, which is the main ingredient to estimate the downtime. Generally, the downtime can be divided into three main components: downtime due to the actual damage (DT1); downtime caused by rational and irrational delays (DT2); and downtime due to utilities disruption (DT3). In this work, DT1 is computed by relating the building damageability results obtained from the visual screening to some already-defined components repair times available in the literature. DT2 and DT3 are estimated using the REDITM Guidelines. The Downtime of the building is finally obtained by combining the three components. The proposed method also allows identifying the downtime corresponding to each of the three recovery states: re-occupancy; functional recovery; and full recovery. Future work is aimed at improving the current methodology to pass from the downtime to the resilience of buildings. This will provide a simple tool that can be used by the authorities for decision making.

Keywords: resilience, restoration, downtime, community resilience, fuzzy logic, recovery, damage, built environment

Procedia PDF Downloads 150
4400 The MTHFR C677T Polymorphism Screening: A Challenge in Recurrent Pregnancy Loss

Authors: Rim Frikha, Nouha Bouayed, Afifa Sellami, Nozha Chakroun, Salima Daoud, Leila Keskes, Tarek Rebai

Abstract:

Introduction: Recurrent pregnancy loss (RPL) defined as two or more pregnancy losses, is a serious clinical problem. Methylene-tetrahydro-folate-reductase (MTHFR) polymorphisms, commonly the variant C677T is recognized as an inherited thrombophilia which might affect embryonic development and pregnancy success and cause pregnancy complications as RPL. Material and Methods DNA was extracted from peripheral blood samples and PCR-RFLP was performed for the molecular diagnosis of the C677T MTHFR polymorphism among 70 patients (35 couples) with more than 2 fetal losses. Aims and Objective: The aim of this study is to determine the frequency of MTHFR C677T among Tunisian couples with RPL and to critically analyze the available literature on the importance of MTHFR polymorphism testing in the management of RPL. Result and comments: No C677T mutation was detected in the carriers of RPL. This result would be related to sample size and to different criteria (number of abortion), - The association between MTHFR polymorphisms and pregnancy complications has been reported but with controversial results. - A lack of evidence for MTHFR polymorphism testing previously recommended by ACMG (American College of Medical medicine). Our study highlights the importance of screening of MTHFR polymorphism since the real impact of such thrombotic molecular defect on the pregnancy outcome is evident. - Folic supplementation of these patients during pregnancy can prevent such complications and lead to a successful pregnancy outcome.

Keywords: methylenetetrahydrofolate reductase, C677T, recurrent pregnancy loss, genetic testing

Procedia PDF Downloads 286
4399 Artificial Neural Network and Satellite Derived Chlorophyll Indices for Estimation of Wheat Chlorophyll Content under Rainfed Condition

Authors: Muhammad Naveed Tahir, Wang Yingkuan, Huang Wenjiang, Raheel Osman

Abstract:

Numerous models used in prediction and decision-making process but most of them are linear in natural environment, and linear models reach their limitations with non-linearity in data. Therefore accurate estimation is difficult. Artificial Neural Networks (ANN) found extensive acceptance to address the modeling of the complex real world for the non-linear environment. ANN’s have more general and flexible functional forms than traditional statistical methods can effectively deal with. The link between information technology and agriculture will become more firm in the near future. Monitoring crop biophysical properties non-destructively can provide a rapid and accurate understanding of its response to various environmental influences. Crop chlorophyll content is an important indicator of crop health and therefore the estimation of crop yield. In recent years, remote sensing has been accepted as a robust tool for site-specific management by detecting crop parameters at both local and large scales. The present research combined the ANN model with satellite-derived chlorophyll indices from LANDSAT 8 imagery for predicting real-time wheat chlorophyll estimation. The cloud-free scenes of LANDSAT 8 were acquired (Feb-March 2016-17) at the same time when ground-truthing campaign was performed for chlorophyll estimation by using SPAD-502. Different vegetation indices were derived from LANDSAT 8 imagery using ERADAS Imagine (v.2014) software for chlorophyll determination. The vegetation indices were including Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Chlorophyll Absorbed Ratio Index (CARI), Modified Chlorophyll Absorbed Ratio Index (MCARI) and Transformed Chlorophyll Absorbed Ratio index (TCARI). For ANN modeling, MATLAB and SPSS (ANN) tools were used. Multilayer Perceptron (MLP) in MATLAB provided very satisfactory results. For training purpose of MLP 61.7% of the data, for validation purpose 28.3% of data and rest 10% of data were used to evaluate and validate the ANN model results. For error evaluation, sum of squares error and relative error were used. ANN model summery showed that sum of squares error of 10.786, the average overall relative error was .099. The MCARI and NDVI were revealed to be more sensitive indices for assessing wheat chlorophyll content with the highest coefficient of determination R²=0.93 and 0.90 respectively. The results suggested that use of high spatial resolution satellite imagery for the retrieval of crop chlorophyll content by using ANN model provides accurate, reliable assessment of crop health status at a larger scale which can help in managing crop nutrition requirement in real time.

Keywords: ANN, chlorophyll content, chlorophyll indices, satellite images, wheat

Procedia PDF Downloads 132
4398 Case Study Hyperbaric Oxygen Therapy for Idiopathic Sudden Sensorineural Hearing Loss

Authors: Magdy I. A. Alshourbagi

Abstract:

Background: The National Institute for Deafness and Communication Disorders defines idiopathic sudden sensorineural hearing loss as the idiopathic loss of hearing of at least 30 dB across 3 contiguous frequencies occurring within 3 days.The most common clinical presentation involves an individual experiencing a sudden unilateral hearing loss, tinnitus, a sensation of aural fullness and vertigo. The etiologies and pathologies of ISSNHL remain unclear. Several pathophysiological mechanisms have been described including: vascular occlusion, viral infections, labyrinthine membrane breaks, immune associated disease, abnormal cochlear stress response, trauma, abnormal tissue growth, toxins, ototoxic drugs and cochlear membrane damage. The rationale for the use of hyperbaric oxygen to treat ISSHL is supported by an understanding of the high metabolism and paucity of vascularity to the cochlea. The cochlea and the structures within it require a high oxygen supply. The direct vascular supply, particularly to the organ of Corti, is minimal. Tissue oxygenation to the structures within the cochlea occurs via oxygen diffusion from cochlear capillary networks into the perilymph and the cortilymph. . The perilymph is the primary oxygen source for these intracochlear structures. Unfortunately, perilymph oxygen tension is decreased significantly in patients with ISSHL. To achieve a consistent rise of perilymph oxygen content, the arterial-perilymphatic oxygen concentration difference must be extremely high. This can be restored with hyperbaric oxygen therapy. Subject and Methods: A 37 year old man was presented at the clinic with a five days history of muffled hearing and tinnitus of the right ear. Symptoms were sudden onset, with no associated pain, dizziness or otorrhea and no past history of hearing problems or medical illness. Family history was negative. Physical examination was normal. Otologic examination revealed normal tympanic membranes bilaterally, with no evidence of cerumen or middle ear effusion. Tuning fork examination showed positive Rinne test bilaterally but with lateralization of Weber test to the left side, indicating right ear sensorineural hearing loss. Audiometric analysis confirmed sensorineural hearing loss across all frequencies of about 70- dB in the right ear. Routine lab work were all within normal limits. Clinical diagnosis of idiopathic sudden sensorineural hearing loss of the right ear was made and the patient began a medical treatment (corticosteroid, vasodilator and HBO therapy). The recommended treatment profile consists of 100% O2 at 2.5 atmospheres absolute for 60 minutes daily (six days per week) for 40 treatments .The optimal number of HBOT treatments will vary, depending on the severity and duration of symptomatology and the response to treatment. Results: As HBOT is not yet a standard for idiopathic sudden sensorineural hearing loss, it was introduced to this patient as an adjuvant therapy. The HBOT program was scheduled for 40 sessions, we used a 12-seat multi place chamber for the HBOT, which was started at day seven after the hearing loss onset. After the tenth session of HBOT, improvement of both hearing (by audiogram) and tinnitus was obtained in the affected ear (right). Conclusions: In conclusion, HBOT may be used for idiopathic sudden sensorineural hearing loss as an adjuvant therapy. It may promote oxygenation to the inner ear apparatus and revive hearing ability. Patients who fail to respond to oral and intratympanic steroids may benefit from this treatment. Further investigation is warranted, including animal studies to understand the molecular and histopathological aspects of HBOT and randomized control clinical studies.

Keywords: idiopathic sudden sensorineural hearing loss (issnhl), hyperbaric oxygen therapy (hbot), the decibel (db), oxygen (o2)

Procedia PDF Downloads 416
4397 Studies of the Corrosion Kinetics of Metal Alloys in Stagnant Simulated Seawater Environment

Authors: G. Kabir, A. M. Mohammed, M. A. Bawa

Abstract:

The paper presents corrosion behaviors of Naval Brass, aluminum alloy and carbon steel in simulated seawater under stagnant conditions. The behaviors were characterized on the variation of chloride ions concentration in the range of 3.0wt% and 3.5wt% and exposure time. The weight loss coupon-method immersion technique was employed. The weight loss for the various alloys was measured. Based on the obtained results, the corrosion rate was determined. It was found that the corrosion rates of the various alloys are related to the chloride ions concentrations, exposure time and kinetics of passive film formation of the various alloys. Carbon steel, suffers corrosion many folds more than Naval Brass. This indicated that the alloy exhibited relatively strong resistance to corrosion in the exposure environment of the seawater. Whereas, the aluminum alloy exhibited an excellent and beneficial resistance to corrosion more than the Naval Brass studied. Despite the prohibitive cost, Naval Brass and aluminum alloy, indicated to have beneficial corrosion behavior that can offer wide range of application in seashore operations. The corrosion kinetics parameters indicated that the corrosion reaction is limited by diffusion mass transfer of the corrosion reaction elements and not by reaction controlled.

Keywords: alloys, chloride ions concentration, corrosion kinetics, corrosion rate, diffusion mass transfer, exposure time, seawater, weight loss

Procedia PDF Downloads 283
4396 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis

Authors: Alexander Marx

Abstract:

Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.

Keywords: value at risk, financial market risk, banking, quantitative risk management

Procedia PDF Downloads 80
4395 Repeatable Scalable Business Models: Can Innovation Drive an Entrepreneurs Un-Validated Business Model?

Authors: Paul Ojeaga

Abstract:

Can the level of innovation use drive un-validated business models across regions? To what extent does industrial sector attractiveness drive firm’s success across regions at the time of start-up? This study examines the role of innovation on start-up success in six regions of the world (namely Sub Saharan Africa, the Middle East and North Africa, Latin America, South East Asia Pacific, the European Union and the United States representing North America) using macroeconomic variables. While there have been studies using firm level data, results from such studies are not suitable for national policy decisions. The need to drive a regional innovation policy also begs for an answer, therefore providing room for this study. Results using dynamic panel estimation show that innovation counts in the early infancy stage of new business life cycle. The results are robust even after controlling for time fixed effects and the study present variance-covariance estimation robust standard errors.

Keywords: industrial economics, un-validated business models, scalable models, entrepreneurship

Procedia PDF Downloads 263
4394 Ultra-High Frequency Passive Radar Coverage for Cars Detection in Semi-Urban Scenarios

Authors: Pedro Gómez-del-Hoyo, Jose-Luis Bárcena-Humanes, Nerea del-Rey-Maestre, María-Pilar Jarabo-Amores, David Mata-Moya

Abstract:

A study of achievable coverages using passive radar systems in terrestrial traffic monitoring applications is presented. The study includes the estimation of the bistatic radar cross section of different commercial vehicle models that provide challenging low values which make detection really difficult. A semi-urban scenario is selected to evaluate the impact of excess propagation losses generated by an irregular relief. A bistatic passive radar exploiting UHF frequencies radiated by digital video broadcasting transmitters is assumed. A general method of coverage estimation using electromagnetic simulators in combination with estimated car average bistatic radar cross section is applied. In order to reduce the computational cost, hybrid solution is implemented, assuming free space for the target-receiver path but estimating the excess propagation losses for the transmitter-target one.

Keywords: bistatic radar cross section, passive radar, propagation losses, radar coverage

Procedia PDF Downloads 316
4393 Aerodynamic Designing of Supersonic Centrifugal Compressor Stages

Authors: Y. Galerkin, A. Rekstin, K. Soldatova

Abstract:

Universal modeling method well proven for industrial compressors was applied for design of the high flow rate supersonic stage. Results were checked by ANSYS CFX and NUMECA Fine Turbo calculations. The impeller appeared to be very effective at transonic flow velocities. Stator elements efficiency is acceptable at design Mach numbers too. Their loss coefficient versus inlet flow angle performances correlates well with Universal modeling prediction. The impeller demonstrated ability of satisfactory operation at design flow rate. Supersonic flow behavior in the impeller inducer at the shroud blade to blade surface Φdes deserves additional study.

Keywords: centrifugal compressor stage, supersonic impeller, inlet flow angle, loss coefficient, return channel, shock wave, vane diffuser

Procedia PDF Downloads 449
4392 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution

Procedia PDF Downloads 340
4391 Teachers’ Awareness of the Significance of Lifelong Learning: A Case Study of Secondary School Teachers of Batna - Algeria

Authors: Bahloul Amel

Abstract:

This study is an attempt to raise the awareness of the stakeholders and the authorities on the sensitivity of Algerian secondary school teachers of English as a Foreign Language about the students’ loss of English language skills learned during formal schooling with effort and at expense and the supposed measures to arrest that loss. Data was collected from secondary school teachers of EFL and analyzed quantitatively using a questionnaire containing open-ended and close-ended questions. The results advocate a consensus about the need for actions to be adopted to make assessment techniques outcome-oriented. Most of the participants were in favor of including curricular activities involving contextualized learning, problem-solving learning critical self-awareness, self and peer-assisted learning, use of computers and internet so as to make learners autonomous.

Keywords: lifelong learning, EFL, contextualized learning, Algeria

Procedia PDF Downloads 328
4390 The Decrease of Collagen or Mineral Affect the Fracture in the Turkey Long Bones

Authors: P. Vosynek, T. Návrat, M. Peč, J. Pořízka, P. Diviš

Abstract:

Changes of mechanical properties and response behavior of bones is an important external sign of medical problems like osteoporosis, bone remodeling after fracture or surgery, osteointegration, or bone tissue loss of astronauts in space. Measuring of mechanical behavior of bones in physiological and osteoporotic states, quantified by different degrees of protein (collagen) and mineral loss, is thus an important topic in biomechanical research. This contribution deals with the relation between mechanical properties of the turkey long bone–tibia in physiological, demineralized, and deproteinized state. Three methods for comparison were used: densitometry, three point bending and harmonic response analysis. The results help to find correlations between the methods and estimate their possible application in medical practice.

Keywords: bone properties, long bone, osteoporosis, response behavior

Procedia PDF Downloads 466
4389 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection

Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye

Abstract:

The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.

Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document

Procedia PDF Downloads 134
4388 Influence of Bragg Reflectors Pairs on Resonance Characteristics of Solidly Mounted Resonators

Authors: Vinita Choudhary

Abstract:

The solidly mounted resonator (SMR) is a bulk acoustic wave-based device consisting of a piezoelectric layer sandwiched between two electrodes upon Bragg reflectors, which then are attached to a substrate. To transform the effective acoustic impedance of the substrate to a near zero value, the Bragg reflectors are composed of alternating high and low acoustic impedance layers of quarter-wavelength thickness. In this work presents the design and investigation of acoustic Bragg reflectors (ABRs) for solidly mounted bulk acoustic wave resonators through analysis and simulation. This performance of the resonator is analyzed using 1D Mason modeling. The performance parameters are the effect of Bragg pairs number on transmissivity, reflectivity, insertion loss, the electromechanical and quality factor of the 5GHz operating resonator.

Keywords: bragg reflectors, SMR, insertion loss, quality factor

Procedia PDF Downloads 69
4387 Modeling and Simulating Productivity Loss Due to Project Changes

Authors: Robert Pellerin, Michel Gamache, Remi Trudeau, Nathalie Perrier

Abstract:

The context of large engineering projects is particularly favorable to the appearance of engineering changes and contractual modifications. These elements are potential causes for claims. In this paper, we investigate one of the critical components of the claim management process: the calculation of the impacts of changes in terms of losses of productivity due to the need to accelerate some project activities. When project changes are initiated, delays can arise. Indeed, project activities are often executed in fast-tracking in an attempt to respect the completion date. But the acceleration of project execution and the resulting rework can entail important costs as well as induce productivity losses. In the past, numerous methods have been proposed to quantify the duration of delays, the gains achieved by project acceleration, and the loss of productivity. The calculation related to those changes can be divided into two categories: direct cost and indirect cost. The direct cost is easily quantifiable as opposed to indirect costs which are rarely taken into account during the calculation of the cost of an engineering change or contract modification despite several research projects have been made on this subject. However, proposed models have not been accepted by companies yet, nor they have been accepted in court. Those models require extensive data and are often seen as too specific to be used for all projects. These techniques are also ignoring the resource constraints and the interdependencies between the causes of delays and the delays themselves. To resolve this issue, this research proposes a simulation model that mimics how major engineering changes or contract modifications are handled in large construction projects. The model replicates the use of overtime in a reactive scheduling mode in order to simulate the loss of productivity present when a project change occurs. Multiple tests were conducted to compare the results of the proposed simulation model with statistical analysis conducted by other researchers. Different scenarios were also conducted in order to determine the impact the number of activities, the time of occurrence of the change, the availability of resources, and the type of project changes on productivity loss. Our results demonstrate that the number of activities in the project is a critical variable influencing the productivity of a project. When changes occur, the presence of a large number of activities leads to a much lower productivity loss than a small number of activities. The speed of reducing productivity for 30-job projects is about 25 percent faster than the reduction speed for 120-job projects. The moment of occurrence of a change also shows a significant impact on productivity. Indeed, the sooner the change occurs, the lower the productivity of the labor force. The availability of resources also impacts the productivity of a project when a change is implemented. There is a higher loss of productivity when the amount of resources is restricted.

Keywords: engineering changes, indirect costs overtime, productivity, scheduling, simulation

Procedia PDF Downloads 225
4386 Fatigue Life Estimation Using N-Code for Drive Shaft of Passenger Vehicle

Authors: Tae An Kim, Hyo Lim Kang, Hye Won Han, Seung Ho Han

Abstract:

The drive shaft of passenger vehicle has its own function such as transmitting the engine torque from the gearbox and differential gears to the wheels. It must also compensate for all variations in angle or length resulting from manoeuvring and deflection for perfect synchronization between joints. Torsional fatigue failures occur frequently at the connection parts of the spline joints in the end of the drive shaft. In this study, the fatigue life of a drive shaft of passenger vehicle was estimated by using the finite element analysis. A commercial software of n-Code was applied under twisting load conditions, i.e. 0~134kgf•m and 0~188kgf•m, in which the shear strain range-fatigue life relationship considering Signed Shear method, Smith-Watson-Topper equation, Neuber-Hoffman Seeger method, size sensitivity factor and surface roughness effect was taken into account. The estimated fatigue life was verified by a twisting load test of the real drive shaft in a test rig. (Human Resource Training Project for Industry Matched R & D, KIAT, N036200004).

Keywords: drive shaft, fatigue life estimation, passenger vehicle, shear strain range-fatigue life relationship, torsional fatigue failure

Procedia PDF Downloads 258
4385 Comparative Fragility Analysis of Shallow Tunnels Subjected to Seismic and Blast Loads

Authors: Siti Khadijah Che Osmi, Mohammed Ahmad Syed

Abstract:

Underground structures are crucial components which required detailed analysis and design. Tunnels, for instance, are massively constructed as transportation infrastructures and utilities network especially in urban environments. Considering their prime importance to the economy and public safety that cannot be compromised, thus any instability to these tunnels will be highly detrimental to their performance. Recent experience suggests that tunnels become vulnerable during earthquakes and blast scenarios. However, a very limited amount of studies has been carried out to study and understanding the dynamic response and performance of underground tunnels under those unpredictable extreme hazards. In view of the importance of enhancing the resilience of these structures, the overall aims of the study are to evaluate probabilistic future performance of shallow tunnels subjected to seismic and blast loads by developing detailed fragility analysis. Critical non-linear time history numerical analyses using sophisticated finite element software Midas GTS NX have been presented about the current methods of analysis, taking into consideration of structural typology, ground motion and explosive characteristics, effect of soil conditions and other associated uncertainties on the tunnel integrity which may ultimately lead to the catastrophic failure of the structures. The proposed fragility curves for both extreme loadings are discussed and compared which provide significant information the performance of the tunnel under extreme hazards which may beneficial for future risk assessment and loss estimation.

Keywords: fragility analysis, seismic loads, shallow tunnels, blast loads

Procedia PDF Downloads 319
4384 A Study of Mode Choice Model Improvement Considering Age Grouping

Authors: Young-Hyun Seo, Hyunwoo Park, Dong-Kyu Kim, Seung-Young Kho

Abstract:

The purpose of this study is providing an improved mode choice model considering parameters including age grouping of prime-aged and old age. In this study, 2010 Household Travel Survey data were used and improper samples were removed through the analysis. Chosen alternative, date of birth, mode, origin code, destination code, departure time, and arrival time are considered from Household Travel Survey. By preprocessing data, travel time, travel cost, mode, and ratio of people aged 45 to 55 years, 55 to 65 years and over 65 years were calculated. After the manipulation, the mode choice model was constructed using LIMDEP by maximum likelihood estimation. A significance test was conducted for nine parameters, three age groups for three modes. Then the test was conducted again for the mode choice model with significant parameters, travel cost variable and travel time variable. As a result of the model estimation, as the age increases, the preference for the car decreases and the preference for the bus increases. This study is meaningful in that the individual and households characteristics are applied to the aggregate model.

Keywords: age grouping, aging, mode choice model, multinomial logit model

Procedia PDF Downloads 312
4383 Optimization of Economic Order Quantity of Multi-Item Inventory Control Problem through Nonlinear Programming Technique

Authors: Prabha Rohatgi

Abstract:

To obtain an efficient control over a huge amount of inventory of drugs in pharmacy department of any hospital, generally, the medicines are categorized on the basis of their cost ‘ABC’ (Always Better Control), first and then categorize on the basis of their criticality ‘VED’ (Vital, Essential, desirable) for prioritization. About one-third of the annual expenditure of a hospital is spent on medicines. To minimize the inventory investment, the hospital management may like to keep the medicines inventory low, as medicines are perishable items. The main aim of each and every hospital is to provide better services to the patients under certain limited resources. To achieve the satisfactory level of health care services to outdoor patients, a hospital has to keep eye on the wastage of medicines because expiry date of medicines causes a great loss of money though it was limited and allocated for a particular period of time. The objectives of this study are to identify the categories of medicines requiring incentive managerial control. In this paper, to minimize the total inventory cost and the cost associated with the wastage of money due to expiry of medicines, an inventory control model is used as an estimation tool and then nonlinear programming technique is used under limited budget and fixed number of orders to be placed in a limited time period. Numerical computations have been given and shown that by using scientific methods in hospital services, we can give more effective way of inventory management under limited resources and can provide better health care services. The secondary data has been collected from a hospital to give empirical evidence.

Keywords: ABC-VED inventory classification, multi item inventory problem, nonlinear programming technique, optimization of EOQ

Procedia PDF Downloads 235
4382 Study on Errors in Estimating the 3D Gaze Point for Different Pupil Sizes Using Eye Vergences

Authors: M. Pomianek, M. Piszczek, M. Maciejewski

Abstract:

The binocular eye tracking technology is increasingly being used in industry, entertainment and marketing analysis. In the case of virtual reality, eye tracking systems are already the basis for user interaction with the environment. In such systems, the high accuracy of determining the user's eye fixation point is very important due to the specificity of the virtual reality head-mounted display (HMD). Often, however, there are unknown errors occurring in the used eye tracking technology, as well as those resulting from the positioning of the devices in relation to the user's eyes. However, can the virtual environment itself influence estimation errors? The paper presents mathematical analyses and empirical studies of the determination of the fixation point and errors resulting from the change in the size of the pupil in response to the intensity of the displayed scene. The article contains both static laboratory tests as well as on the real user. Based on the research results, optimization solutions were proposed that would reduce the errors of gaze estimation errors. Studies show that errors in estimating the fixation point of vision can be minimized both by improving the pupil positioning algorithm in the video image and by using more precise methods to calibrate the eye tracking system in three-dimensional space.

Keywords: eye tracking, fixation point, pupil size, virtual reality

Procedia PDF Downloads 116
4381 The Current Situation and Perspectives of Electricity Demand and Estimation of Carbon Dioxide Emissions and Efficiency

Authors: F. Ahwide, Y. Aldali

Abstract:

This article presents a current and future energy situation in Libya. The electric power efficiency and operating hours in power plants are evaluated from 2005 to 2010. Carbon dioxide emissions in most of power plants are estimated. In 2005, the efficiency of steam power plants achieved a range of 20% to 28%. While, the gas turbine power plants efficiency ranged between 9% and 25%, this can be considered as low efficiency. However, the efficiency improvement has clearly observed in some power plants from 2008 to 2010, especially in the power plant of North Benghazi and west Tripoli. In fact, these power plants have modified to combine cycle. The efficiency of North Benghazi power plant has increased from 25% to 46.6%, while in Tripoli it is increased from 22% to 34%. On the other hand, the efficiency improvement is not observed in the gas turbine power plants. When compared to the quantity of fuel used, the carbon dioxide emissions resulting from electricity generation plants were very high. Finally, an estimation of the energy demand has been done to the maximum load and the annual load factor (i.e., the ratio between the output power and installed power).

Keywords: power plant, efficiency improvement, carbon dioxide emissions, energy situation in Libya

Procedia PDF Downloads 455
4380 Estimation of Reservoir Capacity and Sediment Deposition Using Remote Sensing Data

Authors: Odai Ibrahim Mohammed Al Balasmeh, Tapas Karmaker, Richa Babbar

Abstract:

In this study, the reservoir capacity and sediment deposition were estimated using remote sensing data. The satellite images were synchronized with water level and storage capacity to find out the change in sediment deposition due to soil erosion and transport by streamflow. The water bodies spread area was estimated using vegetation indices, e.g., normalize differences vegetation index (NDVI) and normalize differences water index (NDWI). The 3D reservoir bathymetry was modeled by integrated water level, storage capacity, and area. From the models of different time span, the change in reservoir storage capacity was estimated. Another reservoir with known water level, storage capacity, area, and sediment deposition was used to validate the estimation technique. The t-test was used to assess the results between observed and estimated reservoir capacity and sediment deposition.

Keywords: satellite data, normalize differences vegetation index, NDVI, normalize differences water index, NDWI, reservoir capacity, sedimentation, t-test hypothesis

Procedia PDF Downloads 147
4379 Estimation of Aquifer Properties Using Pumping Tests: Case Study of Pydibhimavaram Industrial Area, Srikakulam, India

Authors: G. Venkata Rao, P. Kalpana, R. Srinivasa Rao

Abstract:

Adequate and reliable estimates of aquifer parameters are of utmost importance for proper management of vital groundwater resources. At present scenario the ground water is polluted because of industrial waste disposed over the land and the contaminants are transported in the aquifer from one area to another area which is depending on the characteristics of the aquifer and contaminants. To know the contaminant transport, the accurate estimation of aquifer properties is highly needed. Conventionally, these properties are estimated through pumping tests carried out on water wells. The occurrence and movement of ground water in the aquifer are characteristically defined by the aquifer parameters. The pumping (aquifer) test is the standard technique for estimating various hydraulic properties of aquifer systems, viz, transmissivity (T), hydraulic conductivity (K), storage coefficient (S) etc., for which the graphical method is widely used. The study area for conducting pumping test is Pydibheemavaram Industrial area near the coastal belt of Srikulam, AP, India. The main objective of the present work is to estimate the aquifer properties for developing contaminant transport model for the study area.

Keywords: aquifer, contaminant transport, hydraulic conductivity, industrial waste, pumping test

Procedia PDF Downloads 425
4378 Post Harvest Losses and Food Security in Northeast Nigeria What Are the Key Challenges and Concrete Solutions

Authors: Adebola Adedugbe

Abstract:

The challenge of post-harvest losses poses serious threats for food security in Nigeria and the north-eastern part with the country losing about $9billion annually due to postharvest losses in the sector. Post-harvest loss (PHL) is the quantitative and qualitative loss of food in various post-harvest operations. In Nigeria, post-harvest losses (PHL) have been a major challenge to food security and improved farmer’s income. In 2022, the Nigerian government had said over 30 percent of food produced by Nigerian farmers perish during post-harvest. For many in northeast Nigeria, agriculture is the predominant source of livelihood and income. The persistent communal conflicts, flood, decade-old attacks by boko haram and insurgency in this region have disrupted farming activities drastically, with farmlands becoming insecure and inaccessible as communities are forced to abandon ancestral homes, The impact of climate change is also affecting agricultural and fishing activities, leading to shortage of food supplies, acute hunger and loss of livelihood. This has continued to impact negatively on the region and country’s food production and availability making it loose billions of US dollars annually in income in this sector. The root cause of postharvest losses among others in crops, livestock and fisheries are lack of modern post-harvest equipment, chemical and lack of technologies used for combating losses. The 2019 Global Hunger Index showed Nigeria’s case was progressing from a ‘serious to alarming level’. As part of measures to address the problem of post-harvest losses experienced by farmers, the federal government of Nigeria concessioned 17 silos with 6000 metric tonne storage space to private sector to enable farmers to have access to storage facilities. This paper discusses the causes, effects and solutions in handling post-harvest losses and optimize returns on food security in northeast Nigeria.

Keywords: farmers, food security, northeast Nigeria, postharvest loss

Procedia PDF Downloads 56
4377 Estimation of Emanation Properties of Kimberlites and Host Rocks of Lomonosov Diamond Deposit in Russia

Authors: E. Yu. Yakovlev, A. V. Puchkov

Abstract:

The study is devoted to experimental work on the assessment of emanation properties of kimberlites and host rocks of the Lomonosov diamond deposit of the Arkhangelsk diamondiferous province. The aim of the study is estimation the factors influencing on formation of the radon field over kimberlite pipes. For various types of rocks composing the kimberlite pipe and near-pipe space, the following parameters were measured: porosity, density, radium-226 activity, activity of free radon and emanation coefficient. The research results showed that the largest amount of free radon is produced by rocks of near-pipe space, which are the Vendian host deposits and are characterized by high values of the emanation coefficient, radium activity and porosity. The lowest values of these parameters are characteristic of vent-facies kimberlites, which limit the formation of activity of free radon in body of the pipe. The results of experimental work confirm the prospects of using emanation methods for prospecting of kimberlite pipes.

Keywords: emanation coefficient, kimberlites, porosity, radon volumetric activity

Procedia PDF Downloads 122
4376 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 141