Search results for: interval time-varying delays
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1114

Search results for: interval time-varying delays

934 Multi-Criteria Test Case Selection Using Ant Colony Optimization

Authors: Niranjana Devi N.

Abstract:

Test case selection is to select the subset of only the fit test cases and remove the unfit, ambiguous, redundant, unnecessary test cases which in turn improve the quality and reduce the cost of software testing. Test cases optimization is the problem of finding the best subset of test cases from a pool of the test cases to be audited. It will meet all the objectives of testing concurrently. But most of the research have evaluated the fitness of test cases only on single parameter fault detecting capability and optimize the test cases using a single objective. In the proposed approach, nine parameters are considered for test case selection and the best subset of parameters for test case selection is obtained using Interval Type-2 Fuzzy Rough Set. Test case selection is done in two stages. The first stage is the fuzzy entropy-based filtration technique, used for estimating and reducing the ambiguity in test case fitness evaluation and selection. The second stage is the ant colony optimization-based wrapper technique with a forward search strategy, employed to select test cases from the reduced test suite of the first stage. The results are evaluated using the Coverage parameters, Precision, Recall, F-Measure, APSC, APDC, and SSR. The experimental evaluation demonstrates that by this approach considerable computational effort can be avoided.

Keywords: ant colony optimization, fuzzy entropy, interval type-2 fuzzy rough set, test case selection

Procedia PDF Downloads 630
933 Determinants of Diarrhoea Prevalence Variations in Mountainous Informal Settlements of Kigali City, Rwanda

Authors: Dieudonne Uwizeye

Abstract:

Introduction: Diarrhoea is one of the major causes of morbidity and mortality among communities living in urban informal settlements of developing countries. It is assumed that mountainous environment introduces variations of the burden among residents of the same settlements. Design and Objective: A cross-sectional study was done in Kigali to explore the effect of mountainous informal settlements on diarrhoea risk variations. Data were collected among 1,152 households through household survey and transect walk to observe the status of sanitation. The outcome variable was the incidence of diarrhoea among household members of any age. The study used the most knowledgeable person in the household as the main respondent. Mostly this was the woman of the house as she was more likely to know the health status of every household member as she plays various roles: mother, wife, and head of the household among others. The analysis used cross tabulation and logistic regression analysis. Results: Results suggest that risks for diarrhoea vary depending on home location in the settlements. Diarrhoea risk increased as the distance from the road increased. The results of the logistic regression analysis indicate the adjusted odds ratio of 2.97 with 95% confidence interval being 1.35-6.55 and 3.50 adjusted odds ratio with 95% confidence interval being 1.61-7.60 in level two and three respectively compared with level one. The status of sanitation within and around homes was also significantly associated with the increase of diarrhoea. Equally, it is indicated that stable households were less likely to have diarrhoea. The logistic regression analysis indicated the adjusted odds ratio of 0.45 with 95% confidence interval being 0.25-0.81. However, the study did not find evidence for a significant association between diarrhoea risks and household socioeconomic status in the multivariable model. It is assumed that environmental factors in mountainous settings prevailed. Households using the available public water sources were more likely to have diarrhoea in their households. Recommendation: The study recommends the provision and extension of infrastructure for improved water, drainage, sanitation and wastes management facilities. Equally, studies should be done to identify the level of contamination and potential origin of contaminants for water sources in the valleys to adequately control the risks for diarrhoea in mountainous urban settings.

Keywords: urbanisation, diarrhoea risk, mountainous environment, urban informal settlements in Rwanda

Procedia PDF Downloads 141
932 Model for Calculating Traffic Mass and Deceleration Delays Based on Traffic Field Theory

Authors: Liu Canqi, Zeng Junsheng

Abstract:

This study identifies two typical bottlenecks that occur when a vehicle cannot change lanes: car following and car stopping. The ideas of traffic field and traffic mass are presented in this work. When there are other vehicles in front of the target vehicle within a particular distance, a force is created that affects the target vehicle's driving speed. The characteristics of the driver and the vehicle collectively determine the traffic mass; the driving speed of the vehicle and external variables have no bearing on this. From a physical level, this study examines the vehicle's bottleneck when following a car, identifies the outside factors that have an impact on how it drives, takes into account that the vehicle will transform kinetic energy into potential energy during deceleration, and builds a calculation model for traffic mass. The energy-time conversion coefficient is created from an economic standpoint utilizing the social average wage level and the average cost of motor fuel. Vissim simulation program measures the vehicle's deceleration distance and delays under the Wiedemann car-following model. The difference between the measured value of deceleration delay acquired by simulation and the theoretical value calculated by the model is compared using the conversion calculation model of traffic mass and deceleration delay. The experimental data demonstrate that the model is reliable since the error rate between the theoretical calculation value of the deceleration delay obtained by the model and the measured value of simulation results is less than 10%. The article's conclusion is that the traffic field has an impact on moving cars on the road and that physical and socioeconomic factors should be taken into account while studying vehicle-following behavior. The deceleration delay value of a vehicle's driving and traffic mass have a socioeconomic relationship that can be utilized to calculate the energy-time conversion coefficient when dealing with the bottleneck of cars stopping and starting.

Keywords: traffic field, social economics, traffic mass, bottleneck, deceleration delay

Procedia PDF Downloads 30
931 Cost Overrun in Construction Projects

Authors: Hailu Kebede Bekele

Abstract:

Construction delays are suitable where project events occur at a certain time expected due to causes related to the client, consultant, and contractor. Delay is the major cause of the cost overrun that leads to the poor efficiency of the project. The cost difference between completion and the originally estimated is known as cost overrun. The common ways of cost overruns are not simple issues that can be neglected, but more attention should be given to prevent the organization from being devastated to be failed, and financial expenses to be extended. The reasons that may raised in different studies show that the problem may arise in construction projects due to errors in budgeting, lack of favorable weather conditions, inefficient machinery, and the availability of extravagance. The study is focused on the pace of mega projects that can have a significant change in the cost overrun calculation.15 mega projects are identified to study the problem of the cost overrun in the site. The contractor, consultant, and client are the principal stakeholders in the mega projects. 20 people from each sector were selected to participate in the investigation of the current mega construction project. The main objective of the study on the construction cost overrun is to prioritize the major causes of the cost overrun problem. The methodology that was employed in the construction cost overrun is the qualitative methodology that mostly rates the causes of construction project cost overrun. Interviews, open-ended and closed-ended questions group discussions, and rating qualitative methods are the best methodologies to study construction projects overrun. The result shows that design mistakes, lack of labor, payment delay, old equipment and scheduling, weather conditions, lack of skilled labor, payment delays, transportation, inflation, and order variations, market price fluctuation, and people's thoughts and philosophies, the prior cause of the cost overrun that fail the project performance. The institute shall follow the scheduled activities to bring a positive forward in the project life.

Keywords: cost overrun, delay, mega projects, design

Procedia PDF Downloads 29
930 Fuzzy Time Series- Markov Chain Method for Corn and Soybean Price Forecasting in North Carolina Markets

Authors: Selin Guney, Andres Riquelme

Abstract:

Among the main purposes of optimal and efficient forecasts of agricultural commodity prices is to guide the firms to advance the economic decision making process such as planning business operations and marketing decisions. Governments are also the beneficiaries and suppliers of agricultural price forecasts. They use this information to establish a proper agricultural policy, and hence, the forecasts affect social welfare and systematic errors in forecasts could lead to a misallocation of scarce resources. Various empirical approaches have been applied to forecast commodity prices that have used different methodologies. Most commonly-used approaches to forecast commodity sectors depend on classical time series models that assume values of the response variables are precise which is quite often not true in reality. Recently, this literature has mostly evolved to a consideration of fuzzy time series models that provide more flexibility in terms of the classical time series models assumptions such as stationarity, and large sample size requirement. Besides, fuzzy modeling approach allows decision making with estimated values under incomplete information or uncertainty. A number of fuzzy time series models have been developed and implemented over the last decades; however, most of them are not appropriate for forecasting repeated and nonconsecutive transitions in the data. The modeling scheme used in this paper eliminates this problem by introducing Markov modeling approach that takes into account both the repeated and nonconsecutive transitions. Also, the determination of length of interval is crucial in terms of the accuracy of forecasts. The problem of determining the length of interval arbitrarily is overcome and a methodology to determine the proper length of interval based on the distribution or mean of the first differences of series to improve forecast accuracy is proposed. The specific purpose of this paper is to propose and investigate the potential of a new forecasting model that integrates methodologies for determining the proper length of interval based on the distribution or mean of the first differences of series and Fuzzy Time Series- Markov Chain model. Moreover, the accuracy of the forecasting performance of proposed integrated model is compared to different univariate time series models and the superiority of proposed method over competing methods in respect of modelling and forecasting on the basis of forecast evaluation criteria is demonstrated. The application is to daily corn and soybean prices observed at three commercially important North Carolina markets; Candor, Cofield and Roaring River for corn and Fayetteville, Cofield and Greenville City for soybeans respectively. One main conclusion from this paper is that using fuzzy logic improves the forecast performance and accuracy; the effectiveness and potential benefits of the proposed model is confirmed with small selection criteria value such MAPE. The paper concludes with a discussion of the implications of integrating fuzzy logic and nonarbitrary determination of length of interval for the reliability and accuracy of price forecasts. The empirical results represent a significant contribution to our understanding of the applicability of fuzzy modeling in commodity price forecasts.

Keywords: commodity, forecast, fuzzy, Markov

Procedia PDF Downloads 196
929 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm

Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu

Abstract:

Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.

Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model

Procedia PDF Downloads 219
928 Seismic Microzonation Analysis for Damage Mapping of the 2006 Yogyakarta Earthquake, Indonesia

Authors: Fathul Mubin, Budi E. Nurcahya

Abstract:

In 2006, a large earthquake ever occurred in the province of Yogyakarta, which caused considerable damage. This is the basis need to investigate the seismic vulnerability index in around of the earthquake zone. This research is called microzonation of earthquake hazard. This research has been conducted at the site and surrounding of Prambanan Temple, includes homes and civil buildings. The reason this research needs to be done because in the event of an earthquake in 2006, there was damage to the temples at Prambanan temple complex and its surroundings. In this research, data collection carried out for 60 minutes using three component seismograph measurements at 165 points with spacing of 1000 meters. The data recorded in time function were analyzed using the spectral ratio method, known as the Horizontal to Vertical Spectral Ratio (HVSR). Results from this analysis are dominant frequency (Fg) and maximum amplification factor (Ag) are used to obtain seismic vulnerability index. The results of research showed the dominant frequency range from 0.5 to 30 Hz and the amplification is in interval from 0.5 to 9. Interval value for seismic vulnerability index is 0.1 to 50. Based on distribution maps of seismic vulnerability index and impact of buildings damage seemed for suitability. For further research, it needs to survey to the east (klaten) and south (Bantul, DIY) to determine a full distribution maps of seismic vulnerability index.

Keywords: amplification factor, dominant frequency, microzonation analysis, seismic vulnerability index

Procedia PDF Downloads 169
927 Effect of Tissue Preservation Chemicals on Decomposition in Different Soil Types

Authors: Onyekachi Ogbonnaya Iroanya, Taiye Abdullahi Gegele, Frank Tochukwu Egwuatu

Abstract:

Introduction: Forensic taphonomy is a multifaceted area that incorporates decomposition, chemical and biological cadaver exposure in post-mortem event chronology and reconstruction to predict the Post Mortem Interval (PMI). The aim of this study was to evaluate the integrity of DNA extracted from the remains of embalmed decomposed Sus domesticus tissues buried in different soil types. Method: A total of 12 limbs of Sus domesticus weighing between 0.7-1.4 kg were used. Each of the samples across the groups was treated with 10% formaldehyde, absolute methanol and 50% Pine oil for 24 hours before burial except the control samples, which were buried immediately. All samples were buried in shallow simulated Clay, Sandy and Loamy soil graves for 12 months. The DNA for each sample was extracted and quantified with Nanodrop Spectrophotometer (6305 JENWAY spectrometers). The rate of decomposition was examined through the modified qualitative decomposition analysis. Extracted DNA was amplified through PCR and bands visualized via gel electrophoresis. A biochemical enzyme assay was done for each burial grave soil. Result: The limbs in all burial groups had lost weight over the burial period. There was a significant increase in the soil urease level in the samples preserved in formaldehyde across the 3 soil type groups (p≤0.01). Also, the control grave soils recorded significantly higher alkaline phosphatase, dehydrogenase and calcium carbonate values compared to experimental grave soils (p≤0.01). The experimental samples showed a significant decrease in DNA concentration and purity when compared to the control groups (p≤0.01). Obtained findings of the soil biochemical analysis showed the embalming treatment altered the relationship between organic matter decomposition and soil biochemical properties as observed in the fluctuations that were recorded in the soil biochemical parameters. The PCR amplified DNA showed no bands on the gel electrophoresis plates. Conclusion: In criminal investigations, factors such as burial grave soil, grave soil biochemical properties, antemortem exposure to embalming chemicals should be considered in post-mortem interval (PMI) determination.

Keywords: forensic taphonomy, post-mortem interval (PMI), embalmment, decomposition, grave soil

Procedia PDF Downloads 135
926 Complexity in a Leslie-Gower Delayed Prey-Predator Model

Authors: Anuraj Singh

Abstract:

The complex dynamics is explored in a prey predator system with multiple delays. The predator dynamics is governed by Leslie-Gower scheme. The existence of periodic solutions via Hopf bifurcation with respect to delay parameters is established. To substantiate analytical findings, numerical simulations are performed. The system shows rich dynamic behavior including chaos and limit cycles.

Keywords: chaos, Hopf bifurcation, stability, time delay

Procedia PDF Downloads 299
925 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 282
924 Gaussian Probability Density for Forest Fire Detection Using Satellite Imagery

Authors: S. Benkraouda, Z. Djelloul-Khedda, B. Yagoubi

Abstract:

we present a method for early detection of forest fires from a thermal infrared satellite image, using the image matrix of the probability of belonging. The principle of the method is to compare a theoretical mathematical model to an experimental model. We considered that each line of the image matrix, as an embodiment of a non-stationary random process. Since the distribution of pixels in the satellite image is statistically dependent, we divided these lines into small stationary and ergodic intervals to characterize the image by an adequate mathematical model. A standard deviation was chosen to generate random variables, so each interval behaves naturally like white Gaussian noise. The latter has been selected as the mathematical model that represents a set of very majority pixels, which we can be considered as the image background. Before modeling the image, we made a few pretreatments, then the parameters of the theoretical Gaussian model were extracted from the modeled image, these settings will be used to calculate the probability of each interval of the modeled image to belong to the theoretical Gaussian model. The high intensities pixels are regarded as foreign elements to it, so they will have a low probability, and the pixels that belong to the background image will have a high probability. Finally, we did present the reverse of the matrix of probabilities of these intervals for a better fire detection.

Keywords: forest fire, forest fire detection, satellite image, normal distribution, theoretical gaussian model, thermal infrared matrix image

Procedia PDF Downloads 112
923 Analysis of Maternal Death Surveillance and Response: Causes and Contributing Factors in Addis Ababa, Ethiopia, 2022

Authors: Sisay Tiroro Salato

Abstract:

Background: Ethiopia has been implementing the maternal death surveillance and response system to provide real-time actionable information, including causes of death and contributing factors. Analysis of maternal mortality surveillance data was conducted to identify the causes and underlying factors in Addis Ababa, Ethiopia. Methods: We carried out a retrospective surveillance data analysis of 324 maternal deaths reported in Addis Ababa, Ethiopia, from 2017 to 2021. The data were extracted from the national maternal death surveillance and response database, including information from case investigation, verbal autopsy, and facility extraction forms. The data were analyzed by computing frequency and presented in numbers, proportions, and ratios. Results: Of 324 maternal deaths, 92% died in the health facilities, 6.2% in transit, and 1.5% at home. The mean age at death was 28 years, ranging from 17 to 45. The maternal mortality ratio per 100,000 live births was 77for the five years, ranging from 126 in 2017 to 21 in 2021. The direct and indirect causes of death were responsible for 87% and 13%, respectively. The direct causes included obstetric haemorrhage, hypertensive disorders in pregnancy, puerperal sepsis, embolism, obstructed labour, and abortion. The third delay (delay in receiving care after reaching health facilities) accounted for 57% of deaths, while the first delay (delay in deciding to seek health care) and the second delay (delay in reaching health facilities) and accounted for 34% and 24%, respectively. Late arrival to the referral facility, delayed management after admission, andnon-recognition of danger signs were underlying factors. Conclusion: Over 86% of maternal deaths were attributed by avoidable direct causes. The majority of women do try to reach health services when an emergency occurs, but the third delays present a major problem. Improving the quality of care at the healthcare facility level will help to reduce maternal death.

Keywords: maternal death, surveillance, delays, factors

Procedia PDF Downloads 69
922 Parallel Multisplitting Methods for Differential Systems

Authors: Malika El Kyal, Ahmed Machmoum

Abstract:

We prove the superlinear convergence of asynchronous multi-splitting methods applied to differential equations. This study is based on the technique of nested sets. It permits to specify kind of the convergence in the asynchronous mode.The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.

Keywords: parallel methods, asynchronous mode, multisplitting, ODE

Procedia PDF Downloads 498
921 Estimating the Traffic Impacts of Green Light Optimal Speed Advisory Systems Using Microsimulation

Authors: C. B. Masera, M. Imprialou, L. Budd, C. Morton

Abstract:

Even though signalised intersections are necessary for urban road traffic management, they can act as bottlenecks and disrupt traffic operations. Interrupted traffic flow causes congestion, delays, stop-and-go conditions (i.e. excessive acceleration/deceleration) and longer journey times. Vehicle and infrastructure connectivity offers the potential to provide improved new services with additional functions of assisting drivers. This paper focuses on one of the applications of vehicle-to-infrastructure communication namely Green Light Optimal Speed Advisory (GLOSA). To assess the effectiveness of GLOSA in the urban road network, an integrated microscopic traffic simulation framework is built into VISSIM software. Vehicle movements and vehicle-infrastructure communications are simulated through the interface of External Driver Model. A control algorithm is developed for recommending an optimal speed that is continuously updated in every time step for all vehicles approaching a signal-controlled point. This algorithm allows vehicles to pass a traffic signal without stopping or to minimise stopping times at a red phase. This study is performed with all connected vehicles at 100% penetration rate. Conventional vehicles are also simulated in the same network as a reference. A straight road segment composed of two opposite directions with two traffic lights per lane is studied. The simulation is implemented under 150 vehicles per hour and 200 per hour traffic volume conditions to identify how different traffic densities influence the benefits of GLOSA. The results indicate that traffic flow is improved by the application of GLOSA. According to this study, vehicles passed through the traffic lights more smoothly, and waiting times were reduced by up to 28 seconds. Average delays decreased for the entire network by 86.46% and 83.84% under traffic densities of 150 vehicles per hour per lane and 200 vehicles per hour per lane, respectively.

Keywords: connected vehicles, GLOSA, intelligent transport systems, vehicle-to-infrastructure communication

Procedia PDF Downloads 136
920 Tracking Patient Pathway for Assessing Public Health and Financial Burden to Community for Pulmonary Tuberculosis: Pointer from Central India

Authors: Ashish Sinha, Pushpend Agrawal

Abstract:

Background: Patients with undiagnosed pulmonary TB predominantly act as reservoirs for its transmission through 10-15 secondary infections in the next 1-5 Yrs. Delays in the diagnosis and treatment may worsen the disease with increase the risk of death. Factors responsible for such delays by tracking patient pathways to treatment may help in planning better interventions. The provision of ‘free diagnosis and treatment’ forms the cornerstone of the National Tuberculosis Elimination Programme (NTEP). OOPE is defined as the money spent by the patient during TB care other than public health facilities. Free TB care at all health facilities could reduce out-of-pocket expenses to the minimum possible levels. Material and Methods: This cross-sectional study was conducted among randomly selected 252 TB patients from Nov – Oct 2022 by taking in-depth interviews following informed verbal consent. We documented their journey from initial symptoms until they reached the public health facility, along with their ‘out-of-pocket expenditure’ (OOPE) pertaining to TB care. Results: Total treatment delay was 91±72 days on average (median: 77days, IQR: 45-104 days), while the isolated patient delay was 31±45 days (median: 15 days, IQR: 0 days to 43 days); diagnostic delay; 57±60 days (median: 42days, IQR 14-78 days), treatment delay 19 ± 18 days (median: 15days, IQR: 11-19 days). A patient delay (> 30 days) was significantly associated with ignorance about classic symptoms of pulmonary TB, adoption of self-medication, illiteracy, and middle and lower social class. Diagnostic delay was significantly higher among those who contacted private health facilities, were unaware of signs and symptoms, had >2 consultations, and not getting an appropriate referral for TB care. Most (97%) of the study participants interviewed claimed to have incurred some expenditure.Median total expenses were 6155(IQR: 2625-15175) rupees. More than half 141 (56%) of the study participants had expenses >5000 rupees. Median transport expenses were 525(IQR: 200-1012) rupees; Median consultation expenses were 700(IQR: 200-1600) rupees; Median investigation expenses were 1000(IQR: 0-3025) rupees and the Median medicine expenses were 3350(IQR: 1300-7525).OOPE for consultation, investigation, and medicine was observed to be significantly higher among patients who ignored classical signs& symptoms of TB, repeated visits to private health facilities, and due to self-medication practices. Transport expenses and delays in seeking care at facilities were observed to have an upward trend with OOP Expenses (r =1). Conclusion: Delay in TB care due to low awareness about signs and symptoms of TB and poor seeking care, lack of proper consultation, and appropriate referrals reported by the study subjects indicate the areas which need proper attention by the program managers. Despite a centrally sponsored programme, the financial burden on TB patients is still in the unacceptable range. OOPE could be reduced as low as possible by addressing the responsible factors linked to it.

Keywords: patient pathway, delay, pulmonary tuberculosis, out of pocket expenses

Procedia PDF Downloads 36
919 Estimation of a Finite Population Mean under Random Non Response Using Improved Nadaraya and Watson Kernel Weights

Authors: Nelson Bii, Christopher Ouma, John Odhiambo

Abstract:

Non-response is a potential source of errors in sample surveys. It introduces bias and large variance in the estimation of finite population parameters. Regression models have been recognized as one of the techniques of reducing bias and variance due to random non-response using auxiliary data. In this study, it is assumed that random non-response occurs in the survey variable in the second stage of cluster sampling, assuming full auxiliary information is available throughout. Auxiliary information is used at the estimation stage via a regression model to address the problem of random non-response. In particular, the auxiliary information is used via an improved Nadaraya-Watson kernel regression technique to compensate for random non-response. The asymptotic bias and mean squared error of the estimator proposed are derived. Besides, a simulation study conducted indicates that the proposed estimator has smaller values of the bias and smaller mean squared error values compared to existing estimators of finite population mean. The proposed estimator is also shown to have tighter confidence interval lengths at a 95% coverage rate. The results obtained in this study are useful, for instance, in choosing efficient estimators of the finite population mean in demographic sample surveys.

Keywords: mean squared error, random non-response, two-stage cluster sampling, confidence interval lengths

Procedia PDF Downloads 107
918 Control of Biofilm Formation and Inorganic Particle Accumulation on Reverse Osmosis Membrane by Hypochlorite Washing

Authors: Masaki Ohno, Cervinia Manalo, Tetsuji Okuda, Satoshi Nakai, Wataru Nishijima

Abstract:

Reverse osmosis (RO) membranes have been widely used for desalination to purify water for drinking and other purposes. Although at present most RO membranes have no resistance to chlorine, chlorine-resistant membranes are being developed. Therefore, direct chlorine treatment or chlorine washing will be an option in preventing biofouling on chlorine-resistant membranes. Furthermore, if particle accumulation control is possible by using chlorine washing, expensive pretreatment for particle removal can be removed or simplified. The objective of this study was to determine the effective hypochlorite washing condition required for controlling biofilm formation and inorganic particle accumulation on RO membrane in a continuous flow channel with RO membrane and spacer. In this study, direct chlorine washing was done by soaking fouled RO membranes in hypochlorite solution and fluorescence intensity was used to quantify biofilm on the membrane surface. After 48 h of soaking the membranes in high fouling potential waters, the fluorescence intensity decreased to 0 from 470 using the following washing conditions: 10 mg/L chlorine concentration, 2 times/d washing interval, and 30 min washing time. The chlorine concentration required to control biofilm formation decreased as the chlorine concentration (0.5–10 mg/L), the washing interval (1–4 times/d), or the washing time (1–30 min) increased. For the sample solutions used in the study, 10 mg/L chlorine concentration with 2 times/d interval, and 5 min washing time was required for biofilm control. The optimum chlorine washing conditions obtained from soaking experiments proved to be applicable also in controlling biofilm formation in continuous flow experiments. Moreover, chlorine washing employed in controlling biofilm with suspended particles resulted in lower amounts of organic (0.03 mg/cm2) and inorganic (0.14 mg/cm2) deposits on the membrane than that for sample water without chlorine washing (0.14 mg/cm2 and 0.33 mg/cm2, respectively). The amount of biofilm formed was 79% controlled by continuous washing with 10 mg/L of free chlorine concentration, and the inorganic accumulation amount decreased by 58% to levels similar to that of pure water with kaolin (0.17 mg/cm2) as feed water. These results confirmed the acceleration of particle accumulation due to biofilm formation, and that the inhibition of biofilm growth can almost completely reduce further particle accumulation. In addition, effective hypochlorite washing condition which can control both biofilm formation and particle accumulation could be achieved.

Keywords: reverse osmosis, washing condition optimization, hypochlorous acid, biofouling control

Procedia PDF Downloads 315
917 Parking Service Effectiveness at Commercial Malls

Authors: Ahmad AlAbdullah, Ali AlQallaf, Mahdi Hussain, Mohammed AlAttar, Salman Ashknani, Magdy Helal

Abstract:

We study the effectiveness of the parking service provided at Kuwaiti commercial malls and explore potential problems and feasible improvements. Commercial malls are important to Kuwaitis as the entertainment and shopping centers due to the lack of other alternatives. The difficulty and relatively long times wasted in finding a parking spot at the mall are real annoyances. We applied queuing analysis to one of the major malls that offer paid-parking (1040 parking spots) in addition to free parking. Patrons of the mall usually complained of the traffic jams and delays at entering the paid parking (average delay to park exceeds 15 min for about 62% of the patrons, while average time spent in the mall is about 2.6 hours). However, the analysis showed acceptable service levels at the check-in gates of the parking garage. Detailed review of the vehicle movement at the gateways indicated that arriving and departing cars both had to share parts of the gateway to the garage, which caused the traffic jams and delays. A simple comparison we made indicated that the largest commercial mall in Kuwait does not suffer such parking issues, while other smaller, yet important malls do, including the one we studied. It was suggested that well-designed inlets and outlets of that gigantic mall permitted smooth parking despite being totally free and mall is the first choice for most people for entertainment and shopping. A simulation model is being developed for further analysis and verification. Simulation can overcome the mathematical difficulty in using non-Poisson queuing models. The simulation model is used to explore potential changes to the parking garage entrance layout. And with the inclusion of the drivers’ behavior inside the parking, effectiveness indicators can be derived to address the economic feasibility of extending the parking capacity and increasing service levels. Outcomes of the study are planned to be generalized as appropriate to other commercial malls in Kuwait

Keywords: commercial malls, parking service, queuing analysis, simulation modeling

Procedia PDF Downloads 317
916 A Review of Paleo-Depositional Environment and Thermal Alteration Index of Carboniferous, Permian, and Triassic of A1-9 Well, NW Libya

Authors: M. A. Alrabib, Y. Sherif, A. K. Mohamed, E. A. Elfandi, E. I. Fandi

Abstract:

This paper introduces a paleo-environmental and hydrocarbon show in this well was identified in the interval of Dembaba formation to the Hassaona Formation was poor to very poor oil show. And from palaeo-environmental analysis there is neither particularly good reservoir nor source rock have been developed in the area. Recent palaeo-environment work undertakes that the sedimentary succession in this area comprises the Upper Paleozoic rock of the Carboniferous and Permian and the Mesozoic (Triassic) sedimentary sequences. No early Paleozoic rocks have been found in this area, these rocks were eroding during the Late Carboniferous and Early Permian time. During Latest Permian and earliest Triassic time evidence for major marine transgression has occurred. From depths 5930-5940 feet, to 10800-10810 feet, the TAI of the Al Guidr, the Bir Al Jaja Al Uotia, Hebilia and the top varies between 3+ to 4-(mature-dry gas). This interval corporate the rest part of the Dembaba Formation. From depth 10800- 10810 feet, until total sediment depth (11944 feet Log) which corporate the rest of the Dembaba and underlying equivalents of the Assedjefar and M Rar Formations and the underlying Indeterminate unit (Hassouna Formation) the TAI varies between 4 and 5 (dry gas-black and deformed).

Keywords: paleoenvironmental, thermal alteration index, north western Libya, hydrocarbon

Procedia PDF Downloads 444
915 Aquatic Therapy Improving Balance Function of Individuals with Stroke: A Systematic Review with Meta-Analysis

Authors: Wei-Po Wu, Wen-Yu Liu, Wei−Ting Lin, Hen-Yu Lien

Abstract:

Introduction: Improving balance function for individuals after stroke is a crucial target in physiotherapy. Aquatic therapy which challenges individual’s postural control in an unstable fluid environment may be beneficial in enhancing balance functions. The purposes of the systematic review with meta-analyses were to validate the effects of aquatic therapy in improving balance functions for individuals with strokes in contrast to conventional physiotherapy. Method: Available studies were explored from three electronic databases: PubMed, Scopus, and Web of Science. During literature search, the published date of studies was not limited. The study design of the included studies should be randomized controlled trials (RCTs) and the studies should contain at least one outcome measurement of balance function. The PEDro scale was adopted to assess the quality of included studies, while the 'Oxford Centre for Evidence-Based Medicine 2011 Levels of Evidence' was used to evaluate the level of evidence. After the data extraction, studies with same outcome measures were pooled together for meta-analysis. Result: Ten studies with 282 participants were included in analyses. The research qualities of the studies were ranged from fair to good (4 to 8 points). Levels of evidence of the included studies were graded as level 2 and 3. Finally, scores of Berg Balance Scale (BBS), Eye closed force plate center of pressure velocity (anterior-posterior, medial-lateral axis) and Timed up and Go test were pooled and analyzed separately. The pooled results shown improvement in balance function (BBS mean difference (MD): 1.39 points; 95% confidence interval (CI): 0.05-2.29; p=0.002) (Eye closed force plate center of pressure velocity (anterior-posterior axis) MD: 1.39 mm/s; 95% confidence interval (CI): 0.93-1.86; p<0.001) (Eye closed force plate center of pressure velocity (medial-lateral) MD: 1.48 mm/s; 95% confidence interval (CI): 0.15-2.82; p=0.03) and mobility (MD: 0.9 seconds; 95% CI: 0.07-1.73; p=0.03) of stroke individuals after aquatic therapy compared to conventional therapy. Although there were significant differences between two treatment groups, the differences in improvement were relatively small. Conclusion: The aquatic therapy improved general balance function and mobility in the individuals with stroke better than conventional physiotherapy.

Keywords: aquatic therapy, balance function, meta-analysis, stroke, systematic review

Procedia PDF Downloads 166
914 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 208
913 Clinical Impact of Delirium and Antipsychotic Therapy: 10-Year Experience from a Referral Coronary Care Unit

Authors: Niyada Naksuk, Thoetchai Peeraphatdit, Vitaly Herasevich, Peter A. Brady, Suraj Kapa, Samuel J. Asirvatham

Abstract:

Introduction: Little is known about the safety of antipsychotic therapy for delirium in the coronary care unit (CCU). Our aim was to examine the effect of delirium and antipsychotic therapy among CCU patients. Methods: Pre-study Confusion Assessment Method-Intensive Care Unit (CAM–ICU) criteria were implemented in screening consecutive patients admitted to Mayo Clinic, Rochester, the USA from 2004 through 2013. Death status was prospectively ascertained. Results: Of 11,079 study patients, the incidence of delirium was 8.3% (n=925). Delirium was associated with an increased risk of in-hospital mortality (adjusted OR 1.49; 95% CI, 1.08-2.08; P=.02) and one-year mortality among patients who survived from CCU admission (adjusted HR 1.46; 95% CI, 1.12-1.87; P=.005). A total of 792 doses of haloperidol (5 IQR [3-10] mg/day) or quetiapine (25 IQR [13-50] mg/day) were given to 244 patients with delirium. The clinical characteristics of patients with delirium who did and did not receive antipsychotic therapy were not different (baseline corrected QT [QTc] interval 460±61 ms vs. 457±58 ms, respectively; P = 0.57). In comparison to baseline, mean QTc intervals after the first and third doses of the antipsychotics were not significantly prolonged in haloperidol (448±56, 458±57, and 450±50 ms, respectively) or quetiapine groups (459±54, 467±68, and 462±46 ms, respectively) (P > 0.05 for all). Additionally, in-hospital mortality (adjusted OR 0.67; 95% CI, 0.42-1.04; P=.07), ventricular arrhythmia (adjusted OR 0.87; 95% CI, 0.17-3.62; P=.85) and one-year mortality among the hospital survivors (adjusted HR 0.86; 95% CI 0.62-1.17; P = 0.34) were not different in patients with delirium irrespective of whether or not they received antipsychotics. Conclusions: In patients admitted to the CCU, delirium was associated with an increase in both in-hospital and one-year mortality. Low doses of haloperidol and quetiapine appeared to be safe, without an increase in risk of sudden cardiac death, in-hospital mortality, or one-year mortality in carefully monitored patients.

Keywords: arrhythmias, haloperidol, mortality, qtc interval, quetiapine

Procedia PDF Downloads 347
912 Meta-Analysis of Particulate Matter Production in Developing and Developed Countries

Authors: Hafiz Mehtab Gull Nasir

Abstract:

Industrial development and urbanization have significant impacts on air emissions, and their relationship diverges at different stages of economic progress. The revolution further propelled these activities as principal paths to economic and social transformation; nevertheless, the paths also promoted environmental degradation. Resultantly, both developed and developing countries undergone through fast-paced development; in which developed countries implemented legislation towards environmental pollution control however developing countries took the advantage of technology without caring about the environment. In this study, meta-analysis is performed on production of particulate matter (i.e., PM10 and PM2.5) from urbanized cities of first, second and third world countries to assess the air quality. The cities were selected based on ranked set principles. In case of PM10, third world countries showed highest PM level (~95% confidence interval of 0.74-1.86) followed by second world countries but with managed situation. Besides, first, world countries indicated the lowest pollution (~95% confidence interval of 0.12-0.2). Similarly, highest level of PM2.5 was produced by third world countries followed by the second and first world countries. Hereby, level of PM2.5 was not significantly different for both second and third world countries; however, first world countries showed minimum PM load. Finally, the study revealed different that levels of pollution status exist among different countries; whereas developed countries also devised better strategies towards pollution control while developing countries are least caring about their environmental resources. It is suggested that although industrialization and urbanization are directly involved with interference in natural elements, however, production of nature appears to be more societal rather hermetical.

Keywords: meta-analysis, particulate matter, developing countries, urbanization

Procedia PDF Downloads 313
911 Identification of Workplace Hazards of Underground Coal Mines

Authors: Madiha Ijaz, Muhammad Akram, Sima Mir

Abstract:

Underground mining of coal is carried out manually in Pakistan. Exposure to ergonomic hazards (musculoskeletal disorders) are very common among the coal cutters of these mines. Cutting coal in narrow spaces poses a great threat to both upper and lower limbs of these workers. To observe the prevalence of such hazards, a thorough study was conducted on 600 workers from 30 mines (20 workers from 1 mine), located in two districts of province Punjab, Pakistan. Rapid Upper Limb Assessment sheet and Rapid Entire Body Assessment sheet were used for the study along with a standard Nordic Musculoskeleton disorder questionnaire. SPSS, 25, software was used for data analysis on upper and lower limb disorders, and regression analysis models were run for upper and lower back pain. According to the results obtained, it was found that work stages (drilling & blasting, coal cutting, timbering & supporting, etc.), wok experience and number of repetitions performed/minute were significant (with p-value 0.00,0.004 and 0.009, respectively) for discomfort in upper and lower limb. Age got p vale 0.00 for upper limb and 0.012 for lower limb disorder. The task of coal cutting was strongly associated with the pain in upper back (with odd ratios13.21, 95% confidence interval (CI)14.0-21.64)) and lower back pain (3.7, 95% confidence interval 1.3-4.2). scored on RULA and REBA sheets, every work-stage was ranked at 7-highest level of risk involved. Workers were young (mean value of age= 28.7 years) with mean BMI 28.1 kg/m2

Keywords: workplace hazards, ergonomic disorders, limb disorders, MSDs.

Procedia PDF Downloads 56
910 Temperature-Dependent Post-Mortem Changes in Human Cardiac Troponin-T (cTnT): An Approach in Determining Postmortem Interval

Authors: Sachil Kumar, Anoop Kumar Verma, Wahid Ali, Uma Shankar Singh

Abstract:

Globally approximately 55.3 million people die each year. In the India there were 95 lakh annual deaths in 2013. The number of deaths resulted from homicides, suicides and unintentional injuries in the same period was about 5.7 lakh. The ever-increasing crime rate necessitated the development of methods for determining time since death. An erroneous time of death window can lead investigators down the wrong path or possibly focus a case on an innocent suspect. In this regard a research was carried out by analyzing the temperature dependent degradation of a Cardiac Troponin-T protein (cTnT) in the myocardium postmortem as a marker for time since death. Cardiac tissue samples were collected from (n=6) medico-legal autopsies, (in the Department of Forensic Medicine and Toxicology, King George’s Medical University, Lucknow India) after informed consent from the relatives and studied post-mortem degradation by incubation of the cardiac tissue at room temperature (20±2 OC), 12 0C, 25 0C and 37 0C for different time periods ((~5, 26, 50, 84, 132, 157, 180, 205, and 230 hours). The cases included were the subjects of road traffic accidents (RTA) without any prior history of disease who died in the hospital and their exact time of death was known. The analysis involved extraction of the protein, separation by denaturing gel electrophoresis (SDS-PAGE) and visualization by Western blot using cTnT specific monoclonal antibodies. The area of the bands within a lane was quantified by scanning and digitizing the image using Gel Doc. The data shows a distinct temporal profile corresponding to the degradation of cTnT by proteases found in cardiac muscle. The disappearance of intact cTnT and the appearance of lower molecular weight bands are easily observed. Western blot data clearly showed the intact protein at 42 kDa, two major (27 kDa, 10kDa) fragments, two additional minor fragments (32 kDa) and formation of low molecular weight fragments as time increases. At 12 0C the intensity of band (intact cTnT) decreased steadily as compared to RT, 25 0C and 37 0C. Overall, both PMI and temperature had a statistically significant effect where the greatest amount of protein breakdown was observed within the first 38 h and at the highest temperature, 37 0C. The combination of high temperature (37 0C) and long Postmortem interval (105.15 hrs) had the most drastic effect on the breakdown of cTnT. If the percent intact cTnT is calculated from the total area integrated within a Western blot lane, then the percent intact cTnT shows a pseudo-first order relationship when plotted against the log of the time postmortem. These plots show a good coefficient of correlation of r = 0.95 (p=0.003) for the regression of the human heart at different temperature conditions. The data presented demonstrates that this technique can provide an extended time range during which Postmortem interval can be more accurately estimated.

Keywords: degradation, postmortem interval, proteolysis, temperature, troponin

Procedia PDF Downloads 357
909 Deficits in Perceptual and Musical Memory in Individuals with Major Depressive Disorder

Authors: Toledo-Fernandez Aldebaran

Abstract:

Introduction: One of the least explored cognitive functions in relation with depression is the one related to musical stimuli. Music perception and memory can become impaired as well. The term amusia is used to define a type of agnosia caused by damage to basic processes that creates a general inability to perceive music. Therefore, the main objective is to explore performance-based and self-report deficits in music perception and memory on people with major depressive disorder (MDD). Method: Data was collected through April-October 2021 recruiting people who met the eligibility criteria and using the Montreal Battery of Evaluation of Amusia (MBEA) to evaluate performance-based music perception and memory, along with the module for depression of the Mini International Neuropsychiatric Interview, and the Amusic Dysfunction Inventory (ADI) which evaluates the participants’ self-report concerning their abilities in music perception. Results: 64 participants were evaluated. The main study, referring to analyzing the differences between people with MDD and the control group, only showed one statistical difference on the Interval subtest of the MBEA. No difference was found in the dimensions assessed by the ADI. Conclusion: Deficits in interval perception can be explained by mental fatigue, to which people with depression are more vulnerable, rather than by specific deficits in musical perception and memory associated with depressive disorder. Additionally, significant associations were found between musical deficits as observed by performance-based evidence and music dysfunction according to self-report, which could suggest that some people with depression are capable of detecting these deficits in themselves.

Keywords: depression, amusia, music, perception, memory

Procedia PDF Downloads 30
908 The Effects of Blanching, Boiling and Steaming on Ascorbic Acid Content, Total Phenolic Content, and Colour in Cauliflowers (Brassica oleracea var. Botrytis)

Authors: Huei Lin Lee, Wee Sim Choo

Abstract:

The effects of blanching, boiling and steaming on the ascorbic acid content, total phenolic content and colour in cauliflower (Brassica oleraceavar. Botrytis) was investigated. It was found that blanching was the best thermal processing to be applied on cauliflower compared to boiling and steaming processes. Blanching and steaming processes on cauliflower retained most of the ascorbic acid content (AAC) compared to those of boiling. As for the total phenolic content (TPC), blanching process retained a higher TPC in cauliflower compared to those of boiling and steaming processes. There were no significant differences between the TPC of boiled and steamed cauliflowers. As for the colour measurement, there were no significant differences in the colour of the cauliflower at different lead time (after processing to the point of consumption) of 30 minutes interval up to 3 hours but there were slight variations in L*, a*, and b* values among the thermal processed cauliflowers (blanched, boiled and steamed). The cauliflowers in this study were found to give a desirable white colour (L* value in the range of 77-83) in all the three thermal processes (blanching, boiling and steaming). There was no significant difference on the effect of lead time (30-minutes interval up to 3 hours) in raw and all the three thermal processed (blanched, boiled and steamed) cauliflowers.

Keywords: ascorbic acid, cauliflower, colour, phenolics

Procedia PDF Downloads 294
907 Effects of Poultry Manure Rates on Some Growth and Yield Attributes of Cucumber in Owerri, South Eastern Nigeria

Authors: Chinwe Pearl Poly-Mbah, Evelyn Obioma, Juliet Amajuoyi

Abstract:

The investigation here reported examined growth and yield responses of Cucumber to manure rates in Owerri, Southeastern Nigeria. Fruit vegetables are widely cultivated and produced in Northern Nigeria but greatly consumed in Southern Nigeria where cucumbers command high demand and price but are minimally cultivated. Unfortunately, farmers in northern Nigeria incur lots of losses because cucumber is a perishable vegetable and is transported all the way from the northern Nigeria where cucumbers are produced to Southern Nigeria where cucumbers are consumed, hence the high cost of cucumber fruits in Southern Nigeria. There is a need, therefore, to evolve packages that will enhance cucumber production in Southern Nigeria. The main objective of this study was to examine the effects of poultry manure rates on the growth and yield of cucumber in Owerri, South Eastern Nigeria. Specifically, this study was designed to assess the effect of poultry manure rates on number of days to 50% seedling emergence, vine length/plant, leaf area per plant and the number of leaves produced per plant. The design used for the experiment was Randomized Complete Block Design (RCBD) with three blocks (replications). Treatment consisted of four rates of well-decomposed poultry manure at the rate of 0 tons/ha, 2 tons/ha, 4 tons/ha and 6 tons/ha. Data were collected on number of days to 50% seedling emergence, vine length per plant at two weeks interval, leaf number per plant at two weeks interval, leaf area per plant at two weeks interval, number of fruits produced per plant, and fresh weight of fruits per plant at harvest. Results from the analysis of variance (ANOVA) showed that there were highly significant effects (P=0.05) of poultry manure on growth and yield parameters studied which include number of days to 50% seedling emergence, vine length per plant, leaf number per plant, leaf area per plant, fruit number and fruit weight per plant such that increase in poultry manure rates lead to increase in growth and yield parameters studied. Therefore, the null hypothesis (Ho) was rejected, while the alternative hypothesis was accepted. Farmers should be made to know that growing cucumber with poultry manure in southeastern Nigeria agro ecology is a successful enterprise

Keywords: cucumber, effects, growth and yield, manure

Procedia PDF Downloads 198
906 Risk Factors of Hospital Acquired Infection Mortality in a Tunisian Intensive Care Unit

Authors: Ben Cheikh Asma, Bouafia Nabiha, Ammar Asma, Ezzi Olfa, Meddeb Khaoula, Chouchène Imed, Boussarsar Hamadi, Njah Mansour

Abstract:

Background: Hospital Acquired Infection (HAI) constitutes an important worldwide health problem. It was associated with high mortality rate in intensive care units (ICU). This study aimed to determine HAI mortality rate in Tunisian intensive care units and identify its risk factors. Methods: We conducted a prospective observational cohort study over a 12 months period (September 15th 2015 to September 15 th 2016) in the adult medical ICU of University Hospital-Farhat Hached (Sousse-Tunisia). All patients admitted in the ICU for more than 48 hours were included in the study. We used an anonymous standardized survey record form to collect data by a medical hygienist assisted by an intensivist. We adopted definitions of Center for Diseases Control and prevention of Atlanta to detect HAI, Kaplan Meier survival analysis and Cox proportional hazard regression to identify independent risk factor of HAI mortality. Results: Of 171 patients, 67 developed ICU-acquired infection (global incidence rate=39.2%). The mean age of patients was 59 ± 21.2 years and 60.8% were male. The most frequently identified infections were pulmonary acquired infection (ventilator associated pneumonia (VAP) and infected atelectasis with density rates 21.4 VAP/1000 days of mechanical ventilation and 9.4 infected atelectasis /1000 days of mechanical ventilation; respectively) and central venous catheter associated infection (CVC - AI) with density rate 28.4 CVC-AI / 1000 CVC-days). HAI mortality rate was 66.7% (n=44). The median survival was 20 days 3.36, 95% Confidential Interval [13.39 – 26.60]. Specific mortality rates according to infectious site were 65.5%, 36.4% and 4.5% respectively for VAP, CVC associated infection and infected atelectasis. In univariate analysis, a significant associations between mortality and cardiovascular history (p=0.04) tracheotomy (p=0.00), peripheral venous catheterization (p=0.04), VAP (p=0.04) and infected atelectasis (p=0.04) were detected. Independent risk factors for HAI mortality were VAP with Hazard Ratio = 3.14, 95% Confidential Interval [1.63 – 6.05] (p=0.001) and tracheotomy (Hazard Ratio=0.22, 95% Confidential Interval [0.10 – 0.44], p=0.000). Conclusions: In the present study, hospital acquired infection mortality rate was relatively high. We need to intensify the fight against these infections especially ventilator-associated pneumonia that is associated with higher risk of mortality in many studies. Thus, more effective infection control interventions were necessary in our hospital.

Keywords: hospital acquired infection, intensive care unit, mortality, risk factors

Procedia PDF Downloads 454
905 Application of Forensic Entomology to Estimate the Post Mortem Interval

Authors: Meriem Taleb, Ghania Tail, Fatma Zohra Kara, Brahim Djedouani, T. Moussa

Abstract:

Forensic entomology has grown immensely as a discipline in the past thirty years. The main purpose of forensic entomology is to establish the post mortem interval or PMI. Three days after the death, insect evidence is often the most accurate and sometimes the only method of determining elapsed time since death. This work presents the estimation of the PMI in an experiment to test the reliability of the accumulated degree days (ADD) method and the application of this method in a real case. The study was conducted at the Laboratory of Entomology at the National Institute for Criminalistics and Criminology of the National Gendarmerie, Algeria. The domestic rabbit Oryctolagus cuniculus L. was selected as the animal model. On 08th July 2012, the animal was killed. Larvae were collected and raised to adulthood. Estimation of oviposition time was calculated by summing up average daily temperatures minus minimum development temperature (also specific to each species). When the sum is reached, it corresponds to the oviposition day. Weather data were obtained from the nearest meteorological station. After rearing was accomplished, three species emerged: Lucilia sericata, Chrysomya albiceps, and Sarcophaga africa. For Chrysomya albiceps species, a cumulation of 186°C is necessary. The emergence of adults occured on 22nd July 2012. A value of 193.4°C is reached on 9th August 2012. Lucilia sericata species require a cumulation of 207°C. The emergence of adults occurred on 23rd, July 2012. A value of 211.35°C is reached on 9th August 2012. We should also consider that oviposition may occur more than 12 hours after death. Thus, the obtained PMI is in agreement with the actual time of death. We illustrate the use of this method during the investigation of a case of a decaying human body found on 03rd March 2015 in Bechar, South West of Algerian desert. Maggots were collected and sent to the Laboratory of Entomology. Lucilia sericata adults were identified on 24th March 2015 after emergence. A sum of 211.6°C was reached on 1st March 2015 which corresponds to the estimated day of oviposition. Therefore, the estimated date of death is 1st March 2015 ± 24 hours. The estimated PMI by accumulated degree days (ADD) method seems to be very precise. Entomological evidence should always be used in homicide investigations when the time of death cannot be determined by other methods.

Keywords: forensic entomology, accumulated degree days, postmortem interval, diptera, Algeria

Procedia PDF Downloads 253