Search results for: standard error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6627

Search results for: standard error

5307 A Comprehensive Study on Freshwater Aquatic Life Health Quality Assessment Using Physicochemical Parameters and Planktons as Bio Indicator in a Selected Region of Mahaweli River in Kandy District, Sri Lanka

Authors: S. M. D. Y. S. A. Wijayarathna, A. C. A. Jayasundera

Abstract:

Mahaweli River is the longest and largest river in Sri Lanka and it is the major drinking water source for a large portion of 2.5 million inhabitants in the Central Province. The aim of this study was to the determination of water quality and aquatic life health quality in a selected region of Mahaweli River. Six sampling locations (Site 1: 7° 16' 50" N, 80° 40' 00" E; Site 2: 7° 16' 34" N, 80° 40' 27" E; Site 3: 7° 16' 15" N, 80° 41' 28" E; Site 4: 7° 14' 06" N, 80° 44' 36" E; Site 5: 7° 14' 18" N, 80° 44' 39" E; Site 6: 7° 13' 32" N, 80° 46' 11" E) with various anthropogenic activities at bank of the river were selected for a period of three months from Tennekumbura Bridge to Victoria Reservoir. Temperature, pH, Electrical Conductivity (EC), Total Dissolved Solids (TDS), Dissolved Oxygen (DO), 5-day Biological Oxygen Demand (BOD5), Total Suspended Solids (TSS), hardness, the concentration of anions, and metal concentration were measured according to the standard methods, as physicochemical parameters. Planktons were considered as biological parameters. Using a plankton net (20 µm mesh size), surface water samples were collected into acid washed dried vials and were stored in an ice box during transportation. Diversity and abundance of planktons were identified within 4 days of sample collection using standard manuals of plankton identification under the light microscope. Almost all the measured physicochemical parameters were within the CEA standards limits for aquatic life, Sri Lanka Standards (SLS) or World Health Organization’s Guideline for drinking water. Concentration of orthophosphate ranged between 0.232 to 0.708 mg L-1, and it has exceeded the standard limit of aquatic life according to CEA guidelines (0.400 mg L-1) at Site 1 and Site 2, where there is high disturbance by cultivations and close households. According to the Pearson correlation (significant correlation at p < 0.05), it is obvious that some physicochemical parameters (temperature, DO, TDS, TSS, phosphate, sulphate, chloride fluoride, and sodium) were significantly correlated to the distribution of some plankton species such as Aulocoseira, Navicula, Synedra, Pediastrum, Fragilaria, Selenastrum, Oscillataria, Tribonema and Microcystis. Furthermore, species that appear in blooms (Aulocoseira), organic pollutants (Navicula), and phosphate high eutrophic water (Microcystis) were found, indicating deteriorated water quality in Mahaweli River due to agricultural activities, solid waste disposal, and release of domestic effluents. Therefore, it is necessary to improve environmental monitoring and management to control the further deterioration of water quality of the river.

Keywords: bio indicator, environmental variables, planktons, physicochemical parameters, water quality

Procedia PDF Downloads 103
5306 Modelling of Groundwater Resources for Al-Najaf City, Iraq

Authors: Hayder H. Kareem, Shunqi Pan

Abstract:

Groundwater is a vital water resource in many areas in the world, particularly in the Middle-East region where the water resources become scarce and depleting. Sustainable management and planning of the groundwater resources become essential and urgent given the impact of the global climate change. In the recent years, numerical models have been widely used to predict the flow pattern and assess the water resources security, as well as the groundwater quality affected by the contaminants transported. In this study, MODFLOW is used to study the current status of groundwater resources and the risk of water resource security in the region centred at Al-Najaf City, which is located in the mid-west of Iraq and adjacent to the Euphrates River. In this study, a conceptual model is built using the geologic and hydrogeologic collected for the region, together with the Digital Elevation Model (DEM) data obtained from the "Global Land Cover Facility" (GLCF) and "United State Geological Survey" (USGS) for the study area. The computer model is also implemented with the distributions of 69 wells in the area with the steady pro-defined hydraulic head along its boundaries. The model is then applied with the recharge rate (from precipitation) of 7.55 mm/year, given from the analysis of the field data in the study area for the period of 1980-2014. The hydraulic conductivity from the measurements at the locations of wells is interpolated for model use. The model is calibrated with the measured hydraulic heads at the locations of 50 of 69 wells in the domain and results show a good agreement. The standard-error-of-estimate (SEE), root-mean-square errors (RMSE), Normalized RMSE and correlation coefficient are 0.297 m, 2.087 m, 6.899% and 0.971 respectively. Sensitivity analysis is also carried out, and it is found that the model is sensitive to recharge, particularly when the rate is greater than (15mm/year). Hydraulic conductivity is found to be another parameter which can affect the results significantly, therefore it requires high quality field data. The results show that there is a general flow pattern from the west to east of the study area, which agrees well with the observations and the gradient of the ground surface. It is found that with the current operational pumping rates of the wells in the area, a dry area is resulted in Al-Najaf City due to the large quantity of groundwater withdrawn. The computed water balance with the current operational pumping quantity shows that the Euphrates River supplies water into the groundwater of approximately 11759 m3/day, instead of gaining water of 11178 m3/day from the groundwater if no pumping from the wells. It is expected that the results obtained from the study can provide important information for the sustainable and effective planning and management of the regional groundwater resources for Al-Najaf City.

Keywords: Al-Najaf city, conceptual modelling, groundwater, unconfined aquifer, visual MODFLOW

Procedia PDF Downloads 207
5305 A Dynamic Cardiac Single Photon Emission Computer Tomography Using Conventional Gamma Camera to Estimate Coronary Flow Reserve

Authors: Maria Sciammarella, Uttam M. Shrestha, Youngho Seo, Grant T. Gullberg, Elias H. Botvinick

Abstract:

Background: Myocardial perfusion imaging (MPI) is typically performed with static imaging protocols and visually assessed for perfusion defects based on the relative intensity distribution. Dynamic cardiac SPECT, on the other hand, is a new imaging technique that is based on time varying information of radiotracer distribution, which permits quantification of myocardial blood flow (MBF). In this abstract, we report a progress and current status of dynamic cardiac SPECT using conventional gamma camera (Infinia Hawkeye 4, GE Healthcare) for estimation of myocardial blood flow and coronary flow reserve. Methods: A group of patients who had high risk of coronary artery disease was enrolled to evaluate our methodology. A low-dose/high-dose rest/pharmacologic-induced-stress protocol was implemented. A standard rest and a standard stress radionuclide dose of ⁹⁹ᵐTc-tetrofosmin (140 keV) was administered. The dynamic SPECT data for each patient were reconstructed using the standard 4-dimensional maximum likelihood expectation maximization (ML-EM) algorithm. Acquired data were used to estimate the myocardial blood flow (MBF). The correspondence between flow values in the main coronary vasculature with myocardial segments defined by the standardized myocardial segmentation and nomenclature were derived. The coronary flow reserve, CFR, was defined as the ratio of stress to rest MBF values. CFR values estimated with SPECT were also validated with dynamic PET. Results: The range of territorial MBF in LAD, RCA, and LCX was 0.44 ml/min/g to 3.81 ml/min/g. The MBF between estimated with PET and SPECT in the group of independent cohort of 7 patients showed statistically significant correlation, r = 0.71 (p < 0.001). But the corresponding CFR correlation was moderate r = 0.39 yet statistically significant (p = 0.037). The mean stress MBF value was significantly lower for angiographically abnormal than that for the normal (Normal Mean MBF = 2.49 ± 0.61, Abnormal Mean MBF = 1.43 ± 0. 0.62, P < .001). Conclusions: The visually assessed image findings in clinical SPECT are subjective, and may not reflect direct physiologic measures of coronary lesion. The MBF and CFR measured with dynamic SPECT are fully objective and available only with the data generated from the dynamic SPECT method. A quantitative approach such as measuring CFR using dynamic SPECT imaging is a better mode of diagnosing CAD than visual assessment of stress and rest images from static SPECT images Coronary Flow Reserve.

Keywords: dynamic SPECT, clinical SPECT/CT, selective coronary angiograph, ⁹⁹ᵐTc-Tetrofosmin

Procedia PDF Downloads 146
5304 Impact of Hybrid Optical Amplifiers on 16 Channel Wavelength Division Multiplexed System

Authors: Inderpreet Kaur, Ravinder Pal Singh, Kamal Kant Sharma

Abstract:

This paper addresses the different configurations used of optical amplifiers with 16 channels in Wavelength Division Multiplexed system. The systems with 16 channels have been simulated for evaluation of various parameters; Bit Error Rate, Quality Factor, for threshold values for a range of wavelength from 1471 nm to 1611 nm. Comparison of various combination of configurations have been analyzed with EDFA and FRA but EDFA-FRA configuration performance has been found satisfactory in terms of performance indices and stable region. The paper also compared various parameters quantized with different configurations individually. It has been found that Q factor has high value with less value of BER and high resolution for EDFA-FRA configuration.

Keywords: EDFA, FRA, WDM, Q factor, BER

Procedia PDF Downloads 348
5303 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 73
5302 Cross Professional Team-Assisted Teaching Effectiveness

Authors: Shan-Yu Hsu, Hsin-Shu Huang

Abstract:

The main purpose of this teaching research is to design an interdisciplinary team-assisted teaching method for trainees and interns and review the effectiveness of this teaching method on trainees' understanding of peritoneal dialysis. The teaching research object is the fifth and sixth-grade trainees in a medical center's medical school. The teaching methods include media teaching, demonstration of technical operation, face-to-face communication with patients, special case discussions, and field visits to the peritoneal dialysis room. Evaluate learning effectiveness before, after, and verbally. Statistical analysis was performed using the SPSS paired-sample t-test to analyze whether there is a difference in peritoneal dialysis professional cognition before and after teaching intervention. Descriptive statistics show that the average score of the previous test is 74.44, the standard deviation is 9.34, the average score of the post-test is 95.56, and the standard deviation is 5.06. The results of the t-test of the paired samples are shown as p-value = 0.006, showing the peritoneal dialysis professional cognitive test. Significant differences were observed before and after. The interdisciplinary team-assisted teaching method helps trainees and interns to improve their professional awareness of peritoneal dialysis. At the same time, trainee physicians have positive feedback on the inter-professional team-assisted teaching method. This teaching research finds that the clinical ability development education of trainees and interns can provide cross-professional team-assisted teaching methods to assist clinical teaching guidance.

Keywords: monitor quality, patient safety, health promotion objective, cross-professional team-assisted teaching methods

Procedia PDF Downloads 139
5301 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 306
5300 Interoperability Standard for Data Exchange in Educational Documents in Professional and Technological Education: A Comparative Study and Feasibility Analysis for the Brazilian Context

Authors: Giovana Nunes Inocêncio

Abstract:

The professional and technological education (EPT) plays a pivotal role in equipping students for specialized careers, and it is imperative to establish a framework for efficient data exchange among educational institutions. The primary focus of this article is to address the pressing need for document interoperability within the context of EPT. The challenges, motivations, and benefits of implementing interoperability standards for digital educational documents are thoroughly explored. These documents include EPT completion certificates, academic records, and curricula. In conjunction with the prior abstract, it is evident that the intersection of IT governance and interoperability standards holds the key to transforming the landscape of technical education in Brazil. IT governance provides the strategic framework for effective data management, aligning with educational objectives, ensuring compliance, and managing risks. By adopting interoperability standards, the technical education sector in Brazil can facilitate data exchange, enhance data security, and promote international recognition of qualifications. The utilization of the XML (Extensible Markup Language) standard further strengthens the foundation for structured data exchange, fostering efficient communication, standardization of curricula, and enhancing educational materials. The IT governance, interoperability standards, and data management critical role in driving the quality, efficiency, and security of technical education. The adoption of these standards fosters transparency, stakeholder coordination, and regulatory compliance, ultimately empowering the technical education sector to meet the dynamic demands of the 21st century.

Keywords: interoperability, education, standards, governance

Procedia PDF Downloads 67
5299 Development of Structural Deterioration Models for Flexible Pavement Using Traffic Speed Deflectometer Data

Authors: Sittampalam Manoharan, Gary Chai, Sanaul Chowdhury, Andrew Golding

Abstract:

The primary objective of this paper is to present a simplified approach to develop the structural deterioration model using traffic speed deflectometer data for flexible pavements. Maintaining assets to meet functional performance is not economical or sustainable in the long terms, and it would end up needing much more investments for road agencies and extra costs for road users. Performance models have to be included for structural and functional predicting capabilities, in order to assess the needs, and the time frame of those needs. As such structural modelling plays a vital role in the prediction of pavement performance. A structural condition is important for the prediction of remaining life and overall health of a road network and also major influence on the valuation of road pavement. Therefore, the structural deterioration model is a critical input into pavement management system for predicting pavement rehabilitation needs accurately. The Traffic Speed Deflectometer (TSD) is a vehicle-mounted Doppler laser system that is capable of continuously measuring the structural bearing capacity of a pavement whilst moving at traffic speeds. The device’s high accuracy, high speed, and continuous deflection profiles are useful for network-level applications such as predicting road rehabilitations needs and remaining structural service life. The methodology adopted in this model by utilizing time series TSD maximum deflection (D0) data in conjunction with rutting, rutting progression, pavement age, subgrade strength and equivalent standard axle (ESA) data. Then, regression analyses were undertaken to establish a correlation equation of structural deterioration as a function of rutting, pavement age, seal age and equivalent standard axle (ESA). This study developed a simple structural deterioration model which will enable to incorporate available TSD structural data in pavement management system for developing network-level pavement investment strategies. Therefore, the available funding can be used effectively to minimize the whole –of- life cost of the road asset and also improve pavement performance. This study will contribute to narrowing the knowledge gap in structural data usage in network level investment analysis and provide a simple methodology to use structural data effectively in investment decision-making process for road agencies to manage aging road assets.

Keywords: adjusted structural number (SNP), maximum deflection (D0), equant standard axle (ESA), traffic speed deflectometer (TSD)

Procedia PDF Downloads 145
5298 Evaluation of 3D Templated Synthetic Vascular Graft Compared with Standard Graft in a Rat Model

Authors: Won-Min Jo, Suk-Hee Park, Tae-Hee Kim

Abstract:

Although the number of vascular surgeries using vascular grafts is increasing, they are limited by vascular graft-related complications and size discrepancy. Current efforts to develop the ideal synthetic vascular graft for clinical application using tissue engineering or 3D printing are far from satisfactory. Therefore, we aimed to re-design the vascular graft with modified materials and 3D printing techniques and also demonstrated the improved applications of our vascular graft clinically. We designed the 3D printed polyvinyl alcohol (PVA) templates according to the vessel size and shape, and these were dip-coated with salt-suspended thermoplastic polyurethane (TPU). Next, the core template was removed to obtain a customized porous TPU graft. The mechanical testing and cytotoxicity studies of the synthetic 3D templated vascular grafts (3DT) were more appropriate compared with commercially available polytetrafluoroethylene (PTFE) grafts (ePTFE; standard graft, SG) for clinical use. Finally, we performed implantation of the 3DTs and SGs into the rat abdominal aorta as a patch technique. Four groups of the animal model (SG_7 days, SG_30 days, 3DT_7 days, and 3DT_30 days) were enrolled in this study. The abdominal aorta was surgically opened and sutured with SG or 3DT with 8/0 Prolene. The degree of endothelial cell activation, neovascularization, thrombus formation, calcification, inflammatory infiltrates, and fibrosis were analyzed histopathologically. There was significantly decreased thrombogenesis in the group treated with the 3DT for 30 days compared with the group treated with the SG for 7 and 30 days, and the 3DT for 7 days. In addition, the group treated with the 3DT for 30 days may also have shown increased postoperative endothelialization in the early stages. In conclusion, this study suggests the possibility of using the 3DT as an SG substitute in vascular surgery.

Keywords: 3D templated graft, hrombogenesis, calcification, inflammation

Procedia PDF Downloads 7
5297 Comparative Studies and Optimization of Biodiesel Production from Oils of Selected Seeds of Nigerian Origin

Authors: Ndana Mohammed, Abdullahi Musa Sabo

Abstract:

The oils used in this work were extracted from seeds of Ricinuscommunis, Heaveabrasiliensis, Gossypiumhirsutum, Azadirachtaindica, Glycin max and Jatrophacurcasby solvent extraction method using n-hexane, and gave the yield of 48.00±0.00%, 44.30±0.52%, 45.50±0.64%, 47.60±0.51%, 41.50±0.32% and 46.50±0.71% respectively. However these feed stocks are highly challenging to trans-esterification reaction because they were found to contain high amount of free fatty acids (FFA) (6.37±0.18, 17.20±0.00, 6.14±0.05, 8.60±0.14, 5.35±0.07, 4.24±0.02mgKOH/g) in order of the above. As a result, two-stage trans-esterification reactions process was used to produce biodiesel; Acid esterification was used to reduce high FFA to 1% or less, and the second stage involve the alkaline trans-esterification/optimization of process condition to obtain high yield quality biodiesel. The salient features of this study include; characterization of oils using AOAC, AOCS standard methods to reveal some properties that may determine the viability of sample seeds as potential feed stocks for biodiesel production, such as acid value, saponification value, Peroxide value, Iodine value, Specific gravity, Kinematic viscosity, and free fatty acid profile. The optimization of process parameters in biodiesel production was investigated. Different concentrations of alkaline catalyst (KOH) (0.25, 0.5, 0.75, 1.0 and 1.50w/v, methanol/oil molar ratio (3:1, 6:1, 9:1, 12:1, and 15:1), reaction temperature (500 C, 550 C, 600 C, 650 C, 700 C), and the rate of stirring (150 rpm,225 rpm,300 rpm and 375 rpm) were used for the determination of optimal condition at which maximum yield of biodiesel would be obtained. However, while optimizing one parameter other parameters were kept fixed. The result shows the optimal biodiesel yield at a catalyst concentration of 1%, methanol/oil molar ratio of 6:1, except oil from ricinuscommunis which was obtained at 9:1, the reaction temperature of 650 C was observed for all samples, similarly the stirring rate of 300 rpm was also observed for all samples except oil from ricinuscommunis which was observed at 375 rpm. The properties of biodiesel fuel were evaluated and the result obtained conformed favorably to ASTM and EN standard specifications for fossil diesel and biodiesel. Therefore biodiesel fuel produced can be used as substitute for fossil diesel. The work also reports the result of the study on the evaluation of the effect of the biodiesel storage on its physicochemical properties to ascertain the level of deterioration with time. The values obtained for the entire samples are completely out of standard specification for biodiesel before the end of the twelve months test period, and are clearly degraded. This suggests the biodiesels from oils of Ricinuscommunis, Heaveabrasiliensis, Gossypiumhirsutum, Azadirachtaindica, Glycin max and Jatrophacurcascannot be stored beyond twelve months.

Keywords: biodiesel, characterization, esterification, optimization, transesterification

Procedia PDF Downloads 415
5296 Accuracy of VCCT for Calculating Stress Intensity Factor in Metal Specimens Subjected to Bending Load

Authors: Sanjin Kršćanski, Josip Brnić

Abstract:

Virtual Crack Closure Technique (VCCT) is a method used for calculating stress intensity factor (SIF) of a cracked body that is easily implemented on top of basic finite element (FE) codes and as such can be applied on the various component geometries. It is a relatively simple method that does not require any special finite elements to be used and is usually used for calculating stress intensity factors at the crack tip for components made of brittle materials. This paper studies applicability and accuracy of VCCT applied on standard metal specimens containing trough thickness crack, subjected to an in-plane bending load. Finite element analyses were performed using regular 4-node, regular 8-node and a modified quarter-point 8-node 2D elements. Stress intensity factor was calculated from the FE model results for a given crack length, using data available from FE analysis and a custom programmed algorithm based on virtual crack closure technique. Influence of the finite element size on the accuracy of calculated SIF was also studied. The final part of this paper includes a comparison of calculated stress intensity factors with results obtained from analytical expressions found in available literature and in ASTM standard. Results calculated by this algorithm based on VCCT were found to be in good correlation with results obtained with mentioned analytical expressions.

Keywords: VCCT, stress intensity factor, finite element analysis, 2D finite elements, bending

Procedia PDF Downloads 301
5295 A Study on the Quantitative Evaluation Method of Asphalt Pavement Condition through the Visual Investigation

Authors: Sungho Kim, Jaechoul Shin, Yujin Baek

Abstract:

In recent years, due to the environmental impacts and time factor, etc., various type of pavement deterioration is increasing rapidly such as crack, pothole, rutting and roughness degradation. The Ministry of Land, Infrastructure and Transport maintains regular pavement condition of the highway and the national highway using the pavement condition survey equipment and structural survey equipment in Korea. Local governments that maintain local roads, farm roads, etc. are difficult to maintain the pavement condition using the pavement condition survey equipment depending on economic conditions, skills shortages and local conditions such as narrow roads. This study presents a quantitative evaluation method of the pavement condition through the visual inspection to overcome these problems of roads managed by local governments. It is difficult to evaluate rutting and roughness with the naked eye. However, the condition of cracks can be evaluated with the naked eye. Linear cracks (m), area cracks (m²) and potholes (number, m²) were investigated with the naked eye every 100 meters for survey the cracks. In this paper, crack ratio was calculated using the results of the condition of cracks and pavement condition was evaluated by calculated crack ratio. The pavement condition survey equipment also investigated the pavement condition in the same section in order to evaluate the reliability of pavement condition evaluation by the calculated crack ratio. The pavement condition was evaluated through the SPI (Seoul Pavement Index) and calculated crack ratio using results of field survey. The results of a comparison between 'the SPI considering only crack ratio' and 'the SPI considering rutting and roughness either' using the equipment survey data showed a margin of error below 5% when the SPI is less than 5. The SPI 5 is considered the base point to determine whether to maintain the pavement condition. It showed that the pavement condition can be evaluated using only the crack ratio. According to the analysis results of the crack ratio between the visual inspection and the equipment survey, it has an average error of 1.86%(minimum 0.03%, maximum 9.58%). Economically, the visual inspection costs only 10% of the equipment survey and will also help the economy by creating new jobs. This paper advises that local governments maintain the pavement condition through the visual investigations. However, more research is needed to improve reliability. Acknowledgment: The author would like to thank the MOLIT (Ministry of Land, Infrastructure, and Transport). This work was carried out through the project funded by the MOLIT. The project name is 'development of 20mm grade for road surface detecting roadway condition and rapid detection automation system for removal of pothole'.

Keywords: asphalt pavement maintenance, crack ratio, evaluation of asphalt pavement condition, SPI (Seoul Pavement Index), visual investigation

Procedia PDF Downloads 162
5294 Mathematical Modeling of the Working Principle of Gravity Gradient Instrument

Authors: Danni Cong, Meiping Wu, Hua Mu, Xiaofeng He, Junxiang Lian, Juliang Cao, Shaokun Cai, Hao Qin

Abstract:

Gravity field is of great significance in geoscience, national economy and national security, and gravitational gradient measurement has been extensively studied due to its higher accuracy than gravity measurement. Gravity gradient sensor, being one of core devices of the gravity gradient instrument, plays a key role in measuring accuracy. Therefore, this paper starts from analyzing the working principle of the gravity gradient sensor by Newton’s law, and then considers the relative motion between inertial and non-inertial systems to build a relatively adequate mathematical model, laying a foundation for the measurement error calibration, measurement accuracy improvement.

Keywords: gravity gradient, gravity gradient sensor, accelerometer, single-axis rotation modulation

Procedia PDF Downloads 323
5293 Improvement of Visual Acuity in Patient Undergoing Occlusion Therapy

Authors: Rajib Husain, Mezbah Uddin, Mohammad Shamsal Islam, Rabeya Siddiquee

Abstract:

Purpose: To determine the improvement of visual acuity in patients undergoing occlusion therapy. Methods: This was a prospective hospital-based study of newly diagnosed of amblyopia seen at the pediatric clinic of Chittagong Eye Infirmary & Training Complex. There were 32 refractive amblyopia subjects were examined & questionnaire was piloted. Included were all patients diagnosed with refractive amblyopia between 5 to 8 years, without previous amblyopia treatment, and whose parents were interested to participate in the study. Patients diagnosed with strabismic amblyopia were excluded. Patients were first corrected with the best correction for a month. When the VA in the amblyopic eye did not improve over a month, then occlusion treatment was started. Occlusion was done daily for 6-8 h together with vision therapy. The occlusion was carried out for three months. Results: Out of study 32 children, 31 of them have a good compliance of amblyopic treatment whereas one child has poor compliance. About 6% Children have amblyopia from Myopia, 7% Hyperopia, 32% from myopic astigmatism, 42% from hyperopic astigmatism and 13% have mixed astigmatism. The mean and Standard deviation of present average VA was 0.452±0.275 Log MAR and after an intervention of amblyopia therapy with vision therapy mean and Standard deviation VA was 0.155±0.157 Log MAR. Out of total respondent 21.85% have BCVA in range from (0-.2) log MAR, 37.5% have BCVA in range from (0.22-0.5) log MAR, 35.95% have in range from (0.52-0.8) log MAR, 4.7% have in range from (0.82-1) log MAR and after intervention of occlusion therapy with vision therapy 76.6% have VA in range from (0-.2) log MAR, 21.85% have VA in range from (0.22-0.5) log MAR, 1.5% have in range from (0.52-0.8) log MAR. Conclusion: Amblyopia is a most important factor in pediatric age group because it can lead to visual impairment. Thus, this study concludes that occlusion therapy with vision therapy is probably one of the best treatment methods for amblyopic patients (age 5-8 years), and compliance and age were the most critical factor predicting a successful outcome.

Keywords: amblyopia, occlusion therapy, vision therapy, eccentric fixation, visuoscopy

Procedia PDF Downloads 501
5292 Characterization of Onboard Reliable Error Correction Code FORSDRAM Controller

Authors: N. Pitcheswara Rao

Abstract:

In the process of conveying the information there may be a chance of signal being corrupted which leads to the erroneous bits in the message. The message may consist of single, double and multiple bit errors. In high-reliability applications, memory can sustain multiple soft errors due to single or multiple event upsets caused by environmental factors. The traditional hamming code with SEC-DED capability cannot be address these types of errors. It is possible to use powerful non-binary BCH code such as Reed-Solomon code to address multiple errors. However, it could take at least a couple dozen cycles of latency to complete first correction and run at a relatively slow speed. In order to overcome this drawback i.e., to increase speed and latency we are using reed-Muller code.

Keywords: SEC-DED, BCH code, Reed-Solomon code, Reed-Muller code

Procedia PDF Downloads 422
5291 CD133 and CD44 - Stem Cell Markers for Prediction of Clinically Aggressive Form of Colorectal Cancer

Authors: Ognen Kostovski, Svetozar Antovic, Rubens Jovanovic, Irena Kostovska, Nikola Jankulovski

Abstract:

Introduction:Colorectal carcinoma (CRC) is one of the most common malignancies in the world. The cancer stem cell (CSC) markers are associated with aggressive cancer types and poor prognosis. The aim of study was to determine whether the expression of colorectal cancer stem cell markers CD133 and CD44 could be significant in prediction of clinically aggressive form of CRC. Materials and methods: Our study included ninety patients (n=90) with CRC. Patients were divided into two subgroups: with metatstatic CRC and non-metastatic CRC. Tumor samples were analyzed with standard histopathological methods, than was performed immunohistochemical analysis with monoclonal antibodies against CD133 and CD44 stem cell markers. Results: High coexpression of CD133 and CD44 was observed in 71.4% of patients with metastatic disease, compared to 37.9% in patients without metastases. Discordant expression of both markers was found in 8% of the subgroup with metastatic CRC, and in 13.4% of the subgroup without metastatic CRC. Statistical analyses showed a significant association of increased expression of CD133 and CD44 with the disease stage, T - category and N - nodal status. With multiple regression analysis the stage of disease was designate as a factor with the greatest statistically significant influence on expression of CD133 (p <0.0001) and CD44 (p <0.0001). Conclusion: Our results suggest that the coexpression of CD133 and CD44 have an important role in prediction of clinically aggressive form of CRC. Both stem cell markers can be routinely implemented in standard pathohistological diagnostics and can be useful markers for pre-therapeutic oncology screening.

Keywords: colorectal carcinoma, stem cells, CD133+, CD44+

Procedia PDF Downloads 144
5290 Direct CP Violation in Baryonic B-Hadron Decays

Authors: C. Q. Geng, Y. K. Hsiao

Abstract:

We study direct CP-violating asymmetries (CPAs) in the baryonic B decays of B- -> p\bar{p}M and Λb decays of Λb ®pM andΛb -> J/ΨpM with M=π-, K-,ρ-,K*- based on the generalized factorization method in the standard model (SM). In particular, we show that the CPAs in the vector modes of B-®p\bar{p}K* and Λb -> p K*- can be as large as 20%. We also discuss the simplest purely baryonic decays of Λb-> p\bar{p}n, p\bar{p}Λ, Λ\bar{p}Λ, and Λ\bar{Λ}Λ. We point out that some of CPAs are promising to be measured by the current as well as future B facilities.

Keywords: CP violation, B decays, baryonic decays, Λb decays

Procedia PDF Downloads 254
5289 Image Distortion Correction Method of 2-MHz Side Scan Sonar for Underwater Structure Inspection

Authors: Youngseok Kim, Chul Park, Jonghwa Yi, Sangsik Choi

Abstract:

The 2-MHz Side Scan SONAR (SSS) attached to the boat for inspection of underwater structures is affected by shaking. It is difficult to determine the exact scale of damage of structure. In this study, a motion sensor is attached to the inside of the 2-MHz SSS to get roll, pitch, and yaw direction data, and developed the image stabilization tool to correct the sonar image. We checked that reliable data can be obtained with an average error rate of 1.99% between the measured value and the actual distance through experiment. It is possible to get the accurate sonar data to inspect damage in underwater structure.

Keywords: image stabilization, motion sensor, safety inspection, sonar image, underwater structure

Procedia PDF Downloads 277
5288 Optimization Aluminium Design for the Facade Second Skin toward Visual Comfort: Case Studies & Dialux Daylighting Simulation Model

Authors: Yaseri Dahlia Apritasari

Abstract:

Visual comfort is important for the building occupants to need. Visual comfort can be fulfilled through natural lighting (daylighting) and artificial lighting. One strategy to optimize natural lighting can be achieved through the facade second skin design. This strategy can reduce glare, and fulfill visual comfort need. However, the design strategy cannot achieve light intensity for visual comfort. Because the materials, design and opening percentage of the facade of second skin blocked sunlight. This paper discusses aluminum material for the facade second skin design that can fulfill the optimal visual comfort with the case studies Multi Media Tower building. The methodology of the research is combination quantitative and qualitative through field study observed, lighting measurement and visual comfort questionnaire. Then it used too simulation modeling (DIALUX 4.13, 2016) for three facades second skin design model. Through following steps; (1) Measuring visual comfort factor: light intensity indoor and outdoor; (2) Taking visual comfort data from building occupants; (3) Making models with different facade second skin design; (3) Simulating and analyzing the light intensity value for each models that meet occupants visual comfort standard: 350 lux (Indonesia National Standard, 2010). The result shows that optimization of aluminum material for the facade second skin design can meet optimal visual comfort for building occupants. The result can give recommendation aluminum opening percentage of the facade second skin can meet optimal visual comfort for building occupants.

Keywords: aluminium material, Facade, second skin, visual comfort

Procedia PDF Downloads 349
5287 Vitex agnus-castus Anti-Inflammatory, Antioxidants Characters and Anti-Tumor Effect in Ehrlich Ascites Carcinoma Model

Authors: Abeer Y. Ibrahim, Faten M. Ibrahim, Samah A. El-Newary, Saber F. Hendawy

Abstract:

Objective: Appreciation of in-vitro anti-inflammatory and antioxidant characters of Vitex agnus-castus berries alcoholic extract and fractions, as well as in-vivo antitumor ability of alcoholic extract and chloroform fraction against Ehrlich ascites carcinoma is the aim of this study. Material and methods: Antioxidant properties of crude alcoholic extract of vitex berries as well as petroleum ether, chloroform, ethyl acetate and butanol fractions were evaluated, in-vitro assessments, as compared with standard materials, l-ascorbic acid (vitamin C) and butylated hydroxyl toluene(BHT). The anti-inflammatory activity was investigated in cyclooxygenase (COX)-1 and COX-2 inhibition assays. Moreover, in-vivo antitumor effect of vitex berries alcoholic and chloroform extracts were evaluated using Ehrlich ascites carcinoma model. Data were presented as mean±SE, and data were analyzed by one-way analysis of variance test. Results and conclusion: Berries crude extract showed potent antioxidant activity followed with its fractions ethyl acetate and chloroform as compared with standard (V.C and BHT). Ethyl acetate fraction showed good reduction capability, metal ion chelation, hydrogen peroxide scavenging, nitric oxide scavenging and superoxide anion scavenging. Meanwhile, chloroform fraction produced the highest free radical scavenging activity and total antioxidant capacity. In respectable of lipid peroxidation inhibition, crude alcoholic extract and its fractions cleared weak inhibition in comparing with standard materials. Anti-inflammatory activity of V. agnus-castus berries chloroform fraction of vitex was best COX-2 inhibitor (IC₅₀, 135.41 µg/ ml) as compared to vitex alcoholic extract or ethyl acetate fraction with weak inhibitory effect on COX-1 (IC50, 778.432 µg/ ml), where the lowest effect on COX-1 was recorded with alcoholic extract. Alcoholic extract and its fractions showed weak COX-1 inhibition activity, whereas COX-2 was inhibited (100%), compared with celecoxib drug (72% at 1000ppm). The crude alcoholic and chloroform extracts of V. agnus-castus barries significantly reduced the viable Ehrlich cell count and increased nonviable count with amelioration of all hematological parameters. This amelioration was reflected on increasing median survival time and significant increase (P < 0.05) in lifespan.

Keywords: anti-inflammatory, antioxidants, ehrlich ascites carcinoma, Vitex agnus-castus

Procedia PDF Downloads 142
5286 Assessment of Airtightness Through a Standardized Procedure in a Nearly-Zero Energy Demand House

Authors: Mar Cañada Soriano, Rafael Royo-Pastor, Carolina Aparicio-Fernández, Jose-Luis Vivancos

Abstract:

The lack of insulation, along with the existence of air leakages, constitute a meaningful impact on the energy performance of buildings. Both of them lead to increases in the energy demand through additional heating and/or cooling loads. Additionally, they cause thermal discomfort. In order to quantify these uncontrolled air currents, pressurization and depressurization tests can be performed. Among them, the Blower Door test is a standardized procedure to determine the airtightness of a space which characterizes the rate of air leakages through the envelope surface, calculating to this purpose an air flow rate indicator. In this sense, the low-energy buildings complying with the Passive House design criteria are required to achieve high levels of airtightness. Due to the invisible nature of air leakages, additional tools are often considered to identify where the infiltrations take place. Among them, the infrared thermography entails a valuable technique to this purpose since it enables their detection. The aim of this study is to assess the airtightness of a typical Mediterranean dwelling house located in the Valencian orchad (Spain) restored under the Passive House standard using to this purpose the blower-door test. Moreover, the building energy performance modelling tools TRNSYS (TRaNsient System Simulation program) and TRNFlow (TRaNsient Flow) have been used to determine its energy performance, and the infiltrations’ identification was carried out by means of infrared thermography. The low levels of infiltrations obtained suggest that this house may comply with the Passive House standard.

Keywords: airtightness, blower door, trnflow, infrared thermography

Procedia PDF Downloads 120
5285 Rainfall-Runoff Forecasting Utilizing Genetic Programming Technique

Authors: Ahmed Najah Ahmed Al-Mahfoodh, Ali Najah Ahmed Al-Mahfoodh, Ahmed Al-Shafie

Abstract:

In this study, genetic programming (GP) technique has been investigated in prediction of set of rainfall-runoff data. To assess the effect of input parameters on the model, the sensitivity analysis was adopted. To evaluate the performance of the proposed model, three statistical indexes were used, namely; Correlation Coefficient (CC), Mean Square Error (MSE) and Correlation of Efficiency (CE). The principle aim of this study is to develop a computationally efficient and robust approach for predict of rainfall-runoff which could reduce the cost and labour for measuring these parameters. This research concentrates on the Johor River in Johor State, Malaysia.

Keywords: genetic programming, prediction, rainfall-runoff, Malaysia

Procedia PDF Downloads 474
5284 Real Estate Trend Prediction with Artificial Intelligence Techniques

Authors: Sophia Liang Zhou

Abstract:

For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.

Keywords: linear regression, random forest, artificial neural network, real estate price prediction

Procedia PDF Downloads 99
5283 Knowledge Graph Development to Connect Earth Metadata and Standard English Queries

Authors: Gabriel Montague, Max Vilgalys, Catherine H. Crawford, Jorge Ortiz, Dava Newman

Abstract:

There has never been so much publicly accessible atmospheric and environmental data. The possibilities of these data are exciting, but the sheer volume of available datasets represents a new challenge for researchers. The task of identifying and working with a new dataset has become more difficult with the amount and variety of available data. Datasets are often documented in ways that differ substantially from the common English used to describe the same topics. This presents a barrier not only for new scientists, but for researchers looking to find comparisons across multiple datasets or specialists from other disciplines hoping to collaborate. This paper proposes a method for addressing this obstacle: creating a knowledge graph to bridge the gap between everyday English language and the technical language surrounding these datasets. Knowledge graph generation is already a well-established field, although there are some unique challenges posed by working with Earth data. One is the sheer size of the databases – it would be infeasible to replicate or analyze all the data stored by an organization like The National Aeronautics and Space Administration (NASA) or the European Space Agency. Instead, this approach identifies topics from metadata available for datasets in NASA’s Earthdata database, which can then be used to directly request and access the raw data from NASA. By starting with a single metadata standard, this paper establishes an approach that can be generalized to different databases, but leaves the challenge of metadata harmonization for future work. Topics generated from the metadata are then linked to topics from a collection of English queries through a variety of standard and custom natural language processing (NLP) methods. The results from this method are then compared to a baseline of elastic search applied to the metadata. This comparison shows the benefits of the proposed knowledge graph system over existing methods, particularly in interpreting natural language queries and interpreting topics in metadata. For the research community, this work introduces an application of NLP to the ecological and environmental sciences, expanding the possibilities of how machine learning can be applied in this discipline. But perhaps more importantly, it establishes the foundation for a platform that can enable common English to access knowledge that previously required considerable effort and experience. By making this public data accessible to the full public, this work has the potential to transform environmental understanding, engagement, and action.

Keywords: earth metadata, knowledge graphs, natural language processing, question-answer systems

Procedia PDF Downloads 144
5282 Cryptocurrency as a Payment Method in the Tourism Industry: A Comparison of Volatility, Correlation and Portfolio Performance

Authors: Shu-Han Hsu, Jiho Yoon, Chwen Sheu

Abstract:

With the rapidly growing of blockchain technology and cryptocurrency, various industries which include tourism has added in cryptocurrency as the payment method of their transaction. More and more tourism companies accept payments in digital currency for flights, hotel reservations, transportation, and more. For travellers and tourists, using cryptocurrency as a payment method has become a way to circumvent costs and prevent risks. Understanding volatility dynamics and interdependencies between standard currency and cryptocurrency is important for appropriate financial risk management to assist policy-makers and investors in marking more informed decisions. The purpose of this paper has been to understand and explain the risk spillover effects between six major cryptocurrencies and the top ten most traded standard currencies. Using data for the daily closing price of cryptocurrencies and currency exchange rates from 7 August 2015 to 10 December 2019, with 1,133 observations. The diagonal BEKK model was used to analyze the co-volatility spillover effects between cryptocurrency returns and exchange rate returns, which are measures of how the shocks to returns in different assets affect each other’s subsequent volatility. The empirical results show there are co-volatility spillover effects between the cryptocurrency returns and GBP/USD, CNY/USD and MXN/USD exchange rate returns. Therefore, currencies (British Pound, Chinese Yuan and Mexican Peso) and cryptocurrencies (Bitcoin, Ethereum, Ripple, Tether, Litecoin and Stellar) are suitable for constructing a financial portfolio from an optimal risk management perspective and also for dynamic hedging purposes.

Keywords: blockchain, co-volatility effects, cryptocurrencies, diagonal BEKK model, exchange rates, risk spillovers

Procedia PDF Downloads 140
5281 The Effect of Female Access to Healthcare and Educational Attainment on Nigerian Agricultural Productivity Level

Authors: Esther M. Folarin, Evans Osabuohien, Ademola Onabote

Abstract:

Agriculture constitutes an important part of development and poverty mitigation in lower-middle-income countries, like Nigeria. The level of agricultural productivity in the Nigerian economy in line with the level of demand necessary to meet the desired expectation of the Nigerian populace is threatening to meeting the standard of the United Nations (UN) Sustainable Development Goals (SDGs); This includes the SDG-2 (achieve food security through agricultural productivity). The overall objective of the study is to reveal the performance of the interaction variable in the model among other factors that help in the achievement of greater Nigerian agricultural productivity. The study makes use of Wave 4 (2018/2019) of the Living Standard Measurement Studies, Integrated Survey on Agriculture (LSMS-ISA). Qualitative analysis of the information was also used to provide complimentary answers to the quantitative analysis done in the study. The study employed human capital theory and Grossman’s theory of health Demand in explaining the relationships that exist between the variables within the model of the study. The study engages the Instrumental Variable Regression technique in achieving the broad objectives among other techniques for the other specific objectives. The estimation results show that there exists a positive relationship between female healthcare and the level of female agricultural productivity in Nigeria. In conclusion, the study emphasises the need for more provision and empowerment for greater female access to healthcare and educational attainment levels that aids higher female agricultural productivity and consequently an improvement in the total agricultural productivity of the Nigerian economy.

Keywords: agricultural productivity, education, female, healthcare, investment

Procedia PDF Downloads 78
5280 PRENACEL: Development and Evaluation of an M-Health Strategy to Improve Prenatal Care in Brazil

Authors: E. M. Vieira, C. S. Vieira, L. P. Bonifácio, L. M. de Oliveira Ciabati, A. C. A. Franzon, F. S. Zaratini, J. A. C. Sanchez, M. S. Andrade, J. P. Dias de Souza

Abstract:

The quality of prenatal care is key to reduce maternal morbidity and mortality. Communication between the health service and users can stimulate prevention and care. M-health has been an important and low cost strategy to health education. The PRENACEL programme (prenatal in the cell phone) was developed. It consists of a programme of information via SMS from the 20th week of pregnancy up to 12th week after delivery. Messages were about prenatal care, birth, contraception and breastfeeding. Communication of the pregnant woman asking questions about their health was possible. The objective of this study was to evaluate the implementation of PRENACEL as a useful complement to the standard prenatal care. Twenty health clinics were selected and randomized by cluster, 10 as the intervention group and 10 as the control group. In the intervention group, women and their partner were invited to participate. The control group received the standard prenatal care. All women were interviewed in the immediate post-partum and in the 12th and 24th week post-partum. Most women were married, had more than 8 years of schooling and visit the clinic more than 6 times during prenatal care. The intervention group presented lowest percentage of higher economic participants (5.6%), less single mothers and no drug user. It also presented more prenatal care visits than the control group and it was less likely to present Severe Acute Maternal Mortality when compared to control group as well as higher percentage of partners (75.4%) was present at the birth compared to control group. Although the study is still being carried out, preliminary data are showing positive results of the compliance of women to prenatal care.

Keywords: cellphone, health technology, prenatal care, prevention

Procedia PDF Downloads 386
5279 Effects of Potential Chloride-Free Admixtures on Selected Mechanical Properties of Kenya Clay-Based Cement Mortars

Authors: Joseph Mwiti Marangu, Joseph Karanja Thiong'o, Jackson Muthengia Wachira

Abstract:

The mechanical performance of hydrated cements mortars mainly depends on its compressive strength and setting time. These properties are crucial in the construction industry. Pozzolana based cements are mostly characterized by low 28 day compressive strength and long setting times. These are some of the major impediments to their production and diverse uses despite numerous technological and environmental benefits associated with them. The study investigated the effects of potential chemical activators on calcined clay- Portland cement blends with an aim to achieve high early compressive strength and shorter setting times in cement mortar. In addition, standard consistency, soundness and insoluble residue of all cement categories was determined. The test cement was made by blending calcined clays with Ordinary Portland Cement (OPC) at replacement levels from 35 to 50 percent by mass of the OPC to make test cement labeled PCC for the purposes of this study. Mortar prisms measuring 40mmx40mmx160mm were prepared and cured in accordance with KS EAS 148-3:2000 standard. Solutions of Na2SO4, NaOH, Na2SiO3 and Na2CO3 containing 0.5- 2.5M were separately added during casting. Compressive strength was determined at 2rd, 7th, 28th and 90th day of curing. For comparison purposes, commercial Portland Pozzolana cement (PPC) and Ordinary Portland Cement (OPC) were also investigated without activators under similar conditions. X-Ray Florescence (XRF) was used for chemical analysis while X-Ray Diffraction (XRD) and Fourier Transform Infrared Spectroscopy (FTIR) were used for mineralogical analysis of the test samples. The results indicated that addition of activators significantly increased the 2nd and 7th day compressive strength but minimal increase on the 28th and 90th day compressive strength. A relatively linear relationship was observed between compressive strength and concentration of activator solutions up to 28th of curing. Addition of the said activators significantly reduced both initial and final setting time. Standard consistency and soundness varied with increased amount of clay in the test cement and concentration of activators. Amount of insoluble residues increased with increased replacement of OPC with calcined clays. Mineralogical studies showed that N-A-S-H is formed in addition to C-S-H. In conclusion, the concentration of 2 molar for all activator solutions produced the optimum compressive strength and greatly reduced the setting times for all cement mortars.

Keywords: activators, admixture, cement, clay, pozzolana

Procedia PDF Downloads 259
5278 Detection of Trends and Break Points in Climatic Indices: The Case of Umbria Region in Italy

Authors: A. Flammini, R. Morbidelli, C. Saltalippi

Abstract:

The increase of air surface temperature at global scale is a fact, with values around 0.85 ºC since the late nineteen century, as well as a significant change in main features of rainfall regime. Nevertheless, the detected climatic changes are not equally distributed all over the world, but exhibit specific characteristics in different regions. Therefore, studying the evolution of climatic indices in different geographical areas with a prefixed standard approach becomes very useful in order to analyze the existence of climatic trend and compare results. In this work, a methodology to investigate the climatic change and its effects on a wide set of climatic indices is proposed and applied at regional scale in the case study of a Mediterranean area, Umbria region in Italy. From data of the available temperature stations, nine temperature indices have been obtained and the existence of trends has been checked by applying the non-parametric Mann-Kendall test, while the non-parametric Pettitt test and the parametric Standard Normal Homogeneity Test (SNHT) have been applied to detect the presence of break points. In addition, aimed to characterize the rainfall regime, data from 11 rainfall stations have been used and a trend analysis has been performed on cumulative annual rainfall depth, daily rainfall, rainy days, and dry periods length. The results show a general increase in any temperature indices, even if with a trend pattern dependent of indices and stations, and a general decrease of cumulative annual rainfall and average daily rainfall, with a time rainfall distribution over the year different from the past.

Keywords: climatic change, temperature, rainfall regime, trend analysis

Procedia PDF Downloads 112