Search results for: evaluation%20methodology
5345 Library Screening and Evaluation of Mycobacterium tuberculosis Ketol-Acid Reductoisomerase Inhibitors
Authors: Vagolu S. Krishna, Shan Zheng, Estharla M. Rekha, Luke W. Guddat, Dharmarajan Sriram
Abstract:
Tuberculosis (TB) remains a major threat to human health. This due to the fact that current drug treatments are less than optimal as well as the rising occurrence of multi drug-resistant and extensively drug-resistant strains of the etiological agent, Mycobacterium tuberculosis (Mt). Given the wide-spread significance of this disease, we have undertaken a design and evaluation program to discover new anti-TB drug leads. Here, our attention is focused on ketol-acid reductoisomerase (KARI), the second enzyme in the branched-chain amino acid biosynthesis pathway. Importantly, this enzyme is present in bacteria but not in humans, making it an attractive proposition for drug discovery. In the present work, we used high-throughput virtual screening to identify seventeen potential inhibitors of KARI using the Birla Institute of Technology and Science in-house database. Compounds were selected based on high docking scores, which were assigned as the result of favourable interactions between the compound and the active site of KARI. The Ki values for two leads, compounds 14 and 16 are 3.71 and 3.06 µM, respectively for Mt KARI. To assess the mode of binding, 100 ns molecular dynamics simulations for these two compounds in association with Mt KARI were performed and showed that the complex was stable with an average RMSD of less than 2.5 Å for all atoms. Compound 16 showed an MIC of 2.06 ± 0.91 µM and a 1.9 fold logarithmic reduction in the growth of Mt in an infected macrophage model. The two compounds exhibited low toxicity against murine macrophage RAW 264.7 cell lines. Thus, both compounds are promising candidates for development as an anti-TB drug leads.Keywords: ketol-acid reductoisomerase, macrophage, molecular docking and dynamics, tuberculosis
Procedia PDF Downloads 1215344 Decision Support System for Fetus Status Evaluation Using Cardiotocograms
Authors: Oyebade K. Oyedotun
Abstract:
The cardiotocogram is a technical recording of the heartbeat rate and uterine contractions of a fetus during pregnancy. During pregnancy, several complications can occur to both the mother and the fetus; hence it is very crucial that medical experts are able to find technical means to check the healthiness of the mother and especially the fetus. It is very important that the fetus develops as expected in stages during the pregnancy period; however, the task of monitoring the health status of the fetus is not that which is easily achieved as the fetus is not wholly physically available to medical experts for inspection. Hence, doctors have to resort to some other tests that can give an indication of the status of the fetus. One of such diagnostic test is to obtain cardiotocograms of the fetus. From the analysis of the cardiotocograms, medical experts can determine the status of the fetus, and therefore necessary medical interventions. Generally, medical experts classify examined cardiotocograms into ‘normal’, ‘suspect’, or ‘pathological’. This work presents an artificial neural network based decision support system which can filter cardiotocograms data, producing the corresponding statuses of the fetuses. The capability of artificial neural network to explore the cardiotocogram data and learn features that distinguish one class from the others has been exploited in this research. In this research, feedforward and radial basis neural networks were trained on a publicly available database to classify the processed cardiotocogram data into one of the three classes: ‘normal’, ‘suspect’, or ‘pathological’. Classification accuracies of 87.8% and 89.2% were achieved during the test phase of the trained network for the feedforward and radial basis neural networks respectively. It is the hope that while the system described in this work may not be a complete replacement for a medical expert in fetus status evaluation, it can significantly reinforce the confidence in medical diagnosis reached by experts.Keywords: decision support, cardiotocogram, classification, neural networks
Procedia PDF Downloads 3315343 The microbial evaluation of cow raw milk used in private dairy factories in of Zawia city, Libya
Authors: Obied A. Alwan, Elgerbi, M. Ali
Abstract:
This study was conducted on the cow milk which is used in the local milk factories of Zawia. This was completely random sampling the unscheduled samples. The microbiologic result have approved that the count of bacteria and the count of E.Coli are very high and all the manufacturing places which were included in the study have lacked the health conditions.Keywords: raw milk, dairy factories, Libya, microbiologic
Procedia PDF Downloads 4375342 Management of Cultural Heritage: Bologna Gates
Authors: Alfonso Ippolito, Cristiana Bartolomei
Abstract:
A growing demand is felt today for realistic 3D models enabling the cognition and popularization of historical-artistic heritage. Evaluation and preservation of Cultural Heritage is inextricably connected with the innovative processes of gaining, managing, and using knowledge. The development and perfecting of techniques for acquiring and elaborating photorealistic 3D models, made them pivotal elements for popularizing information of objects on the scale of architectonic structures.Keywords: cultural heritage, databases, non-contact survey, 2D-3D models
Procedia PDF Downloads 4215341 WWSE School Development in German Christian Schools Revisited: Organizational Development Taken to a Test
Authors: Marco Sewald
Abstract:
WWSE School Development (Wahrnehmungs- und wertorientierte Schulentwicklung) contains surveys on pupils, teachers and parents and enables schools to align the development to the requirements mentioned by these three stakeholders. WWSE includes a derivative set of questions for Christian schools, meeting their specific needs. The conducted research on WWSE is reflecting contemporary questions on school development, questioning the quality of the implementation of the results of past surveys, delivered by WWSE School Development in Christian schools in Germany. The research focused on questions connected to organizational development, including leadership and change management. This is done contoured to the two other areas of WWSE: human resources development and development of school teaching methods. The chosen research methods are: (1) A quantitative triangulation on three sets of data. Data from a past evaluation taken in 2011, data from a second evaluation covering the same school conducted in 2014 and a structured survey among the teachers, headmasters and members of the school board taken within the research. (2) Interviews with teachers and headmasters have been conducted during the research as a second stage to fortify the result of the quantitative first stage. Results: WWSE is supporting modern school development. While organizational development, leadership, and change management are proofed to be important for modern school development, these areas are widespread underestimated by teachers and headmasters. Especially in comparison to the field of human resource development and to an even bigger extent in comparison to the area of development of school teaching methods. The research concluded, that additional efforts in the area of organizational development are necessary to meet modern demands and the research also shows which areas are the most important ones.Keywords: school as a social organization, school development, school leadership, WWSE, Wahrnehmungs- und wertorientierte Schulentwicklung
Procedia PDF Downloads 2245340 Evaluation of the Grammar Questions at the Undergraduate Level
Authors: Preeti Gacche
Abstract:
A considerable part of undergraduate level English Examination papers is devoted to grammar. Hence the grammar questions in the question papers are evaluated and the opinions of both students and teachers about them are obtained and analyzed. A grammar test of 100 marks is administered to 43 students to check their performance. The question papers have been evaluated by 10 different teachers and their scores compared. The analysis of 38 University question papers reveals that on an average 20 percent marks are allotted to grammar. Almost all the grammar topics are tested. Abundant use of grammatical terminology is observed in the questions. Decontextualization, repetition, possibility of multiple correct answers and grammatical errors in framing the questions have been observed. Opinions of teachers and students about grammar questions vary in many respects. The students responses are analyzed medium-wise and sex-wise. The Medium at the School level and the sex of the students are found to play no role as far as interest in the study of grammar is concerned. English medium students solve grammar questions intuitively whereas non-English medium students are required to recollect the rules of grammar. Prepositions, Verbs, Articles and Model auxiliaries are found to be easy topics for most students whereas the use of conjunctions is the most difficult topic. Out of context items of grammar are difficult to answer in comparison with contextualized items of grammar. Hence contextualized texts to test grammar items are desirable. No formal training in setting questions is imparted to teachers by the competent authorities like the University. They need to be trained in testing. Statistically there is no significant change of score with the change in the rater in testing of grammar items. There is scope of future improvement. The question papers need to be evaluated and feedback needs to be obtained from students and teachers for future improvement.Keywords: context, evaluation, grammar, tests
Procedia PDF Downloads 3515339 Comprehensive Evaluation of Oral and Maxillofacial Radiology in "COVID-19"
Authors: Sahar Heidary, Ramin Ghasemi Shayan
Abstract:
The recent coronavirus disease 2019 (COVID-19) occurrence has carried considerabletrials to the world health system, comprising the training of dental and maxillofacial radiology (DMFR). DMFR will keep avital role in healthcare throughout this disaster. Severe acute breathing disease coronavirus 2 (SARS-CoV-2), the virus producing the current coronavirus disease 2019 (COVID-19) pandemic, is not only extremely contagious but can make solemn consequences in susceptible persons comprising dental patients and dental health care personnel (DHCPs). Reactions to COVID-19 have been available by the Cores for Infection Switch and Inhibition and the American Dental Association, but a more detailed answer is necessary for the harmless preparation of oral and maxillofacial radiology. Our goal is to evaluation the existing information just how the illness threatens patients and DHCPs and how to define which patients are possible to be SARS-CoV-2 infected; study how the usage of private shielding utensils and contamination control measures based on recent top observes, and knowledge can decrease the danger of virus spread in radiologic trials; and scrutinize how intraoral radiography, with its actually superior danger of scattering the infection, might be changed by extraoralradiographic methods for definite diagnostic jobs. In the pandemic, teleradiology has been extensively recycled for diagnostic determinations of COVID-19 patients, for discussions with radiologists in crisis cases, or managing of distance among radiology clinics. Dentists can have the digital radiographic images of their emergency patients through online service area also by electronic message or messaging applications to view in their smart phones, laptops, or other electronic devices.Keywords: radiology, dental, oral, COVID-19, infection
Procedia PDF Downloads 1715338 Performance Evaluation of Fingerprint, Auto-Pin and Password-Based Security Systems in Cloud Computing Environment
Authors: Emmanuel Ogala
Abstract:
Cloud computing has been envisioned as the next-generation architecture of Information Technology (IT) enterprise. In contrast to traditional solutions where IT services are under physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the management of the data and services may not be fully trustworthy. This is due to the fact that the systems are opened to the whole world and as people tries to have access into the system, many people also are there trying day-in day-out on having unauthorized access into the system. This research contributes to the improvement of cloud computing security for better operation. The work is motivated by two problems: first, the observed easy access to cloud computing resources and complexity of attacks to vital cloud computing data system NIC requires that dynamic security mechanism evolves to stay capable of preventing illegitimate access. Second; lack of good methodology for performance test and evaluation of biometric security algorithms for securing records in cloud computing environment. The aim of this research was to evaluate the performance of an integrated security system (ISS) for securing exams records in cloud computing environment. In this research, we designed and implemented an ISS consisting of three security mechanisms of biometric (fingerprint), auto-PIN and password into one stream of access control and used for securing examination records in Kogi State University, Anyigba. Conclusively, the system we built has been able to overcome guessing abilities of hackers who guesses people password or pin. We are certain about this because the added security system (fingerprint) needs the presence of the user of the software before a login access can be granted. This is based on the placement of his finger on the fingerprint biometrics scanner for capturing and verification purpose for user’s authenticity confirmation. The study adopted the conceptual of quantitative design. Object oriented and design methodology was adopted. In the analysis and design, PHP, HTML5, CSS, Visual Studio Java Script, and web 2.0 technologies were used to implement the model of ISS for cloud computing environment. Note; PHP, HTML5, CSS were used in conjunction with visual Studio front end engine design tools and MySQL + Access 7.0 were used for the backend engine and Java Script was used for object arrangement and also validation of user input for security check. Finally, the performance of the developed framework was evaluated by comparing with two other existing security systems (Auto-PIN and password) within the school and the results showed that the developed approach (fingerprint) allows overcoming the two main weaknesses of the existing systems and will work perfectly well if fully implemented.Keywords: performance evaluation, fingerprint, auto-pin, password-based, security systems, cloud computing environment
Procedia PDF Downloads 1395337 The Effects of the Introduction of a One-day Waiting Period on Absences for Ordinary Illness of Public Employees
Authors: Mohamed Ali Ben Halima, Malik Koubi, Joseph Lanfranchi, Yohan Wloczysiak
Abstract:
This article assesses the consequences on the frequency and duration of ordinary sick leave of the January 2012 and 2018 reforms modifying the scope of sick leave reimbursement in the French civil service. These reforms introduce a one-day waiting period which removes the compensation for the first day of ordinary sick leave. In order to evaluate these reforms, we use an administrative database from the National Pension Fund for local public employees (FPT). The first important result of our data analysis is that the one-day waiting period was not introduced at the same time in the French Local Public Service establishments, or even never in some. This peculiarity allows for an identification strategy using a difference-in-differences method based on the definition at each date of groups of employees treated and not treated by the reform, since establishments that apply the one-day waiting period coexist with establishments that do not apply it. Two types of estimators are used for this evaluation: individual and time fixed effects estimators and DIDM estimators which correct for the biases of the Two Way Fixed Effects one. The results confirm that the change in the sick pay system decreases the probability of having at least one ordinary sick leave as well as the number and duration of these episodes. On the other hand, the estimates show that longer leave episodes are not less affected than shorter ones. Finally, the validity tests of the estimators support the results obtained for the second period of 2018-2019, but suggest estimation biases for the period 2012-2013. The extent to which the endogeneity of the choices of implementation of the reform at the local level impact these estimates needs to be further tested.Keywords: sick leave, one-day waiting period, territorial civil service, public policy evaluation
Procedia PDF Downloads 835336 Evaluation of Cooperative Hand Movement Capacity in Stroke Patients Using the Cooperative Activity Stroke Assessment
Authors: F. A. Thomas, M. Schrafl-Altermatt, R. Treier, S. Kaufmann
Abstract:
Stroke is the main cause of adult disability. Especially upper limb function is affected in most patients. Recently, cooperative hand movements have been shown to be a promising type of upper limb training in stroke rehabilitation. In these movements, which are frequently found in activities of daily living (e.g. opening a bottle, winding up a blind), the force of one upper limb has to be equally counteracted by the other limb to successfully accomplish a task. The use of standardized and reliable clinical assessments is essential to evaluate the efficacy of therapy and the functional outcome of a patient. Many assessments for upper limb function or impairment are available. However, the evaluation of cooperative hand movement tasks are rarely included in those. Thus, the aim of this study was (i) to develop a novel clinical assessment (CASA - Cooperative Activity Stroke Assessment) for the evaluation of patients’ capacity to perform cooperative hand movements and (ii) to test its inter- and interrater reliability. Furthermore, CASA scores were compared to current gold standard assessments for upper extremity in stroke patients (i.e. Fugl-Meyer Assessment, Box & Blocks Test). The CASA consists of five cooperative activities of daily living including (1) opening a jar, (2) opening a bottle, (3) open and closing of a zip, (4) unscrew a nut and (5) opening a clipbox. Here, the goal is to accomplish the tasks as fast as possible. In addition to the quantitative rating (i.e. time) which is converted to a 7-point scale, also the quality of the movement is rated in a 4-point scale. To test the reliability of CASA, fifteen stroke subjects were tested within a week twice by the same two raters. Intra-and interrater reliability was calculated using the intraclass correlation coefficient (ICC) for total CASA score and single items. Furthermore, Pearson-correlation was used to compare the CASA scores to the scores of Fugl-Meyer upper limb assessment and the box and blocks test, which were assessed in every patient additionally to the CASA. ICC scores of the total CASA score indicated an excellent- and single items established a good to excellent inter- and interrater reliability. Furthermore, the CASA score was significantly correlated to the Fugl-Meyer and Box & Blocks score. The CASA provides a reliable assessment for cooperative hand movements which are crucial for many activities of daily living. Due to its non-costly setup, easy and fast implementation, we suggest it to be well suitable for clinical application. In conclusion, the CASA is a useful tool in assessing the functional status and therapy related recovery in cooperative hand movement capacity in stroke patients.Keywords: activitites of daily living, clinical assessment, cooperative hand movements, reliability, stroke
Procedia PDF Downloads 3185335 Robust Segmentation of Salient Features in Automatic Breast Ultrasound (ABUS) Images
Authors: Lamees Nasser, Yago Diez, Robert Martí, Joan Martí, Ibrahim Sadek
Abstract:
Automated 3D breast ultrasound (ABUS) screening is a novel modality in medical imaging because of its common characteristics shared with other ultrasound modalities in addition to the three orthogonal planes (i.e., axial, sagittal, and coronal) that are useful in analysis of tumors. In the literature, few automatic approaches exist for typical tasks such as segmentation or registration. In this work, we deal with two problems concerning ABUS images: nipple and rib detection. Nipple and ribs are the most visible and salient features in ABUS images. Determining the nipple position plays a key role in some applications for example evaluation of registration results or lesion follow-up. We present a nipple detection algorithm based on color and shape of the nipple, besides an automatic approach to detect the ribs. In point of fact, rib detection is considered as one of the main stages in chest wall segmentation. This approach consists of four steps. First, images are normalized in order to minimize the intensity variability for a given set of regions within the same image or a set of images. Second, the normalized images are smoothed by using anisotropic diffusion filter. Next, the ribs are detected in each slice by analyzing the eigenvalues of the 3D Hessian matrix. Finally, a breast mask and a probability map of regions detected as ribs are used to remove false positives (FP). Qualitative and quantitative evaluation obtained from a total of 22 cases is performed. For all cases, the average and standard deviation of the root mean square error (RMSE) between manually annotated points placed on the rib surface and detected points on rib borders are 15.1188 mm and 14.7184 mm respectively.Keywords: Automated 3D Breast Ultrasound, Eigenvalues of Hessian matrix, Nipple detection, Rib detection
Procedia PDF Downloads 3295334 Qualitative Analysis of Current Child Custody Evaluation Practices
Authors: Carolyn J. Ortega, Stephen E. Berger
Abstract:
The role of the custody evaluator is perhaps one of the most controversial and risky endeavors in clinical practice. Complaints filed with licensing boards regarding a child-custody evaluation constitute the second most common reason for such an event. Although the evaluator is expected to answer for the family-law court what is in the “best interest of the child,” there is a lack of clarity on how to establish this in any empirically validated manner. Hence, practitioners must contend with a nebulous framework in formulating their methodological procedures that inherently places them at risk in an already litigious context. This study sought to qualitatively investigate patterns of practice among doctoral practitioners conducting child custody evaluations in the area of Southern California. Ten psychologists were interviewed who devoted between 25 and 100% of their California private practice to custody work. All held Ph.D. degrees with a range of eight to 36 years of experience in custody work. Semi-structured interviews were used to investigate assessment practices, ensure adherence to guidelines, risk management, and qualities of evaluators. Forty-three Specific Themes were identified using Interpretive Phenomenological Analysis (IPA). Seven Higher Order Themes clustered on salient factors such as use of Ethics, Law, Guidelines; Parent Variables; Child Variables; Psychologist Variables; Testing; Literature; and Trends. Evaluators were aware of the ever-present reality of a licensure complaint and thus presented idiosyncratic descriptions of risk management considerations. Ambiguity about quantifying and validly tapping parenting abilities was also reviewed. Findings from this study suggested a high reliance on unstructured and observational methods in child custody practices.Keywords: forensic psychology, psychological testing, assessment methodology, child custody
Procedia PDF Downloads 2835333 Household Climate-Resilience Index Development for the Health Sector in Tanzania: Use of Demographic and Health Surveys Data Linked with Remote Sensing
Authors: Heribert R. Kaijage, Samuel N. A. Codjoe, Simon H. D. Mamuya, Mangi J. Ezekiel
Abstract:
There is strong evidence that climate has changed significantly affecting various sectors including public health. The recommended feasible solution is adopting development trajectories which combine both mitigation and adaptation measures for improving resilience pathways. This approach demands a consideration for complex interactions between climate and social-ecological systems. While other sectors such as agriculture and water have developed climate resilience indices, the public health sector in Tanzania is still lagging behind. The aim of this study was to find out how can we use Demographic and Health Surveys (DHS) linked with Remote Sensing (RS) technology and metrological information as tools to inform climate change resilient development and evaluation for the health sector. Methodological review was conducted whereby a number of studies were content analyzed to find appropriate indicators and indices for climate resilience household and their integration approach. These indicators were critically reviewed, listed, filtered and their sources determined. Preliminary identification and ranking of indicators were conducted using participatory approach of pairwise weighting by selected national stakeholders from meeting/conferences on human health and climate change sciences in Tanzania. DHS datasets were retrieved from Measure Evaluation project, processed and critically analyzed for possible climate change indicators. Other sources for indicators of climate change exposure were also identified. For the purpose of preliminary reporting, operationalization of selected indicators was discussed to produce methodological approach to be used in resilience comparative analysis study. It was found that household climate resilient index depends on the combination of three indices namely Household Adaptive and Mitigation Capacity (HC), Household Health Sensitivity (HHS) and Household Exposure Status (HES). It was also found that, DHS alone cannot complement resilient evaluation unless integrated with other data sources notably flooding data as a measure of vulnerability, remote sensing image of Normalized Vegetation Index (NDVI) and Metrological data (deviation from rainfall pattern). It can be concluded that if these indices retrieved from DHS data sets are computed and scientifically integrated can produce single climate resilience index and resilience maps could be generated at different spatial and time scales to enhance targeted interventions for climate resilient development and evaluations. However, further studies are need to test for the sensitivity of index in resilience comparative analysis among selected regions.Keywords: climate change, resilience, remote sensing, demographic and health surveys
Procedia PDF Downloads 1645332 Low-Cost Image Processing System for Evaluating Pavement Surface Distress
Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa
Abstract:
Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means
Procedia PDF Downloads 1815331 MRI Quality Control Using Texture Analysis and Spatial Metrics
Authors: Kumar Kanudkuri, A. Sandhya
Abstract:
Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy
Procedia PDF Downloads 1685330 The Relationship between Anthropometric Obesity Indices and Insulin in Children with Metabolic Syndrome
Authors: Mustafa M. Donma, Orkide Donma
Abstract:
The number of indices developed for the evaluation of obesity both in adults and pediatric population is ever increasing. These indices are also used in cases with metabolic syndrome (MetS), mostly the ultimate form of morbid obesity. Aside from anthropometric measurements, formulas constituted using these parameters also find clinical use. These formulas can be listed as two groups; being weight-dependent and –independent. Some are extremely sophisticated equations and their clinical utility is questionable in routine clinical practice. The aim of this study is to compare presently available obesity indices and find the most practical one. Their associations with MetS components were also investigated to determine their capacities in differential diagnosis of morbid obesity with and without MetS. Children with normal body mass index (N-BMI) and morbid obesity were recruited for this study. Three groups were constituted. Age- and sex- dependent BMI percentiles for morbid obese (MO) children were above 99 according to World Health Organization tables. Of them, those with MetS findings were evaluated as MetS group. Children, whose values were between 85 and 15 were included in N-BMI group. The study protocol was approved by the Ethics Committee of the Institution. Parents filled out informed consent forms to participate in the study. Anthropometric measurements and blood pressure values were recorded. Body mass index, hip index (HI), conicity index (CI), triponderal mass index (TPMI), body adiposity index (BAI), body shape index (ABSI), body roundness index (BRI), abdominal volume index (AVI), waist-to-hip ratio (WHR) and waist circumference+hip circumference/2 ((WC+HC)/2) were the formulas examined within the scope of this study. Routine biochemical tests including fasting blood glucose (FBG), insulin (INS), triglycerides (TRG), high density lipoprotein-cholesterol (HDL-C) were performed. Statistical package program SPSS was used for the evaluation of study data. p<0.05 was accepted as the statistical significance degree. Hip index did not differ among the groups. A statistically significant difference was noted between N-BMI and MetS groups in terms of ABSI. All the other indices were capable of making discrimination between N-BMI-MO, N-BMI- MetS and MO-MetS groups. No correlation was found between FBG and any obesity indices in any groups. The same was true for INS in N-BMI group. Insulin was correlated with BAI, TPMI, CI, BRI, AVI and (WC+HC)/2 in MO group without MetS findings. In MetS group, the only index, which was correlated with INS was (WC+HC)/2. These findings have pointed out that complicated formulas may not be required for the evaluation of the alterations among N-BMI and various obesity groups including MetS. The simple easily computable weight-independent index, (WC+HC)/2, was unique, because it was the only index, which exhibits a valuable association with INS in MetS group. It did not exhibit any correlation with other obesity indices showing associations with INS in MO group. It was concluded that (WC+HC)/2 was pretty valuable practicable index for the discrimination of MO children with and without MetS findings.Keywords: children, insulin, metabolic syndrome, obesity indices
Procedia PDF Downloads 755329 Value Proposition and Value Creation in Network Environments: An Experimental Study of Academic Productivity via the Application of Bibliometrics
Authors: R. Oleko, A. Saraceni
Abstract:
The aim of this research is to provide a rigorous evaluation of the existing academic productivity in relation to value proposition and creation in networked environments. Bibliometrics is a vigorous approach used to structure existing literature in an objective and reliable manner. To that aim, a thorough bibliometric analysis was performed in order to assess the large volume of the information encountered in a structured and reliable manner. A clear distinction between networks and service networks was considered indispensable in order to capture the effects of each network’s type properties on value creation processes. Via the use of bibliometric parameters, this review was able to capture the state-of-the-art in both value proposition and value creation consecutively. The results provide a rigorous assessment of the annual scientific production, the most influential journals, and the leading corresponding author countries. By means of citation analysis, the most frequently cited manuscripts and countries for each network type were identified. Moreover, by means of co-citation analysis, existing collaborative patterns were detected through the creation of reference co-citation networks and country collaboration networks. Co-word analysis was also performed in order to provide an overview of the conceptual structure in both networks and service networks. The acquired results provide a rigorous and systematic assessment of the existing scientific output in networked settings. As such, they positively contribute to a better understanding of the distinct impact of service networks on value proposition and value creation when compared to regular networks. The implications derived can serve as a guide for informed decision-making by practitioners during network formation and provide a structured evaluation that can stand as a basis for future research in the field.Keywords: bibliometrics, co-citation analysis, networks, service networks, value creation, value proposition
Procedia PDF Downloads 2035328 Review of Strategies for Hybrid Energy Storage Management System in Electric Vehicle Application
Authors: Kayode A. Olaniyi, Adeola A. Ogunleye, Tola M. Osifeko
Abstract:
Electric Vehicles (EV) appear to be gaining increasing patronage as a feasible alternative to Internal Combustion Engine Vehicles (ICEVs) for having low emission and high operation efficiency. The EV energy storage systems are required to handle high energy and power density capacity constrained by limited space, operating temperature, weight and cost. The choice of strategies for energy storage evaluation, monitoring and control remains a challenging task. This paper presents review of various energy storage technologies and recent researches in battery evaluation techniques used in EV applications. It also underscores strategies for the hybrid energy storage management and control schemes for the improvement of EV stability and reliability. The study reveals that despite the advances recorded in battery technologies there is still no cell which possess both the optimum power and energy densities among other requirements, for EV application. However combination of two or more energy storages as hybrid and allowing the advantageous attributes from each device to be utilized is a promising solution. The review also reveals that State-of-Charge (SoC) is the most crucial method for battery estimation. The conventional method of SoC measurement is however questioned in the literature and adaptive algorithms that include all model of disturbances are being proposed. The review further suggests that heuristic-based approach is commonly adopted in the development of strategies for hybrid energy storage system management. The alternative approach which is optimization-based is found to be more accurate but is memory and computational intensive and as such not recommended in most real-time applications.Keywords: battery state estimation, hybrid electric vehicle, hybrid energy storage, state of charge, state of health
Procedia PDF Downloads 2405327 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue
Authors: Rachel Y. Zhang, Christopher K. Anderson
Abstract:
A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine
Procedia PDF Downloads 1315326 Economic Evaluation of Varying Scenarios to Fulfill the Regional Electricity Demand in Pakistan
Authors: Muhammad Shahid, Kafait Ullah, Kashif Imran, Arshad Mahmood, Maarten Arentsen
Abstract:
Poor planning and governance in the power sector of Pakistan have generated several issues ranging from gradual reliance on thermal-based expensive energy mix, supply shortages, unrestricted demand, subsidization, inefficiencies at different levels of the value chain and resultantly, the circular debt. This situation in the power sector has also hampered the growth of allied economic sectors. This study uses the Long-range Energy Alternative Planning (LEAP) system for electricity modelling of Pakistan from the period of 2016 to 2040. The study has first time in Pakistan forecasted the electricity demand at the provincial level. At the supply side, five scenarios Business as Usual Scenario (BAUS), Coal Scenario (CS), Gas Scenario (GS), Nuclear Scenario (NS) and Renewable Scenario (RS) have been analyzed based on the techno-economic and environmental parameters. The study has also included environmental externality costs for evaluating the actual costs and benefits of different scenarios. Contrary to the expectations, RS has a lower output than even BAUS. The study has concluded that the generation from RS has five times lesser costs than BAUS, CS, and GS. NS can also be an alternative for the sustainable future of Pakistan. Generation from imported coal is not a good option, however, indigenous coal with clean coal technologies should be promoted. This paper proposes energy planners of the country to devise incentives for the utilization of indigenous energy resources including renewables on priority and then clean coal to reduce the energy crises of Pakistan.Keywords: economic evaluation, externality cost, penetration of renewable energy, regional electricity supply-demand planning
Procedia PDF Downloads 1155325 A Handheld Light Meter Device for Methamphetamine Detection in Oral Fluid
Authors: Anindita Sen
Abstract:
Oral fluid is a promising diagnostic matrix for drugs of abuse compared to urine and serum. Detection of methamphetamine in oral fluid would pave way for the easy evaluation of impairment in drivers during roadside drug testing as well as ensure safe working environments by facilitating evaluation of impairment in employees at workplaces. A membrane-based point-of-care (POC) friendly pre-treatment technique has been developed which aided elimination of interferences caused by salivary proteins and facilitated the demonstration of methamphetamine detection in saliva using a gold nanoparticle based colorimetric aptasensor platform. It was found that the colorimetric response in saliva was always suppressed owing to the matrix effects. By navigating the challenging interfering issues in saliva, we were successfully able to detect methamphetamine at nanomolar levels in saliva offering immense promise for the translation of these platforms for on-site diagnostic systems. This subsequently motivated the development of a handheld portable light meter device that can reliably transduce the aptasensors colorimetric response into absorbance, facilitating quantitative detection of analyte concentrations on-site. This is crucial due to the prevalent unreliability and sensitivity problems of the conventional drug testing kits. The fabricated light meter device response was validated against a standard UV-Vis spectrometer to confirm reliability. The portable and cost-effective handheld detector device features sensitivity comparable to the well-established UV-Vis benchtop instrument and the easy-to-use device could potentially serve as a prototype for a commercial device in the future.Keywords: aptasensors, colorimetric gold nanoparticle assay, point-of-care, oral fluid
Procedia PDF Downloads 575324 Simulation-Based Evaluation of Indoor Air Quality and Comfort Control in Non-Residential Buildings
Authors: Torsten Schwan, Rene Unger
Abstract:
Simulation of thermal and electrical building performance more and more becomes part of an integrative planning process. Increasing requirements on energy efficiency, the integration of volatile renewable energy, smart control and storage management often cause tremendous challenges for building engineers and architects. This mainly affects commercial or non-residential buildings. Their energy consumption characteristics significantly distinguish from residential ones. This work focuses on the many-objective optimization problem indoor air quality and comfort, especially in non-residential buildings. Based on a brief description of intermediate dependencies between different requirements on indoor air treatment it extends existing Modelica-based building physics models with additional system states to adequately represent indoor air conditions. Interfaces to corresponding HVAC (heating, ventilation, and air conditioning) system and control models enable closed-loop analyzes of occupants' requirements and energy efficiency as well as profitableness aspects. A complex application scenario of a nearly-zero-energy school building shows advantages of presented evaluation process for engineers and architects. This way, clear identification of air quality requirements in individual rooms together with realistic model-based description of occupants' behavior helps to optimize HVAC system already in early design stages. Building planning processes can be highly improved and accelerated by increasing integration of advanced simulation methods. Those methods mainly provide suitable answers on engineers' and architects' questions regarding more exuberant and complex variety of suitable energy supply solutions.Keywords: indoor air quality, dynamic simulation, energy efficient control, non-residential buildings
Procedia PDF Downloads 2315323 The Potential in the Use of Building Information Modelling and Life-Cycle Assessment for Retrofitting Buildings: A Study Based on Interviews with Experts in Both Fields
Authors: Alex Gonzalez Caceres, Jan Karlshøj, Tor Arvid Vik
Abstract:
Life cycle of residential buildings are expected to be several decades, 40% of European residential buildings have inefficient energy conservation measure. The existing building represents 20-40% of the energy use and the CO₂ emission. Since net zero energy buildings are a short-term goal, (should be achieved by EU countries after 2020), is necessary to plan the next logical step, which is to prepare the existing outdated stack of building to retrofit them into an energy efficiency buildings. In order to accomplish this, two specialize and widespread tool can be used Building Information Modelling (BIM) and life-cycle assessment (LCA). BIM and LCA are tools used by a variety of disciplines; both are able to represent and analyze the constructions in different stages. The combination of these technologies could improve greatly the retrofitting techniques. The incorporation of the carbon footprint, introducing a single database source for different material analysis. To this is added the possibility of considering different analysis approaches such as costs and energy saving. Is expected with these measures, enrich the decision-making. The methodology is based on two main activities; the first task involved the collection of data this is accomplished by literature review and interview with experts in the retrofitting field and BIM technologies. The results of this task are presented as an evaluation checklist of BIM ability to manage data and improve decision-making in retrofitting projects. The last activity involves an evaluation using the results of the previous tasks, to check how far the IFC format can support the requirements by each specialist, and its uses by third party software. The result indicates that BIM/LCA have a great potential to improve the retrofitting process in existing buildings, but some modification must be done in order to meet the requirements of the specialists for both, retrofitting and LCA evaluators.Keywords: retrofitting, BIM, LCA, energy efficiency
Procedia PDF Downloads 2165322 Investigating the Behaviour of Composite Floors (Steel Beams and Concrete Slabs) under Mans Rhythmical Movement
Authors: M. Ali Lotfollahi Yaghin, M. Reza Bagerzadeh Karimi, Ali Rahmani, V. Sadeghi Balkanlou
Abstract:
Structural engineers have long been trying to develop solutions using the full potential of its composing materials. Therefore, there is no doubt that the structural solution progress is directly related to an increase in materials science knowledge. These efforts in conjunction with up-to-date modern construction techniques have led to an extensive use of composite floors in large span structures. On the other hand, the competitive trends of the world market have long been forcing structural engineers to develop minimum weight and labour cost solutions. A direct consequence of this new design trend is a considerable increase in problems related to unwanted floor vibrations. For this reason, the structural floors systems become vulnerable to excessive vibrations produced by impacts such as human rhythmic activities. The main objective of this paper is to present an analysis methodology for the evaluation of the composite floors human comfort. This procedure takes into account a more realistic loading model developed to incorporate the dynamic effects induced by human walking. The investigated structural models were based on various composite floors, with main spans varying from 5 to 10 m. based on an extensive parametric study the composite floors dynamic response, in terms of peak accelerations, was obtained and compared to the limiting values proposed by several authors and design standards. This strategy was adopted to provide a more realistic evaluation for this type of structure when subjected to vibration due to human walking.Keywords: vibration, resonance, composite floors, people’s rhythmic movement, dynamic analysis, Abaqus software
Procedia PDF Downloads 3005321 Northern Ghana’s Sustainable Food Systems: Evaluating the Impact of International Development
Authors: Maxwell Ladogo Abilla
Abstract:
As evidence from the 2007–2008 and 2010 global food and financial crises revealed that food systems were under stress, the idea of sustainable food systems rose to prominence in the discussion of food security. The idea suggests moving away from a conception of food security that emphasizes production in favor of one that is more socially and environmentally conscious and interested in tackling a wide range of issues that have rendered the food system dysfunctional. This study evaluates the efforts made by international development organizations to increase food security in the area, taking into account the persistence of poverty and food insecurity in northern Ghana, utilizing the idea of sustainable food systems as the evaluation criterion. The study used triangulation to address the research questions by combining qualitative interview data with documentary analysis. To better comprehend the concept of sustainability, a variety of discourses and concepts are used, which results in the development of eight doable objectives for attaining sustainable food systems. The study finds that the food system in northern Ghana is unsustainable because of three kinds of barriers, with the practical objectives of developing sustainable food systems serving as the assessment criteria (natural, cultural and economic, and institutional). According to an evaluation of the World Food Programme's development support in northern Ghana, regional challenges to attaining sustainable food systems continue to be unaddressed by global development initiatives. Due to institutional constraints, WFP's interventions fell short of their promise. By demonstrating the need for development partners to enhance institutional efficiency and coordination, enable marginalized communities to access their rights, and prioritize agricultural irrigation in the area, the study makes a contribution to development policy and practice in northern Ghana.Keywords: sustainable, food security, development, institutional
Procedia PDF Downloads 895320 Morphometric Parameters and Evaluation of Persian Fallow Deer Semen in Dashenaz Refuge in Iran
Authors: Behrang Ekrami, Amin Tamadon
Abstract:
Persian fallow deer (Dama dama mesopotamica) is belonging to the family Cervidae and is only found in a few protected areas in the northwest, north, and southwest of Iran. The aims of this study were analysis of inbreeding and morphometric parameters of semen in male Persian fallow deer to investigate the cause of reduced fertility of this endangered species in Dasht-e-Naz National Refuge, Sari, Iran. The Persian fallow deer semen was collected from four adult bucks randomly during the breeding and non-breeding season from five dehorned and horned deer's BY an artificial vagina. Twelve blood samples was taken from Persian fallow deer and mitochondrial DNA was extracted, amplified, extracted, sequenced, and then were considered for genetic analysis. The Persian fallow deer semen, both with normal and abnormal spermatozoa, is similar to that of domestic ruminants but very smaller and difficult to observe at the primary observation. The post-mating season collected ejaculates contained abnormal spermatozoa, debris and secretion of accessory glands in horned bucks and accessory glands secretion free of any spermatozoa in dehorned or early velvet budding bucks. Microscopic evaluation in all four bucks during the mating season showed the mean concentration of 9×106 spermatozoa/ml. The mean ±SD of age, testes length and testes width was 4.60±1.52 years, 3.58±0.32 and 1.86±0.09 cm, respectively. The results identified 1120 loci (assuming each nucleotide as locus) in which 377 were polymorphic. In conclusion, reduced fertility of male Persian fallow deer may be caused by inbreeding of the protected herd in a limited area of Dasht-e-Naz National Refuge.Keywords: Persian fallow deer, spermatozoa, reproductive characteristics, morphometric parameters
Procedia PDF Downloads 5755319 Parsonage Turner Syndrome PTS, Case Report
Authors: A. M. Bumbea, A. Musetescu, P. Ciurea, A. Bighea
Abstract:
Objectives: The authors present a Parsonage Turner syndrome, a rare disease characterized by onset in apparently healthy person with shoulder and/or arm pain, sensory deficit, motor deficit. The causes are not established, could be determinate by vaccination, postoperative, immunologic disease, post traumatic etc. Methods: The authors present a woman case, 32 years old, (in 2006), no medical history, with arm pain and no other symptom. The onset was sudden with pain at very high level quantified as 10 to a 0 to 10 scale, with no response to classical analgesic and corticoids. The only drugs which can reduce the intensity of pain were oxycodone hydrochloride, 60 mg daily and pregabalinum150 mg daily. After two weeks the intensity of pain was reduced to 5. The patient started a rehabilitation program. After 6 weeks the patient associated sensory and motor deficit. We performed electromyography for upper limb that showed incomplete denervation with reduced neural transmission speed. The patient receives neurotrophic drugs and painkillers for a long period and physical and kinetic therapy. After 6 months the pain was reduced to level 2 and the patient maintained only 150 mg pregabalinum for another 6 months. Then, the evaluation showed no pain but general amiotrophy in upper limb. Results: At the evaluation in 2009, the patient developed a rheumatoid syndrome with tender and swelling joints, but no positive inflammation test, no antibodies or rheumatoid factor. After two years, in 2011 the patient develops an increase of antinuclear antibodies. This context certifies the diagnosis of lupus and the patient receives the specific therapy. Conclusions: This case is not a typical case of onset of lupus with PTS, but the onset of PTS could include the onset of an immune disease.Keywords: lupus, arm pain, patient, swelling
Procedia PDF Downloads 3295318 Evaluation of Green Infrastructure with Different Woody Plants Practice and Benefit Using the Stormwater Management-HYDRUS Model
Authors: Bei Zhang, Zhaoxin Zhang, Lidong Zhao
Abstract:
Green infrastructures (GIs) for rainwater management can directly meet the multiple purposes of urban greening and non-point source pollution control. To reveal the overall layout law of GIs dominated by typical woody plants and their impact on urban environmental effects, we constructed a HYDRUS-1D and Stormwater management (SWMM) coupling model to simulate the response of typical root woody plant planting methods on urban hydrological. The results showed that the coupling model had high adaptability to the simulation of urban surface runoff control effect under different woody plant planting methods (NSE ≥0.64 and R² ≥ 0.71). The regulation effect on surface runoff showed that the average runoff reduction rate of GIs increased from 60 % to 71 % with the increase of planting area (5% to 25%) under the design rainfall event of the 2-year recurrence interval. Sophora japonica with tap roots was slightly higher than that of without plants (control) and Malus baccata (M. baccata) with fibrous roots. The comprehensive benefit evaluation system of rainwater utilization technology was constructed by using an analytic hierarchy process. The coupling model was used to evaluate the comprehensive benefits of woody plants with different planting areas in the study area in terms of environment, economy, and society. The comprehensive benefit value of planting 15% M. baccata was the highest, which was the first choice for the planting of woody plants in the study area. This study can provide a scientific basis for the decision-making of green facility layouts of woody plants.Keywords: green infrastructure, comprehensive benefits, runoff regulation, woody plant layout, coupling model
Procedia PDF Downloads 685317 Educational Innovation through Coaching and Mentoring in Thailand: A Mixed Method Evaluation of the Training Outcomes
Authors: Kanu Priya Mohan
Abstract:
Innovation in education is one of the essential pathways to achieve both educational, and development goals in today’s dynamically changing world. Over the last decade, coaching and mentoring have been applied in the field of education as positive intervention techniques for fostering teaching and learning reforms in the developed countries. The context of this research was Thailand’s educational reform process, wherein a project on coaching and mentoring (C&M) was launched in 2014. The C&M project endeavored to support the professional development of the school teachers in the various provinces of Thailand, and to also enable them to apply C&M for teaching innovative instructional techniques. This research aimed to empirically investigate the learning outcomes for the master trainers, who trained for coaching and mentoring as the first step in the process to train the school teachers. A mixed method study was used for evaluating the learning outcomes of training in terms of cognitive- behavioral-affective dimensions. In the first part of the research a quantitative research design was incorporated to evaluate the effects of learner characteristics and instructional techniques, on the learning outcomes. In the second phase, a qualitative method of in-depth interviews was used to find details about the training outcomes, as well as the perceived barriers and enablers of the training process. Sample size constraints were there, yet these exploratory results, integrated from both methods indicated the significance of evaluating training outcomes from the three dimensions, and the perceived role of other factors in the training. Findings are discussed in terms of their implications for the training of C&M, and also their impact in fostering positive education through innovative educational techniques in the developing countries.Keywords: cognitive-behavioral-affective learning outcomes, mixed method research, teachers in Thailand, training evaluation
Procedia PDF Downloads 2725316 Evaluation Method for Fouling Risk Using Quartz Crystal Microbalance
Authors: Natsuki Kishizawa, Keiko Nakano, Hussam Organji, Amer Shaiban, Mohammad Albeirutty
Abstract:
One of the most important tasks in operating desalination plants using a reverse osmosis (RO) method is preventing RO membrane fouling caused by foulants found in seawater. Optimal design of the pre-treatment process of RO process for plants enables the reduction of foulants. Therefore, a quantitative evaluation of the fouling risk in pre-treated water, which is fed to RO, is required for optimal design. Some measurement methods for water quality such as silt density index (SDI) and total organic carbon (TOC) have been conservatively applied for evaluations. However, these methods have not been effective in some situations for evaluating the fouling risk of RO feed water. Furthermore, stable management of plants will be possible by alerts and appropriate control of the pre-treatment process by using the method if it can be applied to the inline monitoring system for the fouling risk of RO feed water. The purpose of this study is to develop a method to evaluate the fouling risk of RO feed water. We applied a quartz crystal microbalance (QCM) to measure the amount of foulants found in seawater using a sensor whose surface is coated with polyamide thin film, which is the main material of a RO membrane. The increase of the weight of the sensor after a certain length of time in which the sample water passes indicates the fouling risk of the sample directly. We classified the values as “FP: Fouling Potential”. The characteristics of the method are to measure the very small amount of substances in seawater in a short time: < 2h, and from a small volume of the sample water: < 50mL. Using some RO cell filtration units, a higher correlation between the pressure increase given by RO fouling and the FP from the method than SDI and TOC was confirmed in the laboratory-scale test. Then, to establish the correlation in the actual bench-scale RO membrane module, and to confirm the feasibility of the monitoring system as a control tool for the pre-treatment process, we have started a long-term test at an experimental desalination site by the Red Sea in Jeddah, Kingdom of Saudi Arabia. Implementing inline equipment for the method made it possible to measure FP intermittently (4 times per day) and automatically. Moreover, for two 3-month long operations, the RO operation pressure among feed water samples of different qualities was compared. The pressure increase through a RO membrane module was observed at a high FP RO unit in which feed water was treated by a cartridge filter only. On the other hand, the pressure increase was not observed at a low FP RO unit in which feed water was treated by an ultra-filter during the operation. Therefore, the correlation in an actual scale RO membrane was established in two runs of two types of feed water. The result suggested that the FP method enables the evaluation of the fouling risk of RO feed water.Keywords: fouling, monitoring, QCM, water quality
Procedia PDF Downloads 211