Search results for: diagnostic tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1858

Search results for: diagnostic tool

1348 Correction of Frequent English Writing Errors by Using Coded Indirect Corrective Feedback and Error Treatment: The Case of Reading and Writing English for Academic Purposes II

Authors: Chaiwat Tantarangsee

Abstract:

The purposes of this study are 1) to study the frequent English writing errors of students registering the course: Reading and Writing English for Academic Purposes II, and 2) to find out the results of writing error correction by using coded indirect corrective feedback and writing error treatments. Samples include 28 2nd year English Major students, Faculty of Education, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tool for data collection includes 4 writing tests of short texts. The research findings disclose that frequent English writing errors found in this course comprise 7 types of grammatical errors, namely Fragment sentence, Subject-verb agreement, Wrong form of verb tense, Singular or plural noun endings, Run-ons sentence, Wrong form of verb pattern and Lack of parallel structure. Moreover, it is found that the results of writing error correction by using coded indirect corrective feedback and error treatment reveal the overall reduction of the frequent English writing errors and the increase of students’ achievement in the writing of short texts with the significance at .05.

Keywords: Coded indirect corrective feedback, error correction, and error treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1093
1347 Arterial Stiffness Detection Depending on Neural Network Classification of the Multi- Input Parameters

Authors: Firas Salih, Luban Hameed, Afaf Kamil, Armin Bolz

Abstract:

Diagnostic and detection of the arterial stiffness is very important; which gives indication of the associated increased risk of cardiovascular diseases. To make a cheap and easy method for general screening technique to avoid the future cardiovascular complexes , due to the rising of the arterial stiffness ; a proposed algorithm depending on photoplethysmogram to be used. The photoplethysmograph signals would be processed in MATLAB. The signal will be filtered, baseline wandering removed, peaks and valleys detected and normalization of the signals should be achieved .The area under the catacrotic phase of the photoplethysmogram pulse curve is calculated using trapezoidal algorithm ; then will used in cooperation with other parameters such as age, height, blood pressure in neural network for arterial stiffness detection. The Neural network were implemented with sensitivity of 80%, accuracy 85% and specificity of 90% were got from the patients data. It is concluded that neural network can detect the arterial STIFFNESS depending on risk factor parameters.

Keywords: Arterial stiffness, area under the catacrotic phase of the photoplethysmograph pulse, neural network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
1346 A Case Study on the Value of Corporate Social Responsibility Systems

Authors: José M. Brotons, Manuel E. Sansalvador

Abstract:

The relationship between Corporate Social Responsibility (CSR) and financial performance (FP) is a subject of great interest that has not yet been resolved. In this work, we have developed a new and original tool to measure this relation. The tool quantifies the value contributed to companies that are committed to CSR. The theoretical model used is the fuzzy discounted cash flow method. Two assumptions have been considered, the first, the company has implemented the IQNet SR10 certification, and the second, the company has not implemented that certification. For the first one, the growth rate used for the time horizon is the rate maintained by the company after obtaining the IQNet SR10 certificate. For the second one, both, the growth rates company prior to the implementation of the certification, and the evolution of the sector will be taken into account. By using triangular fuzzy numbers, it is possible to deal adequately with each company’s forecasts as well as the information corresponding to the sector. Once the annual growth rate of the sales is obtained, the profit and loss accounts are generated from the annual estimate sales. For the remaining elements of this account, their regression with the nets sales has been considered. The difference between these two valuations, made in a fuzzy environment, allows obtaining the value of the IQNet SR10 certification. Although this study presents an innovative methodology to quantify the relation between CSR and FP, the authors are aware that only one company has been analyzed. This is precisely the main limitation of this study which in turn opens up an interesting line for future research: to broaden the sample of companies.

Keywords: Corporate social responsibility, case study, financial performance, company valuation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 789
1345 A 3-Year Evaluation Study on Fine Needle Aspiration Cytology and Corresponding Histology

Authors: Amjad Al Shammari, Ashraf Ibrahim, Laila Seada

Abstract:

Background and Objectives: Incidence of thyroid carcinoma has been increasing world-wide. In the present study, we evaluated diagnostic accuracy of Fine needle aspiration (FNA) and its efficiency in early detecting neoplastic lesions of thyroid gland over a 3-year period. Methods: Data have been retrieved from pathology files in King Khalid Hospital. For each patient, age, gender, FNA, site & size of nodule and final histopathologic diagnosis were recorded. Results: Study included 490 cases where 419 of them were female and 71 male. Male to female ratio was 1:6. Mean age was 43 years for males and 38 for females. Cases with confirmed histopathology were 131. In 101/131 (77.1%), concordance was found between FNA and histology. In 30/131 (22.9%), there was discrepancy in diagnosis. Total malignant cases were 43, out of which 14 (32.5%) were true positive and 29 (67.44%) were false negative. No false positive cases could be found in our series. Conclusion: FNA could diagnose benign nodules in all cases, however, in malignant cases, ultrasound findings have to be taken into consideration to avoid missing of a microcarcinoma in the contralateral lobe.

Keywords: FNA, hail, histopathology, thyroid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1176
1344 Ontology-Driven Generation of Radiation Protection Procedures

Authors: Chamseddine Barki, Salam Labidi, Hanen Boussi Rahmouni

Abstract:

In this article, we present the principle and suitable methodology for the design of a medical ontology that highlights the radiological and dosimetric knowledge, applied in diagnostic radiology and radiation-therapy. Our ontology, which we named «Onto.Rap», is the subject of radiation protection in medical and radiology centers by providing a standardized regulatory oversight. Thanks to its added values of knowledge-sharing, reuse and the ease of maintenance, this ontology tends to solve many problems. Of which we name the confusion between radiological procedures a practitioner might face while performing a patient radiological exam. Adding to it, the difficulties they might have in interpreting applicable patient radioprotection standards. Here, the ontology, thanks to its concepts simplification and expressiveness capabilities, can ensure an efficient classification of radiological procedures. It also provides an explicit representation of the relations between the different components of the studied concept. In fact, an ontology based-radioprotection expert system, when used in radiological center, could implement systematic radioprotection best practices during patient exam and a regulatory compliance service auditing afterwards.

Keywords: Ontology, radiology, medicine, knowledge, radiation protection, audit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1289
1343 An Experimental Design Approach to Determine Effects of The Operating Parameters on The Rate of Ru promoted Ir Carbonylation of Methanol

Authors: Vahid Hosseinpour, Mohammad Kazemini, Alireza Mohammadrezaee

Abstract:

carbonylation of methanol in homogenous phase is one of the major routesfor production of acetic acid. Amongst group VIII metal catalysts used in this process iridium has displayed the best capabilities. To investigate effect of operating parameters like: temperature, pressure, methyl iodide, methyl acetate, iridium, ruthenium, and water concentrations on the reaction rate, experimental design for this system based upon central composite design (CCD) was utilized. Statistical rate equation developed by this method contained individual, interactions and curvature effects of parameters on the reaction rate. The model with p-value less than 0.0001 and R2 values greater than 0.9; confirmeda satisfactory fitness of the experimental and theoretical studies. In other words, the developed model and experimental data obtained passed all diagnostic tests establishing this model as a statistically significant.

Keywords: Acetic Acid, Carbonylation of Methanol, Central Composite Design, Experimental Design, Iridium/Ruthenium

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3660
1342 Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment

Authors: Fares Innal, Yves Dutuit, Mourad Chebila

Abstract:

The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.

Keywords: Fuzzy sets, Monte Carlo simulation, Safety instrumented system, Safety integrity level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2779
1341 Fault Diagnosis of Nonlinear Systems Using Dynamic Neural Networks

Authors: E. Sobhani-Tehrani, K. Khorasani, N. Meskin

Abstract:

This paper presents a novel integrated hybrid approach for fault diagnosis (FD) of nonlinear systems. Unlike most FD techniques, the proposed solution simultaneously accomplishes fault detection, isolation, and identification (FDII) within a unified diagnostic module. At the core of this solution is a bank of adaptive neural parameter estimators (NPE) associated with a set of singleparameter fault models. The NPEs continuously estimate unknown fault parameters (FP) that are indicators of faults in the system. Two NPE structures including series-parallel and parallel are developed with their exclusive set of desirable attributes. The parallel scheme is extremely robust to measurement noise and possesses a simpler, yet more solid, fault isolation logic. On the contrary, the series-parallel scheme displays short FD delays and is robust to closed-loop system transients due to changes in control commands. Finally, a fault tolerant observer (FTO) is designed to extend the capability of the NPEs to systems with partial-state measurement.

Keywords: Hybrid fault diagnosis, Dynamic neural networks, Nonlinear systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2221
1340 Economic Loss due to Ganoderma Disease in Oil Palm

Authors: K. Assis, K. P. Chong, A. S. Idris, C. M. Ho

Abstract:

Oil palm or Elaeis guineensis is considered as the golden crop in Malaysia. But oil palm industry in this country is now facing with the most devastating disease called as Ganoderma Basal Stem Rot disease. The objective of this paper is to analyze the economic loss due to this disease. There were three commercial oil palm sites selected for collecting the required data for economic analysis. Yield parameter used to measure the loss was the total weight of fresh fruit bunch in six months. The predictors include disease severity, change in disease severity, number of infected neighbor palms, age of palm, planting generation, topography, and first order interaction variables. The estimation model of yield loss was identified by using backward elimination based regression method. Diagnostic checking was conducted on the residual of the best yield loss model. The value of mean absolute percentage error (MAPE) was used to measure the forecast performance of the model. The best yield loss model was then used to estimate the economic loss by using the current monthly price of fresh fruit bunch at mill gate.

Keywords: Ganoderma, oil palm, regression model, yield loss, economic loss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3237
1339 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: Actual cost and duration, attribute selection, bridge projects, neural networks, predicting models, FANN TOOL, WEKA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1229
1338 Info-participation of the Disabled Using the Mixed Preference Data in Improving Their Travel Quality

Authors: Y. Duvarci, S. Mizokami

Abstract:

Today, the preferences and participation of the TD groups such as the elderly and disabled is still lacking in decision-making of transportation planning, and their reactions to certain type of policies are not well known. Thus, a clear methodology is needed. This study aimed to develop a method to extract the preferences of the disabled to be used in the policy-making stage that can also guide to future estimations. The method utilizes the combination of cluster analysis and data filtering using the data of the Arao city (Japan). The method is a process that follows: defining the TD group by the cluster analysis tool, their travel preferences in tabular form from the household surveys by policy variableimpact pairs, zones, and by trip purposes, and the final outcome is the preference probabilities of the disabled. The preferences vary by trip purpose; for the work trips, accessibility and transit system quality policies with the accompanying impacts of modal shifts towards public mode use as well as the decreasing travel costs, and the trip rate increase; for the social trips, the same accessibility and transit system policies leading to the same mode shift impact, together with the travel quality policy area leading to trip rate increase. These results explain the policies to focus and can be used in scenario generation in models, or any other planning purpose as decision support tool.

Keywords: Transportation Disadvantaged, Disabled, Mixed Preference, Stated Preference Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1079
1337 Data Mining for Cancer Management in Egypt Case Study: Childhood Acute Lymphoblastic Leukemia

Authors: Nevine M. Labib, Michael N. Malek

Abstract:

Data Mining aims at discovering knowledge out of data and presenting it in a form that is easily comprehensible to humans. One of the useful applications in Egypt is the Cancer management, especially the management of Acute Lymphoblastic Leukemia or ALL, which is the most common type of cancer in children. This paper discusses the process of designing a prototype that can help in the management of childhood ALL, which has a great significance in the health care field. Besides, it has a social impact on decreasing the rate of infection in children in Egypt. It also provides valubale information about the distribution and segmentation of ALL in Egypt, which may be linked to the possible risk factors. Undirected Knowledge Discovery is used since, in the case of this research project, there is no target field as the data provided is mainly subjective. This is done in order to quantify the subjective variables. Therefore, the computer will be asked to identify significant patterns in the provided medical data about ALL. This may be achieved through collecting the data necessary for the system, determimng the data mining technique to be used for the system, and choosing the most suitable implementation tool for the domain. The research makes use of a data mining tool, Clementine, so as to apply Decision Trees technique. We feed it with data extracted from real-life cases taken from specialized Cancer Institutes. Relevant medical cases details such as patient medical history and diagnosis are analyzed, classified, and clustered in order to improve the disease management.

Keywords: Data Mining, Decision Trees, Knowledge Discovery, Leukemia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2215
1336 Evaluating Factors Affecting Audiologists’ Diagnostic Performance in Auditory Brainstem Response Reading: Training and Experience

Authors: M. Zaitoun, S. Cumming, A. Purcell

Abstract:

This study aims to determine if audiologists' experience characteristics in ABR (Auditory Brainstem Response) reading is associated with their performance in interpreting ABR results. Fifteen ABR traces with varying degrees of hearing level were presented twice, making a total of 30. Audiologists were asked to determine the hearing threshold for each of the cases after completing a brief survey regarding their experience and training in ABR administration. Sixty-one audiologists completed all tasks. Correlations between audiologists’ performance measures and experience variables suggested significant associations (p < 0.05) between training period in ABR testing and audiologists’ performance in terms of both sensitivity and accuracy. In addition, the number of years conducting ABR testing correlated with specificity. No other correlations approached significance. While there are relatively few significant correlations between ABR performance and experience, accuracy in ABR reading is associated with audiologists’ length of experience and period of training. To improve audiologists’ performance in reading ABR results, an emphasis on the importance of training should be raised and standardized levels and period for audiologists training in ABR testing should also be set.

Keywords: ABR, audiology, performance, training, experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 759
1335 Study on Numerical Simulation Applied to Moisture Buffering Design Method – The Case Study of Pine Wood in a Single Zone Residential Unit in Taiwan

Authors: Y.C. Yeh, Y.S. Tsay, C.M. Chiang

Abstract:

A good green building design project, designers should consider not only energy consumption, but also healthy and comfortable needs of inhabitants. In recent years, the Taiwan government paid attentions on both carbon reduction and indoor air quality issues, which be presented in the legislation of Building Codes and other regulations. Taiwan located in hot and humid climates, dampness in buildings leads to significant microbial pollution and building damage. This means that the high temperature and humidity present a serious indoor air quality issue. The interactions between vapor transfers and energy fluxes are essential for the whole building Heat Air and Moisture (HAM) response. However, a simulation tool with short calculation time, property accuracy and interface is needed for practical building design processes. In this research, we consider the vapor transfer phenomenon of building materials as well as temperature and humidity and energy consumption in a building space. The simulation bases on the EMPD method, which was performed by EnergyPlus, a simulation tool developed by DOE, to simulate the indoor moisture variation in a one-zone residential unit based on the Effective Moisture Penetration Depth Method, which is more suitable for practical building design processes.

Keywords: Effective Moisture Penetration Depth Method, Moisture Buffering Effect, Interior Material, Green Material, EnergyPlus

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
1334 An Assessment of Water and Sediment Quality of the Danube River: Polycyclic Aromatic Hydrocarbons and Trace Metals

Authors: A. Szabó Nagy, J. Szabó, I. Vass

Abstract:

Water and sediment samples from the Danube River and Moson Danube Arm (Hungary) have been collected and analyzed for contamination by 18 polycyclic aromatic hydrocarbons (PAHs) and eight trace metal(loid)s (As, Cu, Pb, Ni, Cr, Cd, Hg and Zn) in the period of 2014-2015. Moreover, the trace metal(loid) concentrations were measured in the Rába and Marcal rivers (parts of the tributary system feeding the Danube). Total PAH contents in water were found to vary from 0.016 to 0.133 µg/L and concentrations in sediments varied in the range of 0.118 mg/kg and 0.283 mg/kg. Source analysis of PAHs using diagnostic concentration ratios indicated that PAHs found in sediments were of pyrolytic origins. The dissolved trace metal and arsenic concentrations were relatively low in the surface waters. However, higher concentrations were detected in the water samples of Rába (Zn, Cu, Ni, Pb) and Marcal (As, Cu, Ni, Pb) compared to the Danube and Moson Danube. The concentrations of trace metals in sediments were higher than those found in water samples.

Keywords: Surface water, sediment, PAH, trace metal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778
1333 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase

Authors: A. Lauvray, F. Poulhaon, P. Michaud, P. Joyot, E. Duc

Abstract:

Additive Friction Stir Manufacturing, or AFSM, is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. There is still a lack in understanding of the physical phenomena taking place during the process. This research aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system due to pure friction. An analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable, due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes through a numerical modeling followed by an experimental validation to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.

Keywords: numerical model, additive manufacturing, frictional heat generation, process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 515
1332 Low Cost Microcontroller Based ECG Machine

Authors: Muhibul H. Bhuyan, Md. T. Hasan, Hasan Iskander

Abstract:

Electrocardiographic (ECG) machine is an important equipment to diagnose heart problems. Besides, the ECG signals are used to detect many other features of human body and behavior. But it is not so cheap and simple in operation to be used in the countries like Bangladesh, where most of the people are very low income earners. Therefore, in this paper, we have tried to implement a simple and portable ECG machine. Since Arduino Uno microcontroller is very cheap, we have used it in our system to minimize the cost. Our designed system is powered by a 2-voltage level DC power supply. It provides wireless connectivity to have ECG data either in smartphone having android operating system or a PC/laptop having Windows operating system. To get the data, a graphic user interface has been designed. Android application has also been made using IDE for Android 2.3 and API 10. Since it requires no USB host API, almost 98% Android smartphones, available in the country, will be able to use it. We have calculated the heart rate from the measured ECG by our designed machine and by an ECG machine of a reputed diagnostic center in Dhaka city for the same people at the same time on same day. Then we calculated the percentage of errors between the readings of two machines and computed its average. From this computation, we have found out that the average percentage of error is within an acceptable limit.

Keywords: Low cost ECG machine, heart diseases, remote monitoring, Arduino microcontroller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 868
1331 A Case Study on Optimization of Contractor’s Financing through Allocation of Subcontractors

Authors: Helen S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

In many countries, the construction industry relies heavily on outsourcing models in executing their projects and expanding their businesses to fit in the diverse market. Such extensive integration of subcontractors is becoming an influential factor in contractor’s cash flow management. Accordingly, subcontractors’ financial terms are important phenomena and pivotal components for the well-being of the contractor’s cash flow. The aim of this research is to study the contractor’s cash flow with respect to the owner and subcontractor’s payment management plans, considering variable advance payment, payment frequency, and lag and retention policies. The model is developed to provide contractors with a decision support tool that can assist in selecting the optimum subcontracting plan to minimize the contractor’s financing limits and optimize the profit values. The model is built using Microsoft Excel VBA coding, and the genetic algorithm is utilized as the optimization tool. Three objective functions are investigated, which are minimizing the highest negative overdraft value, minimizing the net present worth of overdraft, and maximizing the project net profit. The model is validated on a full-scale project which includes both self-performed and subcontracted work packages. The results show potential outputs in optimizing the contractor’s negative cash flow values and, in the meantime, assisting contractors in selecting suitable subcontractors to achieve the objective function.

Keywords: Cash flow optimization, payment plan, procurement management, subcontracting plan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207
1330 Bridging Quantitative and Qualitative of Glaucoma Detection

Authors: Noor Elaiza Abdul Khalid, Noorhayati Mohamed Noor, Zamalia Mahmud, Saadiah Yahya, and Norharyati Md Ariff

Abstract:

Glaucoma diagnosis involves extracting three features of the fundus image; optic cup, optic disc and vernacular. Present manual diagnosis is expensive, tedious and time consuming. A number of researches have been conducted to automate this process. However, the variability between the diagnostic capability of an automated system and ophthalmologist has yet to be established. This paper discusses the efficiency and variability between ophthalmologist opinion and digital technique; threshold. The efficiency and variability measures are based on image quality grading; poor, satisfactory or good. The images are separated into four channels; gray, red, green and blue. A scientific investigation was conducted on three ophthalmologists who graded the images based on the image quality. The images are threshold using multithresholding and graded as done by the ophthalmologist. A comparison of grade from the ophthalmologist and threshold is made. The results show there is a small variability between result of ophthalmologists and digital threshold.

Keywords: Digital Fundus Image, Glaucoma Detection, Multithresholding, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043
1329 Distribution and Source of PAHs in Surface Sediments of Canon River Mouth, Taiwan

Authors: Chiu-Wen Chen, Chih-Feng Chen, Cheng-Di Dong

Abstract:

Surface sediment samples were collected from the Canon River mouth, Taiwan and analyzed for polycyclic aromatic hydrocarbons (PAHs). Total PAHs concentrations varied from 337 to 1,252 ng/g dry weight, with a mean concentration of 827 ng/g dry weight. The spatial distribution of PAHs reveals that the PAHs concentration is relatively high in the river mouth region, and gradually diminishes toward the harbor region. Diagnostic ratios showed that the possible source of PAHs in the Canon River mouth could be petroleum combustion. The toxic equivalent concentrations (TEQcarc) of PAHs varied from 47 to 112 ng TEQ/g dry weight. Higher total TEQcarc values were found in the river mouth region. As compared with the US Sediment Quality Guidelines (SQGs), the observed levels of PAHs at Canon River mouth were lower than the effects range low (ERL), and would probably not exert adverse biological effects.

Keywords: PAHs, sediment, river mouth, sediment quality guidelines (SQGs), toxic equivalent (TEQcarc)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1824
1328 Technical, Environmental, and Financial Assessment for the Optimal Sizing of a Run-of-River Small Hydropower Project: A Case Study in Colombia

Authors: David Calderón Villegas, Thomas Kalitzky

Abstract:

Run-of-river (RoR) hydropower projects represent a viable, clean, and cost-effective alternative to dam-based plants and provide decentralized power production. However, RoR schemes’ cost-effectiveness depends on the proper selection of site and design flow, which is a challenging task because it requires multivariate analysis. In this respect, this study presents the development of an investment decision support tool for assessing the optimal size of an RoR scheme considering the technical, environmental, and cost constraints. The net present value (NPV) from a project perspective is used as an objective function for supporting the investment decision. The tool has been tested by applying it to an actual RoR project recently proposed in Colombia. The obtained results show that the optimum point in financial terms does not match the flow that maximizes energy generation from exploiting the river's available flow. For the case study, the flow that maximizes energy corresponds to a value of 5.1 m3/s. In comparison, an amount of 2.1 m3/s maximizes the investors NPV. Finally, a sensitivity analysis is performed to determine the NPV as a function of the debt rate changes and the electricity prices and the CapEx. Even for the worst-case scenario, the optimal size represents a positive business case with an NPV of 2.2 USD million and an internal rate of return (IRR) 1.5 times higher than the discount rate. 

Keywords: small hydropower, renewable energy, RoR schemes, optimal sizing, financial analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 600
1327 Decision Support System for Flood Crisis Management using Artificial Neural Network

Authors: Muhammad Aqil, Ichiro Kita, Akira Yano, Nishiyama Soichi

Abstract:

This paper presents an alternate approach that uses artificial neural network to simulate the flood level dynamics in a river basin. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach and evolving graphical feature and can be adopted for any similar situation to predict the flood level. The main data processing includes the gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood level data, to train/test the model using various inputs and to visualize results. The program code consists of a set of files, which can as well be modified to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The running results indicate that the decision support system applied to the flood level seems to have reached encouraging results for the river basin under examination. The comparison of the model predictions with the observed data was satisfactory, where the model is able to forecast the flood level up to 5 hours in advance with reasonable prediction accuracy. Finally, this program may also serve as a tool for real-time flood monitoring and process control.

Keywords: Decision Support System, Neural Network, Flood Level

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
1326 A Construction Management Tool: Determining a Project Schedule Typical Behaviors Using Cluster Analysis

Authors: Natalia Rudeli, Elisabeth Viles, Adrian Santilli

Abstract:

Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.

Keywords: Cluster analysis, construction management, earned value, schedule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1200
1325 Automated Thickness Measurement of Retinal Blood Vessels for Implementation of Clinical Decision Support Systems in Diagnostic Diabetic Retinopathy

Authors: S.Jerald Jeba Kumar, M.Madheswaran

Abstract:

The structure of retinal vessels is a prominent feature, that reveals information on the state of disease that are reflected in the form of measurable abnormalities in thickness and colour. Vascular structures of retina, for implementation of clinical diabetic retinopathy decision making system is presented in this paper. Retinal Vascular structure is with thin blood vessel, whose accuracy is highly dependent upon the vessel segmentation. In this paper the blood vessel thickness is automatically detected using preprocessing techniques and vessel segmentation algorithm. First the capture image is binarized to get the blood vessel structure clearly, then it is skeletonised to get the overall structure of all the terminal and branching nodes of the blood vessels. By identifying the terminal node and the branching points automatically, the main and branching blood vessel thickness is estimated. Results are presented and compared with those provided by clinical classification on 50 vessels collected from Bejan Singh Eye hospital..

Keywords: Diabetic retinopathy, Binarization, SegmentationClinical Decision Support Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043
1324 An Empirical Study of the Effect of Robot Programming Education on the Computational Thinking of Young Children: The Role of Flowcharts

Authors: Wei Sun, Yan Dong

Abstract:

There is an increasing interest in introducing computational thinking at an early age. Computational thinking, like mathematical thinking, engineering thinking, and scientific thinking, is a kind of analytical thinking. Learning computational thinking skills is not only to improve technological literacy, but also allows learners to equip with practicable skills such as problem-solving skills. As people realize the importance of computational thinking, the field of educational technology faces a problem: how to choose appropriate tools and activities to help students develop computational thinking skills. Robots are gradually becoming a popular teaching tool, as robots provide a tangible way for young children to access to technology, and controlling a robot through programming offers them opportunities to engage in developing computational thinking. This study explores whether the introduction of flowcharts into the robotics programming courses can help children convert natural language into a programming language more easily, and then to better cultivate their computational thinking skills. An experimental study was adopted with a sample of children ages six to seven (N = 16) participated, and a one-meter-tall humanoid robot was used as the teaching tool. Results show that children can master basic programming concepts through robotic courses. Children's computational thinking has been significantly improved. Besides, results suggest that flowcharts do have an impact on young children’s computational thinking skills development, but it only has a significant effect on the "sequencing" and "correspondence" skills. Overall, the study demonstrates that the humanoid robot and flowcharts have qualities that foster young children to learn programming and develop computational thinking skills.

Keywords: Robotics, computational thinking, programming, young children, flowcharts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 809
1323 Influence of Deep Cold Rolling and Low Plasticity Burnishing on Surface Hardness and Surface Roughness of AISI 4140 Steel

Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma

Abstract:

Deep cold rolling (DCR) and low plasticity burnishing (LPB) process are cold working processes, which easily produce a smooth and work-hardened surface by plastic deformation of surface irregularities. The present study focuses on the surface roughness and surface hardness aspects of AISI 4140 work material, using fractional factorial design of experiments. The assessment of the surface integrity aspects on work material was done, in order to identify the predominant factors amongst the selected parameters. They were then categorized in order of significance followed by setting the levels of the factors for minimizing surface roughness and/or maximizing surface hardness. In the present work, the influence of main process parameters (force, feed rate, number of tool passes/overruns, initial roughness of the work piece, ball material, ball diameter and lubricant used) on the surface roughness and the hardness of AISI 4140 steel were studied for both LPB and DCR process and the results are compared. It was observed that by using LPB process surface hardness has been improved by 167% and in DCR process surface hardness has been improved by 442%. It was also found that the force, ball diameter, number of tool passes and initial roughness of the workpiece are the most pronounced parameters, which has a significant effect on the work piece-s surface during deep cold rolling and low plasticity burnishing process.

Keywords: Deep cold rolling, burnishing, surface roughness, surface hardness, design of experiments, AISI4140 steel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3796
1322 Efficient Supplies to Assembly Areas from Storage Stages

Authors: Matthias Schmidt, Steffen C. Eickemeyer, Prof. Peter Nyhuis

Abstract:

Guaranteeing the availability of the required parts at the scheduled time represents a key logistical challenge. This is especially important when several parts are required together. This article describes a tool that supports the positioning in the area of conflict between low stock costs and a high service level for a consumer.

Keywords: Systems Modeling, Manufacturing Systems, Simulation & Control, logistics and supply chain management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1649
1321 A Novel Prostate Segmentation Algorithm in TRUS Images

Authors: Ali Rafiee, Ahad Salimi, Ali Reza Roosta

Abstract:

Prostate cancer is one of the most frequent cancers in men and is a major cause of mortality in the most of countries. In many diagnostic and treatment procedures for prostate disease accurate detection of prostate boundaries in transrectal ultrasound (TRUS) images is required. This is a challenging and difficult task due to weak prostate boundaries, speckle noise and the short range of gray levels. In this paper a novel method for automatic prostate segmentation in TRUS images is presented. This method involves preprocessing (edge preserving noise reduction and smoothing) and prostate segmentation. The speckle reduction has been achieved by using stick filter and top-hat transform has been implemented for smoothing. A feed forward neural network and local binary pattern together have been use to find a point inside prostate object. Finally the boundary of prostate is extracted by the inside point and an active contour algorithm. A numbers of experiments are conducted to validate this method and results showed that this new algorithm extracted the prostate boundary with MSE less than 4.6% relative to boundary provided manually by physicians.

Keywords: Prostate segmentation, stick filter, neural network, active contour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
1320 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ debugger, data acquisition system, FPGA, system signals, Qt framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893
1319 The Importance of Development in Laboratory Diagnosis at the Intersection

Authors: Agus Sahri, Cahya Putra Dinata, Faishal Andhi Rokhman

Abstract:

Intersection is a critical area on a highway which is a place of conflict points and congestion due to the meeting of two or more roads. Conflicts that occur at the intersection include diverging, merging, weaving, and crossing. To deal with these conflicts, a crossing control system is needed, at a plot of intersection there are two control systems namely signal intersections and non-signalized intersections. The control system at a plot of intersection can affect the intersection performance. In Indonesia there are still many intersections with poor intersection performance. In analyzing the parameters to measure the performance of a plot of intersection in Indonesia, it is guided by the 1997 Indonesian Road Capacity Manual. For this reason, this study aims to develop laboratory diagnostics at plot intersections to analyze parameters that can affect the performance of an intersection. The research method used is research and development. The laboratory diagnosis includes anamnesis, differential diagnosis, inspection, diagnosis, prognosis, specimens, analysis and sample data analysts. It is expected that this research can encourage the development and application of laboratory diagnostics at a plot of intersection in Indonesia so that intersections can function optimally.

Keywords: Intersection, laboratory diagnostic, control systems, Indonesia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 753